WorldWideScience

Sample records for big bayou creeks

  1. Big Bayou Creek and Little Bayou Creek Watershed Monitoring Program

    Energy Technology Data Exchange (ETDEWEB)

    Kszos, L.A.; Peterson, M.J.; Ryon; Smith, J.G.

    1999-03-01

    Biological monitoring of Little Bayou and Big Bayou creeks, which border the Paducah Site, has been conducted since 1987. Biological monitoring was conducted by University of Kentucky from 1987 to 1991 and by staff of the Environmental Sciences Division (ESD) at Oak Ridge National Laboratory (ORNL) from 1991 through March 1999. In March 1998, renewed Kentucky Pollutant Discharge Elimination System (KPDES) permits were issued to the US Department of Energy (DOE) and US Enrichment Corporation. The renewed DOE permit requires that a watershed monitoring program be developed for the Paducah Site within 90 days of the effective date of the renewed permit. This plan outlines the sampling and analysis that will be conducted for the watershed monitoring program. The objectives of the watershed monitoring are to (1) determine whether discharges from the Paducah Site and the Solid Waste Management Units (SWMUs) associated with the Paducah Site are adversely affecting instream fauna, (2) assess the ecological health of Little Bayou and Big Bayou creeks, (3) assess the degree to which abatement actions ecologically benefit Big Bayou Creek and Little Bayou Creek, (4) provide guidance for remediation, (5) provide an evaluation of changes in potential human health concerns, and (6) provide data which could be used to assess the impact of inadvertent spills or fish kill. According to the cleanup will result in these watersheds [Big Bayou and Little Bayou creeks] achieving compliance with the applicable water quality criteria.

  2. Big Creek Pit Tags

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BCPITTAGS database is used to store data from an Oncorhynchus mykiss (steelhead/rainbow trout) population dynamics study in Big Creek, a coastal stream along the...

  3. Big Canyon Creek Ecological Restoration Strategy.

    Energy Technology Data Exchange (ETDEWEB)

    Rasmussen, Lynn; Richardson, Shannon

    2007-10-01

    He-yey, Nez Perce for steelhead or rainbow trout (Oncorhynchus mykiss), are a culturally and ecologically significant resource within the Big Canyon Creek watershed; they are also part of the federally listed Snake River Basin Steelhead DPS. The majority of the Big Canyon Creek drainage is considered critical habitat for that DPS as well as for the federally listed Snake River fall chinook (Oncorhynchus tshawytscha) ESU. The Nez Perce Soil and Water Conservation District (District) and the Nez Perce Tribe Department of Fisheries Resources Management-Watershed (Tribe), in an effort to support the continued existence of these and other aquatic species, have developed this document to direct funding toward priority restoration projects in priority areas for the Big Canyon Creek watershed. In order to achieve this, the District and the Tribe: (1) Developed a working group and technical team composed of managers from a variety of stakeholders within the basin; (2) Established geographically distinct sub-watershed areas called Assessment Units (AUs); (3) Created a prioritization framework for the AUs and prioritized them; and (4) Developed treatment strategies to utilize within the prioritized AUs. Assessment Units were delineated by significant shifts in sampled juvenile O. mykiss (steelhead/rainbow trout) densities, which were found to fall at fish passage barriers. The prioritization framework considered four aspects critical to determining the relative importance of performing restoration in a certain area: density of critical fish species, physical condition of the AU, water quantity, and water quality. It was established, through vigorous data analysis within these four areas, that the geographic priority areas for restoration within the Big Canyon Creek watershed are Big Canyon Creek from stream km 45.5 to the headwaters, Little Canyon from km 15 to 30, the mainstem corridors of Big Canyon (mouth to 7km) and Little Canyon (mouth to 7km). The District and the Tribe

  4. Final report. Paducah Gaseous Diffusion Plant PCB sediment survey: Big Bayou Creek and Little Bayou Creek, Paducah, Kentucky

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Laboratory analysis of collected samples along drainage features at PGDP. Report documents levels of PCB contamination and considers locations of contamination and known releases to theorize probable sources to further investigate.

  5. A validation test for Adagio through replication of Big Hill and Bayou Choctaw JAS3D models.

    Energy Technology Data Exchange (ETDEWEB)

    Park, Byoung Yoon

    2013-06-01

    JAS3D, a three dimensional iterative solid mechanics code, has been used for structural analyses for the Strategic Petroleum Reserve system since the 1990s. JAS3D is no longer supported by Sandia National Laboratories, and has been replaced by Adagio. To validate the transition from JAS3D to Adagio, the existing JAS3D input decks and user subroutines for Bayou Choctaw and Big Hill models were converted for use with Adagio. The calculation results from the Adagio runs are compared to the JAS3D. Since the Adagio results are very similar to the JAS3D results, Adagio is judged to be performing satisfactorily.

  6. Fish Passage Assessment: Big Canyon Creek Watershed, Technical Report 2004.

    Energy Technology Data Exchange (ETDEWEB)

    Christian, Richard

    2004-02-01

    This report presents the results of the fish passage assessment as outlined as part of the Protect and Restore the Big Canyon Creek Watershed project as detailed in the CY2003 Statement of Work (SOW). As part of the Northwest Power Planning Council's Columbia Basin Fish and Wildlife Program (FWP), this project is one of Bonneville Power Administration's (BPA) many efforts at off-site mitigation for damage to salmon and steelhead runs, their migration, and wildlife habitat caused by the construction and operation of federal hydroelectric dams on the Columbia River and its tributaries. The proposed restoration activities within the Big Canyon Creek watershed follow the watershed restoration approach mandated by the Fisheries and Watershed Program. Nez Perce Tribal Fisheries/Watershed Program vision focuses on protecting, restoring, and enhancing watersheds and treaty resources within the ceded territory of the Nez Perce Tribe under the Treaty of 1855 with the United States Federal Government. The program uses a holistic approach, which encompasses entire watersheds, ridge top to ridge top, emphasizing all cultural aspects. We strive toward maximizing historic ecosystem productive health, for the restoration of anadromous and resident fish populations. The Nez Perce Tribal Fisheries/Watershed Program (NPTFWP) sponsors the Protect and Restore the Big Canyon Creek Watershed project. The NPTFWP has the authority to allocate funds under the provisions set forth in their contract with BPA. In the state of Idaho vast numbers of relatively small obstructions, such as road culverts, block thousands of miles of habitat suitable for a variety of fish species. To date, most agencies and land managers have not had sufficient, quantifiable data to adequately address these barrier sites. The ultimate objective of this comprehensive inventory and assessment was to identify all barrier crossings within the watershed. The barriers were then prioritized according to the

  7. Thermal discharges from Paducah Gaseous Diffusion Plant outfalls: Impacts on stream temperatures and fauna of Little Bayou and Big Bayou Creeks

    Energy Technology Data Exchange (ETDEWEB)

    Roy, W.K.; Ryon, M.G.; Hinzman, R.L. [Oak Ridge National Lab., TN (United States). Computer Science and Mathematics Div.

    1996-03-01

    The development of a biological monitoring plan for the receiving streams of the Paducah Gaseous Diffusion Plant (PGDP) began in the late 1980s, because of an Agreed Order (AO) issued in September 1987 by the Kentucky Division of Water (KDOW). Five years later, in September 1992, more stringent effluent limitations were imposed upon the PGDP operations when the KDOW reissued Kentucky Pollutant Discharge Elimination System permit No. KY 0004049. This action prompted the US Department of Energy (DOE) to request a stay of certain limits contained in the permit. An AO is being negotiated between KDOW, the US Enrichment Corporation (USEC), and DOE that will require that several studies be conducted, including this stream temperature evaluation study, in an effort to establish permit limitations. All issues associated with this AO have been resolved, and the AO is currently being signed by all parties involved. The proposed effluent temperature limit is 89 F (31.7 C) as a mean monthly temperature. In the interim, temperatures are not to exceed 95 F (35 C) as a monthly mean or 100 F (37.8 C) as a daily maximum. This study includes detailed monitoring of instream temperatures, benthic macroinvertebrate communities, fish communities, and a laboratory study of thermal tolerances.

  8. Thermal discharges from Paducah Gaseous Diffusion Plant outfalls: Impacts on stream temperatures and fauna of Little Bayou and Big Bayou Creeks

    International Nuclear Information System (INIS)

    Roy, W.K.; Ryon, M.G.; Hinzman, R.L.

    1996-03-01

    The development of a biological monitoring plan for the receiving streams of the Paducah Gaseous Diffusion Plant (PGDP) began in the late 1980s, because of an Agreed Order (AO) issued in September 1987 by the Kentucky Division of Water (KDOW). Five years later, in September 1992, more stringent effluent limitations were imposed upon the PGDP operations when the KDOW reissued Kentucky Pollutant Discharge Elimination System permit No. KY 0004049. This action prompted the US Department of Energy (DOE) to request a stay of certain limits contained in the permit. An AO is being negotiated between KDOW, the US Enrichment Corporation (USEC), and DOE that will require that several studies be conducted, including this stream temperature evaluation study, in an effort to establish permit limitations. All issues associated with this AO have been resolved, and the AO is currently being signed by all parties involved. The proposed effluent temperature limit is 89 F (31.7 C) as a mean monthly temperature. In the interim, temperatures are not to exceed 95 F (35 C) as a monthly mean or 100 F (37.8 C) as a daily maximum. This study includes detailed monitoring of instream temperatures, benthic macroinvertebrate communities, fish communities, and a laboratory study of thermal tolerances

  9. Thermal Discharges from Paducah Gaseous Diffusion Plant Outfalls: Impacts on Stream Temperatures and Fauna of Little Bayou and Big Bayou Creeks

    International Nuclear Information System (INIS)

    Roy, W.K.

    1999-01-01

    The development of a biological monitoring plan for the receiving streams of the Paducah Gaseous Diffusion Plant (PGDP) began in the late 1980s, because of an Agreed Order (AO) issued in September 1987 by the Kentucky Division of Water (KDOW). Five years later, in September 1992, more stringent effluent limitations were imposed upon the PGDP operations when the KDOW reissued Kentucky Pollutant Discharge Elimination System permit No. KY 0004049. This action prompted the US Department of Energy (DOE) to request a stay of certain limits contained in the permit. An AO is being negotiated between KDOW, the United States Enrichment Corporation (USEC), and DOE that will require that several studies be conducted, including this stream temperature evaluation study, in an effort to establish permit limitations. All issues associated with this AO have been resolved, and the AO is currently being signed by all parties involved. The proposed effluent temperature limit is 89 F (31.7C) as a mean monthly temperature. In the interim, temperatures are not to exceed 95 F (35 C) as a monthly mean or 100 F (37.8 C) as a daily maximum. This study includes detailed monitoring of instream temperatures, benthic macroinvertebrate communities, fish communities, and a laboratory study of thermal tolerances

  10. Thermal Discharges from Paducah Gaseous Diffusion Plant Outfalls: Impacts on Stream Temperatures and Fauna of Little Bayou and Big Bayou Creeks

    Energy Technology Data Exchange (ETDEWEB)

    Roy, W.K.

    1999-01-01

    The development of a biological monitoring plan for the receiving streams of the Paducah Gaseous Diffusion Plant (PGDP) began in the late 1980s, because of an Agreed Order (AO) issued in September 1987 by the Kentucky Division of Water (KDOW). Five years later, in September 1992, more stringent effluent limitations were imposed upon the PGDP operations when the KDOW reissued Kentucky Pollutant Discharge Elimination System permit No. KY 0004049. This action prompted the US Department of Energy (DOE) to request a stay of certain limits contained in the permit. An AO is being negotiated between KDOW, the United States Enrichment Corporation (USEC), and DOE that will require that several studies be conducted, including this stream temperature evaluation study, in an effort to establish permit limitations. All issues associated with this AO have been resolved, and the AO is currently being signed by all parties involved. The proposed effluent temperature limit is 89 F (31.7C) as a mean monthly temperature. In the interim, temperatures are not to exceed 95 F (35 C) as a monthly mean or 100 F (37.8 C) as a daily maximum. This study includes detailed monitoring of instream temperatures, benthic macroinvertebrate communities, fish communities, and a laboratory study of thermal tolerances.

  11. Flood-inundation maps for Big Creek from the McGinnis Ferry Road bridge to the confluence of Hog Wallow Creek, Alpharetta and Roswell, Georgia

    Science.gov (United States)

    Musser, Jonathan W.

    2015-08-20

    Digital flood-inundation maps for a 12.4-mile reach of Big Creek that extends from 260 feet above the McGinnis Ferry Road bridge to the U.S. Geological Survey (USGS) streamgage at Big Creek below Hog Wallow Creek at Roswell, Georgia (02335757), were developed by the USGS in cooperation with the cities of Alpharetta and Roswell, Georgia. The inundation maps, which can be accessed through the USGS Flood Inundation Mapping Science Web site at http://water.usgs.gov/osw/flood_inundation/, depict estimates of the areal extent and depth of flooding corresponding to selected water levels (stages) at the USGS streamgage at Big Creek near Alpharetta, Georgia (02335700). Real-time stage information from this USGS streamgage may be obtained at http://waterdata.usgs.gov/ and can be used in conjunction with these maps to estimate near real-time areas of inundation. The National Weather Service (NWS) is incorporating results from this study into the Advanced Hydrologic Prediction Service (AHPS) flood-warning system http://water.weather.gov/ahps/). The NWS forecasts flood hydrographs for many streams where the USGS operates streamgages and provides flow data. The forecasted peak-stage information for the USGS streamgage at Big Creek near Alpharetta (02335700), available through the AHPS Web site, may be used in conjunction with the maps developed for this study to show predicted areas of flood inundation.

  12. 75 FR 5758 - Bridger-Teton National Forest, Big Piney Ranger District, WY; Piney Creeks Vegetation Treatment

    Science.gov (United States)

    2010-02-04

    ... analysis area is approximately 20,000 acres within this watershed and includes the creeks of South, Middle... and for further site specific analysis of effects. It is approximately 25 miles west of Big Piney, Wyoming in the Green River drainage, on the east slope of the Wyoming range. All lands within the analysis...

  13. Restoring Anadromous Fish Habitat in Big Canyon Creek Watershed, 2004-2005 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Rasmussen, Lynn (Nez Perce Soil and Conservation District, Lewiston, ID)

    2006-07-01

    The ''Restoring Anadromous Fish Habitat in the Big Canyon Creek Watershed'' is a multi-phase project to enhance steelhead trout in the Big Canyon Creek watershed by improving salmonid spawning and rearing habitat. Habitat is limited by extreme high runoff events, low summer flows, high water temperatures, poor instream cover, spawning gravel siltation, and sediment, nutrient and bacteria loading. Funded by the Bonneville Power Administration (BPA) as part of the Northwest Power Planning Council's Fish and Wildlife Program, the project assists in mitigating damage to steelhead runs caused by the Columbia River hydroelectric dams. The project is sponsored by the Nez Perce Soil and Water Conservation District. Target fish species include steelhead trout (Oncorhynchus mykiss). Steelhead trout within the Snake River Basin were listed in 1997 as threatened under the Endangered Species Act. Accomplishments for the contract period September 1, 2004 through October 31, 2005 include; 2.7 riparian miles treated, 3.0 wetland acres treated, 5,263.3 upland acres treated, 106.5 riparian acres treated, 76,285 general public reached, 3,000 students reached, 40 teachers reached, 18 maintenance plans completed, temperature data collected at 6 sites, 8 landowner applications received and processed, 14 land inventories completed, 58 habitat improvement project designs completed, 5 newsletters published, 6 habitat plans completed, 34 projects installed, 2 educational workshops, 6 displays, 1 television segment, 2 public service announcements, a noxious weed GIS coverage, and completion of NEPA, ESA, and cultural resources requirements.

  14. Ecological Health and Water Quality Assessments in Big Creek Lake, AL

    Science.gov (United States)

    Childs, L. M.; Frey, J. W.; Jones, J. B.; Maki, A. E.; Brozen, M. W.; Malik, S.; Allain, M.; Mitchell, B.; Batina, M.; Brooks, A. O.

    2008-12-01

    Big Creek Lake (aka J.B. Converse Reservoir) serves as the water supply for the majority of residents in Mobile County, Alabama. The area surrounding the reservoir serves as a gopher tortoise mitigation bank and is protected from further development, however, impacts from previous disasters and construction have greatly impacted the Big Creek Lake area. The Escatawpa Watershed drains into the lake, and of the seven drainage streams, three have received a 303 (d) (impaired water bodies) designation in the past. In the adjacent ecosystem, the forest is experiencing major stress from drought and pine bark beetle infestations. Various agencies are using control methods such as pesticide treatment to eradicate the beetles. There are many concerns about these control methods and the run-off into the ecosystem. In addition to pesticide control methods, the Highway 98 construction projects cross the north area of the lake. The community has expressed concern about both direct and indirect impacts of these construction projects on the lake. This project addresses concerns about water quality, increasing drought in the Southeastern U.S., forest health as it relates to vegetation stress, and state and federal needs for improved assessment methods supported by remotely sensed data to determine coastal forest susceptibility to pine bark beetles. Landsat TM, ASTER, MODIS, and EO-1/ALI imagery was employed in Normalized Difference Vegetation Index (NDVI) and Normalized Difference Moisture Index (NDMI), as well as to detect concentration of suspended solids, chlorophyll and water turbidity. This study utilizes NASA Earth Observation Systems to determine how environmental conditions and human activity relate to pine tree stress and the onset of pine beetle invasion, as well as relate current water quality data to community concerns and gain a better understanding of human impacts upon water resources.

  15. The coal deposits of the Alkali Butte, the Big Sand Draw, and the Beaver Creek fields, Fremont County, Wyoming

    Science.gov (United States)

    Thompson, Raymond M.; White, Vincent L.

    1952-01-01

    Large coal reserves are present in three areas located between 12 and 20 miles southeast of Riverton, Fremont County, central Wyoming. Coal in two of these areas, the Alkali Butte coal field and the Big Sand Draw coal field, is exposed on the surface and has been developed to some extent by underground mining. The Beaver Creek coal field is known only from drill cuttings and cores from wells drilled for oil and gas in the Beaver Creek oil and gas field.These three coal areas can be reached most readily from Riverton, Wyo. State Route 320 crosses Wind River about 1 mile south of Riverton. A few hundred yards south of the river a graveled road branches off the highway and extends south across the Popo Agie River toward Sand Draw oil and gas field. About 8 miles south of the highway along the Sand Draw road, a dirt road bears east and along this road it is about 12 miles to the Bell coal mine in the Alkali Butte coal field. Three miles southeast of the Alkali Butte turn-off, 3 miles of oiled road extends southwest into the Beaver Creek oil and gas field. About 6 miles southeast of the Beaver Creek turn-off, in the valley of Little Sand Draw Creek, a dirt road extends east 1. mile and then southeast 1 mile to the Downey mine in the Big Sand Draw coal field. Location of these coal fields is shown on figure 1 with their relationship to the Wind River basin and other coal fields, place localities, and wells mentioned in this report. The coal in the Alkali Butte coal field is exposed partly on the Wind River Indian Reservation in Tps. 1 and 2 S., R. 6 E., and partly on public land. Coal in the Beaver Creek and Big Sand Draw coal fields is mainly on public land. The region has a semiarid climate with rainfall averaging less than 10 in. per year. When rain does fall the sandy-bottomed stream channels fill rapidly and are frequently impassable for a few hours. Beaver Creek, Big Sand Draw, Little Sand Draw, and Kirby Draw and their smaller tributaries drain the area and flow

  16. 2015 Strategic Petroleum Reserve Bayou Choctaw Well Integrity Grading Report.

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, Barry L; Lord, David; Lord, Anna C. Snider; Bettin, Giorgia; Park, Byoung; Rudeen, D.K.; Eldredge, L.L.; Wynn, K.; Checkai, D.; Osborne, G.; Moore, D.

    2015-10-01

    This report summarizes the work performed in the prioritization of cavern access wells for remediation and monitoring at the Bayou Choctaw Strategic Petroleum Reserve site. The grading included consideration of all 15 wells at the Bayou Choctaw site, with each active well receiving a separate grade for remediation and monitoring. Numerous factors affecting well integrity were incorporated into the grading including casing survey results, cavern pressure history, results from geomechanical simulations, and site geologic factors. The factors and grading framework used here are the same as those used in developing similar well remediation and monitoring priorities at the Big Hill, Bryan Mound, and West Hackberry Strategic Petroleum Reserve Sites.

  17. Flood-inundation maps for a 12.5-mile reach of Big Papillion Creek at Omaha, Nebraska

    Science.gov (United States)

    Strauch, Kellan R.; Dietsch, Benjamin J.; Anderson, Kayla J.

    2016-03-22

    Digital flood-inundation maps for a 12.5-mile reach of the Big Papillion Creek from 0.6 mile upstream from the State Street Bridge to the 72nd Street Bridge in Omaha, Nebraska, were created by the U.S. Geological Survey (USGS) in cooperation with the Papio-Missouri River Natural Resources District. The flood-inundation maps, which can be accessed through the USGS Flood Inundation Mapping Science Web site at http://water.usgs.gov/osw/flood_inundation/, depict estimates of the areal extent and depth of flooding corresponding to selected water levels (stages) at the USGS streamgage on the Big Papillion Creek at Fort Street at Omaha, Nebraska (station 06610732). Near-real-time stages at this streamgage may be obtained on the Internet from the USGS National Water Information System at http://waterdata.usgs.gov/ or the National Weather Service Advanced Hydrologic Prediction Service at http:/water.weather.gov/ahps/, which also forecasts flood hydrographs at this site.

  18. Time of travel of solutes in Buffalo Bayou and selected tributaries, Houston, Texas, August 1999

    Science.gov (United States)

    East, Jeffery W.; Schaer, Jasper D.

    2000-01-01

    The U.S. Geological Survey (USGS), in cooperation with the U.S. Environmental Protection Agency, conducted a time-of-travel study in the Buffalo Bayou watershed during low flow in August 1999. The study was done as part of the U.S. Environmental Protection Agency Environmental Monitoring for Public Access and Community Tracking (EMPACT) program. The EMPACT program was designed for the U.S. Environmental Protection Agency to work with communities to “make timely, accurate, and understandable environmental information available to millions of people in the largest metropolitan areas across the country.” (U.S. Environmental Protection Agency, 2000). Buffalo Bayou, located in Houston, Texas, was chosen as a pilot project because it is a frequently used recreational water source, it has many water-treatment facilities located along its stream segments, and it has a history of water-quality problems (Houston-Galveston Area Council, 2000). One component of the pilot project is to develop a water-quality simulation model that can be used to assess the effects of noncompliance events on Buffalo Bayou. Because accurate estimates of time of travel during low flow are required to develop the model, the time of travel of solutes in Buffalo Bayou and selected tributaries was determined using dye tracing methods. The study was conducted during low flow in a 38.7-mile reach of Buffalo Bayou, a 9.6-mile reach of Whiteoak Bayou, a 5.9-mile reach of Mason Creek, and a 6.6-mile reach of Bear Creek. Efforts to determine the time of travel in a 7.5-mile reach of Horsepen Creek were unsuccessful. This report explains the approach used to conduct the study and presents the results of the study

  19. Peak discharge, flood frequency, and peak stage of floods on Big Cottonwood Creek at U.S. Highway 50 near Coaldale, Colorado, and Fountain Creek below U.S. Highway 24 in Colorado Springs, Colorado, 2016

    Science.gov (United States)

    Kohn, Michael S.; Stevens, Michael R.; Mommandi, Amanullah; Khan, Aziz R.

    2017-12-14

    The U.S. Geological Survey (USGS), in cooperation with the Colorado Department of Transportation, determined the peak discharge, annual exceedance probability (flood frequency), and peak stage of two floods that took place on Big Cottonwood Creek at U.S. Highway 50 near Coaldale, Colorado (hereafter referred to as “Big Cottonwood Creek site”), on August 23, 2016, and on Fountain Creek below U.S. Highway 24 in Colorado Springs, Colorado (hereafter referred to as “Fountain Creek site”), on August 29, 2016. A one-dimensional hydraulic model was used to estimate the peak discharge. To define the flood frequency of each flood, peak-streamflow regional-regression equations or statistical analyses of USGS streamgage records were used to estimate annual exceedance probability of the peak discharge. A survey of the high-water mark profile was used to determine the peak stage, and the limitations and accuracy of each component also are presented in this report. Collection and computation of flood data, such as peak discharge, annual exceedance probability, and peak stage at structures critical to Colorado’s infrastructure are an important addition to the flood data collected annually by the USGS.The peak discharge of the August 23, 2016, flood at the Big Cottonwood Creek site was 917 cubic feet per second (ft3/s) with a measurement quality of poor (uncertainty plus or minus 25 percent or greater). The peak discharge of the August 29, 2016, flood at the Fountain Creek site was 5,970 ft3/s with a measurement quality of poor (uncertainty plus or minus 25 percent or greater).The August 23, 2016, flood at the Big Cottonwood Creek site had an annual exceedance probability of less than 0.01 (return period greater than the 100-year flood) and had an annual exceedance probability of greater than 0.005 (return period less than the 200-year flood). The August 23, 2016, flood event was caused by a precipitation event having an annual exceedance probability of 1.0 (return

  20. Features of Bayou Choctaw SPR caverns and internal structure of the salt dome.

    Energy Technology Data Exchange (ETDEWEB)

    Munson, Darrell E.

    2007-07-01

    The intent of this study is to examine the internal structure of the Bayou Choctaw salt dome utilizing the information obtained from graphical representations of sonar survey data of the internal cavern surfaces. Many of the Bayou Choctaw caverns have been abandoned. Some existing caverns were purchased by the Strategic Petroleum Reserve (SPR) program and have rather convoluted histories and complex cavern geometries. In fact, these caverns are typically poorly documented and are not particularly constructive to this study. Only two Bayou Choctaw caverns, 101 and 102, which were constructed using well-controlled solutioning methods, are well documented. One of these was constructed by the SPR for their use while the other was constructed and traded for another existing cavern. Consequently, compared to the SPR caverns of the West Hackberry and Big Hill domes, it is more difficult to obtain a general impression of the stratigraphy of the dome. Indeed, caverns of Bayou Choctaw show features significantly different than those encountered in the other two SPR facilities. In the number of abandoned caverns, and some of those existing caverns purchased by the SPR, extremely irregular solutioning has occurred. The two SPR constructed caverns suggest that some sections of the caverns may have undergone very regular solutioning to form uniform cylindrical shapes. Although it is not usually productive to speculate, some suggestions that point to the behavior of the Bayou Choctaw dome are examined. Also the primary differences in the Bayou Choctaw dome and the other SPR domes are noted.

  1. Neural network prediction of carbonate lithofacies from well logs, Big Bow and Sand Arroyo Creek fields, Southwest Kansas

    Science.gov (United States)

    Qi, L.; Carr, T.R.

    2006-01-01

    In the Hugoton Embayment of southwestern Kansas, St. Louis Limestone reservoirs have relatively low recovery efficiencies, attributed to the heterogeneous nature of the oolitic deposits. This study establishes quantitative relationships between digital well logs and core description data, and applies these relationships in a probabilistic sense to predict lithofacies in 90 uncored wells across the Big Bow and Sand Arroyo Creek fields. In 10 wells, a single hidden-layer neural network based on digital well logs and core described lithofacies of the limestone depositional texture was used to train and establish a non-linear relationship between lithofacies assignments from detailed core descriptions and selected log curves. Neural network models were optimized by selecting six predictor variables and automated cross-validation with neural network parameters and then used to predict lithofacies on the whole data set of the 2023 half-foot intervals from the 10 cored wells with the selected network size of 35 and a damping parameter of 0.01. Predicted lithofacies results compared to actual lithofacies displays absolute accuracies of 70.37-90.82%. Incorporating adjoining lithofacies, within-one lithofacies improves accuracy slightly (93.72%). Digital logs from uncored wells were batch processed to predict lithofacies and probabilities related to each lithofacies at half-foot resolution corresponding to log units. The results were used to construct interpolated cross-sections and useful depositional patterns of St. Louis lithofacies were illustrated, e.g., the concentration of oolitic deposits (including lithofacies 5 and 6) along local highs and the relative dominance of quartz-rich carbonate grainstone (lithofacies 1) in the zones A and B of the St. Louis Limestone. Neural network techniques are applicable to other complex reservoirs, in which facies geometry and distribution are the key factors controlling heterogeneity and distribution of rock properties. Future work

  2. 33 CFR 117.959 - Chocolate Bayou.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Chocolate Bayou. 117.959 Section 117.959 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Texas § 117.959 Chocolate Bayou. The draw of the Union...

  3. 33 CFR 117.965 - Cow Bayou.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Cow Bayou. 117.965 Section 117.965 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Texas § 117.965 Cow Bayou. The draws of the Orange County...

  4. 33 CFR 117.987 - Taylor Bayou.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Taylor Bayou. 117.987 Section 117.987 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Texas § 117.987 Taylor Bayou. The draws of the Union Pacific...

  5. Effects of geothermal energy utilization on stream biota and water quality at The Geysers, California. Final report. [Big Sulphur, Little Sulphur, Squaw, and Pieta Creeks

    Energy Technology Data Exchange (ETDEWEB)

    LeGore, R.S.

    1975-01-01

    The discussion is presented under the following section headings: biological studies, including fish, insects, and microbiology; stream hydrology; stream water quality, including methods and results; the contribution of tributaries to Big Sulphur Creek, including methods, results, and tributary characterization; standing water at wellheads; steam condensate quality; accidental discharges; trout spawning bed quality; major conclusions; list of references; and appendices. It is concluded that present operational practices at Geysers geothermal field do not harm the biological resources in adjacent streams. The only effects of geothermal development observed during the study were related to operational accidents. (JGB)

  6. Ground-Water System in the Chimacum Creek Basin and Surface Water/Ground Water Interaction in Chimacum and Tarboo Creeks and the Big and Little Quilcene Rivers, Eastern Jefferson County, Washington

    Science.gov (United States)

    Simonds, F. William; Longpre, Claire I.; Justin, Greg B.

    2004-01-01

    throughout most of the year and the lower reaches have little or no gains. The Big Quilcene River generally gains water from the shallow ground-water system after it emerges from a bedrock canyon and loses water from the town of Quilcene to the mouth of the river in Quilcene Bay. The Little Quilcene River generally loses water to the shallow ground-water system, although two localized areas were found to have gaining conditions. The Big Quilcene and Little Quilcene Rivers incur significant losses on the alluvial plain at the head of Quilcene Bay. Each of the creeks examined had a unique pattern of gaining and losing reaches, owing to the hydraulic conductivity of the streambed material and the relative altitude of the surrounding water table. Although the magnitudes of gains and losses varied seasonally, the spatial distribution did not vary greatly, suggesting that patterns of gains and losses in surface-water systems depend greatly on the geology underlying the streambed.

  7. Data and statistical summaries of background concentrations of metals in soils and streambed sediments in part of Big Soos Creek drainage basin, King County, Washington

    Science.gov (United States)

    Prych, E.A.; Kresch, D.L.; Ebbert, J.C.; Turney, G.L.

    1995-01-01

    Twenty-nine soil samples from 14 holes at 9 sites in part of the Big Soos Creek drainage basin in southwest King County, Washington, were collected and analyzed to obtain data on the magnitude and variability of background concentrations of metals in soils. Seven streambed-sediment samples and three streamwater samples from three sites also were collected and analyzed. These data are needed by regulating government agencies to determine if soils at sites of suspected contamination have elevated concentrations of metals, and to evaluate the effectiveness of remediation at sites with known contamination. Concentrations of 43 metals were determined by a total method, and concentrations of 17 metals were determined by a total-recoverable method and two different leaching methods. Metals analyzed for by all methods included most of those on the U.S. Environmental Protection agency list of priority pollutants, plus alluminum, iron, and manganese. Ranges of concentrations of metals determined by the total method are within ranges found by others for the conterminous United States. Concentrations of mercury, manganese, phosphorus, lead, selenium, antimony, and zinc as determined by the total method, and of some of these plus other metals as determined by the other methods were larger in shallow soil (less than 12 inches deep) than in deep soil (greater than 12 inches). Concentrations of metals in streambed sediments were more typical of shallow than deep soils.

  8. Geology and coal resources of the Hanging Woman Creek Study Area, Big Horn and Powder River Counties, Montana

    Science.gov (United States)

    Culbertson, William Craven; Hatch, Joseph R.; Affolter, Ronald H.

    1978-01-01

    In an area of 7,200 acres (29 sq km) In the Hanging Woman Creek study area, the Anderson coal bed contains potentially surface minable resources of 378 million short tons (343 million metric tons) of subbituminous C coal that ranges in thickness from 26 to 33 feet (7.9-10.1 m) at depths of less than 200 feet (60 m). Additional potentially surface minable resources of 55 million short tons (50 million metric tons) are contained in the 9-12 foot (2.7-3.7 m) thick Dietz coal bed which lies 50-100 feet (15-30 m) below the Anderson. Analyses of coal from 5 core holes indicates that the Anderson bed contains 0.4 percent sulfur, 5 percent ash, and has a heating value of 8,540 Btu/lb (4,750 Kcal/kg). The trace element content of the coal is generally similar to other coals in the Powder River Basin. The two coal beds are in the Fort Union Formation of Paleocene age which consists of sandstone, siltstone, shale, coal beds, and locally impure limestone. A northeast-trending normal fault through the middle of the area, downthrown on the southeast side, has displaced the generally flat lying strata as much as 300 feet (91 m). Most of the minable coal lies northwest of this fault.

  9. September 2016 Bayou Choctaw Subsidence Report

    Energy Technology Data Exchange (ETDEWEB)

    Moriarty, Dylan Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, Anna C. Snider [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-05-01

    Subsidence monitoring is a crucial component to understanding cavern integrity of salt storage caverns. This report looks at historical and current data at the Bayou Choctaw Strategic Petroleum Reserve Site. Data from the most recent land-based annual surveys, GPS, and tiltmeter indicate the subsidence rates across the site are approximately 0.0 ft./yr. Because of this, there is no evidence from the subsidence survey to suggest any of the DOE caverns have been structurally compromised.

  10. 77 FR 46286 - Drawbridge Operation Regulation; Bayou Boeuf, Amelia, LA

    Science.gov (United States)

    2012-08-03

    ... (BNSF) Railway Company swing span bridge across Bayou Boeuf, mile 10.2, at Amelia, St. Mary Parish... the swing span railroad bridge across Bayou Boeuf, mile 10.2, at Amelia, St. Mary Parish, Louisiana... equipment. The deviation is necessary to complete scheduled repairs necessitated by a bridge allision. This...

  11. Effects of Storm Events on Bacteria and Nutrients in the Bayou Chico Watershed

    Science.gov (United States)

    Hobbs, S. E.; Truong, S.

    2017-12-01

    Levels of Escherichia coli and abiotic nutrients often increase in response to storm events due to urban runoff. The urban setting, aging septic systems, and ample pet waste (predominant sources of bacterial and nutrient contamination) that surround Bayou Chico, provide abundant possibilities for contamination. E. coli is a gram-negative, rod shaped bacteria commonly found in the intestines of animals; while some strains are harmless, others produce dangerous toxins that can cause side effects and sometimes death. Along with E. coli, inorganic nutrient concentrations (orthophosphate, nitrate/nitrite, and ammonium) are key indicators of water quality. Dissolved nutrients promote the growth of primary producers and excessive amounts lead to algal blooms, often reducing biodiversity. Four sites were sampled weekly in June and July 2017; during which, June had the highest rainfall in comparison to the past three years; these four sites represented three different sub-watersheds of the Bayou Chico Watershed, with differing land-use at each site. Historical nutrient and bacterial data from the Bream Fishermen Association was also compared and examined to determine long term trends and obtain a more in-depth understanding of the dynamics of water quality in th urban setting. E. coli levels were universally high (ranging from 98 to 12,997 MPN/100mL) for all sites and did not show observable correlations to rainfall; possibly influenced by the systemic and anomalous heavy precipitation during most of the summer study period. Nitrate was detected at levels between 2.5 and 154.0 µM, while ammonium levels ranged from 0 to 16.1 µM. Three of four stations showed extremely elevated dissolved inorganic nitrogen and ammonium while one station showed low levels of these nutrients. Correlations between these nutrient loads and rainfall, support the hypothesis that runoff into tributary creeks contributes significant inorganic nutrient loads to the Bayou Chico urban estuary.

  12. Construction of hexahedral elements mesh capturing realistic geometries of Bayou Choctaw SPR site

    Energy Technology Data Exchange (ETDEWEB)

    Park, Byoung Yoon [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Roberts, Barry L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    The three-dimensional finite element mesh capturing realistic geometries of Bayou Choctaw site has been constructed using the sonar and seismic survey data obtained from the field. The mesh is consisting of hexahedral elements because the salt constitutive model is coded using hexahedral elements. Various ideas and techniques to construct finite element mesh capturing artificially and naturally formed geometries are provided. The techniques to reduce the number of elements as much as possible to save on computer run time with maintaining the computational accuracy is also introduced. The steps and methodologies could be applied to construct the meshes of Big Hill, Bryan Mound, and West Hackberry strategic petroleum reserve sites. The methodology could be applied to the complicated shape masses for not only various civil and geological structures but also biological applications such as artificial limbs.

  13. 77 FR 69564 - Drawbridge Operation Regulation; Bayou Boeuf, Amelia, LA

    Science.gov (United States)

    2012-11-20

    ... schedule that governs the Burlington Northern Santa Fe (BNSF) Railway Company swing span bridge across... schedule of the swing span railroad bridge across Bayou Boeuf, mile 10.2, at Amelia, St. Mary Parish... scheduled repairs necessitated by a bridge allision. This deviation allows the bridge to remain in the...

  14. 77 FR 27115 - Drawbridge Operation Regulation; Bayou Boeuf, Amelia, LA

    Science.gov (United States)

    2012-05-09

    ... schedule that governs the Burlington Northern Santa Fe (BNSF) Railway Company swing span bridge across... swing span railroad bridge across Bayou Boeuf, mile 10.2, at Amelia, St. Mary Parish, Louisiana. The... scheduled repairs necessitated by a bridge allision. This [[Page 27116

  15. 77 FR 42637 - Drawbridge Operation Regulation; Bayou Boeuf, Amelia, LA

    Science.gov (United States)

    2012-07-20

    ... schedule that governs the Burlington Northern Santa Fe (BNSF) Railway Company swing span bridge across... schedule of the swing span railroad bridge across Bayou Boeuf, mile 10.2, at Amelia, St. Mary Parish... to complete scheduled repairs necessitated by a bridge allision. This deviation allows the bridge to...

  16. Microseismic monitoring of Chocolate Bayou, Texas: The Pleasant Bayou no. 2 geopressured/geothermal energy test well program

    Science.gov (United States)

    Mauk, F. J.; Kimball, B.; Davis, R. A.

    The Brazoria seismic network, instrumentation, design, and specifications are described. The data analysis procedures are presented. Seismicity is described in relation to the Pleasant Bayou production history. Seismicity originating near the chemical plant east of the geopressured/geothermal well is discussed.

  17. Testing of the Pleasant Bayou Well through October 1990

    Energy Technology Data Exchange (ETDEWEB)

    Randolph, P.L.; Hayden, C.G.; Mosca, V.L.; Anhaiser, J.L.

    1992-08-01

    Pleasant Bayou location was inactive from 1983 until the cleanout of the production and disposal wells in 1986. The surface facilities were rehabilitated and after shakedown of the system, additional repair of wellhead valves, and injection of an inhibitor pill, continuous long-term production was started in 1988. Over two years of production subsequent to that are reviewed here, including: production data, brine sampling and analysis, hydrocarbon sampling and analysis, solids sampling and analysis, scale control and corrosion monitoring and control.

  18. Hail creek

    Energy Technology Data Exchange (ETDEWEB)

    Chadwick, J.

    2005-09-01

    The paper examines the development of one of the largest coking coal deposits in the world. Hail Creek is 100 km west of Mackay and 35 km northeast of Nebo, Queensland and has proven opencut reserves of 195.6 as at December 2003. Coal processing stated in July 2003. The award winning project included construction of a coal handling and preparation plant, a railway, a village and offsite infrastructure and mine buildings and site services. Coal is mined by conventional dragline and truck/shovel techniques. 1 photo.

  19. Bayou Choctaw Well Integrity Grading Component Based on Geomechanical Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Park, Byoung [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Geotechnology & Engineering Dept.

    2016-09-08

    This letter report provides a Bayou Choctaw (BC) Strategic Petroleum Reserve (SPR) well grading system based on the geomechanical simulation. The analyses described in this letter were used to evaluate the caverns’ geomechanical effect on wellbore integrity, which is an important component in the well integrity grading system recently developed by Roberts et al. [2015]. Using these analyses, the wellbores for caverns BC-17 and 20 are expected to be significantly impacted by cavern geomechanics, BC-18 and 19 are expected to be medium impacted; and the other caverns are expected to be less impacted.

  20. Bubbles, Bubbles, Tremors & Trouble: The Bayou Corne Sinkhole

    Science.gov (United States)

    Nunn, J. A.

    2013-12-01

    In May 2012, thermogenic methane bubbles were first observed in Bayou Corne in Assumption Parish, Louisiana. As of July 2013, ninety one bubbling sites have been identified. Gas was also found in the top of the Mississippi River Alluvial Aquifer (MRAA) about 125 ft below the surface. Vent wells drilled into the MRAA have flared more 16 million SCF of gas. Trace amounts of hydrogen sulfide also have been detected. Bayou Corne flows above the Napoleonville salt dome which has been an active area for oil and gas exploration since the 1920s. The dome is also a site of dissolution salt mining which has produced large caverns with diameters of up to 300 ft and heights of 2000 ft. Some caverns are used for storage of natural gas. Microseismic activity was confirmed by an Earthscope seismic station in White Castle, LA in July 2012. An array of microseismic stations set up in the area recorded more than 60 microseismic events in late July and early August, 2012. These microseismic events were located on the western side of the dome. Estimated focal depths are just above the top of salt. In August 2012, a sinkhole developed overnight just to the northwest of a plugged and abandoned brine filled cavern (see figure below). The sinkhole continues to grow in area to more than 20 acres and has consumed a pipeline right of way. The sinkhole is more than 750 ft deep at its center. Microseismic activity was reduced for several months following the formation of the sinkhole. Microseismic events have reoccurred episodically since then with periods of frequent events preceding slumping of material into the sinkhole or a 'burp' where fluid levels in the sinkhole drop and then rebound followed by a decrease in microseismic activity. Some gas and/or oil may appear at the surface of the sinkhole following a 'burp'. Very long period events also have been observed which are believed to be related to subsurface fluid movement. A relief well drilled into the abandoned brine cavern found that

  1. Three dimensional simulation for bayou choctaw strategic petroleum reserve (SPR).

    Energy Technology Data Exchange (ETDEWEB)

    Ehgartner, Brian L. (Sandia National Laboratories, Albuquerque, NM); Park, Byoung Yoon; Lee, Moo Yul

    2006-12-01

    Three dimensional finite element analyses were performed to evaluate the structural integrity of the caverns located at the Bayou Choctaw (BC) site which is considered a candidate for expansion. Fifteen active and nine abandoned caverns exist at BC, with a total cavern volume of some 164 MMB. A 3D model allowing control of each cavern individually was constructed because the location and depth of caverns and the date of excavation are irregular. The total cavern volume has practical interest, as this void space affects total creep closure in the BC salt mass. Operations including both cavern workover, where wellhead pressures are temporarily reduced to atmospheric, and cavern enlargement due to leaching during oil drawdowns that use water to displace the oil from the caverns, were modeled to account for as many as the five future oil drawdowns in the six SPR caverns. The impacts on cavern stability, underground creep closure, surface subsidence, infrastructure, and well integrity were quantified.

  2. Marine Tech in the Bayou City: An Experiential Education Experience

    Science.gov (United States)

    Barnard, A.; Mackay, E.; Zarate, A.; Coutts, A.; Craig, A.; Roman, R.; Max, M. D.; Detiveaux, G.; Dement, G.; Trufan, E.; Sager, W.; Wellner, J.; Stewart, R. R.; Mayer, L. A.; Coffin, R. B.; Vielstädte, L.; Skarke, A. D.; VanSumeren, H.; Cardenas, I.; Mir, R.

    2017-12-01

    Training and expertise in underwater exploration with advanced marine technology is a must for today's STEM graduates. Much of this training can be initiated in relatively inexpensive ways using local expertise and technologies. Instead of going to sea, we have previously demonstrated marine survey techniques by exploring the shallow waters of Houston's bayous. This setting has served to encourage participant's intrinsic motivation. In this study attendees were given the opportunity to fly/pilot a remotely operated vehicle (ROV), an autonomous surface vehicle (ASV), and participate in relevant lectures and data analysis. To achieve a quantitative evaluation of the training, participants provided responses to a list of focused questions before and after the survey exercises. Initially, a multibeam survey (200 m x 50 m) was conducted by Survey Equipment Services, Inc. using an Teledyne Oceanscience Z-Boat ASV with an Integrated 200 - 460 kHz Odom MB2 Multibeam System. Using the multibeam survey data research students identified acoustic targets on the bayou floor for further investigation. Target identification was achieved using a Predator II (Seatronics, Inc., an Acteon company) ROV mounted with a Teledyne BlueView Technologies M900-130 900 kHz, and 1.4 - 3 MHz Sound Metrics ARIS Explorer 3000 imaging sonars. Multibeam data delineated a 90 m long, 45 m wide, and 8 m deep hollow, interpreted as a confluence scour created at the junction of the Buffalo and White Oak Bayous. A raised bank downstream of the hollow within the main channel is attributed to rapid sedimentation in a region of post confluence flow deceleration. Targets in the imaging sonar were identified as boulder-sized transported riprap, fluvial sediment, and sand waves. Review of participant's survey results using a Wilcoxon signed rank test indicated statistically significant results across all 30 survey questions. Positive improvements were reported across the board in questions related to three

  3. Archeological Investigations at Big Hill Lake, Southeastern Kansas, 1980.

    Science.gov (United States)

    1982-09-01

    settled primarily along the Neosho river and Labette, Big Hill, and Pumpkin creeks. One of the first settlers in Osage township, in which Big Hill...slabs is not known at present. About 10 years later, in 1876, materials were reported- ly collected from an aboriginal site along Pumpkin creek...and length- ening its lifetime of use. As would therefore be expected, cracks are present between each of the paired holes on both of the two restored

  4. Sonar atlas of caverns comprising the U.S. Strategic Petroleum Reserve. Volume 1, Bayou Choctaw site, Louisiana.

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, Christopher Arthur; Lord, Anna Snider

    2007-10-01

    Downhole sonar surveys from the four active U.S. Strategic Petroleum Reserve sites have been modeled and used to generate a four-volume sonar atlas, showing the three-dimensional geometry of each cavern. This volume 1 focuses on the Bayou Choctaw SPR site, located in southern Louisiana. Volumes 2, 3, and 4, respectively, present images for the Big Hill SPR site, Texas, the Bryan Mound SPR site, Texas, and the West Hackberry SPR site, Louisiana. The atlas uses a consistent presentation format throughout. The basic geometric measurements provided by the down-cavern surveys have also been used to generate a number of geometric attributes, the values of which have been mapped onto the geometric form of each cavern using a color-shading scheme. The intent of the various geometrical attributes is to highlight deviations of the cavern shape from the idealized cylindrical form of a carefully leached underground storage cavern in salt. The atlas format does not allow interpretation of such geometric deviations and anomalies. However, significant geometric anomalies, not directly related to the leaching history of the cavern, may provide insight into the internal structure of the relevant salt dome.

  5. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......, and highlight some recent methodological advancements in machine learning and image analysis triggered by astronomical applications....

  6. Geomechanical Simulation of Bayou Choctaw Strategic Petroleum Reserve - Model Calibration.

    Energy Technology Data Exchange (ETDEWEB)

    Park, Byoung [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    A finite element numerical analysis model has been constructed that consists of a realistic mesh capturing the geometries of Bayou Choctaw (BC) Strategic Petroleum Reserve (SPR) site and multi - mechanism deformation ( M - D ) salt constitutive model using the daily data of actual wellhead pressure and oil - brine interface. The salt creep rate is not uniform in the salt dome, and the creep test data for BC salt is limited. Therefore, the model calibration is necessary to simulate the geomechanical behavior of the salt dome. The cavern volumetric closures of SPR caverns calculated from CAVEMAN are used for the field baseline measurement. The structure factor, A 2 , and transient strain limit factor, K 0 , in the M - D constitutive model are used for the calibration. The A 2 value obtained experimentally from the BC salt and K 0 value of Waste Isolation Pilot Plant (WIPP) salt are used for the baseline values. T o adjust the magnitude of A 2 and K 0 , multiplication factors A2F and K0F are defined, respectively. The A2F and K0F values of the salt dome and salt drawdown skins surrounding each SPR cavern have been determined through a number of back fitting analyses. The cavern volumetric closures calculated from this model correspond to the predictions from CAVEMAN for six SPR caverns. Therefore, this model is able to predict past and future geomechanical behaviors of the salt dome, caverns, caprock , and interbed layers. The geological concerns issued in the BC site will be explained from this model in a follow - up report .

  7. 77 FR 57492 - Drawbridge Operation Regulation; Grosse Tete Bayou, Iberville Parish, LA

    Science.gov (United States)

    2012-09-18

    ... modified from a moveable to a fixed bridge, without publishing an NPRM. The change removes the section of... the Union Pacific railroad swing bridge over Grosse Tete Bayou, mile 14.7, Iberville Parish, Louisiana. This bridge has been modified from a swing bridge to a fixed bridge and the current special operating...

  8. 75 FR 22737 - Final Damage Assessment and Restoration Plan for the Bayou Verdine and Calcasieu River

    Science.gov (United States)

    2010-04-30

    ..., and Liability Act (CERCLA), 42 U.S.C. 9607(f), Section 311 of the Federal Water Pollution and Control... DEPARTMENT OF COMMERCE National Oceanic and Atmospheric Administration Final Damage Assessment and Restoration Plan for the Bayou Verdine and Calcasieu River AGENCY: National Oceanic and Atmospheric...

  9. 77 FR 24840 - Safety Zone; Crowley Barge 750-2, Bayou Casotte, Pascagoula, MS

    Science.gov (United States)

    2012-04-26

    ... its possible commercial and contractual obligations. Under 5 U.S.C. 553(d)(3), the Coast Guard finds... process would unnecessarily interfere with launching the barge and its possible commercial and contractual obligations. Basis and Purpose VT-Halter Pascagoula is a ship yard and repair facility located on Bayou...

  10. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  11. Big science

    CERN Multimedia

    Nadis, S

    2003-01-01

    " "Big science" is moving into astronomy, bringing large experimental teams, multi-year research projects, and big budgets. If this is the wave of the future, why are some astronomers bucking the trend?" (2 pages).

  12. Nutrient and sediment concentrations and loads in the Steele Bayou Basin, northwestern Mississippi, 2010–14

    Science.gov (United States)

    Hicks, Matthew B.; Murphy, Jennifer C.; Stocks, Shane J.

    2017-06-01

    The U.S. Geological Survey, in cooperation with the U.S. Army Corps of Engineers-Vicksburg District, monitored streamflow, water quality, and sediment at two stations on the Steele Bayou in northwestern Mississippi from October 2010 through September 2014 to characterize nutrient and sediment concentrations and loads in areas where substantial implementation of conservation efforts have been implemented. The motivation for this effort was to quantify improvements, or lack thereof, in water quality in the Steele Bayou watershed as a result of implementing large- and small-scale best-management practices aimed at reducing nutrient and sediment concentrations and loads. The results of this study document the hydrologic, water-quality, and sedimentation status of these basins following over two decades of ongoing implementation of conservation practices.Results from this study indicate the two Steele Bayou stations have comparable loads and yields of total nitrogen, phosphorus, and suspended sediment when compared to other agricultural basins in the southeastern and central United States. However, nitrate plus nitrite yields from basins in the Mississippi River alluvial plain, including the Steele Bayou Basin, are generally lower than other agricultural basins in the southeastern and central United States.Seasonal variation in nutrient and sediment loads was observed at both stations and for most constituents. About 50 percent of the total annual nutrient and sediment load was observed during the spring (February through May) and between 25 and 50 percent was observed during late fall and winter (October through January). These seasonal patterns probably reflect a combination of seasonal patterns in precipitation, runoff, streamflow, and in the timing of fertilizer application.Median concentrations of total nitrogen, nitrate plus nitrite, total phosphorus, orthophosphate, and suspended sediment were slightly higher at the upstream station, Steele Bayou near Glen Allan

  13. Henretta Creek reclamation project

    International Nuclear Information System (INIS)

    Pumphrey, J.F.

    2009-01-01

    Teck Coal Ltd. operates 6 open-pit coal mines, of which 5 are located in the Elk Valley in southeastern British Columbia. The Fording River Operations (FRO) began in 1971 in mining areas in Eagle Mountain, Turnbull Mountain and Henretta Valley. The recovery of approximately 5 million tons of coal from the Henretta Creek Valley posed significant challenges to mine planners, hydrologists and environmental experts because the coal had to be recovered from the valley flanks and also from under the main valley floor, on which the fish-bearing Henretta Creek runs. The Henretta Dragline Mining project was described along with the water control structures and fisheries management efforts for the cutthroat trout. A detailed Environmental Impact Assessment and Stage 1 mining report for the Henretta Valley area was completed in December 1990. FRO was granted a mining and reclamation permit in 1991. A temporary relocation of 1,270 metres was required in in April 1997 in order to enable mining on both sides and below the creek bed. Among the innovative construction techniques was a diversion of Henretta Creek through large diameter steel culverts and a specialized crossing of the creek to allow fish passage. The first water flowed through the reclaimed Henretta Creek channel in late 1998 and the first high flow occurred in the spring of 2000. Teck coal FRO then launched an annual fish and fish habitat monitoring program which focused on the Henretta Creek Reclaimed Channel and Henretta Lake. This document presented the results from the final year, 2006, and a summary of the 7 year aquatic monitoring program. It was concluded that from mining through to reclamation, the Henretta project shows the commitment and success of mining and reclamation practices at Teck Coal. Indicators of the project's success include riparian zone vegetation, fisheries re-establishment, aquatic communities and habitat utilization by terrestrial and avian species. 33 refs., 1 fig.

  14. Vegetation - Pine Creek WA and Fitzhugh Creek WA [ds484

    Data.gov (United States)

    California Natural Resource Agency — This fine-scale vegetation classification and map of the Pine Creek and Fitzhugh Creek Wildlife Areas, Modoc County, California was created following FGDC and...

  15. The Impact of Rainfall on Fecal Coliform Bacteria in Bayou Dorcheat (North Louisiana

    Directory of Open Access Journals (Sweden)

    Paul B. Tchounwou

    2006-03-01

    Full Text Available Fecal coliform bacteria are the most common pollutant in rivers and streams. In Louisiana, it has been reported that 37% of surveyed river miles, 31% of lakes, and 23% of estuarine water had some level of contamination. The objective of this research was to assess the effect of surface runoff amounts and rainfall amount parameters on fecal coliform bacterial densities in Bayou Dorcheat in Louisiana. Bayou Dorcheat has been designated by the Louisiana Department of Environmental Quality as a waterway that has uses such as primary contact recreation, secondary contact recreation, propagation of fish and wildlife, agriculture and as being an outstanding natural resource water. Samples from Bayou Dorcheat were collected monthly and analyzed for the presence of fecal coliforms. Fecal coliforms isolated from these samples were identified to the species level. The analysis of the bacterial levels was performed following standard test protocols as described in Standard Methods for the Examination of Water and Wastewater. Information regarding the rainfall amounts and surface runoff amounts for the selected years was retrieved from the Louisiana Office of State Climatology. It was found that a significant increase in the fecal coliform numbers may be associated with average rainfall amounts. Possible sources of elevated coliform counts could include sewage discharges from municipal treatment plants and septic tanks, storm water overflows, and runoff from pastures and range lands. It can be concluded that nonpoint source pollution that is carried by surface runoff has a significant effect on bacterial levels in water resources.

  16. Hepatic lesions in mollies (Poecilia latipinna) collected from Bayou Trepagnier, Louisiana

    International Nuclear Information System (INIS)

    Thiyagarajah, A.

    1993-01-01

    Mollies, Poecilia latipinna, are small fish species belonging to the Family Poeciliidae. Mollies are surface feeders and are commonly found in Louisiana waters. Bayou Trepagnier is located in the Lake Pontchatrain Basin, in St. Charles Parish of Louisiana, which receives treated wastewater and stormwater from an oil refinery and manufacturing complex. The purpose of this study was to determine the effects of refinery discharges on mollies from Bayou Trepagnier. Fish were caught by beach seine, examined for gross lesions and then fixed in 10% neutral buffered formalin for histopathological analysis. Paraffin-embedded fish were cut at 6 μm and stained with hematoxylin and eosin. Lesions observed in mollies were grouped into (1) neoplasms, (2) preneoplastic lesions, and (3) cytotoxic lesions. Hepatocellular carcinoma was the only neoplasm found in these fish. The preneoplastic lesions include basophilic foci, eosinophilic foci, and clear-cell foci. Cytotoxic lesions observed were fatty change, focal necrosis, hyaline degeneration of hepatocytes, and fatty change in pancreatic acinar cells. These preliminary results suggest the presence of carcinogens in Bayou Trepagnier

  17. Lead isotopes in tree rings: Chronology of pollution in Bayou Trepagnier, Louisiana

    International Nuclear Information System (INIS)

    Marcantonio, F.; Flowers, G.; Thien, L.; Ellgaard, E.

    1998-01-01

    The authors have measured the Pb isotopic composition of tree rings from seven trees in both highly contaminated and relatively noncontaminated regions of Bayou Trepagnier, a bayou in southern Louisiana that has had oil refinery effluent discharged into it over the past 70 years. To their knowledge, this is the first time that Pb isotope tree-ring records have been used to assess the sources and extent of heavy-metal contamination of the environment through time. When tree ring 206 Pb/ 208 Pb and 206 Pb/ 207 Pb isotope ratios are plotted against one another, a straight line is defined by four of the most contaminated trees. This linear correlation suggests mixing between two sources of Pb. One of the sources is derived from the highly polluted dredge spoils on the banks of the bayou and the other from the natural environment. The nature of the contaminant Pb is unique in that it is, isotopically, relatively homogeneous and extremely radiogenic, similar to ores of the Mississippi Valley (i.e., 206 Pb/ 207 Pb = 1.28). This singular pollutant isotope signature has enabled them to determine the extent of Pb contamination in each cypress wood sample. The isotope results indicate that Pb uptake by the tree is dominated by local-scale root processes and is, therefore, hydrologically and chemically controlled. In addition, the authors propose that the mobility and bioavailability of Pb in the environment depends on its chemical speciation

  18. Urbanising Big

    DEFF Research Database (Denmark)

    Ljungwall, Christer

    2013-01-01

    Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis.......Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis....

  19. Big Argumentation?

    Directory of Open Access Journals (Sweden)

    Daniel Faltesek

    2013-08-01

    Full Text Available Big Data is nothing new. Public concern regarding the mass diffusion of data has appeared repeatedly with computing innovations, in the formation before Big Data it was most recently referred to as the information explosion. In this essay, I argue that the appeal of Big Data is not a function of computational power, but of a synergistic relationship between aesthetic order and a politics evacuated of a meaningful public deliberation. Understanding, and challenging, Big Data requires an attention to the aesthetics of data visualization and the ways in which those aesthetics would seem to depoliticize information. The conclusion proposes an alternative argumentative aesthetic as the appropriate response to the depoliticization posed by the popular imaginary of Big Data.

  20. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations......The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  1. Channel stability of Turkey Creek, Nebraska

    Science.gov (United States)

    Rus, David L.; Soenksen, Philip J.

    1998-01-01

    Channelization on Turkey Creek and its receiving stream, the South Fork Big Nemaha River, has disturbed the equilibrium of Turkey Creek and has led to channel-stability problems, such as degradation and channel widening, which pose a threat to bridges and land adjacent to the stream. As part of a multiagency study, the U.S. Geological Survey assessed channel stability at two bridge sites on upper and middle portions of Turkey Creek by analyzing streambed-elevation data for gradation changes, comparing recent cross-section surveys and historic accounts, identifying bank-failure blocks, and analyzing tree-ring samples. These results were compared to gradation data and trend results for a U.S. Geological Survey streamflow-gaging station near the mouth of Turkey Creek from a previous study. Examination of data on streambed elevations reveals that degradation has occurred. The streambed elevation declined 0.5 m at the upper site from 1967-97. The streambed elevation declined by 3.2 m at the middle site from 1948-97 and exposed 2 m of the pilings of the Nebraska Highway 8 bridge. Channel widening could not be verified at the two sites from 1967-97, but a historic account indicates widening at the middle site to be two to three times that of the 1949 channel width. Small bank failures were evident at the upper site and a 4-m-wide bank failure occurred at the middle site in 1987 according to tree ring analyses. Examination of streambed-elevation data from a previous study at the lower site reveals a statistically significant aggrading trend from 1958-93. Further examination of these data suggests minor degradation occurred until 1975, followed by aggradation.

  2. How Big Is Too Big?

    Science.gov (United States)

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  3. Geomechanical testing of Bayou Choctaw 102B core for SPR analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ingraham, Mathew Duffy; Broome, Scott Thomas; Bauer, Stephen J.; Barrow, Perry Carl; Flint, Gregory Mark.

    2014-02-01

    A laboratory testing program was developed to examine the short-term mechanical and time-dependent (creep) behavior of salt from the Bayou Choctaw Salt Dome. This report documents the test methodologies, and constitutive properties inferred from tests performed. These are used to extend our understanding of the mechanical behavior of the Bayou Choctaw domal salt and provide a data set for numerical analyses. The resulting information will be used to support numerical analyses of the current state of the Bayou Choctaw Dome as it relates to its crude oil storage function as part of the US Strategic Petroleum Reserve. Core obtained from Drill Hole BC-102B was tested under creep and quasi-static constant mean stress axisymmetric compression, and constant mean stress axisymmetric extension conditions. Creep tests were performed at 100 degrees Fahrenheit, and the axisymmetric tests were performed at ambient temperatures (72-78 degrees Fahrenheit). The testing performed indicates that the dilation criterion is pressure and stress state dependent. It was found that as the mean stress increases, the shear stress required to cause dilation increases. The results for this salt are reasonably consistent with those observed for other domal salts. Also it was observed that tests performed under extensile conditions required consistently lower shear stress to cause dilation for the same mean stress, which is consistent with other domal salts. Young's moduli ranged from 3.95 x 106 to 8.51 x 106 psi with an average of 6.44 x 106 psi, with Poisson's ratios ranging from 0.10 to 0.43 with an average of 0.30. Creep testing indicates that the BC salt is intermediate in creep resistance when compared with other bedded and domal salt steady-state behavior.

  4. Pine Creek uranium province

    International Nuclear Information System (INIS)

    Bower, M.B.; Needham, R.S.; Page, R.W.; Stuart-Smith, P.G.; Wyborn, L.A.I.

    1985-01-01

    The objective of this project is to help establish a sound geological framework of the Pine Creek region through regional geological, geochemical and geophysical studies. Uranium ore at the Coronation Hill U-Au mine is confined to a wedge of conglomerate in faulted contact with altered volcanics. The uranium, which is classified as epigenetic sandstone type, is derived from a uranium-enriched felsic volcanic source

  5. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  6. 76 FR 79145 - Drawbridge Operation Regulations; Bayou Liberty, Mile 2.0, St. Tammany Parish, Slidell, LA

    Science.gov (United States)

    2011-12-21

    ... proposed rule concerning the regulation governing the operation of the SR 433 Bridge over Bayou Liberty... comments expressing concern with the impact the proposed changes would have on public access to the... on the Internet by going to http://www.regulations.gov , inserting USCG-2010-0972 in the ``Keyword...

  7. Expansion analyses of strategic petroleum reserve in Bayou Choctaw : revised locations.

    Energy Technology Data Exchange (ETDEWEB)

    Ehgartner, Brian L.; Park, Byoung Yoon

    2010-11-01

    This report summarizes a series of three-dimensional simulations for the Bayou Choctaw Strategic Petroleum Reserve. The U.S. Department of Energy plans to leach two new caverns and convert one of the existing caverns within the Bayou Choctaw salt dome to expand its petroleum reserve storage capacity. An existing finite element mesh from previous analyses is modified by changing the locations of two caverns. The structural integrity of the three expansion caverns and the interaction between all the caverns in the dome are investigated. The impacts of the expansion on underground creep closure, surface subsidence, infrastructure, and well integrity are quantified. Two scenarios were used for the duration and timing of workover conditions where wellhead pressures are temporarily reduced to atmospheric pressure. The three expansion caverns are predicted to be structurally stable against tensile failure for both scenarios. Dilatant failure is not expected within the vicinity of the expansion caverns. Damage to surface structures is not predicted and there is not a marked increase in surface strains due to the presence of the three expansion caverns. The wells into the caverns should not undergo yield. The results show that from a structural viewpoint, the locations of the two newly proposed expansion caverns are acceptable, and all three expansion caverns can be safely constructed and operated.

  8. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  9. Judy Creek and beyond

    International Nuclear Information System (INIS)

    Kerr, S.A.

    1999-01-01

    The story of the Pengrowth Energy Trust, a company created in 1988 to provide investors with an opportunity to participate in the oil and gas industry without the higher investment risk associated with exploratory drilling is the vehicle used to provide an overview of the development of the Judy Creek oil field, an historical sketch of Imperial Oil Limited, and of the development of the community of Swan Hills shed, a town carved out of muskeg by early pioneers in 1957-1958. The book is replete with anecdotes and photographs, depicting the indomitable spirit of the people whose determination and faith made the development of the oil industry in Alberta possible

  10. Big Dreams

    Science.gov (United States)

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  11. Big Science

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1986-05-15

    Astronomy, like particle physics, has become Big Science where the demands of front line research can outstrip the science budgets of whole nations. Thus came into being the European Southern Observatory (ESO), founded in 1962 to provide European scientists with a major modern observatory to study the southern sky under optimal conditions.

  12. Soil salinity data from Bayou Dupont and flanking marshes, New Orleans, LA, 2015-09-16 to 2016-03-30 (NCEI Accession 0151633)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The project restored both structural and habitat functions of Bayou Dupont and flanking marshes. The project created and nourished marsh and restored a ridge on the...

  13. Soil salinity data from Grand Liard Bayou and flanking marshes, New Orleans, LA, 2015-12-01 to 2016-03-30 (NCEI Accession 0151634)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The project restored both structural and habitat functions of Grand Liard Bayou and flanking marshes. The project created and nourished marsh and restored a ridge on...

  14. Ship Creek bioassessment investigations

    Energy Technology Data Exchange (ETDEWEB)

    Cushing, C.E.; Mueller, R.P.; Murphy, M.T.

    1995-06-01

    Pacific Northwest Laboratory (PNL) was asked by Elmendorf Air Force Base (EAFB) personnel to conduct a series of collections of macroinvertebrates and sediments from Ship Creek to (1) establish baseline data on these populations for reference in evaluating possible impacts from Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) activities at two operable units, (2) compare current population indices with those found by previous investigations in Ship Creek, and (3) determine baseline levels of concentrations of any contaminants in the sediments associated with the macroinvertebrates. A specific suite of indices established by the US Environmental Protection Agency (EPA) was requested for the macroinvertebrate analyses; these follow the Rapid Bioassessment Protocol developed by Plafkin et al. (1989) and will be described. Sediment sample analyses included a Microtox bioassay and chemical analysis for contaminants of concern. These analyses included, volatile organic compounds, total gasoline and diesel hydrocarbons (EPA method 8015, CA modified), total organic carbon, and an inductive-coupled plasma/mass spectrometry (ICP/MS) metals scan. Appendix A reports on the sediment analyses. The Work Plan is attached as Appendix B.

  15. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Andersen, Kristina Vaarst; Jeppesen, Jacob

    In this paper we investigate the micro-mechanisms governing structural evolution and performance of scientific collaboration. Scientific discovery tends not to be lead by so called lone ?stars?, or big egos, but instead by collaboration among groups of researchers, from a multitude of institutions...

  16. Big Data and Big Science

    OpenAIRE

    Di Meglio, Alberto

    2014-01-01

    Brief introduction to the challenges of big data in scientific research based on the work done by the HEP community at CERN and how the CERN openlab promotes collaboration among research institutes and industrial IT companies. Presented at the FutureGov 2014 conference in Singapore.

  17. Big inquiry

    Energy Technology Data Exchange (ETDEWEB)

    Wynne, B [Lancaster Univ. (UK)

    1979-06-28

    The recently published report entitled 'The Big Public Inquiry' from the Council for Science and Society and the Outer Circle Policy Unit is considered, with especial reference to any future enquiry which may take place into the first commercial fast breeder reactor. Proposals embodied in the report include stronger rights for objectors and an attempt is made to tackle the problem that participation in a public inquiry is far too late to be objective. It is felt by the author that the CSS/OCPU report is a constructive contribution to the debate about big technology inquiries but that it fails to understand the deeper currents in the economic and political structure of technology which so influence the consequences of whatever formal procedures are evolved.

  18. Big Data

    OpenAIRE

    Bútora, Matúš

    2017-01-01

    Cieľom bakalárskej práca je popísať problematiku Big Data a agregačné operácie OLAP pre podporu rozhodovania, ktoré sú na ne aplikované pomocou technológie Apache Hadoop. Prevažná časť práce je venovaná popisu práve tejto technológie. Posledná kapitola sa zaoberá spôsobom aplikovania agregačných operácií a problematikou ich realizácie. Nasleduje celkové zhodnotenie práce a možnosti využitia výsledného systému do budúcna. The aim of the bachelor thesis is to describe the Big Data issue and ...

  19. BIG DATA

    OpenAIRE

    Abhishek Dubey

    2018-01-01

    The term 'Big Data' portrays inventive methods and advances to catch, store, disseminate, oversee and break down petabyte-or bigger estimated sets of data with high-speed & diverted structures. Enormous information can be organized, non-structured or half-organized, bringing about inadequacy of routine information administration techniques. Information is produced from different distinctive sources and can touch base in the framework at different rates. With a specific end goal to handle this...

  20. Bridge Creek IMW database - Bridge Creek Restoration and Monitoring Project

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The incised and degraded habitat of Bridge Creek is thought to be limiting a population of ESA-listed steelhead (Oncorhynchus mykiss). A logical restoration approach...

  1. Geothermal Energy Geopressure Subprogram, GCO-DOE, Pleasant Bayou No. 1

    Energy Technology Data Exchange (ETDEWEB)

    none

    1978-03-01

    This Environmental Assessment (EA) has been prepared to assess the environmental implications of the Department of Energy's proposal to drill, complete, and test one geopressure well located in Brazoria County on a 2 hectares (five acre) test site 64 km (40 mi) south of Houston, Abstract 107, Perry and Austin Survey, Brazoria County, TX. The test well is herein referred to as GCO-DOE Pleasant Bayou No. 1. A maximum of four disposal wells will be located within .8 km (1/2 mi) of the proposed well. The DOE and the University of Texas Center for Energy Studies propose to operate the test facility for three years to evaluate the geopressure potential of the subsurface. Tests to be conducted include flow rates, fluid composition, temperature, gas content, geologic characteristics, and the land subsidence potential for subsequent production.

  2. Keynote: Big Data, Big Opportunities

    OpenAIRE

    Borgman, Christine L.

    2014-01-01

    The enthusiasm for big data is obscuring the complexity and diversity of data in scholarship and the challenges for stewardship. Inside the black box of data are a plethora of research, technology, and policy issues. Data are not shiny objects that are easily exchanged. Rather, data are representations of observations, objects, or other entities used as evidence of phenomena for the purposes of research or scholarship. Data practices are local, varying from field to field, individual to indiv...

  3. BIG WAPWALLOPEN CREEK AND LATTIMER CREEK HYDRAULICS, LUZERNE COUNTY, PA, USA

    Data.gov (United States)

    Federal Emergency Management Agency, Department of Homeland Security — Recent developments in digital terrain and geospatial database management technology make it possible to protect this investment for existing and future projects to...

  4. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  5. Tidal Creek Sentinel Habitat Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Ecological Research, Assessment and Prediction's Tidal Creeks: Sentinel Habitat Database was developed to support the National Oceanic and Atmospheric...

  6. Big Data

    DEFF Research Database (Denmark)

    Aaen, Jon; Nielsen, Jeppe Agger

    2016-01-01

    Big Data byder sig til som en af tidens mest hypede teknologiske innovationer, udråbt til at rumme kimen til nye, værdifulde operationelle indsigter for private virksomheder og offentlige organisationer. Mens de optimistiske udmeldinger er mange, er forskningen i Big Data i den offentlige sektor...... indtil videre begrænset. Denne artikel belyser, hvordan den offentlige sundhedssektor kan genanvende og udnytte en stadig større mængde data under hensyntagen til offentlige værdier. Artiklen bygger på et casestudie af anvendelsen af store mængder sundhedsdata i Dansk AlmenMedicinsk Database (DAMD......). Analysen viser, at (gen)brug af data i nye sammenhænge er en flerspektret afvejning mellem ikke alene økonomiske rationaler og kvalitetshensyn, men også kontrol over personfølsomme data og etiske implikationer for borgeren. I DAMD-casen benyttes data på den ene side ”i den gode sags tjeneste” til...

  7. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  8. Pine creek geosyncline

    International Nuclear Information System (INIS)

    Needham, R.S.; Ewers, G.R.; Ferguson, J.

    1988-01-01

    The Pine Creek Geosyncline is a 66,000 km 2 inlier of Early Proterozoic metasediments, mafic and felsic intrusives and minor extrusives, surrounding small late Archaean granitic domes. Economic uranium occurrences cluster into three fields, with the Alligator Rivers field being the most significant. The metasediments are alluvial and reduced shallow-water pelites and psammites. Evaporitic carbonate developed on shallow shelves around Archaean islands. Basin development and sedimentation (c. 2000-1870 Ma) were related to gradual subsidence induced by crustal extension. Facies variations and volcanism were in places controlled by the extensional faults. The rocks were metamorphosed to lower the high grade, complexly folded, and intruded by numerous granitoids from c. 1870 to 1730 Ma. Late orogenic felsic volcanics accumulated in local rift systems. Middle Proterozoic sandstone was deposited on a peneplaned and deeply weathered surface from about 1650 Ma. Uranium is enriched in some Archaean and Proterozoic igneous rocks, but there is no local or regional enrichment of the metasedimentary hosts or of the unconformably overlying sandstone. There is no regional gravity, magnetic or radiometric character attributable to the region's significance as a uranium province; contrasts with surrounding sedimentary basins reflect expected differences in rock properties between a heterogeneous igneous/metamorphic region and relatively homogeneous undeformed and unmineralized sediments. Uranium-enriched Archaean and Proterozoic granitoids and felsic volcanics with labile U are likely though not exclusive source rocks. U was probably transported in oxidized low temperature solutions as uranyl complexes and precipitated in reduced, structurally controlled, low-pressure traps. All uranium occurrences are broadly classified as 'Proterozoic unconformity related'. Greatest potential for further discovery is offered in the Alligator Rivers field, where perhaps at least 3 to 5.5 times the

  9. Water quality study at the Congaree Swamp National monument of Myers Creek, Reeves Creek and Toms Creek. Technical report

    International Nuclear Information System (INIS)

    Rikard, M.

    1991-11-01

    The Congaree Swamp National Monument is one of the last significant near virgin tracts of bottom land hardwood forests in the Southeast United States. The study documents a water quality monitoring program on Myers Creek, Reeves Creek and Toms Creek. Basic water quality parameters were analyzed. High levels of aluminum and iron were found, and recommendations were made for further monitoring

  10. Asotin Creek Model Watershed Plan

    Energy Technology Data Exchange (ETDEWEB)

    Browne, D.; Holzmiller, J.; Koch, F.; Polumsky, S.; Schlee, D.; Thiessen, G.; Johnson, C.

    1995-04-01

    The Asotin Creek Model Watershed Plan is the first to be developed in Washington State which is specifically concerned with habitat protection and restoration for salmon and trout. The plan is consistent with the habitat element of the ``Strategy for Salmon``. Asotin Creek is similar in many ways to other salmon-bearing streams in the Snake River system. Its watershed has been significantly impacted by human activities and catastrophic natural events, such as floods and droughts. It supports only remnant salmon and trout populations compared to earlier years. It will require protection and restoration of its fish habitat and riparian corridor in order to increase its salmonid productivity. The watershed coordinator for the Asotin County Conservation District led a locally based process that combined local concerns and knowledge with technology from several agencies to produce the Asotin Creek Model Watershed Plan.

  11. 33 CFR 117.331 - Snake Creek.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Snake Creek. 117.331 Section 117.331 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Florida § 117.331 Snake Creek. The draw of the Snake Creek...

  12. Narrow hybrid zone between two subspecies of big sagebrush (Artemisia tridentata: Asteraceae): XI. Plant-insect interactions in reciprocal transplant gardens

    Science.gov (United States)

    John H. Graham; E. Durant McArthur; D. Carl Freeman

    2001-01-01

    Basin big sagebrush (Artemisia tridentata ssp. tridentata) and mountain big sagebrush (A. t. ssp. vaseyana) hybridize in a narrow zone near Salt Creek, Utah. Reciprocal transplant experiments in this hybrid zone demonstrate that hybrids are more fit than either parental subspecies, but only in the hybrid zone. Do hybrids experience greater, or lesser, use by...

  13. Big Data, Big Problems: A Healthcare Perspective.

    Science.gov (United States)

    Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M

    2017-01-01

    Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."

  14. 78 FR 3909 - Big Oaks National Wildlife Refuge, IN; Glacial Ridge National Wildlife Refuge, MN; Northern...

    Science.gov (United States)

    2013-01-17

    ... DEPARTMENT OF THE INTERIOR Fish and Wildlife Service [FWS-R3-R-2012-N283; FXRS1265030000-134-FF03R06000] Big Oaks National Wildlife Refuge, IN; Glacial Ridge National Wildlife Refuge, MN; Northern Tallgrass Prairie National Wildlife Refuge, MN; Whittlesey Creek National Wildlife Refuge, WI AGENCY: Fish...

  15. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  16. Stalin's Big Fleet Program

    National Research Council Canada - National Science Library

    Mauner, Milan

    2002-01-01

    Although Dr. Milan Hauner's study 'Stalin's Big Fleet program' has focused primarily on the formation of Big Fleets during the Tsarist and Soviet periods of Russia's naval history, there are important lessons...

  17. Big Data Semantics

    NARCIS (Netherlands)

    Ceravolo, Paolo; Azzini, Antonia; Angelini, Marco; Catarci, Tiziana; Cudré-Mauroux, Philippe; Damiani, Ernesto; Mazak, Alexandra; van Keulen, Maurice; Jarrar, Mustafa; Santucci, Giuseppe; Sattler, Kai-Uwe; Scannapieco, Monica; Wimmer, Manuel; Wrembel, Robert; Zaraket, Fadi

    2018-01-01

    Big Data technology has discarded traditional data modeling approaches as no longer applicable to distributed data processing. It is, however, largely recognized that Big Data impose novel challenges in data and infrastructure management. Indeed, multiple components and procedures must be

  18. Strategic Petroleum Reserve (SPR) additional geologic site characterization studies, Bayou Choctaw salt dome, Louisiana

    Energy Technology Data Exchange (ETDEWEB)

    Neal, J.T. [Sandia National Labs., Albuquerque, NM (United States); Magorian, T.R. [Magorian (Thomas R.), Amherst, NY (United States); Byrne, K.O.; Denzler, S. [Acres International Corp., Amherst, NY (United States)

    1993-09-01

    This report revises and updates the geologic site characterization report that was published in 1980. Revised structure maps and sections show interpretative differences in the dome shape and caprock structural contours, especially a major east-west trending shear zone, not mapped in the 1980 report. Excessive gas influx in Caverns 18 and 20 may be associated with this shear zone. Subsidence values at Bayou Choctaw are among the lowest in the SPR system, averaging only about 10 mm/yr but measurement and interpretation issues persist, as observed values often approximate measurement accuracy. Periodic, temporary flooding is a continuing concern because of the low site elevation (less than 10 ft), and this may intensify as future subsidence lowers the surface even further. Cavern 4 was re-sonared in 1992 and the profiles suggest that significant change has not occurred since 1980, thereby reducing the uncertainty of possible overburden collapse -- as occurred at Cavern 7 in 1954. Other potential integrity issues persist, such as the proximity of Cavern 20 to the dome edge, and the narrow web separating Caverns 15 and 17. Injection wells have been used for the disposal of brine but have been only marginally effective thus far; recompletions into more permeable lower Pleistocene gravels may be a practical way of increasing injection capacity and brinefield efficiency. Cavern storage space is limited on this already crowded dome, but 15 MMBBL could be gained by enlarging Cavern 19 and by constructing a new cavern beneath and slightly north of abandoned Cavern 13. Environmental issues center on the low site elevation: the backswamp environment combined with the potential for periodic flooding create conditions that will require continuing surveillance.

  19. Characterizing Big Data Management

    OpenAIRE

    Rogério Rossi; Kechi Hirama

    2015-01-01

    Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: t...

  20. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  1. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  2. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  3. 75 FR 40034 - Northeastern Tributary Reservoirs Land Management Plan, Beaver Creek, Clear Creek, Boone, Fort...

    Science.gov (United States)

    2010-07-13

    ... TENNESSEE VALLEY AUTHORITY Northeastern Tributary Reservoirs Land Management Plan, Beaver Creek...-managed public land on Beaver Creek, Clear Creek, Boone, Fort Patrick Henry, South Holston, Watauga, and... Proposed Land Use Alternative) identified in the final environmental impact statement (FEIS). Under the...

  4. 78 FR 62616 - Salmon Creek Hydroelectric Company, Salmon Creek Hydroelectric Company, LLC; Notice of Transfer...

    Science.gov (United States)

    2013-10-22

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 3730-005] Salmon Creek Hydroelectric Company, Salmon Creek Hydroelectric Company, LLC; Notice of Transfer of Exemption 1. By letter filed September 23, 2013, Salmon Creek Hydroelectric Company informed the Commission that they have...

  5. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  6. Characterizing Micro- and Macro-Scale Seismicity from Bayou Corne, Louisiana

    Science.gov (United States)

    Baig, A. M.; Urbancic, T.; Karimi, S.

    2013-12-01

    The initiation of felt seismicity in Bayou Corne, Louisiana, coupled with other phenomena detected by residents on the nearby housing development, prompted a call to install a broadband seismic network to monitor subsurface deformation. The initial deployment was in place to characterize the deformation contemporaneous with the formation of a sinkhole located in close proximity to a salt dome. Seismic events generated during this period followed a swarm-like behaviour with moment magnitudes culminating around Mw2.5. However, the seismic data recorded during this sequence suffer from poor signal to noise, onsets that are very difficult to pick, and the presence of a significant amount of energy arriving later in the waveforms. Efforts to understand the complexity in these waveforms are ongoing, and involve invoking the complexities inherent in recording in a highly attenuating swamp overlying a complex three-dimensional structure with the strong material property contrast of the salt dome. In order to understand the event character, as well as to locally lower the completeness threshold of the sequence, a downhole array of 15 Hz sensors was deployed in a newly drilled well around the salt dome. Although the deployment lasted a little over a month in duration, over 1000 events were detected down to moment magnitude -Mw3. Waveform quality tended to be excellent, with very distinct P and S wave arrivals observable across the array for most events. The highest magnitude events were seen as well on the surface network and allowed for the opportunity to observe the complexities introduced by the site effects, while overcoming the saturation effects on the higher-frequency downhole geophones. This hybrid downhole and surface array illustrates how a full picture of subsurface deformation is only made possible by combining the high-frequency downhole instrumentation to see the microseismicity complemented with a broadband array to accurately characterize the source

  7. Sacaton riparian grasslands of the Sky Islands: Mapping distribution and ecological condition using state-and-transition models in Upper Cienega Creek Watershed

    Science.gov (United States)

    Ron Tiller; Melissa Hughes; Gita Bodner

    2013-01-01

    Riparian grasslands dominated by Sporobolus wrightii (big sacaton) were once widely distributed in the intermountain basins of the Madrean Archipelago. These alluvial grasslands are still recognized as key resources for watershed function, livestock, and wildlife. The upper Cienega Creek watershed in SE Arizona is thought to harbor some of the region’s most extensive...

  8. 33 CFR 117.917 - Battery Creek.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Battery Creek. 117.917 Section 117.917 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements South Carolina § 117.917 Battery Creek. The draw of...

  9. 33 CFR 117.543 - Bear Creek.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Bear Creek. 117.543 Section 117.543 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Maryland § 117.543 Bear Creek. (a) The draws of the Baltimore...

  10. 27 CFR 9.211 - Swan Creek.

    Science.gov (United States)

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Swan Creek. 9.211 Section 9.211 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX AND TRADE BUREAU, DEPARTMENT OF THE TREASURY LIQUORS AMERICAN VITICULTURAL AREAS Approved American Viticultural Areas § 9.211 Swan Creek. (a) Name. The name of the viticultural are...

  11. 33 CFR 117.231 - Brandywine Creek.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Brandywine Creek. 117.231 Section 117.231 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Delaware § 117.231 Brandywine Creek. The draw of the...

  12. 33 CFR 117.841 - Smith Creek.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Smith Creek. 117.841 Section 117.841 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements North Carolina § 117.841 Smith Creek. The draw of the S117-S133...

  13. 33 CFR 117.324 - Rice Creek.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Rice Creek. 117.324 Section 117.324 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Florida § 117.324 Rice Creek. The CSX Railroad Swingbridge, mile...

  14. Currents and siltation at Dharamtar creek, Bombay

    Digital Repository Service at National Institute of Oceanography (India)

    Swamy, G.N.; Kolhatkar, V.M.; Fernandes, A.A.

    Hydrographic data collected in Dharamtar Creek during 1976-77 have been analysed. This showed that the waters in the Creek are well mixed and the salinity varied with the tide. The tidal currents are found to be generally strong. The distribution...

  15. 33 CFR 117.335 - Taylor Creek.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Taylor Creek. 117.335 Section 117.335 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Florida § 117.335 Taylor Creek. The draw of US441 bridge, mile 0...

  16. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  17. The big bang

    International Nuclear Information System (INIS)

    Chown, Marcus.

    1987-01-01

    The paper concerns the 'Big Bang' theory of the creation of the Universe 15 thousand million years ago, and traces events which physicists predict occurred soon after the creation. Unified theory of the moment of creation, evidence of an expanding Universe, the X-boson -the particle produced very soon after the big bang and which vanished from the Universe one-hundredth of a second after the big bang, and the fate of the Universe, are all discussed. (U.K.)

  18. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  19. Data: Big and Small.

    Science.gov (United States)

    Jones-Schenk, Jan

    2017-02-01

    Big data is a big topic in all leadership circles. Leaders in professional development must develop an understanding of what data are available across the organization that can inform effective planning for forecasting. Collaborating with others to integrate data sets can increase the power of prediction. Big data alone is insufficient to make big decisions. Leaders must find ways to access small data and triangulate multiple types of data to ensure the best decision making. J Contin Educ Nurs. 2017;48(2):60-61. Copyright 2017, SLACK Incorporated.

  20. A Big Video Manifesto

    DEFF Research Database (Denmark)

    Mcilvenny, Paul Bruce; Davidsen, Jacob

    2017-01-01

    and beautiful visualisations. However, we also need to ask what the tools of big data can do both for the Humanities and for more interpretative approaches and methods. Thus, we prefer to explore how the power of computation, new sensor technologies and massive storage can also help with video-based qualitative......For the last few years, we have witnessed a hype about the potential results and insights that quantitative big data can bring to the social sciences. The wonder of big data has moved into education, traffic planning, and disease control with a promise of making things better with big numbers...

  1. Buck Creek River Flow Analysis

    Science.gov (United States)

    Dhanapala, Yasas; George, Elizabeth; Ritter, John

    2009-04-01

    Buck Creek flowing through Springfield Ohio has a number of low-head dams currently in place that cause safety issues and sometimes make it impossible for recreational boaters to pass through. The safety issues include the back eddies created by the dams that are known as drowning machines and the hydraulic jumps. In this study we are modeling the flow of Buck Creek using topographical and flow data provided by the Geology Department of Wittenberg University. The flow is analyzed using Hydraulic Engineering Center - River Analysis System software (HEC-RAS). As the first step a model of the river near Snyder Park has been created with the current structure in place for validation purposes. Afterwards the low-head dam is replaced with four drop structures with V-notch overflow gates. The river bed is altered to reflect plunge pools after each drop structure. This analysis will provide insight to how the flow is going to behave after the changes are made. In addition a sediment transport analysis is also being conducted to provide information about the stability of these structures.

  2. Hydrology of the Johnson Creek Basin, Oregon

    Science.gov (United States)

    Lee, Karl K.; Snyder, Daniel T.

    2009-01-01

    The Johnson Creek basin is an important resource in the Portland, Oregon, metropolitan area. Johnson Creek forms a wildlife and recreational corridor through densely populated areas of the cities of Milwaukie, Portland, and Gresham, and rural and agricultural areas of Multnomah and Clackamas Counties. The basin has changed as a result of agricultural and urban development, stream channelization, and construction of roads, drains, and other features characteristic of human occupation. Flooding of Johnson Creek is a concern for the public and for water management officials. The interaction of the groundwater and surface-water systems in the Johnson Creek basin also is important. The occurrence of flooding from high groundwater discharge and from a rising water table prompted this study. As the Portland metropolitan area continues to grow, human-induced effects on streams in the Johnson Creek basin will continue. This report provides information on the groundwater and surface-water systems over a range of hydrologic conditions, as well as the interaction these of systems, and will aid in management of water resources in the area. High and low flows of Crystal Springs Creek, a tributary to Johnson Creek, were explained by streamflow and groundwater levels collected for this study, and results from previous studies. High flows of Crystal Springs Creek began in summer 1996, and did not diminish until 2000. Low streamflow of Crystal Springs Creek occurred in 2005. Flow of Crystal Springs Creek related to water-level fluctuations in a nearby well, enabling prediction of streamflow based on groundwater level. Holgate Lake is an ephemeral lake in Southeast Portland that has inundated residential areas several times since the 1940s. The water-surface elevation of the lake closely tracked the elevation of the water table in a nearby well, indicating that the occurrence of the lake is an expression of the water table. Antecedent conditions of the groundwater level and autumn

  3. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  4. Dual of big bang and big crunch

    International Nuclear Information System (INIS)

    Bak, Dongsu

    2007-01-01

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory

  5. 77 FR 10960 - Drawbridge Operation Regulation; Snake Creek, Islamorada, FL

    Science.gov (United States)

    2012-02-24

    ... Operation Regulation; Snake Creek, Islamorada, FL AGENCY: Coast Guard, DHS. ACTION: Notice of temporary... deviation from the regulation governing the operation of Snake Creek Bridge, mile 0.5, across Snake Creek... schedule of Snake Creek Bridge in Islamorada, Florida. This deviation will result in the bridge opening...

  6. Flood discharges and hydraulics near the mouths of Wolf Creek, Craig Branch, Manns Creek, Dunloup Creek, and Mill Creek in the New River Gorge National River, West Virginia

    Science.gov (United States)

    Wiley, J.B.

    1994-01-01

    The U.S. Geological Survey, in cooperation with the National Park Service, studied the frequency and magnitude of flooding near the mouths of five tributaries to the New River in the New River Gorge National River. The 100-year peak discharge at each tributary was determined from regional frequency equations. The 100-year discharge at Wolf Creek, Craig Branch, Manns Creek, Dunloup Creek, and Mill Creek was 3,400 cubic feet per second, 640 cubic feet per second, 8,200 cubic feet per second, 7,100 cubic feet per second, and 9,400 cubic feet per second, respectively. Flood elevations for each tributary were determined by application of a steady-state, one-dimensional flow model. Manning's roughness coefficients for the stream channels ranged from 0.040 to 0.100. Bridges that would be unable to contain the 100-year flood within the bridge opening included: the State Highway 82 bridge on Wolf Creek, the second Fayette County Highway 25 bridge upstream from the confluence with New River on Dunloup Creek, and an abandoned log bridge on Mill Creek.

  7. Big Data and Neuroimaging.

    Science.gov (United States)

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  8. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  9. CREEK Project's Internal Creek Habitat Survey for Eight Creeks in the North Inlet Estuary, South Carolina: January 1998.

    Data.gov (United States)

    Baruch Institute for Marine and Coastal Sciences, Univ of South Carolina — A group of eight intertidal creeks with high densities of oysters, Crassostrea virginica, in North Inlet Estuary, South Carolina, USA were studied using a replicated...

  10. From Big Data to Big Business

    DEFF Research Database (Denmark)

    Lund Pedersen, Carsten

    2017-01-01

    Idea in Brief: Problem: There is an enormous profit potential for manufacturing firms in big data, but one of the key barriers to obtaining data-driven growth is the lack of knowledge about which capabilities are needed to extract value and profit from data. Solution: We (BDBB research group at C...

  11. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  12. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  13. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  14. Featured Partner: Saddle Creek Logistics Services

    Science.gov (United States)

    This EPA fact sheet spotlights Saddle Creek Logistics as a SmartWay partner committed to sustainability in reducing greenhouse gas emissions and air pollution caused by freight transportation, partly by growing its compressed natural gas (CNG) vehicles for

  15. Some Physicochemical Charateristics of Badagry Creek, Nigeria ...

    African Journals Online (AJOL)

    West African Journal of Applied Ecology ... Badagry Creek runs through Nigeria and Republic of Benin with access to the Atlantic Ocean. ... Colour, surface temperature, pH, salinity, turbidity, phenol, dissolved oxygen, biological oxygen ...

  16. Tritium at the Steel Creek Landing

    International Nuclear Information System (INIS)

    Arnett, M.; Heffner, J.D.; Fledderman, P.D.; Littrell, J.W.; Hayes, D.W.; Dodgen, M.S.

    1998-01-01

    In December 1997 and January 1998, the South Carolina Department of Health and Environmental Control (SCDHEC) collected routine weekly grab samples from the Savannah River near the Steel Creek Boat Landing

  17. Mercury in Thana creek, Bombay harbour

    Digital Repository Service at National Institute of Oceanography (India)

    Zingde, M.D.; Desai, B.N.

    weight) with marked increased from harbour to the creek region suggests substantial mercury input in the head region. Chemical extraction by hydrogen peroxide indicated that more than 70% of mercury was leachable and probably organically bound...

  18. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van

  19. Wolf Creek Generating Station containment model

    International Nuclear Information System (INIS)

    Nguyen, D.H.; Neises, G.J.; Howard, M.L.

    1995-01-01

    This paper presents a CONTEMPT-LT/28 containment model that has been developed by Wolf Creek Nuclear Operating Corporation (WCNOC) to predict containment pressure and temperature behavior during the postulated events at Wolf Creek Generating Station (WCGS). The model has been validated using data provided in the WCGS Updated Safety Analysis Report (USAR). CONTEMPT-LT/28 model has been used extensively at WCGS to support plant operations, and recently, to support its 4.5% thermal power uprate project

  20. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  1. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Boyd, Richard N.

    2001-01-01

    The precision of measurements in modern cosmology has made huge strides in recent years, with measurements of the cosmic microwave background and the determination of the Hubble constant now rivaling the level of precision of the predictions of big bang nucleosynthesis. However, these results are not necessarily consistent with the predictions of the Standard Model of big bang nucleosynthesis. Reconciling these discrepancies may require extensions of the basic tenets of the model, and possibly of the reaction rates that determine the big bang abundances

  2. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  3. Big Data, indispensable today

    Directory of Open Access Journals (Sweden)

    Radu-Ioan ENACHE

    2015-10-01

    Full Text Available Big data is and will be used more in the future as a tool for everything that happens both online and offline. Of course , online is a real hobbit, Big Data is found in this medium , offering many advantages , being a real help for all consumers. In this paper we talked about Big Data as being a plus in developing new applications, by gathering useful information about the users and their behaviour.We've also presented the key aspects of real-time monitoring and the architecture principles of this technology. The most important benefit brought to this paper is presented in the cloud section.

  4. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  5. Small Big Data Congress 2017

    NARCIS (Netherlands)

    Doorn, J.

    2017-01-01

    TNO, in collaboration with the Big Data Value Center, presents the fourth Small Big Data Congress! Our congress aims at providing an overview of practical and innovative applications based on big data. Do you want to know what is happening in applied research with big data? And what can already be

  6. Cryptography for Big Data Security

    Science.gov (United States)

    2015-07-13

    Cryptography for Big Data Security Book Chapter for Big Data: Storage, Sharing, and Security (3S) Distribution A: Public Release Ariel Hamlin1 Nabil...Email: arkady@ll.mit.edu ii Contents 1 Cryptography for Big Data Security 1 1.1 Introduction...48 Chapter 1 Cryptography for Big Data Security 1.1 Introduction With the amount

  7. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  8. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  9. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  10. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  11. BigDansing

    KAUST Repository

    Khayyat, Zuhair; Ilyas, Ihab F.; Jindal, Alekh; Madden, Samuel; Ouzzani, Mourad; Papotti, Paolo; Quiané -Ruiz, Jorge-Arnulfo; Tang, Nan; Yin, Si

    2015-01-01

    of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic

  12. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  13. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-01-01

    on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also

  14. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  15. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  16. Big Bang baryosynthesis

    International Nuclear Information System (INIS)

    Turner, M.S.; Chicago Univ., IL

    1983-01-01

    In these lectures I briefly review Big Bang baryosynthesis. In the first lecture I discuss the evidence which exists for the BAU, the failure of non-GUT symmetrical cosmologies, the qualitative picture of baryosynthesis, and numerical results of detailed baryosynthesis calculations. In the second lecture I discuss the requisite CP violation in some detail, further the statistical mechanics of baryosynthesis, possible complications to the simplest scenario, and one cosmological implication of Big Bang baryosynthesis. (orig./HSI)

  17. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  18. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2015 The Authors.

  19. Big Data and medicine: a big deal?

    Science.gov (United States)

    Mayer-Schönberger, V; Ingelsson, E

    2018-05-01

    Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research. Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data's role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval. © 2017 The Association for the Publication of the Journal of Internal Medicine.

  20. Big data uncertainties.

    Science.gov (United States)

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  1. Hoe Creek groundwater restoration, 1989

    Energy Technology Data Exchange (ETDEWEB)

    Renk, R.R.; Crader, S.E.; Lindblom, S.R.; Covell, J.R.

    1990-01-01

    During the summer of 1989, approximately 6.5 million gallons of contaminated groundwater were pumped from 23 wells at the Hoe Creek underground coal gasification site, near Gillette, Wyoming. The organic contaminants were removed using activated carbon before the water was sprayed on 15.4 acres at the sites. Approximately 2647 g (5.8 lb) of phenols and 10,714 g (23.6 lb) of benzene were removed from the site aquifers. Phenols, benzene, toluene, ethylbenzene, and naphthalene concentrations were measured in 43 wells. Benzene is the only contaminant at the site exceeds the federal standard for drinking water (5 {mu}g/L). Benzene leaches into the groundwater and is slow to biologically degrade; therefore, the benzene concentration has remained high in the groundwater at the site. The pumping operation affected groundwater elevations across the entire 80-acre site. The water levels rebounded quickly when the pumping operation was stopped on October 1, 1989. Removing contaminated groundwater by pumping is not an effective way to clean up the site because the continuous release of benzene from coal tars is slow. Benzene will continue to leach of the tars for a long time unless its source is removed or the leaching rate retarded through mitigation techniques. The application of the treated groundwater to the surface stimulated plant growth. No adverse effects were noted or recorded from some 60 soil samples taken from twenty locations in the spray field area. 20 refs., 52 figs., 8 tabs.

  2. 78 FR 64003 - Notice of Availability of the Final Environmental Impact Statement for the Jump Creek, Succor...

    Science.gov (United States)

    2013-10-25

    ...] Notice of Availability of the Final Environmental Impact Statement for the Jump Creek, Succor Creek, and... Field Office Jump Creek, Succor Creek and Cow Creek Watersheds grazing permit renewal, and by this... in the Federal Register. ADDRESSES: Copies of the Jump Creek, Succor Creek and Cow Creek Watersheds...

  3. 78 FR 26065 - Notice of Availability of the Draft Environmental Impact Statement for the Jump Creek, Succor...

    Science.gov (United States)

    2013-05-03

    ...] Notice of Availability of the Draft Environmental Impact Statement for the Jump Creek, Succor Creek, and... the Jump Creek, Succor Creek, and Cow Creek Watersheds Grazing Permit Renewal and by this notice is... receive written comments on the Draft EIS for the Jump Creek, Succor Creek, and Cow Creek Watersheds...

  4. Conceptual frameworks, geomorphic interpretation and storytelling: Tales from Lockyer Creek , Australia.

    Science.gov (United States)

    Croke, Jacky; Phillips, Jonathan; Van Dyke, Chris

    2017-04-01

    Earth science knowledge and insight begins with case studies, and theories should be derived from and ultimately evaluated against empirical, case study evidence. However, isolated case studies not linked conceptually to other locations or embedded within a broader framework are often of limited use beyond the study site. Geomorphic evidence and phenomena may be interpreted using a variety of conceptual frameworks (theories, models, laws, methodologies, etc.). The evidence may be, or at least appear to be, consistent with multiple frameworks, even when those constructs are derived from entirely different assumptions or frames of reference. Thus different interpretations and stories can be derived from the same evidence. Our purpose here is to illustrate this phenomenon via a case study from Lockyer Creek, southeast Queensland, Australia. Lockyer Creek is fast becoming one of Australia's most studied catchments with a wealth of data emerging following two extreme flood events in 2011 and 2013. Whilst the initial objective of the Big Flood project was to provide information on the frequency and magnitude of these extreme events, in essence the project revealed a rich 'story' of river evolution and adjustment which at first glance did not appear to 'fit' many established conceptual frameworks and theories. This presentation tells the tale of Lockyer Creek as it relates to selected key conceptual frameworks and importantly how this information can then be used for more effective catchment and flood management.

  5. Benthic macroinvertebrate assemblages and sediment toxicity testing in the Ely Creek watershed restoration project

    International Nuclear Information System (INIS)

    Soucek, D.J.; Currie, R.J.; Cherry, D.S.; Latimer, H.A.

    1998-01-01

    The Ely Creek watershed in Lee County, Virginia, contains an abundance of abandoned mined land (AML) seeps that contaminate the majority of the creek and its confluence into Big Stone Creek. Contaminated sediments had high concentrations of iron (∼10,000 mg/kg), aluminum (∼1,500 mg/kg), magnesium (∼400 mg/kg) and manganese (∼150 mg/kg). Copper and zinc generally ranged from 3 to 20 mg/kg. Benthic macroinvertebrates surveys at six of 20 sites sampled in the watershed yielded no macroinvertebrates, while eight others had total abundances of 1 to 9 organisms. Four reference sites contained ≥100 organisms and at least 14 different taxa. Laboratory, 10-day survival/impairment sediments tests with Daphnia magna did not support the field data. Mortality of 92 to 100% for D. magna occurred in samples collected from six cities. Daphnid reproduction was more sensitive than laboratory test organism survivorship; however, neither daphnid survivorship nor reproduction were good predictors of taxa richness. Laboratory test concerns included the use of a reference diluent water rather than site specific diluent water

  6. A Peek into 'Alamogordo Creek'

    Science.gov (United States)

    2006-01-01

    [figure removed for brevity, see original site] [figure removed for brevity, see original site] [figure removed for brevity, see original site] Figure 1Figure 2Figure 3 On its 825th Martian day (May 20, 2006), NASA's Mars Exploration Rover Opportunity stopped for the weekend to place its instrument arm onto the soil target pictured here, dubbed 'Alamogordo Creek.' Two views from the panoramic camera, acquired at about noon local solar time, are at the top. Below them is a close-up view from the microscopic imager. At upper left, a false-color view emphasizes differences among materials in rocks and soil. It combines images taken through the panoramic camera's 753-nanometer, 535-nanometer and 432-nanometer filters. At upper right is an approximately true-color rendering made with the panoramic camera's 600-nanometer, 535-nanometer and 480-nanometer filters. The microscopic-imager frame covers the area outlined by the white boxes in the panoramic-camera views, a rectangle 3 centimeters (1.2 inches) across. As Opportunity traverses to the south, it is analyzing soil and rocks along the way for differences from those seen earlier. At this site, the soil contains abundant small spherical fragments, thought to be hematite-rich concretions, plus finer-grained basaltic sand. Most of the spherical fragments seen in the microscopic image are smaller than those first seen at the rover's landing site in 'Eagle Crater,' some five kilometers (3.1 miles) to the north. However, a few larger spherical fragments and other rock fragments can also be seen in the panoramic-camera images.

  7. BigDansing

    KAUST Repository

    Khayyat, Zuhair

    2015-06-02

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to scaling to big datasets. This presents a serious impediment since data cleansing often involves costly computations such as enumerating pairs of tuples, handling inequality joins, and dealing with user-defined functions. In this paper, we present BigDansing, a Big Data Cleansing system to tackle efficiency, scalability, and ease-of-use issues in data cleansing. The system can run on top of most common general purpose data processing platforms, ranging from DBMSs to MapReduce-like frameworks. A user-friendly programming interface allows users to express data quality rules both declaratively and procedurally, with no requirement of being aware of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic and real datasets show that BigDansing outperforms existing baseline systems up to more than two orders of magnitude without sacrificing the quality provided by the repair algorithms.

  8. Elevation - LiDAR Survey Minnehaha Creek, MN Watershed

    Data.gov (United States)

    Army Corps of Engineers, Department of the Army, Department of Defense — LiDAR Bare-Earth Grid - Minnehaha Creek Watershed District. The Minnehaha Creek watershed is located primarily in Hennepin County, Minnesota. The watershed covers...

  9. Preliminary Chemical and Biological Assessment of Ogbe Creek ...

    African Journals Online (AJOL)

    USER

    The study was aimed at assessing the quality of water from the Ogbe Creek ... indicated the impact of the perturbational stress on the organisms inhabiting the creek. ... experiences seasonal flooding which introduces a lot of detritus and ...

  10. Plankton biodiversity of Dharamtar creek adjoining Mumbai harbour

    Digital Repository Service at National Institute of Oceanography (India)

    Tiwari, L.R.; Nair, V.R.

    rich plankton community. However, recent industrial development along the banks of creek may pose the problem due to waste disposal into this creek system. Losses of marine life diversity are largely the results of conflicting uses, in particular...

  11. Steel Creek water quality: L-Lake/Steel Creek Biological Monitoring Program, November 1985--December 1991

    International Nuclear Information System (INIS)

    Bowers, J.A.; Kretchmer, D.W.; Chimney, M.J.

    1992-04-01

    The Savannah River Site (SRS) encompasses 300 sq mi of the Atlantic Coastal Plain in west-central South Carolina. The Savannah River forms the western boundary of the site. Five major tributaries of the Savannah River -- upper Three Runs Creek, Four Mile Creek, Pen Branch, Steel Creek, and Lower Three Runs Creek -- drain the site. All but Upper Three Runs Creek receive, or in the past received, thermal effluents from nuclear production reactors. In 1985, L Lake, a 400-hectare cooling reservoir, was built on the upper reaches of Steel Creek to receive effluent from the restart of L-Reactor, and protect the lower reaches from thermal impacts. The Steel Creek Biological Monitoring Program was designed to meet envirorunental regulatory requirements associated with the restart of L-Reactor and complements the Biological Monitoring Program for L Lake. This extensive program was implemented to address portions of Section 316(a) of the Clean Water Act. The Department of Energy (DOE) must demonstrate that the operation of L-Reactor will not significantly alter the established aquatic ecosystems

  12. Streamflow conditions along Soldier Creek, Northeast Kansas

    Science.gov (United States)

    Juracek, Kyle E.

    2017-11-14

    The availability of adequate water to meet the present (2017) and future needs of humans, fish, and wildlife is a fundamental issue for the Prairie Band Potawatomi Nation in northeast Kansas. Because Soldier Creek flows through the Prairie Band Potawatomi Nation Reservation, it is an important tribal resource. An understanding of historical Soldier Creek streamflow conditions is required for the effective management of tribal water resources, including drought contingency planning. Historical data for six selected U.S. Geological Survey (USGS) streamgages along Soldier Creek were used in an assessment of streamflow characteristics and trends by Juracek (2017). Streamflow data for the period of record at each streamgage were used to compute annual mean streamflow, annual mean base flow, mean monthly flow, annual peak flow, and annual minimum flow. Results of the assessment are summarized in this fact sheet.

  13. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  14. Thick-Big Descriptions

    DEFF Research Database (Denmark)

    Lai, Signe Sophus

    The paper discusses the rewards and challenges of employing commercial audience measurements data – gathered by media industries for profitmaking purposes – in ethnographic research on the Internet in everyday life. It questions claims to the objectivity of big data (Anderson 2008), the assumption...... communication systems, language and behavior appear as texts, outputs, and discourses (data to be ‘found’) – big data then documents things that in earlier research required interviews and observations (data to be ‘made’) (Jensen 2014). However, web-measurement enterprises build audiences according...... to a commercial logic (boyd & Crawford 2011) and is as such directed by motives that call for specific types of sellable user data and specific segmentation strategies. In combining big data and ‘thick descriptions’ (Geertz 1973) scholars need to question how ethnographic fieldwork might map the ‘data not seen...

  15. Big data in biomedicine.

    Science.gov (United States)

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Big data bioinformatics.

    Science.gov (United States)

    Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao

    2014-12-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.

  17. Flood-inundation maps for Indian Creek and Tomahawk Creek, Johnson County, Kansas, 2014

    Science.gov (United States)

    Peters, Arin J.; Studley, Seth E.

    2016-01-25

    Digital flood-inundation maps for a 6.4-mile upper reach of Indian Creek from College Boulevard to the confluence with Tomahawk Creek, a 3.9-mile reach of Tomahawk Creek from 127th Street to the confluence with Indian Creek, and a 1.9-mile lower reach of Indian Creek from the confluence with Tomahawk Creek to just beyond the Kansas/Missouri border at State Line Road in Johnson County, Kansas, were created by the U.S. Geological Survey in cooperation with the city of Overland Park, Kansas. The flood-inundation maps, which can be accessed through the U.S. Geological Survey Flood Inundation Mapping Science Web site at http://water.usgs.gov/osw/flood_inundation/, depict estimates of the areal extent and depth of flooding corresponding to selected water levels (stages) at the U.S. Geological Survey streamgages on Indian Creek at Overland Park, Kansas; Indian Creek at State Line Road, Leawood, Kansas; and Tomahawk Creek near Overland Park, Kansas. Near real time stages at these streamgages may be obtained on the Web from the U.S. Geological Survey National Water Information System at http://waterdata.usgs.gov/nwis or the National Weather Service Advanced Hydrologic Prediction Service at http://water.weather.gov/ahps/, which also forecasts flood hydrographs at these sites.Flood profiles were computed for the stream reaches by means of a one-dimensional step-backwater model. The model was calibrated for each reach by using the most current stage-discharge relations at the streamgages. The hydraulic models were then used to determine 15 water-surface profiles for Indian Creek at Overland Park, Kansas; 17 water-surface profiles for Indian Creek at State Line Road, Leawood, Kansas; and 14 water-surface profiles for Tomahawk Creek near Overland Park, Kansas, for flood stages at 1-foot intervals referenced to the streamgage datum and ranging from bankfull to the next interval above the 0.2-percent annual exceedance probability flood level (500-year recurrence interval). The

  18. Blue Creek Winter Range: Wildlife Mitigation Project. Final environmental assessment

    International Nuclear Information System (INIS)

    1994-11-01

    Bonneville Power Administration (BPA) proposes to fund that portion of the Washington Wildlife Agreement pertaining to the Blue Creek Winter Range Wildlife Mitigation Project (Project) in a cooperative effort with the Spokane Tribe, Upper Columbia United Tribes, and the Bureau of Indian Affairs (BIA). If fully implemented, the proposed action would allow the sponsors to protect and enhance 2,631 habitat units of big game winter range and riparian shrub habitat on 2,185 hectares (5,400 acres) of Spokane Tribal trust lands, and to conduct long term wildlife management activities within the Spokane Indian Reservation project area. This Final Environmental Assessment (EA) examines the potential environmental effects of securing land and conducting wildlife habitat enhancement and long term management activities within the boundaries of the Spokane Indian Reservation. Four proposed activities (habitat protection, habitat enhancement, operation and maintenance, and monitoring and evaluation) are analyzed. The proposed action is intended to meet the need for mitigation of wildlife and wildlife habitat adversely affected by the construction of Grand Coulee Dam and its reservoir

  19. CREEK Project's Phytoplankton Pigment Monitoring Database for Eight Creeks in the North Inlet Estuary, South Carolina: 1997-1999

    Data.gov (United States)

    Baruch Institute for Marine and Coastal Sciences, Univ of South Carolina — The CREEK Project began in January of 1996 and was designed to help determine the role of oysters, Crassostrea virginica, in tidal creeks of the North Inlet Estuary,...

  20. 77 FR 5201 - Drawbridge Operation Regulation; Bear Creek, Dundalk, MD

    Science.gov (United States)

    2012-02-02

    ...-AA09 Drawbridge Operation Regulation; Bear Creek, Dundalk, MD AGENCY: Coast Guard, DHS. ACTION: Notice... operation of the Baltimore County highway bridge at Wise Avenue across Bear Creek, mile 3.4, between Dundalk... Avenue across Bear Creek, mile 3.4 between Dundalk and Sparrows Point, MD. This change would require the...

  1. Effects of Abandoned Coal-Mine Drainage on Streamflow and Water Quality in the Mahanoy Creek Basin, Schuylkill, Columbia, and Northumberland Counties, Pennsylvania, 2001

    Science.gov (United States)

    Cravotta,, Charles A.

    2004-01-01

    This report assesses the contaminant loading, effects to receiving streams, and possible remedial alternatives for abandoned mine drainage (AMD) within the Mahanoy Creek Basin in east-central Pennsylvania. The Mahanoy Creek Basin encompasses an area of 157 square miles (407 square kilometers) including approximately 42 square miles (109 square kilometers) underlain by the Western Middle Anthracite Field. As a result of more than 150 years of anthracite mining in the basin, ground water, surface water, and streambed sediments have been adversely affected. Leakage from streams to underground mines and elevated concentrations (above background levels) of acidity, metals, and sulfate in the AMD from flooded underground mines and (or) unreclaimed culm (waste rock) degrade the aquatic ecosystem and impair uses of the main stem of Mahanoy Creek from its headwaters to its mouth on the Susquehanna River. Various tributaries also are affected, including North Mahanoy Creek, Waste House Run, Shenandoah Creek, Zerbe Run, and two unnamed tributaries locally called Big Mine Run and Big Run. The Little Mahanoy Creek and Schwaben Creek are the only major tributaries not affected by mining. To assess the current hydrological and chemical characteristics of the AMD and its effect on receiving streams, and to identify possible remedial alternatives, the U.S. Geological Survey (USGS) began a study in 2001, in cooperation with the Pennsylvania Department of Environmental Protection and the Schuylkill Conservation District. Aquatic ecological surveys were conducted by the USGS at five stream sites during low base-flow conditions in October 2001. Twenty species of fish were identified in Schwaben Creek near Red Cross, which drains an unmined area of 22.7 square miles (58.8 square kilometers) in the lower part of the Mahanoy Creek Basin. In contrast, 14 species of fish were identified in Mahanoy Creek near its mouth at Kneass, below Schwaben Creek. The diversity and abundance of fish

  2. Sedimentation Study and Flume Investigation, Mission Creek, Santa Barbara, California; Corte Madera Creek, Marin County, California

    National Research Council Canada - National Science Library

    Copeland, Ronald

    2000-01-01

    .... An existing concrete-lined flood control channel on Corte Madera Creek in Marin County, California lacks a debris basin at its upstream terminus and carries significant bed load through a supercritical flow reach...

  3. CREEK Project's Oyster Biomass Database for Eight Creeks in the North Inlet Estuary, South Carolina

    Data.gov (United States)

    Baruch Institute for Marine and Coastal Sciences, Univ of South Carolina — A group of eight tidal creeks dominated by oysters, Crassostrea virginica, in North Inlet Estuary, South Carolina, USA were studied using a replicated BACI (Before -...

  4. Big Java late objects

    CERN Document Server

    Horstmann, Cay S

    2012-01-01

    Big Java: Late Objects is a comprehensive introduction to Java and computer programming, which focuses on the principles of programming, software engineering, and effective learning. It is designed for a two-semester first course in programming for computer science students.

  5. Big ideas: innovation policy

    OpenAIRE

    John Van Reenen

    2011-01-01

    In the last CentrePiece, John Van Reenen stressed the importance of competition and labour market flexibility for productivity growth. His latest in CEP's 'big ideas' series describes the impact of research on how policy-makers can influence innovation more directly - through tax credits for business spending on research and development.

  6. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with

  7. Big data in history

    CERN Document Server

    Manning, Patrick

    2013-01-01

    Big Data in History introduces the project to create a world-historical archive, tracing the last four centuries of historical dynamics and change. Chapters address the archive's overall plan, how to interpret the past through a global archive, the missions of gathering records, linking local data into global patterns, and exploring the results.

  8. The Big Sky inside

    Science.gov (United States)

    Adams, Earle; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Hester, Carolyn; Knuth, Randy; Spangler, Todd; Jones, David; Henthorn, Melissa; Hammill, Brock; Smith, Paul; Salisbury, Rob; Reckin, Gene; Boulafentis, Johna

    2009-01-01

    The University of Montana (UM)-Missoula has implemented a problem-based program in which students perform scientific research focused on indoor air pollution. The Air Toxics Under the Big Sky program (Jones et al. 2007; Adams et al. 2008; Ward et al. 2008) provides a community-based framework for understanding the complex relationship between poor…

  9. Moving Another Big Desk.

    Science.gov (United States)

    Fawcett, Gay

    1996-01-01

    New ways of thinking about leadership require that leaders move their big desks and establish environments that encourage trust and open communication. Educational leaders must trust their colleagues to make wise choices. When teachers are treated democratically as leaders, classrooms will also become democratic learning organizations. (SM)

  10. A Big Bang Lab

    Science.gov (United States)

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  11. New 'bigs' in cosmology

    International Nuclear Information System (INIS)

    Yurov, Artyom V.; Martin-Moruno, Prado; Gonzalez-Diaz, Pedro F.

    2006-01-01

    This paper contains a detailed discussion on new cosmic solutions describing the early and late evolution of a universe that is filled with a kind of dark energy that may or may not satisfy the energy conditions. The main distinctive property of the resulting space-times is that they make to appear twice the single singular events predicted by the corresponding quintessential (phantom) models in a manner which can be made symmetric with respect to the origin of cosmic time. Thus, big bang and big rip singularity are shown to take place twice, one on the positive branch of time and the other on the negative one. We have also considered dark energy and phantom energy accretion onto black holes and wormholes in the context of these new cosmic solutions. It is seen that the space-times of these holes would then undergo swelling processes leading to big trip and big hole events taking place on distinct epochs along the evolution of the universe. In this way, the possibility is considered that the past and future be connected in a non-paradoxical manner in the universes described by means of the new symmetric solutions

  12. The Big Bang

    CERN Multimedia

    Moods, Patrick

    2006-01-01

    How did the Universe begin? The favoured theory is that everything - space, time, matter - came into existence at the same moment, around 13.7 thousand million years ago. This event was scornfully referred to as the "Big Bang" by Sir Fred Hoyle, who did not believe in it and maintained that the Universe had always existed.

  13. Big Data Analytics

    Indian Academy of Sciences (India)

    The volume and variety of data being generated using computersis doubling every two years. It is estimated that in 2015,8 Zettabytes (Zetta=1021) were generated which consistedmostly of unstructured data such as emails, blogs, Twitter,Facebook posts, images, and videos. This is called big data. Itis possible to analyse ...

  14. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  15. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  16. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-07-31

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to big data scaling. This presents a serious impediment since identify- ing and repairing dirty data often involves processing huge input datasets, handling sophisticated error discovery approaches and managing huge arbitrary errors. With large datasets, error detection becomes overly expensive and complicated especially when considering user-defined functions. Furthermore, a distinctive algorithm is de- sired to optimize inequality joins in sophisticated error discovery rather than na ̈ıvely parallelizing them. Also, when repairing large errors, their skewed distribution may obstruct effective error repairs. In this dissertation, I present solutions to overcome the above three problems in scaling data cleansing. First, I present BigDansing as a general system to tackle efficiency, scalability, and ease-of-use issues in data cleansing for Big Data. It automatically parallelizes the user’s code on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also enables parallel execution of serial repair algorithms by exploiting the graph representation of discovered errors. The experimental results show that BigDansing outperforms existing baselines up to more than two orders of magnitude. Although BigDansing scales cleansing jobs, it still lacks the ability to handle sophisticated error discovery requiring inequality joins. Therefore, I developed IEJoin as an algorithm for fast inequality joins. It is based on sorted arrays and space efficient bit-arrays to reduce the problem’s search space. By comparing IEJoin against well- known optimizations, I show that it is more scalable, and several orders of magnitude faster. BigDansing depends on vertex-centric graph systems, i.e., Pregel

  17. Drywell corrosion stopped at Oyster Creek

    International Nuclear Information System (INIS)

    Lipford, B.L.; Flynn, J.C.

    1993-01-01

    This article describes the detection of corrosion on the drywell containment vessel of Oyster Creek Nuclear Plant and the application of a protective coating to repair the drywell. The topics of the article include drywell design features, identification of the problem, initial action, drywell corrosion, failure of cathodic protection, long-term repair, and repair results

  18. Geology of the Teakettle Creek watersheds

    Science.gov (United States)

    Robert S. LaMotte

    1937-01-01

    The Teakettle Creek Experimental Watersheds lie for the most part on quartzites of probable Triassic age. However one of the triplicate drainages has a considerable acreage developed on weathered granodiorite. Topography is relatively uniform and lends itself to triplicate watershed studies. Locations for dams are suitable if certain engineering precautions...

  19. Tidal mixing in Dahej creek waters

    Digital Repository Service at National Institute of Oceanography (India)

    Swamy, G.N.; Sarma, R.V.

    Mixing characteristics of a tidal inlet near Dahej at the mouth of Narmada River, Gujarat, India are examined in terms of tides, currents and bathymetry. The dilution potential of the Dahej Creek waters during a tidal march for a given rate...

  20. Species status of Mill Creek Elliptio

    Energy Technology Data Exchange (ETDEWEB)

    Davis, G.M. [Academy of Natural Sciences (United States); Mulvey, M. [Savannah River Ecology Lab., Aiken, SC (United States)

    1993-12-31

    This report discusses environmental effects of the Savannah River Plant on aqautic populations in Mill Creek and surrounding tributaries. Of particular concern was the status of Elliptio. Genetics and phenotypic characteristics have shown that the current classification system is not adequate for these populations. The appendices characterize genetic variability at different loci, electrophoretic data, allele frequencies, sympatric species, and anatomical characters.

  1. UTILIZING CREEKS FOR INTEGRATED RURAL COASTAL ...

    African Journals Online (AJOL)

    Osondu

    2013-02-09

    Feb 9, 2013 ... This study examines the Utilization of Creeks for Integrated Coastal Development of Ilaje ... utilization, poor fishing techniques, poor sources of water and navigation routes, and manual ... Ethiopian Journal of Environmental Studies and Management Vol. 6 No.3 .... together, implement, monitor and evaluate.

  2. Collaborative monitoring in Walnut Creek, California

    Science.gov (United States)

    Heidi Ballard; Ralph Kraetsch; Lynn Huntsinger

    2002-01-01

    In 1995 and 2000, a monitoring program was designed and implemented to track oak regeneration and native grass populations in target management areas in the four Open Space Preserves of the City of Walnut Creek, California. The program resulted from a collaboration of scientists at the University of California, Berkeley, a group of interested citizens known as the...

  3. Pine Creek Ranch, FY 2001 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Berry, Mark E.

    2001-11-01

    Pine Creek Ranch was purchased in 1999 by the Confederated Tribes of Warm Springs using Bonneville Power Administration Fish and Wildlife Habitat Mitigation funds. The 25,000 acre property will be managed in perpetuity for the benefit of fish and wildlife habitat. Major issues include: (1) Restoring quality spawning and rearing habitat for stealhead. Streams are incised and fish passage barriers exist from culverts and possibly beaver dams. In addition to stealhead habitat, the Tribes are interested in overall riparian recovery in the John Day River system for wildlife habitat, watershed values and other values such as recreation. (2) Future grazing for specific management purposes. Past grazing practices undoubtedly contributed to current unacceptable conditions. The main stem of Pine Creek has already been enrolled in the CREP program administered by the USDA, Natural Resource Conservation Service in part because of the cost-share for vegetation restoration in a buffer portion of old fields and in part because of rental fees that will help the Tribes to pay the property taxes. Grazing is not allowed in the riparian buffer for the term of the contract. (3) Noxious weeds are a major concern. (4) Encroachment by western juniper throughout the watershed is a potential concern for the hydrology of the creek. Mark Berry, Habitat Manager, for the Pine Creek Ranch requested the Team to address the following objectives: (1) Introduce some of the field staff and others to Proper Functioning Condition (PFC) assessments and concepts. (2) Do a PFC assessment on approximately 10 miles of Pine Creek. (3) Offer management recommendations. (4) Provide guidelines for monitoring.

  4. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    Science.gov (United States)

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  5. Pipeline corridors through wetlands - impacts on plant communities: Bayou Grand Cane, De Soto Parish, Louisiana. Topical report, August 1991--July 1993

    Energy Technology Data Exchange (ETDEWEB)

    Shem, L.M.; Zimmerman, R.E.; Hayes, D. [Argonne National Lab., IL (United States); Van Dyke, G.D. [Argonne National Lab., IL (United States)]|[Trinity Christian College, Palos Heights, IL (United States)

    1994-12-01

    The goal of the Gas Research Institute Wetland Corridors Program is to document impacts of existing pipeline on the wetlands they traverse. To accomplish this goal, 12 existing wetland crossings were surveyed. These sites varied in elapsed time since pipeline construction, wetland type, pipeline installation techniques, and night of-way (ROW) management practices. This report presents the results of a survey conducted over the period of August 12-13, 1991, at the Bayou Grand Cane crossing in De Soto Parish, Louisiana, where a pipeline constructed three years prior to the survey crosses the bayou through mature bottomland hardwoods. The sit was not seeded or fertilized after construction activities. At the time of sampling, a dense herb stratum (composed of mostly native species) covered the 20-m-wide ROW, except within drainage channels. As a result of the creation of the ROW, new habitat was created, plant diversity increased, and forest habitat became fragmented. The ROW must be maintained at an early stage of succession to allow access to the pipeline however, impacts to the wetland were minimized by decreasing the width of the ROW to 20 m and recreating the drainage channels across the ROW. The canopy trees on the ROW`s edge shaded part of the ROW, which helped to minimize the effects of the ROW.

  6. Big Data-Survey

    Directory of Open Access Journals (Sweden)

    P.S.G. Aruna Sri

    2016-03-01

    Full Text Available Big data is the term for any gathering of information sets, so expensive and complex, that it gets to be hard to process for utilizing customary information handling applications. The difficulties incorporate investigation, catch, duration, inquiry, sharing, stockpiling, Exchange, perception, and protection infringement. To reduce spot business patterns, anticipate diseases, conflict etc., we require bigger data sets when compared with the smaller data sets. Enormous information is hard to work with utilizing most social database administration frameworks and desktop measurements and perception bundles, needing rather enormously parallel programming running on tens, hundreds, or even a large number of servers. In this paper there was an observation on Hadoop architecture, different tools used for big data and its security issues.

  7. Finding the big bang

    CERN Document Server

    Page, Lyman A; Partridge, R Bruce

    2009-01-01

    Cosmology, the study of the universe as a whole, has become a precise physical science, the foundation of which is our understanding of the cosmic microwave background radiation (CMBR) left from the big bang. The story of the discovery and exploration of the CMBR in the 1960s is recalled for the first time in this collection of 44 essays by eminent scientists who pioneered the work. Two introductory chapters put the essays in context, explaining the general ideas behind the expanding universe and fossil remnants from the early stages of the expanding universe. The last chapter describes how the confusion of ideas and measurements in the 1960s grew into the present tight network of tests that demonstrate the accuracy of the big bang theory. This book is valuable to anyone interested in how science is done, and what it has taught us about the large-scale nature of the physical universe.

  8. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Klinkby Madsen, Anders; Rasche, Andreas

    data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects......This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... of international development agendas to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development efforts, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion...

  9. Big nuclear accidents

    International Nuclear Information System (INIS)

    Marshall, W.; Billingon, D.E.; Cameron, R.F.; Curl, S.J.

    1983-09-01

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but just imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the risks of nuclear power. The paper reviews the way in which the probability and consequences of big nuclear accidents have been presented in the past and makes recommendations for the future, including the presentation of the long-term consequences of such accidents in terms of 'loss of life expectancy', 'increased chance of fatal cancer' and 'equivalent pattern of compulsory cigarette smoking'. The paper presents mathematical arguments, which show the derivation and validity of the proposed methods of presenting the consequences of imaginable big nuclear accidents. (author)

  10. Big Bounce and inhomogeneities

    International Nuclear Information System (INIS)

    Brizuela, David; Mena Marugan, Guillermo A; Pawlowski, Tomasz

    2010-01-01

    The dynamics of an inhomogeneous universe is studied with the methods of loop quantum cosmology, via a so-called hybrid quantization, as an example of the quantization of vacuum cosmological spacetimes containing gravitational waves (Gowdy spacetimes). The analysis of this model with an infinite number of degrees of freedom, performed at the effective level, shows that (i) the initial Big Bang singularity is replaced (as in the case of homogeneous cosmological models) by a Big Bounce, joining deterministically two large universes, (ii) the universe size at the bounce is at least of the same order of magnitude as that of the background homogeneous universe and (iii) for each gravitational wave mode, the difference in amplitude at very early and very late times has a vanishing statistical average when the bounce dynamics is strongly dominated by the inhomogeneities, whereas this average is positive when the dynamics is in a near-vacuum regime, so that statistically the inhomogeneities are amplified. (fast track communication)

  11. Big Data and reality

    Directory of Open Access Journals (Sweden)

    Ryan Shaw

    2015-11-01

    Full Text Available DNA sequencers, Twitter, MRIs, Facebook, particle accelerators, Google Books, radio telescopes, Tumblr: what do these things have in common? According to the evangelists of “data science,” all of these are instruments for observing reality at unprecedentedly large scales and fine granularities. This perspective ignores the social reality of these very different technological systems, ignoring how they are made, how they work, and what they mean in favor of an exclusive focus on what they generate: Big Data. But no data, big or small, can be interpreted without an understanding of the process that generated them. Statistical data science is applicable to systems that have been designed as scientific instruments, but is likely to lead to confusion when applied to systems that have not. In those cases, a historical inquiry is preferable.

  12. Really big numbers

    CERN Document Server

    Schwartz, Richard Evan

    2014-01-01

    In the American Mathematical Society's first-ever book for kids (and kids at heart), mathematician and author Richard Evan Schwartz leads math lovers of all ages on an innovative and strikingly illustrated journey through the infinite number system. By means of engaging, imaginative visuals and endearing narration, Schwartz manages the monumental task of presenting the complex concept of Big Numbers in fresh and relatable ways. The book begins with small, easily observable numbers before building up to truly gigantic ones, like a nonillion, a tredecillion, a googol, and even ones too huge for names! Any person, regardless of age, can benefit from reading this book. Readers will find themselves returning to its pages for a very long time, perpetually learning from and growing with the narrative as their knowledge deepens. Really Big Numbers is a wonderful enrichment for any math education program and is enthusiastically recommended to every teacher, parent and grandparent, student, child, or other individual i...

  13. Big Bang Circus

    Science.gov (United States)

    Ambrosini, C.

    2011-06-01

    Big Bang Circus is an opera I composed in 2001 and which was premiered at the Venice Biennale Contemporary Music Festival in 2002. A chamber group, four singers and a ringmaster stage the story of the Universe confronting and interweaving two threads: how early man imagined it and how scientists described it. Surprisingly enough fancy, myths and scientific explanations often end up using the same images, metaphors and sometimes even words: a strong tension, a drumskin starting to vibrate, a shout…

  14. Big Bang 5

    CERN Document Server

    Apolin, Martin

    2007-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 5 RG behandelt die Grundlagen (Maßsystem, Größenordnungen) und die Mechanik (Translation, Rotation, Kraft, Erhaltungssätze).

  15. Big Bang 8

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Band 8 vermittelt auf verständliche Weise Relativitätstheorie, Kern- und Teilchenphysik (und deren Anwendungen in der Kosmologie und Astrophysik), Nanotechnologie sowie Bionik.

  16. Big Bang 6

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 6 RG behandelt die Gravitation, Schwingungen und Wellen, Thermodynamik und eine Einführung in die Elektrizität anhand von Alltagsbeispielen und Querverbindungen zu anderen Disziplinen.

  17. Big Bang 7

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. In Band 7 werden neben einer Einführung auch viele aktuelle Aspekte von Quantenmechanik (z. Beamen) und Elektrodynamik (zB Elektrosmog), sowie die Klimaproblematik und die Chaostheorie behandelt.

  18. Big Bang Darkleosynthesis

    OpenAIRE

    Krnjaic, Gordan; Sigurdson, Kris

    2014-01-01

    In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis , a dark-sector version of big-bang nucleosynthesis (BBN), in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD , which generica...

  19. Big³. Editorial.

    Science.gov (United States)

    Lehmann, C U; Séroussi, B; Jaulent, M-C

    2014-05-22

    To provide an editorial introduction into the 2014 IMIA Yearbook of Medical Informatics with an overview of the content, the new publishing scheme, and upcoming 25th anniversary. A brief overview of the 2014 special topic, Big Data - Smart Health Strategies, and an outline of the novel publishing model is provided in conjunction with a call for proposals to celebrate the 25th anniversary of the Yearbook. 'Big Data' has become the latest buzzword in informatics and promise new approaches and interventions that can improve health, well-being, and quality of life. This edition of the Yearbook acknowledges the fact that we just started to explore the opportunities that 'Big Data' will bring. However, it will become apparent to the reader that its pervasive nature has invaded all aspects of biomedical informatics - some to a higher degree than others. It was our goal to provide a comprehensive view at the state of 'Big Data' today, explore its strengths and weaknesses, as well as its risks, discuss emerging trends, tools, and applications, and stimulate the development of the field through the aggregation of excellent survey papers and working group contributions to the topic. For the first time in history will the IMIA Yearbook be published in an open access online format allowing a broader readership especially in resource poor countries. For the first time, thanks to the online format, will the IMIA Yearbook be published twice in the year, with two different tracks of papers. We anticipate that the important role of the IMIA yearbook will further increase with these changes just in time for its 25th anniversary in 2016.

  20. Recent big flare

    International Nuclear Information System (INIS)

    Moriyama, Fumio; Miyazawa, Masahide; Yamaguchi, Yoshisuke

    1978-01-01

    The features of three big solar flares observed at Tokyo Observatory are described in this paper. The active region, McMath 14943, caused a big flare on September 16, 1977. The flare appeared on both sides of a long dark line which runs along the boundary of the magnetic field. Two-ribbon structure was seen. The electron density of the flare observed at Norikura Corona Observatory was 3 x 10 12 /cc. Several arc lines which connect both bright regions of different magnetic polarity were seen in H-α monochrome image. The active region, McMath 15056, caused a big flare on December 10, 1977. At the beginning, several bright spots were observed in the region between two main solar spots. Then, the area and the brightness increased, and the bright spots became two ribbon-shaped bands. A solar flare was observed on April 8, 1978. At first, several bright spots were seen around the solar spot in the active region, McMath 15221. Then, these bright spots developed to a large bright region. On both sides of a dark line along the magnetic neutral line, bright regions were generated. These developed to a two-ribbon flare. The time required for growth was more than one hour. A bright arc which connects two ribbons was seen, and this arc may be a loop prominence system. (Kato, T.)

  1. Big Data Technologies

    Science.gov (United States)

    Bellazzi, Riccardo; Dagliati, Arianna; Sacchi, Lucia; Segagni, Daniele

    2015-01-01

    The so-called big data revolution provides substantial opportunities to diabetes management. At least 3 important directions are currently of great interest. First, the integration of different sources of information, from primary and secondary care to administrative information, may allow depicting a novel view of patient’s care processes and of single patient’s behaviors, taking into account the multifaceted nature of chronic care. Second, the availability of novel diabetes technologies, able to gather large amounts of real-time data, requires the implementation of distributed platforms for data analysis and decision support. Finally, the inclusion of geographical and environmental information into such complex IT systems may further increase the capability of interpreting the data gathered and extract new knowledge from them. This article reviews the main concepts and definitions related to big data, it presents some efforts in health care, and discusses the potential role of big data in diabetes care. Finally, as an example, it describes the research efforts carried on in the MOSAIC project, funded by the European Commission. PMID:25910540

  2. Big bang and big crunch in matrix string theory

    OpenAIRE

    Bedford, J; Papageorgakis, C; Rodríguez-Gómez, D; Ward, J

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  3. Summer food habits and trophic overlap of roundtail chub and creek chub in Muddy Creek, Wyoming

    Science.gov (United States)

    Quist, M.C.; Bower, M.R.; Hubert, W.A.

    2006-01-01

    Native fishes of the Upper Colorado River Basin have experienced substantial declines in abundance and distribution, and are extirpated from most of Wyoming. Muddy Creek, in south-central Wyoming (Little Snake River watershed), contains sympatric populations of native roundtail chub (Gila robusta), bluehead sucker, (Catostomus discobolus), and flannelmouth sucker (C. tatipinnis), and represents an area of high conservation concern because it is the only area known to have sympatric populations of all 3 species in Wyoming. However, introduced creek chub (Semotilus atromaculatus) are abundant and might have a negative influence on native fishes. We assessed summer food habits of roundtail chub and creek chub to provide information on the ecology of each species and obtain insight on potential trophic overlap. Roundtail chub and creek chub seemed to be opportunistic generalists that consumed a diverse array of food items. Stomach contents of both species were dominated by plant material, aquatic and terrestrial insects, and Fishes, but also included gastropods and mussels. Stomach contents were similar between species, indicating high trophic, overlap. No length-related patterns in diet were observed for either species. These results suggest that creek chubs have the potential to adversely influence the roundtail chub population through competition for food and the native fish assemblage through predation.

  4. The macroinvertebrates of Magela Creek, Northern Territory

    International Nuclear Information System (INIS)

    Marchant, R.

    1982-04-01

    The littoral zones of five permanent billabongs in Magela Creek were sampled monthly for macroinvertebrates. Greatest numbers of taxa and individuals were caught in the late wet season and early dry season in the shallow billabongs; in the deep billabongs, seasonal variations were not so marked. These changes appeared to be associated with the development of macrophytes, which offered food and shelter to the invertebrate fauna. The dominant groups were the Chironomidae, Oligochaetae and Ephemeroptera. The seasonal patterns of the catches were sufficiently consistent for future samples to be able to be compared with these initial ones with some confidence that any changes are real. This work is part of a larger study into the biota and water quality of Magela Creek designed to provide data on aquatic communities before mining of the Ranger uranium deposit starts

  5. Mathematical modelling of flooding at Magela Creek

    International Nuclear Information System (INIS)

    Vardavas, I.

    1989-01-01

    The extent and frequency of the flooding at Magela Creek can be predicted from a mathematical/computer model describing the hydrological phases of surface runoff. Surface runoff involves complex water transfer processes over very inhomogeneous terrain. A simple mathematical model of these has been developed which includes the interception of rainfall by the plant canopy, evapotranspiration, infiltration of surface water into the soil, the storage of water in surface depressions, and overland and subsurface water flow. The rainfall-runoff model has then been incorporated into a more complex computer model to predict the amount of water that enters and leaves the Magela Creek flood plain, downstream of the mine. 2 figs., ills

  6. Clean Coal Power at Toms Creek

    International Nuclear Information System (INIS)

    Schmid, M.R.

    1993-01-01

    On October 20, 1992 the US Department of Energy (DOE), through the Morgantown Energy Technology Center, entered into Cooperative Agreement DE-FC-21-93MC92444 with TAMCO Power Partners to implement the Toms Creek Integrated Gasification Combined - Cycle Demonstration Project. The process design is proceeding as scheduled, and a draft Environmental Information Volume has been produced. The overall project schedule, however, may have to be adjusted when the Power Sales Agreement has been finalized

  7. Final Environmental Assessment, Horse Creek Bridge Replacement

    Science.gov (United States)

    2010-10-01

    existing bridge pipes that have failed and replace the failed structure with a new, prefabricated pedestrian bridge within the original bridge footprint...vehicles, nor designed for support of standard passenger vehicle loads. The bridge would be a single prefabricated unit consisting of a steel grate...placed on new concrete abutments built on the existing foundations on the creek banks, and put in place by a crane operating from the vehicle parking

  8. Steel Creek primary producers: Periphyton and seston, L-Lake/Steel Creek Biological Monitoring Program, January 1986--December 1991

    Energy Technology Data Exchange (ETDEWEB)

    Bowers, J.A. [Westinghouse Savannah River Co., Aiken, SC (United States); Toole, M.A.; van Duyn, Y. [Normandeau Associates Inc., New Ellenton, SC (United States)

    1992-02-01

    The Savannah River Site (SRS) encompasses 300 sq mi of the Atlantic Coastal Plain in west-central South Carolina. Five major tributaries of the Savannah River -- Upper Three Runs Creek, Four Mile Creek, Pen Branch, Steel Creek, and Lower Three Runs Creek -- drain the site. In 1985, L Lake, a 400-hectare cooling reservoir, was built on the upper reaches of Steel Creek to receive effluent from the restart of L-Reactor and to protect the lower reaches from thermal impacts. The Steel Creek Biological Monitoring Program was designed to assess various components of the system and identify and changes due to the operation of L-Reactor or discharge from L Lake. An intensive ecological assessment program prior to the construction of the lake provided baseline data with which to compare data accumulated after the lake was filled and began discharging into the creek. The Department of Energy must demonstrate that the operation of L-Reactor will not significantly alter the established aquatic ecosystems. This report summarizes the results of six years` data from Steel Creek under the L-Lake/Steel Creek Monitoring Program. L Lake is discussed separately from Steel Creek in Volumes NAI-SR-138 through NAI-SR-143.

  9. Steel Creek primary producers: Periphyton and seston, L-Lake/Steel Creek Biological Monitoring Program, January 1986--December 1991

    International Nuclear Information System (INIS)

    Bowers, J.A.; Toole, M.A.; van Duyn, Y.

    1992-02-01

    The Savannah River Site (SRS) encompasses 300 sq mi of the Atlantic Coastal Plain in west-central South Carolina. Five major tributaries of the Savannah River -- Upper Three Runs Creek, Four Mile Creek, Pen Branch, Steel Creek, and Lower Three Runs Creek -- drain the site. In 1985, L Lake, a 400-hectare cooling reservoir, was built on the upper reaches of Steel Creek to receive effluent from the restart of L-Reactor and to protect the lower reaches from thermal impacts. The Steel Creek Biological Monitoring Program was designed to assess various components of the system and identify and changes due to the operation of L-Reactor or discharge from L Lake. An intensive ecological assessment program prior to the construction of the lake provided baseline data with which to compare data accumulated after the lake was filled and began discharging into the creek. The Department of Energy must demonstrate that the operation of L-Reactor will not significantly alter the established aquatic ecosystems. This report summarizes the results of six years' data from Steel Creek under the L-Lake/Steel Creek Monitoring Program. L Lake is discussed separately from Steel Creek in Volumes NAI-SR-138 through NAI-SR-143

  10. Disaggregating asthma: Big investigation versus big data.

    Science.gov (United States)

    Belgrave, Danielle; Henderson, John; Simpson, Angela; Buchan, Iain; Bishop, Christopher; Custovic, Adnan

    2017-02-01

    We are facing a major challenge in bridging the gap between identifying subtypes of asthma to understand causal mechanisms and translating this knowledge into personalized prevention and management strategies. In recent years, "big data" has been sold as a panacea for generating hypotheses and driving new frontiers of health care; the idea that the data must and will speak for themselves is fast becoming a new dogma. One of the dangers of ready accessibility of health care data and computational tools for data analysis is that the process of data mining can become uncoupled from the scientific process of clinical interpretation, understanding the provenance of the data, and external validation. Although advances in computational methods can be valuable for using unexpected structure in data to generate hypotheses, there remains a need for testing hypotheses and interpreting results with scientific rigor. We argue for combining data- and hypothesis-driven methods in a careful synergy, and the importance of carefully characterized birth and patient cohorts with genetic, phenotypic, biological, and molecular data in this process cannot be overemphasized. The main challenge on the road ahead is to harness bigger health care data in ways that produce meaningful clinical interpretation and to translate this into better diagnoses and properly personalized prevention and treatment plans. There is a pressing need for cross-disciplinary research with an integrative approach to data science, whereby basic scientists, clinicians, data analysts, and epidemiologists work together to understand the heterogeneity of asthma. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  11. Big data and educational research

    OpenAIRE

    Beneito-Montagut, Roser

    2017-01-01

    Big data and data analytics offer the promise to enhance teaching and learning, improve educational research and progress education governance. This chapter aims to contribute to the conceptual and methodological understanding of big data and analytics within educational research. It describes the opportunities and challenges that big data and analytics bring to education as well as critically explore the perils of applying a data driven approach to education. Despite the claimed value of the...

  12. The trashing of Big Green

    International Nuclear Information System (INIS)

    Felten, E.

    1990-01-01

    The Big Green initiative on California's ballot lost by a margin of 2-to-1. Green measures lost in five other states, shocking ecology-minded groups. According to the postmortem by environmentalists, Big Green was a victim of poor timing and big spending by the opposition. Now its supporters plan to break up the bill and try to pass some provisions in the Legislature

  13. Big data in fashion industry

    Science.gov (United States)

    Jain, S.; Bruniaux, J.; Zeng, X.; Bruniaux, P.

    2017-10-01

    Significant work has been done in the field of big data in last decade. The concept of big data includes analysing voluminous data to extract valuable information. In the fashion world, big data is increasingly playing a part in trend forecasting, analysing consumer behaviour, preference and emotions. The purpose of this paper is to introduce the term fashion data and why it can be considered as big data. It also gives a broad classification of the types of fashion data and briefly defines them. Also, the methodology and working of a system that will use this data is briefly described.

  14. Big Data Analytics An Overview

    Directory of Open Access Journals (Sweden)

    Jayshree Dwivedi

    2015-08-01

    Full Text Available Big data is a data beyond the storage capacity and beyond the processing power is called big data. Big data term is used for data sets its so large or complex that traditional data it involves data sets with sizes. Big data size is a constantly moving target year by year ranging from a few dozen terabytes to many petabytes of data means like social networking sites the amount of data produced by people is growing rapidly every year. Big data is not only a data rather it become a complete subject which includes various tools techniques and framework. It defines the epidemic possibility and evolvement of data both structured and unstructured. Big data is a set of techniques and technologies that require new forms of assimilate to uncover large hidden values from large datasets that are diverse complex and of a massive scale. It is difficult to work with using most relational database management systems and desktop statistics and visualization packages exacting preferably massively parallel software running on tens hundreds or even thousands of servers. Big data environment is used to grab organize and resolve the various types of data. In this paper we describe applications problems and tools of big data and gives overview of big data.

  15. Was there a big bang

    International Nuclear Information System (INIS)

    Narlikar, J.

    1981-01-01

    In discussing the viability of the big-bang model of the Universe relative evidence is examined including the discrepancies in the age of the big-bang Universe, the red shifts of quasars, the microwave background radiation, general theory of relativity aspects such as the change of the gravitational constant with time, and quantum theory considerations. It is felt that the arguments considered show that the big-bang picture is not as soundly established, either theoretically or observationally, as it is usually claimed to be, that the cosmological problem is still wide open and alternatives to the standard big-bang picture should be seriously investigated. (U.K.)

  16. How Big is Earth?

    Science.gov (United States)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  17. The Patroon Creek Contamination Migration Investigation

    International Nuclear Information System (INIS)

    Dufek, K.; Zafran, A.; Moore, J.T.

    2006-01-01

    Shaw performed a Site Investigation (SI) for sediment within the Unnamed Tributary of the Patroon Creek, a section of the Patroon Creek, and the Three Mile Reservoir as part of the overall contract with the United States Army Corps of Engineers (USACE) to remediate the Colonie Formerly Utilized Sites Remedial Action Program (FUSRAP) Site. The Unnamed Tributary formerly flowed through the former Patroon Lake, which was located on the main site property and was used as a landfill for radiological and chemical wastes. The objective of the investigation was to determine the absence/presence of radioactive contamination within the three Areas of Concern (AOC). In order to accomplish this objective, Shaw assembled a team to produce a Technical Memorandum that provided an in-depth understanding of the environmental conditions related to the Patroon Creek. Upon completion and analysis of the Technical Memorandum, a Conceptual Site Model (CSM) was constructed and a Technical Planning Program (TPP) was held to develop a Sediment Investigation Work Plan and Sediment Investigation Sampling and Analysis Plan. A total of 32 sample locations were analyzed using on-site direct gamma scans with a Pancake Geiger-Mueller (PGM) instrument for screening purposes and samples were analyzed at on-site and off-site laboratories. The highest interval from each core scan was selected for on-site analysis utilizing a High Purity Germanium (HPGe) detector. Eight of these samples were sent off-site for gamma/alpha spectroscopy confirmation. The data collected during the SI indicated that the U-238 cleanup criterion was exceeded in sediment samples collected from two locations within the Unnamed Tributary but not in downstream sections of Patroon Creek or Three Mile Reservoir. Future actions for impacted sediment in the Unnamed Tributary will be further evaluated. Concentrations of U-238 and Th-232 in all other off-site sediment samples collected from the Unnamed Tributary, Patroon Creek, and

  18. Privacy and Big Data

    CERN Document Server

    Craig, Terence

    2011-01-01

    Much of what constitutes Big Data is information about us. Through our online activities, we leave an easy-to-follow trail of digital footprints that reveal who we are, what we buy, where we go, and much more. This eye-opening book explores the raging privacy debate over the use of personal data, with one undeniable conclusion: once data's been collected, we have absolutely no control over who uses it or how it is used. Personal data is the hottest commodity on the market today-truly more valuable than gold. We are the asset that every company, industry, non-profit, and government wants. Pri

  19. Visualizing big energy data

    DEFF Research Database (Denmark)

    Hyndman, Rob J.; Liu, Xueqin Amy; Pinson, Pierre

    2018-01-01

    Visualization is a crucial component of data analysis. It is always a good idea to plot the data before fitting models, making predictions, or drawing conclusions. As sensors of the electric grid are collecting large volumes of data from various sources, power industry professionals are facing th...... the challenge of visualizing such data in a timely fashion. In this article, we demonstrate several data-visualization solutions for big energy data through three case studies involving smart-meter data, phasor measurement unit (PMU) data, and probabilistic forecasts, respectively....

  20. Big Data Challenges

    Directory of Open Access Journals (Sweden)

    Alexandru Adrian TOLE

    2013-10-01

    Full Text Available The amount of data that is traveling across the internet today, not only that is large, but is complex as well. Companies, institutions, healthcare system etc., all of them use piles of data which are further used for creating reports in order to ensure continuity regarding the services that they have to offer. The process behind the results that these entities requests represents a challenge for software developers and companies that provide IT infrastructure. The challenge is how to manipulate an impressive volume of data that has to be securely delivered through the internet and reach its destination intact. This paper treats the challenges that Big Data creates.

  1. Big data naturally rescaled

    International Nuclear Information System (INIS)

    Stoop, Ruedi; Kanders, Karlis; Lorimer, Tom; Held, Jenny; Albert, Carlo

    2016-01-01

    We propose that a handle could be put on big data by looking at the systems that actually generate the data, rather than the data itself, realizing that there may be only few generic processes involved in this, each one imprinting its very specific structures in the space of systems, the traces of which translate into feature space. From this, we propose a practical computational clustering approach, optimized for coping with such data, inspired by how the human cortex is known to approach the problem.

  2. A Matrix Big Bang

    OpenAIRE

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matr...

  3. BIG Data - BIG Gains? Understanding the Link Between Big Data Analytics and Innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance for product innovations. Since big data technologies provide new data information practices, they create new decision-making possibilities, which firms can use to realize innovations. Applying German firm-level data we find suggestive evidence that big data analytics matters for the likelihood of becoming a product innovator as well as the market success of the firms’ product innovat...

  4. BIG data - BIG gains? Empirical evidence on the link between big data analytics and innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance in terms of product innovations. Since big data technologies provide new data information practices, they create novel decision-making possibilities, which are widely believed to support firms’ innovation process. Applying German firm-level data within a knowledge production function framework we find suggestive evidence that big data analytics is a relevant determinant for the likel...

  5. 75 FR 16728 - Beaver Creek Landscape Management Project, Ashland Ranger District, Custer National Forest...

    Science.gov (United States)

    2010-04-02

    ... DEPARTMENT OF AGRICULTURE Forest Service Beaver Creek Landscape Management Project, Ashland Ranger... manner that increases resiliency of the Beaver Creek Landscape Management Project area ecosystem to... requirements to require. The Beaver Creek Landscape Management Project includes treatments previously proposed...

  6. [Big data in imaging].

    Science.gov (United States)

    Sewerin, Philipp; Ostendorf, Benedikt; Hueber, Axel J; Kleyer, Arnd

    2018-04-01

    Until now, most major medical advancements have been achieved through hypothesis-driven research within the scope of clinical trials. However, due to a multitude of variables, only a certain number of research questions could be addressed during a single study, thus rendering these studies expensive and time consuming. Big data acquisition enables a new data-based approach in which large volumes of data can be used to investigate all variables, thus opening new horizons. Due to universal digitalization of the data as well as ever-improving hard- and software solutions, imaging would appear to be predestined for such analyses. Several small studies have already demonstrated that automated analysis algorithms and artificial intelligence can identify pathologies with high precision. Such automated systems would also seem well suited for rheumatology imaging, since a method for individualized risk stratification has long been sought for these patients. However, despite all the promising options, the heterogeneity of the data and highly complex regulations covering data protection in Germany would still render a big data solution for imaging difficult today. Overcoming these boundaries is challenging, but the enormous potential advances in clinical management and science render pursuit of this goal worthwhile.

  7. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Fields, Brian D.; Olive, Keith A.

    2006-01-01

    We present an overview of the standard model of big bang nucleosynthesis (BBN), which describes the production of the light elements in the early universe. The theoretical prediction for the abundances of D, 3 He, 4 He, and 7 Li is discussed. We emphasize the role of key nuclear reactions and the methods by which experimental cross section uncertainties are propagated into uncertainties in the predicted abundances. The observational determination of the light nuclides is also discussed. Particular attention is given to the comparison between the predicted and observed abundances, which yields a measurement of the cosmic baryon content. The spectrum of anisotropies in the cosmic microwave background (CMB) now independently measures the baryon density to high precision; we show how the CMB data test BBN, and find that the CMB and the D and 4 He observations paint a consistent picture. This concordance stands as a major success of the hot big bang. On the other hand, 7 Li remains discrepant with the CMB-preferred baryon density; possible explanations are reviewed. Finally, moving beyond the standard model, primordial nucleosynthesis constraints on early universe and particle physics are also briefly discussed

  8. Nursing Needs Big Data and Big Data Needs Nursing.

    Science.gov (United States)

    Brennan, Patricia Flatley; Bakken, Suzanne

    2015-09-01

    Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.

  9. Was the big bang hot

    International Nuclear Information System (INIS)

    Wright, E.L.

    1983-01-01

    The author considers experiments to confirm the substantial deviations from a Planck curve in the Woody and Richards spectrum of the microwave background, and search for conducting needles in our galaxy. Spectral deviations and needle-shaped grains are expected for a cold Big Bang, but are not required by a hot Big Bang. (Auth.)

  10. Ethische aspecten van big data

    NARCIS (Netherlands)

    N. (Niek) van Antwerpen; Klaas Jan Mollema

    2017-01-01

    Big data heeft niet alleen geleid tot uitdagende technische vraagstukken, ook gaat het gepaard met allerlei nieuwe ethische en morele kwesties. Om verantwoord met big data om te gaan, moet ook over deze kwesties worden nagedacht. Want slecht datagebruik kan nadelige gevolgen hebben voor

  11. Fremtidens landbrug bliver big business

    DEFF Research Database (Denmark)

    Hansen, Henning Otte

    2016-01-01

    Landbrugets omverdensforhold og konkurrencevilkår ændres, og det vil nødvendiggøre en udvikling i retning af “big business“, hvor landbrugene bliver endnu større, mere industrialiserede og koncentrerede. Big business bliver en dominerende udvikling i dansk landbrug - men ikke den eneste...

  12. Human factors in Big Data

    NARCIS (Netherlands)

    Boer, J. de

    2016-01-01

    Since 2014 I am involved in various (research) projects that try to make the hype around Big Data more concrete and tangible for the industry and government. Big Data is about multiple sources of (real-time) data that can be analysed, transformed to information and be used to make 'smart' decisions.

  13. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  14. Applications of Big Data in Education

    OpenAIRE

    Faisal Kalota

    2015-01-01

    Big Data and analytics have gained a huge momentum in recent years. Big Data feeds into the field of Learning Analytics (LA) that may allow academic institutions to better understand the learners' needs and proactively address them. Hence, it is important to have an understanding of Big Data and its applications. The purpose of this descriptive paper is to provide an overview of Big Data, the technologies used in Big Data, and some of the applications of Big Data in educa...

  15. Exploring complex and big data

    Directory of Open Access Journals (Sweden)

    Stefanowski Jerzy

    2017-12-01

    Full Text Available This paper shows how big data analysis opens a range of research and technological problems and calls for new approaches. We start with defining the essential properties of big data and discussing the main types of data involved. We then survey the dedicated solutions for storing and processing big data, including a data lake, virtual integration, and a polystore architecture. Difficulties in managing data quality and provenance are also highlighted. The characteristics of big data imply also specific requirements and challenges for data mining algorithms, which we address as well. The links with related areas, including data streams and deep learning, are discussed. The common theme that naturally emerges from this characterization is complexity. All in all, we consider it to be the truly defining feature of big data (posing particular research and technological challenges, which ultimately seems to be of greater importance than the sheer data volume.

  16. The natural channel of Brandywine Creek, Pennsylvania

    Science.gov (United States)

    Wolman, M.G.

    1955-01-01

    This study of the channel of Brandy wine Creek, Pennsylvania, consists of three parts. The first is an analysis of the changes which take place in the width, depth, velocity, slope of the water surface, suspended load, and roughness factor with changing discharge below the bankfull stage at each of several widely separated cross sections of the channel. Expressed as functions of the discharge, it is found that the variables behave systematically. In every section studied, as the discharge increases, the velocity increases to about the 0.6 power, depth to the 0.4, and load to the 2.0 power of the discharge. The roughness decreases to the 0.2 power of the discharge. The relative magnitudes and the direction of these variations are similar to those which have been observed in other rivers in the United States, primarily in the West. Some modifications of the hypotheses applicable to the western rivers are probably required because on Brandywine Creek the difference between the materials on the bed and in the banks is considerably greater than it is on most of the western rivers studied. In the second part of the paper the progressive changes of the same variables in the downstream direction with increasing discharge at a given frequency are described. Despite the disorderly appearance of the stream, it is found that the variables display a progressive, orderly change in the downstream direction when traced from the headwater tributaries through the trunk stream of Brandywine Creek. At a given frequency of flow, width increases with discharge to about the 0.5 power. Depth increases downstream somewhat less rapidly, while the slope and roughness both decrease in the downstream direction. Despite a decrease in the size of the material on the bed, both the mean velocity and the mean bed velocity increase downstream. The rates of change of these variables are in close accord with the changes observed on rivers flowing in alluvium and in stable irrigation canals. These

  17. Subsurface geology of the Cold Creek syncline

    International Nuclear Information System (INIS)

    Meyers, C.W.; Price, S.M.

    1981-07-01

    Bedrock beneath the Hanford Site is being evaluated by the Basalt Waste Isolation Project (BWIP) for possible use by the US Department of Energy as a geologic repository for nuclear waste storage. Initial BWIP geologic and hydrologic studies served to determine that the central Hanford Site contains basalt flows with thick, dense interiors that have low porosities and permeabilities. Furthermore, within the Cold Creek syncline, these flows appear to be nearly flat lying across areas in excess of tens of square kilometers. Such flows have been identified as potential repository host rock candidates. The Umtanum flow, which lies from 900 to 1150 m beneath the surface, is currently considered the leading host rock candidate. Within the west-central Cold Creek syncline, a 47-km 2 area designated as the reference repository location (RRL) is currently considered the leading candidate site. The specific purpose of this report is to present current knowledge of stratigraphic, lithologic, and structural factors that directly relate to the suitability of the Umtanum flow within the Cold Creek syncline for use as a nuclear waste repository host rock. The BWIP geologic studies have concentrated on factors that might influence groundwater transport of radionuclides from this flow. These factors include: (1) intraflow structures within the interiors of individual lava flows, (2) interflow zones and flow fronts between adjacent lava flows, and (3) bedrock structures. Data have been obtained primarily through coring and geophysical logging of deep boreholes, petrographic, paleomagnetic, and chemical analysis, seismic-reflection, gravity, and magnetic (ground and multilevel airborne) surveys, and surface mapping. Results included in this document comprise baseline data which will be utilized to prepare a Site Characterization Report as specified by the US Nuclear Regulatory Commission

  18. Bear Creek Project. Final environmental statement

    International Nuclear Information System (INIS)

    1977-06-01

    The Bear Creek Project consists of certain mining and milling operations involving uranium ore deposits located in Converse County, Wyoming. Mining of uranium from nine known ore bodies will take place over a period of ten years (estimated); a mill with a nominal capacity of 1000 tons per day of ore will be constructed and operated as long as ore is available. The waste material (tailings) from the mill, also produced at a rate of about 1000 tons per day, will be stored onsite in an impoundment. Environmental impacts and adverse effects are summarized

  19. The big data telescope

    International Nuclear Information System (INIS)

    Finkel, Elizabeth

    2017-01-01

    On a flat, red mulga plain in the outback of Western Australia, preparations are under way to build the most audacious telescope astronomers have ever dreamed of - the Square Kilometre Array (SKA). Next-generation telescopes usually aim to double the performance of their predecessors. The Australian arm of SKA will deliver a 168-fold leap on the best technology available today, to show us the universe as never before. It will tune into signals emitted just a million years after the Big Bang, when the universe was a sea of hydrogen gas, slowly percolating with the first galaxies. Their starlight illuminated the fledgling universe in what is referred to as the “cosmic dawn”.

  20. The Big Optical Array

    International Nuclear Information System (INIS)

    Mozurkewich, D.; Johnston, K.J.; Simon, R.S.

    1990-01-01

    This paper describes the design and the capabilities of the Naval Research Laboratory Big Optical Array (BOA), an interferometric optical array for high-resolution imaging of stars, stellar systems, and other celestial objects. There are four important differences between the BOA design and the design of Mark III Optical Interferometer on Mount Wilson (California). These include a long passive delay line which will be used in BOA to do most of the delay compensation, so that the fast delay line will have a very short travel; the beam combination in BOA will be done in triplets, to allow measurement of closure phase; the same light will be used for both star and fringe tracking; and the fringe tracker will use several wavelength channels

  1. Big nuclear accidents

    International Nuclear Information System (INIS)

    Marshall, W.

    1983-01-01

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the safety of nuclear power. The way in which the probability and consequences of big nuclear accidents have been presented in the past is reviewed and recommendations for the future are made including the presentation of the long-term consequences of such accidents in terms of 'reduction in life expectancy', 'increased chance of fatal cancer' and the equivalent pattern of compulsory cigarette smoking. (author)

  2. Nonstandard big bang models

    International Nuclear Information System (INIS)

    Calvao, M.O.; Lima, J.A.S.

    1989-01-01

    The usual FRW hot big-bang cosmologies have been generalized by considering the equation of state ρ = Anm +(γ-1) -1 p, where m is the rest mass of the fluid particles and A is a dimensionless constant. Explicit analytic solutions are given for the flat case (ε=O). For large cosmological times these extended models behave as the standard Einstein-de Sitter universes regardless of the values of A and γ. Unlike the usual FRW flat case the deceleration parameter q is a time-dependent function and its present value, q≅ 1, obtained from the luminosity distance versus redshift relation, may be fitted by taking, for instance, A=1 and γ = 5/3 (monatomic relativistic gas with >> k B T). In all cases the universe cools obeying the same temperature law of the FRW models and it is shown that the age of the universe is only slightly modified. (author) [pt

  3. The Last Big Bang

    Energy Technology Data Exchange (ETDEWEB)

    McGuire, Austin D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Meade, Roger Allen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-13

    As one of the very few people in the world to give the “go/no go” decision to detonate a nuclear device, Austin “Mac” McGuire holds a very special place in the history of both the Los Alamos National Laboratory and the world. As Commander of Joint Task Force Unit 8.1.1, on Christmas Island in the spring and summer of 1962, Mac directed the Los Alamos data collection efforts for twelve of the last atmospheric nuclear detonations conducted by the United States. Since data collection was at the heart of nuclear weapon testing, it fell to Mac to make the ultimate decision to detonate each test device. He calls his experience THE LAST BIG BANG, since these tests, part of Operation Dominic, were characterized by the dramatic displays of the heat, light, and sounds unique to atmospheric nuclear detonations – never, perhaps, to be witnessed again.

  4. A matrix big bang

    International Nuclear Information System (INIS)

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control

  5. A matrix big bang

    Energy Technology Data Exchange (ETDEWEB)

    Craps, Ben [Instituut voor Theoretische Fysica, Universiteit van Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam (Netherlands); Sethi, Savdeep [Enrico Fermi Institute, University of Chicago, Chicago, IL 60637 (United States); Verlinde, Erik [Instituut voor Theoretische Fysica, Universiteit van Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam (Netherlands)

    2005-10-15

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control.

  6. 75 FR 8036 - Monitor-Hot Creek Rangeland Project

    Science.gov (United States)

    2010-02-23

    ... DEPARTMENT OF AGRICULTURE Forest Service Monitor-Hot Creek Rangeland Project AGENCY: Forest... Rangeland Project area. The analysis will determine if a change in management direction for livestock grazing is needed to move existing resource conditions within the Monitor-Hot Creek Rangeland Project area...

  7. 75 FR 57766 - Ryckman Creek Resources, LLC; Notice of Petition

    Science.gov (United States)

    2010-09-22

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. CP10-498-000] Ryckman Creek Resources, LLC; Notice of Petition September 15, 2010. Take notice that on September 3, 2010, Ryckman Creek..., a petition for an Exemption of Temporary Acts and Operations and Request for Expedited Approval...

  8. 33 CFR 117.1001 - Cat Point Creek.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Cat Point Creek. 117.1001 Section 117.1001 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Virginia § 117.1001 Cat Point Creek. The draw of the...

  9. 33 CFR 117.800 - Mill Neck Creek.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Mill Neck Creek. 117.800 Section 117.800 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements New York § 117.800 Mill Neck Creek. The draw of the...

  10. 33 CFR 117.705 - Beaver Dam Creek.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Beaver Dam Creek. 117.705 Section 117.705 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements New Jersey § 117.705 Beaver Dam Creek. The draw of the...

  11. Hydrology of Bishop Creek, California: An Isotopic Analysis

    Science.gov (United States)

    Michael L. Space; John W. Hess; Stanley D. Smith

    1989-01-01

    Five power generation plants along an eleven kilometer stretch divert Bishop Creek water for hydro-electric power. Stream diversion may be adversely affecting the riparian vegetation. Stable isotopic analysis is employed to determine surface water/ground-water interactions along the creek. surface water originates primarily from three headwater lakes. Discharge into...

  12. 78 FR 76750 - Drawbridge Operation Regulation; Chambers Creek, Steilacoom, WA

    Science.gov (United States)

    2013-12-19

    ... operating schedule that governs the Burlington Northern Santa Fe (BNSF) Chambers Creek Railway Bridge across... performing lift bridge maintenance and upgrades for the BNSF Chambers Creek Railway Bridge across Chambers... maintenance and upgrade items to this vertical lift bridge in support of Positive Train Control requirements...

  13. DPF Big One

    International Nuclear Information System (INIS)

    Anon.

    1993-01-01

    At its latest venue at Fermilab from 10-14 November, the American Physical Society's Division of Particles and Fields meeting entered a new dimension. These regular meetings, which allow younger researchers to communicate with their peers, have been gaining popularity over the years (this was the seventh in the series), but nobody had expected almost a thousand participants and nearly 500 requests to give talks. Thus Fermilab's 800-seat auditorium had to be supplemented with another room with a video hookup, while the parallel sessions were organized into nine bewildering streams covering fourteen major physics topics. With the conventionality of the Standard Model virtually unchallenged, physics does not move fast these days. While most of the physics results had already been covered in principle at the International Conference on High Energy Physics held in Dallas in August (October, page 1), the Fermilab DPF meeting had a very different atmosphere. Major international meetings like Dallas attract big names from far and wide, and it is difficult in such an august atmosphere for young researchers to find a receptive audience. This was not the case at the DPF parallel sessions. The meeting also adopted a novel approach, with the parallels sandwiched between an initial day of plenaries to set the scene, and a final day of summaries. With the whole world waiting for the sixth ('top') quark to be discovered at Fermilab's Tevatron protonantiproton collider, the meeting began with updates from Avi Yagil and Ronald Madaras from the big detectors, CDF and DO respectively. Although rumours flew thick and fast, the Tevatron has not yet reached the top, although Yagil could show one intriguing event of a type expected from the heaviest quark

  14. DPF Big One

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1993-01-15

    At its latest venue at Fermilab from 10-14 November, the American Physical Society's Division of Particles and Fields meeting entered a new dimension. These regular meetings, which allow younger researchers to communicate with their peers, have been gaining popularity over the years (this was the seventh in the series), but nobody had expected almost a thousand participants and nearly 500 requests to give talks. Thus Fermilab's 800-seat auditorium had to be supplemented with another room with a video hookup, while the parallel sessions were organized into nine bewildering streams covering fourteen major physics topics. With the conventionality of the Standard Model virtually unchallenged, physics does not move fast these days. While most of the physics results had already been covered in principle at the International Conference on High Energy Physics held in Dallas in August (October, page 1), the Fermilab DPF meeting had a very different atmosphere. Major international meetings like Dallas attract big names from far and wide, and it is difficult in such an august atmosphere for young researchers to find a receptive audience. This was not the case at the DPF parallel sessions. The meeting also adopted a novel approach, with the parallels sandwiched between an initial day of plenaries to set the scene, and a final day of summaries. With the whole world waiting for the sixth ('top') quark to be discovered at Fermilab's Tevatron protonantiproton collider, the meeting began with updates from Avi Yagil and Ronald Madaras from the big detectors, CDF and DO respectively. Although rumours flew thick and fast, the Tevatron has not yet reached the top, although Yagil could show one intriguing event of a type expected from the heaviest quark.

  15. Big bang and big crunch in matrix string theory

    International Nuclear Information System (INIS)

    Bedford, J.; Ward, J.; Papageorgakis, C.; Rodriguez-Gomez, D.

    2007-01-01

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe

  16. BigOP: Generating Comprehensive Big Data Workloads as a Benchmarking Framework

    OpenAIRE

    Zhu, Yuqing; Zhan, Jianfeng; Weng, Chuliang; Nambiar, Raghunath; Zhang, Jinchao; Chen, Xingzhen; Wang, Lei

    2014-01-01

    Big Data is considered proprietary asset of companies, organizations, and even nations. Turning big data into real treasure requires the support of big data systems. A variety of commercial and open source products have been unleashed for big data storage and processing. While big data users are facing the choice of which system best suits their needs, big data system developers are facing the question of how to evaluate their systems with regard to general big data processing needs. System b...

  17. Hoe Creek 1990 quarterly sampling cumulative report

    Energy Technology Data Exchange (ETDEWEB)

    Crader, S.E.; Huntington, G.S.

    1991-03-01

    Groundwater samples were collected and analyzed for benzene and for total phenols three times during 1990. This report summarizes the results of these sampling events and compares the results with those obtained in previous years. Possible further options for remediation of the Hoe Creek site was addressed. Three underground coal gasification (UCG) burns were performed by Lawrence Livermore National Laboratory for the US Department of Energy in 1976, 1977, and 1979 at the Hoe Creek site, which is about 20 miles south of Gillette, Wyoming. As a result of these burns, there has been considerable contamination of groundwater by various organic compounds. There have been three efforts at remediating this situation. In 1986 and again in 1987, contaminated water was pumped out, treated, and reinjected. In 1989, the water was pumped, treated, and sprayed into the atmosphere. Benzene and total phenols have been monitored at various monitoring wells as the site during 1990. The highest detected benzene concentration in 1990 was 220 {mu}g/L, and the highest total phenols concentration was 430 {mu}g/L. It is apparent that contamination is still above baseline levels, although the concentration of total phenols is far less than immediately after the burns. The burned coal seams are still releasing organic compounds into the groundwater that passes through them.

  18. From big bang to big crunch and beyond

    International Nuclear Information System (INIS)

    Elitzur, Shmuel; Rabinovici, Eliezer; Giveon, Amit; Kutasov, David

    2002-01-01

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a 'big bang' singularity, expands and then contracts to a 'big crunch' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spacetimes is further connected at the singularities to a series of non-compact static regions with closed timelike curves. These regions contain boundaries, on which the observables of the theory live. This suggests a holographic interpretation of the physics. (author)

  19. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  20. Empathy and the Big Five

    OpenAIRE

    Paulus, Christoph

    2016-01-01

    Del Barrio et al. (2004) haben vor mehr als 10 Jahren versucht, eine direkte Beziehung zwischen Empathie und den Big Five herzustellen. Im Mittel hatten in ihrer Stichprobe Frauen höhere Werte in der Empathie und auf den Big Five-Faktoren mit Ausnahme des Faktors Neurotizismus. Zusammenhänge zu Empathie fanden sie in den Bereichen Offenheit, Verträglichkeit, Gewissenhaftigkeit und Extraversion. In unseren Daten besitzen Frauen sowohl in der Empathie als auch den Big Five signifikant höhere We...

  1. "Big data" in economic history.

    Science.gov (United States)

    Gutmann, Myron P; Merchant, Emily Klancher; Roberts, Evan

    2018-03-01

    Big data is an exciting prospect for the field of economic history, which has long depended on the acquisition, keying, and cleaning of scarce numerical information about the past. This article examines two areas in which economic historians are already using big data - population and environment - discussing ways in which increased frequency of observation, denser samples, and smaller geographic units allow us to analyze the past with greater precision and often to track individuals, places, and phenomena across time. We also explore promising new sources of big data: organically created economic data, high resolution images, and textual corpora.

  2. Big Data as Information Barrier

    Directory of Open Access Journals (Sweden)

    Victor Ya. Tsvetkov

    2014-07-01

    Full Text Available The article covers analysis of ‘Big Data’ which has been discussed over last 10 years. The reasons and factors for the issue are revealed. It has proved that the factors creating ‘Big Data’ issue has existed for quite a long time, and from time to time, would cause the informational barriers. Such barriers were successfully overcome through the science and technologies. The conducted analysis refers the “Big Data” issue to a form of informative barrier. This issue may be solved correctly and encourages development of scientific and calculating methods.

  3. Homogeneous and isotropic big rips?

    CERN Document Server

    Giovannini, Massimo

    2005-01-01

    We investigate the way big rips are approached in a fully inhomogeneous description of the space-time geometry. If the pressure and energy densities are connected by a (supernegative) barotropic index, the spatial gradients and the anisotropic expansion decay as the big rip is approached. This behaviour is contrasted with the usual big-bang singularities. A similar analysis is performed in the case of sudden (quiescent) singularities and it is argued that the spatial gradients may well be non-negligible in the vicinity of pressure singularities.

  4. Big Data in Space Science

    OpenAIRE

    Barmby, Pauline

    2018-01-01

    It seems like “big data” is everywhere these days. In planetary science and astronomy, we’ve been dealing with large datasets for a long time. So how “big” is our data? How does it compare to the big data that a bank or an airline might have? What new tools do we need to analyze big datasets, and how can we make better use of existing tools? What kinds of science problems can we address with these? I’ll address these questions with examples including ESA’s Gaia mission, ...

  5. Rate Change Big Bang Theory

    Science.gov (United States)

    Strickland, Ken

    2013-04-01

    The Rate Change Big Bang Theory redefines the birth of the universe with a dramatic shift in energy direction and a new vision of the first moments. With rate change graph technology (RCGT) we can look back 13.7 billion years and experience every step of the big bang through geometrical intersection technology. The analysis of the Big Bang includes a visualization of the first objects, their properties, the astounding event that created space and time as well as a solution to the mystery of anti-matter.

  6. Big Data is invading big places as CERN

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Big Data technologies are becoming more popular with the constant grow of data generation in different fields such as social networks, internet of things and laboratories like CERN. How is CERN making use of such technologies? How machine learning is applied at CERN with Big Data technologies? How much data we move and how it is analyzed? All these questions will be answered during the talk.

  7. The ethics of big data in big agriculture

    OpenAIRE

    Carbonell (Isabelle M.)

    2016-01-01

    This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique in...

  8. Sources of baseflow for the Minnehaha Creek Watershed, Minnesota, US

    Science.gov (United States)

    Nieber, J. L.; Moore, T. L.; Gulliver, J. S.; Magner, J. A.; Lahti, L. B.

    2013-12-01

    Minnehaha Creek is among the most valued surface water features in the Minneapolis, MN metro area, with a waterfall as it enters the Minnehaha Creek park. Flow in Minnehaha Creek is heavily dependent on discharge from the stream's origin, Lake Minnetonka, the outlet of which is closed during drought periods to maintain water elevations in the lake resulting in low- (or no-) flow conditions in the creek. Stormwater runoff entering directly to the creek from the creek's largely urbanized watershed exacerbates extremes in flow conditions. Given the cultural and ecological value of this stream system, there is great interest in enhancing the cultural and ecosystem services provided by Minnehaha Creek through improvements in streamflow regime by reducing flashiness and sustaining increased low-flows. Determining the potential for achieving improvements in flow requires first that the current sources of water contributing to low-flows in the creek be identified and quantified. Work on this source identification has involved a number of different approaches, including analyses of the streamflow record using a hydrologic system model framework, examination of the Quaternary and bedrock geology of the region, estimation of groundwater-surface water exchange rates within the channel using hyporheic zone temperature surveys and flux meter measurements, and analyses of the stable isotopes of oxygen and hydrogen in samples of stream water, groundwater, and rainfall. Analysis of baseflow recessions using the method of Brutsaert and Nieber (1977) indicates that only a small portion of the catchment, probably the riparian zone, contributes to baseflows. This result appears to be supported by the observation that the limestone/shale bedrock layer underlying the surficial aquifer has a non-zero permeability, and in a significant portion of the watershed the layer has been eroded away leaving the surficial aquifer ';bottomless' and highly susceptible to vertical (down) water loss

  9. Simulation of Water Quality in the Tull Creek and West Neck Creek Watersheds, Currituck Sound Basin, North Carolina and Virginia

    Science.gov (United States)

    Garcia, Ana Maria

    2009-01-01

    A study of the Currituck Sound was initiated in 2005 to evaluate the water chemistry of the Sound and assess the effectiveness of management strategies. As part of this study, the Soil and Water Assessment Tool (SWAT) model was used to simulate current sediment and nutrient loadings for two distinct watersheds in the Currituck Sound basin and to determine the consequences of different water-quality management scenarios. The watersheds studied were (1) Tull Creek watershed, which has extensive row-crop cultivation and artificial drainage, and (2) West Neck Creek watershed, which drains urban areas in and around Virginia Beach, Virginia. The model simulated monthly streamflows with Nash-Sutcliffe model efficiency coefficients of 0.83 and 0.76 for Tull Creek and West Neck Creek, respectively. The daily sediment concentration coefficient of determination was 0.19 for Tull Creek and 0.36 for West Neck Creek. The coefficient of determination for total nitrogen was 0.26 for both watersheds and for dissolved phosphorus was 0.4 for Tull Creek and 0.03 for West Neck Creek. The model was used to estimate current (2006-2007) sediment and nutrient yields for the two watersheds. Total suspended-solids yield was 56 percent lower in the urban watershed than in the agricultural watershed. Total nitrogen export was 45 percent lower, and total phosphorus was 43 percent lower in the urban watershed than in the agricultural watershed. A management scenario with filter strips bordering the main channels was simulated for Tull Creek. The Soil and Water Assessment Tool model estimated a total suspended-solids yield reduction of 54 percent and total nitrogen and total phosphorus reductions of 21 percent and 29 percent, respectively, for the Tull Creek watershed.

  10. Big climate data analysis

    Science.gov (United States)

    Mudelsee, Manfred

    2015-04-01

    The Big Data era has begun also in the climate sciences, not only in economics or molecular biology. We measure climate at increasing spatial resolution by means of satellites and look farther back in time at increasing temporal resolution by means of natural archives and proxy data. We use powerful supercomputers to run climate models. The model output of the calculations made for the IPCC's Fifth Assessment Report amounts to ~650 TB. The 'scientific evolution' of grid computing has started, and the 'scientific revolution' of quantum computing is being prepared. This will increase computing power, and data amount, by several orders of magnitude in the future. However, more data does not automatically mean more knowledge. We need statisticians, who are at the core of transforming data into knowledge. Statisticians notably also explore the limits of our knowledge (uncertainties, that is, confidence intervals and P-values). Mudelsee (2014 Climate Time Series Analysis: Classical Statistical and Bootstrap Methods. Second edition. Springer, Cham, xxxii + 454 pp.) coined the term 'optimal estimation'. Consider the hyperspace of climate estimation. It has many, but not infinite, dimensions. It consists of the three subspaces Monte Carlo design, method and measure. The Monte Carlo design describes the data generating process. The method subspace describes the estimation and confidence interval construction. The measure subspace describes how to detect the optimal estimation method for the Monte Carlo experiment. The envisaged large increase in computing power may bring the following idea of optimal climate estimation into existence. Given a data sample, some prior information (e.g. measurement standard errors) and a set of questions (parameters to be estimated), the first task is simple: perform an initial estimation on basis of existing knowledge and experience with such types of estimation problems. The second task requires the computing power: explore the hyperspace to

  11. Hey, big spender

    Energy Technology Data Exchange (ETDEWEB)

    Cope, G.

    2000-04-01

    Business to business electronic commerce is looming large in the future of the oil industry. It is estimated that by adopting e-commerce the industry could achieve bottom line savings of between $1.8 to $ 3.4 billion a year on annual gross revenues in excess of $ 30 billion. At present there are several teething problems to overcome such as inter-operability standards, which are at least two or three years away. Tying in electronically with specific suppliers is also an expensive proposition, although the big benefits are in fact in doing business with the same suppliers on a continuing basis. Despite these problems, 14 of the world's largest energy and petrochemical companies joined forces in mid-April to create a single Internet procurement marketplace for the industry's complex supply chain. The exchange was designed by B2B (business-to-business) software provider, Commerce One Inc., ; it will leverage the buying clout of these industry giants (BP Amoco, Royal Dutch Shell Group, Conoco, Occidental Petroleum, Phillips Petroleum, Unocal Corporation and Statoil among them), currently about $ 125 billion on procurement per year; they hope to save between 5 to 30 per cent depending on the product and the region involved. Other similar schemes such as Chevron and partners' Petrocosm Marketplace, Network Oil, a Houston-based Internet portal aimed at smaller petroleum companies, are also doing business in the $ 10 billion per annum range. e-Energy, a cooperative project between IBM Ericson and Telus Advertising is another neutral, virtual marketplace targeted at the oil and gas sector. PetroTRAX, a Calgary-based website plans to take online procurement and auction sales a big step forward by establishing a portal to handle any oil company's asset management needs. There are also a number of websites targeting specific needs: IndigoPool.com (acquisitions and divestitures) and WellBid.com (products related to upstream oil and gas operators) are just

  12. Hey, big spender

    International Nuclear Information System (INIS)

    Cope, G.

    2000-01-01

    Business to business electronic commerce is looming large in the future of the oil industry. It is estimated that by adopting e-commerce the industry could achieve bottom line savings of between $1.8 to $ 3.4 billion a year on annual gross revenues in excess of $ 30 billion. At present there are several teething problems to overcome such as inter-operability standards, which are at least two or three years away. Tying in electronically with specific suppliers is also an expensive proposition, although the big benefits are in fact in doing business with the same suppliers on a continuing basis. Despite these problems, 14 of the world's largest energy and petrochemical companies joined forces in mid-April to create a single Internet procurement marketplace for the industry's complex supply chain. The exchange was designed by B2B (business-to-business) software provider, Commerce One Inc., ; it will leverage the buying clout of these industry giants (BP Amoco, Royal Dutch Shell Group, Conoco, Occidental Petroleum, Phillips Petroleum, Unocal Corporation and Statoil among them), currently about $ 125 billion on procurement per year; they hope to save between 5 to 30 per cent depending on the product and the region involved. Other similar schemes such as Chevron and partners' Petrocosm Marketplace, Network Oil, a Houston-based Internet portal aimed at smaller petroleum companies, are also doing business in the $ 10 billion per annum range. e-Energy, a cooperative project between IBM Ericson and Telus Advertising is another neutral, virtual marketplace targeted at the oil and gas sector. PetroTRAX, a Calgary-based website plans to take online procurement and auction sales a big step forward by establishing a portal to handle any oil company's asset management needs. There are also a number of websites targeting specific needs: IndigoPool.com (acquisitions and divestitures) and WellBid.com (products related to upstream oil and gas operators) are just two examples. All in

  13. Methods and tools for big data visualization

    OpenAIRE

    Zubova, Jelena; Kurasova, Olga

    2015-01-01

    In this paper, methods and tools for big data visualization have been investigated. Challenges faced by the big data analysis and visualization have been identified. Technologies for big data analysis have been discussed. A review of methods and tools for big data visualization has been done. Functionalities of the tools have been demonstrated by examples in order to highlight their advantages and disadvantages.

  14. Measuring the Promise of Big Data Syllabi

    Science.gov (United States)

    Friedman, Alon

    2018-01-01

    Growing interest in Big Data is leading industries, academics and governments to accelerate Big Data research. However, how teachers should teach Big Data has not been fully examined. This article suggests criteria for redesigning Big Data syllabi in public and private degree-awarding higher education establishments. The author conducted a survey…

  15. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  16. Sonar atlas of caverns comprising the U.S. Strategic Petroleum Reserve. Volume 2, Big Hill Site, Texas.

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, Christopher Arthur; Lord, Anna Snider

    2007-08-01

    Downhole sonar surveys from the four active U.S. Strategic Petroleum Reserve sites have been modeled and used to generate a four-volume sonar atlas, showing the three-dimensional geometry of each cavern. This volume 2 focuses on the Big Hill SPR site, located in southeastern Texas. Volumes 1, 3, and 4, respectively, present images for the Bayou Choctaw SPR site, Louisiana, the Bryan Mound SPR site, Texas, and the West Hackberry SPR site, Louisiana. The atlas uses a consistent presentation format throughout. The basic geometric measurements provided by the down-cavern surveys have also been used to generate a number of geometric attributes, the values of which have been mapped onto the geometric form of each cavern using a color-shading scheme. The intent of the various geometrical attributes is to highlight deviations of the cavern shape from the idealized cylindrical form of a carefully leached underground storage cavern in salt. The atlas format does not allow interpretation of such geometric deviations and anomalies. However, significant geometric anomalies, not directly related to the leaching history of the cavern, may provide insight into the internal structure of the relevant salt dome.

  17. Biophotonics: the big picture

    Science.gov (United States)

    Marcu, Laura; Boppart, Stephen A.; Hutchinson, Mark R.; Popp, Jürgen; Wilson, Brian C.

    2018-02-01

    The 5th International Conference on Biophotonics (ICOB) held April 30 to May 1, 2017, in Fremantle, Western Australia, brought together opinion leaders to discuss future directions for the field and opportunities to consider. The first session of the conference, "How to Set a Big Picture Biophotonics Agenda," was focused on setting the stage for developing a vision and strategies for translation and impact on society of biophotonic technologies. The invited speakers, panelists, and attendees engaged in discussions that focused on opportunities and promising applications for biophotonic techniques, challenges when working at the confluence of the physical and biological sciences, driving factors for advances of biophotonic technologies, and educational opportunities. We share a summary of the presentations and discussions. Three main themes from the conference are presented in this position paper that capture the current status, opportunities, challenges, and future directions of biophotonics research and key areas of applications: (1) biophotonics at the nano- to microscale level; (2) biophotonics at meso- to macroscale level; and (3) biophotonics and the clinical translation conundrum.

  18. Big Data, Small Sample.

    Science.gov (United States)

    Gerlovina, Inna; van der Laan, Mark J; Hubbard, Alan

    2017-05-20

    Multiple comparisons and small sample size, common characteristics of many types of "Big Data" including those that are produced by genomic studies, present specific challenges that affect reliability of inference. Use of multiple testing procedures necessitates calculation of very small tail probabilities of a test statistic distribution. Results based on large deviation theory provide a formal condition that is necessary to guarantee error rate control given practical sample sizes, linking the number of tests and the sample size; this condition, however, is rarely satisfied. Using methods that are based on Edgeworth expansions (relying especially on the work of Peter Hall), we explore the impact of departures of sampling distributions from typical assumptions on actual error rates. Our investigation illustrates how far the actual error rates can be from the declared nominal levels, suggesting potentially wide-spread problems with error rate control, specifically excessive false positives. This is an important factor that contributes to "reproducibility crisis". We also review some other commonly used methods (such as permutation and methods based on finite sampling inequalities) in their application to multiple testing/small sample data. We point out that Edgeworth expansions, providing higher order approximations to the sampling distribution, offer a promising direction for data analysis that could improve reliability of studies relying on large numbers of comparisons with modest sample sizes.

  19. Big bang darkleosynthesis

    Directory of Open Access Journals (Sweden)

    Gordan Krnjaic

    2015-12-01

    Full Text Available In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN, in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV/dark-nucleon binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S≫3/2, whose discovery would be smoking gun evidence for dark nuclei.

  20. Predicting big bang deuterium

    Energy Technology Data Exchange (ETDEWEB)

    Hata, N.; Scherrer, R.J.; Steigman, G.; Thomas, D.; Walker, T.P. [Department of Physics, Ohio State University, Columbus, Ohio 43210 (United States)

    1996-02-01

    We present new upper and lower bounds to the primordial abundances of deuterium and {sup 3}He based on observational data from the solar system and the interstellar medium. Independent of any model for the primordial production of the elements we find (at the 95{percent} C.L.): 1.5{times}10{sup {minus}5}{le}(D/H){sub {ital P}}{le}10.0{times}10{sup {minus}5} and ({sup 3}He/H){sub {ital P}}{le}2.6{times}10{sup {minus}5}. When combined with the predictions of standard big bang nucleosynthesis, these constraints lead to a 95{percent} C.L. bound on the primordial abundance deuterium: (D/H){sub best}=(3.5{sup +2.7}{sub {minus}1.8}){times}10{sup {minus}5}. Measurements of deuterium absorption in the spectra of high-redshift QSOs will directly test this prediction. The implications of this prediction for the primordial abundances of {sup 4}He and {sup 7}Li are discussed, as well as those for the universal density of baryons. {copyright} {ital 1996 The American Astronomical Society.}

  1. Big bang darkleosynthesis

    Science.gov (United States)

    Krnjaic, Gordan; Sigurdson, Kris

    2015-12-01

    In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN), in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV /dark-nucleon) binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S ≫ 3 / 2), whose discovery would be smoking gun evidence for dark nuclei.

  2. The role of big laboratories

    CERN Document Server

    Heuer, Rolf-Dieter

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward.

  3. The role of big laboratories

    International Nuclear Information System (INIS)

    Heuer, R-D

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward. (paper)

  4. 75 FR 3195 - Ochoco National Forest, Lookout Mountain Ranger District; Oregon; Mill Creek; Allotment...

    Science.gov (United States)

    2010-01-20

    ...; Oregon; Mill Creek; Allotment Management Plans EIS AGENCY: Forest Service, USDA. ACTION: Notice of intent... allotments on the Lookout Mountain Ranger District. These four allotments are: Cox, Craig, Mill Creek, and..., Mill Creek and Old Dry Creek allotments. The responsible official will also decide how to mitigate...

  5. Challenges of Big Data Analysis.

    Science.gov (United States)

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  6. Generalized formal model of Big Data

    OpenAIRE

    Shakhovska, N.; Veres, O.; Hirnyak, M.

    2016-01-01

    This article dwells on the basic characteristic features of the Big Data technologies. It is analyzed the existing definition of the “big data” term. The article proposes and describes the elements of the generalized formal model of big data. It is analyzed the peculiarities of the application of the proposed model components. It is described the fundamental differences between Big Data technology and business analytics. Big Data is supported by the distributed file system Google File System ...

  7. Bereavement rituals in the Muscogee Creek tribe.

    Science.gov (United States)

    Walker, Andrea C; Balk, David E

    2007-08-01

    A qualitative, collective case study explores bereavement rituals in the Muscogee Creek tribe. Data from interviews with 27 participants, all adult members of the tribe, revealed consensus on participation in certain bereavement rituals. Common rituals included: (a) conducting a wake service the night before burial; (b) never leaving the body alone before burial; (c) enclosing personal items and food in the casket; (d) digging graves by hand; (e) each individual throwing a handful of dirt into the grave before covering, called giving a "farewell handshake"; (f) covering the grave completely by hand; (g) building a house over the grave; (h) waiting 4 days before burial; (i) using medicine/purification; and (j) adhering to socialized mourning period. Cultural values of family, community, religion, importance of the number 4, Indian medicine, and the meaning of death contributed to the development of these rituals.

  8. Bear Creek Project. Draft environmental statement

    International Nuclear Information System (INIS)

    1977-01-01

    The Bear Creek Project consists of mining and milling operations involving uranium ore deposits located in Converse County, Wyoming. Mining of uranium from six known ore bodies will take place over ten years; a 1000 tons ore/day will be constructed and operated as long as ore is available. The tailings will be stored onsite in an impoundment. The project would convert 2700 acres from grazing use to mining/milling activities for about ten years. Mining would disturb a total of 1600 acres but, because of reclamation, the max acreage disturbed at any one time would be about 1000 acres, the average being about 650 acres. Dose rates were computed for an individual in a ranch house at the nearest ranch. Conditions for the protection of the environment are proposed. Possible environmental impacts evaluated cover air, land, water, soil, vegetation, wildlife, and community. A benefit-cost analysis is made

  9. Hydrogen sulfide concentration in Beaver Dam Creek

    International Nuclear Information System (INIS)

    Kiser, D.L.

    1979-01-01

    Concentration-time profiles calculated with LODIPS for various hypothetical releases of hydrogen sulfide from the heavy water extraction facility predict lethal conditions for swamp fish from releases as small as 568 kg discharged over a period of 30 minutes or from releases of 1818 kg discharged over a period of 6 hours or less. The necessary volatilization and oxidation coefficients for LODIPS were derived from field measurements following planned releases of H 2 S. Upsets in the operation of the wastewater strippers in the Girdler-Sulfide (GS) heavy water extraction facility in D Area have released significant amounts of dissolved H 2 S to Beaver Dam Creek. Because H 2 S is toxic to fish in concentrations as low as 1 mg/liter, the downstream environmental impact of H 2 S releases from D Area was evaluated

  10. Ground water in Creek County, Oklahoma

    Science.gov (United States)

    Cady, Richard Carlysle

    1937-01-01

    Creek County has been designated as a problem area by the Land Use Planning Section of the Resettlement Administration. Some of the earliest oil fields to brought into production were situated in and near this county, and new fields have been opened from time to time during the ensuing years. The production of the newer fields, however, has not kept pace with the exhaustion of the older fields, and the county now presents an excellent picture of the problems involved in adjusting a population to lands that are nearly depleted of their mineral wealth. Values of land have been greatly depressed; tax collection is far in arrears; tenancy is widespread; and in addition more people will apparently be forced to depend on the income from agriculture than the land seems capable of supporting. The county as a whole is at best indifferently suitable for general farming. The Land Use planning Section proposes to study the present and seemingly immanent maladjustments of population to the resources of the land, and make recommendations for their correction. The writer was detailed to the Land Use Planning Section of Region VIII for the purposes of making studies of ground water problems in the region. In Creek County two investigations were made. In September, 1936, the writer spent about ten days investigating the availability of ground water for the irrigation of garden crops during drouths. If it proved feasible to do this generally throughout the county, the Land Use Planning Section might be able to encourage this practice. The second investigation made by the writer was in regard to the extent to which ground water supplies have been damaged by oil well brines. He was in county for four days late in January 1937, and again in March, 1937. During part of the second field trip he was accompanied by R.M. Dixon, sanitary engineer of the Water Utilization Unit of the Resettlement Administration. (available as photostat copy only)

  11. Tidal flow characteristics at Kasheli (Kalwa/ Bassein creek), Bombay

    Digital Repository Service at National Institute of Oceanography (India)

    Swamy, G.N.; Suryanarayana, A.

    Tidal flow characteristics of waters at Kasheli, connected to the sea through Thane and Bassein Creeks in Bombay, Maharashtra, India are investigated based on tide and current observations carried out in 1980-81. The results establish that the tidal...

  12. Ecology of phytoplankton from Dharmatar Creek, west coast of India

    Digital Repository Service at National Institute of Oceanography (India)

    Tiwari, L.R.; Nair, V.R.

    Phytoplankton pigment, cell count and species diversity wee studied at five locations in Dharamtar Creek during September 1984 to November 1985. Chemical parameters indicated a healthy system free of any environmental stress. The water...

  13. Missing link between the Hayward and Rodgers Creek faults.

    Science.gov (United States)

    Watt, Janet; Ponce, David; Parsons, Tom; Hart, Patrick

    2016-10-01

    The next major earthquake to strike the ~7 million residents of the San Francisco Bay Area will most likely result from rupture of the Hayward or Rodgers Creek faults. Until now, the relationship between these two faults beneath San Pablo Bay has been a mystery. Detailed subsurface imaging provides definitive evidence of active faulting along the Hayward fault as it traverses San Pablo Bay and bends ~10° to the right toward the Rodgers Creek fault. Integrated geophysical interpretation and kinematic modeling show that the Hayward and Rodgers Creek faults are directly connected at the surface-a geometric relationship that has significant implications for earthquake dynamics and seismic hazard. A direct link enables simultaneous rupture of the Hayward and Rodgers Creek faults, a scenario that could result in a major earthquake ( M = 7.4) that would cause extensive damage and loss of life with global economic impact.

  14. Zooplankton composition in Dharamtar creek adjoining Bombay harbour

    Digital Repository Service at National Institute of Oceanography (India)

    Tiwari, L.R.; Nair, V.R.

    bedoti was the true inhabitant. In general zooplankton production indicated 1.5 fold increase towards the upper reaches of the creek where salinity variations were drastic. A more diversified faunal assemblage of oceanic and neritic species characterised...

  15. Water quality of the Swatara Creek Basin, PA

    Science.gov (United States)

    McCarren, Edward F.; Wark, J.W.; George, J.R.

    1964-01-01

    The Swatara Creek of the Susquehanna River Basin is the farthest downstream sub-basin that drains acid water (pH of 4.5 or less) from anthracite coal mines. The Swatara Creek drainage area includes 567 square miles of parts of Schuylkill, Berks, Lebanon, and Dauphin Counties in Pennsylvania.To learn what environmental factors and dissolved constituents in water were influencing the quality of Swatara Creek, a reconnaissance of the basin was begun during the summer of 1958. Most of the surface streams and the wells adjacent to the principal tributaries of the Creek were sampled for chemical analysis. Effluents from aquifers underlying the basin were chemically analyzed because ground water is the basic source of supply to surface streams in the Swatara Creek basin. When there is little runoff during droughts, ground water has a dominating influence on the quality of surface water. Field tests showed that all ground water in the basin was non-acidic. However, several streams were acidic. Sources of acidity in these streams were traced to the overflow of impounded water in unworked coal mines.Acidic mine effluents and washings from coal breakers were detected downstream in Swatara Creek as far as Harper Tavern, although the pH at Harper Tavern infrequently went below 6.0. Suspended-sediment sampling at this location showed the mean daily concentration ranged from 2 to 500 ppm. The concentration of suspended sediment is influenced by runoff and land use, and at Harper Tavern it consisted of natural sediments and coal wastes. The average daily suspended-sediment discharge there during the period May 8 to September 30, 1959, was 109 tons per day, and the computed annual suspended-sediment load, 450 tons per square mile. Only moderate treatment would be required to restore the quality of Swatara Creek at Harper Tavern for many uses. Above Ravine, however, the quality of the Creek is generally acidic and, therefore, of limited usefulness to public supplies, industries and

  16. [Big data in official statistics].

    Science.gov (United States)

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  17. GEOSS: Addressing Big Data Challenges

    Science.gov (United States)

    Nativi, S.; Craglia, M.; Ochiai, O.

    2014-12-01

    In the sector of Earth Observation, the explosion of data is due to many factors including: new satellite constellations, the increased capabilities of sensor technologies, social media, crowdsourcing, and the need for multidisciplinary and collaborative research to face Global Changes. In this area, there are many expectations and concerns about Big Data. Vendors have attempted to use this term for their commercial purposes. It is necessary to understand whether Big Data is a radical shift or an incremental change for the existing digital infrastructures. This presentation tries to explore and discuss the impact of Big Data challenges and new capabilities on the Global Earth Observation System of Systems (GEOSS) and particularly on its common digital infrastructure called GCI. GEOSS is a global and flexible network of content providers allowing decision makers to access an extraordinary range of data and information at their desk. The impact of the Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on GEOSS is discussed. The main solutions and experimentation developed by GEOSS along these axes are introduced and analyzed. GEOSS is a pioneering framework for global and multidisciplinary data sharing in the Earth Observation realm; its experience on Big Data is valuable for the many lessons learned.

  18. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  19. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  20. Results of the 2000 Creek Plantation Swamp Survey

    International Nuclear Information System (INIS)

    Fledderman, P.D.

    2000-01-01

    This report is a survey of the Creek Plantation located along the Savannah River and borders the southeast portion of the Savannah River Site. The land is primarily undeveloped and agricultural; its purpose is to engage in equestrian-related operations. A portion of Creek Plantation along the Savannah River is a low-lying swamp, known as the Savannah River Swamp, which is uninhabited and not easily accessible

  1. Quantum fields in a big-crunch-big-bang spacetime

    International Nuclear Information System (INIS)

    Tolley, Andrew J.; Turok, Neil

    2002-01-01

    We consider quantum field theory on a spacetime representing the big-crunch-big-bang transition postulated in ekpyrotic or cyclic cosmologies. We show via several independent methods that an essentially unique matching rule holds connecting the incoming state, in which a single extra dimension shrinks to zero, to the outgoing state in which it reexpands at the same rate. For free fields in our construction there is no particle production from the incoming adiabatic vacuum. When interactions are included the particle production for fixed external momentum is finite at the tree level. We discuss a formal correspondence between our construction and quantum field theory on de Sitter spacetime

  2. Poker Player Behavior After Big Wins and Big Losses

    OpenAIRE

    Gary Smith; Michael Levere; Robert Kurtzman

    2009-01-01

    We find that experienced poker players typically change their style of play after winning or losing a big pot--most notably, playing less cautiously after a big loss, evidently hoping for lucky cards that will erase their loss. This finding is consistent with Kahneman and Tversky's (Kahneman, D., A. Tversky. 1979. Prospect theory: An analysis of decision under risk. Econometrica 47(2) 263-292) break-even hypothesis and suggests that when investors incur a large loss, it might be time to take ...

  3. Turning big bang into big bounce: II. Quantum dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Malkiewicz, Przemyslaw; Piechocki, Wlodzimierz, E-mail: pmalk@fuw.edu.p, E-mail: piech@fuw.edu.p [Theoretical Physics Department, Institute for Nuclear Studies, Hoza 69, 00-681 Warsaw (Poland)

    2010-11-21

    We analyze the big bounce transition of the quantum Friedmann-Robertson-Walker model in the setting of the nonstandard loop quantum cosmology (LQC). Elementary observables are used to quantize composite observables. The spectrum of the energy density operator is bounded and continuous. The spectrum of the volume operator is bounded from below and discrete. It has equally distant levels defining a quantum of the volume. The discreteness may imply a foamy structure of spacetime at a semiclassical level which may be detected in astro-cosmo observations. The nonstandard LQC method has a free parameter that should be fixed in some way to specify the big bounce transition.

  4. A mangrove creek restoration plan utilizing hydraulic modeling.

    Science.gov (United States)

    Marois, Darryl E; Mitsch, William J

    2017-11-01

    Despite the valuable ecosystem services provided by mangrove ecosystems they remain threatened around the globe. Urban development has been a primary cause for mangrove destruction and deterioration in south Florida USA for the last several decades. As a result, the restoration of mangrove forests has become an important topic of research. Using field sampling and remote-sensing we assessed the past and present hydrologic conditions of a mangrove creek and its connected mangrove forest and brackish marsh systems located on the coast of Naples Bay in southwest Florida. We concluded that the hydrology of these connected systems had been significantly altered from its natural state due to urban development. We propose here a mangrove creek restoration plan that would extend the existing creek channel 1.1 km inland through the adjacent mangrove forest and up to an adjacent brackish marsh. We then tested the hydrologic implications using a hydraulic model of the mangrove creek calibrated with tidal data from Naples Bay and water levels measured within the creek. The calibrated model was then used to simulate the resulting hydrology of our proposed restoration plan. Simulation results showed that the proposed creek extension would restore a twice-daily flooding regime to a majority of the adjacent mangrove forest and that there would still be minimal tidal influence on the brackish marsh area, keeping its salinity at an acceptable level. This study demonstrates the utility of combining field data and hydraulic modeling to aid in the design of mangrove restoration plans.

  5. Surface-water resources of Polecat Creek basin, Oklahoma

    Science.gov (United States)

    Laine, L.L.

    1956-01-01

    A compilation of basic data on surface waters in Polecat Creek basin is presented on a monthly basis for Heyburn Reservoir and for Polecat Creek at Heyburn, Okla. Chemical analyses are shown for five sites in the basin. Correlation of runoff records with those for nearby basins indicates that the average annual runoff of the basin above gaging station at Heyburn is 325 acre-feet per square mile. Estimated duration curves of daily flow indicate that under natural conditions there would be no flow in Polecat Creek at Heyburn (drainage area, 129 square miles) about 16 percent of the time on an average, and that the flow would be less than 3 cubic feet per second half of the time. As there is no significant base flow in the basin, comparable low flows during dry-weather periods may be expected in other parts of the basin. During drought periods Heyburn Reservoir does not sustain a dependable low-water flow in Polecat Creek. Except for possible re-use of the small sewage effluent from city of Sapulpa, dependable supplies for additional water needs on the main stem will require development of supplemental storage. There has been no regular program for collection of chemical quality data in the basin, but miscellaneous analyses indicate a water of suitable quality for municipal and agricultural uses in Heyburn Reservoir and Polecat Creek near Heyburn. One recent chemical analysis indicates the possibility of a salt pollution problem in the Creek near Sapulpa. (available as photostat copy only)

  6. Big Data Analytics in Medicine and Healthcare.

    Science.gov (United States)

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  7. 33 CFR 207.170d - Taylor Creek, navigation lock (S-193) across the entrance to Taylor Creek at Lake Okeechobee...

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Taylor Creek, navigation lock (S-193) across the entrance to Taylor Creek at Lake Okeechobee, Okeechobee, Fla.; use, administration..., DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE NAVIGATION REGULATIONS § 207.170d Taylor Creek, navigation lock...

  8. CREEK Project's Nekton Database for Eight Creeks in the North Inlet Estuary, South Carolina: 1997-1998.

    Data.gov (United States)

    Baruch Institute for Marine and Coastal Sciences, Univ of South Carolina — A group of eight intertidal creeks with high densities of oysters, Crassostrea virginica, in North Inlet Estuary, South Carolina, USA were studied using a replicated...

  9. CREEK Project's Microzooplankton Seasonal Monitoring Database for Eight Creeks in the North Inlet Estuary, South Carolina: 1997-1999

    Data.gov (United States)

    Baruch Institute for Marine and Coastal Sciences, Univ of South Carolina — A group of eight intertidal creeks with high densities of oysters, Crassostrea virginica, in North Inlet Estuary, South Carolina, USA were studied using a replicated...

  10. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  11. The challenges of big data.

    Science.gov (United States)

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. © 2016. Published by The Company of Biologists Ltd.

  12. Puente Willow Creek en Monterrey, California

    Directory of Open Access Journals (Sweden)

    Editorial, Equipo

    1965-09-01

    Full Text Available Of the 10 awards given every year by the Prestressed Concrete Institute for the most outstanding prestressed concrete projects, two have been awarded in California this year, one of them to the Willow Creek bridge, near Monterrey. The prestressed, double T girders of this bridge were made at a workshop, a great distance from the bridge site. These are 24 m long, 1.35 m high, and are stabilized by transversal diaphragms, 20 cm in thickness. The table deck is of reinforced concrete, being 8.85 m wide and 20 cm thick. The structure is straightforward, slender, and adapts itself pleasantly to the background. It has seven spans and crosses over a secondary road, in addition to bridging the Willow stream. The supporting piles are hollow, of rectangular cross section, and over them a cross beam carries the five girders and the deck itself. The end abutments consist of vertical reinforced concrete walls, and supporting, soil filled, structures. The above information was supplied by the California Road Department.De los diez premios que anualmente concede el Prestressed Concrete Institute para las obras de hormigón pretensado más notables, dos han correspondido a California y uno de ellos al puente de Willow Creek, situado en la región de Monterrey. Las vigas de hormigón pretensado, con sección en forma de doble T, se prefabricaron en un taller situado a gran distancia del puente. Tienen 24 m de longitud y 1,35 m de canto, estando arriostradas con diafragmas transversales de 20 cm de espesor. La losa del tablero, de hormigón armado, tiene 8,85 m de anchura y 20 cm de espesor. La estructura es sencilla, esbelta y armoniza perfectamente con el paisaje que la circunda. Tiene siete tramos y salva un paso inferior secundario y el arroyo Willow. Los soportes, se apoyan sobre pilotes, algunos de gran altura; son huecos, de sección rectangular y terminan en una cruceta que sirve de sostén a las cinco vigas que soportan la losa del tablero. Los estribos

  13. Tulane/Xavier Center for Bioenvironmental Research; project: hazardous materials in aquatic environments; subproject: biomarkers and risk assessment in Bayou Trepagnier, LA

    International Nuclear Information System (INIS)

    Ide, C.

    1996-01-01

    Tulane and Xavier Universities have singled out the environment as a major strategic focus for research and training for now and beyond the year 2000. the Tulane/Xavier Center for Bioenvironmental Research (CBR) was established in 1989 as the umbrella organization to coordinate environmental research at both universities. CBR projects funded by the DOE under the Hazardous Materials in Aquatic Environments grant are defining the following: (1) the complex interactions that occur during the transport of contaminants through wetlands environments, (2) the actual and potential impact of contaminants on ecological systems and health, (3) the mechanisms and new technologies through which these impacts might be remediated, and (4) new programs aimed at educating and training environmental workers of the future. The subproject described in this report, 'Biomarkers and Risk Assessment in Bayou Trepagnier, LN', is particularly relevant to the US Department of Energy's Environmental Restoration and Waste Management program aimed at solving problems related to hazard monitoring and clean-up prioritization at sites with aquatic pollution problems in the DOE complex

  14. Big Book of Windows Hacks

    CERN Document Server

    Gralla, Preston

    2008-01-01

    Bigger, better, and broader in scope, the Big Book of Windows Hacks gives you everything you need to get the most out of your Windows Vista or XP system, including its related applications and the hardware it runs on or connects to. Whether you want to tweak Vista's Aero interface, build customized sidebar gadgets and run them from a USB key, or hack the "unhackable" screensavers, you'll find quick and ingenious ways to bend these recalcitrant operating systems to your will. The Big Book of Windows Hacks focuses on Vista, the new bad boy on Microsoft's block, with hacks and workarounds that

  15. Sosiaalinen asiakassuhdejohtaminen ja big data

    OpenAIRE

    Toivonen, Topi-Antti

    2015-01-01

    Tässä tutkielmassa käsitellään sosiaalista asiakassuhdejohtamista sekä hyötyjä, joita siihen voidaan saada big datan avulla. Sosiaalinen asiakassuhdejohtaminen on terminä uusi ja monille tuntematon. Tutkimusta motivoi aiheen vähäinen tutkimus, suomenkielisen tutkimuksen puuttuminen kokonaan sekä sosiaalisen asiakassuhdejohtamisen mahdollinen olennainen rooli yritysten toiminnassa tulevaisuudessa. Big dataa käsittelevissä tutkimuksissa keskitytään monesti sen tekniseen puoleen, eikä sovellutuk...

  16. Do big gods cause anything?

    DEFF Research Database (Denmark)

    Geertz, Armin W.

    2014-01-01

    Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere.......Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere....

  17. Big Data and Social Media

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    A critical analysis of the "keep everything" Big Data era, the impact on our lives of the information, at first glance "convenient for future use" that we make known about ourselves on the network. NB! The lecture will be recorded like all Academic Training lectures. Lecturer's biography: Father of the Internet, see https://internethalloffame.org/inductees/vint-cerf or https://en.wikipedia.org/wiki/Vint_Cerf The video on slide number 9 is from page https://www.gapminder.org/tools/#$state$time$value=2018&value;;&chart-type=bubbles   Keywords: Big Data, Internet, History, Applications, tools, privacy, technology, preservation, surveillance, google, Arpanet, CERN, Web  

  18. Baryon symmetric big bang cosmology

    International Nuclear Information System (INIS)

    Stecker, F.W.

    1978-01-01

    It is stated that the framework of baryon symmetric big bang (BSBB) cosmology offers our greatest potential for deducting the evolution of the Universe because its physical laws and processes have the minimum number of arbitrary assumptions about initial conditions in the big-bang. In addition, it offers the possibility of explaining the photon-baryon ratio in the Universe and how galaxies and galaxy clusters are formed. BSBB cosmology also provides the only acceptable explanation at present for the origin of the cosmic γ-ray background radiation. (author)

  19. Release plan for Big Pete

    International Nuclear Information System (INIS)

    Edwards, T.A.

    1996-11-01

    This release plan is to provide instructions for the Radiological Control Technician (RCT) to conduct surveys for the unconditional release of ''Big Pete,'' which was used in the removal of ''Spacers'' from the N-Reactor. Prior to performing surveys on the rear end portion of ''Big Pete,'' it shall be cleaned (i.e., free of oil, grease, caked soil, heavy dust). If no contamination is found, the vehicle may be released with the permission of the area RCT Supervisor. If contamination is found by any of the surveys, contact the cognizant Radiological Engineer for decontamination instructions

  20. Small quarks make big nuggets

    International Nuclear Information System (INIS)

    Deligeorges, S.

    1985-01-01

    After a brief recall on the classification of subatomic particles, this paper deals with quark nuggets, particle with more than three quarks, a big bag, which is called ''nuclearite''. Neutron stars, in fact, are big sacks of quarks, gigantic nuggets. Now, physicists try to calculate which type of nuggets of strange quark matter is stable, what has been the influence of quark nuggets on the primordial nucleosynthesis. At the present time, one says that if these ''nuggets'' exist, and in a large proportion, they may be candidates for the missing mass [fr

  1. [Big Data- challenges and risks].

    Science.gov (United States)

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-06

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented.

  2. Towards a big crunch dual

    Energy Technology Data Exchange (ETDEWEB)

    Hertog, Thomas E-mail: hertog@vulcan2.physics.ucsb.edu; Horowitz, Gary T

    2004-07-01

    We show there exist smooth asymptotically anti-de Sitter initial data which evolve to a big crunch singularity in a low energy supergravity limit of string theory. This opens up the possibility of using the dual conformal field theory to obtain a fully quantum description of the cosmological singularity. A preliminary study of this dual theory suggests that the big crunch is an endpoint of evolution even in the full string theory. We also show that any theory with scalar solitons must have negative energy solutions. The results presented here clarify our earlier work on cosmic censorship violation in N=8 supergravity. (author)

  3. The Inverted Big-Bang

    OpenAIRE

    Vaas, Ruediger

    2004-01-01

    Our universe appears to have been created not out of nothing but from a strange space-time dust. Quantum geometry (loop quantum gravity) makes it possible to avoid the ominous beginning of our universe with its physically unrealistic (i.e. infinite) curvature, extreme temperature, and energy density. This could be the long sought after explanation of the big-bang and perhaps even opens a window into a time before the big-bang: Space itself may have come from an earlier collapsing universe tha...

  4. Big Cities, Big Problems: Reason for the Elderly to Move?

    NARCIS (Netherlands)

    Fokkema, T.; de Jong-Gierveld, J.; Nijkamp, P.

    1996-01-01

    In many European countries, data on geographical patterns of internal elderly migration show that the elderly (55+) are more likely to leave than to move to the big cities. Besides emphasising the attractive features of the destination areas (pull factors), it is often assumed that this negative

  5. Big-Eyed Bugs Have Big Appetite for Pests

    Science.gov (United States)

    Many kinds of arthropod natural enemies (predators and parasitoids) inhabit crop fields in Arizona and can have a large negative impact on several pest insect species that also infest these crops. Geocoris spp., commonly known as big-eyed bugs, are among the most abundant insect predators in field c...

  6. Regional geology of the Pine Creek Geosyncline

    International Nuclear Information System (INIS)

    Needham, R.S.; Crick, I.H.; Stuart-Smith, P.G.

    1980-01-01

    The Pine Creek Geosyncline comprises about 14km of chronostratigraphic mainly pelitic and psammitic Lower Proterozoic sediments with interlayered tuff units, resting on granitic late Archaean complexes exposed as three small domes. Sedimentation took place in one basin, and most stratigraphic units are represented throughout the basin. The sediments were regionally deformed and metamorphosed at 1800Ma. Tightly folded greenschist facies strata in the centre grade into isoclinally deformed amphibolite facies metamorphics in the west and northeast. Pre and post-orogenic continental tholeiites, and post-orogenic granite diapirs intrude the Lower Proterozoic metasediments, and the granites are surrounded by hornfels zones up to 10km wide in the greenschist facies terrane. Cover rocks of Carpentarian (Middle Proterozoic) and younger ages rest on all these rocks unconformably and conceal the original basin margins. The Lower Proterozoic metasediments are mainly pelites (about 75 percent) which are commonly carbonaceous, lesser psammites and carbonates (about 10 percent each), and minor rudites (about 5 percent). Volcanic rocks make up about 10 percent of the total sequence. The environment of deposition ranges from shallow-marine to supratidal and fluviatile for most of the sequence, and to flysch in the topmost part. Poor exposure and deep weathering over much of the area hampers correlation of rock units; the correlation preferred by the authors is presented, and possible alternatives are discussed. Regional geological observations pertinent to uranium ore genesis are described. (author)

  7. Pine Creek Geosyncline, N.T

    International Nuclear Information System (INIS)

    Ewers, G.R.; Ferguson, J.; Needham, R.S.; Donnelly, T.H.

    1984-01-01

    The Pine Creek Geosyncline comprises about 14 km of chronostratigraphic mainly pelitic and psammitic Early Proterozoic sediments with interlayered tuff units, resting on granitic late Archaean complexes exposed as small domes. Sedimentation took place in one basin, and most stratigraphic units are represented throughout the basin. The sediments were regionally deformed and metamorphosed at 1800 Ma. Tightly folded greenschist facies strata in the centre grade into isoclinally deformed amphibolite facies metamorphics in the west and northeast, granulites are present in the extreme northeast. Pre and post-orogenic continental tholeiites, and post-orogenic granite diapirs intrude the Early Proterozoic metasediments, and the granites are surrounded by hornfels zones up to 10 km wide in the greenschist facies terrane. Cover rocks of Carpentarian (Middle Proterozoic) and younger ages rest on all these rocks unconformably and conceal the original basin margins. The uranium deposits post-date the approx. 1800 Ma regional metamorphic event; isotopic dating of uraninite and galena in the ore bodies indicates ages of mineralisation at approx. 1600 Ma, approx. 900 Ma and approx. 500 Ma. The ore bodies are stratabound, located within breccia zones, are of a shallow depth, and occur immediately below the Early/Middle Proterozoic unconformity

  8. A survey on Big Data Stream Mining

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... Big Data can be static on one machine or distributed ... decision making, and process automation. Big data .... Concept Drifting: concept drifting mean the classifier .... transactions generated by a prefix tree structure. EstDec ...

  9. BIG DATA-DRIVEN MARKETING: AN ABSTRACT

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (BD use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: What are the key antecedents of big data customer analytics use? How, and to what extent, does big data customer an...

  10. Big data in Finnish financial services

    OpenAIRE

    Laurila, M. (Mikko)

    2017-01-01

    Abstract This thesis aims to explore the concept of big data, and create understanding of big data maturity in the Finnish financial services industry. The research questions of this thesis are “What kind of big data solutions are being implemented in the Finnish financial services sector?” and “Which factors impede faster implementation of big data solutions in the Finnish financial services sector?”. ...

  11. Report on the Watershed Monitoring Program at the Paducah Site January-December 1998

    Energy Technology Data Exchange (ETDEWEB)

    Kszos, L.A.; Peterson, M.J.; Ryon, M.G.; Southworth, G.R.

    1999-03-01

    Watershed Monitoring of Big Bayou and Little Bayou creeks has been conducted since 1987. The monitoring was conducted by the University of Kentucky between 1987 and 1991 and by staff of the Environmental Sciences Division (ESD) at Oak Ridge National Laboratory (ORNL) from 1991 to present. The goals of monitoring are to (1) demonstrate that the effluent limitations established for DOE protect and maintain the use of Little Bayour and Big Bayou creeks for frowth and propagation of fish and other aquatic life, (2) characterize potential environmental impacts, and (3) document the effects of pollution abatement facilities on stream biota. The watershed (biological) monitoring discussed in this report was conducted under DOE Order 5400.1, General Environmental Protection Program. Future monitoring will be conducted as required by the Kentucky Pollutant Discharge Elimination System (KPDES) permit issued to the Department of Energy (DOE) in March 1998. A draft Watershed Monitoring Program plan was approved by the Kentucky Division of Water and will be finalized in 1999. The DOE permit also requires toxicity monitoring of one continuous outfall and of three intermittent outfalls on a quarterly basis. The Watershed Monitoring Program for the Paducah Site during calendar year 1998 consisted of three major tasks: (1) effluent toxicity monitoring, (2) bioaccumulation studies, and (3) ecological surveys of fish communities. This report focuses on ESD activities occurring from january 1998 to December 1998, although activities conducted outside this time period are included as appropriate.

  12. Ecological effects of contaminants and remedial actions in Bear Creek

    Energy Technology Data Exchange (ETDEWEB)

    Southworth, G.R.; Loar, J.M.; Ryon, M.G.; Smith, J.G.; Stewart, A.J. (Oak Ridge National Lab., TN (United States)); Burris, J.A. (C. E. Environmental, Inc., Tallahassee, FL (United States))

    1992-01-01

    Ecological studies of the Bear Creek watershed, which drains the area surrounding several Oak Ridge Y-12 Plant waste disposal facilities, were initiated in May 1984 and are continuing at present. These studies consisted of an initial, detailed characterization of the benthic invertebrate and fish communities in Bear Creek, and they were followed by a presently ongoing monitoring phase that involves reduced sampling intensities. The characterization phase utilized two approaches: (1) instream sampling of benthic invertebrate and fish communities in Bear Creek to identify spatial and temporal patterns in distribution and abundance and (2) laboratory bioassays on water samples from Bear Creek and selected tributaries to identify potential sources of toxicity to biota. The monitoring phase of the ecological program relates to the long-term goals of identifying and prioritizing contaminant sources and assessing the effectiveness of remedial actions. It continues activities of the characterization phase at less frequent intervals. The Bear Greek Valley is a watershed that drains the area surrounding several closed Oak Ridge Y-12 Plant waste disposal facilities. Past waste disposal practices in Bear Creek Valley resulted in contamination of Bear Creek and consequent ecological damage. Extensive remedial actions have been proposed at waste sites, and some of the have been implemented or are now underway. The proposed study plan consists of an initial, detailed characterization of the benthic invertebrate and fish communities in Bear Creek in the first year followed by a reduction in sampling intensity during the monitoring phase of the plan. The results of sampling conducted from May 1984 through early 1989 are presented in this report.

  13. Ecological effects of contaminants and remedial actions in Bear Creek

    International Nuclear Information System (INIS)

    Southworth, G.R.; Loar, J.M.; Ryon, M.G.; Smith, J.G.; Stewart, A.J.; Burris, J.A.

    1992-01-01

    Ecological studies of the Bear Creek watershed, which drains the area surrounding several Oak Ridge Y-12 Plant waste disposal facilities, were initiated in May 1984 and are continuing at present. These studies consisted of an initial, detailed characterization of the benthic invertebrate and fish communities in Bear Creek, and they were followed by a presently ongoing monitoring phase that involves reduced sampling intensities. The characterization phase utilized two approaches: (1) instream sampling of benthic invertebrate and fish communities in Bear Creek to identify spatial and temporal patterns in distribution and abundance and (2) laboratory bioassays on water samples from Bear Creek and selected tributaries to identify potential sources of toxicity to biota. The monitoring phase of the ecological program relates to the long-term goals of identifying and prioritizing contaminant sources and assessing the effectiveness of remedial actions. It continues activities of the characterization phase at less frequent intervals. The Bear Greek Valley is a watershed that drains the area surrounding several closed Oak Ridge Y-12 Plant waste disposal facilities. Past waste disposal practices in Bear Creek Valley resulted in contamination of Bear Creek and consequent ecological damage. Extensive remedial actions have been proposed at waste sites, and some of the have been implemented or are now underway. The proposed study plan consists of an initial, detailed characterization of the benthic invertebrate and fish communities in Bear Creek in the first year followed by a reduction in sampling intensity during the monitoring phase of the plan. The results of sampling conducted from May 1984 through early 1989 are presented in this report

  14. China: Big Changes Coming Soon

    Science.gov (United States)

    Rowen, Henry S.

    2011-01-01

    Big changes are ahead for China, probably abrupt ones. The economy has grown so rapidly for many years, over 30 years at an average of nine percent a year, that its size makes it a major player in trade and finance and increasingly in political and military matters. This growth is not only of great importance internationally, it is already having…

  15. Big data and urban governance

    NARCIS (Netherlands)

    Taylor, L.; Richter, C.; Gupta, J.; Pfeffer, K.; Verrest, H.; Ros-Tonen, M.

    2015-01-01

    This chapter examines the ways in which big data is involved in the rise of smart cities. Mobile phones, sensors and online applications produce streams of data which are used to regulate and plan the city, often in real time, but which presents challenges as to how the city’s functions are seen and

  16. Big Data for personalized healthcare

    NARCIS (Netherlands)

    Siemons, Liseth; Sieverink, Floor; Vollenbroek, Wouter; van de Wijngaert, Lidwien; Braakman-Jansen, Annemarie; van Gemert-Pijnen, Lisette

    2016-01-01

    Big Data, often defined according to the 5V model (volume, velocity, variety, veracity and value), is seen as the key towards personalized healthcare. However, it also confronts us with new technological and ethical challenges that require more sophisticated data management tools and data analysis

  17. Big data en gelijke behandeling

    NARCIS (Netherlands)

    Lammerant, Hans; de Hert, Paul; Blok, P.H.; Blok, P.H.

    2017-01-01

    In dit hoofdstuk bekijken we allereerst de voornaamste basisbegrippen inzake gelijke behandeling en discriminatie (paragraaf 6.2). Vervolgens kijken we haar het Nederlandse en Europese juridisch kader inzake non-discriminatie (paragraaf 6.3-6.5) en hoe die regels moeten worden toegepast op big

  18. Research Ethics in Big Data.

    Science.gov (United States)

    Hammer, Marilyn J

    2017-05-01

    The ethical conduct of research includes, in part, patient agreement to participate in studies and the protection of health information. In the evolving world of data science and the accessibility of large quantities of web-based data created by millions of individuals, novel methodologic approaches to answering research questions are emerging. This article explores research ethics in the context of big data.

  19. Big data e data science

    OpenAIRE

    Cavique, Luís

    2014-01-01

    Neste artigo foram apresentados os conceitos básicos de Big Data e a nova área a que deu origem, a Data Science. Em Data Science foi discutida e exemplificada a noção de redução da dimensionalidade dos dados.

  20. The Case for "Big History."

    Science.gov (United States)

    Christian, David

    1991-01-01

    Urges an approach to the teaching of history that takes the largest possible perspective, crossing time as well as space. Discusses the problems and advantages of such an approach. Describes a course on "big" history that begins with time, creation myths, and astronomy, and moves on to paleontology and evolution. (DK)

  1. Finding errors in big data

    NARCIS (Netherlands)

    Puts, Marco; Daas, Piet; de Waal, A.G.

    No data source is perfect. Mistakes inevitably creep in. Spotting errors is hard enough when dealing with survey responses from several thousand people, but the difficulty is multiplied hugely when that mysterious beast Big Data comes into play. Statistics Netherlands is about to publish its first

  2. Sampling Operations on Big Data

    Science.gov (United States)

    2015-11-29

    gories. These include edge sampling methods where edges are selected by a predetermined criteria; snowball sampling methods where algorithms start... Sampling Operations on Big Data Vijay Gadepally, Taylor Herr, Luke Johnson, Lauren Milechin, Maja Milosavljevic, Benjamin A. Miller Lincoln...process and disseminate information for discovery and exploration under real-time constraints. Common signal processing operations such as sampling and

  3. The International Big History Association

    Science.gov (United States)

    Duffy, Michael; Duffy, D'Neil

    2013-01-01

    IBHA, the International Big History Association, was organized in 2010 and "promotes the unified, interdisciplinary study and teaching of history of the Cosmos, Earth, Life, and Humanity." This is the vision that Montessori embraced long before the discoveries of modern science fleshed out the story of the evolving universe. "Big…

  4. NASA's Big Data Task Force

    Science.gov (United States)

    Holmes, C. P.; Kinter, J. L.; Beebe, R. F.; Feigelson, E.; Hurlburt, N. E.; Mentzel, C.; Smith, G.; Tino, C.; Walker, R. J.

    2017-12-01

    Two years ago NASA established the Ad Hoc Big Data Task Force (BDTF - https://science.nasa.gov/science-committee/subcommittees/big-data-task-force), an advisory working group with the NASA Advisory Council system. The scope of the Task Force included all NASA Big Data programs, projects, missions, and activities. The Task Force focused on such topics as exploring the existing and planned evolution of NASA's science data cyber-infrastructure that supports broad access to data repositories for NASA Science Mission Directorate missions; best practices within NASA, other Federal agencies, private industry and research institutions; and Federal initiatives related to big data and data access. The BDTF has completed its two-year term and produced several recommendations plus four white papers for NASA's Science Mission Directorate. This presentation will discuss the activities and results of the TF including summaries of key points from its focused study topics. The paper serves as an introduction to the papers following in this ESSI session.

  5. Big Math for Little Kids

    Science.gov (United States)

    Greenes, Carole; Ginsburg, Herbert P.; Balfanz, Robert

    2004-01-01

    "Big Math for Little Kids," a comprehensive program for 4- and 5-year-olds, develops and expands on the mathematics that children know and are capable of doing. The program uses activities and stories to develop ideas about number, shape, pattern, logical reasoning, measurement, operations on numbers, and space. The activities introduce the…

  6. BIG DATA IN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2015-06-01

    Full Text Available In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured in order to improve current transactions, to develop new business models, to provide a real image of the supply and demand and thereby, generate market advantages. So, the companies that turn to Big Data have a competitive advantage over other firms. Looking from the perspective of IT organizations, they must accommodate the storage and processing Big Data, and provide analysis tools that are easily integrated into business processes. This paper aims to discuss aspects regarding the Big Data concept, the principles to build, organize and analyse huge datasets in the business environment, offering a three-layer architecture, based on actual software solutions. Also, the article refers to the graphical tools for exploring and representing unstructured data, Gephi and NodeXL.

  7. From Big Bang to Eternity?

    Indian Academy of Sciences (India)

    at different distances (that is, at different epochs in the past) to come to this ... that the expansion started billions of years ago from an explosive Big Bang. Recent research sheds new light on the key cosmological question about the distant ...

  8. Banking Wyoming big sagebrush seeds

    Science.gov (United States)

    Robert P. Karrfalt; Nancy Shaw

    2013-01-01

    Five commercially produced seed lots of Wyoming big sagebrush (Artemisia tridentata Nutt. var. wyomingensis (Beetle & Young) S.L. Welsh [Asteraceae]) were stored under various conditions for 5 y. Purity, moisture content as measured by equilibrium relative humidity, and storage temperature were all important factors to successful seed storage. Our results indicate...

  9. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  10. Big data: een zoektocht naar instituties

    NARCIS (Netherlands)

    van der Voort, H.G.; Crompvoets, J

    2016-01-01

    Big data is a well-known phenomenon, even a buzzword nowadays. It refers to an abundance of data and new possibilities to process and use them. Big data is subject of many publications. Some pay attention to the many possibilities of big data, others warn us for their consequences. This special

  11. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  12. Data, Data, Data : Big, Linked & Open

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Eckartz, S.M.

    2013-01-01

    De gehele business en IT-wereld praat op dit moment over Big Data, een trend die medio 2013 Cloud Computing is gepasseerd (op basis van Google Trends). Ook beleidsmakers houden zich actief bezig met Big Data. Neelie Kroes, vice-president van de Europese Commissie, spreekt over de ‘Big Data

  13. A survey of big data research

    Science.gov (United States)

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  14. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  15. Big Data - Smart Health Strategies

    Science.gov (United States)

    2014-01-01

    Summary Objectives To select best papers published in 2013 in the field of big data and smart health strategies, and summarize outstanding research efforts. Methods A systematic search was performed using two major bibliographic databases for relevant journal papers. The references obtained were reviewed in a two-stage process, starting with a blinded review performed by the two section editors, and followed by a peer review process operated by external reviewers recognized as experts in the field. Results The complete review process selected four best papers, illustrating various aspects of the special theme, among them: (a) using large volumes of unstructured data and, specifically, clinical notes from Electronic Health Records (EHRs) for pharmacovigilance; (b) knowledge discovery via querying large volumes of complex (both structured and unstructured) biological data using big data technologies and relevant tools; (c) methodologies for applying cloud computing and big data technologies in the field of genomics, and (d) system architectures enabling high-performance access to and processing of large datasets extracted from EHRs. Conclusions The potential of big data in biomedicine has been pinpointed in various viewpoint papers and editorials. The review of current scientific literature illustrated a variety of interesting methods and applications in the field, but still the promises exceed the current outcomes. As we are getting closer towards a solid foundation with respect to common understanding of relevant concepts and technical aspects, and the use of standardized technologies and tools, we can anticipate to reach the potential that big data offer for personalized medicine and smart health strategies in the near future. PMID:25123721

  16. FIDDLER CREEK POLYMER AUGMENTATION PROJECT; TOPICAL

    International Nuclear Information System (INIS)

    Lyle A. Johnson, Jr.

    2001-01-01

    The Fiddler Creek field is in Weston County, Wyoming, and was discovered in 1948. Secondary waterflooding recovery was started in 1955 and terminated in the mid-1980s with a fieldwide recovery of approximately 40%. The West Fiddler Creek Unit, the focus of this project, had a lower recovery and therefore has the most remaining oil. Before the project this unit was producing approximately 85 bbl of oil per day from 20 pumping wells and 17 swab wells. The recovery process planned for this project involved adapting two independent processes, the injection of polymer as a channel blocker or as a deep-penetrating permeability modifier, and the stabilization of clays and reduction of the residual oil saturation in the near-wellbore area around the injection wells. Clay stabilization was not conducted because long-term fresh water injection had not severely reduced the injectivity. It was determined that future polymer injection would not be affected by the clay. For the project, two adjoining project patterns were selected on the basis of prior reservoir studies and current well availability and production. The primary injection well of Pattern 1 was treated with a small batch of MARCIT gel to create channel blocking. The long-term test was designed for three phases: (1) 77 days of injection of a 300-mg/l cationic polyacrylamide, (2) 15 days of injection of a 300-mg/l anionic polymer to ensure injectivity of the polymer, and (3) 369 days of injection of the 300-mg/l anionic polymer and a 30:1 mix of the crosslinker. Phases 1 and 2 were conducted as planned. Phase 3 was started in late March 1999 and terminated in May 2001. In this phase, a crosslinker was added with the anionic polymer. Total injection for Phase 3 was 709,064 bbl. To maintain the desired injection rate, the injection pressure was slowly increased from 1,400 psig to 2,100 psig. Early in the application of the polymer, it appeared that the sweep improvement program was having a positive effect on Pattern 1

  17. Investigating the Maya Polity at Lower Barton Creek Cayo, Belize

    Science.gov (United States)

    Kollias, George Van, III

    The objectives of this research are to determine the importance of Lower Barton Creek in both time and space, with relation to other settlements along the Belize River Valley. Material evidence recovered from field excavations and spatial information developed from Lidar data were employed in determining the socio-political nature and importance of this settlement, so as to orient its existence within the context of ancient socio-political dynamics in the Belize River Valley. Before the investigations detailed in this thesis no archaeological research had been conducted in the area, the site of Lower Barton Creek itself was only recently identified via the 2013 West-Central Belize LiDAR Survey (WCBLS 2013). Previously, the southern extent of the Barton Creek area represented a major break in our knowledge not only of the Barton Creek area, but the southern extent of the Belize River Valley. Conducting research at Lower Barton Creek has led to the determination of the polity's temporal existence and allowed for a greater and more complex understanding of the Belize River Valley's interaction with regions abutting the Belize River Valley proper.

  18. Statistical tables and charts showing geochemical variation in the Mesoproterozoic Big Creek, Apple Creek, and Gunsight formations, Lemhi group, Salmon River Mountains and Lemhi Range, central Idaho

    Science.gov (United States)

    Lindsey, David A.; Tysdal, Russell G.; Taggart, Joseph E.

    2002-01-01

    The principal purpose of this report is to provide a reference archive for results of a statistical analysis of geochemical data for metasedimentary rocks of Mesoproterozoic age of the Salmon River Mountains and Lemhi Range, central Idaho. Descriptions of geochemical data sets, statistical methods, rationale for interpretations, and references to the literature are provided. Three methods of analysis are used: R-mode factor analysis of major oxide and trace element data for identifying petrochemical processes, analysis of variance for effects of rock type and stratigraphic position on chemical composition, and major-oxide ratio plots for comparison with the chemical composition of common clastic sedimentary rocks.

  19. Sherman Creek Hatchery, annual report 2000

    International Nuclear Information System (INIS)

    2001-01-01

    The Sherman Creek Hatchery (SCH) was designed to rear 1.7 million kokanee fry for acclimation and imprinting during the spring and early summer. Additionally, it was designed to trap all available returning adult kokanee during the fall for broodstock operations and evaluations. Since the start of this program, the operations on Lake Roosevelt have been modified to better achieve program goals. These strategic changes have been the result of recommendations through the Lake Roosevelt Hatcheries Coordination Team (LRHCT) and were done to enhance imprinting, improve survival and operate the two kokanee facilities more effectively. The primary changes have been to replace the kokanee fingerling program with a yearling (post smolt) program of up to 1,000,000 fish. To construct and operate twenty net pens to handle the increased production. The second significant change was to rear 200,000 rainbow trout fingerling at SCH from July through October, for stocking into the volunteer net pens. This enables the Spokane Tribal Hatchery (STH) to rear additional kokanee to further the enhancement efforts on Lake Roosevelt. Monitoring and evaluation is preformed by the Lake Roosevelt Fisheries Monitoring Program. From 1988 to 1998, the principle sport fishery on Lake Roosevelt has shifted from walleye to include rainbow trout and kokanee salmon (Underwood et al. 1997, Tilson and Scholz 1997). The angler use, harvest rates for rainbow and kokanee and the economic value of the fishery has increased substantially during this 10-year period. The most recent information from the monitoring program also suggests that the hatchery and net pen rearing programs have been beneficial to enhancing the Lake Roosevelt fishery while not negatively impacting wild and native stocks within the lake

  20. Big-Leaf Mahogany on CITES Appendix II: Big Challenge, Big Opportunity

    Science.gov (United States)

    JAMES GROGAN; PAULO BARRETO

    2005-01-01

    On 15 November 2003, big-leaf mahogany (Swietenia macrophylla King, Meliaceae), the most valuable widely traded Neotropical timber tree, gained strengthened regulatory protection from its listing on Appendix II of the Convention on International Trade in Endangered Species ofWild Fauna and Flora (CITES). CITES is a United Nations-chartered agreement signed by 164...

  1. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    CERN Multimedia

    2008-01-01

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  2. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  3. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  4. Antigravity and the big crunch/big bang transition

    International Nuclear Information System (INIS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  5. Solution of a braneworld big crunch/big bang cosmology

    International Nuclear Information System (INIS)

    McFadden, Paul L.; Turok, Neil; Steinhardt, Paul J.

    2007-01-01

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c) 2 . At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios

  6. The ethics of big data in big agriculture

    Directory of Open Access Journals (Sweden)

    Isabelle M. Carbonell

    2016-03-01

    Full Text Available This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique insights on a field-by-field basis into a third or more of the US farmland. This power asymmetry may be rebalanced through open-sourced data, and publicly-funded data analytic tools which rival Climate Corp. in complexity and innovation for use in the public domain.

  7. Big Data – Big Deal for Organization Design?

    OpenAIRE

    Janne J. Korhonen

    2014-01-01

    Analytics is an increasingly important source of competitive advantage. It has even been posited that big data will be the next strategic emphasis of organizations and that analytics capability will be manifested in organizational structure. In this article, I explore how analytics capability might be reflected in organizational structure using the notion of  “requisite organization” developed by Jaques (1998). Requisite organization argues that a new strategic emphasis requires the addition ...

  8. Nowcasting using news topics Big Data versus big bank

    OpenAIRE

    Thorsrud, Leif Anders

    2016-01-01

    The agents in the economy use a plethora of high frequency information, including news media, to guide their actions and thereby shape aggregate economic fluctuations. Traditional nowcasting approches have to a relatively little degree made use of such information. In this paper, I show how unstructured textual information in a business newspaper can be decomposed into daily news topics and used to nowcast quarterly GDP growth. Compared with a big bank of experts, here represented by o cial c...

  9. NPDES Permit for Soap Creek Associates Wastewater Treatment Facility in Montana

    Science.gov (United States)

    Under National Pollutant Discharge Elimination System permit number MT-0023183, Soap Creek Associates, Inc. is authorized to discharge from its wastewater treatment facility located in West, Bighorn County, Montana, to Soap Creek.

  10. 75 FR 66077 - Mahoning Creek Hydroelectric Company, LLC; Notice of Availability of Supplemental Environmental...

    Science.gov (United States)

    2010-10-27

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 12555-004-PA] Mahoning Creek Hydroelectric Company, LLC; Notice of Availability of Supplemental Environmental Assessment... Energy Projects has reviewed the application for an original license for the Mahoning Creek Hydroelectric...

  11. Marine ecological habitat: A case study on projected thermal power plant around Dharamtar creek, India

    Digital Repository Service at National Institute of Oceanography (India)

    Kulkarni, V.A.; Naidu, V.S.; Jagtap, T.G.

    Estuaries and tidal creeks, harboring mangroves particularly, face tremendous anthropogenic pressures. Expansion of mega cities and the thermal power plants are generally proposed in the vicinity of estuaries and creek, due to the feasibility...

  12. 76 FR 8728 - Bear Creek Hydro Associates, LLC; Notice of Preliminary Permit Application Accepted for Filing...

    Science.gov (United States)

    2011-02-15

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Project No. 13951-000] Bear Creek Hydro..., Motions To Intervene, and Competing Applications On December 22, 2010, the Bear Creek Hydro Associates... (FPA), proposing to study the [[Page 8729

  13. Phytoplankton characteristics in a polluted Bombay Harbour-Thana-Bassein creek estuarine complex

    Digital Repository Service at National Institute of Oceanography (India)

    Ramaiah, Neelam; Ramaiah, N.; Nair, V.R.

    Annual variations in phytoplankton characteristics were studied from Bombay Harbour-Thana creek-Bassein creek (BHTCBC) estuarine confluence to assess the levels of pigment concentration, productivity and, qualitative and qunatitative nature...

  14. 78 FR 26063 - Central Utah Project Completion Act; East Hobble Creek Restoration Project Final Environmental...

    Science.gov (United States)

    2013-05-03

    ...-100-00-0-0, CUPCA00] Central Utah Project Completion Act; East Hobble Creek Restoration Project Final... Creek Restoration Project. These two agencies have determined that the proposed [[Page 26064

  15. Big Data in Medicine is Driving Big Changes

    Science.gov (United States)

    Verspoor, K.

    2014-01-01

    Summary Objectives To summarise current research that takes advantage of “Big Data” in health and biomedical informatics applications. Methods Survey of trends in this work, and exploration of literature describing how large-scale structured and unstructured data sources are being used to support applications from clinical decision making and health policy, to drug design and pharmacovigilance, and further to systems biology and genetics. Results The survey highlights ongoing development of powerful new methods for turning that large-scale, and often complex, data into information that provides new insights into human health, in a range of different areas. Consideration of this body of work identifies several important paradigm shifts that are facilitated by Big Data resources and methods: in clinical and translational research, from hypothesis-driven research to data-driven research, and in medicine, from evidence-based practice to practice-based evidence. Conclusions The increasing scale and availability of large quantities of health data require strategies for data management, data linkage, and data integration beyond the limits of many existing information systems, and substantial effort is underway to meet those needs. As our ability to make sense of that data improves, the value of the data will continue to increase. Health systems, genetics and genomics, population and public health; all areas of biomedicine stand to benefit from Big Data and the associated technologies. PMID:25123716

  16. Big data is not a monolith

    CERN Document Server

    Ekbia, Hamid R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  17. Stream sediment detailed geochemical survey for Date Creek Basin, Arizona

    International Nuclear Information System (INIS)

    Butz, T.R.; Tieman, D.J.; Grimes, J.G.; Bard, C.S.; Helgerson, R.N.; Pritz, P.M.; Wolf, D.A.

    1981-01-01

    The purpose of the Date Creek Supplement is to characterize the chemistry of sediment samples representing stream basins in which the Anderson Mine (and related prospects) occur. Once characterized, the chemistry is then used to delineate other areas within the Date Creek Basin where stream sediment chemistry resembles that of the Anderson Mine area. This supplementary report examines more closely the data from sediment samples taken in 239 stream basins collected over a total area of approximately 900 km 2 (350 mi 2 ). Cluster and discriminant analyses are used to characterize the geochemistry of the stream sediment samples collected in the Date Creek Basin. Cluster and discriminant analysis plots are used to delineate areas having a potential for uranium mineralization similar to that of the Anderson Mine

  18. Simulation of effects of wastewater discharges on Sand Creek and lower Caddo Creek near Ardmore, Oklahoma

    Science.gov (United States)

    Wesolowski, Edwin A.

    1999-01-01

    A streamflow and water-quality model was developed for reaches of Sand and Caddo Creeks in south-central Oklahoma to simulate the effects of wastewater discharge from a refinery and a municipal treatment plant.The purpose of the model was to simulate conditions during low streamflow when the conditions controlling dissolved-oxygen concentrations are most severe. Data collected to calibrate and verify the streamflow and water-quality model include continuously monitored streamflow and water-quality data at two gaging stations and three temporary monitoring stations; wastewater discharge from two wastewater plants; two sets each of five water-quality samples at nine sites during a 24-hour period; dye and propane samples; periphyton samples; and sediment oxygen demand measurements. The water-quality sampling, at a 6-hour frequency, was based on a Lagrangian reference frame in which the same volume of water was sampled at each site. To represent the unsteady streamflows and the dynamic water-quality conditions, a transport modeling system was used that included both a model to route streamflow and a model to transport dissolved conservative constituents with linkage to reaction kinetics similar to the U.S. Environmental Protection Agency QUAL2E model to simulate nonconservative constituents. These model codes are the Diffusion Analogy Streamflow Routing Model (DAFLOW) and the branched Lagrangian transport model (BLTM) and BLTM/QUAL2E that, collectively, as calibrated models, are referred to as the Ardmore Water-Quality Model.The Ardmore DAFLOW model was calibrated with three sets of streamflows that collectively ranged from 16 to 3,456 cubic feet per second. The model uses only one set of calibrated coefficients and exponents to simulate streamflow over this range. The Ardmore BLTM was calibrated for transport by simulating dye concentrations collected during a tracer study when streamflows ranged from 16 to 23 cubic feet per second. Therefore, the model is expected to

  19. Big Data for Precision Medicine

    Directory of Open Access Journals (Sweden)

    Daniel Richard Leff

    2015-09-01

    Full Text Available This article focuses on the potential impact of big data analysis to improve health, prevent and detect disease at an earlier stage, and personalize interventions. The role that big data analytics may have in interrogating the patient electronic health record toward improved clinical decision support is discussed. We examine developments in pharmacogenetics that have increased our appreciation of the reasons why patients respond differently to chemotherapy. We also assess the expansion of online health communications and the way in which this data may be capitalized on in order to detect public health threats and control or contain epidemics. Finally, we describe how a new generation of wearable and implantable body sensors may improve wellbeing, streamline management of chronic diseases, and improve the quality of surgical implants.

  20. Big Data hvor N=1

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    2017-01-01

    Forskningen vedrørende anvendelsen af ’big data’ indenfor sundhed er kun lige begyndt, og kan på sigt blive en stor hjælp i forhold til at tilrettelægge en mere personlig og helhedsorienteret sundhedsindsats for multisyge. Personlig sundhedsteknologi, som kort præsenteres i dette kapital, rummer et...... stor potentiale for at gennemføre ’big data’ analyser for den enkelte person, det vil sige hvor N=1. Der er store teknologiske udfordringer i at få lavet teknologier og metoder til at indsamle og håndtere personlige data, som kan deles, på tværs på en standardiseret, forsvarlig, robust, sikker og ikke...

  1. George and the big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2012-01-01

    George has problems. He has twin baby sisters at home who demand his parents’ attention. His beloved pig Freddy has been exiled to a farm, where he’s miserable. And worst of all, his best friend, Annie, has made a new friend whom she seems to like more than George. So George jumps at the chance to help Eric with his plans to run a big experiment in Switzerland that seeks to explore the earliest moment of the universe. But there is a conspiracy afoot, and a group of evildoers is planning to sabotage the experiment. Can George repair his friendship with Annie and piece together the clues before Eric’s experiment is destroyed forever? This engaging adventure features essays by Professor Stephen Hawking and other eminent physicists about the origins of the universe and ends with a twenty-page graphic novel that explains how the Big Bang happened—in reverse!

  2. Did the Big Bang begin?

    International Nuclear Information System (INIS)

    Levy-Leblond, J.

    1990-01-01

    It is argued that the age of the universe may well be numerically finite (20 billion years or so) and conceptually infinite. A new and natural time scale is defined on a physical basis using group-theoretical arguments. An additive notion of time is obtained according to which the age of the universe is indeed infinite. In other words, never did the Big Bang begin. This new time scale is not supposed to replace the ordinary cosmic time scale, but to supplement it (in the same way as rapidity has taken a place by the side of velocity in Einsteinian relativity). The question is discussed within the framework of conventional (big-bang) and classical (nonquantum) cosmology, but could easily be extended to more elaborate views, as the purpose is not so much to modify present theories as to reach a deeper understanding of their meaning

  3. Big Data in Drug Discovery.

    Science.gov (United States)

    Brown, Nathan; Cambruzzi, Jean; Cox, Peter J; Davies, Mark; Dunbar, James; Plumbley, Dean; Sellwood, Matthew A; Sim, Aaron; Williams-Jones, Bryn I; Zwierzyna, Magdalena; Sheppard, David W

    2018-01-01

    Interpretation of Big Data in the drug discovery community should enhance project timelines and reduce clinical attrition through improved early decision making. The issues we encounter start with the sheer volume of data and how we first ingest it before building an infrastructure to house it to make use of the data in an efficient and productive way. There are many problems associated with the data itself including general reproducibility, but often, it is the context surrounding an experiment that is critical to success. Help, in the form of artificial intelligence (AI), is required to understand and translate the context. On the back of natural language processing pipelines, AI is also used to prospectively generate new hypotheses by linking data together. We explain Big Data from the context of biology, chemistry and clinical trials, showcasing some of the impressive public domain sources and initiatives now available for interrogation. © 2018 Elsevier B.V. All rights reserved.

  4. Identification and characterization of wetlands in the Bear Creek watershed

    International Nuclear Information System (INIS)

    Rosensteel, B.A.; Trettin, C.C.

    1993-10-01

    The primary objective of this study was to identify, characterize, and map the wetlands in the Bear Creek watershed. A preliminary wetland categorization system based on the Cowardin classification system (Cowardin et al. 1979) with additional site-specific topographic, vegetation, and disturbance characteristic modifiers was developed to characterize the type of wetlands that exist in the Bear Creek watershed. An additional objective was to detect possible relationships among site soils, hydrology, and the occurrence of wetlands in the watershed through a comparison of existing data with the field survey. Research needs are discussed in the context of wetland functions and values and regulatory requirements for wetland impact assessment and compensatory mitigation

  5. Big Data and central banks

    OpenAIRE

    David Bholat

    2015-01-01

    This commentary recaps a Centre for Central Banking Studies event held at the Bank of England on 2–3 July 2014. The article covers three main points. First, it situates the Centre for Central Banking Studies event within the context of the Bank’s Strategic Plan and initiatives. Second, it summarises and reflects on major themes from the event. Third, the article links central banks’ emerging interest in Big Data approaches with their broader uptake by other economic agents.

  6. Big Bang or vacuum fluctuation

    International Nuclear Information System (INIS)

    Zel'dovich, Ya.B.

    1980-01-01

    Some general properties of vacuum fluctuations in quantum field theory are described. The connection between the ''energy dominance'' of the energy density of vacuum fluctuations in curved space-time and the presence of singularity is discussed. It is pointed out that a de-Sitter space-time (with the energy density of the vacuum fluctuations in the Einstein equations) that matches the expanding Friedman solution may describe the history of the Universe before the Big Bang. (P.L.)

  7. Big bang is not needed

    Energy Technology Data Exchange (ETDEWEB)

    Allen, A.D.

    1976-02-01

    Recent computer simulations indicate that a system of n gravitating masses breaks up, even when the total energy is negative. As a result, almost any initial phase-space distribution results in a universe that eventually expands under the Hubble law. Hence Hubble expansion implies little regarding an initial cosmic state. Especially it does not imply the singularly dense superpositioned state used in the big bang model.

  8. 76 FR 62758 - Wallowa-Whitman and Umatilla National Forests, Oregon Granite Creek Watershed Mining Plans

    Science.gov (United States)

    2011-10-11

    ... environmental analyses for proposed mining Plans in the portions of the Granite Creek Watershed under their... Granite Creek Watershed Mining Plans analysis area that meets the Purpose of and Need for Action. It is... Granite Creek Watershed Mining Plans AGENCY: Forest Service, USDA. ACTION: Notice of intent to prepare an...

  9. 78 FR 25484 - License Amendment for Anadarko Petroleum Corporation, Bear Creek Facility, Converse County, Wyoming

    Science.gov (United States)

    2013-05-01

    ... Petroleum Corporation, Bear Creek Facility, Converse County, Wyoming AGENCY: Nuclear Regulatory Commission.... 47 for its Bear Creek Uranium Mill facility in Converse County, Wyoming. The NRC has prepared an... INFORMATION: I. Background The Bear Creek Uranium Mill operated from September 1977 until January 1986, and...

  10. 76 FR 13344 - Beaver Creek Landscape Management Project, Ashland Ranger District, Custer National Forest...

    Science.gov (United States)

    2011-03-11

    ... DEPARTMENT OF AGRICULTURE Forest Service Beaver Creek Landscape Management Project, Ashland Ranger... Impact Statement for the Beaver Creek Landscape Management Project was published in the Federal Register... Responsible Official for the Beaver Creek Landscape Management Project. DATES: The Final Environmental Impact...

  11. 76 FR 65118 - Drawbridge Operation Regulation; Bear Creek, Sparrows Point, MD

    Science.gov (United States)

    2011-10-20

    ...-AA09 Drawbridge Operation Regulation; Bear Creek, Sparrows Point, MD AGENCY: Coast Guard, DHS. ACTION... regulation. The Baltimore County Revenue Authority (Dundalk Avenue) highway toll drawbridge across Bear Creek... applicable or necessary. Basis and Purpose The drawbridge across Bear Creek, mile 1.5 was removed and...

  12. 75 FR 31418 - Intermountain Region, Payette National Forest, Council Ranger District; Idaho; Mill Creek-Council...

    Science.gov (United States)

    2010-06-03

    ... Ranger District; Idaho; Mill Creek--Council Mountain Landscape Restoration Project AGENCY: Forest Service... the Mill Creek--Council Mountain Landscape Restoration Project. The approximate 51,900 acre project area is located about two miles east of Council, Idaho. The Mill Creek--Council Mountain Landscape...

  13. 75 FR 68780 - Cedar Creek Wind Energy, LLC; Notice of Filing

    Science.gov (United States)

    2010-11-09

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RC11-1-000] Cedar Creek Wind Energy, LLC; Notice of Filing November 2, 2010. Take notice that on October 27, 2010, Cedar Creek Wind Energy, LLC (Cedar Creek) filed an appeal with the Federal Energy Regulatory Commission (Commission) of...

  14. Big data: the management revolution.

    Science.gov (United States)

    McAfee, Andrew; Brynjolfsson, Erik

    2012-10-01

    Big data, the authors write, is far more powerful than the analytics of the past. Executives can measure and therefore manage more precisely than ever before. They can make better predictions and smarter decisions. They can target more-effective interventions in areas that so far have been dominated by gut and intuition rather than by data and rigor. The differences between big data and analytics are a matter of volume, velocity, and variety: More data now cross the internet every second than were stored in the entire internet 20 years ago. Nearly real-time information makes it possible for a company to be much more agile than its competitors. And that information can come from social networks, images, sensors, the web, or other unstructured sources. The managerial challenges, however, are very real. Senior decision makers have to learn to ask the right questions and embrace evidence-based decision making. Organizations must hire scientists who can find patterns in very large data sets and translate them into useful business information. IT departments have to work hard to integrate all the relevant internal and external sources of data. The authors offer two success stories to illustrate how companies are using big data: PASSUR Aerospace enables airlines to match their actual and estimated arrival times. Sears Holdings directly analyzes its incoming store data to make promotions much more precise and faster.

  15. Big Data Comes to School

    Directory of Open Access Journals (Sweden)

    Bill Cope

    2016-03-01

    Full Text Available The prospect of “big data” at once evokes optimistic views of an information-rich future and concerns about surveillance that adversely impacts our personal and private lives. This overview article explores the implications of big data in education, focusing by way of example on data generated by student writing. We have chosen writing because it presents particular complexities, highlighting the range of processes for collecting and interpreting evidence of learning in the era of computer-mediated instruction and assessment as well as the challenges. Writing is significant not only because it is central to the core subject area of literacy; it is also an ideal medium for the representation of deep disciplinary knowledge across a number of subject areas. After defining what big data entails in education, we map emerging sources of evidence of learning that separately and together have the potential to generate unprecedented amounts of data: machine assessments, structured data embedded in learning, and unstructured data collected incidental to learning activity. Our case is that these emerging sources of evidence of learning have significant implications for the traditional relationships between assessment and instruction. Moreover, for educational researchers, these data are in some senses quite different from traditional evidentiary sources, and this raises a number of methodological questions. The final part of the article discusses implications for practice in an emerging field of education data science, including publication of data, data standards, and research ethics.

  16. Surface-water and ground-water quality in the Powell Creek and Armstrong Creek Watersheds, Dauphin County, Pennsylvania, July-September 2001

    Science.gov (United States)

    Galeone, Daniel G.; Low, Dennis J.

    2003-01-01

    Powell Creek and Armstrong Creek Watersheds are in Dauphin County, north of Harrisburg, Pa. The completion of the Dauphin Bypass Transportation Project in 2001 helped to alleviate traffic congestion from these watersheds to Harrisburg. However, increased development in Powell Creek and Armstrong Creek Watersheds is expected. The purpose of this study was to establish a baseline for future projects in the watersheds so that the effects of land-use changes on water quality can be documented. The Pennsylvania Department of Environmental Protection (PADEP) (2002) indicates that surface water generally is good in the 71 perennial stream miles in the watersheds. PADEP lists 11.1 stream miles within the Armstrong Creek and 3.2 stream miles within the Powell Creek Watersheds as impaired or not meeting water-quality standards. Siltation from agricultural sources and removal of vegetation along stream channels are cited by PADEP as likely factors causing this impairment.

  17. Turning big bang into big bounce. I. Classical dynamics

    Science.gov (United States)

    Dzierżak, Piotr; Małkiewicz, Przemysław; Piechocki, Włodzimierz

    2009-11-01

    The big bounce (BB) transition within a flat Friedmann-Robertson-Walker model is analyzed in the setting of loop geometry underlying the loop cosmology. We solve the constraint of the theory at the classical level to identify physical phase space and find the Lie algebra of the Dirac observables. We express energy density of matter and geometrical functions in terms of the observables. It is the modification of classical theory by the loop geometry that is responsible for BB. The classical energy scale specific to BB depends on a parameter that should be fixed either by cosmological data or determined theoretically at quantum level, otherwise the energy scale stays unknown.

  18. Report on the Biological Monitoring Program at Paducah Gaseous Diffusion Plant December 1992--December 1993

    Energy Technology Data Exchange (ETDEWEB)

    Kszos, L.A.; Hinzman, R.L.; Peterson, M.J.; Ryon, M.G.; Smith, J.G.; Southworth, G.R.

    1995-06-01

    On September 24, 1987, the Commonwealth of Kentucky Natural Resources and Environmental Protection Cabinet issued an Agreed Order that required the development of a Biological Monitoring Program (BMP) for the Paducah Gaseous Diffusion Plant (PGDP). The goals of BMP are to demonstrate that the effluent limitations established for PGDP protect and maintain the use of Little Bayou and Big Bayou creeks for growth and propagation of fish and other aquatic life, characterize potential health and environmental impacts, document the effects of pollution abatement facilities on stream biota, and recommend any program improvements that would increase effluent treatability. The BMP for PGDP consists of three major tasks: effluent and ambient toxicity monitoring, bioaccumulation studies, and ecological surveys of stream communities (i.e., benthic macroinvertebrates and fish). This report includes ESD activities occurring from December 1992 to December 1993, although activities conducted outside this time period are included as appropriate.

  19. Streamflow characteristics and trends along Soldier Creek, Northeast Kansas

    Science.gov (United States)

    Juracek, Kyle E.

    2017-08-16

    Historical data for six selected U.S. Geological Survey streamgages along Soldier Creek in northeast Kansas were used in an assessment of streamflow characteristics and trends. This information is required by the Prairie Band Potawatomi Nation for the effective management of tribal water resources, including drought contingency planning. Streamflow data for the period of record at each streamgage were used to assess annual mean streamflow, annual mean base flow, mean monthly flow, annual peak flow, and annual minimum flow.Annual mean streamflows along Soldier Creek were characterized by substantial year-to-year variability with no pronounced long-term trends. On average, annual mean base flow accounted for about 20 percent of annual mean streamflow. Mean monthly flows followed a general seasonal pattern that included peak values in spring and low values in winter. Annual peak flows, which were characterized by considerable year-to-year variability, were most likely to occur in May and June and least likely to occur during November through February. With the exception of a weak yet statistically significant increasing trend at the Soldier Creek near Topeka, Kansas, streamgage, there were no pronounced long-term trends in annual peak flows. Annual 1-day, 30-day, and 90-day mean minimum flows were characterized by considerable year-to-year variability with no pronounced long-term trend. During an extreme drought, as was the case in the mid-1950s, there may be zero flow in Soldier Creek continuously for a period of one to several months.

  20. Large woody debris budgets in the Caspar Creek Experimental Watersheds

    Science.gov (United States)

    Sue Hilton

    2012-01-01

    Monitoring of large woody debris (LWD) in the two mainstem channels of the Caspar Creek Experimental Watersheds since 1998, combined with older data from other work in the watersheds, gives estimates of channel wood input rates, survival, and outputs in intermediate-sized channels in coastal redwood forests. Input rates from standing trees for the two reaches over a 15...

  1. Preliminary investigations on the Ichthyodiversity of Kilifi Creek, Kenya

    African Journals Online (AJOL)

    (Smith, 1939) off the Kenyan coast at Malindi only. 50 km north of ... communities, river fed creek, upstream and the bay proper, in Gazi ... habitat degradation: pollution, overfishing, ..... exploitable fishes from a marine park and its effect on the ...

  2. 78 FR 67084 - Drawbridge Operation Regulation; Broad Creek, Laurel, DE

    Science.gov (United States)

    2013-11-08

    ...-AA09 Drawbridge Operation Regulation; Broad Creek, Laurel, DE AGENCY: Coast Guard, DHS. ACTION: Notice....25, both at Laurel, DE. The proposed new rule would change the current regulation by requiring a..., mile 8.2, all at Laurel, shall open on signal if at least 48 hours notice is given. Previous regulation...

  3. Short notes and reviews The fossil fauna of Mazon Creek

    NARCIS (Netherlands)

    Schultze, Hans-Peter

    1998-01-01

    Review of: Richardson’s Guide to the Fossil Fauna of Mazon Creek, edited by Charles W. Shabica & Andrew A. Hay. Northeastern Illinois University, Chicago, Illinois, 1997: XVIII + 308 pp., 385 figs., 4 tables, 1 faunal list; $75.00 (hard cover) ISBN 0-925065-21-8. Since the last century, the area

  4. Forest Creeks Research Natural Area: guidebook supplement 39

    Science.gov (United States)

    Reid Schuller; Ron Halvorson

    2010-01-01

    This guidebook describes Forest Creeks Research Natural Area, a 164-ha (405-ac) area comprising two geographically distinct canyons and associated drainages. The two units have been established as examples of first- to third-order streams originating within a ponderosa pine (Pinus ponderosa) zone. The two riparian areas also represent examples of...

  5. Copepod composition, abundance and diversity in Makupa Creek ...

    African Journals Online (AJOL)

    Evenness (J) was, however, relatively constant (0.67 to 0.84) during the entire sampling period. These results point to suppressed copepod diversity and abundance in Makupa Creek, and possible reasons for this, which may include environmental degradation caused by pollution, are presented. Western Indian Ocean ...

  6. Cherry Creek Research Natural Area: guidebook supplement 41

    Science.gov (United States)

    Reid Schuller; Jennie Sperling; Tim Rodenkirk

    2011-01-01

    This guidebook describes Cherry Creek Research Natural Area, a 239-ha (590-ac) area that supports old-growth Douglas-fir-western hemlock (Pseudotsuga menziesii- Tsuga heterophylla) forest occurring on sedimentary materials in the southern Oregon Coast Range. Major plant associations present within the area include the western hemlock/Oregon oxalis...

  7. Fish Creek Rim Research Natural Area: guidebook supplement 50

    Science.gov (United States)

    Reid Schuller; Ian Grinter

    2016-01-01

    This guidebook describes major biological and physical attributes of the 3531-ha (8,725-ac) Fish Creek Rim Research Natural Area located within the Northern Basin and Range ecoregion and managed by the Bureau of Land Management, Lakeview District (USDI BLM 2003).

  8. WARM SPRINGS CREEK GEOTHERMAL STUDY, BLAIN COUNTY IDAHO, 1987

    Science.gov (United States)

    In the Warm Springs Creek drainage near Ketchum, Idaho (17040219), a leaking pipeline coveys geothermal water through the valley to heat nearby homes as well as to supply a resorts swimming pool. Several domestic wells in close proximity to this line have exhibited increasing fl...

  9. Tillman Creek Mitigation Site As-Build Report.

    Energy Technology Data Exchange (ETDEWEB)

    Gresham, Doug [Otak, Inc.

    2009-05-29

    This as-built report describes site conditions at the Tillman Creek mitigation site in South Cle Elum, Washington. This mitigation site was constructed in 2006-2007 to compensate for wetland impacts from the Yakama Nation hatchery. This as-built report provides information on the construction sequence, as-built survey, and establishment of baseline monitoring stations.

  10. 78 FR 47427 - AUC, LLC Reno Creek, In Situ

    Science.gov (United States)

    2013-08-05

    ... NUCLEAR REGULATORY COMMISSION [Docket No. 040-09092; NRC-2013-0164] AUC, LLC Reno Creek, In Situ... October 3, 2012, AUC submitted a license application to the U.S. Nuclear Regulatory Commission (NRC... provided the first time that a document is referenced. The AUC License Application request and additional...

  11. 75 FR 43915 - Basin Electric Power Cooperative: Deer Creek Station

    Science.gov (United States)

    2010-07-27

    ... factors that could be affected by the proposed Project were evaluated in detail in the EIS. These issues... DEPARTMENT OF AGRICULTURE Rural Utilities Service Basin Electric Power Cooperative: Deer Creek... Energy Facility project (Project) in Brookings and Deuel Counties, South Dakota. The Administrator of RUS...

  12. EAARL topography-Potato Creek watershed, Georgia, 2010

    Science.gov (United States)

    Bonisteel-Cormier, J.M.; Nayegandhi, Amar; Fredericks, Xan; Jones, J.W.; Wright, C.W.; Brock, J.C.; Nagle, D.B.

    2011-01-01

    This DVD contains lidar-derived first-surface (FS) and bare-earth (BE) topography GIS datasets of a portion of the Potato Creek watershed in the Apalachicola-Chattahoochee-Flint River basin, Georgia. These datasets were acquired on February 27, 2010.

  13. Big Data Knowledge in Global Health Education.

    Science.gov (United States)

    Olayinka, Olaniyi; Kekeh, Michele; Sheth-Chandra, Manasi; Akpinar-Elci, Muge

    The ability to synthesize and analyze massive amounts of data is critical to the success of organizations, including those that involve global health. As countries become highly interconnected, increasing the risk for pandemics and outbreaks, the demand for big data is likely to increase. This requires a global health workforce that is trained in the effective use of big data. To assess implementation of big data training in global health, we conducted a pilot survey of members of the Consortium of Universities of Global Health. More than half the respondents did not have a big data training program at their institution. Additionally, the majority agreed that big data training programs will improve global health deliverables, among other favorable outcomes. Given the observed gap and benefits, global health educators may consider investing in big data training for students seeking a career in global health. Copyright © 2017 Icahn School of Medicine at Mount Sinai. Published by Elsevier Inc. All rights reserved.

  14. A baseline and watershed assessment in the Lynx Creek, Brenot Creek, and Portage Creek watersheds near Hudson's Hope, BC : summary report

    International Nuclear Information System (INIS)

    Matscha, G.; Sutherland, D.

    2005-06-01

    This report summarized a baseline monitoring program for the Lynx Creek, Brenot Creek, and Portage Creek watersheds located near Hudson's Hope, British Columbia (BC). The monitoring program was designed to more accurately determine the effects of potential coalbed gas developments in the region, as well as to assess levels of agricultural and forest harvesting, and the impacts of current land use activities on water quantity and quality. Water quality was sampled at 18 sites during 5 different flow regimes, including summer and fall low flows; ice cover; spring run-off; and high flows after a heavy summer rain event. Sample sites were located up and downstream of both forest and agricultural activities. The water samples were analyzed for 70 contaminants including ions, nutrients, metals, hydrocarbons, and hydrocarbon fractions. Results showed that while many analyzed parameters met current BC water quality guidelines, total organic carbon, manganese, cadmium, E. coli, fecal coliforms, and fecal streptococci often exceeded recommended guidelines. Aluminum and cobalt values exceeded drinking water guidelines. The samples also had a slightly alkaline pH and showed high conductance. A multiple barrier approach was recommended to reduce potential risks of contamination from the watersheds. It was concluded that a more refined bacteria source tracking method is needed to determine whether fecal pollution has emanated from human, livestock or wildlife sources. 1 tab., 9 figs

  15. Modeling canopy-level productivity: is the "big-leaf" simplification acceptable?

    Science.gov (United States)

    Sprintsin, M.; Chen, J. M.

    2009-05-01

    The "big-leaf" approach to calculating the carbon balance of plant canopies assumes that canopy carbon fluxes have the same relative responses to the environment as any single unshaded leaf in the upper canopy. Widely used light use efficiency models are essentially simplified versions of the big-leaf model. Despite its wide acceptance, subsequent developments in the modeling of leaf photosynthesis and measurements of canopy physiology have brought into question the assumptions behind this approach showing that big leaf approximation is inadequate for simulating canopy photosynthesis because of the additional leaf internal control on carbon assimilation and because of the non-linear response of photosynthesis on leaf nitrogen and absorbed light, and changes in leaf microenvironment with canopy depth. To avoid this problem a sunlit/shaded leaf separation approach, within which the vegetation is treated as two big leaves under different illumination conditions, is gradually replacing the "big-leaf" strategy, for applications at local and regional scales. Such separation is now widely accepted as a more accurate and physiologically based approach for modeling canopy photosynthesis. Here we compare both strategies for Gross Primary Production (GPP) modeling using the Boreal Ecosystem Productivity Simulator (BEPS) at local (tower footprint) scale for different land cover types spread over North America: two broadleaf forests (Harvard, Massachusetts and Missouri Ozark, Missouri); two coniferous forests (Howland, Maine and Old Black Spruce, Saskatchewan); Lost Creek shrubland site (Wisconsin) and Mer Bleue petland (Ontario). BEPS calculates carbon fixation by scaling Farquhar's leaf biochemical model up to canopy level with stomatal conductance estimated by a modified version of the Ball-Woodrow-Berry model. The "big-leaf" approach was parameterized using derived leaf level parameters scaled up to canopy level by means of Leaf Area Index. The influence of sunlit

  16. Big Data Strategy for Telco: Network Transformation

    OpenAIRE

    F. Amin; S. Feizi

    2014-01-01

    Big data has the potential to improve the quality of services; enable infrastructure that businesses depend on to adapt continually and efficiently; improve the performance of employees; help organizations better understand customers; and reduce liability risks. Analytics and marketing models of fixed and mobile operators are falling short in combating churn and declining revenue per user. Big Data presents new method to reverse the way and improve profitability. The benefits of Big Data and ...

  17. Big Data in Shipping - Challenges and Opportunities

    OpenAIRE

    Rødseth, Ørnulf Jan; Perera, Lokukaluge Prasad; Mo, Brage

    2016-01-01

    Big Data is getting popular in shipping where large amounts of information is collected to better understand and improve logistics, emissions, energy consumption and maintenance. Constraints to the use of big data include cost and quality of on-board sensors and data acquisition systems, satellite communication, data ownership and technical obstacles to effective collection and use of big data. New protocol standards may simplify the process of collecting and organizing the data, including in...

  18. Big Data in Action for Government : Big Data Innovation in Public Services, Policy, and Engagement

    OpenAIRE

    World Bank

    2017-01-01

    Governments have an opportunity to harness big data solutions to improve productivity, performance and innovation in service delivery and policymaking processes. In developing countries, governments have an opportunity to adopt big data solutions and leapfrog traditional administrative approaches

  19. 78 FR 2990 - Bear Creek Storage Company, L.L.C.; Notice of Request Under Blanket Authorization

    Science.gov (United States)

    2013-01-15

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. CP13-34-000] Bear Creek..., 2012, Bear Creek Storage Company, L.L.C. (Bear Creek), 569 Brookwood Village, Suite 749, Birmingham....208, 157.213 and 157.216 of the Commission's Regulations under the Natural Gas Act, and Bear Creek's...

  20. Ordovician and Silurian Phi Kappa and Trail Creek formations, Pioneer Mountains, central Idaho; stratigraphic and structural revisions, and new data on graptolite faunas

    Science.gov (United States)

    Dover, James H.; Berry, William B.N.; Ross, Reuben James

    1980-01-01

    clastic rocks reported in previously measured sections of the Phi Kappa, as well as the sequence along Phi Kappa Creek from which the name originates, are excluded from the Phi Kappa as revised and are reassigned to two structural plates of Mississippian Copper Basin Formation; other strata now excluded from the formation are reassigned to the Trail Creek Formation and to an unnamed Silurian and Devonian unit. As redefined, the Phi Kappa Formation is only about 240 m thick, compared with the 3,860 m originally estimated, and it occupies only about 25 percent of the outcrop area previously mapped in 1930 by H. G. Westgate and C. P. Ross. Despite this drastic reduction in thickness and the exclusion of the rocks along Phi Kappa Creek, the name Phi Kappa is retained because of widely accepted prior usage to denote the Ordovician graptolitic shale facies of central Idaho, and because the Phi Kappa Formation as revised is present in thrust slices on Phi Kappa Mountain, at the head of Phi Kappa Creek. The lithic and faunal consistency of this unit throughout the area precludes the necessity for major facies telescoping along individual faults within the outcrop belt. However, tens of kilometers of tectonic shortening seems required to juxtapose the imbricated Phi Kappa shale facies with the Middle Ordovician part of the carbonate and quartzite shale sequence of east central Idaho. The shelf rocks are exposed in the Wildhorse structural window of the northeastern Pioneer Mountains, and attain a thickness of at least 1,500 m throughout the region north and east of the Pioneer Mountains. The Phi Kappa is in direct thrust contact on intensely deformed medium- to high-grade metamorphic equivalents of the same shelf sequence in the Pioneer window at the south end of the Phi Kappa-Trail Creek outcrop belt. Along East Pass, Big Lake, and Pine Creeks, north of the Pioneer Mountains, some rocks previously mapped as Ramshorn Slate are lithologically and faunally equivalent to the P

  1. Pine Creek Ranch, FY 2001 annual report; ANNUAL

    International Nuclear Information System (INIS)

    Berry, Mark E.

    2001-01-01

    Pine Creek Ranch was purchased in 1999 by the Confederated Tribes of Warm Springs using Bonneville Power Administration Fish and Wildlife Habitat Mitigation funds. The 25,000 acre property will be managed in perpetuity for the benefit of fish and wildlife habitat. Major issues include: (1) Restoring quality spawning and rearing habitat for stealhead. Streams are incised and fish passage barriers exist from culverts and possibly beaver dams. In addition to stealhead habitat, the Tribes are interested in overall riparian recovery in the John Day River system for wildlife habitat, watershed values and other values such as recreation. (2) Future grazing for specific management purposes. Past grazing practices undoubtedly contributed to current unacceptable conditions. The main stem of Pine Creek has already been enrolled in the CREP program administered by the USDA, Natural Resource Conservation Service in part because of the cost-share for vegetation restoration in a buffer portion of old fields and in part because of rental fees that will help the Tribes to pay the property taxes. Grazing is not allowed in the riparian buffer for the term of the contract. (3) Noxious weeds are a major concern. (4) Encroachment by western juniper throughout the watershed is a potential concern for the hydrology of the creek. Mark Berry, Habitat Manager, for the Pine Creek Ranch requested the Team to address the following objectives: (1) Introduce some of the field staff and others to Proper Functioning Condition (PFC) assessments and concepts. (2) Do a PFC assessment on approximately 10 miles of Pine Creek. (3) Offer management recommendations. (4) Provide guidelines for monitoring

  2. Big data optimization recent developments and challenges

    CERN Document Server

    2016-01-01

    The main objective of this book is to provide the necessary background to work with big data by introducing some novel optimization algorithms and codes capable of working in the big data setting as well as introducing some applications in big data optimization for both academics and practitioners interested, and to benefit society, industry, academia, and government. Presenting applications in a variety of industries, this book will be useful for the researchers aiming to analyses large scale data. Several optimization algorithms for big data including convergent parallel algorithms, limited memory bundle algorithm, diagonal bundle method, convergent parallel algorithms, network analytics, and many more have been explored in this book.

  3. Medical big data: promise and challenges

    Directory of Open Access Journals (Sweden)

    Choong Ho Lee

    2017-03-01

    Full Text Available The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  4. Traffic information computing platform for big data

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn; Liu, Yan, E-mail: ztduan@chd.edu.cn; Dai, Jiting, E-mail: ztduan@chd.edu.cn; Kang, Jun, E-mail: ztduan@chd.edu.cn [Chang' an University School of Information Engineering, Xi' an, China and Shaanxi Engineering and Technical Research Center for Road and Traffic Detection, Xi' an (China)

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  5. Medical big data: promise and challenges.

    Science.gov (United States)

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  6. Traffic information computing platform for big data

    International Nuclear Information System (INIS)

    Duan, Zongtao; Li, Ying; Zheng, Xibin; Liu, Yan; Dai, Jiting; Kang, Jun

    2014-01-01

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users

  7. Big Data's Role in Precision Public Health.

    Science.gov (United States)

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts.

  8. Big data analytics with R and Hadoop

    CERN Document Server

    Prajapati, Vignesh

    2013-01-01

    Big Data Analytics with R and Hadoop is a tutorial style book that focuses on all the powerful big data tasks that can be achieved by integrating R and Hadoop.This book is ideal for R developers who are looking for a way to perform big data analytics with Hadoop. This book is also aimed at those who know Hadoop and want to build some intelligent applications over Big data with R packages. It would be helpful if readers have basic knowledge of R.

  9. Big Data; A Management Revolution : The emerging role of big data in businesses

    OpenAIRE

    Blasiak, Kevin

    2014-01-01

    Big data is a term that was coined in 2012 and has since then emerged to one of the top trends in business and technology. Big data is an agglomeration of different technologies resulting in data processing capabilities that have been unreached before. Big data is generally characterized by 4 factors. Volume, velocity and variety. These three factors distinct it from the traditional data use. The possibilities to utilize this technology are vast. Big data technology has touch points in differ...

  10. BigDataBench: a Big Data Benchmark Suite from Internet Services

    OpenAIRE

    Wang, Lei; Zhan, Jianfeng; Luo, Chunjie; Zhu, Yuqing; Yang, Qiang; He, Yongqiang; Gao, Wanling; Jia, Zhen; Shi, Yingjie; Zhang, Shujie; Zheng, Chen; Lu, Gang; Zhan, Kent; Li, Xiaona; Qiu, Bizhu

    2014-01-01

    As architecture, systems, and data management communities pay greater attention to innovative big data systems and architectures, the pressure of benchmarking and evaluating these systems rises. Considering the broad use of big data systems, big data benchmarks must include diversity of data and workloads. Most of the state-of-the-art big data benchmarking efforts target evaluating specific types of applications or system software stacks, and hence they are not qualified for serving the purpo...

  11. The faces of Big Science.

    Science.gov (United States)

    Schatz, Gottfried

    2014-06-01

    Fifty years ago, academic science was a calling with few regulations or financial rewards. Today, it is a huge enterprise confronted by a plethora of bureaucratic and political controls. This change was not triggered by specific events or decisions but reflects the explosive 'knee' in the exponential growth that science has sustained during the past three-and-a-half centuries. Coming to terms with the demands and benefits of 'Big Science' is a major challenge for today's scientific generation. Since its foundation 50 years ago, the European Molecular Biology Organization (EMBO) has been of invaluable help in meeting this challenge.

  12. Big Data and central banks

    Directory of Open Access Journals (Sweden)

    David Bholat

    2015-04-01

    Full Text Available This commentary recaps a Centre for Central Banking Studies event held at the Bank of England on 2–3 July 2014. The article covers three main points. First, it situates the Centre for Central Banking Studies event within the context of the Bank’s Strategic Plan and initiatives. Second, it summarises and reflects on major themes from the event. Third, the article links central banks’ emerging interest in Big Data approaches with their broader uptake by other economic agents.

  13. Inhomogeneous Big Bang Nucleosynthesis Revisited

    OpenAIRE

    Lara, J. F.; Kajino, T.; Mathews, G. J.

    2006-01-01

    We reanalyze the allowed parameters for inhomogeneous big bang nucleosynthesis in light of the WMAP constraints on the baryon-to-photon ratio and a recent measurement which has set the neutron lifetime to be 878.5 +/- 0.7 +/- 0.3 seconds. For a set baryon-to-photon ratio the new lifetime reduces the mass fraction of He4 by 0.0015 but does not significantly change the abundances of other isotopes. This enlarges the region of concordance between He4 and deuterium in the parameter space of the b...

  14. Comparative validity of brief to medium-length Big Five and Big Six personality questionnaires

    NARCIS (Netherlands)

    Thalmayer, A.G.; Saucier, G.; Eigenhuis, A.

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five

  15. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  16. Big Data en surveillance, deel 1 : Definities en discussies omtrent Big Data

    NARCIS (Netherlands)

    Timan, Tjerk

    2016-01-01

    Naar aanleiding van een (vrij kort) college over surveillance en Big Data, werd me gevraagd iets dieper in te gaan op het thema, definities en verschillende vraagstukken die te maken hebben met big data. In dit eerste deel zal ik proberen e.e.a. uiteen te zetten betreft Big Data theorie en

  17. 77 FR 27245 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN

    Science.gov (United States)

    2012-05-09

    ... DEPARTMENT OF THE INTERIOR Fish and Wildlife Service [FWS-R3-R-2012-N069; FXRS1265030000S3-123-FF03R06000] Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN AGENCY: Fish and... plan (CCP) and environmental assessment (EA) for Big Stone National Wildlife Refuge (Refuge, NWR) for...

  18. Big game hunting practices, meanings, motivations and constraints: a survey of Oregon big game hunters

    Science.gov (United States)

    Suresh K. Shrestha; Robert C. Burns

    2012-01-01

    We conducted a self-administered mail survey in September 2009 with randomly selected Oregon hunters who had purchased big game hunting licenses/tags for the 2008 hunting season. Survey questions explored hunting practices, the meanings of and motivations for big game hunting, the constraints to big game hunting participation, and the effects of age, years of hunting...

  19. Assessment of hydrology, water quality, and trace elements in selected placer-mined creeks in the birch creek watershed near central, Alaska, 2001-05

    Science.gov (United States)

    Kennedy, Ben W.; Langley, Dustin E.

    2007-01-01

    Executive Summary The U.S. Geological Survey, in cooperation with the Bureau of Land Management, completed an assessment of hydrology, water quality, and trace-element concentrations in streambed sediment of the upper Birch Creek watershed near Central, Alaska. The assessment covered one site on upper Birch Creek and paired sites, upstream and downstream from mined areas, on Frying Pan Creek and Harrison Creek. Stream-discharge and suspended-sediment concentration data collected at other selected mined and unmined sites helped characterize conditions in the upper Birch Creek watershed. The purpose of the project was to provide the Bureau of Land Management with baseline information to evaluate watershed water quality and plan reclamation efforts. Data collection began in September 2001 and ended in September 2005. There were substantial geomorphic disturbances in the stream channel and flood plain along several miles of Harrison Creek. Placer mining has physically altered the natural stream channel morphology and removed streamside vegetation. There has been little or no effort to re-contour waste rock piles. During high-flow events, the abandoned placer-mine areas on Harrison Creek will likely contribute large quantities of sediment downstream unless the mined areas are reclaimed. During 2004 and 2005, no substantial changes in nutrient or major-ion concentrations were detected in water samples collected upstream from mined areas compared with water samples collected downstream from mined areas on Frying Pan Creek and Harrison Creek that could not be attributed to natural variation. This also was true for dissolved oxygen, pH, and specific conductance-a measure of total dissolved solids. Sample sites downstream from mined areas on Harrison Creek and Frying Pan Creek had higher median suspended-sediment concentrations, by a few milligrams per liter, than respective upstream sites. However, it is difficult to attach much importance to the small downstream increase

  20. An analysis of cross-sectional differences in big and non-big public accounting firms' audit programs

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans); Drieenhuizen, F.; Stein, M.T.; Simunic, D.A.

    2006-01-01

    A significant body of prior research has shown that audits by the Big 5 (now Big 4) public accounting firms are quality differentiated relative to non-Big 5 audits. This result can be derived analytically by assuming that Big 5 and non-Big 5 firms face different loss functions for "audit failures"

  1. Provenance of radioactive placers, Big Meadow area, Valley and Boise Counties, Idaho

    International Nuclear Information System (INIS)

    Truesdell, D.; Wegrzyn, R.; Dixon, M.

    1977-02-01

    For many years, radioactive black-sand placers have been known to be present in the Bear Valley area of west-central Idaho. The largest of these is in Big Meadow, near the head of Bear Valley Creek. Presence of these placers suggests that low-grade uranium deposits might occur in rocks of the Idaho Batholith, adjacent to Bear Valley. This study was undertaken to locate the provenance of the radioactive minerals and to identify problems that need to be solved before undertaking further investigations. The principal radioactive minerals in these placers are monazite and euxenite. Other minerals include columbite, samarskite, fergusonite, xenotime, zircon, allanite, sphene, and brannerite. Only brannerite is a uranium mineral; the others contain uranium as an impurity in crystal lattices. Radiometric determinations of the concentration of uranium in stream sediments strongly indicate that the radioactive materials originate in an area drained by Casner and Howard Creeks. Equivalent uranium levels in bedrock are highest on the divide between Casner and Howard Creeks. However, this area is not known to contain low-grade uranium occurrences. Euxenite, brannerite, columbite-tantalite, samarskite, and allanite are the principal radioactive minerals that were identified in rock samples. These minerals were found in granite pegmatites, granites, and quartz monzonites. Appreciably higher equivalent uranium concentrations were also found within these rock types. The major problem encountered in this study was the difficulty in mapping bedrock because of extensive soil and glacial mantle. A partial solution to this problem might be the application of radon emanometry so that radiometric measurements would not be limited to the sparse bedrock samples

  2. Geochemical results of a hydrothermally altered area at Baker Creek, Blaine County, Idaho

    Science.gov (United States)

    Erdman, James A.; Moye, Falma J.; Theobald, Paul K.; McCafferty, Anne E.; Larsen, Richard K.

    2001-01-01

    The area immediately east of Baker Creek, Blaine County, Idaho, is underlain by a thick section of mafic to intermediate lava flows of the Eocene Challis Volcanic Group. Widespread propylitic alteration surrounds a zone of argillic alteration and an inner core of phyllic alteration. Silicified breccia is present along an east-trending fault within the zone of phyllic alteration. As part of a reconnaissance geochemical survey, soils and plants were sampled. Several species of plants (Douglas-fir [ Pseudotsuga menziesii ], mountain big sagebrush [ Artemisia tridentata ssp. vaseyana ], and elk sedge [ Carex geyerii ]) were collected from 10 upland localities and stream sediments, panned concentrates, and aquatic mosses were collected from 16 drainage basin localities all of which were generally within the area of alteration. Geochemical results yielded anomalous concentrations of molybenum, zinc, silver, and lead in at least half of the seven different sample media and of gold, thallium, arsenic, antimony, manganese, boron, cadmium, bismuth, copper, and beryllium in from one to four of the various media. Part of this suite of elements? silver, gold, arsenic, antimony, thallium, and manganese? suggests that the mineralization in the area is epithermal. Barite and pyrite (commonly botryoidal-framboidal) are widespread throughout the area sampled. Visible gold and pyromorphite (a secondary lead mineral) were identified in only one small drainage basin, but high levels of gold were detected in aquatic mosses over a larger area. Data from the upland and stream sampling indicate two possible mineralized areas. The first mineralized area was identified by a grab sample from an outcrop of quartz stockwork that contained 50 ppb Au, 1.5 ppm Ag, and 50 ppm Mo. Although the soil and plant species that were sampled in the area indicated mineralized bedrock, the Douglas-fir samples were the best indicators of the silver anomaly. The second possible mineralized area centers on the

  3. Baryon symmetric big bang cosmology

    Science.gov (United States)

    Stecker, F. W.

    1978-01-01

    Both the quantum theory and Einsteins theory of special relativity lead to the supposition that matter and antimatter were produced in equal quantities during the big bang. It is noted that local matter/antimatter asymmetries may be reconciled with universal symmetry by assuming (1) a slight imbalance of matter over antimatter in the early universe, annihilation, and a subsequent remainder of matter; (2) localized regions of excess for one or the other type of matter as an initial condition; and (3) an extremely dense, high temperature state with zero net baryon number; i.e., matter/antimatter symmetry. Attention is given to the third assumption, which is the simplest and the most in keeping with current knowledge of the cosmos, especially as pertains the universality of 3 K background radiation. Mechanisms of galaxy formation are discussed, whereby matter and antimatter might have collided and annihilated each other, or have coexisted (and continue to coexist) at vast distances. It is pointed out that baryon symmetric big bang cosmology could probably be proved if an antinucleus could be detected in cosmic radiation.

  4. Georges et le big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2011-01-01

    Georges et Annie, sa meilleure amie, sont sur le point d'assister à l'une des plus importantes expériences scientifiques de tous les temps : explorer les premiers instants de l'Univers, le Big Bang ! Grâce à Cosmos, leur super ordinateur, et au Grand Collisionneur de hadrons créé par Éric, le père d'Annie, ils vont enfin pouvoir répondre à cette question essentielle : pourquoi existons nous ? Mais Georges et Annie découvrent qu'un complot diabolique se trame. Pire, c'est toute la recherche scientifique qui est en péril ! Entraîné dans d'incroyables aventures, Georges ira jusqu'aux confins de la galaxie pour sauver ses amis...Une plongée passionnante au coeur du Big Bang. Les toutes dernières théories de Stephen Hawking et des plus grands scientifiques actuels.

  5. Astronomical Surveys and Big Data

    Directory of Open Access Journals (Sweden)

    Mickaelian Areg M.

    2016-03-01

    Full Text Available Recent all-sky and large-area astronomical surveys and their catalogued data over the whole range of electromagnetic spectrum, from γ-rays to radio waves, are reviewed, including such as Fermi-GLAST and INTEGRAL in γ-ray, ROSAT, XMM and Chandra in X-ray, GALEX in UV, SDSS and several POSS I and POSS II-based catalogues (APM, MAPS, USNO, GSC in the optical range, 2MASS in NIR, WISE and AKARI IRC in MIR, IRAS and AKARI FIS in FIR, NVSS and FIRST in radio range, and many others, as well as the most important surveys giving optical images (DSS I and II, SDSS, etc., proper motions (Tycho, USNO, Gaia, variability (GCVS, NSVS, ASAS, Catalina, Pan-STARRS, and spectroscopic data (FBS, SBS, Case, HQS, HES, SDSS, CALIFA, GAMA. An overall understanding of the coverage along the whole wavelength range and comparisons between various surveys are given: galaxy redshift surveys, QSO/AGN, radio, Galactic structure, and Dark Energy surveys. Astronomy has entered the Big Data era, with Astrophysical Virtual Observatories and Computational Astrophysics playing an important role in using and analyzing big data for new discoveries.

  6. Big data in oncologic imaging.

    Science.gov (United States)

    Regge, Daniele; Mazzetti, Simone; Giannini, Valentina; Bracco, Christian; Stasi, Michele

    2017-06-01

    Cancer is a complex disease and unfortunately understanding how the components of the cancer system work does not help understand the behavior of the system as a whole. In the words of the Greek philosopher Aristotle "the whole is greater than the sum of parts." To date, thanks to improved information technology infrastructures, it is possible to store data from each single cancer patient, including clinical data, medical images, laboratory tests, and pathological and genomic information. Indeed, medical archive storage constitutes approximately one-third of total global storage demand and a large part of the data are in the form of medical images. The opportunity is now to draw insight on the whole to the benefit of each individual patient. In the oncologic patient, big data analysis is at the beginning but several useful applications can be envisaged including development of imaging biomarkers to predict disease outcome, assessing the risk of X-ray dose exposure or of renal damage following the administration of contrast agents, and tracking and optimizing patient workflow. The aim of this review is to present current evidence of how big data derived from medical images may impact on the diagnostic pathway of the oncologic patient.

  7. Water quality and amphibian health in the Big Bend region of the Rio Grande Basin

    Science.gov (United States)

    Sharma, Bibek; Hu, F.; Carr, J.A.; Patino, Reynaldo

    2011-01-01

    Male and female Rio Grande leopard frogs (Rana berlandieri) were collected in May 2005 from the main stem and tributaries of the Rio Grande in the Big Bend region of Texas. Frogs were examined for (1) incidence of testicular ovarian follicles in males; (2) thyroid epithelial cell height, a potential index of exposure to thyroid-disrupting contaminants; and (3) incidence of liver melanomacrophage aggregates, a general index of exposure to contaminants. Standard parameters of surface water quality and concentrations of selected elements, including heavy metals, were determined at each frog collection site. Heavy metals also were measured in whole-frog composite extracts. Water cadmium concentrations in most sites and chloride concentrations in the main stem exceeded federal criteria for freshwater aquatic life. Mercury was detected in frogs from the two collection sites in Terlingua Creek. There was a seventeen percent incidence of testicular ovarian follicles in male frogs. Mean thyroid epithelial cell height was greater in frogs from one of the Terlingua Creek sites (Terlingua Abajo). No differences were observed in the incidence of hepatic macrophage aggregates among sites. In conclusion, although potential cause-effect relationships between indices of habitat quality and amphibian health could not be established, the results of this study raise concerns about the general quality of the aquatic habitat and the potential long-term consequences to the aquatic biota of the Big Bend region. The presence of ovarian follicles in male frogs is noteworthy but further study is necessary to determine whether this phenomenon is natural or anthropogenically induced.

  8. Leveraging Mobile Network Big Data for Developmental Policy ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Some argue that big data and big data users offer advantages to generate evidence. ... Supported by IDRC, this research focused on transportation planning in urban ... Using mobile network big data for land use classification CPRsouth 2015.

  9. Geophysical Characterization of the Hilton Creek Fault System

    Science.gov (United States)

    Lacy, A. K.; Macy, K. P.; De Cristofaro, J. L.; Polet, J.

    2016-12-01

    The Long Valley Caldera straddles the eastern edge of the Sierra Nevada Batholith and the western edge of the Basin and Range Province, and represents one of the largest caldera complexes on Earth. The caldera is intersected by numerous fault systems, including the Hartley Springs Fault System, the Round Valley Fault System, the Long Valley Ring Fault System, and the Hilton Creek Fault System, which is our main region of interest. The Hilton Creek Fault System appears as a single NW-striking fault, dipping to the NE, from Davis Lake in the south to the southern rim of the Long Valley Caldera. Inside the caldera, it splays into numerous parallel faults that extend toward the resurgent dome. Seismicity in the area increased significantly in May 1980, following a series of large earthquakes in the vicinity of the caldera and a subsequent large earthquake swarm which has been suggested to be the result of magma migration. A large portion of the earthquake swarms in the Long Valley Caldera occurs on or around the Hilton Creek Fault splays. We are conducting an interdisciplinary geophysical study of the Hilton Creek Fault System from just south of the onset of splay faulting, to its extension into the dome of the caldera. Our investigation includes ground-based magnetic field measurements, high-resolution total station elevation profiles, Structure-From-Motion derived topography and an analysis of earthquake focal mechanisms and statistics. Preliminary analysis of topographic profiles, of approximately 1 km in length, reveals the presence of at least three distinct fault splays within the caldera with vertical offsets of 0.5 to 1.0 meters. More detailed topographic mapping is expected to highlight smaller structures. We are also generating maps of the variation in b-value along different portions of the Hilton Creek system to determine whether we can detect any transition to more swarm-like behavior towards the North. We will show maps of magnetic anomalies, topography

  10. Simulation of water quality for Salt Creek in northeastern Illinois

    Science.gov (United States)

    Melching, Charles S.; Chang, T.J.

    1996-01-01

    Water-quality processes in the Salt Creek watershed in northeastern Illinois were simulated with a computer model. Selected waste-load scenarios for 7-day, 10-year low-flow conditions were simulated in the stream system. The model development involved the calibration of the U.S. Environmental Protection Agency QUAL2E model to water-quality constituent concentration data collected by the Illinois Environmental Protection Agency (IEPA) for a diel survey on August 29-30, 1995, and the verification of this model with water-quality constituent concentration data collected by the IEPA for a diel survey on June 27-28, 1995. In-stream measurements of sediment oxygen demand rates and carbonaceous biochemical oxygen demand (CBOD) decay rates by the IEPA and traveltime and reaeration-rate coefficients by the U.S. Geological Survey facilitated the development of a model for simulation of water quality in the Salt Creek watershed. In general, the verification of the calibrated model increased confidence in the utility of the model for water-quality planning in the Salt Creek watershed. However, the model was adjusted to better simulate constituent concentrations measured during the June 27-28, 1995, diel survey. Two versions of the QUAL2E model were utilized to simulate dissolved oxygen (DO) concentrations in the Salt Creek watershed for selected effluent discharge and concentration scenarios for water-quality planning: (1) the QUAL2E model calibrated to the August 29-30, 1995, diel survey, and (2) the QUAL2E model adjusted to the June 27-28, 1995, diel survey. The results of these simulations indicated that the QUAL2E model adjusted to the June 27-28, 1995, diel survey simulates reliable information for water-quality planning. The results of these simulations also indicated that to maintain DO concentrations greater than 5 milligrams per liter (mg/L) throughout most of Salt Creek for 7-day, 10-year low-flow conditions, the sewage-treatment plants (STP's) must discharge

  11. A New Look at Big History

    Science.gov (United States)

    Hawkey, Kate

    2014-01-01

    The article sets out a "big history" which resonates with the priorities of our own time. A globalizing world calls for new spacial scales to underpin what the history curriculum addresses, "big history" calls for new temporal scales, while concern over climate change calls for a new look at subject boundaries. The article…

  12. A little big history of Tiananmen

    NARCIS (Netherlands)

    Quaedackers, E.; Grinin, L.E.; Korotayev, A.V.; Rodrigue, B.H.

    2011-01-01

    This contribution aims at demonstrating the usefulness of studying small-scale subjects such as Tiananmen, or the Gate of Heavenly Peace, in Beijing - from a Big History perspective. By studying such a ‘little big history’ of Tiananmen, previously overlooked yet fundamental explanations for why

  13. What is beyond the big five?

    Science.gov (United States)

    Saucier, G; Goldberg, L R

    1998-08-01

    Previous investigators have proposed that various kinds of person-descriptive content--such as differences in attitudes or values, in sheer evaluation, in attractiveness, or in height and girth--are not adequately captured by the Big Five Model. We report on a rather exhaustive search for reliable sources of Big Five-independent variation in data from person-descriptive adjectives. Fifty-three candidate clusters were developed in a college sample using diverse approaches and sources. In a nonstudent adult sample, clusters were evaluated with respect to a minimax criterion: minimum multiple correlation with factors from Big Five markers and maximum reliability. The most clearly Big Five-independent clusters referred to Height, Girth, Religiousness, Employment Status, Youthfulness and Negative Valence (or low-base-rate attributes). Clusters referring to Fashionableness, Sensuality/Seductiveness, Beauty, Masculinity, Frugality, Humor, Wealth, Prejudice, Folksiness, Cunning, and Luck appeared to be potentially beyond the Big Five, although each of these clusters demonstrated Big Five multiple correlations of .30 to .45, and at least one correlation of .20 and over with a Big Five factor. Of all these content areas, Religiousness, Negative Valence, and the various aspects of Attractiveness were found to be represented by a substantial number of distinct, common adjectives. Results suggest directions for supplementing the Big Five when one wishes to extend variable selection outside the domain of personality traits as conventionally defined.

  14. Big Data Analytics and Its Applications

    Directory of Open Access Journals (Sweden)

    Mashooque A. Memon

    2017-10-01

    Full Text Available The term, Big Data, has been authored to refer to the extensive heave of data that can't be managed by traditional data handling methods or techniques. The field of Big Data plays an indispensable role in various fields, such as agriculture, banking, data mining, education, chemistry, finance, cloud computing, marketing, health care stocks. Big data analytics is the method for looking at big data to reveal hidden patterns, incomprehensible relationship and other important data that can be utilize to resolve on enhanced decisions. There has been a perpetually expanding interest for big data because of its fast development and since it covers different areas of applications. Apache Hadoop open source technology created in Java and keeps running on Linux working framework was used. The primary commitment of this exploration is to display an effective and free solution for big data application in a distributed environment, with its advantages and indicating its easy use. Later on, there emerge to be a required for an analytical review of new developments in the big data technology. Healthcare is one of the best concerns of the world. Big data in healthcare imply to electronic health data sets that are identified with patient healthcare and prosperity. Data in the healthcare area is developing past managing limit of the healthcare associations and is relied upon to increment fundamentally in the coming years.

  15. Big data and software defined networks

    CERN Document Server

    Taheri, Javid

    2018-01-01

    Big Data Analytics and Software Defined Networking (SDN) are helping to drive the management of data usage of the extraordinary increase of computer processing power provided by Cloud Data Centres (CDCs). This new book investigates areas where Big-Data and SDN can help each other in delivering more efficient services.

  16. Big Science and Long-tail Science

    CERN Document Server

    2008-01-01

    Jim Downing and I were privileged to be the guests of Salavtore Mele at CERN yesterday and to see the Atlas detector of the Large Hadron Collider . This is a wow experience - although I knew it was big, I hadnt realised how big.

  17. The Death of the Big Men

    DEFF Research Database (Denmark)

    Martin, Keir

    2010-01-01

    Recently Tolai people og Papua New Guinea have adopted the term 'Big Shot' to decribe an emerging post-colonial political elite. The mergence of the term is a negative moral evaluation of new social possibilities that have arisen as a consequence of the Big Shots' privileged position within a glo...

  18. An embedding for the big bang

    Science.gov (United States)

    Wesson, Paul S.

    1994-01-01

    A cosmological model is given that has good physical properties for the early and late universe but is a hypersurface in a flat five-dimensional manifold. The big bang can therefore be regarded as an effect of a choice of coordinates in a truncated higher-dimensional geometry. Thus the big bang is in some sense a geometrical illusion.

  19. Probing the pre-big bang universe

    International Nuclear Information System (INIS)

    Veneziano, G.

    2000-01-01

    Superstring theory suggests a new cosmology whereby a long inflationary phase preceded a non singular big bang-like event. After discussing how pre-big bang inflation naturally arises from an almost trivial initial state of the Universe, I will describe how present or near-future experiments can provide sensitive probes of how the Universe behaved in the pre-bang era

  20. Starting Small, Thinking Big - Continuum Magazine | NREL

    Science.gov (United States)

    , Thinking Big Stories NREL Helps Agencies Target New Federal Sustainability Goals Student Engagements Help solar power in the territory. Photo by Don Buchanan, VIEO Starting Small, Thinking Big NREL helps have used these actions to optimize that energy use.'" NREL's cross-organizational work supports

  1. Exploiting big data for critical care research.

    Science.gov (United States)

    Docherty, Annemarie B; Lone, Nazir I

    2015-10-01

    Over recent years the digitalization, collection and storage of vast quantities of data, in combination with advances in data science, has opened up a new era of big data. In this review, we define big data, identify examples of critical care research using big data, discuss the limitations and ethical concerns of using these large datasets and finally consider scope for future research. Big data refers to datasets whose size, complexity and dynamic nature are beyond the scope of traditional data collection and analysis methods. The potential benefits to critical care are significant, with faster progress in improving health and better value for money. Although not replacing clinical trials, big data can improve their design and advance the field of precision medicine. However, there are limitations to analysing big data using observational methods. In addition, there are ethical concerns regarding maintaining confidentiality of patients who contribute to these datasets. Big data have the potential to improve medical care and reduce costs, both by individualizing medicine, and bringing together multiple sources of data about individual patients. As big data become increasingly mainstream, it will be important to maintain public confidence by safeguarding data security, governance and confidentiality.

  2. Practice variation in Big-4 transparency reports

    NARCIS (Netherlands)

    Girdhar, Sakshi; Jeppesen, K.K.

    2018-01-01

    Purpose The purpose of this paper is to examine the transparency reports published by the Big-4 public accounting firms in the UK, Germany and Denmark to understand the determinants of their content within the networks of big accounting firms. Design/methodology/approach The study draws on a

  3. Big data analysis for smart farming

    NARCIS (Netherlands)

    Kempenaar, C.; Lokhorst, C.; Bleumer, E.J.B.; Veerkamp, R.F.; Been, Th.; Evert, van F.K.; Boogaardt, M.J.; Ge, L.; Wolfert, J.; Verdouw, C.N.; Bekkum, van Michael; Feldbrugge, L.; Verhoosel, Jack P.C.; Waaij, B.D.; Persie, van M.; Noorbergen, H.

    2016-01-01

    In this report we describe results of a one-year TO2 institutes project on the development of big data technologies within the milk production chain. The goal of this project is to ‘create’ an integration platform for big data analysis for smart farming and to develop a show case. This includes both

  4. Cloud Based Big Data Infrastructure: Architectural Components and Automated Provisioning

    OpenAIRE

    Demchenko, Yuri; Turkmen, Fatih; Blanchet, Christophe; Loomis, Charles; Laat, Caees de

    2016-01-01

    This paper describes the general architecture and functional components of the cloud based Big Data Infrastructure (BDI). The proposed BDI architecture is based on the analysis of the emerging Big Data and data intensive technologies and supported by the definition of the Big Data Architecture Framework (BDAF) that defines the following components of the Big Data technologies: Big Data definition, Data Management including data lifecycle and data structures, Big Data Infrastructure (generical...

  5. The ethics of biomedical big data

    CERN Document Server

    Mittelstadt, Brent Daniel

    2016-01-01

    This book presents cutting edge research on the new ethical challenges posed by biomedical Big Data technologies and practices. ‘Biomedical Big Data’ refers to the analysis of aggregated, very large datasets to improve medical knowledge and clinical care. The book describes the ethical problems posed by aggregation of biomedical datasets and re-use/re-purposing of data, in areas such as privacy, consent, professionalism, power relationships, and ethical governance of Big Data platforms. Approaches and methods are discussed that can be used to address these problems to achieve the appropriate balance between the social goods of biomedical Big Data research and the safety and privacy of individuals. Seventeen original contributions analyse the ethical, social and related policy implications of the analysis and curation of biomedical Big Data, written by leading experts in the areas of biomedical research, medical and technology ethics, privacy, governance and data protection. The book advances our understan...

  6. Big data analytics methods and applications

    CERN Document Server

    Rao, BLS; Rao, SB

    2016-01-01

    This book has a collection of articles written by Big Data experts to describe some of the cutting-edge methods and applications from their respective areas of interest, and provides the reader with a detailed overview of the field of Big Data Analytics as it is practiced today. The chapters cover technical aspects of key areas that generate and use Big Data such as management and finance; medicine and healthcare; genome, cytome and microbiome; graphs and networks; Internet of Things; Big Data standards; bench-marking of systems; and others. In addition to different applications, key algorithmic approaches such as graph partitioning, clustering and finite mixture modelling of high-dimensional data are also covered. The varied collection of themes in this volume introduces the reader to the richness of the emerging field of Big Data Analytics.

  7. 2nd INNS Conference on Big Data

    CERN Document Server

    Manolopoulos, Yannis; Iliadis, Lazaros; Roy, Asim; Vellasco, Marley

    2017-01-01

    The book offers a timely snapshot of neural network technologies as a significant component of big data analytics platforms. It promotes new advances and research directions in efficient and innovative algorithmic approaches to analyzing big data (e.g. deep networks, nature-inspired and brain-inspired algorithms); implementations on different computing platforms (e.g. neuromorphic, graphics processing units (GPUs), clouds, clusters); and big data analytics applications to solve real-world problems (e.g. weather prediction, transportation, energy management). The book, which reports on the second edition of the INNS Conference on Big Data, held on October 23–25, 2016, in Thessaloniki, Greece, depicts an interesting collaborative adventure of neural networks with big data and other learning technologies.

  8. Practice Variation in Big-4 Transparency Reports

    DEFF Research Database (Denmark)

    Girdhar, Sakshi; Klarskov Jeppesen, Kim

    2018-01-01

    Purpose: The purpose of this paper is to examine the transparency reports published by the Big-4 public accounting firms in the UK, Germany and Denmark to understand the determinants of their content within the networks of big accounting firms. Design/methodology/approach: The study draws...... on a qualitative research approach, in which the content of transparency reports is analyzed and semi-structured interviews are conducted with key people from the Big-4 firms who are responsible for developing the transparency reports. Findings: The findings show that the content of transparency reports...... is inconsistent and the transparency reporting practice is not uniform within the Big-4 networks. Differences were found in the way in which the transparency reporting practices are coordinated globally by the respective central governing bodies of the Big-4. The content of the transparency reports...

  9. Epidemiology in the Era of Big Data

    Science.gov (United States)

    Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M

    2015-01-01

    Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called ‘3 Vs’: variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that, while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field’s future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future. PMID:25756221

  10. Ethics and Epistemology in Big Data Research.

    Science.gov (United States)

    Lipworth, Wendy; Mason, Paul H; Kerridge, Ian; Ioannidis, John P A

    2017-12-01

    Biomedical innovation and translation are increasingly emphasizing research using "big data." The hope is that big data methods will both speed up research and make its results more applicable to "real-world" patients and health services. While big data research has been embraced by scientists, politicians, industry, and the public, numerous ethical, organizational, and technical/methodological concerns have also been raised. With respect to technical and methodological concerns, there is a view that these will be resolved through sophisticated information technologies, predictive algorithms, and data analysis techniques. While such advances will likely go some way towards resolving technical and methodological issues, we believe that the epistemological issues raised by big data research have important ethical implications and raise questions about the very possibility of big data research achieving its goals.

  11. The Big bang and the Quantum

    Science.gov (United States)

    Ashtekar, Abhay

    2010-06-01

    General relativity predicts that space-time comes to an end and physics comes to a halt at the big-bang. Recent developments in loop quantum cosmology have shown that these predictions cannot be trusted. Quantum geometry effects can resolve singularities, thereby opening new vistas. Examples are: The big bang is replaced by a quantum bounce; the `horizon problem' disappears; immediately after the big bounce, there is a super-inflationary phase with its own phenomenological ramifications; and, in presence of a standard inflation potential, initial conditions are naturally set for a long, slow roll inflation independently of what happens in the pre-big bang branch. As in my talk at the conference, I will first discuss the foundational issues and then the implications of the new Planck scale physics near the Big Bang.

  12. Big Creek Flood Control Project, Cleveland, Ohio. Phase II. General Design Memorandum. Appendix A. Soils, Geology, and Construction Materials.

    Science.gov (United States)

    1979-02-01

    maximum thickness of about 4 feet of shale excavation will be required. The upper 1 foot of shale is weathered and easily rippable . A44. From Station...weathered and easily rippable . AS0. From Station 75+60M to 83+50M, maximum excavation in shale is about 3 feet. Most of the excavation in this section will be

  13. Travel time analysis for a subsurface drained sub-watershed in Upper Big Walnut Creek Watershed, Ohio

    Science.gov (United States)

    Runoff travel time, which is a function of watershed and storm characteristics, is an important parameter affecting the prediction accuracy of hydrologic models. Although, time of concentration (tc) is a most widely used time parameter, it has multiple conceptual and computational definitions. Most ...

  14. White Oak Creek embayment sediment retention structure design and construction

    International Nuclear Information System (INIS)

    Van Hoesen, S.D.; Kimmell, B.L.; Page, D.G.; Wilkerson, R.B.; Hudson, G.R.; Kauschinger, J.L.; Zocolla, M.

    1994-01-01

    White Oak Creek is the major surface water drainage throughout the Department of Energy (DOE) Oak Ridge National Laboratory (ORNL). Samples taken from the lower portion of the creek revealed high levels of Cesium 137 and lower level of Cobalt 60 in near surface sediment. Other contaminants present in the sediment included: lead, mercury, chromium, and PCBs. In October 1990, DOE, US Environmental Protection Agency (EPA), and Tennessee Department of Environment and Conservation (TDEC) agreed to initiate a time critical removal action in accordance with the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) to prevent the transport of the contaminated sediments into the Clinch River system. This paper discusses the environmental, regulatory, design, and construction issues that were encountered in conducting the remediation work

  15. Hydrologic data for North Creek, Trinity River basin, Texas, 1976

    Science.gov (United States)

    Kidwell, C.C.

    1978-01-01

    This report contains rainfall and runoff data collected during the 1976 water year for a 21.6-square mile area above the stream-gaging station on North Creek near Jacksboro, Texas. A continuous water-stage recording gage was installed at one representative floodwater-retarding structure (site 28-A) on Oct. 5, 1972. The data are used to compute the contents, surface area, inflow, and outflow at this site. The stream-gaging station on North Creek near Jacksboro continuously records the water level which, with measurements of streamflow, is used to compute the runoff from the study area. Streamflow records at this gage began on Aug. 8, 1956. Detailed rainfall-runoff computations, including hydrographs and mass curves, are included for two storm periods during the 1976 water year at the stream-gaging station. (Woodard-USGS)

  16. Hydrologic data for North Creek, Trinity River basin, Texas, 1979

    Science.gov (United States)

    Kidwell, C.C.

    1981-01-01

    This report contains rainfall and runoff data collected during the 1979 water year for the 21.6-square mile area above the stream-gaging station North Creek near Jacksboro, Texas. A continuous water-stage recording gage was installed at one representative floodwater-retarding structure (site 28-A) on Oct. 5, 1972. The data are collected to compute the contents, surface area, inflow, and outflow at this site. The stream-gaging station on North Creek near Jacksboro continuously records the water level which, with measurements of streamflow, is used to compute the runoff from the study area. Streamflow records at this gage began on Aug. 8, 1956. Detailed rainfall-runoff computations are included for one storm during the 1979 water year at the stream-gaging station. (USGS)

  17. Retran simulation of Oyster Creek generator trip startup test

    International Nuclear Information System (INIS)

    Alammar, M.A.

    1987-01-01

    RETRAN simulation of Oyster Creek generator trip startup test was carried out as part of Oyster Creek RETRAN model qualification program for reload licensing applications. The objective of the simulation was to qualify the turbine model and its interface with the control valve and bypass systems under severe transients. The test was carried out by opening the main breakers at rated power. The turbine speed governor closed the control valves and the pressure regulator opened the bypass valves within 0.5 sec. The stop valves closed by a no-load turbine trip, before the 10 percent overspeed trip was reached and the reactor scrammed on high APRM neutron flux. The simulation resulted in qualifying a normalized hydraulic torque for the turbine model and a 0.3 sec, delay block for the bypass model to account for the different delays in the hydraulic linkages present in the system. One-dimensional kinetics was used in this simulation

  18. Water quality monitoring report for the White Oak Creek Embayment

    International Nuclear Information System (INIS)

    Ford, C.J.; Wefer, M.T.

    1993-01-01

    Water quality monitoring activities that focused on the detection of resuspended sediments in the Clinch River were conducted in conjunction with the White Oak Creek Embayment (WOCE) time-critical Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) removal action to construct a sediment-retention structure at the mouth of White Oak Creek (WOC). Samples were collected by use of a 24-h composite sampler and through real-time water grab sampling of sediment plumes generated by the construction activities. Sampling stations were established both at the WOC mouth, immediately adjacent to the construction site, and at K-1513, the Oak Ridge K-25 Site drinking water intake approximately 9.6 km downstream in the Clinch River. Results are described

  19. Intelligent search in Big Data

    Science.gov (United States)

    Birialtsev, E.; Bukharaev, N.; Gusenkov, A.

    2017-10-01

    An approach to data integration, aimed on the ontology-based intelligent search in Big Data, is considered in the case when information objects are represented in the form of relational databases (RDB), structurally marked by their schemes. The source of information for constructing an ontology and, later on, the organization of the search are texts in natural language, treated as semi-structured data. For the RDBs, these are comments on the names of tables and their attributes. Formal definition of RDBs integration model in terms of ontologies is given. Within framework of the model universal RDB representation ontology, oil production subject domain ontology and linguistic thesaurus of subject domain language are built. Technique of automatic SQL queries generation for subject domain specialists is proposed. On the base of it, information system for TATNEFT oil-producing company RDBs was implemented. Exploitation of the system showed good relevance with majority of queries.

  20. Big Data in Transport Geography

    DEFF Research Database (Denmark)

    Reinau, Kristian Hegner; Agerholm, Niels; Lahrmann, Harry Spaabæk

    for studies that explicitly compare the quality of this new type of data to traditional data sources. With the current focus on Big Data in the transport field, public transport planners are increasingly looking towards smart card data to analyze and optimize flows of passengers. However, in many cases...... it is not all public transport passengers in a city, region or country with a smart card system that uses the system, and in such cases, it is important to know what biases smart card data has in relation to giving a complete view upon passenger flows. This paper therefore analyses the quality and biases...... of smart card data in Denmark, where public transport passengers may use a smart card, may pay with cash for individual trips or may hold a season ticket for a certain route. By analyzing smart card data collected in Denmark in relation to data on sales of cash tickets, sales of season tickets, manual...