WorldWideScience

Sample records for program big spring

  1. 76 FR 63714 - Big Spring Rail System, Inc.;Operation Exemption;Transport Handling Specialists, Inc.

    Science.gov (United States)

    2011-10-13

    ...] Big Spring Rail System, Inc.;Operation Exemption;Transport Handling Specialists, Inc. Big Spring Rail... Howard County, Tex., owned by the City of Big Spring, Tex. (City). BSRS will be operating the line for...

  2. Spatial-Temporal Analysis on Spring Festival Travel Rush in China Based on Multisource Big Data

    National Research Council Canada - National Science Library

    Li, Jiwei; Ye, Qingqing; Deng, Xuankai; Liu, Yaolin; Liu, Yanfang

    2016-01-01

    .... This study investigates the spatial-temporal characteristics of Spring Festival travel rush in 2015 through time series analysis and complex network analysis based on multisource big travel data...

  3. 78 FR 37792 - Mario Julian Martinez-Bernache, Inmate Number #95749-279, CI Big Spring, Corrections Institution...

    Science.gov (United States)

    2013-06-24

    ... Bureau of Industry and Security Mario Julian Martinez-Bernache, Inmate Number 95749-279, CI Big Spring, Corrections Institution, 2001 Rickabaugh Drive, Big Spring, TX 79720; Order Denying Export Privileges On March... this provision may be for a period of up to 10 years from the date of the conviction. 15 CFR 766.25(d...

  4. Spatial-Temporal Analysis on Spring Festival Travel Rush in China Based on Multisource Big Data

    Directory of Open Access Journals (Sweden)

    Jiwei Li

    2016-11-01

    Full Text Available Spring Festival travel rush is a phenomenon in China that population travel intensively surges in a short time around Chinese Spring Festival. This phenomenon, which is a special one in the urbanization process of China, brings a large traffic burden and various kinds of social problems, thereby causing widespread public concern. This study investigates the spatial-temporal characteristics of Spring Festival travel rush in 2015 through time series analysis and complex network analysis based on multisource big travel data derived from Baidu, Tencent, and Qihoo. The main results are as follows: First, big travel data of Baidu and Tencent obtained from location-based services might be more accurate and scientific than that of Qihoo. Second, two travel peaks appeared at five days before and six days after the Spring Festival, respectively, and the travel valley appeared on the Spring Festival. The Spring Festival travel network at the provincial scale did not have small-world and scale-free characteristics. Instead, the travel network showed a multicenter characteristic and a significant geographic clustering characteristic. Moreover, some travel path chains played a leading role in the network. Third, economic and social factors had more influence on the travel network than geographical location factors. The problem of Spring Festival travel rush will not be effectively improved in a short time because of the unbalanced urban-rural development and the unbalanced regional development. However, the development of the modern high-speed transport system and the modern information and communication technology can alleviate problems brought by Spring Festival travel rush. We suggest that a unified real-time traffic platform for Spring Festival travel rush should be established through the government's integration of mobile big data and the official authority data of the transportation department.

  5. A Big Data Analytics Methodology Program in the Health Sector

    Science.gov (United States)

    Lawler, James; Joseph, Anthony; Howell-Barber, H.

    2016-01-01

    The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…

  6. 78 FR 18967 - Walla Walla Basin Spring Chinook Hatchery Program

    Science.gov (United States)

    2013-03-28

    ... Bonneville Power Administration Walla Walla Basin Spring Chinook Hatchery Program AGENCY: Bonneville Power... Tribes of the Umatilla Indian Reservation's (CTUIR) proposal to construct and operate a hatchery for spring Chinook salmon in the Walla Walla River basin. The hatchery would expand facilities at the site of...

  7. Evolution of the Air Toxics under the Big Sky Program

    Science.gov (United States)

    Marra, Nancy; Vanek, Diana; Hester, Carolyn; Holian, Andrij; Ward, Tony; Adams, Earle; Knuth, Randy

    2011-01-01

    As a yearlong exploration of air quality and its relation to respiratory health, the "Air Toxics Under the Big Sky" program offers opportunities for students to learn and apply science process skills through self-designed inquiry-based research projects conducted within their communities. The program follows a systematic scope and sequence…

  8. Instructional Program Review Guidelines, Spring 2001.

    Science.gov (United States)

    Peralta Community Coll. System, Oakland, CA. Office of Educational Services.

    This document presents guidelines for program review at Peralta Community College District's (PCCD) (California) institutions. The primary objective of the program review process is to assure that PCCD's educational programs reflect student needs and encourage student success. The review process consists of five stages: (1) a discipline self-study…

  9. Technology Evaluation for the Big Spring Water Treatment System at the Y-12 National Security Complex, Oak Ridge, Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    Becthel Jacobs Company LLC

    2002-11-01

    The Y-12 National Security Complex (Y-12 Complex) is an active manufacturing and developmental engineering facility that is located on the U.S. Department of Energy (DOE) Oak Ridge Reservation. Building 9201-2 was one of the first process buildings constructed at the Y-12 Complex. Construction involved relocating and straightening of the Upper East Fork Poplar Creek (UEFPC) channel, adding large quantities of fill material to level areas along the creek, and pumping of concrete into sinkholes and solution cavities present within the limestone bedrock. Flow from a large natural spring designated as ''Big Spring'' on the original 1943 Stone & Webster Building 9201-2 Field Sketch FS6003 was captured and directed to UEFPC through a drainpipe designated Outfall 51. The building was used from 1953 to 1955 for pilot plant operations for an industrial process that involved the use of large quantities of elemental mercury. Past operations at the Y-12 Complex led to the release of mercury to the environment. Significant environmental media at the site were contaminated by accidental releases of mercury from the building process facilities piping and sumps associated with Y-12 Complex mercury handling facilities. Releases to the soil surrounding the buildings have resulted in significant levels of mercury in these areas of contamination, which is ultimately transported to UEFPC, its streambed, and off-site. Bechtel Jacobs Company LLC (BJC) is the DOE-Oak Ridge Operations prime contractor responsible for conducting environmental restoration activities at the Y-12 Complex. In order to mitigate the mercury being released to UEFPC, the Big Spring Water Treatment System will be designed and constructed as a Comprehensive Environmental Response, Compensation, and Liability Act action. This facility will treat the combined flow from Big Spring feeding Outfall 51 and the inflow now being processed at the East End Mercury Treatment System (EEMTS). Both discharge to

  10. Big Bayou Creek and Little Bayou Creek Watershed Monitoring Program

    Energy Technology Data Exchange (ETDEWEB)

    Kszos, L.A.; Peterson, M.J.; Ryon; Smith, J.G.

    1999-03-01

    Biological monitoring of Little Bayou and Big Bayou creeks, which border the Paducah Site, has been conducted since 1987. Biological monitoring was conducted by University of Kentucky from 1987 to 1991 and by staff of the Environmental Sciences Division (ESD) at Oak Ridge National Laboratory (ORNL) from 1991 through March 1999. In March 1998, renewed Kentucky Pollutant Discharge Elimination System (KPDES) permits were issued to the US Department of Energy (DOE) and US Enrichment Corporation. The renewed DOE permit requires that a watershed monitoring program be developed for the Paducah Site within 90 days of the effective date of the renewed permit. This plan outlines the sampling and analysis that will be conducted for the watershed monitoring program. The objectives of the watershed monitoring are to (1) determine whether discharges from the Paducah Site and the Solid Waste Management Units (SWMUs) associated with the Paducah Site are adversely affecting instream fauna, (2) assess the ecological health of Little Bayou and Big Bayou creeks, (3) assess the degree to which abatement actions ecologically benefit Big Bayou Creek and Little Bayou Creek, (4) provide guidance for remediation, (5) provide an evaluation of changes in potential human health concerns, and (6) provide data which could be used to assess the impact of inadvertent spills or fish kill. According to the cleanup will result in these watersheds [Big Bayou and Little Bayou creeks] achieving compliance with the applicable water quality criteria.

  11. Alcohol Fuels Program technical review, Spring 1984

    Energy Technology Data Exchange (ETDEWEB)

    1984-10-01

    The alcohol fuels program consists of in-house and subcontracted research for the conversion of lignocellulosic biomass into fuel alcohols via thermoconversion and bioconversion technologies. In the thermoconversion area, the SERI gasifier has been operated on a one-ton per day scale and produces a clean, medium-Btu gas that can be used to manufacture methanol with a relatively small gas-water shift reaction requirement. Recent research has produced catalysts that make methanol and a mixture of higher alcohols from the biomass-derived synthetic gas. Three hydrolysis processes have emerged as candidates for more focused research. They are: a high-temperature, dilute-acid, plug-flow approach based on the Dartmouth reactor; steam explosion pretreatment followed by hydrolysis using the RUT-C30 fungal organism; and direct microbial conversion of the cellulose to ethanol using bacteria in a single or mixed culture. Modeling studies, including parametric and sensitivity analyses, have recently been completed. The results of these studies will lead to a better definition of the present state-of-the-art for these processes and provide a framework for establishing the research and process engineering issues that still need resolution. In addition to these modeling studies, economic feasibility studies are being carried out by commercial engineering firms. Their results will supplement and add commercial validity to the program results. The feasibility contractors will provide input at two levels: Technical and economic assessment of the current state-of-the-art in alcohol production from lignocellulosic biomass via thermoconversion to produce methanol and higher alcohol mixtures and bioconversion to produce ethanol; and identification of research areas having the potential to significantly reduce the cost of production of alcohols.

  12. Big Data: Are Biomedical and Health Informatics Training Programs Ready?

    Science.gov (United States)

    Hersh, W.; Ganesh, A. U. Jai

    2014-01-01

    Summary Objectives The growing volume and diversity of health and biomedical data indicate that the era of Big Data has arrived for healthcare. This has many implications for informatics, not only in terms of implementing and evaluating information systems, but also for the work and training of informatics researchers and professionals. This article addresses the question: What do biomedical and health informaticians working in analytics and Big Data need to know? Methods We hypothesize a set of skills that we hope will be discussed among academic and other informaticians. Results The set of skills includes: Programming - especially with data-oriented tools, such as SQL and statistical programming languages; Statistics - working knowledge to apply tools and techniques; Domain knowledge - depending on one’s area of work, bioscience or health care; and Communication - being able to understand needs of people and organizations, and articulate results back to them. Conclusions Biomedical and health informatics educational programs must introduce concepts of analytics, Big Data, and the underlying skills to use and apply them into their curricula. The development of new coursework should focus on those who will become experts, with training aiming to provide skills in “deep analytical talent” as well as those who need knowledge to support such individuals. PMID:25123740

  13. Are youth mentoring programs good value-for-money? An evaluation of the Big Brothers Big Sisters Melbourne Program.

    Science.gov (United States)

    Moodie, Marjory L; Fisher, Jane

    2009-01-30

    The Big Brothers Big Sisters (BBBS) program matches vulnerable young people with a trained, supervised adult volunteer as mentor. The young people are typically seriously disadvantaged, with multiple psychosocial problems. Threshold analysis was undertaken to determine whether investment in the program was a worthwhile use of limited public funds. The potential cost savings were based on US estimates of life-time costs associated with high-risk youth who drop out-of-school and become adult criminals. The intervention was modelled for children aged 10-14 years residing in Melbourne in 2004. If the program serviced 2,208 of the most vulnerable young people, it would cost AUD 39.5 M. Assuming 50% were high-risk, the associated costs of their adult criminality would be AUD 3.3 billion. To break even, the program would need to avert high-risk behaviours in only 1.3% (14/1,104) of participants. This indicative evaluation suggests that the BBBS program represents excellent 'value for money'.

  14. Big George to Carter Mountain 115-kV transmission line project, Park and Hot Springs Counties, Wyoming. Environmental Assessment

    Energy Technology Data Exchange (ETDEWEB)

    1994-02-01

    The Western Area Power Administration (Western) is proposing to rebuild, operate, and maintain a 115-kilovolt (kV) transmission line between the Big George and Carter Mountain Substations in northwest Wyoming (Park and Hot Springs Counties). This environmental assessment (EA) was prepared in compliance with the National Environmental Policy Act (NEPA) and the regulations of the Council on Environmental Quality (CEQ) and the Department of Energy (DOE). The existing Big George to Carter Mountain 69-kV transmission line was constructed in 1941 by the US Department of Interior, Bureau of Reclamation, with 1/0 copper conductor on wood-pole H-frame structures without an overhead ground wire. The line should be replaced because of the deteriorated condition of the wood-pole H-frame structures. Because the line lacks an overhead ground wire, it is subject to numerous outages caused by lightning. The line will be 54 years old in 1995, which is the target date for line replacement. The normal service life of a wood-pole line is 45 years. Under the No Action Alternative, no new transmission lines would be built in the project area. The existing 69-kV transmission line would continue to operate with routine maintenance, with no provisions made for replacement.

  15. Big Data: Big Confusion? Big Challenges?

    Science.gov (United States)

    2015-05-01

    12th Annual Acquisition Research Symposium 12th Annual Acquisition Research Symposium Big Data: Big Confusion? Big Challenges? Mary Maureen...currently valid OMB control number. 1. REPORT DATE MAY 2015 2. REPORT TYPE 3. DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Big ...Data: Big Confusion? Big Challenges? 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK

  16. Certificate program in meeting and event planning to be offered spring 2009

    OpenAIRE

    Felker, Susan B.

    2009-01-01

    Virginia Tech Continuing and Professional Education will again offer a four-weekend-long certificate program in meeting and event planning in spring 2009. This is the second time the program will be available since its inaugural offering in 2008.

  17. Annual Big Game Hunting Program : Parker River National Wildlife Refuge : CY 1993

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This 1993 Annual Big Game Hunting Program outlines the reasons and regulations for white-tailed deer hunting on Parker River National Wildlife Refuge. The...

  18. Annual Big Game Hunting Program : Parker River National Wildlife Refuge : CY 1990

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This 1990 Annual Big Game Hunting Program outlines the reasons and regulations for white-tailed deer hunting on Parker River National Wildlife Refuge. The...

  19. Virginia Tech to launch certificate program in meeting and event planning in spring 2008

    OpenAIRE

    Felker, Susan B.

    2008-01-01

    Virginia Tech Continuing and Professional Education, in partnership with the Virginia Tech Hampton Roads Center, will offer a four-weekend-long certificate program in meeting and event planning in spring 2008.

  20. Measuring the efficacy of a wildfire education program in Colorado Springs.

    Science.gov (United States)

    G.H. Donovan; P.A. Champ; D.T. Butry

    2007-01-01

    We examine an innovative wildfire risk education program in Colorado Springs, which rated the wildfire risk of 35,000 homes in the city's wildland urban interface. Evidence from home sales before and after the program's implementation suggests that the program was successful at changing homebuyers' attitudes toward wildfire risk, particularly preferences...

  1. Our World of Water. A Spring Program for Fifth Graders.

    Science.gov (United States)

    Jackson Community Coll., MI. Dahlem Environmental Education Center.

    This instructional packet is one of 14 school environmental education programs developed for use in the classroom and at the Dahlem Environmental Education Center (DEEC) of the Jackson Community College (Michigan). Provided in the packet are pre-trip activities, field trip activities, and post-trip activities which focus on water in the built and…

  2. Peak discharge, flood frequency, and peak stage of floods on Big Cottonwood Creek at U.S. Highway 50 near Coaldale, Colorado, and Fountain Creek below U.S. Highway 24 in Colorado Springs, Colorado, 2016

    Science.gov (United States)

    Kohn, Michael S.; Stevens, Michael R.; Mommandi, Amanullah; Khan, Aziz R.

    2017-12-14

    The U.S. Geological Survey (USGS), in cooperation with the Colorado Department of Transportation, determined the peak discharge, annual exceedance probability (flood frequency), and peak stage of two floods that took place on Big Cottonwood Creek at U.S. Highway 50 near Coaldale, Colorado (hereafter referred to as “Big Cottonwood Creek site”), on August 23, 2016, and on Fountain Creek below U.S. Highway 24 in Colorado Springs, Colorado (hereafter referred to as “Fountain Creek site”), on August 29, 2016. A one-dimensional hydraulic model was used to estimate the peak discharge. To define the flood frequency of each flood, peak-streamflow regional-regression equations or statistical analyses of USGS streamgage records were used to estimate annual exceedance probability of the peak discharge. A survey of the high-water mark profile was used to determine the peak stage, and the limitations and accuracy of each component also are presented in this report. Collection and computation of flood data, such as peak discharge, annual exceedance probability, and peak stage at structures critical to Colorado’s infrastructure are an important addition to the flood data collected annually by the USGS.The peak discharge of the August 23, 2016, flood at the Big Cottonwood Creek site was 917 cubic feet per second (ft3/s) with a measurement quality of poor (uncertainty plus or minus 25 percent or greater). The peak discharge of the August 29, 2016, flood at the Fountain Creek site was 5,970 ft3/s with a measurement quality of poor (uncertainty plus or minus 25 percent or greater).The August 23, 2016, flood at the Big Cottonwood Creek site had an annual exceedance probability of less than 0.01 (return period greater than the 100-year flood) and had an annual exceedance probability of greater than 0.005 (return period less than the 200-year flood). The August 23, 2016, flood event was caused by a precipitation event having an annual exceedance probability of 1.0 (return

  3. Sustaining Employability: A Process for Introducing Cloud Computing, Big Data, Social Networks, Mobile Programming and Cybersecurity into Academic Curricula

    National Research Council Canada - National Science Library

    Razvan Bologa; Ana-Ramona Lupu; Catalin Boja; Tiberiu Marian Georgescu

    2017-01-01

    ... curricula of business students: cloud computing, big data, mobile programming, and social networks and cybersecurity (CAMSS). The results are useful for those trying to implement similar curricular reforms, or to companies that need to manage talent pipelines.

  4. Tucannon River Spring Chinook Salmon Captive Brood Program, FY 2000 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Bumgarner, Joseph D.; Gallinat, Michael P.

    2001-06-01

    This report summarizes the objectives, tasks, and accomplishments of the Tucannon River spring chinook captive brood program from program inception (1997) through April 2001. The WDFW initiated a captive broodstock program in 1997. The overall goal of the Tucannon River captive broodstock program is for the short-term, and eventually long-term, rebuilding of the Tucannon River spring chinook salmon run, with the hope that natural production will eventually sustain itself. The project goal is to rear captive salmon to adults, spawn them, rear their progeny, and release approximately 150,000 smolts annually into the Tucannon River between 2003-2007. These smolt releases, in combination with the current hatchery supplementation program (132,000 smolts), and wild production, is expected to produce 600-700 returning adult spring chinook to the Tucannon River each year from 2005-2010. The Master Plan, Environmental Assessment, and most facility modifications at LFH were completed for the Tucannon River spring chinook captive broodstock program during FY2000 and FY2001. DNA samples collected since 1997 have been sent to the WDFW genetics lab in Olympia for baseline DNA analysis. Results from the genetic analysis are not available at this time. The captive broodstock program is planned to collect fish from five (1997-2001) brood years (BY). The captive broodstock program was initiated with 1997 BY juveniles, and the 2000 BY fish have been selected. As of April 30, 2001, WDFW has 172 BY 1997, 262 BY 1998, 407 BY 1999, and approximately 1,190 BY 2000 fish on hand at LFH. Twelve of 13 mature 97 BY females were spawned in 2000. Total eggtake was 14,813. Mean fecundity was 1,298 eggs/female based on 11 fully spawned females. Egg survival to eye-up was 47.3%. This low survival was expected for three year old captive broodstock females. As of April 30, 2001, WDFW has 4,211 captive broodstock progeny on hand. These fish will be tagged with blank wire tag without fin clips and

  5. Tucannon River Spring Chinook Captive Broodstock Program Final Environmental Assessment and Finding of No Significant Impact

    Energy Technology Data Exchange (ETDEWEB)

    N/A

    2000-05-24

    Bonneville Power Administration (BPA) is proposing to fund the Tucannon River Spring Chinook Captive Broodstock Program, a small-scale production initiative designed to increase numbers of a weak but potentially recoverable population of spring chinook salmon in the Tucannon River in the State of Washington. BPA has prepared an Environmental Assessment (EA) (DOE/EA-l326) evaluating the proposed project. Based on the analysis in the EA, BPA has determined that the proposed action is not a major Federal action significantly affecting the quality of the human environment, within the meaning of the National Environmental Policy Act (NEPA) of 1969. Therefore, the preparation of an Environmental Impact Statement (EIS) is not required, and BPA is issuing this Finding of No Significant Impact (FONSI).

  6. LSVT LOUD and LSVT BIG: Behavioral Treatment Programs for Speech and Body Movement in Parkinson Disease

    Directory of Open Access Journals (Sweden)

    Cynthia Fox

    2012-01-01

    Full Text Available Recent advances in neuroscience have suggested that exercise-based behavioral treatments may improve function and possibly slow progression of motor symptoms in individuals with Parkinson disease (PD. The LSVT (Lee Silverman Voice Treatment Programs for individuals with PD have been developed and researched over the past 20 years beginning with a focus on the speech motor system (LSVT LOUD and more recently have been extended to address limb motor systems (LSVT BIG. The unique aspects of the LSVT Programs include the combination of (a an exclusive target on increasing amplitude (loudness in the speech motor system; bigger movements in the limb motor system, (b a focus on sensory recalibration to help patients recognize that movements with increased amplitude are within normal limits, even if they feel “too loud” or “too big,” and (c training self-cueing and attention to action to facilitate long-term maintenance of treatment outcomes. In addition, the intensive mode of delivery is consistent with principles that drive activity-dependent neuroplasticity and motor learning. The purpose of this paper is to provide an integrative discussion of the LSVT Programs including the rationale for their fundamentals, a summary of efficacy data, and a discussion of limitations and future directions for research.

  7. Hatcheries, Harvest and Wild Fish: An Integrated Program at Warm Springs National Fish Hatchery, Oregon

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — Warm Springs National Fish Hatchery is operated by the U.S. Fish and Wildlife Service and is located on the Warm Springs River within the Warm Springs Indian...

  8. Tucannon River Spring Chinook Salmon Captive Broodstock Program, Annual Report 2001.

    Energy Technology Data Exchange (ETDEWEB)

    Gallinat, Michael P.; Bumgarner, Joseph D.

    2002-05-01

    This report summarizes the objectives, tasks, and accomplishments of the Tucannon River spring chinook captive brood during 2001. The WDFW initiated a captive broodstock program in 1997. The overall goal of the Tucannon River captive broodstock program is for the short-term, and eventually long-term, rebuilding of the Tucannon River spring chinook salmon run, with the hope that natural production will sustain itself. The project goal is to rear captive salmon selected from the supplementation program to adults, spawn them, rear their progeny, and release approximately 150,000 smolts annually into the Tucannon River between 2003-2007. These smolt releases, in combination with the current hatchery supplementation program (132,000 smolts) and wild production, are expected to produce 600-700 returning adult spring chinook to the Tucannon River each year from 2005-2010. The captive broodstock program will collect fish from five (1997-2001) brood years (BY). The captive broodstock program was initiated with 1997 BY juveniles, and the 2001 BY fish have been selected. As of Jan 1, 2002, WDFW has 17 BY 1997, 159 BY 1998, 316 BY 1999, 448 BY 2000, and approximately 1,200 BY 2001 fish on hand at LFH. The 2001 eggtake from the 1997 brood year (Age 4) was 233,894 eggs from 125 ripe females. Egg survival was 69%. Mean fecundity based on the 105 fully spawned females was 1,990 eggs/female. The 2001 eggtake from the 1998 brood year (Age 3) was 47,409 eggs from 41 ripe females. Egg survival was 81%. Mean fecundity based on the 39 fully spawned females was 1,160 eggs/female. The total 2001 eggtake from the captive brood program was 281,303 eggs. As of May 1, 2002 we have 171,495 BY 2001 captive brood progeny on hand. A total of 20,592 excess fish were marked as parr (AD/CWT) and will be released during early May, 2002 into the Tucannon River (rkm 40-45). This will allow us to stay within our maximum allowed number (150,000) of smolts released. During April 2002, WDFW volitionally

  9. Supporting Imagers' VOICE: A National Training Program in Comparative Effectiveness Research and Big Data Analytics.

    Science.gov (United States)

    Kang, Stella K; Rawson, James V; Recht, Michael P

    2017-12-05

    Provided methodologic training, more imagers can contribute to the evidence basis on improved health outcomes and value in diagnostic imaging. The Value of Imaging Through Comparative Effectiveness Research Program was developed to provide hands-on, practical training in five core areas for comparative effectiveness and big biomedical data research: decision analysis, cost-effectiveness analysis, evidence synthesis, big data principles, and applications of big data analytics. The program's mixed format consists of web-based modules for asynchronous learning as well as in-person sessions for practical skills and group discussion. Seven diagnostic radiology subspecialties and cardiology are represented in the first group of program participants, showing the collective potential for greater depth of comparative effectiveness research in the imaging community. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  10. Comparing and evaluating terminology services application programming interfaces: RxNav, UMLSKS and LexBIG.

    Science.gov (United States)

    Pathak, Jyotishman; Peters, Lee; Chute, Christopher G; Bodenreider, Olivier

    2010-01-01

    To facilitate the integration of terminologies into applications, various terminology services application programming interfaces (API) have been developed in the recent past. In this study, three publicly available terminology services API, RxNav, UMLSKS and LexBIG, are compared and functionally evaluated with respect to the retrieval of information from one biomedical terminology, RxNorm, to which all three services provide access. A list of queries is established covering a wide spectrum of terminology services functionalities such as finding RxNorm concepts by their name, or navigating different types of relationships. Test data were generated from the RxNorm dataset to evaluate the implementation of the functionalities in the three API. The results revealed issues with various aspects of the API implementation (eg, handling of obsolete terms by LexBIG) and documentation (eg, navigational paths used in RxNav) that were subsequently addressed by the development teams of the three API investigated. Knowledge about such discrepancies helps inform the choice of an API for a given use case.

  11. Grande Ronde Endemic Spring Chinook Salmon Supplementation Program: Facility Operation and Maintenance and Monitoring and Evaluation, 2001 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Boe, Stephen J.; Ogburn, Parker N. (Confederated Tribes of the Umatilla Indian Reservation, Department of Natural Resources, Pendleton, OR)

    2003-03-01

    This is the second annual report of a multi-year project to operate adult collection and juvenile acclimation facilities on Catherine Creek and the upper Grande Ronde River for Snake River spring chinook salmon. These two streams have historically supported populations that provided significant tribal and non-tribal fisheries. Supplementation using conventional and captive broodstock techniques is being used to restore fisheries in these streams. Statement of Work Objectives for 2001: (1) Participate in implementation of the comprehensive multiyear operations plan for the Grande Ronde Endemic Spring chinook Supplementation Program (GRESCP). (2) Plan detailed GRESCP Monitoring and Evaluation for future years. (3) Ensure proper construction and trial operation of semi-permanent adult and juvenile facilities for use in 2001. (4) Plan for data collection needs for bull trout. (5) Ensure proper construction and trial operation of semi-permanent adult and juvenile facilities for use in 2001. (6) Collect summer steelhead. (7) Monitor adult endemic spring chinook salmon populations and collect broodstock. (8) Acclimate juvenile spring chinook salmon prior to release into the upper Grande Ronde River and Catherine Creek. (9) Monitor adult population abundance and characteristics of Grande Ronde River spring chinook salmon populations. (10) Monitor condition, movement, and mortality of spring chinook salmon acclimated at remote facilities. (11) Participate in Monitoring & Evaluation of the captive brood component of the Program to document contribution to the Program. (12) Monitor water quality at facilities. (13) Document accomplishments and needs to permitters, comanagers, and funding agencies. (14) Communicate Project results to the scientific community.

  12. Grande Ronde Endemic Spring Chinook Salmon Supplementation Program: Facility Operation and Maintenance and Monitoring and Evaluation, 2000 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Boe, Stephen J.; Lofy, Peter T. (Confederated Tribes of the Umatilla Indian Reservation, Pendleton, OR)

    2003-03-01

    This is the third annual report of a multi-year project to operate adult collection and juvenile acclimation facilities on Catherine Creek and the upper Grande Ronde River for Snake River spring chinook salmon. These two streams have historically supported populations that provided significant tribal and non-tribal fisheries. Supplementation using conventional and captive broodstock techniques is being used to restore fisheries in these streams. Statement of Work Objectives for 2000: (1) Participate in implementation of the comprehensive multiyear operations plan for the Grande Ronde Endemic Spring Chinook Supplementation Program (GRESCP). (2) Plan for recovery of endemic summer steelhead populations in Catherine Creek and the upper Grande Ronde River. (3) Ensure proper construction and trial operation of semi-permanent adult and juvenile facilities for use in 2000. (4) Collect summer steelhead. (5) Collect adult endemic spring chinook salmon broodstock. (6) Acclimate juvenile spring chinook salmon prior to release into the upper Grande Ronde River and Catherine Creek. (7) Document accomplishments and needs to permitters, comanagers, and funding agency. (8) Communicate project results to the scientific community. (9) Plan detailed GRESCP Monitoring and Evaluation for future years. (10) Monitor adult population abundance and characteristics of Grande Ronde River spring chinook salmon populations and incidentally-caught summer steelhead and bull trout. (11) Monitor condition, movement, and mortality of spring chinook salmon acclimated at remote facilities. (12) Monitor water quality at facilities. (13) Participate in Monitoring & Evaluation of the captive brood component of the Program to document contribution to the Program.

  13. Tucannon River Spring Chinook Salmon Captive Broodstock Program, Annual Report 2002.

    Energy Technology Data Exchange (ETDEWEB)

    Gallinat, Michael; Varney, Michelle

    2003-05-01

    This report summarizes the objectives, tasks, and accomplishments of the Tucannon River Spring Chinook Captive Broodstock Program during 2002. The WDFW initiated a captive broodstock program in 1997. The overall goal of the Tucannon River captive broodstock program is for the short-term, and eventually long-term, rebuilding of the Tucannon River spring chinook salmon run, with the hope that natural production will sustain itself. The project goal is to rear captive salmon selected from the supplementation program to adults, spawn them, rear their progeny, and release approximately 150,000 smolts annually into the Tucannon River between 2003-2007. These smolt releases, in combination with the current hatchery supplementation program (132,000 smolts) and wild production, are expected to produce 600-700 returning adult spring chinook to the Tucannon River each year from 2005-2010. The captive broodstock program collected fish from five (1997-2001) brood years (BY). As of January 1, 2003, WDFW has approximately 11 BY 1998, 194 BY 1999, 314 BY 2000, 447 BY 2001, and 300 BY 2002 (for extra males) fish on hand at LFH. The 2002 eggtake from the 1997 brood year (Age 5) was 13,176 eggs from 10 ripe females. Egg survival was 22%. Mean fecundity based on the 5 fully spawned females was 1,803 eggs/female. The 2002 eggtake from the 1998 brood year (Age 4) was 143,709 eggs from 93 ripe females. Egg survival was 29%. Mean fecundity based on the 81 fully spawned females was 1,650 eggs/female. The 2002 eggtake from the 1999 brood year (Age 3) was 19,659 eggs from 18 ripe females. Egg survival was 55%. Mean fecundity based on the 18 fully spawned fish was 1,092 eggs/female. The total 2002 eggtake from the captive brood program was 176,544 eggs. A total of 120,833 dead eggs (68%) were removed with 55,711 live eggs remaining for the program. As of May 1, 2003 we had 46,417 BY 2002 captive brood progeny on hand A total of 20,592 excess BY 01 fish were marked as parr (AD/CWT) and

  14. Grande Ronde Endemic Spring Chinook Salmon Supplementation Program : Facility Operation and Maintenance Facilities, Annual Report 2003.

    Energy Technology Data Exchange (ETDEWEB)

    McLean, Michael L.; Seeger, Ryan; Hewitt, Laurie (Confederated Tribes of the Umatilla Indian Reservation, Department of Natural Resources, Pendleton, OR)

    2004-01-01

    Anadromous salmonid stocks have declined in both the Grande Ronde River Basin (Lower Snake River Compensation Plan (LSRCP) Status Review Symposium 1998) and in the entire Snake River Basin (Nehlsen et al. 1991), many to the point of extinction. The Grande Ronde River Basin historically supported large populations of fall and spring chinook (Oncorhynchus tshawytscha), sockeye (O. nerka), and coho (O. kisutch) salmon and steelhead trout (O. mykiss) (Nehlsen et al. 1991). The decline of chinook salmon and steelhead populations and extirpation of coho and sockeye salmon in the Grande Ronde River Basin was, in part, a result of construction and operation of hydroelectric facilities, over fishing, and loss and degradation of critical spawning and rearing habitat in the Columbia and Snake River basins (Nehlsen et al. 1991). Hatcheries were built in Oregon, Washington and Idaho under the Lower Snake River Compensation Plan (LSRCP) to compensate for losses of anadromous salmonids due to the construction and operation of the lower four Snake River dams. Lookingglass Hatchery (LGH) on Lookingglass Creek, a tributary of the Grande Ronde River, was completed under LSRCP in 1982 and has served as the main incubation and rearing site for chinook salmon programs for Grande Ronde and Imnaha rivers in Oregon. Despite these hatchery programs, natural spring chinook populations continued to decline resulting in the National Marine Fisheries Service (NMFS) listing Snake River spring/summer chinook salmon as ''threatened'' under the federal Endangered Species Act (1973) on 22 April 1992. Continuing poor escapement levels and declining population trends indicated that Grande Ronde River basin spring chinook salmon were in imminent danger of extinction. These continuing trends led fisheries co-managers in the basin to initiate the Grande Ronde Endemic Spring Chinook Salmon Supplementation Program (GRESCSSP) in order to prevent extinction and preserve options for use of

  15. Applications of the MapReduce programming framework to clinical big data analysis: current landscape and future trends

    Science.gov (United States)

    2014-01-01

    The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called “big data” challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. The MapReduce programming framework uses two tasks common in functional programming: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by

  16. Applications of the MapReduce programming framework to clinical big data analysis: current landscape and future trends.

    Science.gov (United States)

    Mohammed, Emad A; Far, Behrouz H; Naugler, Christopher

    2014-01-01

    The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called "big data" challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. THE MAPREDUCE PROGRAMMING FRAMEWORK USES TWO TASKS COMMON IN FUNCTIONAL PROGRAMMING: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by

  17. Big Data Meets Physics Education Research: From MOOCs to University-Led High School Programs

    Science.gov (United States)

    Seaton, Daniel

    2017-01-01

    The Massive Open Online Course (MOOC) movement has catalyzed discussions of digital learning on campuses around the world and highlighted the increasingly large, complex datasets related to learning. Physics Education Research can and should play a key role in measuring outcomes of this most recent wave of digital education. In this talk, I will discuss big data and learning analytics through multiple modes of teaching and learning enabled by the open-source edX platform: open-online, flipped, and blended. Open-Online learning will be described through analysis of MOOC offerings from Harvard and MIT, where 2.5 million unique users have led to 9 million enrollments across nearly 300 courses. Flipped instruction will be discussed through an Advanced Placement program at Davidson College that empowers high school teachers to use AP aligned, MOOC content directly in their classrooms with only their students. Analysis of this program will be highlighted, including results from a pilot study showing a positive correlation between content usage and externally validated AP exam scores. Lastly, blended learning will be discussed through specific residential use cases at Davidson College and MIT, highlighting unique course models that blend open-online and residential experiences. My hope for this talk is that listeners will better understand the current wave of digital education and the opportunities it provides for data-driven teaching and learning.

  18. Grande Ronde Endemic Spring Chinook Salmon Supplementation Program; Satellite Facilities Operation and Maintenance, 2005 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    McLean, Michael L.; Seeger, Ryan; Hewitt, Laurie (Confederated Tribes of the Umatilla Indian Reservation, Department of Natural Resources, Pendleton, OR)

    2006-01-01

    There were 2 acclimation periods at the Catherine Creek Acclimation Facility (CCAF) in 2005. During the early acclimation period, 130,748 smolts were delivered from Lookingglass Hatchery (LGH) on 7 March. This group contained progeny of both the captive (53%) and conventional broodstock programs. The size of the fish at delivery was 23.9 fish/lb. Volitional releases began 14 March 2005 and ended 27 March with an estimated total (based on PIT tag detections of 3,187) of 29,402 fish leaving the raceways. This was 22.5% of the total fish delivered. Fish remaining in the raceways after volitional release were forced out. Hourly detections of PIT-tagged fish showed that most of the fish left around 1900 hours. The size of the fish just before the volitional release was 23.9 and the size of the fish remaining just before the forced release was 23.2 fish/lb. The total mortality for the acclimation period was 204 (0.16%). The total number of fish released from the acclimation facility during the early period was 130,544. During the second acclimation period 59,100 smolts were delivered from LGH on 28 March. This group was comprised entirely of progeny from the conventional broodstock program. The size of the fish at delivery was 21.8 fish/lb. Volitional releases began 3 April 2005 and ended with a force out emergency release on 7 April. The size of the fish just before the volitional release was 21.8. The total mortality for the acclimation period was 64 (0.11 %). The total number of fish released from the acclimation facility during the late period was 59,036. There was only 1 planned acclimation period at the Upper Grande Ronde Acclimation Facility (UGRAF) in 2005. During the early acclimation period 105,418 smolts were delivered from LGH on 8 March. This group was comprised entirely of progeny from the conventional broodstock program. The size of the fish at delivery was 21.0 fish/lb. There was no volitional release in 2005 due to freezing air and water conditions

  19. Big-bird programs: Effect of strain, sex, and debone time on meat quality of broilers.

    Science.gov (United States)

    Brewer, V B; Kuttappan, V A; Emmert, J L; Meullenet, J-F C; Owens, C M

    2012-01-01

    The industry trend toward early deboning of chickens has led to the need to explore the effect on meat quality, including the effects of strain and sex. An experiment was conducted using broilers of 4 different high-yielding commercial strains chosen because of their common use in big-bird production. Of each strain, 360 birds were commercially processed at 59, 61, and 63 d of age in 2 replicates per day. Breast fillets were harvested at 2, 4, and 6 h postmortem (PM). Muscle pH and instrumental color (L*, a*, and b*) were measured at the time of deboning and at 24 h PM. Fillets were cooked to 76°C and cook loss was calculated, followed by Meullenet-Owens razor shear (MORS) analysis. Muscle pH significantly decreased over time as aging before deboning increased. Furthermore, L* values significantly increased as aging time increased, with the fillets deboned at 6 h PM having the highest L* value, followed by 4 h, and then 2 h PM. After 24 h, the fillets deboned at 6 h still had the highest L* compared with those deboned at 2 or 4 h PM. Fillets from strain B had the highest L* values. Fillets deboned at 2 h PM had significantly higher cook losses and MORS energy (indicating tougher fillets) than fillets deboned at 4 or 6 h PM, but there was no difference in cook loss due to strain at any deboning time. Fillets deboned at 4 h PM also had higher MORS energy than fillets deboned at 6 h PM, and differences in MORS energy among the strains were observed at 4 h PM. There was no difference in instrumental color values or cook loss due to sex. However, fillets of males had significantly greater MORS energy (tougher fillets) when deboned at 2, 4, and 6 h PM than those of females. Results of this study suggest that deboning time, sex, and strain can affect meat quality in big-bird market programs.

  20. Grande Ronde Basin Spring Chinook Salmon Captive Broodstock Program, 1995-2002 Summary Report.

    Energy Technology Data Exchange (ETDEWEB)

    Hoffnagle, Timothy; Carmichael, Richard; Noll, William

    2003-12-01

    survey areas in 1995 from as high as 1,205 redds in the same area in 1969 (Table 1). All streams reached low points (0-6 redds in the index areas) in the 1990's, except those in which no redds were found for several years and surveys were discontinued, such as Spring, Sheep and Indian creeks which had a total of 109 redds in 1969. The Minam and Wenaha rivers are tributaries of the Grande Ronde River located primarily in wilderness areas. Chinook salmon numbers in these two streams (based on redd counts) also decreased dramatically beginning in the early 1970's (Table 1). Since then there have been a few years of increasing numbers of redds but counts have generally been 25-40% of the number seen in the 1960's. No hatchery fish have been released into either of these streams and we monitor them during spawning ground surveys for the presence of hatchery strays. These populations will be used as a type of control for evaluating our supplementation efforts in Catherine Creek, upper Grande Ronde River and Lostine River. In this way, we can attempt to filter out the effects of downstream variables, over which we have no control, when we interpret the results of the captive broodstock program as the F1 and F2 generations spawn and complete their life cycles in the wild. The Grande Ronde Basin Captive Broodstock Program was initiated because these chinook salmon populations had reached critical levels where dramatic and unprecedented efforts were needed to prevent extinction and preserve any future options for use of endemic fish for artificial propagation programs for recovery and mitigation. This program was designed to quickly increase numbers of returning adults, while maintaining the genetic integrity of each endemic population.

  1. Spring 5 & reactive streams

    CERN Document Server

    CERN. Geneva; Clozel, Brian

    2017-01-01

    Spring is a framework widely used by the world-wide Java community, and it is also extensively used at CERN. The accelerator control system is constituted of 10 million lines of Java code, spread across more than 1000 projects (jars) developed by 160 software engineers. Around half of this (all server-side Java code) is based on the Spring framework. Warning: the speakers will assume that people attending the seminar are familiar with Java and Spring’s basic concepts. Spring 5.0 and Spring Boot 2.0 updates (45 min) This talk will cover the big ticket items in the 5.0 release of Spring (including Kotlin support, @Nullable and JDK9) and provide an update on Spring Boot 2.0, which is scheduled for the end of the year. Reactive Spring (1h) Spring Framework 5.0 has been released - and it now supports reactive applications in the Spring ecosystem. During this presentation, we'll talk about the reactive foundations of Spring Framework with the Reactor project and the reactive streams specification. We'll al...

  2. DNA Radiation Environments Program - Spring 1990 2-meter box experiments and analyses

    Energy Technology Data Exchange (ETDEWEB)

    Santoro, R.T (Oak Ridge National Lab., TN (United States)); Whitaker, S.Y. (Clark Atlanta Univ., GA (United States))

    1992-09-01

    This report summarizes the Spring 1990 2-m Box Experiments performed at the Army Pulse Radiation Facility (APRF) at Aberdeen Proving Ground, Maryland. These studies were sponsored by the Defense Nuclear Agency (DNA) under the Radiation Environments Program to obtain measured data for benchmarking the Adjoint Monte Carlo Code System, MASH, Version 1.0. MASH was developed as the Department of Defense and NATO code system for calculating neutron and gamma-ray radiation fields and shielding protection factors for armored vehicles and military structures against nuclear weapon radiation. In the experiments, neutron and gamma-ray dose and reduction factors were measured in the free-field and as a function of position on an anthropomorphic phantom that was placed outside and inside the steel-walled 2-m box. The data were acquired at a distance of 400-m from the APRF reactor. The measurements were performed by APRF, Bubble Technology Industries, the Defence Research Establishment Ottawa, Establishment Technique Central de l'Armement, and Harry Diamond Laboratory. Calculations were carried out by the Oak Ridge National Laboratory and Science Applications International Corporation. The purpose of these experiments was to measure the neutron and gamma-ray dose as a function of detector location on the phantom for cases when the phantom was standing in the free-field and inside of the box. Neutron measurements were made using a BD-IOOR bubble detector and gamma-ray measurements were made using thermoluminescent detectors (TLD). Calculated and measured data were compared in terms of the C/M ratio. DNA mandated that C/M values of {plus minus}20% define the acceptable limits for the comparison of the dose and reduction factor data and for qualifying the MASH code in replicating integral parameters.

  3. DNA Radiation Environments Program Spring 1991 2-meter box experiments and analyses

    Energy Technology Data Exchange (ETDEWEB)

    Santoro, R.T. [Oak Ridge National Lab., TN (United States); Whitaker, S.Y. [Clark Atlanta Univ., GA (United States)

    1993-03-01

    This report summarizes the Spring 1991 2-m Box experiments that were performed at the Army Pulse Radiation Facility (APRF) at Aberdeen Proving Ground. These studies were sponsored by the Defense Nuclear Agency (DNA) under the Radiation Environments Program to obtain measured data for benchmarking the Adjoint Monte Carlo Code System, MASH, Version 1.0. The MASH code system was developed for the Department of Defense and NATO for calculating neutron and gamma-ray radiation fields and shielding protection factors for armored vehicles and military structures against nuclear weapon radiation. In the 2-m Box experiments, neutron and gamma-ray dose rates and reduction factors were measured in the free-field and as a function of position on an anthropomorphic phantom that was placed outside and inside a borated polyethylene lined steel-walled 2-m box. The data were acquired at a distance of 400-m from the APRF reactor. The purpose of these experiments was to measure the neutron and gamma-ray dose rates as a function of detector location on the phantom for cases when the phantom was in the free-field and inside of the box. Neutron measurements were made using a BD-100R bubble detector and gamma-ray measurements were made using thermoluminescent detectors (TLD). Calculated and measured data were compared in terms of the C/M ratio. The calculated and measured neutron and gamma-ray dose rates and reduction factors agreed on the average within the {plus_minus}20% limits mandated by DNA and demonstrate the capability of the MASH code system in reproducing measured data in nominally shielded assemblies.

  4. DNA Radiation Environments Program Spring 1991 2-meter box experiments and analyses. [DEfense Nuclear Agency (DNA)

    Energy Technology Data Exchange (ETDEWEB)

    Santoro, R.T. (Oak Ridge National Lab., TN (United States)); Whitaker, S.Y. (Clark Atlanta Univ., GA (United States))

    1993-03-01

    This report summarizes the Spring 1991 2-m Box experiments that were performed at the Army Pulse Radiation Facility (APRF) at Aberdeen Proving Ground. These studies were sponsored by the Defense Nuclear Agency (DNA) under the Radiation Environments Program to obtain measured data for benchmarking the Adjoint Monte Carlo Code System, MASH, Version 1.0. The MASH code system was developed for the Department of Defense and NATO for calculating neutron and gamma-ray radiation fields and shielding protection factors for armored vehicles and military structures against nuclear weapon radiation. In the 2-m Box experiments, neutron and gamma-ray dose rates and reduction factors were measured in the free-field and as a function of position on an anthropomorphic phantom that was placed outside and inside a borated polyethylene lined steel-walled 2-m box. The data were acquired at a distance of 400-m from the APRF reactor. The purpose of these experiments was to measure the neutron and gamma-ray dose rates as a function of detector location on the phantom for cases when the phantom was in the free-field and inside of the box. Neutron measurements were made using a BD-100R bubble detector and gamma-ray measurements were made using thermoluminescent detectors (TLD). Calculated and measured data were compared in terms of the C/M ratio. The calculated and measured neutron and gamma-ray dose rates and reduction factors agreed on the average within the [plus minus]20% limits mandated by DNA and demonstrate the capability of the MASH code system in reproducing measured data in nominally shielded assemblies.

  5. DNA Radiation Environments Program - Spring 1990 2-meter box experiments and analyses

    Energy Technology Data Exchange (ETDEWEB)

    Santoro, R.T [Oak Ridge National Lab., TN (United States); Whitaker, S.Y. [Clark Atlanta Univ., GA (United States)

    1992-09-01

    This report summarizes the Spring 1990 2-m Box Experiments performed at the Army Pulse Radiation Facility (APRF) at Aberdeen Proving Ground, Maryland. These studies were sponsored by the Defense Nuclear Agency (DNA) under the Radiation Environments Program to obtain measured data for benchmarking the Adjoint Monte Carlo Code System, MASH, Version 1.0. MASH was developed as the Department of Defense and NATO code system for calculating neutron and gamma-ray radiation fields and shielding protection factors for armored vehicles and military structures against nuclear weapon radiation. In the experiments, neutron and gamma-ray dose and reduction factors were measured in the free-field and as a function of position on an anthropomorphic phantom that was placed outside and inside the steel-walled 2-m box. The data were acquired at a distance of 400-m from the APRF reactor. The measurements were performed by APRF, Bubble Technology Industries, the Defence Research Establishment Ottawa, Establishment Technique Central de l`Armement, and Harry Diamond Laboratory. Calculations were carried out by the Oak Ridge National Laboratory and Science Applications International Corporation. The purpose of these experiments was to measure the neutron and gamma-ray dose as a function of detector location on the phantom for cases when the phantom was standing in the free-field and inside of the box. Neutron measurements were made using a BD-IOOR bubble detector and gamma-ray measurements were made using thermoluminescent detectors (TLD). Calculated and measured data were compared in terms of the C/M ratio. DNA mandated that C/M values of {plus_minus}20% define the acceptable limits for the comparison of the dose and reduction factor data and for qualifying the MASH code in replicating integral parameters.

  6. Grande Ronde Basin Spring Chinook Salmon Captive Broodstock Program, 2008 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Hoffnagle, Timothy L.; Hair, Donald; Gee, Sally

    2009-03-31

    The Grande Ronde Basin Spring Chinook Salmon Captive Broodstock Program is designed to rapidly increase numbers of Chinook salmon in stocks that are in imminent danger of extirpation in Catherine Creek (CC), Lostine River (LR) and upper Grande Ronde River (GR). Natural parr are captured and reared to adulthood in captivity, spawned (within stocks) and their progeny reared to smoltification before being released into the natal stream of their parents. This program is co-managed by ODFW, National Marine Fisheries Service, Nez Perce Tribe and Confederated Tribes of the Umatilla Indian Reservation. Presmolt rearing was initially conducted at Lookingglass Fish Hatchery (LFH) but parr collected in 2003 and later were reared at Wallowa Fish Hatchery (WFH). Post-smolt rearing is conducted at Bonneville Fish Hatchery (BOH - freshwater) and at Manchester Research Station (MRS - saltwater). The CC and LR programs are being terminated, as these populations have achieved the goal of a consistent return of 150 naturally spawning adults, so the 2005 brood year was the last brood year collected for theses populations. The Grande Ronde River program continued with 300 fish collected each year. Currently, we are attempting to collect 150 natural parr and incorporate 150 parr collected as eggs from females with low ELISA levels from the upper Grande Ronde River Conventional Hatchery Program. This is part of a comparison of two methods of obtaining fish for a captive broodstock program: natural fish vs. those spawned in captivity. In August 2007, we collected 152 parr (BY 2006) from the upper Grande Ronde River and also have 155 Grande Ronde River parr (BY 2006) that were hatched from eyed eggs at LFH. During 2008, we were unable to collect natural parr from the upper Grande Ronde River. Therefore, we obtained 300 fish from low ELISA females from the upper Grande Ronde River Conventional Program. In October 2008 we obtained 170 eyed eggs from the upper Grande Ronde river Conventional

  7. Corporate Social Responsibility programs of Big Food in Australia: a content analysis of industry documents.

    Science.gov (United States)

    Richards, Zoe; Thomas, Samantha L; Randle, Melanie; Pettigrew, Simone

    2015-12-01

    To examine Corporate Social Responsibility (CSR) tactics by identifying the key characteristics of CSR strategies as described in the corporate documents of selected 'Big Food' companies. A mixed methods content analysis was used to analyse the information contained on Australian Big Food company websites. Data sources included company CSR reports and web-based content that related to CSR initiatives employed in Australia. A total of 256 CSR activities were identified across six organisations. Of these, the majority related to the categories of environment (30.5%), responsibility to consumers (25.0%) or community (19.5%). Big Food companies appear to be using CSR activities to: 1) build brand image through initiatives associated with the environment and responsibility to consumers; 2) target parents and children through community activities; and 3) align themselves with respected organisations and events in an effort to transfer their positive image attributes to their own brands. Results highlight the type of CSR strategies Big Food companies are employing. These findings serve as a guide to mapping and monitoring CSR as a specific form of marketing. © 2015 Public Health Association of Australia.

  8. Weldon Spring Site Remedial Action Project: Report from the DOE voluntary protection program onsite review, November 17--21, 1997

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-01-28

    This report summarizes the Department of Energy Voluntary Protection Program (DOE-VPP) Review Team`s findings from the five-day onsite evaluation of the Weldon Spring Site Remedial Action Project (WSSRAP), conducted November 17--21, 1997. The site was evaluated against the program requirements contained in ``US Department of Energy Voluntary Protection Program, Part 1: Program Elements`` to determine its success in implementing the five tenets of DOE-VPP. DOE-VPP consists of three programs, with names and functions similar to those in OSHA`s VPP. These programs are STAR, MERIT, and DEMONSTRATION. The STAR program is the core of DOE-VPP. The program is aimed at truly outstanding protectors of employee safety and health. The MERIT program is a steppingstone for contractors and subcontractors that have good safety and health programs but need time and DOE guidance to achieve STAR status. The DEMONSTRATION program is rarely used; it allows DOE to recognize achievements in unusual situations about which DOE needs to learn more before determining approval requirements for the STAR status.

  9. Installation Restoration Program Preliminary Assessment, Big Mountain Radio Relay Station, Alaska

    Science.gov (United States)

    1989-04-01

    are not as abundant as tuffs or lava flows. 3 Soil formation and development largely dates from the close of the Wisconsin Glaciation, when glaciers...environmental contamination that may have an adverse impact on public health or the environment and to select a remedial action through preparation of...Telephone Switching Station (ATSS-4A) capabilities were added to Big Mountain RRS, Kalakaket Creek RRS, Pedro Dome RRS, and Neklasson Lake RRS. These

  10. Pro Spring Batch

    CERN Document Server

    Minella, Michael T

    2011-01-01

    Since its release, Spring Framework has transformed virtually every aspect of Java development including web applications, security, aspect-oriented programming, persistence, and messaging. Spring Batch, one of its newer additions, now brings the same familiar Spring idioms to batch processing. Spring Batch addresses the needs of any batch process, from the complex calculations performed in the biggest financial institutions to simple data migrations that occur with many software development projects. Pro Spring Batch is intended to answer three questions: *What? What is batch processing? What

  11. Big Society, Big Deal?

    Science.gov (United States)

    Thomson, Alastair

    2011-01-01

    Political leaders like to put forward guiding ideas or themes which pull their individual decisions into a broader narrative. For John Major it was Back to Basics, for Tony Blair it was the Third Way and for David Cameron it is the Big Society. While Mr. Blair relied on Lord Giddens to add intellectual weight to his idea, Mr. Cameron's legacy idea…

  12. Big data, big governance

    NARCIS (Netherlands)

    drs. Frans van den Reep

    2016-01-01

    “Natuurlijk is het leuk dat mijn koelkast zelf melk bestelt op basis van data gerelateerde patronen. Deep learning op basis van big data kent grote beloften,” zegt Frans van der Reep van Inholland. Geen wonder dat dit op de Hannover Messe tijdens de Wissenstag van ScienceGuide een hoofdthema zal

  13. Programs for Engagement and Enhancement. Professional File. Article 131, Spring 2013

    Science.gov (United States)

    Crisp, Gloria; Palacios, Lisa; Kaulfus, John

    2013-01-01

    The following article describes programs used by universities and colleges to engage students; these programs include mentoring, learning communities, and first-year success courses and programs. We begin with a brief overview of student development theory, program descriptions and citations, and article summaries for key references. Next, we…

  14. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  15. Evaluation of NASA SPoRT's Pseudo-Geostationary Lightning Mapper Products in the 2011 Spring Program

    Science.gov (United States)

    Stano, Geoffrey T.; Carcione, Brian; Siewert, Christopher; Kuhlman, Kristin M.

    2012-01-01

    NASA's Short-term Prediction Research and Transition (SPoRT) program is a contributing partner with the GOES-R Proving Ground (PG) preparing forecasters to understand and utilize the unique products that will be available in the GOES-R era. This presentation emphasizes SPoRT s actions to prepare the end user community for the Geostationary Lightning Mapper (GLM). This preparation is a collaborative effort with SPoRT's National Weather Service partners, the National Severe Storms Laboratory (NSSL), and the Hazardous Weather Testbed s Spring Program. SPoRT continues to use its effective paradigm of matching capabilities to forecast problems through collaborations with our end users and working with the developers at NSSL to create effective evaluations and visualizations. Furthermore, SPoRT continues to develop software plug-ins so that these products will be available to forecasters in their own decision support system, AWIPS and eventually AWIPS II. In 2009, the SPoRT program developed the original pseudo geostationary lightning mapper (PGLM) flash extent product to demonstrate what forecasters may see with GLM. The PGLM replaced the previous GLM product and serves as a stepping-stone until the AWG s official GLM proxy is ready. The PGLM algorithm is simple and can be applied to any ground-based total lightning network. For 2011, the PGLM used observations from four ground-based networks (North Alabama, Kennedy Space Center, Oklahoma, and Washington D.C.). While the PGLM is not a true proxy product, it is intended as a tool to train forecasters about total lightning as well as foster discussions on product visualizations and incorporating GLM-resolution data into forecast operations. The PGLM has been used in 2010 and 2011 and is likely to remain the primary lightning training tool for the GOES-R program for the near future. This presentation will emphasize the feedback received during the 2011 Spring Program. This will discuss several topics. Based on feedback

  16. Sustaining Employability: A Process for Introducing Cloud Computing, Big Data, Social Networks, Mobile Programming and Cybersecurity into Academic Curricula

    Directory of Open Access Journals (Sweden)

    Razvan Bologa

    2017-12-01

    Full Text Available This article describes a process for introducing modern technological subjects into the academic curricula of non-technical universities. The process described may increase and contribute to social sustainability by enabling non-technical students’ access to the field of the Internet of Things and the broader Industry 4.0. The process has been defined and tested during a curricular reform project that took place in two large universities in Eastern Europe. In this article, the authors describe the results and impact, over multiple years, of a project financed by the European Union that aimed to introduce the following subjects into the academic curricula of business students: cloud computing, big data, mobile programming, and social networks and cybersecurity (CAMSS. The results are useful for those trying to implement similar curricular reforms, or to companies that need to manage talent pipelines.

  17. Spring in the Arab Spring

    NARCIS (Netherlands)

    Borg, G.J.A.

    2011-01-01

    Column Gert Borg | Spring in the Arab Spring door dr. Gert Borg, onderzoeker bij Islam en Arabisch aan de Radboud Universiteit Nijmegen en voormalig directeur van het Nederlands-Vlaams Instituut Caïro Spring If, in Google, you type "Arab Spring" and hit the button, you get more than

  18. Curriculum Development Based on the Big Picture Assessment of the Mechanical Engineering Program

    Science.gov (United States)

    Sabri, Mohd Anas Mohd; Khamis, Nor Kamaliana; Tahir, Mohd Faizal Mat; Wahid, Zaliha; Kamal, Ahmad; Ihsan, Ariffin Mohd; Sulong, Abu Bakar; Abdullah, Shahrum

    2013-01-01

    One of the major concerns of the Engineering Accreditation Council (EAC) is the need for an effective monitoring and evaluation of program outcome domains that can be associated with courses taught under the Mechanical Engineering program. However, an effective monitoring method that can determine the results of each program outcome using Bloom's…

  19. Big Java late objects

    CERN Document Server

    Horstmann, Cay S

    2012-01-01

    Big Java: Late Objects is a comprehensive introduction to Java and computer programming, which focuses on the principles of programming, software engineering, and effective learning. It is designed for a two-semester first course in programming for computer science students.

  20. Whistling swan dyeing program, Malheur National Wildlife Refuge, fall, 1961-spring, 1962

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This report covers the whistling swan dyeing program on Malheur National Wildlife Refuge, in the fall of 1961 to trap birds which, owing to the absence of food in...

  1. Creswell's Energy Efficient Construction Program: A Big Project for a Small School.

    Science.gov (United States)

    Kelsh, Bruce

    1982-01-01

    In Creswell (Oregon) High School's award winning vocational education program, students study energy efficient construction along with basic building skills. Part of the program has been the active recruitment of female, minority, disadvantaged, and handicapped students into the vocational area. Students have assembled solar hot water collectors,…

  2. "I Am Not a Big Man": Evaluation of the Issue Investigation Program

    Science.gov (United States)

    Cincera, Jan; Simonova, Petra

    2017-01-01

    The article evaluates a Czech environmental education program focused on developing competence in issue investigation. In the evaluation, a simple quasi-experimental design with experimental (N = 200) and control groups was used. The results suggest that the program had a greater impact on girls than on boys, and that it increased their internal…

  3. The Design and Redesign of a Clinical Ladder Program: Thinking Big and Overcoming Challenges.

    Science.gov (United States)

    Warman, Geri-Anne; Williams, Faye; Herrero, Ashlea; Fazeli, Pariya; White-Williams, Connie

    Clinical Ladder Programs or Clinical Advancement Programs (CAPs) are an essential component of staff nurse professional development, satisfaction, and retention. There is a need for more evidence regarding developing CAPs. CAP initially launched in 2004. Nurses accomplished tasks in four main areas: clinical, education, leadership, and research, which reflected and incorporated the 14 Forces of Magnetism. In February 2012, the newly revised program was launched and renamed Professional Nursing Development Program. The new program was based on the 5 Magnet® model components, the Synergy Professional Practice Model, and a point system which enabled nurses to utilize activities in many areas, thereby allowing them to capitalize on their strengths. The purpose of this article is to discuss the development, revision, implementation, and lessons learned in creating and revising CAP.

  4. Learning Spring application development

    CERN Document Server

    Soni, Ravi Kant

    2015-01-01

    This book is intended for those who are interested in learning the core features of the Spring Framework. Prior knowledge of Java programming and web development concepts with basic XML knowledge is expected.

  5. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......Astrophysics and cosmology are rich with data. The advent of wide-area digital cameras on large aperture telescopes has led to ever more ambitious surveys of the sky. Data volumes of entire surveys a decade ago can now be acquired in a single night and real-time analysis is often desired. Thus...... with label and measurement noise. We argue that this makes astronomy a great domain for computer science research, as it pushes the boundaries of data analysis. In the following, we will present this exciting application area for data scientists. We will focus on exemplary results, discuss main challenges...

  6. Working with and Visualizing Big Data Efficiently with Python for the DARPA XDATA Program

    Science.gov (United States)

    2017-08-01

    program focused on computational techniques and software tools for analyzing large volumes of data, both semi-structured (e.g. tabular, relational...effort performed was to support the DARPA XDATA program by developing computational techniques and software tools for analyzing large volumes of data...serve up a mixture of CSV files, SQL databases, and MongoDB databases using a common interface. In addition, any client computation done on Blaze

  7. Framework Spring

    OpenAIRE

    Bobkov, Pavel

    2010-01-01

    The aim of the thesis is to introduce reader to the Spring framework and describe it as a convenient tool for rapid application development and launching projects. It is necessary to grab the Spring issue in a broader context. That's why thesis is trying to note all the relevant technologies that are closely related to Spring, or which is Spring based on. The first step to understanding Spring is a basic knowledge of Java EE. Thesis presents the architecture of Java EE while arguing its flaws...

  8. Hood River Production Program Monitoring and Evaluation (M&E) - Confederated Tribes of Warm Springs : Annual Report For Fiscal Year, October 2007 – September 2008.

    Energy Technology Data Exchange (ETDEWEB)

    Gerstenberger, Ryan [Confederated Tribes of Warm Springs Reservation

    2009-07-27

    This progress report describes work performed by the Confederated Tribes of Warm Springs (CTWSRO) portion of the Hood River Production Program Monitoring and Evaluation Project (HRPP) during the 2008 fiscal year. A total of 64,736 hatchery winter steelhead, 12,108 hatchery summer steelhead, and 68,426 hatchery spring Chinook salmon smolts were acclimated and released in the Hood River basin during the spring. The HRPP exceeded program goals for a release of and 50,000 winter steelhead but fell short of the steelhead release goals of 30,000 summer steelhead and 75,000 spring Chinook in 2008. Passive Integrated Transponders (PIT) tags were implanted in 6,652 hatchery winter steelhead, and 1,196 hatchery summer steelhead, to compare migratory attributes and survival rates of hatchery fish released into the Hood River. Water temperatures were recorded at six locations within the Hood River subbasin to monitor for compliance with Oregon Department of Environmental Quality water quality standards. A preseason spring Chinook salmon adult run forecast was generated, which predicted an abundant return adequate to meet escapement goal and brood stock needs. As a result the tribal and sport fisheries were opened. A tribal creel was conducted from May 22 to July 18 during which an estimated 172 spring Chinook were harvested. One hundred sixteen Spring Chinook salmon redds were observed and 72 carcasses were inspected on 19.4 miles of spawning grounds throughout the Hood River Basin during 2008. Annual salvage operations were completed in two irrigation canals resulting in the liberation of 1,641 fish back to the Hood River.

  9. The Big Sky inside

    Science.gov (United States)

    Adams, Earle; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Hester, Carolyn; Knuth, Randy; Spangler, Todd; Jones, David; Henthorn, Melissa; Hammill, Brock; Smith, Paul; Salisbury, Rob; Reckin, Gene; Boulafentis, Johna

    2009-01-01

    The University of Montana (UM)-Missoula has implemented a problem-based program in which students perform scientific research focused on indoor air pollution. The Air Toxics Under the Big Sky program (Jones et al. 2007; Adams et al. 2008; Ward et al. 2008) provides a community-based framework for understanding the complex relationship between poor…

  10. Big Bang! An Evaluation of NASA's Space School Musical Program for Elementary and Middle School Learners

    Science.gov (United States)

    Haden, C.; Styers, M.; Asplund, S.

    2015-12-01

    Music and the performing arts can be a powerful way to engage students in learning about science. Research suggests that content-rich songs enhance student understanding of science concepts by helping students develop content-based vocabulary, by providing examples and explanations of concepts, and connecting to personal and situational interest in a topic. Building on the role of music in engaging students in learning, and on best practices in out-of-school time learning, the NASA Discovery and New Frontiers program in association with Jet Propulsion Laboratory, Marshall Space Flight Center, and KidTribe developed Space School Musical. Space School Musical consists of a set of nine songs and 36 educational activities to teach elementary and middle school learners about the solar system and space science through an engaging storyline and the opportunity for active learning. In 2014, NASA's Jet Propulsion Laboratory contracted with Magnolia Consulting, LLC to conduct an evaluation of Space School Musical. Evaluators used a mixed methods approach to address evaluation questions related to educator professional development experiences, program implementation and perceptions, and impacts on participating students. Measures included a professional development feedback survey, facilitator follow-up survey, facilitator interviews, and a student survey. Evaluation results showed that educators were able to use the program in a variety of contexts and in different ways to best meet their instructional needs. They noted that the program worked well for diverse learners and helped to build excitement for science through engaging all learners in the musical. Students and educators reported positive personal and academic benefits to participating students. We present findings from the evaluation and lessons learned about integration of the arts into STEM education.

  11. Net Zero Pilot Program Lights the Path to Big Savings in Guam

    Energy Technology Data Exchange (ETDEWEB)

    PNNL

    2016-11-03

    Case study describes how the Army Reserve 9th Mission Support Command (MSC) reduced lighting energy consumption by 62% for a total savings of 125,000 kWh and more than $50,000 per year by replacing over 400 fluorescent troffers with 36 W LED troffers. This project was part of the Army Reserve Net Zero Pilot Program, initiated in 2013, to reduce energy and water consumption, waste generation, and utility costs.

  12. Job schedulers for Big data processing in Hadoop environment: testing real-life schedulers using benchmark programs

    Directory of Open Access Journals (Sweden)

    Mohd Usama

    2017-11-01

    Full Text Available At present, big data is very popular, because it has proved to be much successful in many fields such as social media, E-commerce transactions, etc. Big data describes the tools and technologies needed to capture, manage, store, distribute, and analyze petabyte or larger-sized datasets having different structures with high speed. Big data can be structured, unstructured, or semi structured. Hadoop is an open source framework that is used to process large amounts of data in an inexpensive and efficient way, and job scheduling is a key factor for achieving high performance in big data processing. This paper gives an overview of big data and highlights the problems and challenges in big data. It then highlights Hadoop Distributed File System (HDFS, Hadoop MapReduce, and various parameters that affect the performance of job scheduling algorithms in big data such as Job Tracker, Task Tracker, Name Node, Data Node, etc. The primary purpose of this paper is to present a comparative study of job scheduling algorithms along with their experimental results in Hadoop environment. In addition, this paper describes the advantages, disadvantages, features, and drawbacks of various Hadoop job schedulers such as FIFO, Fair, capacity, Deadline Constraints, Delay, LATE, Resource Aware, etc, and provides a comparative study among these schedulers.

  13. Grande Ronde Endemic Spring Chinook Salmon Supplementation Program : Facility Operations and Maintenance, 2004 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    McLean, Michael L.; Seeger, Ryan; Hewitt, Laurie (Confederated Tribes of the Umatilla Indian Reservation, Department of Natural Resources, Pendleton, OR)

    2005-02-01

    There were 2 acclimation periods at the Catherine Creek Acclimation Facility (CCAF) in 2004. During the early acclimation period, 92,475 smolts were delivered from Lookingglass Hatchery (LGH) on 8 March. This group was comprised entirely of progeny from the captive broodstock program. The size of the fish at delivery was 23.1 fish/lb. Volitional releases began 15 March 2004 and ended 22 March with an estimated total (based on PIT tag detections of 1,475) of 8,785 fish leaving the raceways. This was 9.5% of the total fish delivered. Fish remaining in the raceways after volitional release were forced out. Hourly detections of PIT-tagged fish showed that most of the fish left between 1200 and 2000 hours which was similar to the hourly temperature profile. The size of the fish just before the volitional release was 23.1 and the size of the fish remaining just before the forced release was 23.5 fish/lb. The total mortality for the acclimation period was 62 (0.07 %). The total number of fish released from the acclimation facility during the early period was 92,413. During the second acclimation period 70,977 smolts were delivered from LGH on 24 March. This group was comprised entirely of progeny from the conventional broodstock program. The size of the fish at delivery was 23.4 fish/lb. Volitional releases began 30 March 2004 and ended 12 April with an estimated total (based on PIT tag detections of 3,632) of 49,147 fish leaving the raceways. This was 69.2% of the total fish delivered. Fish remaining in the raceways after volitional release were forced out. Hourly detections of PIT-tagged fish showed that most of the fish left between 1200 and 2000 hours which was similar to the hourly temperature profile. The size of the fish just before the volitional release was 23.4 and the size of the fish remaining just before the forced release was 23.9 fish/lb. The total mortality for the acclimation period was 18 (0.03 %). The total number of fish released from the acclimation

  14. Big Data

    DEFF Research Database (Denmark)

    Aaen, Jon; Nielsen, Jeppe Agger

    2016-01-01

    Big Data byder sig til som en af tidens mest hypede teknologiske innovationer, udråbt til at rumme kimen til nye, værdifulde operationelle indsigter for private virksomheder og offentlige organisationer. Mens de optimistiske udmeldinger er mange, er forskningen i Big Data i den offentlige sektor...

  15. Urbanising Big

    DEFF Research Database (Denmark)

    Ljungwall, Christer

    2013-01-01

    Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis.......Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis....

  16. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  17. Beginning Spring

    CERN Document Server

    Caliskan, Mert

    2015-01-01

    Get up to speed quickly with this comprehensive guide toSpring Beginning Spring is the complete beginner's guide toJava's most popular framework. Written with an eye towardreal-world enterprises, the book covers all aspects of applicationdevelopment within the Spring Framework. Extensive samples withineach chapter allow developers to get up to speed quickly byproviding concrete references for experimentation, building askillset that drives successful application development byexploiting the full capabilities of Java's latest advances. Spring provides the exact toolset required to build anent

  18. Just Spring

    CERN Document Server

    Konda, Madhusudhan

    2011-01-01

    Get a concise introduction to Spring, the increasingly popular open source framework for building lightweight enterprise applications on the Java platform. This example-driven book for Java developers delves into the framework's basic features, as well as advanced concepts such as containers. You'll learn how Spring makes Java Messaging Service easier to work with, and how its support for Hibernate helps you work with data persistence and retrieval. Throughout Just Spring, you'll get your hands deep into sample code, beginning with a problem that illustrates dependency injection, Spring's co

  19. [Advances in early childhood development: from neurons to big scale programs].

    Science.gov (United States)

    Pérez-Escamilla, Rafael; Rizzoli-Córdoba, Antonio; Alonso-Cuevas, Aranzazú; Reyes-Morales, Hortensia

    Early childhood development (ECD) is the basis of countries' economic and social development and their ability to meet the Sustainable Development Goals (SDGs). Gestation and the first three years of life are critical for children to have adequate physical, psychosocial, emotional and cognitive development for the rest of their lives. Nurturing care and protection of children during gestation and early childhood are necessary for the development of trillions of neurons and trillions of synapses necessary for development. ECD requires access to good nutrition and health services from gestation, responsive caregiving according to the child's developmental stage, social protection and child welfare, and early stimulation and learning opportunities. Six actions are recommended to improve national ECD programs: expand political will and funding; create a supportive, evidence-based policy environment; build capacity through inter-sectoral coordination; ensure fair and transparent governance of programs and services; increase support for multidisciplinary research; and promote the development of leaders. Mexico has made significant progress under the leadership of the Health Ministry, but still faces significant challenges. The recent creation of a national inter-sectoral framework to enable ECD with support of international organizations and the participation of civil society organizations can help overcome these challenges. Copyright © 2017 Hospital Infantil de México Federico Gómez. Publicado por Masson Doyma México S.A. All rights reserved.

  20. Big Data: Are Biomedical and Health Informatics Training Programs Ready? Contribution of the IMIA Working Group for Health and Medical Informatics Education.

    Science.gov (United States)

    Otero, P; Hersh, W; Jai Ganesh, A U

    2014-08-15

    The growing volume and diversity of health and biomedical data indicate that the era of Big Data has arrived for healthcare. This has many implications for informatics, not only in terms of implementing and evaluating information systems, but also for the work and training of informatics researchers and professionals. This article addresses the question: What do biomedical and health informaticians working in analytics and Big Data need to know? We hypothesize a set of skills that we hope will be discussed among academic and other informaticians. The set of skills includes: Programming - especially with data-oriented tools, such as SQL and statistical programming languages; Statistics - working knowledge to apply tools and techniques; Domain knowledge - depending on one's area of work, bioscience or health care; and Communication - being able to understand needs of people and organizations, and articulate results back to them. Biomedical and health informatics educational programs must introduce concepts of analytics, Big Data, and the underlying skills to use and apply them into their curricula. The development of new coursework should focus on those who will become experts, with training aiming to provide skills in "deep analytical talent" as well as those who need knowledge to support such individuals.

  1. Environmental assessment for an experimental skunk removal program to increase duck production on Big Stone National Wildlife Refuge

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The primary goal of Big Stone National Wildlife Refuge is duck production; specifically, waterfowl maintenance, preservation and enhancement of diversity of...

  2. How Big Is Too Big?

    Science.gov (United States)

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  3. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Jeppesen, Jacob; Vaarst Andersen, Kristina; Lauto, Giancarlo

    In this paper we investigate the micro-mechanisms governing structural evolution and performance of scientific collaboration. Scientific discovery tends not to be lead by so called lone ?stars?, or big egos, but instead by collaboration among groups of researchers, from a multitude of institutions...... and locations, having a diverse knowledge set and capable of tackling more and more complex problems. This prose the question if Big Egos continues to dominate in this rising paradigm of big science. Using a dataset consisting of full bibliometric coverage from a Large Scale Research Facility, we utilize...... a stochastic actor oriented model (SAOM) to analyze both network endogeneous mechanisms and individual agency driving the collaboration network and further if being a Big Ego in Big Science translates to increasing performance. Our findings suggest that the selection of collaborators is not based...

  4. Not Just for Big Dogs: the NSF Career Program from AN Undergraduate College Perspective

    Science.gov (United States)

    Harpp, K. S.

    2011-12-01

    Relatively few NSF CAREER grants are awarded to faculty at undergraduate colleges, leading to a perception that the program is geared for major research institutions. The goal of this presentation is to dispel this misconception by describing a CAREER grant at a small, liberal arts institution. Because high quality instruction is the primary mission of undergraduate colleges, the career development plan for this proposal was designed to use research as a teaching tool. Instead of distinct sets of objectives for the research and education components, the proposal's research and teaching plans were integrated across the curriculum to maximize opportunities for undergraduate engagement. The driving philosophy was that students learn science by doing it. The proposal plan therefore created opportunities for students to be involved in hands-on, research-driven projects from their first through senior years. The other guiding principle was that students become engaged in science when they experience its real life applications. Stage 1 of the project provided mechanisms to draw students into science in two ways. The first was development of an inquiry-based curriculum for introductory classes, emphasizing practical applications and hands-on learning. The goal was to energize, generate confidence, and provide momentum for early science students to pursue advanced courses. The second mechanism was the development of a science outreach program for area K-9 schools, designed and implemented by undergraduates, an alternative path for students to discover science. Stages 2 and 3 consisted of increasingly advanced project-based courses, with in-depth training in research skills. The courses were designed along chemical, geological, and environmental themes, to capture the most student interest. The students planned their projects within a set of constraints designed to lead them to fundamental concepts and centered on questions of importance to the local community, thereby

  5. Big Buildings Meet Big Data

    National Research Council Canada - National Science Library

    Paul Ehrlich

    2013-01-01

      Big data comprises government and business servers that ar e collecting and analyzing massi ve amounts of data on everything from weather to web browsing, shopping habits to emails, on to phone calls...

  6. BigDansing

    KAUST Repository

    Khayyat, Zuhair

    2015-06-02

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to scaling to big datasets. This presents a serious impediment since data cleansing often involves costly computations such as enumerating pairs of tuples, handling inequality joins, and dealing with user-defined functions. In this paper, we present BigDansing, a Big Data Cleansing system to tackle efficiency, scalability, and ease-of-use issues in data cleansing. The system can run on top of most common general purpose data processing platforms, ranging from DBMSs to MapReduce-like frameworks. A user-friendly programming interface allows users to express data quality rules both declaratively and procedurally, with no requirement of being aware of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic and real datasets show that BigDansing outperforms existing baseline systems up to more than two orders of magnitude without sacrificing the quality provided by the repair algorithms.

  7. Putting Big Data to Work: Community Colleges Use Detailed Reports to Design Smarter Workforce Training and Education Programs

    Science.gov (United States)

    Woods, Bob

    2013-01-01

    In this article, Bob Woods reports that "Big data" is all the rage on college campuses, and it makes sense that administrators would use what they know to boost student outcomes. Woods points out that community colleges around the country are using the data: (1) to guide the systematic expansion of its curriculum, providing targeted…

  8. Spring performance tester for miniature extension springs

    Science.gov (United States)

    Salzbrenner, Bradley; Boyce, Brad

    2017-05-16

    A spring performance tester and method of testing a spring are disclosed that has improved accuracy and precision over prior art spring testers. The tester can perform static and cyclic testing. The spring tester can provide validation for product acceptance as well as test for cyclic degradation of springs, such as the change in the spring rate and fatigue failure.

  9. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  10. Energy Matters - Spring 2002

    Energy Technology Data Exchange (ETDEWEB)

    None

    2002-03-01

    Quarterly newsletter from DOE's Industrial Technologies Program to promote the use of energy-efficient industrial systems. The focus of the Spring 2002 Issue of Energy Matters focuses on premium energy efficiency systems, with articles on new gas technologies, steam efficiency, the Augusta Newsprint Showcase, and more.

  11. Big Dreams

    Science.gov (United States)

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  12. Instant Spring security starter

    CERN Document Server

    Jagielski, Piotr

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. A concise guide written in an easy-to-follow format following the Starter guide approach.This book is for people who have not used Spring Security before and want to learn how to use it effectively in a short amount of time. It is assumed that readers know both Java and HTTP protocol at the level of basic web programming. The reader should also be familiar with Inversion-of-Control/Dependency Injection, preferably with the Spring framework itsel

  13. Big Data and Big Science

    CERN Document Server

    Di Meglio, Alberto

    2014-04-14

    Brief introduction to the challenges of big data in scientific research based on the work done by the HEP community at CERN and how the CERN openlab promotes collaboration among research institutes and industrial IT companies. Presented at the FutureGov 2014 conference in Singapore.

  14. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  15. The Last Big Bang

    Energy Technology Data Exchange (ETDEWEB)

    McGuire, Austin D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Meade, Roger Allen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-13

    As one of the very few people in the world to give the “go/no go” decision to detonate a nuclear device, Austin “Mac” McGuire holds a very special place in the history of both the Los Alamos National Laboratory and the world. As Commander of Joint Task Force Unit 8.1.1, on Christmas Island in the spring and summer of 1962, Mac directed the Los Alamos data collection efforts for twelve of the last atmospheric nuclear detonations conducted by the United States. Since data collection was at the heart of nuclear weapon testing, it fell to Mac to make the ultimate decision to detonate each test device. He calls his experience THE LAST BIG BANG, since these tests, part of Operation Dominic, were characterized by the dramatic displays of the heat, light, and sounds unique to atmospheric nuclear detonations – never, perhaps, to be witnessed again.

  16. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  17. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-07-31

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to big data scaling. This presents a serious impediment since identify- ing and repairing dirty data often involves processing huge input datasets, handling sophisticated error discovery approaches and managing huge arbitrary errors. With large datasets, error detection becomes overly expensive and complicated especially when considering user-defined functions. Furthermore, a distinctive algorithm is de- sired to optimize inequality joins in sophisticated error discovery rather than na ̈ıvely parallelizing them. Also, when repairing large errors, their skewed distribution may obstruct effective error repairs. In this dissertation, I present solutions to overcome the above three problems in scaling data cleansing. First, I present BigDansing as a general system to tackle efficiency, scalability, and ease-of-use issues in data cleansing for Big Data. It automatically parallelizes the user’s code on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also enables parallel execution of serial repair algorithms by exploiting the graph representation of discovered errors. The experimental results show that BigDansing outperforms existing baselines up to more than two orders of magnitude. Although BigDansing scales cleansing jobs, it still lacks the ability to handle sophisticated error discovery requiring inequality joins. Therefore, I developed IEJoin as an algorithm for fast inequality joins. It is based on sorted arrays and space efficient bit-arrays to reduce the problem’s search space. By comparing IEJoin against well- known optimizations, I show that it is more scalable, and several orders of magnitude faster. BigDansing depends on vertex-centric graph systems, i.e., Pregel

  18. NASA thunderstorm overflight program: Atmospheric electricity research. An overview report on the optical lightning detection experiment for spring and summer 1983

    Science.gov (United States)

    Vaughan, O. H., Jr.

    1984-01-01

    This report presents an overview of the NASA Thunderstorm Overflight Program (TOP)/Optical Lightning Experiment (OLDE) being conducted by the Marshall Space Flight Center and university researchers in atmospheric electricity. Discussed in this report are the various instruments flown on the NASA U-2 aircraft, as well as the ground instrumentation used in 1983 to collect optical and electronic signatures from the lightning events. Samples of some of the photographic and electronic signatures are presented. Approximately 4132 electronic data samples of optical pulses were collected and are being analyzed by the NASA and university researchers. A number of research reports are being prepared for future publication. These reports will provide more detailed data analysis and results from the 1983 spring and summer program.

  19. Big queues

    CERN Document Server

    Ganesh, Ayalvadi; Wischik, Damon

    2004-01-01

    Big Queues aims to give a simple and elegant account of how large deviations theory can be applied to queueing problems. Large deviations theory is a collection of powerful results and general techniques for studying rare events, and has been applied to queueing problems in a variety of ways. The strengths of large deviations theory are these: it is powerful enough that one can answer many questions which are hard to answer otherwise, and it is general enough that one can draw broad conclusions without relying on special case calculations.

  20. Big Data Knowledge in Global Health Education.

    Science.gov (United States)

    Olayinka, Olaniyi; Kekeh, Michele; Sheth-Chandra, Manasi; Akpinar-Elci, Muge

    The ability to synthesize and analyze massive amounts of data is critical to the success of organizations, including those that involve global health. As countries become highly interconnected, increasing the risk for pandemics and outbreaks, the demand for big data is likely to increase. This requires a global health workforce that is trained in the effective use of big data. To assess implementation of big data training in global health, we conducted a pilot survey of members of the Consortium of Universities of Global Health. More than half the respondents did not have a big data training program at their institution. Additionally, the majority agreed that big data training programs will improve global health deliverables, among other favorable outcomes. Given the observed gap and benefits, global health educators may consider investing in big data training for students seeking a career in global health. Copyright © 2017 Icahn School of Medicine at Mount Sinai. Published by Elsevier Inc. All rights reserved.

  1. Big Data-Survey

    Directory of Open Access Journals (Sweden)

    P.S.G. Aruna Sri

    2016-03-01

    Full Text Available Big data is the term for any gathering of information sets, so expensive and complex, that it gets to be hard to process for utilizing customary information handling applications. The difficulties incorporate investigation, catch, duration, inquiry, sharing, stockpiling, Exchange, perception, and protection infringement. To reduce spot business patterns, anticipate diseases, conflict etc., we require bigger data sets when compared with the smaller data sets. Enormous information is hard to work with utilizing most social database administration frameworks and desktop measurements and perception bundles, needing rather enormously parallel programming running on tens, hundreds, or even a large number of servers. In this paper there was an observation on Hadoop architecture, different tools used for big data and its security issues.

  2. Really big numbers

    CERN Document Server

    Schwartz, Richard Evan

    2014-01-01

    In the American Mathematical Society's first-ever book for kids (and kids at heart), mathematician and author Richard Evan Schwartz leads math lovers of all ages on an innovative and strikingly illustrated journey through the infinite number system. By means of engaging, imaginative visuals and endearing narration, Schwartz manages the monumental task of presenting the complex concept of Big Numbers in fresh and relatable ways. The book begins with small, easily observable numbers before building up to truly gigantic ones, like a nonillion, a tredecillion, a googol, and even ones too huge for names! Any person, regardless of age, can benefit from reading this book. Readers will find themselves returning to its pages for a very long time, perpetually learning from and growing with the narrative as their knowledge deepens. Really Big Numbers is a wonderful enrichment for any math education program and is enthusiastically recommended to every teacher, parent and grandparent, student, child, or other individual i...

  3. Big Data

    DEFF Research Database (Denmark)

    Aaen, Jon; Nielsen, Jeppe Agger

    2016-01-01

    Big Data byder sig til som en af tidens mest hypede teknologiske innovationer, udråbt til at rumme kimen til nye, værdifulde operationelle indsigter for private virksomheder og offentlige organisationer. Mens de optimistiske udmeldinger er mange, er forskningen i Big Data i den offentlige sektor...... indtil videre begrænset. Denne artikel belyser, hvordan den offentlige sundhedssektor kan genanvende og udnytte en stadig større mængde data under hensyntagen til offentlige værdier. Artiklen bygger på et casestudie af anvendelsen af store mængder sundhedsdata i Dansk AlmenMedicinsk Database (DAMD......). Analysen viser, at (gen)brug af data i nye sammenhænge er en flerspektret afvejning mellem ikke alene økonomiske rationaler og kvalitetshensyn, men også kontrol over personfølsomme data og etiske implikationer for borgeren. I DAMD-casen benyttes data på den ene side ”i den gode sags tjeneste” til...

  4. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  5. On Big Data Benchmarking

    OpenAIRE

    Han, Rui; Lu, Xiaoyi

    2014-01-01

    Big data systems address the challenges of capturing, storing, managing, analyzing, and visualizing big data. Within this context, developing benchmarks to evaluate and compare big data systems has become an active topic for both research and industry communities. To date, most of the state-of-the-art big data benchmarks are designed for specific types of systems. Based on our experience, however, we argue that considering the complexity, diversity, and rapid evolution of big data systems, fo...

  6. Assessment of High Rates of Precocious Male Maturation in a Spring Chinook Salmon Supplementation Hatchery Program, Annual Report 2002-2003.

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Donald; Beckman, Brian; Cooper, Kathleen

    2003-08-01

    The Yakima River Spring Chinook Salmon Supplementation Project in Washington State is currently one of the most ambitious efforts to enhance a natural salmon population in the United States. Over the past five years we have conducted research to characterize the developmental physiology of naturally- and hatchery-reared wild progeny spring chinook salmon (Oncorhynchus tshawytscha) in the Yakima River basin. Fish were sampled at the main hatchery in Cle Elum, at remote acclimation sites and, during smolt migration, at downstream dams. Throughout these studies the maturational state of all fish was characterized using combinations of visual and histological analysis of testes, gonadosomatic index (GSI), and measurement of plasma 11-ketotestosterone (11-KT). We established that a plasma 11-KT threshold of 0.8 ng/ml could be used to designate male fish as either immature or precociously maturing approximately 8 months prior to final maturation (1-2 months prior to release as 'smolts'). Our analyses revealed that 37-49% of the hatchery-reared males from this program undergo precocious maturation at 2 years of age and a proportion of these fish appear to residualize in the upper Yakima River basin throughout the summer. An unnaturally high incidence of precocious male maturation may result in loss of potential returning anadromous adults, skewing of female: male sex ratios, ecological, and genetic impacts on wild populations and other native species. Precocious male maturation is significantly influenced by growth rate at specific times of year and future studies will be conducted to alter maturation rates through seasonal growth rate manipulations.

  7. Genomic Selection for Processing and End-Use Quality Traits in the CIMMYT Spring Bread Wheat Breeding Program

    Directory of Open Access Journals (Sweden)

    Sarah D. Battenfield

    2016-07-01

    Full Text Available Wheat ( L. cultivars must possess suitable end-use quality for release and consumer acceptability. However, breeding for quality traits is often considered a secondary target relative to yield largely because of amount of seed needed and expense. Without testing and selection, many undesirable materials are advanced, expending additional resources. Here, we develop and validate whole-genome prediction models for end-use quality phenotypes in the CIMMYT bread wheat breeding program. Model accuracy was tested using forward prediction on breeding lines ( = 5520 tested in unbalanced yield trials from 2009 to 2015 at Ciudad Obregon, Sonora, Mexico. Quality parameters included test weight, 1000-kernel weight, hardness, grain and flour protein, flour yield, sodium dodecyl sulfate sedimentation, Mixograph and Alveograph performance, and loaf volume. In general, prediction accuracy substantially increased over time as more data was available to train the model. Reflecting practical implementation of genomic selection (GS in the breeding program, forward prediction accuracies ( for quality parameters were assessed in 2015 and ranged from 0.32 (grain hardness to 0.62 (mixing time. Increased selection intensity was possible with GS since more entries can be genotyped than phenotyped and expected genetic gain was 1.4 to 2.7 times higher across all traits than phenotypic selection. Given the limitations in measuring many lines for quality, we conclude that GS is a powerful tool to facilitate early generation selection for end-use quality in wheat, leaving larger populations for selection on yield during advanced testing and leading to better gain for both quality and yield in bread wheat breeding programs.

  8. Hunting Plan : Big Stone National Wildlife Refuge

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The Big Stone National Wildlife Refuge Hunting Plan provides guidance for the management of hunting on the refuge. Hunting program objectives include providing a...

  9. Big Brothers/Big Sisters: A Study of Volunteer Recruitment and Screening.

    Science.gov (United States)

    Roaf, Phoebe A.; And Others

    Since 1988, Public/Private Ventures of Philadelphia (Pennsylvania) has been conducting a series of studies of mentoring programs for at-risk youth. As part of this effort, the recruitment and screening procedures used by Big Brother/Big Sister (BB/BS) agencies were studied in eight cities. Recruitment for the high-profile BB/BS agencies is not as…

  10. Final Program Report for 2010-2012: Monitoring and evaluation for conserving biological resources of the Spring Mountains National Recreation Area

    Science.gov (United States)

    Stephen J. Solem; Burton K. Pendleton; Casey Giffen; Marc Coles-Ritchie; Jeri Ledbetter; Kevin S. McKelvey; Joy Berg; Jim Menlove; Carly K. Woodlief; Luke A. Boehnke

    2013-01-01

    The Spring Mountains National Recreation Area (SMNRA) includes approximately 316,000 acres of National Forest System (NFS) lands managed by the Humboldt-Toiyabe National Forest in Clark and Nye Counties, Nevada (see fig. 1-1). The Spring Mountains have long been recognized as an island of endemism, harboring flora and fauna found nowhere else in the world. Conservation...

  11. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  12. Big Data, Big Problems: A Healthcare Perspective.

    Science.gov (United States)

    Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M

    2017-01-01

    Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."

  13. Epidemiology in the Era of Big Data

    Science.gov (United States)

    Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M

    2015-01-01

    Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called ‘3 Vs’: variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that, while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field’s future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future. PMID:25756221

  14. Five Big Ideas

    Science.gov (United States)

    Morgan, Debbie

    2012-01-01

    Designing quality continuing professional development (CPD) for those teaching mathematics in primary schools is a challenge. If the CPD is to be built on the scaffold of five big ideas in mathematics, what might be these five big ideas? Might it just be a case of, if you tell me your five big ideas, then I'll tell you mine? Here, there is…

  15. Big Data Analytics

    Indian Academy of Sciences (India)

    But analysing massiveamounts of data available in the Internet has the potential ofimpinging on our privacy. Inappropriate analysis of big datacan lead to misleading conclusions. In this article, we explainwhat is big data, how it is analysed, and give some case studiesillustrating the potentials and pitfalls of big data analytics ...

  16. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  17. Thermal springs of Wyoming

    Energy Technology Data Exchange (ETDEWEB)

    Breckenridge, R.M.; Hinckley, B.S.

    1978-01-01

    This bulletin attempts, first, to provide a comprehensive inventory of the thermal springs of Wyoming; second, to explore the geologic and hydrologic factors producing these springs; and, third, to analyze the springs collectively as an indicator of the geothermal resources of the state. A general discussion of the state's geology and the mechanisms of thermal spring production, along with a brief comparison of Wyoming's springs with worldwide thermal features are included. A discussion of geothermal energy resources, a guide for visitors, and an analysis of the flora of Wyoming's springs follow the spring inventory. The listing and analysis of Wyoming's thermal springs are arranged alphabetically by county. Tabulated data are given on elevation, ownership, access, water temperature, and flow rate. Each spring system is described and its history, general characteristics and uses, geology, hydrology, and chemistry are discussed. (MHR)

  18. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  19. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  20. Instant Spring Tool Suite

    CERN Document Server

    Chiang, Geoff

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. A tutorial guide that walks you through how to use the features of Spring Tool Suite using well defined sections for the different parts of Spring.Instant Spring Tool Suite is for novice to intermediate Java developers looking to get a head-start in enterprise application development using Spring Tool Suite and the Spring framework. If you are looking for a guide for effective application development using Spring Tool Suite, then this book is for you.

  1. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  2. Processing Solutions for Big Data in Astronomy

    Science.gov (United States)

    Fillatre, L.; Lepiller, D.

    2016-09-01

    This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.

  3. Water Treatment Technology - Springs.

    Science.gov (United States)

    Ross-Harrington, Melinda; Kincaid, G. David

    One of twelve water treatment technology units, this student manual on springs provides instructional materials for two competencies. (The twelve units are designed for a continuing education training course for public water supply operators.) The competencies focus on spring basin construction and spring protection. For each competency, student…

  4. Big fundamental groups: generalizing homotopy and big homotopy

    OpenAIRE

    Penrod, Keith

    2014-01-01

    The concept of big homotopy theory was introduced by J. Cannon and G. Conner using big intervals of arbitrarily large cardinality to detect big loops. We find, for each space, a canonical cardinal that is sufficient to detect all big loops and all big homotopies in the space.

  5. Executive summary: Weldon Spring Site Environmental Report for calendar year 1992. Weldon Spring Site Remedial Action Project, Weldon Spring, Missouri

    Energy Technology Data Exchange (ETDEWEB)

    1993-06-01

    This report has been prepared to provide information about the public safety and environmental protection programs conducted by the Weldon Spring Site Remedial Action Project. The Weldon Spring site is located in southern St. Charles County, Missouri, approximately 48 km (30 mi) west of St. Louis. The site consists of two main areas, the Weldon Spring Chemical Plant and raffinate pits and the Weldon Spring Quarry. The objectives of the Site Environmental Report are to present a summary of data from the environmental monitoring program, to characterize trends and environmental conditions at the site, and to confirm compliance with environmental and health protection standards and requirements. The report also presents the status of remedial activities and the results of monitoring these activities to assess their impacts on the public and environment. The scope of the environmental monitoring program at the Weldon Spring site has changed since it was initiated. Previously, the program focused on investigations of the extent and level of contaminants in the groundwater, surface waters, buildings, and air at the site. In 1992, the level of remedial activities required monitoring for potential impacts of those activities, particularly on surface water runoff and airborne effluents. This report includes monitoring data from routine radiological and nonradiological sampling activities. These data include estimates of dose to the public from the Weldon Spring site; estimates of effluent releases; and trends in groundwater contaminant levels. Also, applicable compliance requirements, quality assurance programs, and special studies conducted in 1992 to support environmental protection programs are reviewed.

  6. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  7. Matrix Big Brunch

    OpenAIRE

    Bedford, J; Papageorgakis, C.; Rodriguez-Gomez, D.; Ward, J.

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  8. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  9. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  10. Big Data Components for Business Process Optimization

    Directory of Open Access Journals (Sweden)

    Mircea Raducu TRIFU

    2016-01-01

    Full Text Available In these days, more and more people talk about Big Data, Hadoop, noSQL and so on, but very few technical people have the necessary expertise and knowledge to work with those concepts and technologies. The present issue explains one of the concept that stand behind two of those keywords, and this is the map reduce concept. MapReduce model is the one that makes the Big Data and Hadoop so powerful, fast, and diverse for business process optimization. MapReduce is a programming model with an implementation built to process and generate large data sets. In addition, it is presented the benefits of integrating Hadoop in the context of Business Intelligence and Data Warehousing applications. The concepts and technologies behind big data let organizations to reach a variety of objectives. Like other new information technologies, the main important objective of big data technology is to bring dramatic cost reduction.

  11. Spring Framework 5: Themes & Trends

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Spring Framework 5.0/5.1, scheduled for release in early/late 2017, focuses on several key themes: reactive web applications based on Reactive Streams, comprehensive support for JDK 9 and HTTP/2, as well as the latest API generations in the Enterprise Java ecosystem. This talk presents the overall story in the context of wider industry trends, highlighting Spring’s unique programming model strategy.

  12. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  13. A Big Video Manifesto

    DEFF Research Database (Denmark)

    Mcilvenny, Paul Bruce; Davidsen, Jacob

    2017-01-01

    For the last few years, we have witnessed a hype about the potential results and insights that quantitative big data can bring to the social sciences. The wonder of big data has moved into education, traffic planning, and disease control with a promise of making things better with big numbers...... and beautiful visualisations. However, we also need to ask what the tools of big data can do both for the Humanities and for more interpretative approaches and methods. Thus, we prefer to explore how the power of computation, new sensor technologies and massive storage can also help with video-based qualitative...... inquiry, such as video ethnography, ethnovideo, performance documentation, anthropology and multimodal interaction analysis. That is why we put forward, half-jokingly at first, a Big Video manifesto to spur innovation in the Digital Humanities....

  14. Big Data and Neuroimaging.

    Science.gov (United States)

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  15. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  16. Framework Spring MVC

    OpenAIRE

    Jindráček, Petr

    2011-01-01

    The topic of this bachelor thesis is the web application framework Spring MVC which is an integral part of the Spring platform. That means it offers many options of adjustment and support of other significant technologies. The aim is to introduce basic principles of this framework on a theoretical level and subsequently examine them on a real example of application. The thesis is divided into three main parts. The first part is focused on Spring framework in general to introduce basic princip...

  17. Spring integration essentials

    CERN Document Server

    Pandey, Chandan

    2015-01-01

    This book is intended for developers who are either already involved with enterprise integration or planning to venture into the domain. Basic knowledge of Java and Spring is expected. For newer users, this book can be used to understand an integration scenario, what the challenges are, and how Spring Integration can be used to solve it. Prior experience of Spring Integration is not expected as this book will walk you through all the code examples.

  18. Pro Spring Integration

    CERN Document Server

    Lui, M; Chan, Andy; Long, Josh

    2011-01-01

    Pro Spring Integration is an authoritative book from the experts that guides you through the vast world of enterprise application integration (EAI) and application of the Spring Integration framework towards solving integration problems. The book is:. * An introduction to the concepts of enterprise application integration * A reference on building event-driven applications using Spring Integration * A guide to solving common integration problems using Spring Integration What makes this book unique is its coverage of contemporary technologies and real-world information, with a focus on common p

  19. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  20. Further evidence of Mirafiori lettuce big-vein virus but not of Lettuce big-vein associated virus with big-vein disease in lettuce.

    Science.gov (United States)

    Sasaya, Takahide; Fujii, Hiroya; Ishikawa, Koichi; Koganezawa, Hiroki

    2008-04-01

    Mirafiori lettuce big-vein virus (MLBVV) and Lettuce big-vein associated virus (LBVaV) are found in association with big-vein disease of lettuce. Discrimination between the two viruses is critical for elucidating the etiology of big-vein disease. Using specific antibodies to MLBVV and LBVaV for western blotting and exploiting differences between MLBVV and LBVaV in host reaction of cucumber and temperature dependence in lettuce, we separated the two viruses by transfering each virus from doubly infected lettuce plants to cucumber or lettuce plants. A virus-free fungal isolate was allowed to acquire the two viruses individually or together. To confirm the separation, zoospores from MLBVV-, LBVaV-, and dually infected lettuce plants were used for serial inoculations of lettuce seedlings 12 successive times. Lettuce seedlings were infected at each transfer either with MLBVV alone, LBVaV alone, or both viruses together, depending on the virus carried by the vector. Lettuce seedlings infected with MLBVV alone developed the big-vein symptoms, while those infected with LBVaV alone developed no symptoms. In field surveys, MLBVV was consistently detected in lettuce plants from big-vein-affected fields, whereas LBVaV was detected in lettuce plants not only from big-vein-affected fields but also from big-vein-free fields. LBVaV occurred widely at high rates in winter-spring lettuce-growing regions irrespective of the presence of MLBVV and, hence, of the presence of the big-vein disease.

  1. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van

  2. Hydrogeologic assessment—Figeh Spring, Damascus, Syria

    Science.gov (United States)

    Lamoreaux, P. E.; Hughes, Travis H.; Memon, Bashir A.; Lineback, Neal

    1989-03-01

    Hydrogeological studies at Figeh Springs were directed to determine groundwater flow paths, research, storage and discharge units, and the maximum reliable yield. The project was designed to provide information upon which to base pumpage to augment low-season flows from the spring which is the major water supply for the city of Damascus, Syria. As a basis for conclusions and recommendations, work included extensive surface geologic mapping, air photographic interpretation, a detailed well and spring inventory, and a quality of water sampling program. Geologic structural work included mapping and jointing, faulting, and folding, and an analysis of their impact on groundwater movement.

  3. BigDog

    Science.gov (United States)

    Playter, R.; Buehler, M.; Raibert, M.

    2006-05-01

    BigDog's goal is to be the world's most advanced quadruped robot for outdoor applications. BigDog is aimed at the mission of a mechanical mule - a category with few competitors to date: power autonomous quadrupeds capable of carrying significant payloads, operating outdoors, with static and dynamic mobility, and fully integrated sensing. BigDog is about 1 m tall, 1 m long and 0.3 m wide, and weighs about 90 kg. BigDog has demonstrated walking and trotting gaits, as well as standing up and sitting down. Since its creation in the fall of 2004, BigDog has logged tens of hours of walking, climbing and running time. It has walked up and down 25 & 35 degree inclines and trotted at speeds up to 1.8 m/s. BigDog has walked at 0.7 m/s over loose rock beds and carried over 50 kg of payload. We are currently working to expand BigDog's rough terrain mobility through the creation of robust locomotion strategies and terrain sensing capabilities.

  4. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  5. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  6. Draft Supplement to the Environmental Statement Fiscal Year 1976 Proposed Program : Facility Location Evaluation for Pebble Springs-Marion 500-KV Line Study Area 75-B.

    Energy Technology Data Exchange (ETDEWEB)

    United States. Bonneville Power Administration.

    1974-10-22

    Proposed is construction of an approximately 160 mile long, 500-kV, double-circuit transmission line from the proposed Pebble Springs Substation located southeast of Arlington, Oregon, to the existing Marion Substation, 11 miles west of Maupin, Oregon. Development is also proposed of a major switching complex, Pebble Springs Substation, near Arlington, Oregon. Depending on the final route chosen, from 45 to 71 miles of parallel, 9 to 42 miles of new, and 74 miles of existing right-of-way will be required. New access road requirements will range from 45 to 90 miles. Land use affected by the proposed facilities includes 800 to 855 acres of forestland removed from timber production. In addition, 50 miles of cropland, primarily wheat, and approximately 35 miles of grassland will be crossed. Disturbance to wildlife during construction will occur and habitat associated with the above land uses will be eliminated. Soil erosion and siltation, primarily during and immediately after construction will also occur. Visual impacts will occur near several highways, lakes, rivers, and recreation areas. Disturbances to nearby residents will occur during construction. An additional 45 acres of rangeland will be required for the proposed Pebble Springs Substation.

  7. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  8. Big Data ethics

    Directory of Open Access Journals (Sweden)

    Andrej Zwitter

    2014-11-01

    Full Text Available The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with specific and knowable outcomes, towards actions by many unaware that they may have taken actions with unintended consequences for anyone. Responses will require a rethinking of ethical choices, the lack thereof and how this will guide scientists, governments, and corporate agencies in handling Big Data. This essay elaborates on the ways Big Data impacts on ethical conceptions.

  9. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  10. Big Data Provenance: Challenges, State of the Art and Opportunities.

    Science.gov (United States)

    Wang, Jianwu; Crawl, Daniel; Purawat, Shweta; Nguyen, Mai; Altintas, Ilkay

    2015-01-01

    Ability to track provenance is a key feature of scientific workflows to support data lineage and reproducibility. The challenges that are introduced by the volume, variety and velocity of Big Data, also pose related challenges for provenance and quality of Big Data, defined as veracity. The increasing size and variety of distributed Big Data provenance information bring new technical challenges and opportunities throughout the provenance lifecycle including recording, querying, sharing and utilization. This paper discusses the challenges and opportunities of Big Data provenance related to the veracity of the datasets themselves and the provenance of the analytical processes that analyze these datasets. It also explains our current efforts towards tracking and utilizing Big Data provenance using workflows as a programming model to analyze Big Data.

  11. Spring A Developer's Notebook

    CERN Document Server

    Tate, Bruce A

    2009-01-01

    This no-nonsense book quickly gets you up to speed on the new Spring open source framework. Favoring examples and practical application over theory, Spring: A Developer's Notebook features 10 code-intensive labs that'll reveal the many assets of this revolutionary, lightweight architecture. In the end, you'll understand how to produce simple, clean, and effective applications.

  12. Mockito for Spring

    CERN Document Server

    Acharya, Sujoy

    2015-01-01

    If you are an application developer with some experience in software testing and want to learn more about testing frameworks, then this technology and book is for you. Mockito for Spring will be perfect as your next step towards becoming a competent software tester with Spring and Mockito.

  13. Masters of the springs

    DEFF Research Database (Denmark)

    Laursen, Steffen

    2010-01-01

    flanked by villages that relied on these water recourses for agricultural production. The springs emerged in the zone separating the cemeteries from the settlements. The freshwater springs were actively incorporated into the religious landscape of the dead, by consistently erecting mounds of a particular...

  14. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  15. Small Big Data Congress 2017

    NARCIS (Netherlands)

    Doorn, J.

    2017-01-01

    TNO, in collaboration with the Big Data Value Center, presents the fourth Small Big Data Congress! Our congress aims at providing an overview of practical and innovative applications based on big data. Do you want to know what is happening in applied research with big data? And what can already be

  16. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  17. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  18. Sharing big biomedical data.

    Science.gov (United States)

    Toga, Arthur W; Dinov, Ivo D

    The promise of Big Biomedical Data may be offset by the enormous challenges in handling, analyzing, and sharing it. In this paper, we provide a framework for developing practical and reasonable data sharing policies that incorporate the sociological, financial, technical and scientific requirements of a sustainable Big Data dependent scientific community. Many biomedical and healthcare studies may be significantly impacted by using large, heterogeneous and incongruent datasets; however there are significant technical, social, regulatory, and institutional barriers that need to be overcome to ensure the power of Big Data overcomes these detrimental factors. Pragmatic policies that demand extensive sharing of data, promotion of data fusion, provenance, interoperability and balance security and protection of personal information are critical for the long term impact of translational Big Data analytics.

  19. Big Creek Pit Tags

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BCPITTAGS database is used to store data from an Oncorhynchus mykiss (steelhead/rainbow trout) population dynamics study in Big Creek, a coastal stream along the...

  20. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  1. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Klinkby Madsen, Anders; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  2. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  3. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  4. Sharing big biomedical data

    OpenAIRE

    Toga, Arthur W.; Ivo D Dinov

    2015-01-01

    Background The promise of Big Biomedical Data may be offset by the enormous challenges in handling, analyzing, and sharing it. In this paper, we provide a framework for developing practical and reasonable data sharing policies that incorporate the sociological, financial, technical and scientific requirements of a sustainable Big Data dependent scientific community. Findings Many biomedical and healthcare studies may be significantly impacted by using large, heterogeneous and incongruent data...

  5. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  6. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2015 The Authors.

  7. Big data need big theory too

    Science.gov (United States)

    Dougherty, Edward R.; Highfield, Roger R.

    2016-01-01

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their ‘depth’ and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote ‘blind’ big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698035

  8. Big Data and medicine: a big deal?

    Science.gov (United States)

    Mayer-Schönberger, V; Ingelsson, E

    2017-12-13

    Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research. Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data's role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval. © 2017 The Association for the Publication of the Journal of Internal Medicine.

  9. A Proposed Concentration Curriculum Design for Big Data Analytics for Information Systems Students

    Science.gov (United States)

    Molluzzo, John C.; Lawler, James P.

    2015-01-01

    Big Data is becoming a critical component of the Information Systems curriculum. Educators are enhancing gradually the concentration curriculum for Big Data in schools of computer science and information systems. This paper proposes a creative curriculum design for Big Data Analytics for a program at a major metropolitan university. The design…

  10. Pre-big bang cosmology end of a myth?

    CERN Document Server

    Veneziano, Gabriele

    1999-01-01

    The myth according to which the Universe-and time itself-started with /near a big bang singularity is questioned. Superstring theory, through its duality symmetries, favours a pre- (rather than a post-) big bang solution to standard cosmology's puzzles. Sufficiently homogeneous, flat, and hot baby universes naturally spring out of asymptotically trivial (but otherwise generic) initial conditions, after a long period of dilaton-driven inflation. Several characteristic observable consequences should soon provide stringent tests of this new cosmological scenario. (30 refs).

  11. Draft Supplement to the Environmental Statement Fiscal Year 1977 Proposed Program : Facility Location Evaluation for Hot Springs-Bell 500-KV Line Study Area 76-6.

    Energy Technology Data Exchange (ETDEWEB)

    United States. Bonneville Power Administration.

    1975-09-16

    Proposed construction of between 146 and 165 miles of 500-kV transmission line between Hot Springs, Montana, and Bell Substation, immediately north of Spokane, Washington. Depending upon the final route location chosen, between approximately 146 and 165 miles of 500-kV transmission line between Hot Spring Substation and Bell Substation over parallel and new right-of-way would be required. Between 15 and 70 miles of new access road would also be required. Land use affected would include clearing from 2153 to 2503 acres of timber. Depending upon the route location chosen, between 3 and 4 acres of farmland would be removed from production and between 110 and 165 acres temporarily disrupted. Other impacts would include the removal of wildlife habitat associated with the above mentioned right-of-way requirements. Distrubance would occur. Visual impacts would result from clearing right-of-way through heavily forested areas. Noise and other disturbances to residents will occur, primarily during construction. 15 figs. 2 tabs.

  12. Spring Bottom Trawl Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The standardized NEFSC Spring Bottom Trawl Survey was initiated in 1968 and covered an area from Cape Hatteras, NC, to Nova Scotia, Canada, at depths >27m....

  13. The High Cost of Big-Time Football

    Science.gov (United States)

    Weiner, Jay

    1973-01-01

    From facilities to travel to operations, the cost of intercollegiate football is causing questioning on individual campuses and even in the NCAA of the purposes and even necessity of big-time programs. (Editor)

  14. Cropland Management Plan : Big Stone National Wildlife Refuge

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The cropland program on Big Stone NWR will be accomplished each year with a combination of: (1) force account farming of permanent farm units where there is no...

  15. The Springs at Pipe Spring National Monument, Arizona (pisp_springs)

    Data.gov (United States)

    National Park Service, Department of the Interior — This is an Arc/Info coverage consisting of 5 points representing the springs, natural and man-made, at Pipe Spring National Monument, Arizona. The springs were...

  16. Survey of Cyber Crime in Big Data

    Science.gov (United States)

    Rajeswari, C.; Soni, Krishna; Tandon, Rajat

    2017-11-01

    Big data is like performing computation operations and database operations for large amounts of data, automatically from the data possessor’s business. Since a critical strategic offer of big data access to information from numerous and various areas, security and protection will assume an imperative part in big data research and innovation. The limits of standard IT security practices are notable, with the goal that they can utilize programming sending to utilize programming designers to incorporate pernicious programming in a genuine and developing risk in applications and working frameworks, which are troublesome. The impact gets speedier than big data. In this way, one central issue is that security and protection innovation are sufficient to share controlled affirmation for countless direct get to. For powerful utilization of extensive information, it should be approved to get to the information of that space or whatever other area from a space. For a long time, dependable framework improvement has arranged a rich arrangement of demonstrated ideas of demonstrated security to bargain to a great extent with the decided adversaries, however this procedure has been to a great extent underestimated as “needless excess” and sellers In this discourse, essential talks will be examined for substantial information to exploit this develop security and protection innovation, while the rest of the exploration difficulties will be investigated.

  17. From Big Data to Big Business

    DEFF Research Database (Denmark)

    Lund Pedersen, Carsten

    2017-01-01

    Idea in Brief: Problem: There is an enormous profit potential for manufacturing firms in big data, but one of the key barriers to obtaining data-driven growth is the lack of knowledge about which capabilities are needed to extract value and profit from data. Solution: We (BDBB research group at CBS......) have developed a research-based capability mapping tool, entitled DataProfit, which the public business consultants can use to upgrade their tool kit to enable data-driven growth in manufacturing organizations. Benefit: The DataProfit model/tool comprises insights of an extensive research project...

  18. Big Data and Peacebuilding

    Directory of Open Access Journals (Sweden)

    Sanjana Hattotuwa

    2013-11-01

    Full Text Available Any peace process is an exercise in the negotiation of big data. From centuries old communal hagiography to the reams of official texts, media coverage and social media updates, peace negotiations generate data. Peacebuilding and peacekeeping today are informed by, often respond and contribute to big data. This is no easy task. As recently as a few years ago, before the term big data embraced the virtual on the web, what informed peace process design and implementation was in the physical domain – from contested borders and resources to background information in the form of text. The move from analogue, face-to-face negotiations to online, asynchronous, web-mediated negotiations – which can still include real world meetings – has profound implications for how peace is strengthened in fragile democracies.

  19. Big Data Refinement

    Directory of Open Access Journals (Sweden)

    Eerke A. Boiten

    2016-06-01

    Full Text Available "Big data" has become a major area of research and associated funding, as well as a focus of utopian thinking. In the still growing research community, one of the favourite optimistic analogies for data processing is that of the oil refinery, extracting the essence out of the raw data. Pessimists look for their imagery to the other end of the petrol cycle, and talk about the "data exhausts" of our society. Obviously, the refinement community knows how to do "refining". This paper explores the extent to which notions of refinement and data in the formal methods community relate to the core concepts in "big data". In particular, can the data refinement paradigm can be used to explain aspects of big data processing?

  20. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  1. Walking with springs

    Science.gov (United States)

    Sugar, Thomas G.; Hollander, Kevin W.; Hitt, Joseph K.

    2011-04-01

    Developing bionic ankles poses great challenges due to the large moment, power, and energy that are required at the ankle. Researchers have added springs in series with a motor to reduce the peak power and energy requirements of a robotic ankle. We developed a "robotic tendon" that reduces the peak power by altering the required motor speed. By changing the required speed, the spring acts as a "load variable transmission." If a simple motor/gearbox solution is used, one walking step would require 38.8J and a peak motor power of 257 W. Using an optimized robotic tendon, the energy required is 21.2 J and the peak motor power is reduced to 96.6 W. We show that adding a passive spring in parallel with the robotic tendon reduces peak loads but the power and energy increase. Adding a passive spring in series with the robotic tendon reduces the energy requirements. We have built a prosthetic ankle SPARKy, Spring Ankle with Regenerative Kinetics, that allows a user to walk forwards, backwards, ascend and descend stairs, walk up and down slopes as well as jog.

  2. Commentary: Epidemiology in the era of big data.

    Science.gov (United States)

    Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M

    2015-05-01

    Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called "three V's": variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field's future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future.

  3. Cryptography for Big Data Security

    Science.gov (United States)

    2015-07-13

    summary. We also do not address the privacy implications of big data collection and processing. That is, our focus is on keeping the data and...Cryptography for Big Data Security Book Chapter for Big Data : Storage, Sharing, and Security (3S) Distribution A: Public Release Ariel Hamlin1 Nabil...Email: arkady@ll.mit.edu ii Contents 1 Cryptography for Big Data Security 1 1.1 Introduction

  4. Damper Spring For Omega Seal

    Science.gov (United States)

    Maclaughlin, Scott T.; Montgomery, Stuart K.

    1993-01-01

    Damper spring reduces deflections of omega-cross-section seal, reducing probability of failure and extending life of seal. Spring is split ring with U-shaped cross section. Placed inside omega seal and inserted with seal into seal cavity. As omega seal compressed into cavity, spring and seal make contact near convolution of seal, and spring becomes compressed also. During operation, when seal dynamically loaded, spring limits deflection of seal, reducing stress on seal.

  5. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  6. A Big Bang Lab

    Science.gov (United States)

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  7. Big data in history

    CERN Document Server

    Manning, Patrick

    2013-01-01

    Big Data in History introduces the project to create a world-historical archive, tracing the last four centuries of historical dynamics and change. Chapters address the archive's overall plan, how to interpret the past through a global archive, the missions of gathering records, linking local data into global patterns, and exploring the results.

  8. Small Places, Big Stakes

    DEFF Research Database (Denmark)

    Garsten, Christina; Sörbom, Adrienne

    left of much of ‘what is really going on', and ‘what people are really up to.' Meetings, however, as organized and ritualized social events, may provide the ethnographer with a loupe through which key tenets of larger social groups and organizations, and big issues, may be carefully observed. In formal...

  9. Big Data and Cycling

    NARCIS (Netherlands)

    Romanillos, Gustavo; Zaltz Austwick, Martin; Ettema, Dick; De Kruijf, Joost

    2016-01-01

    Big Data has begun to create significant impacts in urban and transport planning. This paper covers the explosion in data-driven research on cycling, most of which has occurred in the last ten years. We review the techniques, objectives and findings of a growing number of studies we have classified

  10. The Big Bang

    CERN Multimedia

    Moods, Patrick

    2006-01-01

    How did the Universe begin? The favoured theory is that everything - space, time, matter - came into existence at the same moment, around 13.7 thousand million years ago. This event was scornfully referred to as the "Big Bang" by Sir Fred Hoyle, who did not believe in it and maintained that the Universe had always existed.

  11. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with

  12. Governing Big Data

    Directory of Open Access Journals (Sweden)

    Andrej J. Zwitter

    2014-04-01

    Full Text Available 2.5 quintillion bytes of data are created every day through pictures, messages, gps-data, etc. "Big Data" is seen simultaneously as the new Philosophers Stone and Pandora's box: a source of great knowledge and power, but equally, the root of serious problems.

  13. Draft Supplement to the Environmental Statement, Fiscal Year 1976 Proposed Program : Facility Location Evaluation for Ashe-Pebble Springs 500-KV Line, Study Area 75-3A.

    Energy Technology Data Exchange (ETDEWEB)

    United States. Bonneville Power Administration.

    1975-03-07

    Proposed is construction of an approximately 80-mile long, 500-kV single-circuit transmission line from Ashe Substation in the northeastern portion of the AEC Hanford Reservation to a proposed Pebble Springs Substation near Arlington, Oregon. In the eastern sector of the proposed facility from 49 to 58 miles of new right-of-way would be required depending on the final route chosen and approximately 15 miles of new access road. Land uses which could be affected by the proposed facility include 12 to 26 miles of wheatland, 4 to 10 miles of irrigated cropland, and 22 to 34 miles of rangeland crossed. The western sector would require approximately 25 miles of new right-of-way and 9 to 16 miles of new access road. Disturbance to wildlife during construction will occur, and habitat associated with the above rangeland and agricultural land will be eliminated. Soil erosion and siltation, primarily during and immediately after construction, will also occur. Visual impacts will occur primarily near several highways, recreation areas, and at river crossings particularly the Columbia River where extremely tall towers may be required. Disturbance to nearby residents will occur during construction.

  14. Spring of women?

    Directory of Open Access Journals (Sweden)

    Mónica Castillo

    2012-12-01

    Full Text Available Terms such as “Islamic feminism” and “women’s movement” refer to those social movements of women that seek to assert their rights in Islamic societies. This brief study focuses on theses social movements of women and will presentan overview of the role and participation of women in the Arab Spring by examining news, events, press articles and opinions in order to contextualize the participation of women and feminists in the Arab Spring from a perspective of the social networking phenomenon as apparent drivers of the revolution.

  15. Pro Spring security

    CERN Document Server

    Scarioni, Carlo

    2013-01-01

    Security is a key element in the development of any non-trivial application. The Spring Security Framework provides a comprehensive set of functionalities to implement industry-standard authentication and authorization mechanisms for Java applications. Pro Spring Security will be a reference and advanced tutorial that will do the following: Guides you through the implementation of the security features for a Java web application by presenting consistent examples built from the ground-up. Demonstrates the different authentication and authorization methods to secure enterprise-level applications

  16. A Quadratic Spring Equation

    Science.gov (United States)

    Fay, Temple H.

    2010-01-01

    Through numerical investigations, we study examples of the forced quadratic spring equation [image omitted]. By performing trial-and-error numerical experiments, we demonstrate the existence of stability boundaries in the phase plane indicating initial conditions yielding bounded solutions, investigate the resonance boundary in the [omega]…

  17. Editors' Spring Picks

    Science.gov (United States)

    Library Journal, 2011

    2011-01-01

    While they do not represent the rainbow of reading tastes American public libraries accommodate, Book Review editors are a wildly eclectic bunch. One look at their bedside tables and ereaders would reveal very little crossover. This article highlights an eclectic array of spring offerings ranging from print books to an audiobook to ebook apps. It…

  18. Spring batch essentials

    CERN Document Server

    Rao, P Raja Malleswara

    2015-01-01

    If you are a Java developer with basic knowledge of Spring and some experience in the development of enterprise applications, and want to learn about batch application development in detail, then this book is ideal for you. This book will be perfect as your next step towards building simple yet powerful batch applications on a Java-based platform.

  19. Springing of ships waves

    NARCIS (Netherlands)

    Van Gunsteren, F.F.

    1978-01-01

    This thesis is the result of an investigation of the assumptions underlying the general applied method for the calculation of springing of ships in waves, which has been proposed by the author some decade ago. It has been found that, contrary to the general practice in seakeeping research, the

  20. Pengembangan Aplikasi Antarmuka Layanan Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Gede Karya

    2017-11-01

    Full Text Available In the 2016 Higher Competitive Grants Research (Hibah Bersaing Dikti, we have been successfully developed models, infrastructure and modules of Hadoop-based big data analysis application. It has also successfully developed a virtual private network (VPN network that allows integration and access to the infrastructure from outside the FTIS Computer Laboratorium. Infrastructure and application modules of analysis are then wanted to be presented as services to small and medium enterprises (SMEs in Indonesia. This research aims to develop application of big data analysis service interface integrated with Hadoop-Cluster. The research begins with finding appropriate methods and techniques for scheduling jobs, calling for ready-made Java Map-Reduce (MR application modules, and techniques for tunneling input / output and meta-data construction of service request (input and service output. The above methods and techniques are then developed into a web-based service application, as well as an executable module that runs on Java and J2EE based programming environment and can access Hadoop-Cluster in the FTIS Computer Lab. The resulting application can be accessed by the public through the site http://bigdata.unpar.ac.id. Based on the test results, the application has functioned well in accordance with the specifications and can be used to perform big data analysis. Keywords: web based service, big data analysis, Hadop, J2EE Abstrak Pada penelitian Hibah Bersaing Dikti tahun 2016 telah berhasil dikembangkan model, infrastruktur dan modul-modul aplikasi big data analysis berbasis Hadoop. Selain itu juga telah berhasil dikembangkan jaringan virtual private network (VPN yang memungkinkan integrasi dan akses infrastruktur tersebut dari luar Laboratorium Komputer FTIS. Infrastruktur dan modul aplikasi analisis tersebut selanjutnya ingin dipresentasikan sebagai layanan kepada usaha kecil dan menengah (UKM di Indonesia. Penelitian ini bertujuan untuk mengembangkan

  1. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  2. Fish Springs molluscan studies: House and Percy Springs

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This report summarizes the findings of a limited survey of House and Percy Springs molluscan fauna within Fish Springs National Wildlife Refuge. Various...

  3. Reconciliation and interpretation of the Big Bend National Park light extinction source apportionment: results from the Big Bend Regional Aerosol and Visibility Observational Study--part II.

    Science.gov (United States)

    Pitchford, Marc L; Schichtel, Bret A; Gebhart, Kristi A; Barna, Michael G; Malm, William C; Tombach, Ivar H; Knipping, Eladio M

    2005-11-01

    The recently completed Big Bend Regional Aerosol and Visibility Observational (BRAVO) Study focused on particulate sulfate source attribution for a 4-month period from July through October 1999. A companion paper in this issue by Schichtel et al. describes the methods evaluation and results reconciliation of the BRAVO Study sulfate attribution approaches. This paper summarizes the BRAVO Study extinction budget assessment and interprets the attribution results in the context of annual and multiyear causes of haze by drawing on long-term aerosol monitoring data and regional transport climatology, as well as results from other investigations. Particulate sulfates, organic carbon, and coarse mass are responsible for most of the haze at Big Bend National Park, whereas fine particles composed of light-absorbing carbon, fine soils, and nitrates are relatively minor contributors. Spring and late summer through fall are the two periods of high-haze levels at Big Bend. Particulate sulfate and carbonaceous compounds contribute in a similar magnitude to the spring haze period, whereas sulfates are the primary cause of haze during the late summer and fall period. Atmospheric transport patterns to Big Bend vary throughout the year, resulting in a seasonal cycle of different upwind source regions contributing to its haze levels. Important sources and source regions for haze at Big Bend include biomass smoke from Mexico and Central America in the spring and African dust during the summer. Sources of sulfur dioxide (SO2) emissions in Mexico, Texas, and in the Eastern United States all contribute to Big Bend haze in varying amounts over different times of the year, with a higher contribution from Mexican sources in the spring and early summer, and a higher contribution from U.S. sources during late summer and fall. Some multiple-day haze episodes result from the influence of several source regions, whereas others are primarily because of emissions from a single source region.

  4. Finding the big bang

    CERN Document Server

    Page, Lyman A; Partridge, R Bruce

    2009-01-01

    Cosmology, the study of the universe as a whole, has become a precise physical science, the foundation of which is our understanding of the cosmic microwave background radiation (CMBR) left from the big bang. The story of the discovery and exploration of the CMBR in the 1960s is recalled for the first time in this collection of 44 essays by eminent scientists who pioneered the work. Two introductory chapters put the essays in context, explaining the general ideas behind the expanding universe and fossil remnants from the early stages of the expanding universe. The last chapter describes how the confusion of ideas and measurements in the 1960s grew into the present tight network of tests that demonstrate the accuracy of the big bang theory. This book is valuable to anyone interested in how science is done, and what it has taught us about the large-scale nature of the physical universe.

  5. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... of the international development agenda to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development policies, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion...

  6. Leaf spring, and electromagnetic actuator provided with a leaf spring

    NARCIS (Netherlands)

    Berkhoff, Arthur P.; Lemmen, Remco Louis Christiaan

    2002-01-01

    The invention relates to a leaf spring for an electromagnetic actuator and to such an electromagnetic actuator. The leaf spring is formed as a whole from a disc of plate-shaped, resilient material. The leaf spring comprises a central fastening part, an outer fastening part extending therearound and

  7. Studying Springs in Series Using a Single Spring

    Science.gov (United States)

    Serna, Juan D.; Joshi, Amitabh

    2011-01-01

    Springs are used for a wide range of applications in physics and engineering. Possibly, one of their most common uses is to study the nature of restoring forces in oscillatory systems. While experiments that verify Hooke's law using springs are abundant in the physics literature, those that explore the combination of several springs together are…

  8. Getting started with Greenplum for big data analytics

    CERN Document Server

    Gollapudi, Sunila

    2013-01-01

    Standard tutorial-based approach.""Getting Started with Greenplum for Big Data"" Analytics is great for data scientists and data analysts with a basic knowledge of Data Warehousing and Business Intelligence platforms who are new to Big Data and who are looking to get a good grounding in how to use the Greenplum Platform. It's assumed that you will have some experience with database design and programming as well as be familiar with analytics tools like R and Weka.

  9. Big Data i nyhedsformidling

    OpenAIRE

    Schjelde, Emil Kristian Kjølhede; Rosendahl, Rasmus

    2016-01-01

    This thesis evolves around the role of big data in scientific and journalistic knowledge production. We take a perspective on knowledge production through an analysis of discourses in contemporary discussions of epistemology in general sciences and journalism. Our empirical material here consists of a mixture of research articles, books and internet articles. The main objective of this analysis is, through the theoretical works of Ernesto Laclau and Chantal Mouffe, to outlin...

  10. Big Bang 5

    CERN Document Server

    Apolin, Martin

    2007-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 5 RG behandelt die Grundlagen (Maßsystem, Größenordnungen) und die Mechanik (Translation, Rotation, Kraft, Erhaltungssätze).

  11. Big Bang 8

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Band 8 vermittelt auf verständliche Weise Relativitätstheorie, Kern- und Teilchenphysik (und deren Anwendungen in der Kosmologie und Astrophysik), Nanotechnologie sowie Bionik.

  12. Big Bang 6

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 6 RG behandelt die Gravitation, Schwingungen und Wellen, Thermodynamik und eine Einführung in die Elektrizität anhand von Alltagsbeispielen und Querverbindungen zu anderen Disziplinen.

  13. Big Bang 7

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. In Band 7 werden neben einer Einführung auch viele aktuelle Aspekte von Quantenmechanik (z. Beamen) und Elektrodynamik (zB Elektrosmog), sowie die Klimaproblematik und die Chaostheorie behandelt.

  14. Big Bang Circus

    Science.gov (United States)

    Ambrosini, C.

    2011-06-01

    Big Bang Circus is an opera I composed in 2001 and which was premiered at the Venice Biennale Contemporary Music Festival in 2002. A chamber group, four singers and a ringmaster stage the story of the Universe confronting and interweaving two threads: how early man imagined it and how scientists described it. Surprisingly enough fancy, myths and scientific explanations often end up using the same images, metaphors and sometimes even words: a strong tension, a drumskin starting to vibrate, a shout…

  15. Proceedings of Management of Risk and Uncertainty in the Acquisition of Major Programs, U. S. Air Force Academy, Colorado Springs, February 9-11, 1981

    Science.gov (United States)

    1981-05-01

    oroanizational Perspective. Unpublished Master’s thesis. Naval Post graduate School’, Monterey, CA. March 1975. LD-36971A. 33 Lev . Baruch ...such as the Apollo Space Program with a 200% overrun, and the Space Shuttle Pro- gram are added to this list, the types and lev - els of technologies...as how they apply to the acquisition process. Lev (33) defines risk as the condition where each outcome of the decision maker leads to one of a

  16. Big Bang Tumor Growth and Clonal Evolution.

    Science.gov (United States)

    Sun, Ruping; Hu, Zheng; Curtis, Christina

    2017-07-14

    The advent and application of next-generation sequencing (NGS) technologies to tumor genomes has reinvigorated efforts to understand clonal evolution. Although tumor progression has traditionally been viewed as a gradual stepwise process, recent studies suggest that evolutionary rates in tumors can be variable with periods of punctuated mutational bursts and relative stasis. For example, Big Bang dynamics have been reported, wherein after transformation, growth occurs in the absence of stringent selection, consistent with effectively neutral evolution. Although first noted in colorectal tumors, effective neutrality may be relatively common. Additionally, punctuated evolution resulting from mutational bursts and cataclysmic genomic alterations have been described. In this review, we contrast these findings with the conventional gradualist view of clonal evolution and describe potential clinical and therapeutic implications of different evolutionary modes and tempos. Copyright © 2017 Cold Spring Harbor Laboratory Press; all rights reserved.

  17. Hydrogeologic and geothermal investigation of Pagosa Springs, Colorado

    Energy Technology Data Exchange (ETDEWEB)

    Galloway, M.J.

    1980-01-01

    The following topics are covered: geology; geophysical surveys; geothermal wells, springs, and heat flow; hydrology; drilling program, well testing, and mineralogical and petrographic studies of samples from geothermal wells. (MHR)

  18. Spring magnet films.

    Energy Technology Data Exchange (ETDEWEB)

    Bader, S. D.; Fullerton, E. E.; Gornakov, V. S.; Inomata, A.; Jiang, J. S.; Nikitenko, V. I.; Shapiro, A. J.; Shull, R. D.; Sowers, C. H.

    1999-03-29

    The properties of exchange-spring-coupled bilayer and superlattice films are highlighted for Sm-Co hard magnet and Fe or Co soft magnet layers. The hexagonal Sm-Co is grown via magnetron sputtering in a- and b-axis epitaxial orientations. In both cases the c-axis, in the film plane, is the easy axis of magnetization. Trends in coercivity with film thickness are established and related to the respective microstructure of the two orientations. The magnetization reversal process for the bilayers is examined by magnetometry and magneto-optical imaging, as well as by simulations that utilize a one-dimensional model to provide the spin configuration for each atomic layer. The Fe magnetization is pinned to that of the Sm-Co at the interface, and reversal proceeds via a progressive twisting of the Fe magnetization. The Fe demagnetization curves are reversible as expected for a spring magnet. Comparison of experiment and simulations indicates that the spring magnet behavior can be understood from the intrinsic properties of the hard and soft layers. Estimated are made of the ultimate gain in performance that can potentially be realized in this system.

  19. Big data science: A literature review of nursing research exemplars.

    Science.gov (United States)

    Westra, Bonnie L; Sylvia, Martha; Weinfurter, Elizabeth F; Pruinelli, Lisiane; Park, Jung In; Dodd, Dianna; Keenan, Gail M; Senk, Patricia; Richesson, Rachel L; Baukner, Vicki; Cruz, Christopher; Gao, Grace; Whittenburg, Luann; Delaney, Connie W

    Big data and cutting-edge analytic methods in nursing research challenge nurse scientists to extend the data sources and analytic methods used for discovering and translating knowledge. The purpose of this study was to identify, analyze, and synthesize exemplars of big data nursing research applied to practice and disseminated in key nursing informatics, general biomedical informatics, and nursing research journals. A literature review of studies published between 2009 and 2015. There were 650 journal articles identified in 17 key nursing informatics, general biomedical informatics, and nursing research journals in the Web of Science database. After screening for inclusion and exclusion criteria, 17 studies published in 18 articles were identified as big data nursing research applied to practice. Nurses clearly are beginning to conduct big data research applied to practice. These studies represent multiple data sources and settings. Although numerous analytic methods were used, the fundamental issue remains to define the types of analyses consistent with big data analytic methods. There are needs to increase the visibility of big data and data science research conducted by nurse scientists, further examine the use of state of the science in data analytics, and continue to expand the availability and use of a variety of scientific, governmental, and industry data resources. A major implication of this literature review is whether nursing faculty and preparation of future scientists (PhD programs) are prepared for big data and data science. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Native bunchgrass response to prescribed fire in ungrazed Mountain Big Sagebrush ecosystems

    Science.gov (United States)

    Lisa M. Ellsworth; J. Boone Kauffman

    2010-01-01

    Fire was historically a dominant ecological process throughout mountain big sagebrush (Artemisia tridentata Nutt. ssp. vaseyana [Rydb.] Beetle) ecosystems of western North America, and the native biota have developed many adaptations to persist in a regime typified by frequent fires. Following spring and fall prescribed fires...

  1. Disaggregating asthma: Big investigation versus big data.

    Science.gov (United States)

    Belgrave, Danielle; Henderson, John; Simpson, Angela; Buchan, Iain; Bishop, Christopher; Custovic, Adnan

    2017-02-01

    We are facing a major challenge in bridging the gap between identifying subtypes of asthma to understand causal mechanisms and translating this knowledge into personalized prevention and management strategies. In recent years, "big data" has been sold as a panacea for generating hypotheses and driving new frontiers of health care; the idea that the data must and will speak for themselves is fast becoming a new dogma. One of the dangers of ready accessibility of health care data and computational tools for data analysis is that the process of data mining can become uncoupled from the scientific process of clinical interpretation, understanding the provenance of the data, and external validation. Although advances in computational methods can be valuable for using unexpected structure in data to generate hypotheses, there remains a need for testing hypotheses and interpreting results with scientific rigor. We argue for combining data- and hypothesis-driven methods in a careful synergy, and the importance of carefully characterized birth and patient cohorts with genetic, phenotypic, biological, and molecular data in this process cannot be overemphasized. The main challenge on the road ahead is to harness bigger health care data in ways that produce meaningful clinical interpretation and to translate this into better diagnoses and properly personalized prevention and treatment plans. There is a pressing need for cross-disciplinary research with an integrative approach to data science, whereby basic scientists, clinicians, data analysts, and epidemiologists work together to understand the heterogeneity of asthma. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Sampling Operations on Big Data

    Science.gov (United States)

    2015-11-29

    big data in Section II, followed by a description of the analytic environment D4M in Section III. We then describe the types of sampling methods and...signal reconstruction steps are used to do these operations. Big Data analytics , often characterized by analytics applied to datasets that strain available...Sampling Operations on Big Data Vijay Gadepally, Taylor Herr, Luke Johnson, Lauren Milechin, Maja Milosavljevic, Benjamin A. Miller Lincoln

  3. Supplement Analysis for the Transmission System Vegetation Management Program (DOE/EIS-0285/SA-113-1) Updates 9/27/02 SA-113 - Big Eddy-Ostrander Transmission Corridor

    Energy Technology Data Exchange (ETDEWEB)

    Hutchinson, Kenneth [Bonneville Power Administration (BPA), Portland, OR (United States)

    2002-12-02

    To perform remedial vegetation management for keeping vegetation a safe distance away from electric power facilities and controlling noxious weeds within a section of BPA's Big Eddy-Ostrander Transmission Corridor. During a site review conducted in late fall of 2001, the inspector observed various species of hardwood trees resprouted from stumps. The new vegetative growth encroached on the required “Minimum Safe Distance” between the top of vegetation and the conductor cables. The management action is necessary to reduce the current and potential future hazards that tall-growing vegetation poses to transmission conductors. In addition, BPA will include weed control as part of their remedial vegetation management action. Noxious weeds occur within the corridor. Under a 1999 Executive Order, all federal agencies are required to detect and control noxious weeds. In addition, BPA is required under the 1990 amendment to the Noxious Weed Act (7 USC 2801-2814) to manage undesirable plants on federal land. Also, the Bonneville Power Administration (BPA) has responsibility to manage noxious weeds under the Transmission System Vegetation Management Program Final Environmental Impact Statement (FEIS).1 State statutes and regulations also mandate action by BPA and the USFS to control noxious weeds. The Oregon Department of Agriculture (ODA) has requested that agencies aggressively control these weeds before additional spread occurs.

  4. From the bush to the big smoke--development of a hybrid urban community based medical education program in the Northern Territory, Australia.

    Science.gov (United States)

    Morgan, S; Smedts, A; Campbell, N; Sager, R; Lowe, M; Strasser, S

    2009-01-01

    The Northern Territory (NT) of Australia is a unique setting for training medical students. This learning environment is characterised by Aboriginal health and an emphasis on rural and remote primary care practice. For over a decade the NT Clinical School (NTCS) of Flinders University has been teaching undergraduate medical students in the NT. Community based medical education (CBME) has been demonstrated to be an effective method of learning medicine, particularly in rural settings. As a result, it is rapidly gaining popularity in Australia and other countries. The NTCS adopted this model some years ago with the implementation of its Rural Clinical School; however, urban models of CBME are much less well developed than those in rural areas. There is considerable pressure to better incorporate CBME into medical student teaching environment, particularly because of the projected massive increase in student numbers over the next few years. To date, the community setting of urban Darwin, the NT capital city, has not been well utilised for medical student training. In 2008, the NTCS enrolled its first cohort of students in a new hybrid CBME program based in urban Darwin. This report describes the process and challenges involved in development of the program, including justification for a hybrid model and the adaptation of a rural model to an urban setting. Relationships were established and formalised with key partners and stakeholders, including GPs and general practices, Aboriginal medical services, community based healthcare providers and other general practice and community organisations. Other significant issues included curriculum development and review, development of learning materials and the establishment of robust evaluation methods. Development of the CBME model in Darwin posed a number of key challenges. Although the experience of past rural programs was useful, a number of distinct differences were evident in the urban setting. Change leadership and inter

  5. Big data in fashion industry

    Science.gov (United States)

    Jain, S.; Bruniaux, J.; Zeng, X.; Bruniaux, P.

    2017-10-01

    Significant work has been done in the field of big data in last decade. The concept of big data includes analysing voluminous data to extract valuable information. In the fashion world, big data is increasingly playing a part in trend forecasting, analysing consumer behaviour, preference and emotions. The purpose of this paper is to introduce the term fashion data and why it can be considered as big data. It also gives a broad classification of the types of fashion data and briefly defines them. Also, the methodology and working of a system that will use this data is briefly described.

  6. How Big is Earth?

    Science.gov (United States)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  7. Privacy and Big Data

    CERN Document Server

    Craig, Terence

    2011-01-01

    Much of what constitutes Big Data is information about us. Through our online activities, we leave an easy-to-follow trail of digital footprints that reveal who we are, what we buy, where we go, and much more. This eye-opening book explores the raging privacy debate over the use of personal data, with one undeniable conclusion: once data's been collected, we have absolutely no control over who uses it or how it is used. Personal data is the hottest commodity on the market today-truly more valuable than gold. We are the asset that every company, industry, non-profit, and government wants. Pri

  8. Big Data Challenges

    Directory of Open Access Journals (Sweden)

    Alexandru Adrian TOLE

    2013-10-01

    Full Text Available The amount of data that is traveling across the internet today, not only that is large, but is complex as well. Companies, institutions, healthcare system etc., all of them use piles of data which are further used for creating reports in order to ensure continuity regarding the services that they have to offer. The process behind the results that these entities requests represents a challenge for software developers and companies that provide IT infrastructure. The challenge is how to manipulate an impressive volume of data that has to be securely delivered through the internet and reach its destination intact. This paper treats the challenges that Big Data creates.

  9. Adapting bioinformatics curricula for big data

    Science.gov (United States)

    Greene, Anna C.; Giffin, Kristine A.; Greene, Casey S.

    2016-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs. PMID:25829469

  10. BigDog-inspired studies in the locomotion of goats and dogs.

    Science.gov (United States)

    Lee, David V; Biewener, Andrew A

    2011-07-01

    Collision-based expenditure of mechanical energy and the compliance and geometry of the leg are fundamental, interrelated considerations in the mechanical design of legged runners. This article provides a basic context and rationale for experiments designed to inform each of these key areas in Boston Dynamic's BigDog robot. Although these principles have been investigated throughout the past few decades within different academic disciplines, BigDog required that they be considered together and in concert with an impressive set of control algorithms that are not discussed here. Although collision reduction is an important strategy for reducing mechanical cost of transport in the slowest and fastest quadrupedal gaits, walking and galloping, BigDog employed an intermediate-speed trotting gait without collision reduction. Trotting, instead, uses a spring-loaded inverted pendulum mechanism with potential for storage and return of elastic strain energy in appropriately compliant structures. Rather than tuning BigDog's built-in leg springs according to a spring-mass model-based virtual leg-spring constant , a much stiffer distal leg spring together with actuation of the adjacent joint provided good trotting dynamics and avoided functional limitations that might have been imposed by too much compliance in real-world terrain. Adjusting the directional compliance of the legs by adopting a knee-forward, elbow-back geometry led to more robust trotting dynamics by reducing perturbations about the pitch axis of the robot's center of mass (CoM). BigDog is the most successful large-scale, all-terrain trotting machine built to date and it continues to stimulate our understanding of legged locomotion in comparative biomechanics as well as in robotics.

  11. Work Term Assignment Spring 2017

    Science.gov (United States)

    Sico, Mallory

    2017-01-01

    My tour in the Engineering Robotics directorate exceeded my expectations. I learned lessons about Creo, manufacturing and assembly, collaboration, and troubleshooting. During my first tour, last spring, I used Creo on a smaller project, but had limited experience with it before starting in the Dynamic Systems Test branch this spring. I gained valuable experience learning assembly design, sheet metal design and designing with intent for manufacturing and assembly. These skills came from working both on the hatch and the floor. I also learned to understand the intent of other designers on models I worked with. While redesigning the floor, I was modifying an existing part and worked to understand what the previous designer had done to make it fit with the new model. Through working with the machine shop and in the mock-up, I learned much more about manufacturing and assembly. I used a Dremel, rivet gun, belt sander, and countersink for the first time. Through taking multiple safety training for different machine shops, I learned new machine shop safety skills specific to each one. This semester also gave me new collaborative opportunities. I collaborated with engineers within my branch as well as with Human Factors and the building 10 machine shop. This experience helped me learn how to design for functionality and assembly, not only for what would be easiest in my designs. In addition to these experiences, I learned many lessons in troubleshooting. I was the first person in my office to use a Windows 10 computer. This caused unexpected issues with NASA services and programs, such as the Digital Data Management Server (DDMS). Because of this, I gained experience finding solutions to lockout and freeze issues as well as Creo specific settings. These will be useful skills to have in the future and will be implemented in future rotations. This co-op tour has motivated me more to finish my degree and pursue my academic goals. I intend to take a machining Career Gateway

  12. Nursing Needs Big Data and Big Data Needs Nursing.

    Science.gov (United States)

    Brennan, Patricia Flatley; Bakken, Suzanne

    2015-09-01

    Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.

  13. Fish survey, fishing duration, and other data from net and otter trawls from the BIG VALLEY as part of Outer Continental Shelf Environmental Assessment Program (OCSEAP) from 20 May 1976 to 30 June 1976 (NODC Accession 7601547)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Fish survey, fishing duration, and other data were collected from net and otter trawls from the BIG VALLEY from 20 May 1976 to 30 June 1976. Data were collected by...

  14. Benthic organism and other data from otter trawls from the Gulf of Alaska from the BIG VALLEY as part of the Outer Continental Shelf Environmental Assessment Program (OCSEAP) from 17 June 1976 to 18 March 1977 (NODC Accession 7700849)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Benthic organism and other data were collected from otter trawls in the Gulf of Alaska from the BIG VALLEY by University of Alaska; Institute of Marine Science...

  15. Ethische aspecten van big data

    NARCIS (Netherlands)

    N. (Niek) van Antwerpen; Klaas Jan Mollema

    2017-01-01

    Big data heeft niet alleen geleid tot uitdagende technische vraagstukken, ook gaat het gepaard met allerlei nieuwe ethische en morele kwesties. Om verantwoord met big data om te gaan, moet ook over deze kwesties worden nagedacht. Want slecht datagebruik kan nadelige gevolgen hebben voor

  16. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  17. Think Big, Bigger ... and Smaller

    Science.gov (United States)

    Nisbett, Richard E.

    2010-01-01

    One important principle of social psychology, writes Nisbett, is that some big-seeming interventions have little or no effect. This article discusses a number of cases from the field of education that confirm this principle. For example, Head Start seems like a big intervention, but research has indicated that its effects on academic achievement…

  18. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2005-01-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. Efforts are underway to showcase the architecture of the GIS framework and initial results for sources and sinks. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is

  19. Spring viremia of carp

    Science.gov (United States)

    Ahne, W.; Bjorklund, H.V.; Essbauer, S.; Fijan, N.; Kurath, G.; Winton, J.R.

    2002-01-01

    pring viremia of carp (SVC) is an important disease affecting cyprinids, mainly common carp Cyprinus carpio. The disease is widespread in European carp culture, where it causes significant morbidity and mortality. Designated a notifiable disease by the Office International des Epizooties, SVC is caused by a rhabdovirus, spring viremia of carp virus (SVCV). Affected fish show destruction of tissues in the kidney, spleen and liver, leading to hemorrhage, loss of water-salt balance and impairment of immune response. High mortality occurs at water temperatures of 10 to 17°C, typically in spring. At higher temperatures, infected carp develop humoral antibodies that can neutralize the spread of virus and such carp are protected against re-infection by solid immunity. The virus is shed mostly with the feces and urine of clinically infected fish and by carriers. Waterborne transmission is believed to be the primary route of infection, but bloodsucking parasites like leeches and the carp louse may serve as mechanical vectors of SVCV. The genome of SVCV is composed of a single molecule of linear, negative-sense, single-stranded RNA containing 5 genes in the order 3¹-NPMGL-5¹ coding for the viral nucleoprotein, phosphoprotein, matrix protein, glycoprotein, and polymerase, respectively. Polyacrylamide gel electrophoresis of the viral proteins, and sequence homologies between the genes and gene junctions of SVCV and vesicular stomatitis viruses, have led to the placement of the virus as a tentative member of the genus Vesiculovirus in the family Rhabdoviridae. These methods also revealed that SVCV is not related to fish rhabdoviruses of the genus Novirhabdovirus. In vitro replication of SVCV takes place in the cytoplasm of cultured cells of fish, bird and mammalian origin at temperatures of 4 to 31°C, with an optimum of about 20°C. Spring viremia of carp can be diagnosed by clinical signs, isolation of virus in cell culture and molecular methods. Antibodies directed

  20. Developing bulk exchange spring magnets

    Energy Technology Data Exchange (ETDEWEB)

    Mccall, Scott K.; Kuntz, Joshua D.

    2017-06-27

    A method of making a bulk exchange spring magnet by providing a magnetically soft material, providing a hard magnetic material, and producing a composite of said magnetically soft material and said hard magnetic material to make the bulk exchange spring magnet. The step of producing a composite of magnetically soft material and hard magnetic material is accomplished by electrophoretic deposition of the magnetically soft material and the hard magnetic material to make the bulk exchange spring magnet.

  1. Spring security 3.x cookbook

    CERN Document Server

    Mankale, Anjana

    2013-01-01

    This book follows a cookbook style exploring various security solutions provided by Spring Security for various vulnerabilities and threat scenarios that web applications may be exposed to at the authentication and session level layers.This book is for all Spring-based application developers as well as Java web developers who wish to implement robust security mechanisms into web application development using Spring Security.Readers are assumed to have a working knowledge of Java web application development, a basic understanding of the Spring framework, and some knowledge of the fundamentals o

  2. Big Data Aesthetics

    DEFF Research Database (Denmark)

    Bjørnsten, Thomas

    2016-01-01

    This article discusses artistic practices and artifacts that are occupied with exploring data through visualization and sonification strategies as well as with translating data into materially solid formats and embodied processes. By means of these examples the overall aim of the article...... is to critically question how and whether such artistic practices can eventually lead to the experience and production of knowledge that could not otherwise be obtained via more traditional ways of data representation. The article, thus, addresses both the problems and possibilities entailed in extending the use...... of large data sets – or Big Data – into the sphere of art and the aesthetic. Central to the discussion here is the analysis of how different structuring principles of data and the discourses that surround these principles shape our perception of data. This discussion involves considerations on various...

  3. Thick-Big Descriptions

    DEFF Research Database (Denmark)

    Lai, Signe Sophus

    The paper discusses the rewards and challenges of employing commercial audience measurements data – gathered by media industries for profitmaking purposes – in ethnographic research on the Internet in everyday life. It questions claims to the objectivity of big data (Anderson 2008), the assumption...... 2010; Webster 2014). This study evolved around industry stakeholders resisting and negotiating changes, as they are happening, in media consumption dynamics and measurement standards, which inevitably reconceptualize future institutionally effective audiences (Ettema & Whitney 1994). With digital...... that bigger is always better, and the many legacy decisions and rules that ultimately govern how audiences are ‘made’ in commercial measurement companies. As such, the paper extends the discussions of a previous empirical study (Lai 2016) on how media organizations imagine their audiences (Ang 1991; Napoli...

  4. Urban Big Data and Sustainable Development Goals: Challenges and Opportunities

    Directory of Open Access Journals (Sweden)

    Ali Kharrazi

    2016-12-01

    Full Text Available Cities are perhaps one of the most challenging and yet enabling arenas for sustainable development goals. The Sustainable Development Goals (SDGs emphasize the need to monitor each goal through objective targets and indicators based on common denominators in the ability of countries to collect and maintain relevant standardized data. While this approach is aimed at harmonizing the SDGs at the national level, it presents unique challenges and opportunities for the development of innovative urban-level metrics through big data innovations. In this article, we make the case for advancing more innovative targets and indicators relevant to the SDGs through the emergence of urban big data. We believe that urban policy-makers are faced with unique opportunities to develop, experiment, and advance big data practices relevant to sustainable development. This can be achieved by situating the application of big data innovations through developing mayoral institutions for the governance of urban big data, advancing the culture and common skill sets for applying urban big data, and investing in specialized research and education programs.

  5. Big Data: present and future

    Directory of Open Access Journals (Sweden)

    Mircea Raducu TRIFU

    2014-05-01

    Full Text Available The paper explains the importance of the Big Data concept, a concept that even now, after years of development, is for the most companies just a cool keyword. The paper also describes the level of the actual big data development and the things it can do, and also the things that can be done in the near future. The paper focuses on explaining to nontechnical and non-database related technical specialists what basically is big data, presents the three most important V's, as well as the new ones, the most important solutions used by companies like Google or Amazon, as well as some interesting perceptions based on this subject.

  6. Big Data and Ambulatory Care

    Science.gov (United States)

    Thorpe, Jane Hyatt; Gray, Elizabeth Alexandra

    2015-01-01

    Big data is heralded as having the potential to revolutionize health care by making large amounts of data available to support care delivery, population health, and patient engagement. Critics argue that big data's transformative potential is inhibited by privacy requirements that restrict health information exchange. However, there are a variety of permissible activities involving use and disclosure of patient information that support care delivery and management. This article presents an overview of the legal framework governing health information, dispels misconceptions about privacy regulations, and highlights how ambulatory care providers in particular can maximize the utility of big data to improve care. PMID:25401945

  7. Big Data Mining: Tools & Algorithms

    Directory of Open Access Journals (Sweden)

    Adeel Shiraz Hashmi

    2016-03-01

    Full Text Available We are now in Big Data era, and there is a growing demand for tools which can process and analyze it. Big data analytics deals with extracting valuable information from that complex data which can’t be handled by traditional data mining tools. This paper surveys the available tools which can handle large volumes of data as well as evolving data streams. The data mining tools and algorithms which can handle big data have also been summarized, and one of the tools has been used for mining of large datasets using distributed algorithms.

  8. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  9. A genetic algorithm-based job scheduling model for big data analytics.

    Science.gov (United States)

    Lu, Qinghua; Li, Shanshan; Zhang, Weishan; Zhang, Lei

    Big data analytics (BDA) applications are a new category of software applications that process large amounts of data using scalable parallel processing infrastructure to obtain hidden value. Hadoop is the most mature open-source big data analytics framework, which implements the MapReduce programming model to process big data with MapReduce jobs. Big data analytics jobs are often continuous and not mutually separated. The existing work mainly focuses on executing jobs in sequence, which are often inefficient and consume high energy. In this paper, we propose a genetic algorithm-based job scheduling model for big data analytics applications to improve the efficiency of big data analytics. To implement the job scheduling model, we leverage an estimation module to predict the performance of clusters when executing analytics jobs. We have evaluated the proposed job scheduling model in terms of feasibility and accuracy.

  10. Big Data is invading big places as CERN

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Big Data technologies are becoming more popular with the constant grow of data generation in different fields such as social networks, internet of things and laboratories like CERN. How is CERN making use of such technologies? How machine learning is applied at CERN with Big Data technologies? How much data we move and how it is analyzed? All these questions will be answered during the talk.

  11. Fatigue behaviour of technical springs

    Energy Technology Data Exchange (ETDEWEB)

    Kaiser, B.; Berger, C. [Institut fuer Werkstoffkunde und Staatliche Materialpruefungsanstalt Darmstadt, Technische Universitaet Darmstadt, Grafenstrasse 2, 64283 Darmstadt (Germany)

    2005-11-01

    Technical Springs belong to the components with the highest cyclic and superposed static load. Nevertheless they have to fulfill the requirements of lightweight constructions. This is only possible, if high strength materials with special properties are carefully manufactured to well designed springs and special additional treatments are carried out, which impose an advantageous residual stress profile in the surface layers of the springs. After a short historical view on the spring research activities in Professor Thums area, some aspects of the fatigue behaviour of hot formed parabolic leaf springs are presented. Then the fatigue properties of cold formed helical compression springs made of different steel spring wires are discussed. Finally some results on the fatigue behaviour of helical springs at a very high number of load cycles are reported. (Abstract Copyright [2005], Wiley Periodicals, Inc.) [German] Technische Federn gehoeren zu den am hoechsten zyklisch beanspruchten Bauteilen mit gleichzeitig hoher Vorspannung bzw. Mittelspannung. Trotzdem sollen sie den Forderungen des Leichtbaus genuegen. Realisierbar ist dies nur durch hochfeste Federwerkstoffe mit speziellen Eigenschaften, die sorgfaeltig zu ueberlegt konstruierten Federn verarbeitet werden, und wenn zusaetzliche Behandlungen durchgefuehrt werden, die einen guenstigen Eigenspannungszustand in der Federrandschicht erzeugen. Nach einem kurzen historischen Rueckblick auf die Forschungsaktivitaeten zur Zeit von August Thum wird zunaechst das Ermuedungsverhalten warm geformter Parabelfedern behandelt. Der folgende Abschnitt befasst sich mit den Schwingfestigkeitseigenschaften kalt geformter Schraubendruckfedern aus verschiedenen Federstahldraehten. Abschliessend wird ueber das Ermuedungsverhalten von Schraubenfedern bei sehr hohen Schwingspielzahlen berichtet. (Abstract Copyright [2005], Wiley Periodicals, Inc.)

  12. Experimenting with Inexpensive Plastic Springs

    Science.gov (United States)

    Perez, Leander; Marques, Adriana; Sánchez, Iván

    2014-01-01

    Acommon undergraduate laboratory experience is the determination of the elastic constant of a spring, whether studying the elongation under a static load or studying the damped harmonic motion of the spring with a suspended mass. An alternative approach to this laboratory experience has been suggested by Menezes et al., aimed at studying the…

  13. Big data for bipolar disorder

    National Research Council Canada - National Science Library

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-01-01

    .... The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events...

  14. Big Data and Perioperative Nursing.

    Science.gov (United States)

    Westra, Bonnie L; Peterson, Jessica J

    2016-10-01

    Big data are large volumes of digital data that can be collected from disparate sources and are challenging to analyze. These data are often described with the five "Vs": volume, velocity, variety, veracity, and value. Perioperative nurses contribute to big data through documentation in the electronic health record during routine surgical care, and these data have implications for clinical decision making, administrative decisions, quality improvement, and big data science. This article explores methods to improve the quality of perioperative nursing data and provides examples of how these data can be combined with broader nursing data for quality improvement. We also discuss a national action plan for nursing knowledge and big data science and how perioperative nurses can engage in collaborative actions to transform health care. Standardized perioperative nursing data has the potential to affect care far beyond the original patient. Copyright © 2016 AORN, Inc. Published by Elsevier Inc. All rights reserved.

  15. Big Lake Dam Inspection Report

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This report summarizes an inspection of the Big Lake Dam that was done in September of 1983. The inspection did not reveal any conditions that constitute and...

  16. Arab Spring Impact on Executive Education in Egypt

    Science.gov (United States)

    Wafa, Dina

    2015-01-01

    Purpose: The purpose of this paper is to study the impact of the Arab Spring on public administration programs in Egypt, with a special focus on executive education programs. Design/Methodology/Approach: The study draws on stakeholder analysis, and uses both primary and secondary data. Findings: The author describes the impact of the Arab Spring…

  17. Clean Cities Now: Vol. 18, No. 1, Spring 2014 (Newsletter)

    Energy Technology Data Exchange (ETDEWEB)

    2014-04-01

    Spring 2014 edition of the biannual newsletter of the U.S. Department of Energy's Clean Cities program. Each issue contains program news, success stories, and information about tools and resources to assist in the deployment of alternative fuels, advanced vehicles, idle reduction, fuel efficiency improvements, and other measures to cut petroleum use in transportation.

  18. A bountiful spring harvest

    CERN Multimedia

    2013-01-01

    Although we recently put the clocks forward and spring has officially begun, the view from my window looks more autumnal – befitting of the season of mists and mellow fruitfulness, rather than that of sowing seeds for the future. Which, in a way is appropriate. With the LHC paused, we are reaping a kind of harvest in the form of recognition for our efforts.   Two weeks ago, I was in Edinburgh, on behalf of everyone at CERN, to collect the Edinburgh medal, which we shared with Peter Higgs. I particularly like the citation for this honour: “The Edinburgh Medal is awarded each year to men and women of science and technology whose professional achievements are judged to have made a significant contribution to the understanding and well-being of humanity.” I like this, because it underlines a fact that needs to be shouted louder – that fundamental science does more than build the sum of human knowledge, it is also the foundation of human well-being. A few d...

  19. Spring comes for ATLAS

    CERN Multimedia

    Butin, F.

    2004-01-01

    (First published in the CERN weekly bulletin 24/2004, 7 June 2004.) A short while ago the ATLAS cavern underwent a spring clean, marking the end of the installation of the detector's support structures and the cavern's general infrastructure. The list of infrastructure to be installed in the ATLAS cavern from September 2003 was long: a thousand tonnes of mechanical structures spread over 13 storeys, two lifts, two 65-tonne overhead travelling cranes 25 metres above cavern floor, with a telescopic boom and cradle to access the remaining 10 metres of the cavern, a ventilation system for the 55 000 cubic metre cavern, a drainage system, a standard sprinkler system and an innovative foam fire-extinguishing system, as well as the external cryogenic system for the superconducting magnets and the liquid argon calorimeters (comprising, amongst other things, two helium refrigeration units, a nitrogen refrigeration unit and 5 km of piping for gaseous or liquid helium and nitrogen), not to mention the handling eq...

  20. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-06-30

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop (see attached agenda). The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement

  1. The role of big laboratories

    CERN Document Server

    Heuer, Rolf-Dieter

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward.

  2. Big Data and Ambulatory Care

    OpenAIRE

    Thorpe, Jane Hyatt; Gray, Elizabeth Alexandra

    2014-01-01

    Big data is heralded as having the potential to revolutionize health care by making large amounts of data available to support care delivery, population health, and patient engagement. Critics argue that big data's transformative potential is inhibited by privacy requirements that restrict health information exchange. However, there are a variety of permissible activities involving use and disclosure of patient information that support care delivery and management. This article presents an ov...

  3. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-06-01

    The Big Sky Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts during the second performance period fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts begun in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for

  4. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan Capalbo

    2005-12-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I are organized into four areas: (1) Evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; (2) Development of GIS-based reporting framework that links with national networks; (3) Design of an integrated suite of monitoring, measuring, and verification technologies, market-based opportunities for carbon management, and an economic/risk assessment framework; (referred to below as the Advanced Concepts component of the Phase I efforts) and (4) Initiation of a comprehensive education and outreach program. As a result of the Phase I activities, the groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that complements the ongoing DOE research agenda in Carbon Sequestration. The geology of the Big Sky Carbon Sequestration Partnership Region is favorable for the potential sequestration of enormous volume of CO{sub 2}. The United States Geological Survey (USGS 1995) identified 10 geologic provinces and 111 plays in the region. These provinces and plays include both sedimentary rock types characteristic of oil, gas, and coal productions as well as large areas of mafic volcanic rocks. Of the 10 provinces and 111 plays, 1 province and 4 plays are located within Idaho. The remaining 9 provinces and 107 plays are dominated by sedimentary rocks and located in the states of Montana and Wyoming. The potential sequestration capacity of the 9 sedimentary provinces within the region ranges from 25,000 to almost 900,000 million metric tons of CO{sub 2}. Overall every sedimentary formation investigated

  5. Aircraft Survivability. Spring 2009

    Science.gov (United States)

    2009-01-01

    Conclusions and Points to Remember .” Program Features An opening video plays when the program is first launched. After the video, the introduction...two years, producing facility systems integration layouts for a Titan missile launch complex. Walt then spent three years at the US Air Force...military services, and the DoD. Of particular note, Walt became an expert at assembling movies (and later videos) of test results and using some of his

  6. Powering Big Data for Nursing Through Partnership.

    Science.gov (United States)

    Harper, Ellen M; Parkerson, Sara

    2015-01-01

    The Big Data Principles Workgroup (Workgroup) was established with support of the Healthcare Information and Management Systems Society. Building on the Triple Aim challenge, the Workgroup sought to identify Big Data principles, barriers, and challenges to nurse-sensitive data inclusion into Big Data sets. The product of this pioneering partnership Workgroup was the "Guiding Principles for Big Data in Nursing-Using Big Data to Improve the Quality of Care and Outcomes."

  7. Stalin’s Big-Fleet Program

    Science.gov (United States)

    2004-01-01

    of the nuclear submarine Kursk with its entire crew—the Russian navy still remains nuclear and the sec- ond most powerful in the world. It overreached...Soviet Battleship Con- struction], Marine-Rundschau 71 (1974), pp. 461–79; S. Breyer, “Sowjetischer Schlacht - schiffbau,” Marine-Rundschau 72 (1975

  8. Challenges of Big Data Analysis.

    Science.gov (United States)

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  9. Learning big data with Amazon Elastic MapReduce

    CERN Document Server

    Singh, Amarkant

    2014-01-01

    This book is aimed at developers and system administrators who want to learn about Big Data analysis using Amazon Elastic MapReduce. Basic Java programming knowledge is required. You should be comfortable with using command-line tools. Prior knowledge of AWS, API, and CLI tools is not assumed. Also, no exposure to Hadoop and MapReduce is expected.

  10. Holy springs and holy water: underestimated sources of illness?

    Science.gov (United States)

    Kirschner, Alexander K T; Atteneder, Michael; Schmidhuber, Angelika; Knetsch, Sonja; Farnleitner, Andreas H; Sommer, Regina

    2012-09-01

    Use of holy springs and holy water is inherent in religious activities. Holy spring water is also used extensively for personal drinking water, although not assessed according to drinking water standards. Holy water in churches and chapels may cause infections via wetting of lips and sprinkling on persons. Our aim was to assess the microbiological and chemical water quality of holy springs and holy water in churches and hospital chapels. Of the holy springs investigated, only 14% met the microbiological and chemical requirements of national drinking water regulations. Considering results from sanitary inspections of the water catchments, no spring was assessed as a reliable drinking water source. All holy water samples from churches and hospital chapels showed extremely high concentrations of HPC; fecal indicators, Pseudomonas aeruginosa and Staphylococcus aureus occurred only in the most frequently visited churches. We conclude that it is highly necessary to include holy springs in programs for assessment and management of water quality. Public awareness has to be raised to perceive holy springs as potential sources of illness. Holy water can be another source of infection, especially in hospital chapels and frequently visited churches. Recommendations are made for proper water quality management of both water types.

  11. Evaluating connection of aquifers to springs and streams, Great Basin National Park and vicinity, Nevada

    Science.gov (United States)

    Prudic, David E.; Sweetkind, Donald S.; Jackson, Tracie R.; Dotson, K. Elaine; Plume, Russell W.; Hatch, Christine E.; Halford, Keith J.

    2015-12-22

    Federal agencies that oversee land management for much of the Snake Range in eastern Nevada, including the management of Great Basin National Park by the National Park Service, need to understand the potential extent of adverse effects to federally managed lands from nearby groundwater development. As a result, this study was developed (1) to attain a better understanding of aquifers controlling groundwater flow on the eastern side of the southern part of the Snake Range and their connection with aquifers in the valleys, (2) to evaluate the relation between surface water and groundwater along the piedmont slopes, (3) to evaluate sources for Big Springs and Rowland Spring, and (4) to assess groundwater flow from southern Spring Valley into northern Hamlin Valley. The study focused on two areas—the first, a northern area along the east side of Great Basin National Park that included Baker, Lehman, and Snake Creeks, and a second southern area that is the potential source area for Big Springs. Data collected specifically for this study included the following: (1) geologic field mapping; (2) drilling, testing, and water quality sampling from 7 test wells; (3) measuring discharge and water chemistry of selected creeks and springs; (4) measuring streambed hydraulic gradients and seepage rates from 18 shallow piezometers installed into the creeks; and (5) monitoring stream temperature along selected reaches to identify places of groundwater inflow.

  12. Collaborative Approaches Needed to Close the Big Data Skills Gap

    Directory of Open Access Journals (Sweden)

    Steven Miller

    2014-04-01

    Full Text Available The big data and analytics talent discussion has largely focused on a single role – the data scientist. However, the need is much broader than data scientists. Data has become a strategic business asset. Every professional occupation must adapt to this new mindset. Universities in partnership with industry must move quickly to ensure that the graduates they produce have the required skills for the age of big data. Existing curricula should be reviewed and adapted to ensure relevance. New curricula and degree programs are needed to meet the needs of industry.

  13. Envisioning the future of 'big data' biomedicine.

    Science.gov (United States)

    Bui, Alex A T; Van Horn, John Darrell

    2017-05-01

    Through the increasing availability of more efficient data collection procedures, biomedical scientists are now confronting ever larger sets of data, often finding themselves struggling to process and interpret what they have gathered. This, while still more data continues to accumulate. This torrent of biomedical information necessitates creative thinking about how the data are being generated, how they might be best managed, analyzed, and eventually how they can be transformed into further scientific understanding for improving patient care. Recognizing this as a major challenge, the National Institutes of Health (NIH) has spearheaded the "Big Data to Knowledge" (BD2K) program - the agency's most ambitious biomedical informatics effort ever undertaken to date. In this commentary, we describe how the NIH has taken on "big data" science head-on, how a consortium of leading research centers are developing the means for handling large-scale data, and how such activities are being marshalled for the training of a new generation of biomedical data scientists. All in all, the NIH BD2K program seeks to position data science at the heart of 21 st Century biomedical research. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. A Portuguese Spring

    Science.gov (United States)

    The author was in Portugal on a two-month Embassy Science Fellows Program [http://www.state.gov/g/oes/rls/fs/2003/17863.htm]. The U.S. State Department draws upon other federal agencies to provide scientific and technical expertise to American embassies around the world. Portugal...

  15. Fish Springs weather CY 2010

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — Weather data for calendar year 2010 at Fish Springs National Wildlife Refuge. Data is provided for each month and includes maximum temperature, minimum temperature,...

  16. Steller's Eider spring migration surveys

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — Annual spring aerial surveys were conducted most years from 1992 to 2008, to monitor the population status and habitat use of Steller's eiders (Polysticta stelleri)...

  17. Big Results From a Smaller Gearbox

    Science.gov (United States)

    2005-01-01

    Many people will be sad to see the Hubble Space Telescope go, as it was the first instrument of its kind to provide us with such a wealth of imagery and information about the galaxy. The telescope has served us well since its launch in spring of 1990, but it is nearly time for its retirement. The science, however, will continue, as NASA plans the launch of a new, more modern orbiting telescope, the James Webb Space Telescope. Named after the man who ran NASA from 1961 to 1968, years fraught with the anxiety and uncertainty of the Space Race, the scope is scheduled for launch in fall of 2011. It is designed to study the earliest galaxies and some of the first stars formed after the Big Bang. NASA scientists at the Goddard Space Flight Center are busy developing the technologies to build this new machine. Many of the new technologies are available for commercial licensing and development. For example, the NASA Planetary Gear System technology developed to give precise nanometer positioning capabilities for the James Webb Space Telescope is now being employed by Turnkey Design Services, LLC (TDS), of Blue Island, Illinois, to improve electric motors. This revolutionary piece of technology allows more efficient operation of the motors, and is more cost- effective than traditional gearbox designs.

  18. Considerations on Geospatial Big Data

    Science.gov (United States)

    LIU, Zhen; GUO, Huadong; WANG, Changlin

    2016-11-01

    Geospatial data, as a significant portion of big data, has recently gained the full attention of researchers. However, few researchers focus on the evolution of geospatial data and its scientific research methodologies. When entering into the big data era, fully understanding the changing research paradigm associated with geospatial data will definitely benefit future research on big data. In this paper, we look deep into these issues by examining the components and features of geospatial big data, reviewing relevant scientific research methodologies, and examining the evolving pattern of geospatial data in the scope of the four ‘science paradigms’. This paper proposes that geospatial big data has significantly shifted the scientific research methodology from ‘hypothesis to data’ to ‘data to questions’ and it is important to explore the generality of growing geospatial data ‘from bottom to top’. Particularly, four research areas that mostly reflect data-driven geospatial research are proposed: spatial correlation, spatial analytics, spatial visualization, and scientific knowledge discovery. It is also pointed out that privacy and quality issues of geospatial data may require more attention in the future. Also, some challenges and thoughts are raised for future discussion.

  19. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  20. GEOSS: Addressing Big Data Challenges

    Science.gov (United States)

    Nativi, S.; Craglia, M.; Ochiai, O.

    2014-12-01

    In the sector of Earth Observation, the explosion of data is due to many factors including: new satellite constellations, the increased capabilities of sensor technologies, social media, crowdsourcing, and the need for multidisciplinary and collaborative research to face Global Changes. In this area, there are many expectations and concerns about Big Data. Vendors have attempted to use this term for their commercial purposes. It is necessary to understand whether Big Data is a radical shift or an incremental change for the existing digital infrastructures. This presentation tries to explore and discuss the impact of Big Data challenges and new capabilities on the Global Earth Observation System of Systems (GEOSS) and particularly on its common digital infrastructure called GCI. GEOSS is a global and flexible network of content providers allowing decision makers to access an extraordinary range of data and information at their desk. The impact of the Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on GEOSS is discussed. The main solutions and experimentation developed by GEOSS along these axes are introduced and analyzed. GEOSS is a pioneering framework for global and multidisciplinary data sharing in the Earth Observation realm; its experience on Big Data is valuable for the many lessons learned.

  1. [Big data in official statistics].

    Science.gov (United States)

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  2. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  3. A practical guide to big data research in psychology.

    Science.gov (United States)

    Chen, Eric Evan; Wojcik, Sean P

    2016-12-01

    The massive volume of data that now covers a wide variety of human behaviors offers researchers in psychology an unprecedented opportunity to conduct innovative theory- and data-driven field research. This article is a practical guide to conducting big data research, covering data management, acquisition, processing, and analytics (including key supervised and unsupervised learning data mining methods). It is accompanied by walkthrough tutorials on data acquisition, text analysis with latent Dirichlet allocation topic modeling, and classification with support vector machines. Big data practitioners in academia, industry, and the community have built a comprehensive base of tools and knowledge that makes big data research accessible to researchers in a broad range of fields. However, big data research does require knowledge of software programming and a different analytical mindset. For those willing to acquire the requisite skills, innovative analyses of unexpected or previously untapped data sources can offer fresh ways to develop, test, and extend theories. When conducted with care and respect, big data research can become an essential complement to traditional research. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  4. The Rise of Big Data in Neurorehabilitation.

    Science.gov (United States)

    Faroqi-Shah, Yasmeen

    2016-02-01

    In some fields, Big Data has been instrumental in analyzing, predicting, and influencing human behavior. However, Big Data approaches have so far been less central in speech-language pathology. This article introduces the concept of Big Data and provides examples of Big Data initiatives pertaining to adult neurorehabilitation. It also discusses the potential theoretical and clinical contributions that Big Data can make. The article also recognizes some impediments in building and using Big Data for scientific and clinical inquiry. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  5. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2005-11-01

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO2 utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research agenda in Carbon Sequestration. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other DOE regional partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for soil C in the

  6. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  7. Spatial big data for disaster management

    Science.gov (United States)

    Shalini, R.; Jayapratha, K.; Ayeshabanu, S.; Chemmalar Selvi, G.

    2017-11-01

    Big data is an idea of informational collections that depicts huge measure of information and complex that conventional information preparing application program is lacking to manage them. Presently, big data is a widely known domain used in research, academic, and industries. It is utilized to store substantial measure of information in a solitary brought together one. Challenges integrate capture, allocation, analysis, information precise, visualization, distribution, interchange, delegation, inquiring, updating and information protection. In this digital world, to put away the information and recovering the data is enormous errand for the huge organizations and some time information ought to be misfortune due to circulated information putting away. For this issue the organization individuals are chosen to actualize the huge information to put away every one of the information identified with the organization they are put away in one enormous database that is known as large information. Remote sensor is a science getting data used to distinguish the items or break down the range from a separation. It is anything but difficult to discover the question effortlessly with the sensor. It makes geographic data from satellite and sensor information so in this paper dissect what are the structures are utilized for remote sensor in huge information and how the engineering is vary from each other and how they are identify with our investigations. This paper depicts how the calamity happens and figuring consequence of informational collection. And applied a seismic informational collection to compute the tremor calamity in view of classification and clustering strategy. The classical data mining algorithms for classification used are k-nearest, naive bayes and decision table and clustering used are hierarchical, make density based and simple k_means using XLMINER and WEKA tool. This paper also helps to predicts the spatial dataset by applying the XLMINER AND WEKA tool and

  8. Big Book of Windows Hacks

    CERN Document Server

    Gralla, Preston

    2008-01-01

    Bigger, better, and broader in scope, the Big Book of Windows Hacks gives you everything you need to get the most out of your Windows Vista or XP system, including its related applications and the hardware it runs on or connects to. Whether you want to tweak Vista's Aero interface, build customized sidebar gadgets and run them from a USB key, or hack the "unhackable" screensavers, you'll find quick and ingenious ways to bend these recalcitrant operating systems to your will. The Big Book of Windows Hacks focuses on Vista, the new bad boy on Microsoft's block, with hacks and workarounds that

  9. Worse than a big rip?

    Energy Technology Data Exchange (ETDEWEB)

    Bouhmadi-Lopez, Mariam [Centro Multidisciplinar de Astrofisica, CENTRA, Departamento de Fisica, Instituto Superior Tecnico, Av. Rovisco Pais 1, 1049-001 Lisbon (Portugal); Departamento de Fisica, Universidade da Beira Interior, R. Marques d' Avila e Bolama, 6201-001 Covilha (Portugal); Institute of Cosmology and Gravitation, University of Portsmouth, Mercantile House, Hampshire Terrace, Portsmouth, PO1 2EG (United Kingdom)], E-mail: mariam.bouhmadi@fisica.ist.utl.pt; Gonzalez-Diaz, Pedro F. [Colina de los Chopos, Centro de Fisica ' Miguel A. Catalan' , Instituto de Matematicas y Fisica Fundamental, Consejo Superior de Investigaciones Cientificas, Serrano 121, 28006 Madrid (Spain)], E-mail: p.gonzalezdiaz@imaff.cfmac.csic.es; Martin-Moruno, Prado [Colina de los Chopos, Centro de Fisica ' Miguel A. Catalan' , Instituto de Matematicas y Fisica Fundamental, Consejo Superior de Investigaciones Cientificas, Serrano 121, 28006 Madrid (Spain)], E-mail: pra@imaff.cfmac.csic.es

    2008-01-17

    We show that a generalised phantom Chaplygin gas can present a future singularity in a finite future cosmic time. Unlike the big rip singularity, this singularity happens for a finite scale factor, but like the big rip singularity, it would also take place at a finite future cosmic time. In addition, we define a dual of the generalised phantom Chaplygin gas which satisfies the null energy condition. Then, in a Randall-Sundrum 1 brane-world scenario, we show that the same kind of singularity at a finite scale factor arises for a brane filled with a dual of the generalised phantom Chaplygin gas.

  10. [Big Data- challenges and risks].

    Science.gov (United States)

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-06

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented.

  11. Little Science to Big Science: Big Scientists to Little Scientists?

    Science.gov (United States)

    Simonton, Dean Keith

    2010-01-01

    This article presents the author's response to Hisham B. Ghassib's essay entitled "Where Does Creativity Fit into a Productivist Industrial Model of Knowledge Production?" Professor Ghassib's (2010) essay presents a provocative portrait of how the little science of the Babylonians, Greeks, and Arabs became the Big Science of the modern industrial…

  12. Big Cities, Big Problems: Reason for the Elderly to Move?

    NARCIS (Netherlands)

    Fokkema, T.; de Jong-Gierveld, J.; Nijkamp, P.

    1996-01-01

    In many European countries, data on geographical patterns of internal elderly migration show that the elderly (55+) are more likely to leave than to move to the big cities. Besides emphasising the attractive features of the destination areas (pull factors), it is often assumed that this negative

  13. Aquatic biological communities and associated habitats at selected sites in the Big Wood River Watershed, south-central Idaho, 2014

    Science.gov (United States)

    MacCoy, Dorene E.; Short, Terry M.

    2016-09-28

    Assessments of streamflow (discharge) parameters, water quality, physical habitat, and biological communities were completed between May and September 2014 as part of a monitoring program in the Big Wood River watershed of south-central Idaho. The sampling was conducted by the U.S. Geological Survey in cooperation with Blaine County, Trout Unlimited, the Nature Conservancy, and the Wood River Land Trust to help identify the status of aquatic resources at selected locations in the watershed. Information in this report provides a basis with which to evaluate and monitor the long-term health of the Big Wood River and its major tributaries. Sampling sites were co-located with existing U.S. Geological Survey streamgaging stations: three on the main stem Big Wood River and four on the North Fork Big Wood River (North Fork), Warm Springs Creek (Warm Sp), Trail Creek (Trail Ck), and East Fork Big Wood River (East Fork) tributaries.The analytical results and quality-assurance information for water quality, physical habitat, and biological community samples collected at study sites during 2 weeks in September 2014 are summarized. Water-quality data include concentrations of major nutrients, suspended sediment, dissolved oxygen, and fecal-coliform bacteria. To assess the potential effects of nutrient enrichment on algal growth, concentrations of periphyton biomass (chlorophyll-a and ash free dry weight) in riffle habitats were determined at each site. Physical habitat parameters include stream channel morphology, habitat volume, instream structure, substrate composition, and riparian vegetative cover. Biological data include taxa richness, abundance, and stream-health indicator metrics for macroinvertebrate and fish communities. Statistical summaries of the water-quality, habitat, and biological data are provided along with discussion of how these findings relate to the health of aquatic resources in the Big Wood River watershed.Seasonal discharge patterns using statistical

  14. BIG DATA-DRIVEN MARKETING: AN ABSTRACT

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (BD use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: What are the key antecedents of big data customer analytics use? How, and to what extent, does big data customer an...

  15. Big Data: Implications for Health System Pharmacy

    OpenAIRE

    Stokes, Laura B.; Rogers, Joseph W.; Hertig, John B.; Weber, Robert J.

    2016-01-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this artic...

  16. Big Data Analytics and Its Applications

    OpenAIRE

    Memon, Mashooque Ahmed; Soomro, Safeeullah; Jumani, Awais Khan; Kartio, Muneer Ahmed

    2017-01-01

    The term, Big Data, has been authored to refer to the extensive heave of data that can't be managed by traditional data handling methods or techniques. The field of Big Data plays an indispensable role in various fields, such as agriculture, banking, data mining, education, chemistry, finance, cloud computing, marketing, health care stocks. Big data analytics is the method for looking at big data to reveal hidden patterns, incomprehensible relationship and other important data that can be uti...

  17. Weldon Spring Site Environmental Report for Calendar Year 1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-06-01

    This Weldon Spring Site Environmental Report for Calendar Year 1995 has been prepared to provide information about the public safety and environmental protection programs conducted by the Weldon Spring Site Remedial Action Project (WSSRAP). The Weldon Spring site is located in southern St. Charles County, Missouri, approximately 48 km (30 mi) west of St. Louis. The site consists of two main areas, the Weldon Spring Chemical Plant and raffinate pits and the Weldon Spring Quarry. The chemical plant, raffinate pits, and quarry are located on Missouri State Route 94, southwest of U.S. Route 40/61. The objectives of the Site Environmental Report are to present a summary of data from the environmental monitoring program, to characterize trends and environmental conditions at the site, and to confirm compliance with environmental and health protection standards and requirements. The report also presents the status of remedial activities and the results of monitoring these activities to assess their impacts on the public and environment. This report includes monitoring data from routine radiological and nonradiological sampling activities. These data include estimates of dose to the public from the Weldon Spring site, estimates of effluent releases, and trends in groundwater contaminant levels. Additionally, applicable compliance requirements, quality assurance programs, and special studies conducted in 1995 to support environmental protection programs are discussed. Dose estimates presented in this report are based on hypothetical exposure scenarios for public use of areas near the site. In addition, release estimates have been calculated on the basis of 1995 National Pollutant Discharge Elimination System (NPDES) and air monitoring data. Effluent discharges from the site under routine NPDES and National Emission Standards for Hazardous Air Pollutants (NESHAPs) monitoring were below permitted levels.

  18. Big sagebrush seed bank densities following wildfires

    Science.gov (United States)

    Big sagebrush (Artemisia spp.) is a critical shrub to many wildlife species including sage grouse (Centrocercus urophasianus), mule deer (Odocoileus hemionus), and pygmy rabbit (Brachylagus idahoensis). Big sagebrush is killed by wildfires and big sagebrush seed is generally short-lived and do not s...

  19. Judging Big Deals: Challenges, Outcomes, and Advice

    Science.gov (United States)

    Glasser, Sarah

    2013-01-01

    This article reports the results of an analysis of five Big Deal electronic journal packages to which Hofstra University's Axinn Library subscribes. COUNTER usage reports were used to judge the value of each Big Deal. Limitations of usage statistics are also discussed. In the end, the author concludes that four of the five Big Deals are good deals…

  20. A survey of big data research

    Science.gov (United States)

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  1. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  2. Data, Data, Data : Big, Linked & Open

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Eckartz, S.M.

    2013-01-01

    De gehele business en IT-wereld praat op dit moment over Big Data, een trend die medio 2013 Cloud Computing is gepasseerd (op basis van Google Trends). Ook beleidsmakers houden zich actief bezig met Big Data. Neelie Kroes, vice-president van de Europese Commissie, spreekt over de ‘Big Data

  3. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  4. A survey of big data research

    OpenAIRE

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang,Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions.

  5. Big data: een zoektocht naar instituties

    NARCIS (Netherlands)

    van der Voort, H.G.; Crompvoets, J

    2016-01-01

    Big data is a well-known phenomenon, even a buzzword nowadays. It refers to an abundance of data and new possibilities to process and use them. Big data is subject of many publications. Some pay attention to the many possibilities of big data, others warn us for their consequences. This special

  6. A survey of big data research.

    Science.gov (United States)

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions.

  7. Big data and urban governance

    NARCIS (Netherlands)

    Taylor, L.; Richter, C.; Gupta, J.; Pfeffer, K.; Verrest, H.; Ros-Tonen, M.

    2015-01-01

    This chapter examines the ways in which big data is involved in the rise of smart cities. Mobile phones, sensors and online applications produce streams of data which are used to regulate and plan the city, often in real time, but which presents challenges as to how the city’s functions are seen and

  8. The Big European Bubble Chamber

    CERN Multimedia

    1977-01-01

    The 3.70 metre Big European Bubble Chamber (BEBC), dismantled on 9 August 1984. During operation it was one of the biggest detectors in the world, producing direct visual recordings of particle tracks. 6.3 million photos of interactions were taken with the chamber in the course of its existence.

  9. Banking Wyoming big sagebrush seeds

    Science.gov (United States)

    Robert P. Karrfalt; Nancy Shaw

    2013-01-01

    Five commercially produced seed lots of Wyoming big sagebrush (Artemisia tridentata Nutt. var. wyomingensis (Beetle & Young) S.L. Welsh [Asteraceae]) were stored under various conditions for 5 y. Purity, moisture content as measured by equilibrium relative humidity, and storage temperature were all important factors to successful seed storage. Our results indicate...

  10. The Case for "Big History."

    Science.gov (United States)

    Christian, David

    1991-01-01

    Urges an approach to the teaching of history that takes the largest possible perspective, crossing time as well as space. Discusses the problems and advantages of such an approach. Describes a course on "big" history that begins with time, creation myths, and astronomy, and moves on to paleontology and evolution. (DK)

  11. Finding errors in big data

    NARCIS (Netherlands)

    Puts, Marco; Daas, Piet; de Waal, A.G.

    No data source is perfect. Mistakes inevitably creep in. Spotting errors is hard enough when dealing with survey responses from several thousand people, but the difficulty is multiplied hugely when that mysterious beast Big Data comes into play. Statistics Netherlands is about to publish its first

  12. True Randomness from Big Data

    Science.gov (United States)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-09-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  13. Big Data for personalized healthcare

    NARCIS (Netherlands)

    Siemons, Liseth; Sieverink, Floor; Vollenbroek, Wouter; van de Wijngaert, Lidwien; Braakman-Jansen, Annemarie; van Gemert-Pijnen, Lisette

    2016-01-01

    Big Data, often defined according to the 5V model (volume, velocity, variety, veracity and value), is seen as the key towards personalized healthcare. However, it also confronts us with new technological and ethical challenges that require more sophisticated data management tools and data analysis

  14. Big data en gelijke behandeling

    NARCIS (Netherlands)

    Lammerant, Hans; de Hert, Paul; Blok, P.H.; Blok, P.H.

    2017-01-01

    In dit hoofdstuk bekijken we allereerst de voornaamste basisbegrippen inzake gelijke behandeling en discriminatie (paragraaf 6.2). Vervolgens kijken we haar het Nederlandse en Europese juridisch kader inzake non-discriminatie (paragraaf 6.3-6.5) en hoe die regels moeten worden toegepast op big

  15. BIG DATA IN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2015-06-01

    Full Text Available In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured in order to improve current transactions, to develop new business models, to provide a real image of the supply and demand and thereby, generate market advantages. So, the companies that turn to Big Data have a competitive advantage over other firms. Looking from the perspective of IT organizations, they must accommodate the storage and processing Big Data, and provide analysis tools that are easily integrated into business processes. This paper aims to discuss aspects regarding the Big Data concept, the principles to build, organize and analyse huge datasets in the business environment, offering a three-layer architecture, based on actual software solutions. Also, the article refers to the graphical tools for exploring and representing unstructured data, Gephi and NodeXL.

  16. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-10-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification

  17. Instant Spring for Android starter

    CERN Document Server

    Dahanne, Anthony

    2013-01-01

    Packt Instant Starter: get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks.This is a Starter which gives you an introduction to Spring for Android with plenty of well-explained practical code examples.If you are an Android developer who wants to learn about RESTful web services and OAuth authentication and authorization, and you also want to know how to speed up your development involving those architectures using Spring for Android abstractions, then this book is for you.But core Java developers

  18. User News. Volume 17, Number 1 -- Spring 1996

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-07-01

    This is a newsletter for users of the DOE-2, PowerDOE, SPARK, and BLAST building energy simulation programs. The topics for the Spring 1996 issue include the SPARK simulation environment, DOE-2 validation, listing of free fenestration software from LBNL, Web sites for building energy efficiency, the heat balance method of calculating building heating and cooling loads.

  19. Big-Leaf Mahogany on CITES Appendix II: Big Challenge, Big Opportunity

    Science.gov (United States)

    JAMES GROGAN; PAULO BARRETO

    2005-01-01

    On 15 November 2003, big-leaf mahogany (Swietenia macrophylla King, Meliaceae), the most valuable widely traded Neotropical timber tree, gained strengthened regulatory protection from its listing on Appendix II of the Convention on International Trade in Endangered Species ofWild Fauna and Flora (CITES). CITES is a United Nations-chartered agreement signed by 164...

  20. Big advance towards the LHC upgrade

    CERN Multimedia

    CERN Bulletin

    2010-01-01

    The LHC is currently the world’s most powerful accelerator. With its technical achievements it has already set world records. However, big science looks very far ahead in time and is already preparing already for the LHC’s magnet upgrade, which should involve a 10-fold increase of the collision rates toward the end of the next decade. The new magnet technology involves the use of an advanced superconducting material that has just started to show its potential.   The first Long Quadrupole Shell (LQS01) model during assembly at Fermilab. The first important step in the qualification of the new technology for use in the LHC was achieved at the beginning of December when the US LHC Accelerator Research Program (LARP) – a consortium of Brookhaven National Laboratory, Fermilab, Lawrence Berkeley National Laboratory and the SLAC National Accelerator Laboratory founded by US Department Of Energy (DOE) in 2003 – successfully tested the first long focussing magnet th...

  1. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    CERN Multimedia

    2008-01-01

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  2. Perspectives on Big Data and Big Data Analytics

    Directory of Open Access Journals (Sweden)

    Elena Geanina ULARU

    2012-12-01

    Full Text Available Nowadays companies are starting to realize the importance of using more data in order to support decision for their strategies. It was said and proved through study cases that “More data usually beats better algorithms”. With this statement companies started to realize that they can chose to invest more in processing larger sets of data rather than investing in expensive algorithms. The large quantity of data is better used as a whole because of the possible correlations on a larger amount, correlations that can never be found if the data is analyzed on separate sets or on a smaller set. A larger amount of data gives a better output but also working with it can become a challenge due to processing limitations. This article intends to define the concept of Big Data and stress the importance of Big Data Analytics.

  3. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  4. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  5. The ethics of big data in big agriculture

    Directory of Open Access Journals (Sweden)

    Isabelle M. Carbonell

    2016-03-01

    Full Text Available This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique insights on a field-by-field basis into a third or more of the US farmland. This power asymmetry may be rebalanced through open-sourced data, and publicly-funded data analytic tools which rival Climate Corp. in complexity and innovation for use in the public domain.

  6. BigSUR

    KAUST Repository

    Kelly, Tom

    2017-11-22

    The creation of high-quality semantically parsed 3D models for dense metropolitan areas is a fundamental urban modeling problem. Although recent advances in acquisition techniques and processing algorithms have resulted in large-scale imagery or 3D polygonal reconstructions, such data-sources are typically noisy, and incomplete, with no semantic structure. In this paper, we present an automatic data fusion technique that produces high-quality structured models of city blocks. From coarse polygonal meshes, street-level imagery, and GIS footprints, we formulate a binary integer program that globally balances sources of error to produce semantically parsed mass models with associated facade elements. We demonstrate our system on four city regions of varying complexity; our examples typically contain densely built urban blocks spanning hundreds of buildings. In our largest example, we produce a structured model of 37 city blocks spanning a total of 1,011 buildings at a scale and quality previously impossible to achieve automatically.

  7. Research Synopsis: Spring 1983 Retention.

    Science.gov (United States)

    Peralta Community Coll. District, Oakland, CA. Office of Research, Planning and Development.

    An analysis of spring 1983 retention rates and grade distributions within the Peralta Community College District (PCCD) revealed: (1) College of Alameda had the highest successful retention rate in the PCCD, defined as the total of all students who completed the term with a grade of A, B, C, D, or CR (credit); (2) the PCCD's successful retention…

  8. Spring for It: First Novels

    Science.gov (United States)

    Hoffert, Barbara

    2010-01-01

    How do publishers describe the first novels they will be releasing this spring and summer? "Amazing," "fabulous," and "unique" are words that pop up frequently, though hats off to one publicist forthright or cheeky enough to call a work "weird Western/horror." The proof of such praise is in the reading, but why not check out this preview of first…

  9. A Laboratory of Spring. Introduction

    Directory of Open Access Journals (Sweden)

    Witold Wachowski

    2013-12-01

    Full Text Available Introduction to a special issue published on the occasion of the 100th anniversary of the premiere of 'The Rite of Spring' by Igor Stravinsky. The articles cover the field of musicology as well as history, philosophy, psychology, sociology, ethnography and cognitive science of music.

  10. Finding Spring on Planet X

    Science.gov (United States)

    Simoson, Andrew J.

    2007-01-01

    For a given orbital period and eccentricity, we determine the maximum time lapse between the winter solstice and the spring equinox on a planet. In addition, given an axial precession path, we determine the effects on the seasons. This material can be used at various levels to illustrate ideas such as periodicity, eccentricity, polar coordinates,…

  11. Big data is not a monolith

    CERN Document Server

    Ekbia, Hamid R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  12. Optimization of holddown spring in KOFA

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Jung Sik; Lee, Jae Kyung; Sohn, Dong Sung [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-08-01

    As a part of establishment of capability for design change and optimization of fuel components, the analysis method of holddown spring was improved and the optimization capability for the holddown spring was established. With the improved analysis method the characteristics of 17 x 17 holddown spring of KOFA and straight type holddown spring were analyzed and the results were compared with test results. It showed that the improved analysis method predicts the spring characteristic well agree with the test results. As an example of an optimization, the thickness of straight type holddown spring was optimized and the characteristic with this thickness was discussed. (Author) 39 figs., 2 tabs.

  13. Big data and ophthalmic research.

    Science.gov (United States)

    Clark, Antony; Ng, Jonathon Q; Morlet, Nigel; Semmens, James B

    2016-01-01

    Large population-based health administrative databases, clinical registries, and data linkage systems are a rapidly expanding resource for health research. Ophthalmic research has benefited from the use of these databases in expanding the breadth of knowledge in areas such as disease surveillance, disease etiology, health services utilization, and health outcomes. Furthermore, the quantity of data available for research has increased exponentially in recent times, particularly as e-health initiatives come online in health systems across the globe. We review some big data concepts, the databases and data linkage systems used in eye research-including their advantages and limitations, the types of studies previously undertaken, and the future direction for big data in eye research. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. George and the big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2012-01-01

    George has problems. He has twin baby sisters at home who demand his parents’ attention. His beloved pig Freddy has been exiled to a farm, where he’s miserable. And worst of all, his best friend, Annie, has made a new friend whom she seems to like more than George. So George jumps at the chance to help Eric with his plans to run a big experiment in Switzerland that seeks to explore the earliest moment of the universe. But there is a conspiracy afoot, and a group of evildoers is planning to sabotage the experiment. Can George repair his friendship with Annie and piece together the clues before Eric’s experiment is destroyed forever? This engaging adventure features essays by Professor Stephen Hawking and other eminent physicists about the origins of the universe and ends with a twenty-page graphic novel that explains how the Big Bang happened—in reverse!

  15. Big Data for Precision Medicine

    Directory of Open Access Journals (Sweden)

    Daniel Richard Leff

    2015-09-01

    Full Text Available This article focuses on the potential impact of big data analysis to improve health, prevent and detect disease at an earlier stage, and personalize interventions. The role that big data analytics may have in interrogating the patient electronic health record toward improved clinical decision support is discussed. We examine developments in pharmacogenetics that have increased our appreciation of the reasons why patients respond differently to chemotherapy. We also assess the expansion of online health communications and the way in which this data may be capitalized on in order to detect public health threats and control or contain epidemics. Finally, we describe how a new generation of wearable and implantable body sensors may improve wellbeing, streamline management of chronic diseases, and improve the quality of surgical implants.

  16. Big Data hvor N=1

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    2017-01-01

    Forskningen vedrørende anvendelsen af ’big data’ indenfor sundhed er kun lige begyndt, og kan på sigt blive en stor hjælp i forhold til at tilrettelægge en mere personlig og helhedsorienteret sundhedsindsats for multisyge. Personlig sundhedsteknologi, som kort præsenteres i dette kapital, rummer et...... stor potentiale for at gennemføre ’big data’ analyser for den enkelte person, det vil sige hvor N=1. Der er store teknologiske udfordringer i at få lavet teknologier og metoder til at indsamle og håndtere personlige data, som kan deles, på tværs på en standardiseret, forsvarlig, robust, sikker og ikke...

  17. Big Data in Transport Geography

    DEFF Research Database (Denmark)

    Reinau, Kristian Hegner; Agerholm, Niels; Lahrmann, Harry Spaabæk

    The emergence of new tracking technologies and Big Data has caused a transformation of the transport geography field in recent years. One new datatype, which is starting to play a significant role in public transport, is smart card data. Despite the growing focus on smart card data, there is a need...... for studies that explicitly compare the quality of this new type of data to traditional data sources. With the current focus on Big Data in the transport field, public transport planners are increasingly looking towards smart card data to analyze and optimize flows of passengers. However, in many cases...... it is not all public transport passengers in a city, region or country with a smart card system that uses the system, and in such cases, it is important to know what biases smart card data has in relation to giving a complete view upon passenger flows. This paper therefore analyses the quality and biases...

  18. The big wheels of ATLAS

    CERN Multimedia

    2006-01-01

    The ATLAS cavern is filling up at an impressive rate. The installation of the first of the big wheels of the muon spectrometer, a thin gap chamber (TGC) wheel, was completed in September. The muon spectrometer will include four big moving wheels at each end, each measuring 25 metres in diameter. Of the eight wheels in total, six will be composed of thin gap chambers for the muon trigger system and the other two will consist of monitored drift tubes (MDTs) to measure the position of the muons (see Bulletin No. 13/2006). The installation of the 688 muon chambers in the barrel is progressing well, with three-quarters of them already installed between the coils of the toroid magnet.

  19. Geohydrologic Investigations and Landscape Characteristics of Areas Contributing Water to Springs, the Current River, and Jacks Fork, Ozark National Scenic Riverways, Missouri

    Science.gov (United States)

    Mugel, Douglas N.; Richards, Joseph M.; Schumacher, John G.

    2009-01-01

    The Ozark National Scenic Riverways (ONSR) is a narrow corridor that stretches for approximately 134 miles along the Current River and Jacks Fork in southern Missouri. Most of the water flowing in the Current River and Jacks Fork is discharged to the rivers from springs within the ONSR, and most of the recharge area of these springs is outside the ONSR. This report describes geohydrologic investigations and landscape characteristics of areas contributing water to springs and the Current River and Jacks Fork in the ONSR. The potentiometric-surface map of the study area for 2000-07 shows that the groundwater divide extends beyond the surface-water divide in some places, notably along Logan Creek and the northeastern part of the study area, indicating interbasin transfer of groundwater between surface-water basins. A low hydraulic gradient occurs in much of the upland area west of the Current River associated with areas of high sinkhole density, which indicates the presence of a network of subsurface karst conduits. The results of a low base-flow seepage run indicate that most of the discharge in the Current River and Jacks Fork was from identified springs, and a smaller amount was from tributaries whose discharge probably originated as spring discharge, or from springs or diffuse groundwater discharge in the streambed. Results of a temperature profile conducted on an 85-mile reach of the Current River indicate that the lowest average temperatures were within or downstream from inflows of springs. A mass-balance on heat calculation of the discharge of Bass Rock Spring, a previously undescribed spring, resulted in an estimated discharge of 34.1 cubic feet per second (ft3/s), making it the sixth largest spring in the Current River Basin. The 13 springs in the study area for which recharge areas have been estimated accounted for 82 percent (867 ft3/s of 1,060 ft3/s) of the discharge of the Current River at Big Spring during the 2006 seepage run. Including discharge from

  20. Relationship Between Big Five Personality Traits, Emotional Intelligence and Self-esteem Among College Students

    OpenAIRE

    Fauzia Nazir, AnamAzam, Muhammad Rafiq, Sobia Nazir, Sophia Nazir, ShaziaTasleem

    2015-01-01

    The current research study was on the “Relationship between Big Five Personality Traits & Emotional Intelligence and Self-esteem among the College Students”. This work is based on cross sectional survey research design. The convenience sample was used by including 170 female Students studying at government college kotla Arab Ali khan Gujrat, Pakistan, degree program of 3rd year and 4th year. The study variables were measured using Big Five Inventory Scale by Goldberg (1993), Emotional Intell...

  1. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    OpenAIRE

    Cheung, Mike W.-L.; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists – and probably the most crucial one – is that most researchers may not have the necessary programming and computati...

  2. Big Data in Cancer Genomics

    OpenAIRE

    Chin, Suet Feung; Maia, AT; Jacinta-Fernandes, A; Sammut, Stephen John

    2017-01-01

    Advances in genomic technologies in the last decade have revolutionised the field of medicine, especially in cancer, by producing a large amount of genetic information, often referred to as Big Data. The identification of genetic predisposition changes, prognostic signatures, and cancer driver genes, which when mutated can act as genetic biomarkers for both targeted treatments and disease monitoring, has greatly advanced our understanding of cancer. However, there are still many challenges, s...

  3. Portrait of a Geothermal Spring, Hunter's Hot Springs, Oregon.

    Science.gov (United States)

    Castenholz, Richard W

    2015-01-27

    Although alkaline Hunter's Hot Springs in southeastern Oregon has been studied extensively for over 40 years, most of these studies and the subsequent publications were before the advent of molecular methods. However, there are many field observations and laboratory experiments that reveal the major aspects of the phototrophic species composition within various physical and chemical gradients of these springs. Relatively constant temperature boundaries demark the upper boundary of the unicellular cyanobacterium, Synechococcus at 73-74 °C (the world-wide upper limit for photosynthesis), and 68-70 °C the upper limit for Chloroflexus. The upper limit for the cover of the filamentous cyanobacterium, Geitlerinema (Oscillatoria) is at 54-55 °C, and the in situ lower limit at 47-48 °C for all three of these phototrophs due to the upper temperature limit for the grazing ostracod, Thermopsis. The in situ upper limit for the cyanobacteria Pleurocapsa and Calothrix is at ~47-48 °C, which are more grazer-resistant and grazer dependent. All of these demarcations are easily visible in the field. In addition, there is a biosulfide production in some sections of the springs that have a large impact on the microbiology. Most of the temperature and chemical limits have been explained by field and laboratory experiments.

  4. Big Data Comes to School

    Directory of Open Access Journals (Sweden)

    Bill Cope

    2016-03-01

    Full Text Available The prospect of “big data” at once evokes optimistic views of an information-rich future and concerns about surveillance that adversely impacts our personal and private lives. This overview article explores the implications of big data in education, focusing by way of example on data generated by student writing. We have chosen writing because it presents particular complexities, highlighting the range of processes for collecting and interpreting evidence of learning in the era of computer-mediated instruction and assessment as well as the challenges. Writing is significant not only because it is central to the core subject area of literacy; it is also an ideal medium for the representation of deep disciplinary knowledge across a number of subject areas. After defining what big data entails in education, we map emerging sources of evidence of learning that separately and together have the potential to generate unprecedented amounts of data: machine assessments, structured data embedded in learning, and unstructured data collected incidental to learning activity. Our case is that these emerging sources of evidence of learning have significant implications for the traditional relationships between assessment and instruction. Moreover, for educational researchers, these data are in some senses quite different from traditional evidentiary sources, and this raises a number of methodological questions. The final part of the article discusses implications for practice in an emerging field of education data science, including publication of data, data standards, and research ethics.

  5. Big data: the management revolution.

    Science.gov (United States)

    McAfee, Andrew; Brynjolfsson, Erik

    2012-10-01

    Big data, the authors write, is far more powerful than the analytics of the past. Executives can measure and therefore manage more precisely than ever before. They can make better predictions and smarter decisions. They can target more-effective interventions in areas that so far have been dominated by gut and intuition rather than by data and rigor. The differences between big data and analytics are a matter of volume, velocity, and variety: More data now cross the internet every second than were stored in the entire internet 20 years ago. Nearly real-time information makes it possible for a company to be much more agile than its competitors. And that information can come from social networks, images, sensors, the web, or other unstructured sources. The managerial challenges, however, are very real. Senior decision makers have to learn to ask the right questions and embrace evidence-based decision making. Organizations must hire scientists who can find patterns in very large data sets and translate them into useful business information. IT departments have to work hard to integrate all the relevant internal and external sources of data. The authors offer two success stories to illustrate how companies are using big data: PASSUR Aerospace enables airlines to match their actual and estimated arrival times. Sears Holdings directly analyzes its incoming store data to make promotions much more precise and faster.

  6. Invisible Roles of Doctoral Program Specialists

    Science.gov (United States)

    Bachman, Eva Burns; Grady, Marilyn L.

    2016-01-01

    The purpose of this study was to investigate the roles of doctoral program specialists in Big Ten universities. Face-to-face interviews with 20 doctoral program specialists employed in institutions in the Big Ten were conducted. Participants were asked to describe their roles within their work place. The doctoral program specialists reported their…

  7. Weldon Spring Site Environmental Report for calendar year 1994

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-05-01

    This report for Calendar Year 1994 has been prepared to provide information about the public safety and environmental protection programs conducted by the Weldon Spring Site Remedial Action Project (WSSRAP). The Weldon Spring site is located in southern St. Charles County, Missouri, approximately 48 km (30 mi) west of St. Louis. The site consists of two main areas, the Weldon Spring Chemical Plant and raffinate pits and the Weldon Spring Quarry. The chemical plant, raffinate pits, and quarry are located on Missouri State Route 94, southwest of US Route 40/61. The objectives of the Site Environmental Report are to present a summary of data from the environmental monitoring program, to characterize trends and environmental conditions at the site, and to confirm compliance with environmental and health protection standards and requirements. The report also presents the status of remedial activities and the results of monitoring these activities to assess their impacts on the public and environment. This report includes monitoring data from routine radiological and nonradiological sampling activities. These data include estimates of dose to the public from the Weldon Spring site, estimates of effluent releases, and trends in groundwater contaminant levels. Additionally, applicable compliance requirements, quality assurance programs, and special studies conducted in 1994 to support environmental protection programs are discussed. Dose estimates presented in this report are based on hypothetical exposure scenarios of public use of areas near the site. In addition, release estimates have been calculated on the basis of 1994 National Pollutant Discharge Elimination System (NPDES) and air monitoring data. Effluent discharges from the site under routine NPDES and National Emission Standards for Hazardous Air Pollutants (NESHAPS) monitoring were below permitted levels.

  8. Triangular springs for modeling nonlinear membranes.

    Science.gov (United States)

    Delingette, Hervé

    2008-01-01

    This paper provides a formal connexion between springs and continuum mechanics in the context of one-dimensional and two-dimensional elasticity. In a first stage, the equivalence between tensile springs and the finite element discretization of stretching energy on planar curves is established. Furthermore, when considering a quadratic strain function of stretch, we introduce a new type of springs called tensile biquadratic springs. In a second stage, we extend this equivalence to non-linear membranes (St Venant-Kirchhoff materials) on triangular meshes leading to triangular biquadratic and quadratic springs. Those tensile and angular springs produce isotropic deformations parameterized by Young modulus and Poisson ratios on unstructured meshes in an efficient and simple way. For a specific choice of the Poisson ratio, 0.3, we show that regular spring-mass models may be used realistically to simulate a membrane behavior. Finally, the different spring formulations are tested in pure traction and cloth simulation experiments.

  9. Fish Springs NWR Water Use Report : 2010

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This report contains locations and water use at Fish Springs National Wildlife Refuge (NWR) for 2010. A general background is presented on historical spring water...

  10. Amendment #2 to the Hunting and Fishing Plan : Mingo National Wildlife Refuge : Spring Firearm Turkey Hunting

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This amendment to the Mingo NWR Hunting and Fishing Plan expands the turkey hunting program to include the Missouri Spring Firearm turkey season.

  11. The first CERN Spring Campus

    CERN Multimedia

    CERN Bulletin

    2014-01-01

    From 14 to 16 April, the first edition of the CERN Spring Campus took place in Spain. Taking place over three intensive days, this event brought experts from CERN together at the University of Oviedo, where they met the engineers and scientists of the future in a programme of scientific and technological dissemination and cultural exchange.   The young participants of the first CERN Spring Campus and their instructors show their enthusiasm after the intensive three-day course. “This three-day school focuses on preparing young engineers for the job market, with a particular emphasis on computing,” explains Derek Mathieson, Advanced Information Systems Group Leader in the GS Department and Head of the CERN Spring Campus organising committee. “We organised talks on entrepreneurship and IT, as well as on job interviews and CV writing. It was also an important opportunity for the participants to meet CERN computing engineers to find out what it is like to work in I...

  12. 75 FR 39241 - Hooper Springs Project

    Science.gov (United States)

    2010-07-08

    ... Bonneville Power Administration Hooper Springs Project AGENCY: Bonneville Power Administration (BPA... Hooper Springs Project). The new BPA substation would be called Hooper Springs Substation and would be... 115-kV Lane Creek Substation, east of the City of Wayan, Idaho. The proposed project would address...

  13. 49 CFR 236.822 - Switch, spring.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Switch, spring. 236.822 Section 236.822 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Switch, spring. A switch equipped with a spring device which forces the points to their original position...

  14. Okanogan Basin Spring Spawner Report for 2007.

    Energy Technology Data Exchange (ETDEWEB)

    Colville Tribes, Department of Fish & Wildlife

    2007-09-01

    The Okanogan Basin Monitoring and Evaluation Program collected data related to spring spawning anadromous salmonid stocks across the entire Okanogan River basin. Data were collected using redd surveys, traps, underwater video, and PIT-tag technology then summarized and analyzed using simple estimate models. From these efforts we estimated that 1,266 summer steelhead spawned in the Okanogan River basin and constructed 552 redds;152 of these fish where of natural origin. Of these, 121 summer steelhead, including 29 of natural origin, created an estimated 70 redds in the Canadian portion of the Okanagan basin. We estimated summer steelhead spawner escapement into each sub-watershed along with the number from natural origin and the number and density of redds. We documented redd desiccation in Loup Loup Creek, habitat utilization in Salmon Creek as a result of a new water lease program, and 10 spring Chinook returning to Omak Creek. High water through most of the redd survey period resulted in development of new modeling techniques and allowed us to survey additional tributaries including the observation of summer steelhead spawning in Wanacut Creek. These 2007 data provide additional support that redd surveys conducted within the United States are well founded and provide essential information for tracking the recovery of listed summer steelhead. Conversely, redd surveys do not appear to be the best approach for enumerating steelhead spawners or there distribution within Canada. We also identified that spawning distributions within the Okanogan River basin vary widely and stocking location may play an over riding roll in this variability.

  15. The international management of big scientific research programs. The example of particle physics; La gestion internationale des grands programmes de recherche scientifique l'exemple de la physique des particules

    Energy Technology Data Exchange (ETDEWEB)

    Feltesse, J. [CEA Saclay, Dept. d' Astrophysique, de Physique des Particules, de Physique Nucleaire et de l' Instrumentation Associee, 91- Gif sur Yvette (France); Comite de Directives Scientifique du CERN (France)

    2004-07-01

    High energy physics is a basic research domain with a well established European and international cooperation. Cooperation can be of different type depending on the size of the facilities involved (accelerators), on their financing, and on the type of experiments that use these facilities. The CERN, the European center for nuclear research, created in October 1954, is the best example of such a cooperation. This article examines first the juridical and scientifical structure of the CERN and the mode of organization of big experiments. Then, it presents the role of international committees in the establishment of a common scientific policy in Europe and in the rest of the world. Finally, the possible future evolution of the CERN towards a worldwide project is evoked. (J.S.)

  16. Final Scientific/Technical Report to the U.S. Department of Energy on NOVA's Einstein's Big Idea (Project title: E-mc2, A Two-Hour Television Program on NOVA)

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Susanne

    2007-05-07

    A woman in the early 1700s who became one of Europe’s leading interpreters of mathematics and a poor bookbinder who became one of the giants of nineteenth-century science are just two of the pioneers whose stories NOVA explored in Einstein’s Big Idea. This two-hour documentary premiered on PBS in October 2005 and is based on the best-selling book by David Bodanis, E=mc2: A Biography of the World’s Most Famous Equation. The film and book chronicle the scientific challenges and discoveries leading up to Einstein’s startling conclusion that mass and energy are one, related by the formula E = mc2.

  17. The Design of Intelligent Repair Welding Mechanism and Relative Control System of Big Gear

    Directory of Open Access Journals (Sweden)

    Hong-Yu LIU

    2014-10-01

    Full Text Available Effective repair of worn big gear has large influence on ensuring safety production and enhancing economic benefits. A kind of intelligent repair welding method was put forward mainly aimed at the big gear restriction conditions of high production cost, long production cycle and high- intensity artificial repair welding work. Big gear repair welding mechanism was designed in this paper. The work principle and part selection of big gear repair welding mechanism was introduced. The three dimensional mode of big gear repair welding mechanism was constructed by Pro/E three dimensional design software. Three dimensional motions can be realized by motor controlling ball screw. According to involute gear feature, the complicated curve motion on curved gear surface can be transformed to linear motion by orientation. By this way, the repair welding on worn gear area can be realized. In the design of big gear repair welding mechanism control system, Siemens S7-200 series hardware was chosen. Siemens STEP7 programming software was chosen as system design tool. The entire repair welding process was simulated by experiment simulation. It provides a kind of practical and feasible method for the intelligent repair welding of big worn gear.

  18. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Corridor Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A first-surface topography Digital Elevation Model (DEM) mosaic for the Big Sandy Creek Corridor Unit of Big Thicket National Preserve in Texas was produced from...

  19. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A bare-earth topography digital elevation model (DEM) mosaic for the Big Sandy Creek Unit of Big Thicket National Preserve in Texas, was produced from remotely...

  20. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A first-surface topography digital elevation model (DEM) mosaic for the Big Sandy Creek Unit of Big Thicket National Preserve in Texas, was produced from remotely...

  1. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Corridor Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A bare-earth topography Digital Elevation Model (DEM) mosaic for the Big Sandy Creek Corridor Unit of Big Thicket National Preserve in Texas was produced from...

  2. Medical big data: promise and challenges

    OpenAIRE

    Choong Ho Lee; Hyung-Jin Yoon

    2017-01-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct fr...

  3. How could discharge management affect Florida spring fish assemblage structure?

    Science.gov (United States)

    Work, Kirsten; Codner, Keneil; Gibbs, Melissa

    2017-08-01

    Freshwater bodies are increasingly affected by reductions in water quantity and quality and by invasions of exotic species. To protect water quantity and maintain the ecological integrity of many water bodies in central Florida, a program of adopting Minimum Flows and Levels (MFLs) has begun for both lentic and lotic waters. The purpose of this study was to determine whether there were relationships between discharge and stage, water quality, and biological parameters for Volusia Blue Spring, a first magnitude spring (discharge > 380,000 m3 day-1 or 100 mgd) for which an MFL program was adopted in 2006. Over the course of fourteen years, we assessed fish density and diversity weekly, monthly, or seasonally with seine and snorkel counts. We evaluated annual changes in the assemblages for relationships with water quantity and quality. Low discharge and dissolved oxygen combined with high stage and conductivity produced a fish population with a lower density and diversity in 2014 than in previous years. Densities of fish taxonomic/functional groups also were low in 2014 and measures of water quantity were significant predictors of fish assemblage structure. As a result of the strong relationships between variation in discharge and an array of chemical and biological characteristics of the spring, we conclude that maintaining the historical discharge rate is important for preserving the ecological integrity of Volusia Blue Spring. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Multi-Objective Big Bang–Big Crunch Optimization Algorithm For Recursive Digital Filter Design

    OpenAIRE

    Ms. Rashmi Singh; Dr. H. K. Verma

    2012-01-01

    The paper represents the design of recursive second order Butterworth low pass digital filter which optimizes both the magnitude and group delay simultaneously under the Multi-Objective Big Bang-Big Crunch Optimization algorithm. Multi-Objective problem of magnitude and group delay are solved using Multi-Objective BB-BC Optimization algorithm that operates on a complex, continuous search space and optimized by statistically determining the abilities of Big Bang Phase and Big Crunch Phase. Her...

  5. Big data optimization recent developments and challenges

    CERN Document Server

    2016-01-01

    The main objective of this book is to provide the necessary background to work with big data by introducing some novel optimization algorithms and codes capable of working in the big data setting as well as introducing some applications in big data optimization for both academics and practitioners interested, and to benefit society, industry, academia, and government. Presenting applications in a variety of industries, this book will be useful for the researchers aiming to analyses large scale data. Several optimization algorithms for big data including convergent parallel algorithms, limited memory bundle algorithm, diagonal bundle method, convergent parallel algorithms, network analytics, and many more have been explored in this book.

  6. Medical big data: promise and challenges.

    Science.gov (United States)

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  7. Storage and Database Management for Big Data

    Science.gov (United States)

    2015-07-27

    pipeline is to parse these files 1.3. SYSTEM ENGINEERING FOR BIG DATA 5 Figure 1.2: A standard big data pipeline consists of five steps to go from raw...Storage and Database Management for Big Data Vijay Gadepally, Jeremy Kepner and Albert Reuther MIT Lincoln Laboratory Lexington, MA, USA 02420...vijayg@ll.mit.edu, kepner@ll.mit.edu, reuther@ll.mit.edu Distribution A: Public Release July 27, 2015 ii Contents 1 Storage and Database Management for Big

  8. Big data analytics with R and Hadoop

    CERN Document Server

    Prajapati, Vignesh

    2013-01-01

    Big Data Analytics with R and Hadoop is a tutorial style book that focuses on all the powerful big data tasks that can be achieved by integrating R and Hadoop.This book is ideal for R developers who are looking for a way to perform big data analytics with Hadoop. This book is also aimed at those who know Hadoop and want to build some intelligent applications over Big data with R packages. It would be helpful if readers have basic knowledge of R.

  9. Medical big data: promise and challenges

    Directory of Open Access Journals (Sweden)

    Choong Ho Lee

    2017-03-01

    Full Text Available The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  10. Organizational Design Challenges Resulting From Big Data

    Directory of Open Access Journals (Sweden)

    Jay R. Galbraith

    2014-04-01

    Full Text Available Business firms and other types of organizations are feverishly exploring ways of taking advantage of the big data phenomenon. This article discusses firms that are at the leading edge of developing a big data analytics capability. Firms that are currently enjoying the most success in this area are able to use big data not only to improve their existing businesses but to create new businesses as well. Putting a strategic emphasis on big data requires adding an analytics capability to the existing organization. This transformation process results in power shifting to analytics experts and in decisions being made in real time.

  11. Biomass Experiment for Wyoming big sagebrush (Artemisia tridentata subsp. wyomingensis), Spring 2010

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The semi-arid sagebrush steppe ecosystem is one of the largest biomes in North America. The steppe provides critical habitat and forage for wildlife and is...

  12. The Big Bang Theory on TV: How to make a Big Bang with the Public

    Science.gov (United States)

    Prady, Bill

    2011-04-01

    Is it possible for a television sitcom to accurately portray scientists? Probably not, but with some effort it can accurately portray science. Since its debut in 2007, The Big Bang Theory on CBS has striven to include accurate and current references to physics, astrophysics and other disciplines. This attention to detail (which means that Big Bang is the first television comedy to employ a physicist as a consultant) is an obsession of its co-creator and executive producer, Bill Prady. Prady, whose twenty-six year career in television has taken him from Jim Henson's Muppets to this current project, began his working life as a computer programmer. His frustration with how inaccurately science and technology is generally depicted in film and television led him to ask if it was possible to be both correct and funny. Using clips from the show as examples, we will engage in a discussion of the depiction of science on this program and in popular entertainment in general.

  13. Spring Recipes A Problem-solution Approach

    CERN Document Server

    Long, Josh; Mak, Gary

    2010-01-01

    With over 3 Million users/developers, Spring Framework is the leading "out of the box" Java framework. Spring addresses and offers simple solutions for most aspects of your Java/Java EE application development, and guides you to use industry best practices to design and implement your applications. The release of Spring Framework 3 has ushered in many improvements and new features. Spring Recipes: A Problem-Solution Approach, Second Edition continues upon the bestselling success of the previous edition but focuses on the latest Spring 3 features for building enterprise Java applications.

  14. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  15. Big game hunting practices, meanings, motivations and constraints: a survey of Oregon big game hunters

    Science.gov (United States)

    Suresh K. Shrestha; Robert C. Burns

    2012-01-01

    We conducted a self-administered mail survey in September 2009 with randomly selected Oregon hunters who had purchased big game hunting licenses/tags for the 2008 hunting season. Survey questions explored hunting practices, the meanings of and motivations for big game hunting, the constraints to big game hunting participation, and the effects of age, years of hunting...

  16. Big Data en surveillance, deel 1 : Definities en discussies omtrent Big Data

    NARCIS (Netherlands)

    Timan, Tjerk

    2016-01-01

    Naar aanleiding van een (vrij kort) college over surveillance en Big Data, werd me gevraagd iets dieper in te gaan op het thema, definities en verschillende vraagstukken die te maken hebben met big data. In dit eerste deel zal ik proberen e.e.a. uiteen te zetten betreft Big Data theorie en

  17. Comparative validity of brief to medium-length Big Five and Big Six personality questionnaires

    NARCIS (Netherlands)

    Thalmayer, A.G.; Saucier, G.; Eigenhuis, A.

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five

  18. Big Deployables in Small Satellites

    OpenAIRE

    Davis, Bruce; Francis, William; Goff, Jonathan; Cross, Michael; Copel, Daniel

    2014-01-01

    The concept of utilizing small satellites to perform big mission objectives has grown from a distant idea to a demonstrated reality. One of the challenges in using small-satellite platforms for high-value missions is the packaging of long and large surface-area devices such as antennae, solar arrays and sensor positioning booms. One possible enabling technology is the slit-tube, or a deployable “tape-measure” boom which can be flattened and rolled into a coil achieving a high volumetric packa...

  19. Fitting ERGMs on big networks.

    Science.gov (United States)

    An, Weihua

    2016-09-01

    The exponential random graph model (ERGM) has become a valuable tool for modeling social networks. In particular, ERGM provides great flexibility to account for both covariates effects on tie formations and endogenous network formation processes. However, there are both conceptual and computational issues for fitting ERGMs on big networks. This paper describes a framework and a series of methods (based on existent algorithms) to address these issues. It also outlines the advantages and disadvantages of the methods and the conditions to which they are most applicable. Selected methods are illustrated through examples. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Big Data and Intelligence: Applications, Human Capital, and Education

    Directory of Open Access Journals (Sweden)

    Michael Landon-Murray

    2016-06-01

    Full Text Available The potential for big data to contribute to the US intelligence mission goes beyond bulk collection, social media and counterterrorism. Applications will speak to a range of issues of major concern to intelligence agencies, from military operations to climate change to cyber security. There are challenges too: procurement lags, data stovepiping, separating signal from noise, sources and methods, a range of normative issues, and central to managing these challenges, human capital. These potential applications and challenges are discussed and a closer look at what data scientists do in the Intelligence Community (IC is offered. Effectively filling the ranks of the IC’s data science workforce will depend on the provision of well-trained data scientists from the higher education system. Program offerings at America’s top fifty universities will thus be surveyed (just a few years ago there were reportedly no degrees in data science. One Master’s program that has melded data science with intelligence is examined as well as a university big data research center focused on security and intelligence. This discussion goes a long way to clarify the prospective uses of data science in intelligence while probing perhaps the key challenge to optimal application of big data in the IC.

  1. Challenges and potential solutions for big data implementations in developing countries.

    Science.gov (United States)

    Luna, D; Mayan, J C; García, M J; Almerares, A A; Househ, M

    2014-08-15

    The volume of data, the velocity with which they are generated, and their variety and lack of structure hinder their use. This creates the need to change the way information is captured, stored, processed, and analyzed, leading to the paradigm shift called Big Data. To describe the challenges and possible solutions for developing countries when implementing Big Data projects in the health sector. A non-systematic review of the literature was performed in PubMed and Google Scholar. The following keywords were used: "big data", "developing countries", "data mining", "health information systems", and "computing methodologies". A thematic review of selected articles was performed. There are challenges when implementing any Big Data program including exponential growth of data, special infrastructure needs, need for a trained workforce, need to agree on interoperability standards, privacy and security issues, and the need to include people, processes, and policies to ensure their adoption. Developing countries have particular characteristics that hinder further development of these projects. The advent of Big Data promises great opportunities for the healthcare field. In this article, we attempt to describe the challenges developing countries would face and enumerate the options to be used to achieve successful implementations of Big Data programs.

  2. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Directory of Open Access Journals (Sweden)

    Mike W.-L. Cheung

    2016-05-01

    Full Text Available Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists – and probably the most crucial one – is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  3. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    Science.gov (United States)

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  4. Big Challenges and Big Opportunities: The Power of "Big Ideas" to Change Curriculum and the Culture of Teacher Planning

    Science.gov (United States)

    Hurst, Chris

    2014-01-01

    Mathematical knowledge of pre-service teachers is currently "under the microscope" and the subject of research. This paper proposes a different approach to teacher content knowledge based on the "big ideas" of mathematics and the connections that exist within and between them. It is suggested that these "big ideas"…

  5. Optimizing Hadoop Performance for Big Data Analytics in Smart Grid

    Directory of Open Access Journals (Sweden)

    Mukhtaj Khan

    2017-01-01

    Full Text Available The rapid deployment of Phasor Measurement Units (PMUs in power systems globally is leading to Big Data challenges. New high performance computing techniques are now required to process an ever increasing volume of data from PMUs. To that extent the Hadoop framework, an open source implementation of the MapReduce computing model, is gaining momentum for Big Data analytics in smart grid applications. However, Hadoop has over 190 configuration parameters, which can have a significant impact on the performance of the Hadoop framework. This paper presents an Enhanced Parallel Detrended Fluctuation Analysis (EPDFA algorithm for scalable analytics on massive volumes of PMU data. The novel EPDFA algorithm builds on an enhanced Hadoop platform whose configuration parameters are optimized by Gene Expression Programming. Experimental results show that the EPDFA is 29 times faster than the sequential DFA in processing PMU data and 1.87 times faster than a parallel DFA, which utilizes the default Hadoop configuration settings.

  6. Weldon Spring Site environmental report for calendar year 1993. Weldon Springs Site Remedial Action Project

    Energy Technology Data Exchange (ETDEWEB)

    1994-05-01

    This Site Environmental Report for Calendar Year 1993 describes the environmental monitoring programs at the Weldon Spring Site Remedial Action Project (WSSRAP). The objectives of these programs are to assess actual or potential exposure to contaminant effluents from the project area by providing public use scenarios and dose estimates, to demonstrate compliance with Federal and State permitted levels, and to summarize trends and/or changes in contaminant concentrations from environmental monitoring program. In 1993, the maximum committed dose to a hypothetical individual at the chemical plant site perimeter was 0.03 mrem (0.0003 mSv). The maximum committed dose to a hypothetical individual at the boundary of the Weldon Spring Quarry was 1.9 mrem (0.019 mSv). These scenarios assume an individual walking along the perimeter of the site-once a day at the chemical plant/raffinate pits and twice a day at the quarry-250 days per year. This hypothetical individual also consumes fish, sediment, and water from lakes and other bodies of water in the area. The collective dose, based on an effected population of 112,000 was 0.12 person-rem (0.0012 person-Sv). This calculation is based on recreational use of the August A. Busch Memorial Conservation Area and the Missouri Department of Conservation recreational trail (the Katy Trail) near the quarry. These estimates are below the U.S. Department of Energy requirement of 100 mrem (I mSv) annual committed effective dose equivalent for all exposure pathways. Results from air monitoring for the National Emission Standards for Hazardous Air Pollutants (NESHAPs) program indicated that the estimated dose was 0.38 mrem, which is below the U.S. Environmental Protection Agency (EPA) standard of 10 mrem per year.

  7. CACTUS SPRING ROADLESS AREA, CALIFORNIA.

    Science.gov (United States)

    Matti, Jonathan C.; Kuizon, Lucia

    1984-01-01

    Geologic, geochemical, and geophysical studies together with a review of historic mining and prospecting activities indicate that the Cactus Spring Roadless Area in California has little promise for the occurrence of mineral or energy resources. Marble bodies occur in the northern part of the roadless area and are possible resources for building stone, crushed and quarried aggregate, and lime and magnesium for Portland cement and industrial applications. It is recommended that the terrane of marble be mapped and sampled carefully in order to evaluate the quantity and quality of the carbonate resources.

  8. Baryon symmetric big bang cosmology

    Science.gov (United States)

    Stecker, F. W.

    1978-01-01

    Both the quantum theory and Einsteins theory of special relativity lead to the supposition that matter and antimatter were produced in equal quantities during the big bang. It is noted that local matter/antimatter asymmetries may be reconciled with universal symmetry by assuming (1) a slight imbalance of matter over antimatter in the early universe, annihilation, and a subsequent remainder of matter; (2) localized regions of excess for one or the other type of matter as an initial condition; and (3) an extremely dense, high temperature state with zero net baryon number; i.e., matter/antimatter symmetry. Attention is given to the third assumption, which is the simplest and the most in keeping with current knowledge of the cosmos, especially as pertains the universality of 3 K background radiation. Mechanisms of galaxy formation are discussed, whereby matter and antimatter might have collided and annihilated each other, or have coexisted (and continue to coexist) at vast distances. It is pointed out that baryon symmetric big bang cosmology could probably be proved if an antinucleus could be detected in cosmic radiation.

  9. Georges et le big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2011-01-01

    Georges et Annie, sa meilleure amie, sont sur le point d'assister à l'une des plus importantes expériences scientifiques de tous les temps : explorer les premiers instants de l'Univers, le Big Bang ! Grâce à Cosmos, leur super ordinateur, et au Grand Collisionneur de hadrons créé par Éric, le père d'Annie, ils vont enfin pouvoir répondre à cette question essentielle : pourquoi existons nous ? Mais Georges et Annie découvrent qu'un complot diabolique se trame. Pire, c'est toute la recherche scientifique qui est en péril ! Entraîné dans d'incroyables aventures, Georges ira jusqu'aux confins de la galaxie pour sauver ses amis...Une plongée passionnante au coeur du Big Bang. Les toutes dernières théories de Stephen Hawking et des plus grands scientifiques actuels.

  10. Big data in oncologic imaging.

    Science.gov (United States)

    Regge, Daniele; Mazzetti, Simone; Giannini, Valentina; Bracco, Christian; Stasi, Michele

    2017-06-01

    Cancer is a complex disease and unfortunately understanding how the components of the cancer system work does not help understand the behavior of the system as a whole. In the words of the Greek philosopher Aristotle "the whole is greater than the sum of parts." To date, thanks to improved information technology infrastructures, it is possible to store data from each single cancer patient, including clinical data, medical images, laboratory tests, and pathological and genomic information. Indeed, medical archive storage constitutes approximately one-third of total global storage demand and a large part of the data are in the form of medical images. The opportunity is now to draw insight on the whole to the benefit of each individual patient. In the oncologic patient, big data analysis is at the beginning but several useful applications can be envisaged including development of imaging biomarkers to predict disease outcome, assessing the risk of X-ray dose exposure or of renal damage following the administration of contrast agents, and tracking and optimizing patient workflow. The aim of this review is to present current evidence of how big data derived from medical images may impact on the diagnostic pathway of the oncologic patient.

  11. The BigBOSS spectrograph

    Science.gov (United States)

    Jelinsky, Patrick; Bebek, Chris; Besuner, Robert; Carton, Pierre-Henri; Edelstein, Jerry; Lampton, Michael; Levi, Michael E.; Poppett, Claire; Prieto, Eric; Schlegel, David; Sholl, Michael

    2012-09-01

    BigBOSS is a proposed ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a 14,000 square degree galaxy and quasi-stellar object redshift survey. It consists of a 5,000- fiber-positioner focal plane feeding the spectrographs. The optical fibers are separated into ten 500 fiber slit heads at the entrance of ten identical spectrographs in a thermally insulated room. Each of the ten spectrographs has a spectral resolution (λ/Δλ) between 1500 and 4000 over a wavelength range from 360 - 980 nm. Each spectrograph uses two dichroic beam splitters to separate the spectrograph into three arms. It uses volume phase holographic (VPH) gratings for high efficiency and compactness. Each arm uses a 4096x4096 15 μm pixel charge coupled device (CCD) for the detector. We describe the requirements and current design of the BigBOSS spectrograph. Design trades (e.g. refractive versus reflective) and manufacturability are also discussed.

  12. An embedding for the big bang

    Science.gov (United States)

    Wesson, Paul S.

    1994-01-01

    A cosmological model is given that has good physical properties for the early and late universe but is a hypersurface in a flat five-dimensional manifold. The big bang can therefore be regarded as an effect of a choice of coordinates in a truncated higher-dimensional geometry. Thus the big bang is in some sense a geometrical illusion.

  13. In Search of the Big Bubble

    Science.gov (United States)

    Simoson, Andrew; Wentzky, Bethany

    2011-01-01

    Freely rising air bubbles in water sometimes assume the shape of a spherical cap, a shape also known as the "big bubble". Is it possible to find some objective function involving a combination of a bubble's attributes for which the big bubble is the optimal shape? Following the basic idea of the definite integral, we define a bubble's surface as…

  14. A New Look at Big History

    Science.gov (United States)

    Hawkey, Kate

    2014-01-01

    The article sets out a "big history" which resonates with the priorities of our own time. A globalizing world calls for new spacial scales to underpin what the history curriculum addresses, "big history" calls for new temporal scales, while concern over climate change calls for a new look at subject boundaries. The article…

  15. Big data analysis for smart farming

    NARCIS (Netherlands)

    Kempenaar, C.; Lokhorst, C.; Bleumer, E.J.B.; Veerkamp, R.F.; Been, Th.; Evert, van F.K.; Boogaardt, M.J.; Ge, L.; Wolfert, J.; Verdouw, C.N.; Bekkum, van Michael; Feldbrugge, L.; Verhoosel, Jack P.C.; Waaij, B.D.; Persie, van M.; Noorbergen, H.

    2016-01-01

    In this report we describe results of a one-year TO2 institutes project on the development of big data technologies within the milk production chain. The goal of this project is to ‘create’ an integration platform for big data analysis for smart farming and to develop a show case. This includes both

  16. Exploiting big data for critical care research.

    Science.gov (United States)

    Docherty, Annemarie B; Lone, Nazir I

    2015-10-01

    Over recent years the digitalization, collection and storage of vast quantities of data, in combination with advances in data science, has opened up a new era of big data. In this review, we define big data, identify examples of critical care research using big data, discuss the limitations and ethical concerns of using these large datasets and finally consider scope for future research. Big data refers to datasets whose size, complexity and dynamic nature are beyond the scope of traditional data collection and analysis methods. The potential benefits to critical care are significant, with faster progress in improving health and better value for money. Although not replacing clinical trials, big data can improve their design and advance the field of precision medicine. However, there are limitations to analysing big data using observational methods. In addition, there are ethical concerns regarding maintaining confidentiality of patients who contribute to these datasets. Big data have the potential to improve medical care and reduce costs, both by individualizing medicine, and bringing together multiple sources of data about individual patients. As big data become increasingly mainstream, it will be important to maintain public confidence by safeguarding data security, governance and confidentiality.

  17. The Death of the Big Men

    DEFF Research Database (Denmark)

    Martin, Keir

    2010-01-01

    Recently Tolai people og Papua New Guinea have adopted the term 'Big Shot' to decribe an emerging post-colonial political elite. The mergence of the term is a negative moral evaluation of new social possibilities that have arisen as a consequence of the Big Shots' privileged position within a glo...

  18. Big Red: A Development Environment for Bigraphs

    DEFF Research Database (Denmark)

    Faithfull, Alexander John; Perrone, Gian David; Hildebrandt, Thomas

    2013-01-01

    We present Big Red, a visual editor for bigraphs and bigraphical reactive systems, based upon Eclipse. The editor integrates with several existing bigraph tools to permit simulation and model-checking of bigraphical models. We give a brief introduction to the bigraphs formalism, and show how thes...... these concepts manifest within the tool using a small motivating example bigraphical model developed in Big Red....

  19. Big history and the future of humanity

    NARCIS (Netherlands)

    Spier, F.

    2011-01-01

    Big History and the Future of Humanity presents an theoretical approach that makes "big history" - the placing of the human past within the history of life, the Earth, and the Universe -- accessible to general readers while revealing insights into what the future may hold for humanity.

  20. Big Science and Long-tail Science

    CERN Multimedia

    2008-01-01

    Jim Downing and I were privileged to be the guests of Salavtore Mele at CERN yesterday and to see the Atlas detector of the Large Hadron Collider . This is a wow experience - although I knew it was big, I hadnt realised how big.

  1. The DBMS - your Big Data Sommelier

    NARCIS (Netherlands)

    Kargın, Y.; Kersten, M.; Manegold, S.; Pirk, H.

    2015-01-01

    When addressing the problem of "big" data volume, preparation costs are one of the key challenges: the high costs for loading, aggregating and indexing data leads to a long data-to-insight time. In addition to being a nuisance to the end-user, this latency prevents real-time analytics on "big" data.

  2. Kansen voor Big data – WPA Vertrouwen

    NARCIS (Netherlands)

    Broek, T.A. van den; Roosendaal, A.P.C.; Veenstra, A.F.E. van; Nunen, A.M. van

    2014-01-01

    Big data is expected to become a driver for economic growth, but this can only be achieved when services based on (big) data are accepted by citizens and consumers. In a recent policy brief, the Cabinet Office mentions trust as one of the three pillars (the others being transparency and control) for

  3. The evolution of risk communication at the Weldon Spring site

    Energy Technology Data Exchange (ETDEWEB)

    McCracken, S. [USDOE Weldon Spring Site, St. Charles, MO (United States); Sizemore, M.; Meyer, L. [MK-Ferguson Co., Weldon Spring, MO (United States)]|[Jacobs Engineering Group, Inc., Weldon Spring, MO (United States); MacDonell, M.; Haroun, L. [Argonne National Lab., IL (United States)

    1993-11-01

    Clear risk communication is one of the keys to establishing a positive relationship with the public at an environmental restoration site. This effort has been evolving at the Weldon Spring site over the past few years, with considerable input from the local community. The recent signing of the major cleanup decision for this site, which identifies on-site disposal as the remedy reflects the strength of the communication program that has evolved for the project.

  4. The ethics of biomedical big data

    CERN Document Server

    Mittelstadt, Brent Daniel

    2016-01-01

    This book presents cutting edge research on the new ethical challenges posed by biomedical Big Data technologies and practices. ‘Biomedical Big Data’ refers to the analysis of aggregated, very large datasets to improve medical knowledge and clinical care. The book describes the ethical problems posed by aggregation of biomedical datasets and re-use/re-purposing of data, in areas such as privacy, consent, professionalism, power relationships, and ethical governance of Big Data platforms. Approaches and methods are discussed that can be used to address these problems to achieve the appropriate balance between the social goods of biomedical Big Data research and the safety and privacy of individuals. Seventeen original contributions analyse the ethical, social and related policy implications of the analysis and curation of biomedical Big Data, written by leading experts in the areas of biomedical research, medical and technology ethics, privacy, governance and data protection. The book advances our understan...

  5. Practice Variation in Big-4 Transparency Reports

    DEFF Research Database (Denmark)

    Girdhar, Sakshi; Klarskov Jeppesen, Kim

    2018-01-01

    Purpose: The purpose of this paper is to examine the transparency reports published by the Big-4 public accounting firms in the UK, Germany and Denmark to understand the determinants of their content within the networks of big accounting firms. Design/methodology/approach: The study draws...... on a qualitative research approach, in which the content of transparency reports is analyzed and semi-structured interviews are conducted with key people from the Big-4 firms who are responsible for developing the transparency reports. Findings: The findings show that the content of transparency reports...... is inconsistent and the transparency reporting practice is not uniform within the Big-4 networks. Differences were found in the way in which the transparency reporting practices are coordinated globally by the respective central governing bodies of the Big-4. The content of the transparency reports...

  6. Ethics and Epistemology in Big Data Research.

    Science.gov (United States)

    Lipworth, Wendy; Mason, Paul H; Kerridge, Ian; Ioannidis, John P A

    2017-12-01

    Biomedical innovation and translation are increasingly emphasizing research using "big data." The hope is that big data methods will both speed up research and make its results more applicable to "real-world" patients and health services. While big data research has been embraced by scientists, politicians, industry, and the public, numerous ethical, organizational, and technical/methodological concerns have also been raised. With respect to technical and methodological concerns, there is a view that these will be resolved through sophisticated information technologies, predictive algorithms, and data analysis techniques. While such advances will likely go some way towards resolving technical and methodological issues, we believe that the epistemological issues raised by big data research have important ethical implications and raise questions about the very possibility of big data research achieving its goals.

  7. Big data analytics methods and applications

    CERN Document Server

    Rao, BLS; Rao, SB

    2016-01-01

    This book has a collection of articles written by Big Data experts to describe some of the cutting-edge methods and applications from their respective areas of interest, and provides the reader with a detailed overview of the field of Big Data Analytics as it is practiced today. The chapters cover technical aspects of key areas that generate and use Big Data such as management and finance; medicine and healthcare; genome, cytome and microbiome; graphs and networks; Internet of Things; Big Data standards; bench-marking of systems; and others. In addition to different applications, key algorithmic approaches such as graph partitioning, clustering and finite mixture modelling of high-dimensional data are also covered. The varied collection of themes in this volume introduces the reader to the richness of the emerging field of Big Data Analytics.

  8. 2nd INNS Conference on Big Data

    CERN Document Server

    Manolopoulos, Yannis; Iliadis, Lazaros; Roy, Asim; Vellasco, Marley

    2017-01-01

    The book offers a timely snapshot of neural network technologies as a significant component of big data analytics platforms. It promotes new advances and research directions in efficient and innovative algorithmic approaches to analyzing big data (e.g. deep networks, nature-inspired and brain-inspired algorithms); implementations on different computing platforms (e.g. neuromorphic, graphics processing units (GPUs), clouds, clusters); and big data analytics applications to solve real-world problems (e.g. weather prediction, transportation, energy management). The book, which reports on the second edition of the INNS Conference on Big Data, held on October 23–25, 2016, in Thessaloniki, Greece, depicts an interesting collaborative adventure of neural networks with big data and other learning technologies.

  9. Impacts of Improved Switchgrass and Big Bluestem Selections on Yield, Morphological Characteristics, and Biomass Quality

    Directory of Open Access Journals (Sweden)

    Erik Delaquis

    2014-01-01

    Full Text Available Switchgrass (Panicum virgatum L. and big bluestem (Andropogon gerardii V. are promising warm-season grasses for biomass production. Understanding the morphological and quality-related traits of these grasses can guide breeders in developing strategies to improve yield and quality for bioindustrial applications. Elite selections were made in Southern Quebec from four promising varieties of switchgrass and one of big bluestem. Biomass yield, morphological characteristics, and selected quality traits were evaluated at two sites in 2011 and 2012. Significant variation was detected for all measured characteristics, with differences varying by site and year. In some cases the selection process modified characteristics including increasing height and reducing tiller mortality. Switchgrasses reached a similar tiller equilibrium density in both years of 690 m−2 and 379 m−2 at a productive and marginal site, respectively. Differences in yield were pronounced at the marginal site, with some advanced selections having a higher yield than their parent varieties. Switchgrass yields were generally greater than those of big bluestem. A delayed spring harvest date greatly reduced yield but reduced moisture content and slightly increased cellulose concentration. Big bluestem had a higher cellulose content than switchgrass, likely due to greater stem content.

  10. The Islamic State’s Tactics in Syria: Role of Social Media in Shifting a Peaceful Arab Spring into Terrorism

    Science.gov (United States)

    2017-06-09

    THE ISLAMIC STATE’S TACTICS IN SYRIA: ROLE OF SOCIAL MEDIA IN SHIFTING A PEACEFUL ARAB SPRING INTO TERRORISM A thesis...Shifting a Peaceful Arab Spring into Terrorism 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) HASAN S...their investigations. 15. SUBJECT TERMS Islamic State, Arab Spring, Social Media, Terrorism , Syria 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF

  11. Forget the hype or reality. Big data presents new opportunities in Earth Science.

    Science.gov (United States)

    Lee, T. J.

    2015-12-01

    Earth science is arguably one of the most mature science discipline which constantly acquires, curates, and utilizes a large volume of data with diverse variety. We deal with big data before there is big data. For example, while developing the EOS program in the 1980s, the EOS data and information system (EOSDIS) was developed to manage the vast amount of data acquired by the EOS fleet of satellites. EOSDIS continues to be a shining example of modern science data systems in the past two decades. With the explosion of internet, the usage of social media, and the provision of sensors everywhere, the big data era has bring new challenges. First, Goggle developed the search algorithm and a distributed data management system. The open source communities quickly followed up and developed Hadoop file system to facility the map reduce workloads. The internet continues to generate tens of petabytes of data every day. There is a significant shortage of algorithms and knowledgeable manpower to mine the data. In response, the federal government developed the big data programs that fund research and development projects and training programs to tackle these new challenges. Meanwhile, comparatively to the internet data explosion, Earth science big data problem has become quite small. Nevertheless, the big data era presents an opportunity for Earth science to evolve. We learned about the MapReduce algorithms, in memory data mining, machine learning, graph analysis, and semantic web technologies. How do we apply these new technologies to our discipline and bring the hype to Earth? In this talk, I will discuss how we might want to apply some of the big data technologies to our discipline and solve many of our challenging problems. More importantly, I will propose new Earth science data system architecture to enable new type of scientific inquires.

  12. Robert oerley and Tuzla mineral springs

    OpenAIRE

    Ateş Can, Sevim

    2014-01-01

    In the Tuzla borough of İstanbul are located mineral springs of great importance which have been known and used since ancient times. Although the therapeutic value of Tuzla Mineral Springs has always been recognized, this value in terms of the history of architecture has not been very well understood. The current facilities in the springs were built in the Republican Period. The most important among them is the Spa Hotel, whose project was initiated by Robert Oerley. The oth...

  13. The big war over brackets.

    Science.gov (United States)

    Alvarez, R O

    1994-01-01

    The Third Preparatory Committee Meeting for the International Conference on Population and Development (ICPD), PrepCom III, was held at UN headquarters in New York on April 4-22, 1994. It was the last big preparatory meeting leading to the ICPD to be held in Cairo, Egypt, in September 1994. The author attended the second week of meetings as the official delegate of the Institute for Social Studies and Action. Debates mostly focused upon reproductive health and rights, sexual health and rights, family planning, contraception, condom use, fertility regulation, pregnancy termination, and safe motherhood. The Vatican and its allies' preoccupation with discussing language which may imply abortion caused sustainable development, population, consumption patterns, internal and international migration, economic strategies, and budgetary allocations to be discussed less extensively than they should have been. The author describes points of controversy, the power of women at the meetings, and afterthoughts on the meetings.

  14. Was the Big Bang hot?

    Science.gov (United States)

    Wright, E. L.

    1983-01-01

    Techniques for verifying the spectrum defined by Woody and Richards (WR, 1981), which serves as a base for dust-distorted models of the 3 K background, are discussed. WR detected a sharp deviation from the Planck curve in the 3 K background. The absolute intensity of the background may be determined by the frequency dependence of the dipole anisotropy of the background or the frequency dependence effect in galactic clusters. Both methods involve the Doppler shift; analytical formulae are defined for characterization of the dipole anisotropy. The measurement of the 30-300 GHz spectra of cold galactic dust may reveal the presence of significant amounts of needle-shaped grains, which would in turn support a theory of a cold Big Bang.

  15. Intelligent search in Big Data

    Science.gov (United States)

    Birialtsev, E.; Bukharaev, N.; Gusenkov, A.

    2017-10-01

    An approach to data integration, aimed on the ontology-based intelligent search in Big Data, is considered in the case when information objects are represented in the form of relational databases (RDB), structurally marked by their schemes. The source of information for constructing an ontology and, later on, the organization of the search are texts in natural language, treated as semi-structured data. For the RDBs, these are comments on the names of tables and their attributes. Formal definition of RDBs integration model in terms of ontologies is given. Within framework of the model universal RDB representation ontology, oil production subject domain ontology and linguistic thesaurus of subject domain language are built. Technique of automatic SQL queries generation for subject domain specialists is proposed. On the base of it, information system for TATNEFT oil-producing company RDBs was implemented. Exploitation of the system showed good relevance with majority of queries.

  16. Advancements in Big Data Processing

    CERN Document Server

    Vaniachine, A; The ATLAS collaboration

    2012-01-01

    The ever-increasing volumes of scientific data present new challenges for Distributed Computing and Grid-technologies. The emerging Big Data revolution drives new discoveries in scientific fields including nanotechnology, astrophysics, high-energy physics, biology and medicine. New initiatives are transforming data-driven scientific fields by pushing Bid Data limits enabling massive data analysis in new ways. In petascale data processing scientists deal with datasets, not individual files. As a result, a task (comprised of many jobs) became a unit of petascale data processing on the Grid. Splitting of a large data processing task into jobs enabled fine-granularity checkpointing analogous to the splitting of a large file into smaller TCP/IP packets during data transfers. Transferring large data in small packets achieves reliability through automatic re-sending of the dropped TCP/IP packets. Similarly, transient job failures on the Grid can be recovered by automatic re-tries to achieve reliable Six Sigma produc...

  17. Radon activity measurements around Bakreswar thermal springs

    Energy Technology Data Exchange (ETDEWEB)

    Chaudhuri, Hirok [Saha Institute of Nuclear Physics, 1/AF Bidhannagar, Kolkata, West Bengal 700 064 (India); Das, Nisith K., E-mail: nkdas@veccal.ernet.i [Variable Energy Cyclotron Centre, Atomic Energy, 1/AF Bidhannagar, Kolkata, West Bengal 700 064 (India); Bhandari, Rakesh K. [Variable Energy Cyclotron Centre, Atomic Energy, 1/AF Bidhannagar, Kolkata, West Bengal 700 064 (India); Sen, Prasanta [Saha Institute of Nuclear Physics, 1/AF Bidhannagar, Kolkata, West Bengal 700 064 (India); Sinha, Bikash [Variable Energy Cyclotron Centre, Atomic Energy, 1/AF Bidhannagar, Kolkata, West Bengal 700 064 (India)

    2010-01-15

    {sup 222}Rn concentrations were measured in the bubble gases, spring waters, soil gases and in ambient air around the thermal springs at Bakreswar in West Bengal, India. This group of springs lies within a geothermal zone having exceptionally high heat flow about 230 mW/m{sup 2}, resembling young oceanic ridges. The spring gas has a high radon activity (approx885 kBq/m{sup 3}) and is rich in helium (approx1.4 vol. %) with appreciably large flow rate. The measured radon exhalation rates in the soils of the spring area show extensive variations from 831 to 4550/mBqm{sup 2} h while {sup 222}Rn concentrations in the different spring waters vary from 3.18 to 46.9 kBq/m{sup 3}. Surface air at a radius of 40 m around the springs, within which is situated the Bakreswar temple complex and a group of dwellings, has radon concentration between 450 and 500 Bq/m{sup 3}. In the present paper we assess the radon activity background in and around the spring area due to the different contributing sources and its possible effect on visiting pilgrims and the people who reside close to the springs.

  18. Matrix sketching for big data reduction (Conference Presentation)

    Science.gov (United States)

    Ezekiel, Soundararajan; Giansiracusa, Michael

    2017-05-01

    Abstract: In recent years, the concept of Big Data has become a more prominent issue as the volume of data as well as the velocity in which it is produced exponentially increases. By 2020 the amount of data being stored is estimated to be 44 Zettabytes and currently over 31 Terabytes of data is being generated every second. Algorithms and applications must be able to effectively scale to the volume of data being generated. One such application designed to effectively and efficiently work with Big Data is IBM's Skylark. Part of DARPA's XDATA program, an open-source catalog of tools to deal with Big Data; Skylark, or Sketching-based Matrix Computations for Machine Learning is a library of functions designed to reduce the complexity of large scale matrix problems that also implements kernel-based machine learning tasks. Sketching reduces the dimensionality of matrices through randomization and compresses matrices while preserving key properties, speeding up computations. Matrix sketches can be used to find accurate solutions to computations in less time, or can summarize data by identifying important rows and columns. In this paper, we investigate the effectiveness of sketched matrix computations using IBM's Skylark versus non-sketched computations. We judge effectiveness based on several factors: computational complexity and validity of outputs. Initial results from testing with smaller matrices are promising, showing that Skylark has a considerable reduction ratio while still accurately performing matrix computations.

  19. Next Generation Workload Management and Analysis System for Big Data

    Energy Technology Data Exchange (ETDEWEB)

    De, Kaushik [Univ. of Texas, Arlington, TX (United States)

    2017-04-24

    We report on the activities and accomplishments of a four-year project (a three-year grant followed by a one-year no cost extension) to develop a next generation workload management system for Big Data. The new system is based on the highly successful PanDA software developed for High Energy Physics (HEP) in 2005. PanDA is used by the ATLAS experiment at the Large Hadron Collider (LHC), and the AMS experiment at the space station. The program of work described here was carried out by two teams of developers working collaboratively at Brookhaven National Laboratory (BNL) and the University of Texas at Arlington (UTA). These teams worked closely with the original PanDA team – for the sake of clarity the work of the next generation team will be referred to as the BigPanDA project. Their work has led to the adoption of BigPanDA by the COMPASS experiment at CERN, and many other experiments and science projects worldwide.

  20. What makes Big Data, Big Data? Exploring the ontological characteristics of 26 datasets

    Directory of Open Access Journals (Sweden)

    Rob Kitchin

    2016-02-01

    Full Text Available Big Data has been variously defined in the literature. In the main, definitions suggest that Big Data possess a suite of key traits: volume, velocity and variety (the 3Vs, but also exhaustivity, resolution, indexicality, relationality, extensionality and scalability. However, these definitions lack ontological clarity, with the term acting as an amorphous, catch-all label for a wide selection of data. In this paper, we consider the question ‘what makes Big Data, Big Data?’, applying Kitchin’s taxonomy of seven Big Data traits to 26 datasets drawn from seven domains, each of which is considered in the literature to constitute Big Data. The results demonstrate that only a handful of datasets possess all seven traits, and some do not possess either volume and/or variety. Instead, there are multiple forms of Big Data. Our analysis reveals that the key definitional boundary markers are the traits of velocity and exhaustivity. We contend that Big Data as an analytical category needs to be unpacked, with the genus of Big Data further delineated and its various species identified. It is only through such ontological work that we will gain conceptual clarity about what constitutes Big Data, formulate how best to make sense of it, and identify how it might be best used to make sense of the world.

  1. The study on stress-strain state of the spring at high temperature using ABAQUS

    Directory of Open Access Journals (Sweden)

    H Sun

    2014-01-01

    Full Text Available Cylindrical helical springs are widely used in the elements of thermal energy devices. It is necessary to guarantee the stability of the stress state of spring in high temperature. Relaxation phenomenon of stress is studied in this paper. Calculations are carried out in the environment of ABAQUS. The verification is taken out using analytical calculations.This paper describes the distribution and character of stress contour lines on the cross section of spring under the condition of instantaneous load, explicates the relaxation law with time. Research object is cylindrical helical spring, that working at high temperature. The purpose of this work is to get the stress relaxation law of spring, and to guarantee the long-term strength.This article presents the basic theory of helical spring. Establishes spring mathematical model of creep under the loads of compression and torsion. The stress formulas of each component in the cross section of spring are given. The calculation process of relaxation is analyzed in the program ABAQUS.In this paper compare the analytical formulas of spring stress with the simulation results, which are created by program ABAQUS.Finite element model for stress creep analysis in the cross section is created, material of spring – stainless steel 10X18N9T, springs are used at the temperature 650℃.At the beginning, stress-stain of spring is in the elastic state. Analyzes the change law of creep stress under the condition of constant load and a fixed compression.When analyzing under the condition of a fixed compression, the stresses are quickly decreased in most area in the cross section of spring, and the point of minimum shear stress gradually moves to the direction of outer diameter, because of this, stresses in a small area near the center increase slowly at first then decrease gradually with time. When analyzing under the condition of constant load, the stresses are quickly decreased in the around area and in creased

  2. Visualization at Supercomputing Centers: The Tale of Little Big Iron and the Three Skinny Guys

    Energy Technology Data Exchange (ETDEWEB)

    Bethel, E. Wes; van Rosendale, John; Southard, Dale; Gaither, Kelly; Childs, Hank; Brugger, Eric; Ahern, Sean

    2010-12-01

    Supercomputing Centers (SC's) are unique resources that aim to enable scientific knowledge discovery through the use of large computational resources, the Big Iron. Design, acquisition, installation, and management of the Big Iron are activities that are carefully planned and monitored. Since these Big Iron systems produce a tsunami of data, it is natural to co-locate visualization and analysis infrastructure as part of the same facility. This infrastructure consists of hardware (Little Iron) and staff (Skinny Guys). Our collective experience suggests that design, acquisition, installation, and management of the Little Iron and Skinny Guys does not receive the same level of treatment as that of the Big Iron. The main focus of this article is to explore different aspects of planning, designing, fielding, and maintaining the visualization and analysis infrastructure at supercomputing centers. Some of the questions we explore in this article include:"How should the Little Iron be sized to adequately support visualization and analysis of data coming off the Big Iron?" What sort of capabilities does it need to have?" Related questions concern the size of visualization support staff:"How big should a visualization program be (number of persons) and what should the staff do?" and"How much of the visualization should be provided as a support service, and how much should applications scientists be expected to do on their own?"

  3. Progress in breeding of Novi Sad spring wheat cultivars

    Directory of Open Access Journals (Sweden)

    Rončević Petar

    2006-01-01

    Full Text Available The Institute of Field and Vegetable Crops in Novi Sad began working on spring wheat breeding in 1979 in order to develop cultivars that could be grown in conditions and years unfavorable for winter wheat cultivation. At the start of the program, a collection of spring wheat cultivars from all over the world was assembled for hybridization purposes, with cultivars from Mexico being the most numerous group. Parental pairs were first chosen based on the concept of cultivar, then trait, and, finally and most recently, the concept of gene. After the selection of parental pairs, the hybridization process began and a total 1,700 combinations have been made since. The material was bred using pedigree selection. A large number of lines were developed by positive selection and the best among them were tested in variety trials of the State Variety Commission. Based on the results of those trials, 31 spring wheat cultivars from the Novi Sad program have been released so far. Among them, the cultivars Jarka, Nevesinjka (a facultative variety, Venera, and, more recently Nataša have proven particularly successful in commercial production. Some of these varieties have also been released in foreign countries or are presently being tested for registration abroad. In order to assess the progress of spring wheat breeding at the Institute of Field and Vegetable Crops in Novi Sad, a trial with all the cultivars released by the Institute thus far was set up. Statistical analysis after the trial has confirmed that significant progress towards better wheat cultivars has been made since the program was founded.

  4. [Big data in medicine and healthcare].

    Science.gov (United States)

    Rüping, Stefan

    2015-08-01

    Healthcare is one of the business fields with the highest Big Data potential. According to the prevailing definition, Big Data refers to the fact that data today is often too large and heterogeneous and changes too quickly to be stored, processed, and transformed into value by previous technologies. The technological trends drive Big Data: business processes are more and more executed electronically, consumers produce more and more data themselves - e.g. in social networks - and finally ever increasing digitalization. Currently, several new trends towards new data sources and innovative data analysis appear in medicine and healthcare. From the research perspective, omics-research is one clear Big Data topic. In practice, the electronic health records, free open data and the "quantified self" offer new perspectives for data analytics. Regarding analytics, significant advances have been made in the information extraction from text data, which unlocks a lot of data from clinical documentation for analytics purposes. At the same time, medicine and healthcare is lagging behind in the adoption of Big Data approaches. This can be traced to particular problems regarding data complexity and organizational, legal, and ethical challenges. The growing uptake of Big Data in general and first best-practice examples in medicine and healthcare in particular, indicate that innovative solutions will be coming. This paper gives an overview of the potentials of Big Data in medicine and healthcare.

  5. Interoperability Outlook in the Big Data Future

    Science.gov (United States)

    Kuo, K. S.; Ramachandran, R.

    2015-12-01

    The establishment of distributed active archive centers (DAACs) as data warehouses and the standardization of file format by NASA's Earth Observing System Data Information System (EOSDIS) had doubtlessly propelled interoperability of NASA Earth science data to unprecedented heights in the 1990s. However, we obviously still feel wanting two decades later. We believe the inadequate interoperability we experience is a result of the the current practice that data are first packaged into files before distribution and only the metadata of these files are cataloged into databases and become searchable. Data therefore cannot be efficiently filtered. Any extensive study thus requires downloading large volumes of data files to a local system for processing and analysis.The need to download data not only creates duplication and inefficiency but also further impedes interoperability, because the analysis has to be performed locally by individual researchers in individual institutions. Each institution or researcher often has its/his/her own preference in the choice of data management practice as well as programming languages. Analysis results (derived data) so produced are thus subject to the differences of these practices, which later form formidable barriers to interoperability. A number of Big Data technologies are currently being examined and tested to address Big Earth Data issues. These technologies share one common characteristics: exploiting compute and storage affinity to more efficiently analyze large volumes and great varieties of data. Distributed active "archive" centers are likely to evolve into distributed active "analysis" centers, which not only archive data but also provide analysis service right where the data reside. "Analysis" will become the more visible function of these centers. It is thus reasonable to expect interoperability to improve because analysis, in addition to data, becomes more centralized. Within a "distributed active analysis center

  6. Habitat Quality and Anadromous Fish Production Potential on the Warm Springs Indian Reservation: Annual Report 1987.

    Energy Technology Data Exchange (ETDEWEB)

    Heinith, Robert

    1987-12-01

    In 1987, The Warm Springs Indian Reservation Anadromous Fish Production and Habitat Improvement Program was in the sixth year of a scheduled eleven year program. To date, 21 kilometers of reservation stream habitat have been enhanced for salmonid production benefits. Unusual climatic conditions created a severe drought throughout the Warm Springs River Basin and Shitike Creek in 1987. Temperature extremes and low annual discharges ensued throughout reservation waters. Study sites, located in the Warm Springs River Basin and Shitike Creek, continued to be monitored for physical biological parameters. Post treatment evaluation of bioengineering work in Mill Creek (Strawberry Falls Project) was conducted. Despite low discharges, physical habitat parameters were improved and notable gains were observed in both spring chinook salmon (Oncorhynchus tshawytascha) and summer steelhead trout (Salmo gairdneri) abundance and biomass at post treatment sites. Major bioengineering work was completed at the Mill Creek (Potter's Pond) Site. 19 refs., 24 figs., 16 tabs.

  7. Simulated big sagebrush regeneration supports predicted changes at the trailing and leading edges of distribution shifts

    Science.gov (United States)

    Schlaepfer, Daniel R.; Taylor, Kyle A.; Pennington, Victoria E.; Nelson, Kellen N.; Martin, Trace E.; Rottler, Caitlin M.; Lauenroth, William K.; Bradford, John B.

    2015-01-01

    Many semi-arid plant communities in western North America are dominated by big sagebrush. These ecosystems are being reduced in extent and quality due to economic development, invasive species, and climate change. These pervasive modifications have generated concern about the long-term viability of sagebrush habitat and sagebrush-obligate wildlife species (notably greater sage-grouse), highlighting the need for better understanding of the future big sagebrush distribution, particularly at the species' range margins. These leading and trailing edges of potential climate-driven sagebrush distribution shifts are likely to be areas most sensitive to climate change. We used a process-based regeneration model for big sagebrush, which simulates potential germination and seedling survival in response to climatic and edaphic conditions and tested expectations about current and future regeneration responses at trailing and leading edges that were previously identified using traditional species distribution models. Our results confirmed expectations of increased probability of regeneration at the leading edge and decreased probability of regeneration at the trailing edge below current levels. Our simulations indicated that soil water dynamics at the leading edge became more similar to the typical seasonal ecohydrological conditions observed within the current range of big sagebrush ecosystems. At the trailing edge, an increased winter and spring dryness represented a departure from conditions typically supportive of big sagebrush. Our results highlighted that minimum and maximum daily temperatures as well as soil water recharge and summer dry periods are important constraints for big sagebrush regeneration. Overall, our results confirmed previous predictions, i.e., we see consistent changes in areas identified as trailing and leading edges; however, we also identified potential local refugia within the trailing edge, mostly at sites at higher elevation. Decreasing

  8. Big data and the electronic health record.

    Science.gov (United States)

    Peters, Steve G; Buntrock, James D

    2014-01-01

    The electronic medical record has evolved from a digital representation of individual patient results and documents to information of large scale and complexity. Big Data refers to new technologies providing management and processing capabilities, targeting massive and disparate data sets. For an individual patient, techniques such as Natural Language Processing allow the integration and analysis of textual reports with structured results. For groups of patients, Big Data offers the promise of large-scale analysis of outcomes, patterns, temporal trends, and correlations. The evolution of Big Data analytics moves us from description and reporting to forecasting, predictive modeling, and decision optimization.

  9. Big data governance an emerging imperative

    CERN Document Server

    Soares, Sunil

    2012-01-01

    Written by a leading expert in the field, this guide focuses on the convergence of two major trends in information management-big data and information governance-by taking a strategic approach oriented around business cases and industry imperatives. With the advent of new technologies, enterprises are expanding and handling very large volumes of data; this book, nontechnical in nature and geared toward business audiences, encourages the practice of establishing appropriate governance over big data initiatives and addresses how to manage and govern big data, highlighting the relevant processes,

  10. Ethics and Epistemology of Big Data.

    Science.gov (United States)

    Lipworth, Wendy; Mason, Paul H; Kerridge, Ian

    2017-12-01

    In this Symposium on the Ethics and Epistemology of Big Data, we present four perspectives on the ways in which the rapid growth in size of research databanks-i.e. their shift into the realm of "big data"-has changed their moral, socio-political, and epistemic status. While there is clearly something different about "big data" databanks, we encourage readers to place the arguments presented in this Symposium in the context of longstanding debates about the ethics, politics, and epistemology of biobank, database, genetic, and epidemiological research.

  11. Smart Information Management in Health Big Data.

    Science.gov (United States)

    Muteba A, Eustache

    2017-01-01

    The smart information management system (SIMS) is concerned with the organization of anonymous patient records in a big data and their extraction in order to provide needful real-time intelligence. The purpose of the present study is to highlight the design and the implementation of the smart information management system. We emphasis, in one hand, the organization of a big data in flat file in simulation of nosql database, and in the other hand, the extraction of information based on lookup table and cache mechanism. The SIMS in the health big data aims the identification of new therapies and approaches to delivering care.

  12. Rheological behavior of a confined bead-spring cube consisting of equal Fraenkel springs

    NARCIS (Netherlands)

    Denneman, A.I.M.; Denneman, A.I.M.; Jongschaap, R.J.J.; Mellema, J.

    2000-01-01

    A general bead-spring model is used to predict linear viscoelastic properties of a non-Hookean bead-spring cube immersed in a Newtonian fluid. This K×K×K cube consist of K 3 beads with equal friction coefficients and 3K 2(K–1) equal Fraenkel springs with length q. The cube has a topology based upon

  13. BIG DATA, BIG CONSEQUENCES? EEN VERKENNING NAAR PRIVACY EN BIG DATA GEBRUIK BINNEN DE OPSPORING, VERVOLGING EN RECHTSPRAAK

    OpenAIRE

    Lodder, A.R.; van der Meulen, N.S.; Wisman, T.H.A.; Meij, Lisette; Zwinkels, C.M.M.

    2014-01-01

    In deze verkenning is ingegaan op de privacy aspecten van Big Data analysis binnen het domein Veiligheid en Justitie. Besproken zijn toepassingen binnen de rechtspraak zoals voorspellen van uitspraken en gebruik in rechtszaken. Met betrekking tot opsporing is onder andere ingegaan op predictive policing en internetopsporing. Na een uiteenzetting van de privacynormen en toepassingsmogelijkheden, zijn de volgende zes uitgangspunten voor Big Data toepassingen voorgesteld: 7 A.R. Lodder e.a. ‐ Bi...

  14. Big questions, big science: meeting the challenges of global ecology.

    Science.gov (United States)

    Schimel, David; Keller, Michael

    2015-04-01

    Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects.

  15. 76 FR 7837 - Big Rivers Electric Corporation; Notice of Filing

    Science.gov (United States)

    2011-02-11

    ... December 1, 2010, the date that Big Rivers integrated its transmission facilities with the Midwest... Energy Regulatory Commission Big Rivers Electric Corporation; Notice of Filing Take notice that on February 4, 2011, Big Rivers Electric Corporation (Big Rivers) filed a notice of cancellation of its Second...

  16. Inleiding symposium 'Silent Spring, 50 jaar later'

    NARCIS (Netherlands)

    Lenteren, van J.C.; Hunneman, H.

    2013-01-01

    Een halve eeuw geleden publiceerde Rachel Carson het boek Silent Spring, dat de aandacht vestigde op milieuproblemen en het begin betekende van de milieubeweging. Op zaterdag 17 november 2012 vond in Leiden het symposium ‘Silent Spring, 50 jaar later’ plaats. De Nederlandse Entomologische Vereniging

  17. Nonlinear Vibration of a Magnetic Spring

    Science.gov (United States)

    Zhong, Juhua; Cheng, Zhongqi; Ge, Ziming; Zhang, Yuelan; Lu, Wenqiang; Song, Feng; Li, Chuanyong

    2012-01-01

    To demonstrate the different vibration characteristics of a magnetic spring compared with those of a metal one, a magnetic spring apparatus was constructed from a pair of circular magnets of the same size with an inside diameter of 2.07 cm and an outside diameter of 4.50 cm. To keep the upper magnet in a suspension state, the two magnets were…

  18. 1988 Hanford riverbank springs characterization report

    Energy Technology Data Exchange (ETDEWEB)

    Dirkes, R.L.

    1990-12-01

    This reports presents the results of a special study undertaken to characterize the riverbank springs (i.e., ground-water seepage) entering the Columbia River along the Hanford Site. Radiological and nonradiological analyses were performed. River water samples were also analyzed from upstream and downstream of the Site as well as from the immediate vicinity of the springs. In addition, irrigation return water and spring water entering the river along the shoreline opposite Hanford were analyzed. Hanford-origin contaminants were detected in spring water entering the Columbia River along the Hanford Site. The type and concentrations of contaminants in the spring water were similar to those known to exist in the ground water near the river. The location and extent of the contaminated discharges compared favorably with recent ground-water reports and predictions. Spring discharge volumes remain very small relative to the flow of the Columbia. Downstream river sampling demonstrates the impact of ground-water discharges to be minimal, and negligible in most cases. Radionuclide concentrations were below US Department of Energy Derived Concentration Guides (DCGs) with the exception {sup 90}Sr near the 100-N Area. Tritium, while below the DCG, was detected at concentrations above the US Environmental Protection Agency drinking water standards in several springs. All other radionuclide concentrations were below drinking water standards. Nonradiological contaminants were generally undetectable in the spring water. River water contaminant concentrations, outside of the immediate discharge zones, were below drinking water standards in all cases. 19 refs., 5 figs., 12 tabs.

  19. The Damper Spring Unit of the Sentinel 1 Solar Array

    Science.gov (United States)

    Doejaaren, Frans; Ellenbroek, Marcel

    2012-01-01

    The Damper Spring Unit (DSU, see Figure 1) has been designed to provide the damping required to control the deployment speed of the spring driven solar array deployment in an ARA Mk3 or FRED based Solar Array in situations where the standard application of a damper at the root-hinge is not feasible. The unit consists of four major parts: a main bracket, an eddy current damper, a spring unit, an actuation pulley which is coupled via Kevlar cables to a synchro-pulley of a hinge. The damper slows down the deployment speed and prevents deployment shocks at deployment completion. The spring unit includes 4 springs which overcome the resistances of the damper and the specific DSU control cable loop. This means it can be added to any spring driven deployment system without major modifications of that system. Engineering models of the Sentinel 1 solar array wing have been built to identify the deployment behavior, and to help to determine the optimal pulley ratios of the solar array and to finalize the DSU design. During the functional tests, the behavior proved to be very sensitive for the alignment of the DSU. This was therefore monitored carefully during the qualification program, especially prior to the TV cold testing. During TV "Cold" testing the measured retarding torque exceeded the max. required value: 284 N-mm versus the required 247 N-mm. Although this requirement was not met, the torque balance analysis shows that the 284 N-mm can be accepted, because the spring unit can provide 1.5 times more torque than required. Some functional tests of the DSU have been performed without the eddy current damper attached. It provided input data for the ADAMS solar array wing model. Simulation of the Sentinel-1 deployment (including DSU) in ADAMS allowed the actual wing deployment tests to be limited in both complexity and number of tests. The DSU for the Sentinel-1 solar array was successfully qualified and the flight models are in production.

  20. NOAA Big Data Partnership RFI

    Science.gov (United States)

    de la Beaujardiere, J.

    2014-12-01

    In February 2014, the US National Oceanic and Atmospheric Administration (NOAA) issued a Big Data Request for Information (RFI) from industry and other organizations (e.g., non-profits, research laboratories, and universities) to assess capability and interest in establishing partnerships to position a copy of NOAA's vast data holdings in the Cloud, co-located with easy and affordable access to analytical capabilities. This RFI was motivated by a number of concerns. First, NOAA's data facilities do not necessarily have sufficient network infrastructure to transmit all available observations and numerical model outputs to all potential users, or sufficient infrastructure to support simultaneous computation by many users. Second, the available data are distributed across multiple services and data facilities, making it difficult to find and integrate data for cross-domain analysis and decision-making. Third, large datasets require users to have substantial network, storage, and computing capabilities of their own in order to fully interact with and exploit the latent value of the data. Finally, there may be commercial opportunities for value-added products and services derived from our data. Putting a working copy of data in the Cloud outside of NOAA's internal networks and infrastructures should reduce demands and risks on our systems, and should enable users to interact with multiple datasets and create new lines of business (much like the industries built on government-furnished weather or GPS data). The NOAA Big Data RFI therefore solicited information on technical and business approaches regarding possible partnership(s) that -- at no net cost to the government and minimum impact on existing data facilities -- would unleash the commercial potential of its environmental observations and model outputs. NOAA would retain the master archival copy of its data. Commercial partners would not be permitted to charge fees for access to the NOAA data they receive, but

  1. Manchester Spring Chinook Broodstock Project : Progress Report, 2000.

    Energy Technology Data Exchange (ETDEWEB)

    McAuley, W. Carlin; Wastel, Michael R.; Flagg, Thomas A. (Thomas Alvin)

    2000-11-01

    In spring 1995 the Idaho Department of Fish and Game (IDFG) and the Oregon Department of Fish and Wildlife (ODFW) initiated captive broodstocks as part of conservation efforts for ESA-listed stocks of Snake River spring/summer chinook salmon (Oncorhynchus tshawytscha). The need for this captive broodstock strategy was identified as critical in the National Marine Fisheries Service (NMFS) Proposed Recovery Plan for Snake River Salmon. These captive broodstock programs are being coordinated by the Bonneville Power Administration (BPA) through the Chinook Salmon Captive Propagation Technical Oversight Committee (CSCPTOC). Oregon's Snake River spring/summer chinook salmon captive broodstock program currently focuses on three stocks captured as juveniles from the Grande Ronde River Basin: the upper Grande Ronde River, Catherine Creek, and the Lostine River. Idaho's Snake River program includes three stocks captured as eggs and juveniles from the Salmon River Basin: the Lemhi River, East Fork Salmon River, and West Fork Yankee Fork. The majority of captive fish from each stock of the Grande Ronde Basin will be grown to maturity in freshwater at the ODFW Bonneville Hatchery. A minority of the Salmon River Basin stocks will be grown to maturity in freshwater at the IDFG Eagle Hatchery. However, the IDFG and ODFW requested that a portion of each group also be reared in protective culture in seawater. In August 1996, NMFS began a BPA funded project (Project 96-067-00) to rear Snake River spring/summer chinook salmon captive broodstocks in seawater at the NMFS Manchester Research Station. During 1997-1999, facilities modifications were undertaken at Manchester to provide secure facilities for rearing of these ESA-listed fish. This included construction of a building housing a total of twenty 6.1-m diameter fiberglass rearing tanks, upgrade of the Manchester salt water pumping and filtration/sterilization systems to a total capacity of 5,670 L/min (1,500 gpm), and

  2. Estimation of springing response for 550 000 DWT ore carrier

    Science.gov (United States)

    Adenya, Christiaan Adika; Ren, Huilong; Li, Hui; Wang, Di

    2016-09-01

    The desire to benefit from economy of scale is one of the major driving forces behind the continuous growth in ship sizes. However, models of new large ships need to be thoroughly investigated to determine the carrier's response in waves. In this work, experimental and numerical assessments of the motion and load response of a 550,000 DWT ore carrier are performed using prototype ships with softer stiffness, and towing tank tests are conducted using a segmented model with two schemes of softer stiffness. Numerical analyses are performed employing both rigid body and linear hydroelasticity theories using an in-house program and a comparison is then made between experimental and numerical results to establish the influence of stiffness on the ore carrier's springing response. Results show that softer stiffness models can be used when studying the springing response of ships in waves.

  3. Big Bend National Park: Acoustical Monitoring 2010

    Science.gov (United States)

    2013-06-01

    During the summer of 2010 (September October 2010), the Volpe Center collected baseline acoustical data at Big Bend National Park (BIBE) at four sites deployed for approximately 30 days each. The baseline data collected during this period will he...

  4. Quantum nature of the big bang.

    Science.gov (United States)

    Ashtekar, Abhay; Pawlowski, Tomasz; Singh, Parampreet

    2006-04-14

    Some long-standing issues concerning the quantum nature of the big bang are resolved in the context of homogeneous isotropic models with a scalar field. Specifically, the known results on the resolution of the big-bang singularity in loop quantum cosmology are significantly extended as follows: (i) the scalar field is shown to serve as an internal clock, thereby providing a detailed realization of the "emergent time" idea; (ii) the physical Hilbert space, Dirac observables, and semiclassical states are constructed rigorously; (iii) the Hamiltonian constraint is solved numerically to show that the big bang is replaced by a big bounce. Thanks to the nonperturbative, background independent methods, unlike in other approaches the quantum evolution is deterministic across the deep Planck regime.

  5. Statistical Challenges in Modeling Big Brain Signals

    KAUST Repository

    Yu, Zhaoxia

    2017-11-01

    Brain signal data are inherently big: massive in amount, complex in structure, and high in dimensions. These characteristics impose great challenges for statistical inference and learning. Here we review several key challenges, discuss possible solutions, and highlight future research directions.

  6. Statistical Challenges in Modeling Big Brain Signals

    OpenAIRE

    Yu, Zhaoxia; Pluta, Dustin; Shen, Tong; Chen, Chuansheng; Xue, Gui; Ombao, Hernando

    2017-01-01

    Brain signal data are inherently big: massive in amount, complex in structure, and high in dimensions. These characteristics impose great challenges for statistical inference and learning. Here we review several key challenges, discuss possible solutions, and highlight future research directions.

  7. Fisicos argentinos reproduciran el Big Bang

    CERN Multimedia

    De Ambrosio, Martin

    2008-01-01

    Two groups of argentine physicists from La Plata and Buenos Aires Universities work in a sery of experiments who while recreate the conditions of the big explosion that was at the origin of the universe. (1 page)

  8. 2015 OLC Lidar DEM: Big Wood, ID

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Quantum Spatial has collected Light Detection and Ranging (LiDAR) data for the Oregon LiDAR Consortium (OLC) Big Wood 2015 study area. This study area is located in...

  9. Big data: survey, technologies, opportunities, and challenges

    National Research Council Canada - National Science Library

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range...

  10. Cosmic relics from the big bang

    Energy Technology Data Exchange (ETDEWEB)

    Hall, L.J.

    1988-12-01

    A brief introduction to the big bang picture of the early universe is given. Dark matter is discussed; particularly its implications for elementary particle physics. A classification scheme for dark matter relics is given. 21 refs., 11 figs., 1 tab.

  11. Big Data and Analytics in Healthcare.

    Science.gov (United States)

    Tan, S S-L; Gao, G; Koch, S

    2015-01-01

    This editorial is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". The amount of data being generated in the healthcare industry is growing at a rapid rate. This has generated immense interest in leveraging the availability of healthcare data (and "big data") to improve health outcomes and reduce costs. However, the nature of healthcare data, and especially big data, presents unique challenges in processing and analyzing big data in healthcare. This Focus Theme aims to disseminate some novel approaches to address these challenges. More specifically, approaches ranging from efficient methods of processing large clinical data to predictive models that could generate better predictions from healthcare data are presented.

  12. Applications of big data to smart cities

    National Research Council Canada - National Science Library

    Al Nuaimi, Eiman; Al Neyadi, Hind; Mohamed, Nader; Al-Jaroodi, Jameela

    2015-01-01

    Many governments are considering adopting the smart city concept in their cities and implementing big data applications that support smart city components to reach the required level of sustainability...

  13. Scaling big data with Hadoop and Solr

    CERN Document Server

    Karambelkar, Hrishikesh Vijay

    2015-01-01

    This book is aimed at developers, designers, and architects who would like to build big data enterprise search solutions for their customers or organizations. No prior knowledge of Apache Hadoop and Apache Solr/Lucene technologies is required.

  14. ARC Code TI: BigView

    Data.gov (United States)

    National Aeronautics and Space Administration — BigView allows for interactive panning and zooming of images of arbitrary size on desktop PCs running linux. Additionally, it can work in a multi-screen environment...

  15. Big data en handschriften van Christiaan Huygens

    NARCIS (Netherlands)

    Damen, J.C.M.

    2013-01-01

    In de achtste aflevering van een serie combinatiebesprekingen (digitaalandspeciaal) schenkt Jos Damen aandacht aan een onderzoek naar big data van bibliotheekcatalogi en een catalogus van het werk van het Nederlandse genie Christiaan Huygens.

  16. Big data business models: Challenges and opportunities

    Directory of Open Access Journals (Sweden)

    Ralph Schroeder

    2016-12-01

    Full Text Available This paper, based on 28 interviews from a range of business leaders and practitioners, examines the current state of big data use in business, as well as the main opportunities and challenges presented by big data. It begins with an account of the current landscape and what is meant by big data. Next, it draws distinctions between the ways organisations use data and provides a taxonomy of big data business models. We observe a variety of different business models, depending not only on sector, but also on whether the main advantages derive from analytics capabilities or from having ready access to valuable data sources. Some major challenges emerge from this account, including data quality and protectiveness about sharing data. The conclusion discusses these challenges, and points to the tensions and differing perceptions about how data should be governed as between business practitioners, the promoters of open data, and the wider public.

  17. Big Data in food and agriculture

    Directory of Open Access Journals (Sweden)

    Kelly Bronson

    2016-06-01

    Full Text Available Farming is undergoing a digital revolution. Our existing review of current Big Data applications in the agri-food sector has revealed several collection and analytics tools that may have implications for relationships of power between players in the food system (e.g. between farmers and large corporations. For example, Who retains ownership of the data generated by applications like Monsanto Corproation's Weed I.D . “app”? Are there privacy implications with the data gathered by John Deere's precision agricultural equipment? Systematically tracing the digital revolution in agriculture, and charting the affordances as well as the limitations of Big Data applied to food and agriculture, should be a broad research goal for Big Data scholarship. Such a goal brings data scholarship into conversation with food studies and it allows for a focus on the material consequences of big data in society.

  18. Big Data for Business Ecosystem Players

    Directory of Open Access Journals (Sweden)

    Perko Igor

    2016-06-01

    Full Text Available In the provided research, some of the Big Data most prospective usage domains connect with distinguished player groups found in the business ecosystem. Literature analysis is used to identify the state of the art of Big Data related research in the major domains of its use-namely, individual marketing, health treatment, work opportunities, financial services, and security enforcement. System theory was used to identify business ecosystem major player types disrupted by Big Data: individuals, small and mid-sized enterprises, large organizations, information providers, and regulators. Relationships between the domains and players were explained through new Big Data opportunities and threats and by players’ responsive strategies. System dynamics was used to visualize relationships in the provided model.

  19. Soft computing in big data processing

    CERN Document Server

    Park, Seung-Jong; Lee, Jee-Hyong

    2014-01-01

    Big data is an essential key to build a smart world as a meaning of the streaming, continuous integration of large volume and high velocity data covering from all sources to final destinations. The big data range from data mining, data analysis and decision making, by drawing statistical rules and mathematical patterns through systematical or automatically reasoning. The big data helps serve our life better, clarify our future and deliver greater value. We can discover how to capture and analyze data. Readers will be guided to processing system integrity and implementing intelligent systems. With intelligent systems, we deal with the fundamental data management and visualization challenges in effective management of dynamic and large-scale data, and efficient processing of real-time and spatio-temporal data. Advanced intelligent systems have led to managing the data monitoring, data processing and decision-making in realistic and effective way. Considering a big size of data, variety of data and frequent chan...

  20. Cincinnati Big Area Additive Manufacturing (BAAM)

    Energy Technology Data Exchange (ETDEWEB)

    Duty, Chad E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Love, Lonnie J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-03-04

    Oak Ridge National Laboratory (ORNL) worked with Cincinnati Incorporated (CI) to demonstrate Big Area Additive Manufacturing which increases the speed of the additive manufacturing (AM) process by over 1000X, increases the size of parts by over 10X and shows a cost reduction of over 100X. ORNL worked with CI to transition the Big Area Additive Manufacturing (BAAM) technology from a proof-of-principle (TRL 2-3) demonstration to a prototype product stage (TRL 7-8).

  1. COBE looks back to the Big Bang

    Science.gov (United States)

    Mather, John C.

    1993-01-01

    An overview is presented of NASA-Goddard's Cosmic Background Explorer (COBE), the first NASA satellite designed to observe the primeval explosion of the universe. The spacecraft carries three extremely sensitive IR and microwave instruments designed to measure the faint residual radiation from the Big Bang and to search for the formation of the first galaxies. COBE's far IR absolute spectrophotometer has shown that the Big Bang radiation has a blackbody spectrum, proving that there was no large energy release after the explosion.

  2. Big data in the new media environment.

    Science.gov (United States)

    O'Donnell, Matthew Brook; Falk, Emily B; Konrath, Sara

    2014-02-01

    Bentley et al. argue for the social scientific contextualization of "big data" by proposing a four-quadrant model. We suggest extensions of the east-west (i.e., socially motivated versus independently motivated) decision-making dimension in light of findings from social psychology and neuroscience. We outline a method that leverages linguistic tools to connect insights across fields that address the individuals underlying big-data media streams.

  3. Weldon Spring historical dose estimate

    Energy Technology Data Exchange (ETDEWEB)

    Meshkov, N.; Benioff, P.; Wang, J.; Yuan, Y.

    1986-07-01

    This study was conducted to determine the estimated radiation doses that individuals in five nearby population groups and the general population in the surrounding area may have received as a consequence of activities at a uranium processing plant in Weldon Spring, Missouri. The study is retrospective and encompasses plant operations (1957-1966), cleanup (1967-1969), and maintenance (1969-1982). The dose estimates for members of the nearby population groups are as follows. Of the three periods considered, the largest doses to the general population in the surrounding area would have occurred during the plant operations period (1957-1966). Dose estimates for the cleanup (1967-1969) and maintenance (1969-1982) periods are negligible in comparison. Based on the monitoring data, if there was a person residing continually in a dwelling 1.2 km (0.75 mi) north of the plant, this person is estimated to have received an average of about 96 mrem/yr (ranging from 50 to 160 mrem/yr) above background during plant operations, whereas the dose to a nearby resident during later years is estimated to have been about 0.4 mrem/yr during cleanup and about 0.2 mrem/yr during the maintenance period. These values may be compared with the background dose in Missouri of 120 mrem/yr.

  4. The NOAA Big Data Project

    Science.gov (United States)

    de la Beaujardiere, J.

    2015-12-01

    The US National Oceanic and Atmospheric Administration (NOAA) is a Big Data producer, generating tens of terabytes per day from hundreds of sensors on satellites, radars, aircraft, ships, and buoys, and from numerical models. These data are of critical importance and value for NOAA's mission to understand and predict changes in climate, weather, oceans, and coasts. In order to facilitate extracting additional value from this information, NOAA has established Cooperative Research and Development Agreements (CRADAs) with five Infrastructure-as-a-Service (IaaS) providers — Amazon, Google, IBM, Microsoft, Open Cloud Consortium — to determine whether hosting NOAA data in publicly-accessible Clouds alongside on-demand computational capability stimulates the creation of new value-added products and services and lines of business based on the data, and if the revenue generated by these new applications can support the costs of data transmission and hosting. Each IaaS provider is the anchor of a "Data Alliance" which organizations or entrepreneurs can join to develop and test new business or research avenues. This presentation will report on progress and lessons learned during the first 6 months of the 3-year CRADAs.

  5. Big-bang nucleosynthesis revisited

    Science.gov (United States)

    Olive, Keith A.; Schramm, David N.; Steigman, Gary; Walker, Terry P.

    1989-01-01

    The homogeneous big-bang nucleosynthesis yields of D, He-3, He-4, and Li-7 are computed taking into account recent measurements of the neutron mean-life as well as updates of several nuclear reaction rates which primarily affect the production of Li-7. The extraction of primordial abundances from observation and the likelihood that the primordial mass fraction of He-4, Y(sub p) is less than or equal to 0.24 are discussed. Using the primordial abundances of D + He-3 and Li-7 we limit the baryon-to-photon ratio (eta in units of 10 exp -10) 2.6 less than or equal to eta(sub 10) less than or equal to 4.3; which we use to argue that baryons contribute between 0.02 and 0.11 to the critical energy density of the universe. An upper limit to Y(sub p) of 0.24 constrains the number of light neutrinos to N(sub nu) less than or equal to 3.4, in excellent agreement with the LEP and SLC collider results. We turn this argument around to show that the collider limit of 3 neutrino species can be used to bound the primordial abundance of He-4: 0.235 less than or equal to Y(sub p) less than or equal to 0.245.

  6. Neutrinos and Big Bang Nucleosynthesis

    Directory of Open Access Journals (Sweden)

    Gary Steigman

    2012-01-01

    Full Text Available According to the standard models of particle physics and cosmology, there should be a background of cosmic neutrinos in the present Universe, similar to the cosmic microwave photon background. The weakness of the weak interactions renders this neutrino background undetectable with current technology. The cosmic neutrino background can, however, be probed indirectly through its cosmological effects on big bang nucleosynthesis (BBN and the cosmic microwave background (CMB radiation. In this BBN review, focused on neutrinos and more generally on dark radiation, the BBN constraints on the number of “equivalent neutrinos” (dark radiation, on the baryon asymmetry (baryon density, and on a possible lepton asymmetry (neutrino degeneracy are reviewed and updated. The BBN constraints on dark radiation and on the baryon density following from considerations of the primordial abundances of deuterium and helium-4 are in excellent agreement with the complementary results from the CMB, providing a suggestive, but currently inconclusive, hint of the presence of dark radiation, and they constrain any lepton asymmetry. For all the cases considered here there is a “lithium problem”: the BBN-predicted lithium abundance exceeds the observationally inferred primordial value by a factor of ~3.

  7. "Big Science" exhibition at Balexert

    CERN Multimedia

    2008-01-01

    CERN is going out to meet those members of the general public who were unable to attend the recent Open Day. The Laboratory will be taking its "Big Science" exhibition from the Globe of Science and Innovation to the Balexert shopping centre from 19 to 31 May 2008. The exhibition, which shows the LHC and its experiments through the eyes of a photographer, features around thirty spectacular photographs measuring 4.5 metres high and 2.5 metres wide. Welcomed and guided around the exhibition by CERN volunteers, shoppers at Balexert will also have the opportunity to discover LHC components on display and watch films. "Fun with Physics" workshops will be held at certain times of the day. Main hall of the Balexert shopping centre, ground floor, from 9.00 a.m. to 7.00 p.m. Monday to Friday and from 10 a.m. to 6 p.m. on the two Saturdays. Call for volunteers All members of the CERN personnel are invited to enrol as volunteers to help welcom...

  8. Big Data: Survey, Technologies, Opportunities, and Challenges

    Directory of Open Access Journals (Sweden)

    Nawsher Khan

    2014-01-01

    Full Text Available Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  9. Big Data: Survey, Technologies, Opportunities, and Challenges

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Mahmoud Ali, Waleed Kamaleldin; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  10. Slaves to Big Data. Or Are We?

    Directory of Open Access Journals (Sweden)

    Mireille Hildebrandt

    2013-10-01

    Full Text Available

    In this contribution, the notion of Big Data is discussed in relation to the monetisation of personal data. The claim of some proponents, as well as adversaries, that Big Data implies that ‘n = all’, meaning that we no longer need to rely on samples because we have all the data, is scrutinised and found to be both overly optimistic and unnecessarily pessimistic. A set of epistemological and ethical issues is presented, focusing on the implications of Big Data for our perception, cognition, fairness, privacy and due process. The article then looks into the idea of user-centric personal data management to investigate to what extent it provides solutions for some of the problems triggered by the Big Data conundrum. Special attention is paid to the core principle of data protection legislation, namely purpose binding. Finally, this contribution seeks to inquire into the influence of Big Data politics on self, mind and society, and asks how we can prevent ourselves from becoming slaves to Big Data.

  11. Main Issues in Big Data Security

    Directory of Open Access Journals (Sweden)

    Julio Moreno

    2016-09-01

    Full Text Available Data is currently one of the most important assets for companies in every field. The continuous growth in the importance and volume of data has created a new problem: it cannot be handled by traditional analysis techniques. This problem was, therefore, solved through the creation of a new paradigm: Big Data. However, Big Data originated new issues related not only to the volume or the variety of the data, but also to data security and privacy. In order to obtain a full perspective of the problem, we decided to carry out an investigation with the objective of highlighting the main issues regarding Big Data security, and also the solutions proposed by the scientific community to solve them. In this paper, we explain the results obtained after applying a systematic mapping study to security in the Big Data ecosystem. It is almost impossible to carry out detailed research into the entire topic of security, and the outcome of this research is, therefore, a big picture of the main problems related to security in a Big Data system, along with the principal solutions to them proposed by the research community.

  12. Big data: survey, technologies, opportunities, and challenges.

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  13. Preparing a Data Scientist: A Pedagogic Experience in Designing a Big Data Analytics Course

    Science.gov (United States)

    Asamoah, Daniel Adomako; Sharda, Ramesh; Hassan Zadeh, Amir; Kalgotra, Pankush

    2017-01-01

    In this article, we present an experiential perspective on how a big data analytics course was designed and delivered to students at a major Midwestern university. In reference to the "MSIS 2006 Model Curriculum," we designed this course as a level 2 course, with prerequisites in databases, computer programming, statistics, and data…

  14. Spring/dimple instrument tube restraint

    Science.gov (United States)

    DeMario, Edmund E.; Lawson, Charles N.

    1993-01-01

    A nuclear fuel assembly for a pressurized water nuclear reactor has a spring and dimple structure formed in a non-radioactive insert tube placed in the top of a sensor receiving instrumentation tube thimble disposed in the fuel assembly and attached at a top nozzle, a bottom nozzle, and intermediate grids. The instrumentation tube thimble is open at the top, where the sensor or its connection extends through the cooling water for coupling to a sensor signal processor. The spring and dimple insert tube is mounted within the instrumentation tube thimble and extends downwardly adjacent the top. The springs and dimples restrain the sensor and its connections against lateral displacement causing impact with the instrumentation tube thimble due to the strong axial flow of cooling water. The instrumentation tube has a stainless steel outer sleeve and a zirconium alloy inner sleeve below the insert tube adjacent the top. The insert tube is relatively non-radioactivated inconel alloy. The opposed springs and dimples are formed on diametrically opposite inner walls of the insert tube, the springs being formed as spaced axial cuts in the insert tube, with a web of the insert tube between the cuts bowed radially inwardly for forming the spring, and the dimples being formed as radially inward protrusions opposed to the springs.

  15. Alternative Spring Break at the Savannah College of Art and Design: Engaging Art and Design Students in Community Service

    Science.gov (United States)

    Hoey, J. Joseph; Feld-Gore, Jeffrey A.

    2014-01-01

    This chapter describes the impact of an alternative spring break program on students at the Savannah College of Art and Design over a set of years as well as its effectiveness as a service-learning tool.

  16. Vanishing Springs in Nepalese Mountains: Assessment of Water Sources, Farmers' Perceptions, and Climate Change Adaptation

    Directory of Open Access Journals (Sweden)

    Durga D. Poudel

    2017-02-01

    Full Text Available The Thulokhola watershed of the Nuwakot district in the midhills region of Nepal can be considered typical of climate change-related stresses in the region. To assess the status of water resources and document farmers' perceptions of and adaptation to climate change impacts in this watershed, we invited community groups to monitor water quality and conducted 6 focus group meetings, 3 participatory rural appraisals, and spring and household surveys in 2011 and 2012. Historical precipitation data from a nearby weather station and discharge data for the Tadi Khola, the nearest major river, were also analyzed. The spring survey results confirmed farmers' perceptions and showed that 73.2% of the springs used as water sources had a decreased flow and 12.2% had dried up over the past 10 or more years, as recognized by local residents. In response to the severe decline of precipitation and the drying up of springs, local communities have implemented some climate change adaptation measures, such as constructing water tanks at water sources, using pipes to transport drinking water, diverting water from other springs, digging deeper wells, and traveling farther to wash clothes and fetch drinking water. To enhance drinking water supplies and ensure the agricultural, ecological, and environmental integrity of the watershed, initiatives such as comprehensive research on springs and groundwater hydrology, a spring rejuvenation program, and community capacity building for water sustainability and climate change adaptation are suggested.

  17. 77 FR 9169 - Standard Instrument Approach Procedures, and Takeoff Minimums and Obstacle Departure Procedures...

    Science.gov (United States)

    2012-02-16

    ...), Flight Technologies and Programs Divisions, Flight Standards Service, Federal Aviation Administration..., ILS OR LOC RWY 8, Amdt 5G Big Spring, TX, Big Spring McMahon-Wrinkle, Takeoff Minimums and Obstacle DP...

  18. Boosting Big National Lab Data

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-02-21

    Introduction: Big data. Love it or hate it, solving the world’s most intractable problems requires the ability to make sense of huge and complex sets of data and do it quickly. Speeding up the process – from hours to minutes or from weeks to days – is key to our success. One major source of such big data are physical experiments. As many will know, these physical experiments are commonly used to solve challenges in fields such as energy security, manufacturing, medicine, pharmacology, environmental protection and national security. Experiments use different instruments and sensor types to research for example the validity of new drugs, the base cause for diseases, more efficient energy sources, new materials for every day goods, effective methods for environmental cleanup, the optimal ingredients composition for chocolate or determine how to preserve valuable antics. This is done by experimentally determining the structure, properties and processes that govern biological systems, chemical processes and materials. The speed and quality at which we can acquire new insights from experiments directly influences the rate of scientific progress, industrial innovation and competitiveness. And gaining new groundbreaking insights, faster, is key to the economic success of our nations. Recent years have seen incredible advances in sensor technologies, from house size detector systems in large experiments such as the Large Hadron Collider and the ‘Eye of Gaia’ billion pixel camera detector to high throughput genome sequencing. These developments have led to an exponential increase in data volumes, rates and variety produced by instruments used for experimental work. This increase is coinciding with a need to analyze the experimental results at the time they are collected. This speed is required to optimize the data taking and quality, and also to enable new adaptive experiments, where the sample is manipulated as it is observed, e.g. a substance is injected into a

  19. Big things start in small ways.

    Science.gov (United States)

    Rawlings, N

    1990-12-01

    This statement from the President of the 31st December Women's Movement in Ghana was part of a larger text presented at the World NGO Conference in Tokyo, July 1-4, 1990. The women's movement in Ghana strives to achieve equal opportunity, social justice, and sustainable development against social discrimination for women. Planning and development have focused on women in socioeconomic development. Specific projects at the core of creating positive conditions for socioeconomic growth, raising the standard of living, and expanding the economy, involve cover food and cash-crop production, food processing, food preparation, and small scale industrial activities such as ceramics and crafts. Income supplementation helps parents to send children to school instead of work. Daycare centers operating near work places benefit mothers in terms of providing a vacation, adult literacy programs, and family counseling sessions. The Movement actively mobilizes women to have children vaccinated. Access to credit for women and utilization of technology enriches life for women, and reduces backbreaking labor. The Movement is building wells in rural areas to reduce parasitic infection and creating easy access to a water supply. 252 projects have been completed and 100 are in process. The Movement provides a development model for integrating the resources of government, NGO's, and members of the community. Self-confidence of women has assured the success of projects. The Sasakawa Foundation has contributed technology and Japanese volunteers to improve the cultivation of food crops and by example express humble, respectful, hard working, and happy models of big things staring in small ways.

  20. Mentoring in Schools: An Impact Study of Big Brothers Big Sisters School-Based Mentoring

    Science.gov (United States)

    Herrera, Carla; Grossman, Jean Baldwin; Kauh, Tina J.; McMaken, Jennifer

    2011-01-01

    This random assignment impact study of Big Brothers Big Sisters School-Based Mentoring involved 1,139 9- to 16-year-old students in 10 cities nationwide. Youth were randomly assigned to either a treatment group (receiving mentoring) or a control group (receiving no mentoring) and were followed for 1.5 school years. At the end of the first school…

  1. 77 FR 27245 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN

    Science.gov (United States)

    2012-05-09

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF THE INTERIOR Fish and Wildlife Service Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN AGENCY: Fish and Wildlife Service, Interior. ACTION: Notice of availability; request for comments...

  2. Five Big, Big Five Issues : Rationale, Content, Structure, Status, and Crosscultural Assessment

    NARCIS (Netherlands)

    De Raad, Boele

    1998-01-01

    This article discusses the rationale, content, structure, status, and crosscultural assessment of the Big Five trait factors, focusing on topics of dispute and misunderstanding. Taxonomic restrictions of the original Big Five forerunner, the "Norman Five," are discussed, and criticisms regarding the

  3. Benchmarking Big Data Systems and the BigData Top100 List.

    Science.gov (United States)

    Baru, Chaitanya; Bhandarkar, Milind; Nambiar, Raghunath; Poess, Meikel; Rabl, Tilmann

    2013-03-01

    "Big data" has become a major force of innovation across enterprises of all sizes. New platforms with increasingly more features for managing big datasets are being announced almost on a weekly basis. Yet, there is currently a lack of any means of comparability among such platforms. While the performance of traditional database systems is well understood and measured by long-established institutions such as the Transaction Processing Performance Council (TCP), there is neither a clear definition of the performance of big data systems nor a generally agreed upon metric for comparing these systems. In this article, we describe a community-based effort for defining a big data benchmark. Over the past year, a Big Data Benchmarking Community has become established in order to fill this void. The effort focuses on defining an end-to-end application-layer benchmark for measuring the performance of big data applications, with the ability to easily adapt the benchmark specification to evolving challenges in the big data space. This article describes the efforts that have been undertaken thus far toward the definition of a BigData Top100 List. While highlighting the major technical as well as organizational challenges, through this article, we also solicit community input into this process.

  4. [Big data from clinical routine].

    Science.gov (United States)

    Mansmann, U

    2018-02-16

    Over the past 100 years, evidence-based medicine has undergone several fundamental changes. Through the field of physiology, medical doctors were introduced to the natural sciences. Since the late 1940s, randomized and epidemiological studies have come to provide the evidence for medical practice, which led to the emergence of clinical epidemiology as a new field in the medical sciences. Within the past few years, big data has become the driving force behind the vision for having a comprehensive set of health-related data which tracks individual healthcare histories and consequently that of large populations. The aim of this article is to discuss the implications of data-driven medicine, and to examine how it can find a place within clinical care. The EU-wide discussion on the development of data-driven medicine is presented. The following features and suggested actions were identified: harmonizing data formats, data processing and analysis, data exchange, related legal frameworks and ethical challenges. For the effective development of data-driven medicine, pilot projects need to be conducted to allow for open and transparent discussion on the advantages and challenges. The Federal Ministry of Education and Research ("Bundesministerium für Bildung und Forschung," BMBF) Arthromark project is an important example. Another example is the Medical Informatics Initiative of the BMBF. The digital revolution affects clinic practice. Data can be generated and stored in quantities that are almost unimaginable. It is possible to take advantage of this for development of a learning healthcare system if the principles of medical evidence generation are integrated into innovative IT-infrastructures and processes.

  5. Reactive Programming in Java

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Reactive Programming in gaining a lot of excitement. Many libraries, tools, and frameworks are beginning to make use of reactive libraries. Besides, applications dealing with big data or high frequency data can benefit from this programming paradigm. Come to this presentation to learn about what reactive programming is, what kind of problems it solves, how it solves them. We will take an example oriented approach to learning the programming model and the abstraction.

  6. ALCF Data Science Program: Productive Data-centric Supercomputing

    Science.gov (United States)

    Romero, Nichols; Vishwanath, Venkatram

    The ALCF Data Science Program (ADSP) is targeted at big data science problems that require leadership computing resources. The goal of the program is to explore and improve a variety of computational methods that will enable data-driven discoveries across all scientific disciplines. The projects will focus on data science techniques covering a wide area of discovery including but not limited to uncertainty quantification, statistics, machine learning, deep learning, databases, pattern recognition, image processing, graph analytics, data mining, real-time data analysis, and complex and interactive workflows. Project teams will be among the first to access Theta, ALCFs forthcoming 8.5 petaflops Intel/Cray system. The program will transition to the 200 petaflop/s Aurora supercomputing system when it becomes available. In 2016, four projects have been selected to kick off the ADSP. The selected projects span experimental and computational sciences and range from modeling the brain to discovering new materials for solar-powered windows to simulating collision events at the Large Hadron Collider (LHC). The program will have a regular call for proposals with the next call expected in Spring 2017.http://www.alcf.anl.gov/alcf-data-science-program This research used resources of the ALCF, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357.

  7. Development of Multiple Big Data Analytics Platforms with Rapid Response

    Directory of Open Access Journals (Sweden)

    Bao Rong Chang

    2017-01-01

    Full Text Available The crucial problem of the integration of multiple platforms is how to adapt for their own computing features so as to execute the assignments most efficiently and gain the best outcome. This paper introduced the new approaches to big data platform, RHhadoop and SparkR, and integrated them to form a high-performance big data analytics with multiple platforms as part of business intelligence (BI to carry out rapid data retrieval and analytics with R programming. This paper aims to develop the optimization for job scheduling using MSHEFT algorithm and implement the optimized platform selection based on computing features for improving the system throughput significantly. In addition, users would simply give R commands rather than run Java or Scala program to perform the data retrieval and analytics in the proposed platforms. As a result, according to performance index calculated for various methods, although the optimized platform selection can reduce the execution time for the data retrieval and analytics significantly, furthermore scheduling optimization definitely increases the system efficiency a lot.

  8. Big Data in Caenorhabditis elegans: quo vadis?

    Science.gov (United States)

    Hutter, Harald; Moerman, Donald

    2015-11-05

    A clear definition of what constitutes "Big Data" is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of "complete" data sets for this organism is actually rather small--not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein-protein interaction--important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell. © 2015 Hutter and Moerman. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  9. Process for Forming a High Temperature Single Crystal Canted Spring

    Science.gov (United States)

    DeMange, Jeffrey J (Inventor); Ritzert, Frank J (Inventor); Nathal, Michael V (Inventor); Dunlap, Patrick H (Inventor); Steinetz, Bruce M (Inventor)

    2017-01-01

    A process for forming a high temperature single crystal canted spring is provided. In one embodiment, the process includes fabricating configurations of a rapid prototype spring to fabricate a sacrificial mold pattern to create a ceramic mold and casting a canted coiled spring to form at least one canted coil spring configuration based on the ceramic mold. The high temperature single crystal canted spring is formed from a nickel-based alloy containing rhenium using the at least one coil spring configuration.

  10. Design and development of a medical big data processing system based on Hadoop.

    Science.gov (United States)

    Yao, Qin; Tian, Yu; Li, Peng-Fei; Tian, Li-Li; Qian, Yang-Ming; Li, Jing-Song

    2015-03-01

    Secondary use of medical big data is increasingly popular in healthcare services and clinical research. Understanding the logic behind medical big data demonstrates tendencies in hospital information technology and shows great significance for hospital information systems that are designing and expanding services. Big data has four characteristics--Volume, Variety, Velocity and Value (the 4 Vs)--that make traditional systems incapable of processing these data using standalones. Apache Hadoop MapReduce is a promising software framework for developing applications that process vast amounts of data in parallel with large clusters of commodity hardware in a reliable, fault-tolerant manner. With the Hadoop framework and MapReduce application program interface (API), we can more easily develop our own MapReduce applications to run on a Hadoop framework that can scale up from a single node to thousands of machines. This paper investigates a practical case of a Hadoop-based medical big data processing system. We developed this system to intelligently process medical big data and uncover some features of hospital information system user behaviors. This paper studies user behaviors regarding various data produced by different hospital information systems for daily work. In this paper, we also built a five-node Hadoop cluster to execute distributed MapReduce algorithms. Our distributed algorithms show promise in facilitating efficient data processing with medical big data in healthcare services and clinical research compared with single nodes. Additionally, with medical big data analytics, we can design our hospital information systems to be much more intelligent and easier to use by making personalized recommendations.

  11. Occurrence and transport of nitrogen in the Big Sunflower River, northwestern Mississippi, October 2009-June 2011

    Science.gov (United States)

    Barlow, Jeannie R.B.; Coupe, Richard H.

    2014-01-01

    The Big Sunflower River Basin, located within the Yazoo River Basin, is subject to large annual inputs of nitrogen from agriculture, atmospheric deposition, and point sources. Understanding how nutrients are transported in, and downstream from, the Big Sunflower River is key to quantifying their eutrophying effects on the Gulf. Recent results from two Spatially Referenced Regressions on Watershed attributes (SPARROW models), which include the Big Sunflower River, indicate minimal losses of nitrogen in stream reaches typical of the main channels of major river systems. If SPARROW assumptions of relatively conservative transport of nitrogen are correct and surface-water losses through the bed of the Big Sunflower River are negligible, then options for managing nutrient loads to the Gulf of Mexico may be limited. Simply put, if every pound of nitrogen entering the Delta is eventually delivered to the Gulf, then the only effective nutrient management option in the Delta is to reduce inputs. If, on the other hand, it can be shown that processes within river channels of the Mississippi Delta act to reduce the mass of nitrogen in transport, other hydrologic approaches may be designed to further limit nitrogen transport. Direct validation of existing SPARROW models for the Delta is a first step in assessing the assumptions underlying those models. In order to characterize spatial and temporal variability of nitrogen in the Big Sunflower River Basin, water samples were collected at four U.S. Geological Survey gaging stations located on the Big Sunflower River between October 1, 2009, and June 30, 2011. Nitrogen concentrations were generally highest at each site during the spring of the 2010 water year and the fall and winter of the 2011 water year. Additionally, the dominant form of nitrogen varied between sites. For example, in samples collected from the most upstream site (Clarksdale), the concentration of organic nitrogen was generally higher than the concentrations of

  12. Report on hard red spring wheat varieties grown in cooperative plot and nursery experiments in the spring wheat region in 2008

    Science.gov (United States)

    The Hard Red Spring Wheat Uniform Regional Nursery (HRSWURN) was planted for the 80th year in 2008. The nursery contained 37 entries submitted by 13 different scientific or industry breeding programs, and 5 checks. Trials were conducted as randomized complete blocks with three replicates except wher...

  13. Report on hard red spring wheat varieties grown in cooperative plot and nursery experiments in the spring wheat region in 2013

    Science.gov (United States)

    The Hard Red Spring Wheat Uniform Regional Nursery (HRSWURN) was planted for the 83rd year in 2013. The nursery contained 29 entries submitted by 7 different scientific or industry breeding programs, and 5 checks (Table 1). Trials were conducted as randomized complete blocks with three replicates ex...

  14. Report on Hard Red Spring Wheat Varieties Grown in Cooperative Plot and Nursery Experiments in the Spring Wheat Region in 2009

    Science.gov (United States)

    The Hard Red Spring Wheat Uniform Regional Nursery (HRSWURN) was planted for the 81st year in 2009. The nursery contained 32 entries submitted by 8 different scientific or industry breeding programs, and 5 checks. Trials were conducted as randomized complete blocks with three replicates except where...

  15. Report on hard red spring wheat varieties grown in cooperative plot and nursery experiments in the spring wheat region in 2016

    Science.gov (United States)

    The Hard Red Spring Wheat Uniform Regional Nursery (HRSWURN) was planted for the 86th year in 2016. The nursery contained 26 entries submitted by 8 different scientific or industry breeding programs, and 5 checks (Table 1). Trials were conducted as randomized complete blocks with three replicates ...

  16. Report on hard red spring wheat varieties grown in cooperative plot and nursery experiments in the spring wheat region in 2014

    Science.gov (United States)

    The Hard Red Spring Wheat Uniform Regional Nursery (HRSWURN) was planted for the 84th year in 2014. The nursery contained 26 entries submitted by 6 different scientific or industry breeding programs, and 5 checks (Table 1). Trials were conducted as randomized complete blocks with three replicates ex...

  17. Report on hard red spring wheat varieties grown in cooperative plot and nursery experiments in the spring wheat region in 2010

    Science.gov (United States)

    The Hard Red Spring Wheat Uniform Regional Nursery (HRSWURN) was planted for the 82nd year in 2010. The nursery contained 32 entries submitted by 7 different scientific or industry breeding programs, and 5 checks. Trials were conducted as randomized complete blocks with three replicates except where...

  18. Fish Springs pond snail : Refuge communication scenario

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — Communication scenario between the branch of Listing and Recovery, Fish and Wildlife Enhancement, and Fish Springs National Wildlife Refuge (NWR), in regards to the...

  19. Fish Springs National Wildlife Refuge habitat map

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — Habitat map for Fish Springs National Wildlife Refuge. This habitat map was created along with the National Vegetation Classification (NVC) map of the refuge. Refuge...

  20. Pagosa Springs geothermal project. Final technical report

    Energy Technology Data Exchange (ETDEWEB)

    1984-10-19

    This booklet discusses some ideas and methods for using Colorado geothermal energy. A project installed in Pagosa Springs, which consists of a pipeline laid down 8th street with service to residences retrofitted to geothermal space heating, is described. (ACR)