WorldWideScience

Sample records for big rock point

  1. Big Rock Point

    International Nuclear Information System (INIS)

    The Big Rock Point Nuclear Plant is the second oldest operating nuclear power plant in the United States. Its 25-yr history is an embodiment of the history of commercial nuclear power. In some respects, its situation today - 5 yr past the midpoint of its design life - can provide operators of other nuclear plants a glimpse of where they will be in another decade. Construction on Big Rock Point began in 1960. It was completed just 2 1/2 yr later at a cost of $27 million. The plant is a General Electric (GE)-designed boiling water direct cycle, forced circulation, high power density reactor. Its construction was undertaken by Consumers Power under the third round of the U.S. Atomic Energy Commission's (AEC's) Power Demonstration Reactor Program. It was an advanced version of GE's Vallecitos boiling water reactor. The plant's fuel was GE's responsibility and, under contract with the AEC, it conducted a fuel research and development (RandD) program involving the plant. Although the plant was designed for research - its original electrical capacity was set at 50 MW(electric) - the unit was subsequently uprated to 69 MW(net electric). The original plant staff included only 44 people and minimal security. Mirroring the industry experience, the number of people on-site had quadrupled

  2. Big Rock Point: An Analysis of Project Estimate Performance

    International Nuclear Information System (INIS)

    The Big Rock Point Restoration Project is well into its third year of decommissioning and restoring the site to a green-field condition. Although the project has gone well and remains on schedule and within budget, much has been learned about decommissioning cost estimates versus actual costs, as well as areas in which the estimate appears adequate and in which the estimate is challenged. These items are briefly described in this report

  3. 78 FR 61401 - Entergy Nuclear Operations, Inc.; Big Rock Point; Independent Spent Fuel Storage Installation

    Science.gov (United States)

    2013-10-03

    ... COMMISSION Entergy Nuclear Operations, Inc.; Big Rock Point; Independent Spent Fuel Storage Installation..., Inc. (ENO) on June 20, 2012, for the Big Rock Point (BRP) Independent Spent Fuel Storage Installation... Regulatory Evaluation In the Final Rule for Storage of Spent Fuel in NRC-Approved Storage Casks at...

  4. Big rock point restoration project BWR major component removal, packaging and shipping - planning and experience

    International Nuclear Information System (INIS)

    The Big Rock Point boiling water reactor (BWR) at Charlevoix, MI was permanently shut down on August 29th 1997. In 1999 BNFL Inc.'s Reactor Decommissioning Group (RDG) was awarded a contract by Consumers Energy (CECo) for the Big Rock Point (BRP) Major Component Removal (MCR) project. BNFL Inc. RDG has teamed with MOTA, Sargent and Lundy and MDM Services to plan and execute MCR in support of the facility restoration project. The facility restoration project will be completed by 2005. Key to the success of the project has been the integration of best available demonstrated technology into a robust and responsive project management approach, which places emphasis on safety and quality assurance in achieving project milestones linked to time and cost. To support decommissioning of the BRP MCR activities, a reactor vessel (RV) shipping container is required. Discussed in this paper is the design and fabrication of a 10 CFR Part 71 Type B container necessary to ship the BRP RV. The container to be used for transportation of the RV to the burial site was designed as an Exclusive Use Type B package for shipment and burial at the Barnwell, South Carolina (SC) disposal facility. (author)

  5. Decommissioning Experience: Big Rock Point Nuclear Power Plant, United States of America

    International Nuclear Information System (INIS)

    Full text: Big Rock Point has successfully employed a well planned and executed process to clean out the fuel pool in around 13 months, as reported by the site decommissioning project. The success was also reported by the contractor and in an independent publication. The reactor was a 67 MW BWR, which was shut down in 1997. Full decommissioning and unrestricted use of the site were achieved in 2007. The contractors were familiar with the site as they had successfully cleaned up another pool on the site in 1996. When completed, approximately 4800 TBq of waste, consisting of channel assemblies, control blades, satellite rollers, in-core detectors and other miscellaneous components, had been removed and shipped for storage or disposal. The project was within the ALARA dose budget, within the project cost budget and schedule, with no reportable incidents. (author)

  6. Integrated plant-safety assessment, Systematic Evaluation Program: Big Rock Point Plant (Docket No. 50-155)

    International Nuclear Information System (INIS)

    The Systematic Evaluation Program was initiated in February 1977 by the US Nuclear Regulatory Commission to review the designs of older operating nuclear reactor plants to reconfirm and document their safety. This report documents the review of the Big Rock Point Plant, which is one of ten plants reviewed under Phase II of this program. This report indicates how 137 topics selected for review under Phase I of the program were addressed. It also addresses a majority of the pending licensing actions for Big Rock Point, which include TMI Action Plan requirements and implementation criteria for resolved generic issues. Equipment and procedural changes have been identified as a result of the review

  7. Integrated plant safety assessment. Systematic evaluation program, Big Rock Point Plant (Docket No. 50-155). Final report

    International Nuclear Information System (INIS)

    The Systematic Evaluation Program was initiated in February 1977 by the U.S. Nuclear Regulatory Commission to review the designs of older operating nuclear reactor plants to reconfirm and document their safety. The review provides (1) an assessment of how these plants compare with current licensing safety requirements relating to selected issues, (2) a basis for deciding how these differences should be resolved in an integrated plant review, and (3) a documented evaluation of plant safety when the supplement to the Final Integrated Plant Safety Assessment Report has been issued. This report documents the review of the Big Rock Point Plant, which is one of ten plants reviewed under Phase II of this program. This report indicates how 137 topics selected for review under Phase I of the program were addressed. It also addresses a majority of the pending licensing actions for Big Rock Point, which include TMI Action Plan requirements and implementation criteria for resolved generic issues. Equipment and procedural changes have been identified as a result of the review

  8. Extended burnup demonstration: reactor fuel program. Pre-irradiation characterization and summary of pre-program poolside examinations. Big Rock Point extended burnup fuel

    International Nuclear Information System (INIS)

    This report is a resource document characterizing the 64 fuel rods being irradiated at the Big Rock Point reactor as part of the Extended Burnup Demonstration being sponsored jointly by the US Department of Energy, Consumers Power Company, Exxon Nuclear Company, and General Public Utilities. The program entails extending the exposure of standard BWR fuel to a discharge average of 38,000 MWD/MTU to demonstrate the feasibility of operating fuel of standard design to levels significantly above current limits. The fabrication characteristics of the Big Rock Point EBD fuel are presented along with measurement of rod length, rod diameter, pellet stack height, and fuel rod withdrawal force taken at poolside at burnups up to 26,200 MWD/MTU. A review of the fuel examination data indicates no performance characteristics which might restrict the continued irradiation of the fuel

  9. Big Bang Day : Physics Rocks

    CERN Multimedia

    Brian Cox; John Barrowman; Eddie Izzard

    2008-01-01

    Is particle physics the new rock 'n' roll? The fundamental questions about the nature of the universe that particle physics hopes to answer have attracted the attention of some very high profile and unusual fans. Alan Alda, Ben Miller, Eddie Izzard, Dara O'Briain and John Barrowman all have interests in this branch of physics. Brian Cox - CERN physicist, and former member of 90's band D:Ream, tracks down some very well known celebrity enthusiasts and takes a light-hearted look at why this subject can appeal to all of us.

  10. CLASSIFICATION OF BIG POINT CLOUD DATA USING CLOUD COMPUTING

    OpenAIRE

    Liu, K.; J. Boehm

    2015-01-01

    Point cloud data plays an significant role in various geospatial applications as it conveys plentiful information which can be used for different types of analysis. Semantic analysis, which is an important one of them, aims to label points as different categories. In machine learning, the problem is called classification. In addition, processing point data is becoming more and more challenging due to the growing data volume. In this paper, we address point data classification in a big data co...

  11. Big Rock Point Nuclear Plant. Annual operating report for 1976

    International Nuclear Information System (INIS)

    Net electrical power generated was 244,492.9 MWH with the reactor on line 4,405 hrs. Information is presented concerning operations, power generation, shutdowns, corrective maintenance, chemistry and radiochemistry, occupational radiation exposure, release of radioactive materials, reportable occurrences, and fuel performance

  12. Application Study of Self-balanced Testing Method on Big Diameter Rock-socketed Piles

    Directory of Open Access Journals (Sweden)

    Qing-biao WANG

    2013-07-01

    Full Text Available Through the technological test of self-balanced testing method on big diameter rock-socketed piles of broadcasting centre building of Tai’an, this paper studies and analyzes the links of the balance position selection, the load cell production and installation, displacement sensor selection and installation, loading steps, stability conditions and determination of the bearing capacity in the process of self-balanced testing. And this paper summarizes key technology and engineering experience of self-balanced testing method of big diameter rock-socketed piles and, meanwhile, it also analyzes the difficult technical problems needed to be resolved urgently at present. Conclusion of the study has important significance to the popularization and application of self-balanced testing method and the similar projects.

  13. Sideloading - Ingestion of Large Point Clouds Into the Apache Spark Big Data Engine

    Science.gov (United States)

    Boehm, J.; Liu, K.; Alis, C.

    2016-06-01

    In the geospatial domain we have now reached the point where data volumes we handle have clearly grown beyond the capacity of most desktop computers. This is particularly true in the area of point cloud processing. It is therefore naturally lucrative to explore established big data frameworks for big geospatial data. The very first hurdle is the import of geospatial data into big data frameworks, commonly referred to as data ingestion. Geospatial data is typically encoded in specialised binary file formats, which are not naturally supported by the existing big data frameworks. Instead such file formats are supported by software libraries that are restricted to single CPU execution. We present an approach that allows the use of existing point cloud file format libraries on the Apache Spark big data framework. We demonstrate the ingestion of large volumes of point cloud data into a compute cluster. The approach uses a map function to distribute the data ingestion across the nodes of a cluster. We test the capabilities of the proposed method to load billions of points into a commodity hardware compute cluster and we discuss the implications on scalability and performance. The performance is benchmarked against an existing native Apache Spark data import implementation.

  14. Automated Rock Detection and Shape Analysis from Mars Rover Imagery and 3D Point Cloud Data

    Institute of Scientific and Technical Information of China (English)

    Kaichang Di; Zongyu Yue; Zhaoqin Liu; Shuliang Wang

    2013-01-01

    A new object-oriented method has been developed for the extraction of Mars rocks from Mars rover data.It is based on a combination of Mars rover imagery and 3D point cloud data.First,Navcam or Pancam images taken by the Mars rovers are segmented into homogeneous objects with a mean-shift algorithm.Then,the objects in the segmented images are classified into small rock candidates,rock shadows,and large objects.Rock shadows and large objects are considered as the regions within which large rocks may exist.In these regions,large rock candidates are extracted through ground-plane fitting with the 3D point cloud data.Small and large rock candidates are combined and postprocessed to obtain the final rock extraction results.The shape properties of the rocks (angularity,circularity,width,height,and width-height ratio) have been calculated for subsequent geological studies.

  15. Parallel Processing of Big Point Clouds Using Z-Order Partitioning

    Science.gov (United States)

    Alis, C.; Boehm, J.; Liu, K.

    2016-06-01

    As laser scanning technology improves and costs are coming down, the amount of point cloud data being generated can be prohibitively difficult and expensive to process on a single machine. This data explosion is not only limited to point cloud data. Voluminous amounts of high-dimensionality and quickly accumulating data, collectively known as Big Data, such as those generated by social media, Internet of Things devices and commercial transactions, are becoming more prevalent as well. New computing paradigms and frameworks are being developed to efficiently handle the processing of Big Data, many of which utilize a compute cluster composed of several commodity grade machines to process chunks of data in parallel. A central concept in many of these frameworks is data locality. By its nature, Big Data is large enough that the entire dataset would not fit on the memory and hard drives of a single node hence replicating the entire dataset to each worker node is impractical. The data must then be partitioned across worker nodes in a manner that minimises data transfer across the network. This is a challenge for point cloud data because there exist different ways to partition data and they may require data transfer. We propose a partitioning based on Z-order which is a form of locality-sensitive hashing. The Z-order or Morton code is computed by dividing each dimension to form a grid then interleaving the binary representation of each dimension. For example, the Z-order code for the grid square with coordinates (x = 1 = 012, y = 3 = 112) is 10112 = 11. The number of points in each partition is controlled by the number of bits per dimension: the more bits, the fewer the points. The number of bits per dimension also controls the level of detail with more bits yielding finer partitioning. We present this partitioning method by implementing it on Apache Spark and investigating how different parameters affect the accuracy and running time of the k nearest neighbour algorithm

  16. 33 CFR 80.760 - Horeshoe Point, FL to Rock Island, FL.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Horeshoe Point, FL to Rock Island... Rock Island, FL. (a) Except inside lines specifically described provided in this section, the 72... Islands. (b) A north-south line drawn through Steinhatchee River Light 21. (c) A line drawn...

  17. Field Plot and Accuracy Assessment Points for Pictured Rocks National Lakeshore Vegetation Mapping Project

    Data.gov (United States)

    National Park Service, Department of the Interior — The vegetation point data for Pictured Rocks National Lakeshore (PIRO) was developed to support two projects associated with the 2004 vegetation map, the collection...

  18. Downstream-migrating fluvial point bars in the rock record

    Science.gov (United States)

    Ghinassi, Massimiliano; Ielpi, Alessandro; Aldinucci, Mauro; Fustic, Milovan

    2016-04-01

    Classical models developed for ancient fluvial point bars are based on the assumption that meander bends invariably increase their radius as meander-bend apices migrate in a direction transverse to the channel-belt axis (i.e., meander bend expansion). However, many modern meandering rivers are also characterized by down-valley migration of the bend apex, a mechanism that takes place without a significant change in meander radius and wavelength. Downstream-migrating fluvial point bars (DMFPB) are the dominant architectural element of these types of meander belts. Yet they are poorly known from ancient fluvial-channel belts, since their disambiguation from expansional point bars often requires fully-3D perspectives. This study aims to review DMFPB deposits spanning in age from Devonian to Holocene, and to discuss their main architectural and sedimentological features from published outcrop, borehole and 3D-seismic datasets. Fluvial successions hosting DMFPB mainly accumulated in low accommodation conditions, where channel belts were affected by different degrees of morphological (e.g., valleys) or tectonic (e.g., axial drainage of shortening basins) confinement. In confined settings, bends migrate downstream along the erosion-resistant valley flanks and little or no floodplain deposits are preserved. Progressive floor aggradation (e.g., valley filling) allow meander belts with DMFPB to decrease their degree of confinement. In less confined settings, meander bends migrate downstream mainly after impinging against older, erosion-resistant channel fill mud. By contrast, tectonic confinement is commonly associated with uplifted alluvial plains that prevented meander-bend expansion, in turn triggering downstream translation. At the scale of individual point bars, translational morphodynamics promote the preservation of downstream-bar deposits, whereas the coarser-grained upstream and central beds are less frequently preserved. However, enhanced preservation of upstream

  19. Small Stress Change Triggering a Big Earthquake: a Test of the Critical Point Hypothesis for Earthquakes

    Institute of Scientific and Technical Information of China (English)

    万永革; 吴忠良; 周公威

    2003-01-01

    Whether or not a small stress change can trigger a big earthquake is one of the most important problems related to the critical point hypothesis for earthquakes. We investigate global earthquakes with different focal mechanisms which have different levels of ambient shear stress. This ambient stress level is the stress level required by the earthquakes for their occurrence. Earthquake pairs are studied to see whether the occurrence of the preceding event encourages the occurrence of the succeeding one in terms of the Coulomb stress triggering. It is observed that the stress triggering effect produced by the change of Coulomb failure stress in the same order of magnitudes,about 10-2 MPa, is distinctly different for different focal mechanisms, and thus for different ambient stress levels.For non-strike-slip earthquakes with a relatively low ambient stress level, the triggering effect is more evident,while for strike-slip earthquakes with a relatively high ambient stress level, there is no evident triggering effect.This water level test provides an observational support to the critical point hypothesis for earthquakes.

  20. Address Points, This is an ESRI feature class of address points within the unincorporated areas of Rock County., Published in 2005, Rock County Planning, Economic, and Community Development Agency.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This Address Points dataset as of 2005. It is described as 'This is an ESRI feature class of address points within the unincorporated areas of Rock County.'. Data...

  1. Big Rock Point Nuclear Plant. 23rd semiannual report of operations, July--December 1976

    International Nuclear Information System (INIS)

    Net electrical power generated was 240,333.9 MWh(e) with the reactor on line 4,316.6 hr. Information is presented concerning operation, power generation, shutdowns, corrective maintenance, chemistry and radiochemistry, occupational radiation exposure, release of radioactive materials, changes, tests, experiments, and environmental monitoring

  2. Plutonium recycle R and D and operating experience at Big Rock Point

    International Nuclear Information System (INIS)

    A utility's progressive nuclear fuel management philosophy must include consideration to both the front and back end of the nuclear fuel cycle. Operation of a nuclear facility since 1962 with various fuel designs and a large inventory of spent fuel, containing generated plutonium, provided the impetus for a plutonium recycle R and D program with the goal of obtaining safe and economic fuel design and performance. In cooperation with different fuel vendors and outside groups, highly favorable mixed oxide fuel operating and handling experience in concert with continuously developing regulatory requirements, has been achieved since 1969. This experience, shared with fuel manufacturers, provides a data base for optimized fuel design and utilization and hence lower fuel costs. MO2 fuels have performed as well as urania fuels

  3. Seismic capacities of masonry walls at the big rock point nuclear generating plant

    International Nuclear Information System (INIS)

    An evaluation to determine the ability of selected concrete block walls in the vicinity of essential equipment to withstand seismic excitation was conducted. The seismic input to the walls was developed in accordance with the Systematic Evaluation Program (SEP) site-specific response spectra for the site. Time-history inputs to the walls were determined from the response of the turbine building complex. Analyses were performed to determine the capacities of the walls to withstand both in-plane and transverse seismic loads. Transverse load capacities were determined from time-history analyses of nonlinear two-dimensional analytical models of the walls. Separate inputs were used at the tops and bottoms of the walls to reflect the amplification through the building. The walls were unreinforced vertically with one exception, and have unsupported heights as high as 20'-8''. Also, cantilever walls as high as 11'-2'' were included in the evaluation. Factors of safety based on stability of the walls were determined for the transverse response, and on code allowable stresses (Reference 1) for the in-plane response

  4. Forests and Forest Cover - TREES_BIG2005_IN: Champion Tree Locations for 2005 in Indiana (Bernardin-Lochmueller and Associates, Point Shapefile)

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — TREES_BIG2005_IN is a point shapefile showing the locations of state champion trees in Indiana. The register is updated every 5 years. Each location represents a...

  5. Potential of acoustic emissions from three point bending tests as rock failure precursors

    Institute of Scientific and Technical Information of China (English)

    Agioutantis Z.; Kaklis K.; Mavrigiannakis S.; Verigakis M.; Vallianatos F.; Saltas V.

    2016-01-01

    Development of failure in brittle materials is associated with microcracks, which release energy in the form of elastic waves called acoustic emissions. This paper presents results from acoustic emission mea-surements obtained during three point bending tests on Nestos marble under laboratory conditions. Acoustic emission activity was monitored using piezoelectric acoustic emission sensors, and the potential for accurate prediction of rock damage based on acoustic emission data was investigated. Damage local-ization was determined based on acoustic emissions generated from the critically stressed region as scat-tered events at stresses below and close to the strength of the material.

  6. Unioned layer for the Point of Rocks-Black Butte coal assessment area, Green River Basin, Wyoming (porbbfing.shp)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This ArcView shapefile contains a polygon representation of the spatial query layer for the Point of Rocks-Black Butte coalfield, Greater Green River Basin,...

  7. Automated extraction and analysis of rock discontinuity characteristics from 3D point clouds

    Science.gov (United States)

    Bianchetti, Matteo; Villa, Alberto; Agliardi, Federico; Crosta, Giovanni B.

    2016-04-01

    A reliable characterization of fractured rock masses requires an exhaustive geometrical description of discontinuities, including orientation, spacing, and size. These are required to describe discontinuum rock mass structure, perform Discrete Fracture Network and DEM modelling, or provide input for rock mass classification or equivalent continuum estimate of rock mass properties. Although several advanced methodologies have been developed in the last decades, a complete characterization of discontinuity geometry in practice is still challenging, due to scale-dependent variability of fracture patterns and difficult accessibility to large outcrops. Recent advances in remote survey techniques, such as terrestrial laser scanning and digital photogrammetry, allow a fast and accurate acquisition of dense 3D point clouds, which promoted the development of several semi-automatic approaches to extract discontinuity features. Nevertheless, these often need user supervision on algorithm parameters which can be difficult to assess. To overcome this problem, we developed an original Matlab tool, allowing fast, fully automatic extraction and analysis of discontinuity features with no requirements on point cloud accuracy, density and homogeneity. The tool consists of a set of algorithms which: (i) process raw 3D point clouds, (ii) automatically characterize discontinuity sets, (iii) identify individual discontinuity surfaces, and (iv) analyse their spacing and persistence. The tool operates in either a supervised or unsupervised mode, starting from an automatic preliminary exploration data analysis. The identification and geometrical characterization of discontinuity features is divided in steps. First, coplanar surfaces are identified in the whole point cloud using K-Nearest Neighbor and Principal Component Analysis algorithms optimized on point cloud accuracy and specified typical facet size. Then, discontinuity set orientation is calculated using Kernel Density Estimation and

  8. An Approach for Automatic Orientation of Big Point Clouds from the Stationary Scanners Based on the Spherical Targets

    Directory of Open Access Journals (Sweden)

    YAO Jili

    2015-04-01

    Full Text Available Terrestrial laser scanning (TLS technology has high speed of data acquisition, large amount of point cloud, long distance of measuring. However, there are some disadvantages such as distance limitation in target detecting, hysteresis in point clouds processing, low automation and weaknesses of adapting long-distance topographic survey. In this case, we put forward a method on long-range targets detecting in big point clouds orientation. The method firstly searches point cloud rings that contain targets according to their engineering coordinate system. Then the detected rings are divided into sectors to detect targets in a very short time so as to obtain central coordinates of these targets. Finally, the position and orientation parameters of scanner are calculated and point clouds in scanner's own coordinate system(SOCS are converted into engineering coordinate system. The method is able to be applied in ordinary computers for long distance topographic(the distance between scanner and targets ranges from 180 to 700 m survey in mountainous areas with targets radius of 0.162m.

  9. Development of uniform hazard response spectra for rock sites considering line and point sources of earthquakes

    International Nuclear Information System (INIS)

    Traditionally, the seismic design basis ground motion has been specified by normalised response spectral shapes and peak ground acceleration (PGA). The mean recurrence interval (MRI) used to computed for PGA only. It is shown that the MRI associated with such response spectra are not the same at all frequencies. The present work develops uniform hazard response spectra i.e. spectra having the same MRI at all frequencies for line and point sources of earthquakes by using a large number of strong motion accelerograms recorded on rock sites. Sensitivity of the number of the results to the changes in various parameters has also been presented. This work is an extension of an earlier work for aerial sources of earthquakes. These results will help to determine the seismic hazard at a given site and the associated uncertainities. (author)

  10. Brit Crit: Turning Points in British Rock Criticism 1960-1990

    DEFF Research Database (Denmark)

    Gudmundsson, Gestur; Lindberg, U.; Michelsen, M.;

    2002-01-01

    The article examines the development of rock criticism in the United Kingdom from the perspective of a Bourdieuan field-analysis. Early British rock critics, like Nik Cohn, were international pioneers, a few years later there was a strong American influence, but British rock criticism has always...... had national specific traits and there have been more profound paradigm shifts than in American rock criticism. This is primarily explained by the fact that American rock criticism is more strongly connected to general cultural history, while the UK rock criticism has been more alienated from dominant...

  11. The tipping point how little things can make a big difference

    CERN Document Server

    Gladwell, Malcolm

    2002-01-01

    The tipping point is that magic moment when an idea, trend, or social behavior crosses a threshold, tips, and spreads like wildfire. Just as a single sick person can start an epidemic of the flu, so too can a small but precisely targeted push cause a fashion trend, the popularity of a new product, or a drop in the crime rate. This widely acclaimed bestseller, in which Malcolm Gladwell explores and brilliantly illuminates the tipping point phenomenon, is already changing the way people throughout the world think about selling products and disseminating ideas.

  12. Schools K-12, This is a point feature class of Schools within Rock County. This data does not contain religious or parochial schools, or schools affiliated with churches., Published in 2005, Rock County Planning, Economic, and Community Development Agency.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This Schools K-12 dataset, was produced all or in part from Other information as of 2005. It is described as 'This is a point feature class of Schools within Rock...

  13. Spectral characteristics of Acoustic Emission of rock based on Singular point of HHT Analysis

    Directory of Open Access Journals (Sweden)

    Zhou Xiaoshan

    2016-01-01

    Full Text Available The sandstone test of uniaxial compression acoustic emission (AE test has been studied, the HHT analysis is applied to AE signal processing, and through the analysis of AE signal to reveal the process of rock fracture. The results show that HHT is a method that based on principal component analysis of time-frequency analysis. The method of HHT can very convenient to deal the singular signal; it can be determine the main composition of singular signal. The instantaneous frequency can be used to describe precisely the time-frequency characteristics of singular signal. The method has a very important significance to reveal the frequency characteristics of AE signal. The EMD signal is decomposed into 8 IMF components in the failure process of rock sound. The component of IMF1 ~ IMF4 is the main component, and the IMF5 ~ IMF8 for low frequency noise signal. Through the EMD of AE signal frequency, the rock fracture has been decomposition into three stages: the initial zone, wave zone, quiet zone. This shows that in the analysis of rupture must eliminate noise interference signal characteristics of AE.

  14. Harnessing big data for precision medicine: A panel of experts elucidates the data challenges and proposes key strategic decisions points

    Directory of Open Access Journals (Sweden)

    Carol Isaacson Barash

    2015-03-01

    Full Text Available A group of disparate translational bioinformatics experts convened at the 6th Annual Precision Medicine Partnership Meeting, October 29–30, 2014 to discuss big data challenges and key strategic decisions needed to advance precision medicine, emerging solutions, and the anticipated path to success. This article reports the panel discussion.

  15. Petrofabrics of High-Pressure Rocks Exhumed at the Slab-Mantle Interface from the 'Point of No Return'

    Science.gov (United States)

    Whitney, D. L.; Teyssier, C. P.; Seaton, N. C.; Fornash, K.

    2014-12-01

    The highest pressure typically recorded by metamorphic rocks exhumed from oceanic subduction zones is ~2.5±1 GPa, corresponding to the maximum decoupling depth (MDD) (80±10 km) identified in active subduction zones; beyond the MDD (the 'point of no return') exhumation is unlikely. One of the few places where rocks returned from the MDD largely unaltered is Sivrihisar, Turkey: a structurally coherent terrane of lawsonite eclogite and blueschist facies rocks in which assemblages and fabrics record P-T-fluid-deformation conditions during exhumation from ~80 to 45 km. Crystallographic fabrics and other structural features of high-pressure metasedimentary and metabasaltic rocks record transitions during exhumation. In quartzite, heterogeneous microstructures and crystallographic fabrics record deformation and dynamic recrystallization from ~2.6 GPa to ~1.5 GPa, as expressed by transition from prism c-axis patterns through progressive overprinting and activation of rhomb and basal slip. Omphacite, glaucophane, phengite, and lawsonite in quartzite remained stable during deformation. In marble, CaCO3 deformed in dislocation creep as aragonite, producing strong crystallographic fabrics. This fabric persisted through formation of calcite and destruction of the shape-preferred orientation, indicating the strength of aragonite marble. Omphacite in metabasalt and quartzite displays an L-type crystallographic fabric. Lawsonite kinematic vorticity data and other fabrics in metabasalt are consistent with exhumation involving increasing amounts of pure shear relative to simple shear and indicate strain localization and simple shear near the fault contact between the high-pressure unit and a serpentinite body. This large coaxial component multiplied the exhuming power of the subduction channel and forced rocks to return from the MDD.

  16. Toward a Learning Health-care System - Knowledge Delivery at the Point of Care Empowered by Big Data and NLP.

    Science.gov (United States)

    Kaggal, Vinod C; Elayavilli, Ravikumar Komandur; Mehrabi, Saeed; Pankratz, Joshua J; Sohn, Sunghwan; Wang, Yanshan; Li, Dingcheng; Rastegar, Majid Mojarad; Murphy, Sean P; Ross, Jason L; Chaudhry, Rajeev; Buntrock, James D; Liu, Hongfang

    2016-01-01

    The concept of optimizing health care by understanding and generating knowledge from previous evidence, ie, the Learning Health-care System (LHS), has gained momentum and now has national prominence. Meanwhile, the rapid adoption of electronic health records (EHRs) enables the data collection required to form the basis for facilitating LHS. A prerequisite for using EHR data within the LHS is an infrastructure that enables access to EHR data longitudinally for health-care analytics and real time for knowledge delivery. Additionally, significant clinical information is embedded in the free text, making natural language processing (NLP) an essential component in implementing an LHS. Herein, we share our institutional implementation of a big data-empowered clinical NLP infrastructure, which not only enables health-care analytics but also has real-time NLP processing capability. The infrastructure has been utilized for multiple institutional projects including the MayoExpertAdvisor, an individualized care recommendation solution for clinical care. We compared the advantages of big data over two other environments. Big data infrastructure significantly outperformed other infrastructure in terms of computing speed, demonstrating its value in making the LHS a possibility in the near future. PMID:27385912

  17. Compliance Monitoring of Underwater Blasting for Rock Removal at Warrior Point, Columbia River Channel Improvement Project, 2009/2010

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, Thomas J.; Johnson, Gary E.; Woodley, Christa M.; Skalski, J. R.; Seaburg, Adam

    2011-05-10

    The U.S. Army Corps of Engineers, Portland District (USACE) conducted the 20-year Columbia River Channel Improvement Project (CRCIP) to deepen the navigation channel between Portland, Oregon, and the Pacific Ocean to allow transit of fully loaded Panamax ships (100 ft wide, 600 to 700 ft long, and draft 45 to 50 ft). In the vicinity of Warrior Point, between river miles (RM) 87 and 88 near St. Helens, Oregon, the USACE conducted underwater blasting and dredging to remove 300,000 yd3 of a basalt rock formation to reach a depth of 44 ft in the Columbia River navigation channel. The purpose of this report is to document methods and results of the compliance monitoring study for the blasting project at Warrior Point in the Columbia River.

  18. A 3D clustering approach for point clouds to detect and quantify changes at a rock glacier front

    Science.gov (United States)

    Micheletti, Natan; Tonini, Marj; Lane, Stuart N.

    2016-04-01

    Terrestrial Laser Scanners (TLS) are extensively used in geomorphology to remotely-sense landforms and surfaces of any type and to derive digital elevation models (DEMs). Modern devices are able to collect many millions of points, so that working on the resulting dataset is often troublesome in terms of computational efforts. Indeed, it is not unusual that raw point clouds are filtered prior to DEM creation, so that only a subset of points is retained and the interpolation process becomes less of a burden. Whilst this procedure is in many cases necessary, it implicates a considerable loss of valuable information. First, and even without eliminating points, the common interpolation of points to a regular grid causes a loss of potentially useful detail. Second, it inevitably causes the transition from 3D information to only 2.5D data where each (x,y) pair must have a unique z-value. Vector-based DEMs (e.g. triangulated irregular networks) partially mitigate these issues, but still require a set of parameters to be set and a considerable burden in terms of calculation and storage. Because of the reasons above, being able to perform geomorphological research directly on point clouds would be profitable. Here, we propose an approach to identify erosion and deposition patterns on a very active rock glacier front in the Swiss Alps to monitor sediment dynamics. The general aim is to set up a semiautomatic method to isolate mass movements using 3D-feature identification directly from LiDAR data. An ultra-long range LiDAR RIEGL VZ-6000 scanner was employed to acquire point clouds during three consecutive summers. In order to isolate single clusters of erosion and deposition we applied the Density-Based Scan Algorithm with Noise (DBSCAN), previously successfully employed by Tonini and Abellan (2014) in a similar case for rockfall detection. DBSCAN requires two input parameters, strongly influencing the number, shape and size of the detected clusters: the minimum number of

  19. Fundamental study on long-term stability of rock from the macroscopic point of view

    International Nuclear Information System (INIS)

    In 1994, this project was started, a pneumatic creep testing machine was modified. Inada granite was purchased, and the preliminary tests such as P-wave velocity measurement and Schmidt hammer testing were carried out. In 1995, a specimen of Tage tuff under water-saturated condition had been loaded in uniaxial condition in the pneumatic creep testing machine. The uniaxial compression and tension tests, and the short-term creep test of Inada granite were also carried out in the servo-controlled testing machines to obtain the complete stress-strain curves. In 1996, creep, compression and tension tests were carried out. Though creep stress of Tage tuff was set at as low as 30% of the uniaxial compressive strength, creep strain was continuously increasing after elapsed time exceeded 2 years. Creep tests of sandstone and Inada granite were also carried out. Uniaxial compression, uniaxial tension tests of sandstone were carried out in servo-controlled testing machines to obtain the complete stress-strain curves. Also, indirect tension test was carried out to compare with the results obtained in uniaxial condition. Two types of pressure maintenance equipment were developed and examined. The hydraulic type equipment modified for long-term creep testing to ensure durability and stability was, found to be precise and reliable. The pneumatic type equipment newly developed was comparatively less precise and reliable. A constitutive equation of variable compliance type was discussed based on the experimental results. Several ways how to obtain a set of four constants required to solve the equation were described. The constant strain-rate test, comparing with creep, constant stress-rate and relaxation tests, is recommended to be most appropriate for obtaining constants readily and easily. Based on seven Japanese rocks, the effects of confining pressure and moisture content on the value of each constant were discussed. (J.P.N.)

  20. Life's Little (and Big) Lessons: Identity Statuses and Meaning-Making in the Turning Point Narratives of Emerging Adults

    Science.gov (United States)

    McLean, Kate C.; Pratt, Michael W.

    2006-01-01

    A longitudinal study examined relations between 2 approaches to identity development: the identity status model and the narrative life story model. Turning point narratives were collected from emerging adults at age 23 years. Identity statuses were collected at several points across adolescence and emerging adulthood, as were measures of…

  1. Big Society, Big Deal?

    Science.gov (United States)

    Thomson, Alastair

    2011-01-01

    Political leaders like to put forward guiding ideas or themes which pull their individual decisions into a broader narrative. For John Major it was Back to Basics, for Tony Blair it was the Third Way and for David Cameron it is the Big Society. While Mr. Blair relied on Lord Giddens to add intellectual weight to his idea, Mr. Cameron's legacy idea…

  2. The Research about Strengthening Project of Middle Rock Pillars in the Big Section and Small Clear Distance Tunnel%大断面小净距隧道中间岩柱加固方案研究

    Institute of Scientific and Technical Information of China (English)

    陈佳

    2014-01-01

    探讨了在质量较差的Ⅴ级围岩中大断面小净距隧道中间岩柱加固方案,得出当左右洞之间的距离小于最小净距时,对中间岩柱进行注浆加固能很好地改善隧道洞周位移及衬砌内力,而对中间岩柱进行对拉预应力锚杆加固并不能起到很好的效果。对大断面小净距隧道中间岩柱的加固有一定的借鉴意义。%This paper discusses middle roc pillars reinforcement scheme in the poor quality sur-rounding rock in the big section and small clear distance tunnel ,it is concluded that when the distance of the two hole less than the minimum interval,the grouting reinforcement in middle rock pillars can per-fect the displacement and internal force lining in tunnel,and prestressed anchor reinforcement in middle rock pillars does not have very good effect. This paper have certain reference significance for the rein-forcement of middle rock pillars in the large section and small clear distance tunnel.

  3. BigDog

    Science.gov (United States)

    Playter, R.; Buehler, M.; Raibert, M.

    2006-05-01

    BigDog's goal is to be the world's most advanced quadruped robot for outdoor applications. BigDog is aimed at the mission of a mechanical mule - a category with few competitors to date: power autonomous quadrupeds capable of carrying significant payloads, operating outdoors, with static and dynamic mobility, and fully integrated sensing. BigDog is about 1 m tall, 1 m long and 0.3 m wide, and weighs about 90 kg. BigDog has demonstrated walking and trotting gaits, as well as standing up and sitting down. Since its creation in the fall of 2004, BigDog has logged tens of hours of walking, climbing and running time. It has walked up and down 25 & 35 degree inclines and trotted at speeds up to 1.8 m/s. BigDog has walked at 0.7 m/s over loose rock beds and carried over 50 kg of payload. We are currently working to expand BigDog's rough terrain mobility through the creation of robust locomotion strategies and terrain sensing capabilities.

  4. Contrasting Nature of Magnetic Anomalies over Thin Sections Made out of Barrandien’s Basaltic Rocks Points to their Origin

    Czech Academy of Sciences Publication Activity Database

    Kletetschka, Günther; Pruner, Petr; Schnabl, Petr; Šifnerová, Kristýna

    -, special issue (2012), s. 69-70. ISSN 1335-2806. [Castle meeting New Trends in Geomagnetism : Paleo, rock and environmental magnetism/13./. 17.06.2012-23.06.2012, Zvolen] R&D Projects: GA ČR GAP210/10/2351 Institutional support: RVO:67985831 Keywords : magnetic anomalies * thin sections * volcanic rocks Subject RIV: DE - Earth Magnetism, Geodesy , Geography http://gauss.savba.sk/GPIweb/conferences/Castle2012/abstrCastle.pdf

  5. Kimberley rock art dating project

    International Nuclear Information System (INIS)

    The art's additional value, unequalled by traditionally recognised artefacts, is its permanent pictorial documentation presenting a 'window' into the otherwise intangible elements of perceptions, vision and mind of pre-historic cultures. Unfortunately it's potential in establishing Kimberley archaeological 'big picture' still remains largely unrecognised. Some of findings of the Kimberley Rock Art Dating Project, using AMS and optical stimulated luminescence (OSL) dating techniques, are outlined. It is estimated that these findings will encourage involvement by a greater diversity of specialist disciplines to tie findings into levels of this art sequence as a primary reference point. The sequence represents a sound basis for selecting specific defined images for targeting detailed studies by a range of dating technique. This effectively removes the undesirable ad hoc sampling of 'apparently old paintings'; a process which must unavoidably remain the case with researchers working on most global bodies of rock art

  6. Heart tissue of harlequin (hq)/Big Blue mice has elevated reactive oxygen species without significant impact on the frequency and nature of point mutations in nuclear DNA

    International Nuclear Information System (INIS)

    Age is a major risk factor for heart disease, and cardiac aging is characterized by elevated mitochondrial reactive oxygen species (ROS) with compromised mitochondrial and nuclear DNA integrity. To assess links between increased ROS levels and mutations, we examined in situ levels of ROS and cII mutation frequency, pattern and spectrum in the heart of harlequin (hq)/Big Blue mice. The hq mouse is a model of premature aging with mitochondrial dysfunction and increased risk of oxidative stress-induced heart disease with the means for in vivo mutation detection. The hq mutation produces a significant downregulation in the X-linked apoptosis-inducing factor gene (Aif) impairing both the antioxidant and oxidative phosphorylation functions of AIF. Brain and skin of hq disease mice have elevated frequencies of point mutations in nuclear DNA and histopathology characterized by cell loss. Reports of associated elevations in ROS in brain and skin have mixed results. Herein, heart in situ ROS levels were elevated in hq disease compared to AIF-proficient mice (p < 0.0001) yet, mutation frequency and pattern were similar in hq disease, hq carrier and AIF-proficient mice. Heart cII mutations were also assessed 15 days following an acute exposure to an exogenous ROS inducer (10 mg paraquat/kg). Acute paraquat exposure with a short mutant manifestation period was insufficient to elevate mutation frequency or alter mutation pattern in the post-mitotic heart tissue of AIF-proficient mice. Paraquat induction of ROS requires mitochondrial complex I and thus is likely compromised in hq mice. Results of this preliminary survey and the context of recent literature suggest that determining causal links between AIF deficiency and the premature aging phenotypes of specific tissues is better addressed with assay of mitochondrial ROS and large-scale changes in mitochondrial DNA in specific cell types.

  7. Heart tissue of harlequin (hq)/Big Blue mice has elevated reactive oxygen species without significant impact on the frequency and nature of point mutations in nuclear DNA

    Energy Technology Data Exchange (ETDEWEB)

    Crabbe, Rory A. [Department of Biology, University of Western Ontario, London, Ontario, N6A 5B7 (Canada); Hill, Kathleen A., E-mail: khill22@uwo.ca [Department of Biology, University of Western Ontario, London, Ontario, N6A 5B7 (Canada)

    2010-09-10

    Age is a major risk factor for heart disease, and cardiac aging is characterized by elevated mitochondrial reactive oxygen species (ROS) with compromised mitochondrial and nuclear DNA integrity. To assess links between increased ROS levels and mutations, we examined in situ levels of ROS and cII mutation frequency, pattern and spectrum in the heart of harlequin (hq)/Big Blue mice. The hq mouse is a model of premature aging with mitochondrial dysfunction and increased risk of oxidative stress-induced heart disease with the means for in vivo mutation detection. The hq mutation produces a significant downregulation in the X-linked apoptosis-inducing factor gene (Aif) impairing both the antioxidant and oxidative phosphorylation functions of AIF. Brain and skin of hq disease mice have elevated frequencies of point mutations in nuclear DNA and histopathology characterized by cell loss. Reports of associated elevations in ROS in brain and skin have mixed results. Herein, heart in situ ROS levels were elevated in hq disease compared to AIF-proficient mice (p < 0.0001) yet, mutation frequency and pattern were similar in hq disease, hq carrier and AIF-proficient mice. Heart cII mutations were also assessed 15 days following an acute exposure to an exogenous ROS inducer (10 mg paraquat/kg). Acute paraquat exposure with a short mutant manifestation period was insufficient to elevate mutation frequency or alter mutation pattern in the post-mitotic heart tissue of AIF-proficient mice. Paraquat induction of ROS requires mitochondrial complex I and thus is likely compromised in hq mice. Results of this preliminary survey and the context of recent literature suggest that determining causal links between AIF deficiency and the premature aging phenotypes of specific tissues is better addressed with assay of mitochondrial ROS and large-scale changes in mitochondrial DNA in specific cell types.

  8. Big Data

    Directory of Open Access Journals (Sweden)

    Prachi More

    2013-05-01

    Full Text Available Demand and spurt in collections and accumulation of data has coined new term “Big Data” has begun. Accidently, incidentally and by interaction of people, information so called data is massively generated. This BIG DATA is to be smartly and effectively used Computer scientists, physicists, economists, mathematicians, political scientists, bio-informaticists, sociologists and many Variety of Intellegesia debate over the potential benefits and costs of analysing information from Twitter, Google, Facebook, Wikipedia and every space where large groups of people leave digital traces and deposit data. Given the rise of Big Data as both a phenomenon and a methodological persuasion, it is time to start critically interrogating this phenomenon, its assumptions and its biases. Big data, which refers to the data sets that are too big to be handled using the existing database management tools, are emerging in many important applications, such as Internet search, business informatics, social networks, social media, genomics, and meteorology. Big data presents a grand challenge for database and data analytics research. This paper is a blend of non-technical and introductory-level technical detail, ideal for the novice. We conclude with some technical challenges as well as the solutions that can be used to these challenges. Big Data differs from other data with five characteristics like volume, variety, value, velocity and complexity. The article will focus on some current and future cases and causes for BIG DATA.

  9. Big Data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin;

    2016-01-01

    The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper is...... to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big...

  10. Petrofabrics of high-pressure rocks exhumed at the slab-mantle interface from the "point of no return" in a subduction zone (Sivrihisar, Turkey)

    Science.gov (United States)

    Whitney, Donna L.; Teyssier, Christian; Seaton, Nicholas C. A.; Fornash, Katherine F.

    2014-12-01

    The highest pressure recorded by metamorphic rocks exhumed from oceanic subduction zones is ~2.5 GPa, corresponding to the maximum decoupling depth (MDD) (80 ± 10 km) identified in active subduction zones; beyond the MDD (the "point of no return") exhumation is unlikely. The Sivrihisar massif (Turkey) is a coherent terrane of lawsonite eclogite and blueschist facies rocks in which assemblages and fabrics record P-T-fluid-deformation conditions during exhumation from ~80 to 45 km. Crystallographic fabrics and other features of high-pressure metasedimentary and metabasaltic rocks record transitions during exhumation. In quartzite, microstructures and crystallographic fabrics record deformation in the dislocation creep regime, including dynamic recrystallization during decompression, and a transition from prism slip to activation of rhomb and basal slip that may be related to a decrease in water fugacity during decompression (~2.5 to ~1.5 GPa). Phengite, lawsonite, and omphacite or glaucophane in quartzite and metabasalt remained stable during deformation, and omphacite developed an L-type crystallographic fabric. In marble, aragonite developed columnar textures with strong crystallographic fabrics that persisted during partial to complete dynamic recrystallization that was likely achieved in the stability field of aragonite (P > ~1.2 GPa). Results of kinematic vorticity analysis based on lawsonite shape fabrics are consistent with shear criteria in quartzite and metabasalt and indicate a large component of coaxial deformation in the exhuming channel beneath a simple shear dominated interface. This large coaxial component may have multiplied the exhuming power of the subduction channel and forced deeply subducted rocks to flow back from the point of no return.

  11. Research of the Rock Art from the point of view of geography: the neolithic painting of the Mediterranean area of the Iberian Peninsula

    Directory of Open Access Journals (Sweden)

    Cruz Berrocal, María

    2004-12-01

    Full Text Available The rock art of the Mediterranean Arch (which includes what are conventionally called Levantine Rock Art, Schematic Rock Art and Macroschematic Rock Art, among other styles, designated as part of the Human Heritage in 1997, is studied from the point of view of the Archaeology of Landscape. The information sources used were field work, cartographic analysis and analysis in GIS, besides two Rock Art Archives: the UNESCO Document and the Corpus of Levantine Cave Painting (Corpus de Pintura Rupestre Levantina. The initial hypothesis was that this rock art was involved in the process of neolithisation of the Eastern part of Iberia, of which it is a symptom and a result, and it must be understood as an element of landscape construction. If this is true, it would have a concrete distribution in the form of locational patterns. Through statistical procedures and heuristical approaches, it has been demonstrated that there is a structure of the neolithic landscape, defined by rock art, which is possible to interpret functional and economically.

    Se estudia el arte rupestre del Arco Mediterráneo (que incluye a los convencionalmente conocidos como Arte Levantino, Arte Esquemático y Arte Macroesquemático, entre otros estilos, nombrado Patrimonio de la Humanidad en 1998, desde el punto de vista de su localización. Las fuentes de información utilizadas fueron trabajo de campo, revisión cartográfica y análisis en Sistema de Información Geográfica, además de dos archivos de arte rupestre: el Expediente UNESCO y el Corpus de Pintura Rupestre Levantina. La hipótesis inicial fue que este arte rupestre se imbrica en el proceso de neolitización del Levante peninsular, del que es síntoma y resultado, y debe entenderse como un elemento de construcción paisajística, de lo que se deduce que ha de presentar una distribución determinable en forma de patrones locacionales. Por medio tanto de contrastes y descripciones estadísticas como de

  12. Accurate 3D point cloud comparison and volumetric change analysis of Terrestrial Laser Scan data in a hard rock coastal cliff environment

    Science.gov (United States)

    Earlie, C. S.; Masselink, G.; Russell, P.; Shail, R.; Kingston, K.

    2013-12-01

    Our understanding of the evolution of hard rock coastlines is limited due to the episodic nature and ';slow' rate at which changes occur. High-resolution surveying techniques, such as Terrestrial Laser Scanning (TLS), have just begun to be adopted as a method of obtaining detailed point cloud data to monitor topographical changes over short periods of time (weeks to months). However, the difficulties involved in comparing consecutive point cloud data sets in a complex three-dimensional plane, such as occlusion due to surface roughness and positioning of data capture point as a result of a consistently changing environment (a beach profile), mean that comparing data sets can lead to errors in the region of 10 - 20 cm. Meshing techniques are often used for point cloud data analysis for simple surfaces, but in surfaces such as rocky cliff faces, this technique has been found to be ineffective. Recession rates of hard rock coastlines in the UK are typically determined using aerial photography or airborne LiDAR data, yet the detail of the important changes occurring to the cliff face and toe are missed using such techniques. In this study we apply an algorithm (M3C2 - Multiscale Model to Model Cloud Comparison), initially developed for analysing fluvial morphological change, that directly compares point to point cloud data using surface normals that are consistent with surface roughness and measure the change that occurs along the normal direction (Lague et al., 2013). The surfaces changes are analysed using a set of user defined scales based on surface roughness and registration error. Once the correct parameters are defined, the volumetric cliff face changes are calculated by integrating the mean distance between the point clouds. The analysis has been undertaken at two hard rock sites identified for their active erosion located on the UK's south west peninsular at Porthleven in south west Cornwall and Godrevy in north Cornwall. Alongside TLS point cloud data, in

  13. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Ruppert, Evelyn; Flyverbom, Mikkel;

    2016-01-01

    The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper is...... to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments that is contained in any big data practice. Secondly, it suggest a research agenda built around a set of sub-themes that each deserve dedicated scrutiny when studying the interplay...

  14. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Jeppesen, Jacob; Vaarst Andersen, Kristina; Lauto, Giancarlo; Valentin, Finn

    utilize a stochastic actor oriented model (SAOM) to analyze both network endogeneous mechanisms and individual agency driving the collaboration network and further if being a Big Ego in Big Science translates to increasing performance. Our findings suggest that the selection of collaborators is not based...... on preferentialattachment, but more of an assortativity effect creating not merely a rich-gets-richer effect but an elitist network with high entry barriers. In this acclaimed democratic and collaborative environment of Big Science, the elite closes in on itself. We propose this tendency to be even...

  15. Toward a Learning Health-care System – Knowledge Delivery at the Point of Care Empowered by Big Data and NLP

    Science.gov (United States)

    Kaggal, Vinod C.; Elayavilli, Ravikumar Komandur; Mehrabi, Saeed; Pankratz, Joshua J.; Sohn, Sunghwan; Wang, Yanshan; Li, Dingcheng; Rastegar, Majid Mojarad; Murphy, Sean P.; Ross, Jason L.; Chaudhry, Rajeev; Buntrock, James D.; Liu, Hongfang

    2016-01-01

    The concept of optimizing health care by understanding and generating knowledge from previous evidence, ie, the Learning Health-care System (LHS), has gained momentum and now has national prominence. Meanwhile, the rapid adoption of electronic health records (EHRs) enables the data collection required to form the basis for facilitating LHS. A prerequisite for using EHR data within the LHS is an infrastructure that enables access to EHR data longitudinally for health-care analytics and real time for knowledge delivery. Additionally, significant clinical information is embedded in the free text, making natural language processing (NLP) an essential component in implementing an LHS. Herein, we share our institutional implementation of a big data-empowered clinical NLP infrastructure, which not only enables health-care analytics but also has real-time NLP processing capability. The infrastructure has been utilized for multiple institutional projects including the MayoExpertAdvisor, an individualized care recommendation solution for clinical care. We compared the advantages of big data over two other environments. Big data infrastructure significantly outperformed other infrastructure in terms of computing speed, demonstrating its value in making the LHS a possibility in the near future. PMID:27385912

  16. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  17. Big Science

    International Nuclear Information System (INIS)

    Astronomy, like particle physics, has become Big Science where the demands of front line research can outstrip the science budgets of whole nations. Thus came into being the European Southern Observatory (ESO), founded in 1962 to provide European scientists with a major modern observatory to study the southern sky under optimal conditions

  18. Big Dreams

    Science.gov (United States)

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  19. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  20. Comparing Two Photo-Reconstruction Methods to Produce High Density Point Clouds and DEMs in the Corral del Veleta Rock Glacier (Sierra Nevada, Spain

    Directory of Open Access Journals (Sweden)

    Álvaro Gómez-Gutiérrez

    2014-06-01

    Full Text Available In this paper, two methods based on computer vision are presented in order to produce dense point clouds and high resolution DEMs (digital elevation models of the Corral del Veleta rock glacier in Sierra Nevada (Spain. The first one is a semi-automatic 3D photo-reconstruction method (SA-3D-PR based on the Scale-Invariant Feature Transform algorithm and the epipolar geometry theory that uses oblique photographs and camera calibration parameters as input. The second method is fully automatic (FA-3D-PR and is based on the recently released software 123D-Catch that uses the Structure from Motion and MultiView Stereo algorithms and needs as input oblique photographs and some measurements in order to scale and geo-reference the resulting model. The accuracy of the models was tested using as benchmark a 3D model registered by means of a Terrestrial Laser Scanner (TLS. The results indicate that both methods can be applied to micro-scale study of rock glacier morphologies and processes with average distances to the TLS point cloud of 0.28 m and 0.21 m, for the SA-3D-PR and the FA-3D-PR methods, respectively. The performance of the models was also tested by means of the dimensionless relative precision ratio parameter resulting in figures of 1:1071 and 1:1429 for the SA-3D-PR and the FA-3D-PR methods, respectively. Finally, Digital Elevation Models (DEMs of the study area were produced and compared with the TLS-derived DEM. The results showed average absolute differences with the TLS-derived DEM of 0.52 m and 0.51 m for the SA-3D-PR and the FA-3D-PR methods, respectively.

  1. Antigravity and the big crunch/big bang transition

    OpenAIRE

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.(Princeton Center for Theoretical Science, Princeton University, Princeton, NJ, 08544, USA); Turok, Neil

    2011-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition...

  2. Big Data and Big Science

    OpenAIRE

    Di Meglio, Alberto

    2014-01-01

    Brief introduction to the challenges of big data in scientific research based on the work done by the HEP community at CERN and how the CERN openlab promotes collaboration among research institutes and industrial IT companies. Presented at the FutureGov 2014 conference in Singapore.

  3. Big Data; A Management Revolution : The emerging role of big data in businesses

    OpenAIRE

    Blasiak, Kevin

    2014-01-01

    Big data is a term that was coined in 2012 and has since then emerged to one of the top trends in business and technology. Big data is an agglomeration of different technologies resulting in data processing capabilities that have been unreached before. Big data is generally characterized by 4 factors. Volume, velocity and variety. These three factors distinct it from the traditional data use. The possibilities to utilize this technology are vast. Big data technology has touch points in differ...

  4. Big queues

    CERN Document Server

    Ganesh, Ayalvadi; Wischik, Damon

    2004-01-01

    Big Queues aims to give a simple and elegant account of how large deviations theory can be applied to queueing problems. Large deviations theory is a collection of powerful results and general techniques for studying rare events, and has been applied to queueing problems in a variety of ways. The strengths of large deviations theory are these: it is powerful enough that one can answer many questions which are hard to answer otherwise, and it is general enough that one can draw broad conclusions without relying on special case calculations.

  5. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  6. Deep crescentic features caused by subglacial boulder point pressure on jointed rock; an example from Virkisjökull, SE Iceland

    Science.gov (United States)

    Krabbendam, M.; Bradwell, T.; Everest, J.

    2012-04-01

    A variety of subglacially formed, erosional crescentic features (e.g. crescentic gouges, lunate fractures) have been widely reported on deglaciated bedrock surfaces. They are characterised by a conchoidal fracture that dips in the same direction as the palaeo-ice flow direction, and a steeper fracture that faces against the ice flow. They are generally interpreted as being formed by point pressure exerted by large boulders entrained in basal ice. They are significant in that they record palaeo-ice flow even if shallower glacial striae are obliterated by post-glacial weathering [1, 2, 3]. This contribution reports on deep scallop-shaped, crescentic depressions observed on abraded surfaces of roche moutonnées and whalebacks recently (depressions at Virkisjökull are cut into smoothed, abraded surfaces festooned with abundant glacial striae. Differences with previously reported crescentic features are: • The scallop-shaped depressions are considerably deeper (5-20 cm); • The steep fracture facing ice flow coincides in all cases with a pre-existing joint that cuts the entire whaleback. The steep joints developed thus before the conchoidal fracture, whilst in reported crescentic features they develop after the conchoidal fracture. We suggest the following formation mechanism. A boulder encased in basal ice exerts continuous pressure on its contact point as it moves across the ice-bedrock contact. This sets up a stress field in the bedrock that does not necessarily exceed the intact rock strength (other crescentic features are rare to absent at Virkisjökull). However, as the stress field migrates (with the transported boulder) and encounters a subvertical, pre-existing joint, stress concentrations build up that do exceed the intact rock strength, resulting in a new (conchoidal) fracture, 'spalling' off a thick, scallop-shaped fragment. The significance of the deep scallop-shaped crescentic depressions is that: • in common with other crescentic features they

  7. Rock History and Culture

    OpenAIRE

    Gonzalez, Éric

    2013-01-01

    Two ambitious works written by French-speaking scholars tackle rock music as a research object, from different but complementary perspectives. Both are a definite must-read for anyone interested in the contextualisation of rock music in western popular culture. In Une histoire musicale du rock (i.e. A Musical History of Rock), rock music is approached from the point of view of the people – musicians and industry – behind the music. Christophe Pirenne endeavours to examine that field from a m...

  8. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  9. CRITERIA FOR ROCK ENGINEERING FAILURE

    Institute of Scientific and Technical Information of China (English)

    ZHUDeren; ZHANGYuzhuo

    1995-01-01

    A great number of underground rock projects are maintained in the rock mass which is subject to rock damage and failure development. In many cases, the rock. engineering is still under normal working conditions even though rock is already fails to some extent. This paper introduces two different concepts: rock failure and rock engineering failure. Rock failure is defined as a mechanical state under which an applicable characteristic is changed or lost.However, the rock engineering failure is an engineering state under which an applicable function is changed or lost. The failure of surrounding rocks is the major reason of rock engineering failure. The criterion of rock engineering failure depends on the limit of applicable functions. The rock engineering failure state possesses a corresponding point in rock failure state. In this paper, a description of rock engineering failure criterion is given by simply using a mechanical equation or expression. It is expected that the study of rock engineering failure criterion will be an optimal approach that combines research of rock mechanics with rock engineering problems.

  10. Magnetostratigraphy of a Marine Triassic-Jurassic Boundary Section, Kennecott Point, Queen Charlotte Islands: Implications for the Temporal Correlation of a 'Big Five' Mass Extinction Event.

    Science.gov (United States)

    Hilburn, I. A.; Kirschvink, J. L.; Ward, P. D.; Haggart, J. W.; Raub, T. D.

    2008-12-01

    Several causes have been proposed for Triassic-Jurassic (T-J) boundary extinctions, including global ocean anoxia/euxinia, an impact event, and/or eruption of the massive Central Atlantic Magmatic Province (CAMP), but poor intercontinental correlation makes testing these difficult. Sections at Kennecott Point, Queen Charlotte Islands, British Columbia span the late Norian through Rhaetian (Triassic) and into the earliest Hettangian (Jurassic) and provide the best integrated magneto- and chemostratigraphic framework for placing necessary temporal constraints upon the T-J mass extinctions. At Kennecott Point, turnover of radiolaria and ammonoids define the T-J boundary marine extinction and are coincident with a 2 ‰ negative excursion in δ13Corg similar in magnitude to that observed at Ferguson Hill (Muller Canyon), Nevada (1, 2). With Conodont Alteration Index values in the 1-2 range, Kennecott Point provides the ideal setting for use of magnetostratigraphy to tie the marine isotope excursion into the chronostratigraphic framework of the Newark, Hartford, and Fundy Basins. In the summer of 2005, we collected a ~1m resolution magnetostratigraphic section from 105 m of deep marine, silt- and sandstone turbidites and interbedded mudstones, spanning the T-J boundary at Kennecott Point. Hybrid progressive demagnetization - including zero-field, low-temperature cycling; low-field AF cleaning; and thermal demagnetization in ~25°C steps to 445°C under flowing N2 gas (3) - first removed a Northerly, steeply inclined component interpreted to be a Tertiary overprint, revealing an underlying dual-polarity component of moderate inclination. Five major polarity zones extend through our section, with several short, one-sample reversals interspersed amongst them. Comparison of this pattern with other T-J boundary sections (4-6) argues for a Northern hemisphere origin of our site, albeit with large vertical-axis rotations. A long normal chron bounds the T-J boundary punctuated

  11. Big Data : Overview

    OpenAIRE

    Richa Gupta; Sunny Gupta; Anuradha Singhal

    2014-01-01

    Big data is data that exceeds the processing capacity of traditional databases. The data is too big to be processed by a single machine. New and innovative methods are required to process and store such large volumes of data. This paper provides an overview on big data, its importance in our live and some technologies to handle big data.

  12. Big Data: Overview

    OpenAIRE

    Gupta, Richa; Gupta, Sunny; Singhal, Anuradha

    2014-01-01

    Big data is data that exceeds the processing capacity of traditional databases. The data is too big to be processed by a single machine. New and innovative methods are required to process and store such large volumes of data. This paper provides an overview on big data, its importance in our live and some technologies to handle big data.

  13. Efficiency study of a big volume well type NaI(Tl) detector by point and voluminous sources and Monte-Carlo simulation

    International Nuclear Information System (INIS)

    The activity of environmental samples is usually measured by high resolution HPGe gamma spectrometers. In this work a set-up with a 9 in.x9 in. NaI well-detector with 3 in. thickness and a 3 in.×3 in. plug detector in a 15-cm-thick lead shielding is considered as an alternative (Hansman, 2014). In spite of its much poorer resolution, it requires shorter measurement times and may possibly give better detection limits. In order to determine the U-238, Th-232, and K-40 content in the samples by this NaI(Tl) detector, the corresponding photopeak efficiencies must be known. These efficiencies can be found for certain source matrix and geometry by Geant4 simulation. We found discrepancy between simulated and experimental efficiencies of 5–50%, which can be mainly due to effects of light collection within the detector volume, an effect which was not taken into account by simulations. The influence of random coincidence summing on detection efficiency for radionuclide activities in the range 130–4000 Bq, was negligible. This paper describes also, how the efficiency in the detector depends on the position of the radioactive point source. To avoid large dead time, relatively weak Mn-54, Co-60 and Na-22 point sources of a few kBq were used. Results for single gamma lines and also for coincidence summing gamma lines are presented. - Highlights: • 9 in.x9 in. NaI well detector and 3 in.x3 in. plug detector studied. • Peak efficiency simulated with Geant4. • Result shows discrepancy with measurements. • High efficiency useful for environmental samples

  14. A study on density, melting point, thermal expansion, creep, thermal diffusivity and thermal conductivity of the simulated rock-like oxide (ROX) fuels

    Energy Technology Data Exchange (ETDEWEB)

    Yanagisawa, Kazuaki; Shirasu, Noriko; Muromura, Tadasumi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Ohmichi, Toshihiko; Matsuda, Tetsushi

    1999-03-01

    A new type of fuel, that is, rock-like oxide (ROX) fuel composed of PuO{sub 2}-SZR (stabilized zirconia)-MgAl{sub 2}O{sub 4} is under development at JAERI. To prepare the data base, the simulated ROX fuel in which original PuO{sub 2} was replaced by UO{sub 2} was fabricated and brought to out-of-pile tests. Main remarks obtained within this experimental scope are: (1) It was found from the present study that the simulated ROX fuel was successfully fabricated. (2) The gas immersion density of the simulated ROX fuels had values ranging from 4.9 to 5.4 g/cc, which were of order of about 47-52% of that of UO{sub 2}. SZR increased the density of while MgAl{sub 2}O{sub 4}. (3) Melting point of the simulated ROX fuel was revealed to be 1,911 {+-} 39degC, about 30% lower than that of UO{sub 2} fuel. (4) The difference in linear thermal expansion (LTE) between the simulated ROX fuel and the UO{sub 2} fuel was little up to temperatures of 1,500degC. The LTE was increased with the increase of SZR. (5) The creep rate of simulated ROX fuel was strongly dependent on the amount of MgAl{sub 2}O{sub 4}, where the role of Al{sub 2}O{sub 3} dissolved in MgAl{sub 2}O{sub 4} is important. The creep propensity was similar between the simulated ROX fuel and UO{sub 2} fuel. (6) The magnitude of hardness (Hv) was sensitive to the Al{sub 2}O{sub 3} contained in MgAl{sub 2}O{sub 4}, hence the increase of Al{sub 2}O{sub 3} made the simulated ROX fuel more hard. (7) The difference in thermal diffusivity between the simulated ROX fuel and the UO{sub 2} was not so significant. (8) The difference in thermal conductivity between the simulated ROX and UO{sub 2} fuel is little. Degradation of the thermal conductivity occurred by the increase of SZR. (9) Among three candidates of the simulated ROX fuels studied in the present paper, the sample consisted of 26wt%UO{sub 2}-24wt%SZR-50wt%MgAl{sub 2}O{sub 4} seems to have the most feasible performance for future studies. (J.P.N.)

  15. The safety of big workmanship

    International Nuclear Information System (INIS)

    This book brings together the contributions of a colloquium given in memory of Pierre Londe (1922-1999) and dealing with the safety of big workmanship. The main topics concern: the 3-D under pressure water flow inside fractured environments; Rion-Antarion bridge: reliability and para-seismic design of foundations; Rion-Antarion bridge: design and realization; geology and safety of dams; risk assessment; salt storage cavities: evaluation of tightness; safety of tunnels supporting in deformed rock massifs: application to the El Achir tunnel; instability risk of rock formations on the natural slopes of the Alps; safety approach applied to the civil engineering of nuclear facilities; lessons learnt from the accidents of offshore platforms; the engineer in front of the natural hazards; science and regulation. (J.S.)

  16. 基于大数据技术的配电网抢修驻点优化方法%Optimization Method of Repair the Stagnation Point Distribution Based on Big Data Analysis Method

    Institute of Scientific and Technical Information of China (English)

    陆如; 范宏; 周献远

    2015-01-01

    Emergency repair is an important task in the operation of distribution network,for which the scientific and efficient management and implementation method is vital to improve reliability and service quality of distribution network. A method based on Hadoop analysis method is proposed to solve the optimization problem of distribution network emergency repair stagnation points.The factors that affect the efficiency of distribution network emergency repair is analyzed comprehensively,optimization model for emergency repair stagnation point is built up,and the data mining technique for processing big data is introduced to enhance the efficiency of model analysis.In addition,the reasonable and effective allocation of emergency repair resources is achieved by quick and accurate estimation of fault time and fault point and comprehensive analysis and location of distribution network emergency repair points and states,thus improving the serve quantity and efficiency of emergency repair.%配电网故障抢修是配电网运行的重要工作,科学高效的抢修管理和实施方法对提高配电网供电可靠性和配电网服务质量意义重大.提出了基于Hadoop处理技术的大数据解决方法处理配电网抢修驻点优化问题;全面分析了影响配电网抢修效率的各个因素;建立了配电网抢修驻点优化模型;引入了处理大数据的数据挖掘技术以提高模型分析的效率.此外,通过对配电网抢修点和抢修态进行综合分析与定位,对电网故障发生时间和故障位置的准确快速判断,实现合理有效调配抢修资源,从而提高配电网故障抢修工作的服务质量和效率.

  17. Characterizing Big Data Management

    OpenAIRE

    Rogério Rossi; Kechi Hirama

    2015-01-01

    Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: t...

  18. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  19. Five Big Ideas

    Science.gov (United States)

    Morgan, Debbie

    2012-01-01

    Designing quality continuing professional development (CPD) for those teaching mathematics in primary schools is a challenge. If the CPD is to be built on the scaffold of five big ideas in mathematics, what might be these five big ideas? Might it just be a case of, if you tell me your five big ideas, then I'll tell you mine? Here, there is…

  20. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  1. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  2. Mining "big data" using big data services

    OpenAIRE

    Reips, UD; Matzat, U Uwe

    2014-01-01

    While many colleagues within science are fed up with the “big data fad”, empirical analyses we conducted for the current editorial actually show an inconsistent picture: we use big data services to determine whether there really is an increase in writing about big data or even widespread use of the term. Google Correlate (http://www.google.com/trends/correlate/), the first free tool we are presenting here, doesn’t list the term, showing that number of searches for it are below an absolute min...

  3. Rock Stars

    Institute of Scientific and Technical Information of China (English)

    张国平

    2000-01-01

    Around the world young people are spending unbelievable sums of money to listen to rock music. Forbes Magazine reports that at least fifty rock stars have incomes between two million and six million dollars per year.

  4. KREEP Rocks

    Institute of Scientific and Technical Information of China (English)

    邹永廖; 徐琳; 欧阳自远

    2004-01-01

    KREEP rocks with high contents of K, REE and P were first recognized in Apollo-12 samples, and it was confirmed later that there were KREEP rock fragments in all of the Apollo samples, particularly in Apollo-12 and-14 samples. The KREEP rocks distributed on the lunar surface are the very important objects of study on the evolution of the moon, as well as to evaluate the utilization prospect of REE in KREEP rocks. Based on previous studies and lunar exploration data, the authors analyzed the chemical and mineral characteristics of KREEP rocks, the abundance of Th on the lunar surface materials, the correlation between Th and REE of KREEP rocks in abundance, studied the distribution regions of KREEP rocks on the lunar surface, and further evaluated the utilization prospect of REE in KREEP rocks.

  5. From Big Crunch to Big Bang

    OpenAIRE

    Khoury, Justin; Ovrut, Burt A.; Seiberg, Nathan; Steinhardt, Paul J.(Princeton Center for Theoretical Science, Princeton University, Princeton, NJ, 08544, USA); Turok, Neil

    2001-01-01

    We consider conditions under which a universe contracting towards a big crunch can make a transition to an expanding big bang universe. A promising example is 11-dimensional M-theory in which the eleventh dimension collapses, bounces, and re-expands. At the bounce, the model can reduce to a weakly coupled heterotic string theory and, we conjecture, it may be possible to follow the transition from contraction to expansion. The possibility opens the door to new classes of cosmological models. F...

  6. The Big Group of People Looking at How to Control Putting the Parts of the Air That Are the Same as What You Breathe Out Into Small Spaces in Rocks

    Energy Technology Data Exchange (ETDEWEB)

    Stack, Andrew

    2013-07-18

    Representing the Nanoscale Control of Geologic CO2 (NCGC), this document is one of the entries in the Ten Hundred and One Word Challenge. As part of the challenge, the 46 Energy Frontier Research Centers were invited to represent their science in images, cartoons, photos, words and original paintings, but any descriptions or words could only use the 1000 most commonly used words in the English language, with the addition of one word important to each of the EFRCs and the mission of DOE energy. The mission of NCGC is to build a fundamental understanding of molecular-to-pore-scale processes in fluid-rock systems, and to demonstrate the ability to control critical aspects of flow, transport, and mineralization in porous rock media as applied to the injection and storage of carbon dioxide (CO2) in subsurface reservoirs.

  7. Results of new petrologic and remote sensing studies in the Big Bend region

    Science.gov (United States)

    Benker, Stevan Christian

    The initial section of this manuscript involves the South Rim Formation, a series of 32.2-32 Ma comenditic quartz trachytic-rhyolitic volcanics and associated intrusives, erupted and was emplaced in Big Bend National Park, Texas. Magmatic parameters have only been interpreted for one of the two diverse petrogenetic suites comprising this formation. Here, new mineralogic data for the South Rim Formation rocks are presented. Magmatic parameters interpreted from these data assist in deciphering lithospheric characteristics during the mid-Tertiary. Results indicate low temperatures (Fledermaus 3D three-dimensional visualization software to drape Google Earth horizontal positions over a National Elevation Dataset (NED) digital elevation map (DEM) in order to adopt a large set of elevation data. A vertical position accuracy of 1.63 meters RMSE was determined between 268 Google Earth data points and the NED. Since determined accuracies were considerably lower than those reported in previous investigations, we devoted a later portion of this investigation to testing Google Earth-NED data in paleo-surface modeling of the Big Bend region. An 18 x 30 kilometer area in easternmost Big Ranch State Park was selected to create a post-Laramide paleo-surface model via interpolation of approximately 2900 Google Earth-NED data points representing sections of an early Tertiary

  8. Antigravity and the big crunch/big bang transition

    International Nuclear Information System (INIS)

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  9. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  10. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  11. Antigravity and the big crunch/big bang transition

    CERN Document Server

    Bars, Itzhak; Steinhardt, Paul J; Turok, Neil

    2011-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  12. Big fundamental groups: generalizing homotopy and big homotopy

    OpenAIRE

    Penrod, Keith

    2014-01-01

    The concept of big homotopy theory was introduced by J. Cannon and G. Conner using big intervals of arbitrarily large cardinality to detect big loops. We find, for each space, a canonical cardinal that is sufficient to detect all big loops and all big homotopies in the space.

  13. ANALYSIS OF BIG DATA

    OpenAIRE

    Anshul Sharma; Preeti Gulia

    2014-01-01

    Big Data is data that either is too large, grows too fast, or does not fit into traditional architectures. Within such data can be valuable information that can be discovered through data analysis [1]. Big data is a collection of complex and large data sets that are difficult to process and mine for patterns and knowledge using traditional database management tools or data processing and mining systems. Big Data is data whose scale, diversity and complexity require new architecture, technique...

  14. Matrix Big Brunch

    OpenAIRE

    Bedford, J; Papageorgakis, C.; Rodriguez-Gomez, D.; Ward, J.

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  15. The big bang

    International Nuclear Information System (INIS)

    The paper concerns the 'Big Bang' theory of the creation of the Universe 15 thousand million years ago, and traces events which physicists predict occurred soon after the creation. Unified theory of the moment of creation, evidence of an expanding Universe, the X-boson -the particle produced very soon after the big bang and which vanished from the Universe one-hundredth of a second after the big bang, and the fate of the Universe, are all discussed. (U.K.)

  16. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  17. CERN Rocks

    CERN Multimedia

    2004-01-01

    The 15th CERN Hardronic Festival took place on 17 July on the terrace of Rest 3 (Prévessin). Over 1000 people, from CERN and other International Organizations, came to enjoy the warm summer night, and to watch the best of the World's High Energy music. Jazz, rock, pop, country, metal, blues, funk and punk blasted out from 9 bands from the CERN Musiclub and Jazz club, alternating on two stages in a non-stop show.  The night reached its hottest point when The Canettes Blues Band got everybody dancing to sixties R&B tunes (pictured). Meanwhile, the bars and food vans were working at full capacity, under the expert management of the CERN Softball club, who were at the same time running a Softball tournament in the adjacent "Higgs Field". The Hardronic Festival is the main yearly CERN music event, and it is organized with the support of the Staff Association and the CERN Administration.

  18. Pre-Big Bang, vacuum and noncyclic cosmologies

    OpenAIRE

    Gonzalez-Mestres, L.

    2011-01-01

    WMAP and Planck open the way to unprecedented Big Bang phenomenology, potentially allowing to test the standard Big Bang model as well as less conventional approaches including noncyclic pre-Big Bang cosmologies that would incorporate a new fundamental scale beyond the Planck scale and, possibly, new ultimate constituents of matter. Alternatives to standard physics can be considered from a cosmological point of view concerning vacuum structure, the nature of space-time, the origin and evoluti...

  19. Dual of big bang and big crunch

    International Nuclear Information System (INIS)

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory

  20. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  1. Big Ideas in Art

    Science.gov (United States)

    Day, Kathleen

    2008-01-01

    In this article, the author shares how she was able to discover some big ideas about art education. She relates how she found great ideas to improve her teaching from the book "Rethinking Curriculum in Art." She also shares how she designed a "Big Idea" unit in her class.

  2. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled. PMID:26173222

  3. Flow characteristics of caved ore and rock in the multiple draw-point condition%多放矿口条件下崩落矿岩流动特性

    Institute of Scientific and Technical Information of China (English)

    孙浩; 金爱兵; 高永涛; 孟新秋

    2015-01-01

    基于离散元理论和PFC3D程序构建放矿模型,探究多放矿口条件下崩落矿岩流动特性,实现多放矿口条件下放出体及矿石残留体形态变化过程的可视化。同时,将模拟结果与已有研究结论进行对比分析,验证基于PFC程序的放矿模型在崩落矿岩流动特性研究中的可靠性。放矿PFC模拟结果表明,多放矿口条件下放出体形态会因各放矿口间的相互影响而产生交错、缺失等程度的不同变异,并不是一个规则的椭球体。在单一放矿口和多放矿口条件下,放出体高度的变化趋势均可概括为两个阶段:在放矿初始阶段,放出体高度呈指数形式快速增加,随放矿量的增加,其增长率逐渐减小;随后,放出体高度将随放矿量的增加而呈线性增长的趋势。矿石损失率随放矿口尺寸及崩落矿石层高度的增大而减小,随放矿口间距的增大而增大。当相邻放矿口间产生相互影响时,平面放矿方式与立面放矿方式相比,其矿石残留量更小,且崩落矿岩接触面呈近似水平状态下降。%Based on the particle flow theory and PFC3D code, a draw model was constructed to research the flow characteristics of caved ore and rock in the multiple draw-point condition and visualize the form-changing process of the isolated extraction zone ( IEZ) and the ridge hangover body. Simultaneously, the suitability and reliability of this draw model were validated in the flow characteristics study of caved ore and rock by comparative analysis between simulated results and existing research conclusions. Due to interactions among multiple draw-points, the IEZ’ s form produces different degrees of variation in the multiple draw-point condition, including interlacement and deficiency, which result in that the IEZ’ s form is not a regular ellipsoid. The height changing trend of the IEZ in both the isolated draw-point condition and the multiple draw-point

  4. Big Data, Big Knowledge: Big Data for Personalized Healthcare.

    OpenAIRE

    Viceconti, M.; Hunter, P.; Hose, R.

    2015-01-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine soluti...

  5. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority. PMID:26218867

  6. Big data: survey, technologies, opportunities, and challenges.

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  7. Big Data: Survey, Technologies, Opportunities, and Challenges

    Directory of Open Access Journals (Sweden)

    Nawsher Khan

    2014-01-01

    Full Text Available Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  8. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  9. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  10. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    2016-01-01

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van dez

  11. Prestudy Oskarshamn. Soils, rocks and deformation zones

    International Nuclear Information System (INIS)

    Soil and geology of the Oskarshamn area are described, as well as deformation zones and seismicity. Several areas of the inland are judged to be potentially well suited for a spent fuel repository. In the Simpevarp peninsula, it may be difficult to locate a rock mass big enough, between the fracture zones, to host a repository

  12. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  13. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    The precision of measurements in modern cosmology has made huge strides in recent years, with measurements of the cosmic microwave background and the determination of the Hubble constant now rivaling the level of precision of the predictions of big bang nucleosynthesis. However, these results are not necessarily consistent with the predictions of the Standard Model of big bang nucleosynthesis. Reconciling these discrepancies may require extensions of the basic tenets of the model, and possibly of the reaction rates that determine the big bang abundances

  14. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  15. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  16. Big Data Analytics

    Indian Academy of Sciences (India)

    2016-08-01

    The volume and variety of data being generated using computersis doubling every two years. It is estimated that in 2015,8 Zettabytes (Zetta=1021) were generated which consistedmostly of unstructured data such as emails, blogs, Twitter,Facebook posts, images, and videos. This is called big data. Itis possible to analyse such huge data collections with clustersof thousands of inexpensive computers to discover patterns inthe data that have many applications. But analysing massiveamounts of data available in the Internet has the potential ofimpinging on our privacy. Inappropriate analysis of big datacan lead to misleading conclusions. In this article, we explainwhat is big data, how it is analysed, and give some case studiesillustrating the potentials and pitfalls of big data analytics.

  17. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    ’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  18. Big Creek Pit Tags

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BCPITTAGS database is used to store data from an Oncorhynchus mykiss (steelhead/rainbow trout) population dynamics study in Big Creek, a coastal stream along...

  19. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  20. Sharing big biomedical data

    OpenAIRE

    Toga, Arthur W.; Dinov, Ivo D.

    2015-01-01

    Background The promise of Big Biomedical Data may be offset by the enormous challenges in handling, analyzing, and sharing it. In this paper, we provide a framework for developing practical and reasonable data sharing policies that incorporate the sociological, financial, technical and scientific requirements of a sustainable Big Data dependent scientific community. Findings Many biomedical and healthcare studies may be significantly impacted by using large, heterogeneous and incongruent data...

  1. Testing Big Bang Nucleosynthesis

    OpenAIRE

    Steigman, Gary

    1996-01-01

    Big Bang Nucleosynthesis (BBN), along with the cosmic background radiation and the Hubble expansion, is one of the pillars ofthe standard, hot, big bang cosmology since the primordial synthesis of the light nuclides (D, $^3$He, $^4$He, $^7$Li) must have occurred during the early evolution of a universe described by this model. The overall consistency between the predicted and observed abundances of the light nuclides, each of which spans a range of some nine orders of magnitude, provides impr...

  2. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  3. Big Bang baryosynthesis

    International Nuclear Information System (INIS)

    In these lectures I briefly review Big Bang baryosynthesis. In the first lecture I discuss the evidence which exists for the BAU, the failure of non-GUT symmetrical cosmologies, the qualitative picture of baryosynthesis, and numerical results of detailed baryosynthesis calculations. In the second lecture I discuss the requisite CP violation in some detail, further the statistical mechanics of baryosynthesis, possible complications to the simplest scenario, and one cosmological implication of Big Bang baryosynthesis. (orig./HSI)

  4. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  5. "Big Data": Big Knowledge Gaps in the Field of Internet Science

    Directory of Open Access Journals (Sweden)

    Ulf-Dietrich Reips

    2012-01-01

    Full Text Available Research on so-called ‘Big Data’ has received a considerable momentum and is expected to grow in the future. One very interesting stream of research on Big Data analyzes online networks. Many online networks are known to have some typical macro-characteristics, such as ‘small world’ properties. Much less is known about underlying micro-processes leading to these properties. The models used by Big Data researchers usually are inspired by mathematical ease of exposition. We propose to follow in addition a different strategy that leads to knowledge about micro-processes that match with actual online behavior. This knowledge can then be used for the selection of mathematically-tractable models of online network formation and evolution. Insight from social and behavioral research is needed for pursuing this strategy of knowledge generation about micro-processes. Accordingly, our proposal points to a unique role that social scientists could play in Big Data research. ...

  6. Big Data in Caenorhabditis elegans: quo vadis?

    Science.gov (United States)

    Hutter, Harald; Moerman, Donald

    2015-11-01

    A clear definition of what constitutes "Big Data" is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of "complete" data sets for this organism is actually rather small--not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein-protein interaction--important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell. PMID:26543198

  7. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution....... The massive involvement of lay publics as instrumented by social media breaks with the strong expert cultures that have underlain the production and use of data in modern organizations. It also sets apart the interactive and communicative processes by which social data is produced from sensor data...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...

  8. Big Data and Peacebuilding

    Directory of Open Access Journals (Sweden)

    Sanjana Hattotuwa

    2013-11-01

    Full Text Available Any peace process is an exercise in the negotiation of big data. From centuries old communal hagiography to the reams of official texts, media coverage and social media updates, peace negotiations generate data. Peacebuilding and peacekeeping today are informed by, often respond and contribute to big data. This is no easy task. As recently as a few years ago, before the term big data embraced the virtual on the web, what informed peace process design and implementation was in the physical domain – from contested borders and resources to background information in the form of text. The move from analogue, face-to-face negotiations to online, asynchronous, web-mediated negotiations – which can still include real world meetings – has profound implications for how peace is strengthened in fragile democracies.

  9. Big data in biomedicine.

    Science.gov (United States)

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. PMID:24183925

  10. Intellektuaalne rock

    Index Scriptorium Estoniae

    2007-01-01

    Briti laulja-helilooja ja näitleja Toyah Willcox ning Bill Rieflin ansamblist R.E.M. ja Pat Mastelotto King Krimsonist esinevad koos ansamblitega The Humans ja Tuner 25. okt. Tallinnas Rock Cafés ja 27. okt Tartu Jaani kirikus

  11. Primordial Big Bang Nucleosynthesis

    OpenAIRE

    Olive, Keith A.

    1999-01-01

    Big Bang Nucleosynthesis is the theory of the production of the the light element isotopes of D, He3, He4, and Li7. After a brief review of the essential elements of the standard Big Bang model at a temperature of about 1 MeV, the theoretical input and predictions of BBN are discussed. The theory is tested by the observational determinations of the light element abundances and the current status of these observations is reviewed. Concordance of standard model and the related observations is f...

  12. Networks & big data

    OpenAIRE

    Litvak, Nelly; Meulen, van der, P.

    2015-01-01

    Once a year, the NWO cluster Stochastics – Theoretical and Applied Research (STAR) organises a STAR Outreach Day, a one-day event around a theme that is of a broad interest to the stochastics community in the Netherlands. The last Outreach Day took place at Eurandom on 12 December 2014. The theme of the day was ‘Networks & Big Data’. The topic is very timely. The Vision document 2025 of the PlatformWiskunde Nederland (PWN) mentions big data as one of the six “major societal and scientific tre...

  13. Georges et le big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2011-01-01

    Georges et Annie, sa meilleure amie, sont sur le point d'assister à l'une des plus importantes expériences scientifiques de tous les temps : explorer les premiers instants de l'Univers, le Big Bang ! Grâce à Cosmos, leur super ordinateur, et au Grand Collisionneur de hadrons créé par Éric, le père d'Annie, ils vont enfin pouvoir répondre à cette question essentielle : pourquoi existons nous ? Mais Georges et Annie découvrent qu'un complot diabolique se trame. Pire, c'est toute la recherche scientifique qui est en péril ! Entraîné dans d'incroyables aventures, Georges ira jusqu'aux confins de la galaxie pour sauver ses amis...Une plongée passionnante au coeur du Big Bang. Les toutes dernières théories de Stephen Hawking et des plus grands scientifiques actuels.

  14. Baryon symmetric big bang cosmology

    Science.gov (United States)

    Stecker, F. W.

    1978-01-01

    Both the quantum theory and Einsteins theory of special relativity lead to the supposition that matter and antimatter were produced in equal quantities during the big bang. It is noted that local matter/antimatter asymmetries may be reconciled with universal symmetry by assuming (1) a slight imbalance of matter over antimatter in the early universe, annihilation, and a subsequent remainder of matter; (2) localized regions of excess for one or the other type of matter as an initial condition; and (3) an extremely dense, high temperature state with zero net baryon number; i.e., matter/antimatter symmetry. Attention is given to the third assumption, which is the simplest and the most in keeping with current knowledge of the cosmos, especially as pertains the universality of 3 K background radiation. Mechanisms of galaxy formation are discussed, whereby matter and antimatter might have collided and annihilated each other, or have coexisted (and continue to coexist) at vast distances. It is pointed out that baryon symmetric big bang cosmology could probably be proved if an antinucleus could be detected in cosmic radiation.

  15. Aspects of the Flora and Vegetation of the “Izvorul Bigăr” Nature Reserve (South-Western Romania

    Directory of Open Access Journals (Sweden)

    Ilinca M. IMBREA

    2009-06-01

    Full Text Available The “Izvorul Bigăr” Nature Reserve is located in south-western Romania. The aim of the present paper is to describe some aspects of the flora and vegetation around Bigăr spring. The analysis of the vegetal association was carried out using the method of the Central-European phytocoenological school. The vegetation around the Bigăr spring and waterfall is dominated by compact beech forests with a frequently reduced grassy layer and soil rich in humus. On the banks of the watercourse and on the rocks around the spring there are species specific to flooding plains of the beech sub-stratum and also thermophilous and xerophilous species, many of them Balkan, Pontic or Mediterranean elements. The phytocoenoses we analysed belong to the Phyllitidi - Fagetum Vida (1959 1963 association. The association is characteristic to shady and moist slopes with soils rich in humus and formed on a lime substratum sometimes with surface rocks. The species with high abundance-dominance values are: Fagus sylvatica, Fraxinus ornus, Acer pseudoplatanus, Tilia cordata, Hedera helix, Asplenium scolopendrium, Arum orientale, Asarum europaeum, Cardamine bulbifera, Lunaria annua, Polypodium vulgare. Such species as Carpinus orientalis, Cotinus coggygria, Fraxinus ornus, Ruscus hypoglossum, Syringa vulgaris point out the thermophilous character of the forests in southern Banat.

  16. [Utilization of Big Data in Medicine and Future Outlook].

    Science.gov (United States)

    Kinosada, Yasutomi; Uematsu, Machiko; Fujiwara, Takuya

    2016-03-01

    "Big data" is a new buzzword. The point is not to be dazzled by the volume of data, but rather to analyze it, and convert it into insights, innovations, and business value. There are also real differences between conventional analytics and big data. In this article, we show some results of big data analysis using open DPC (Diagnosis Procedure Combination) data in areas of the central part of JAPAN: Toyama, Ishikawa, Fukui, Nagano, Gifu, Aichi, Shizuoka, and Mie Prefectures. These 8 prefectures contain 51 medical administration areas called the second medical area. By applying big data analysis techniques such as k-means, hierarchical clustering, and self-organizing maps to DPC data, we can visualize the disease structure and detect similarities or variations among the 51 second medical areas. The combination of a big data analysis technique and open DPC data is a very powerful method to depict real figures on patient distribution in Japan. PMID:27363223

  17. Rocas como símbolos: la selección de materias primas para puntas de proyectil en ambientes mesetarios de Patagonia Rocks As Symbols: The Selection Of Raw Material For Projectile Points In Patagonian Plateau Environments

    Directory of Open Access Journals (Sweden)

    Darío Hermo

    2008-12-01

    Full Text Available Los análisis sobre la tecnología lítica de los cazadores-recolectores que ocuparon la Patagonia argentina se han desarrollado desde perspectivas ecológico-económicas, dejando de lado los aspectos sociales y simbólicos que incidieron en la organización tecnológica. A partir del análisis de conjuntos de puntas de proyectil de diversa cronología y provenientes de ambientes mesetarios de Patagonia se propone el aspecto de las rocas como criterio no excluyentes para la selección de la materia prima en las que fueron manufacturadas. Esta propuesta podría abrir caminos interpretativos hasta ahora en desuso en la región.Analyses of hunter-gatherer lithic technologies from Argentinean Patagonia have been developed from ecological-economic perspectives, leaving aside the social and symbolic aspects that impacted technological organization. Considering the analysis of projectile points of diverse ages from the Patagonian plateaus, the visual aspect of these rocks, such as their color and brightness, is proposed as a non-exclusionary approach for the selection of the raw materials out of which they were manufactured. The proposed approach could open new interpretive paths for the region.

  18. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with sp

  19. Space big book

    CERN Document Server

    Homer, Charlene

    2007-01-01

    Our Combined resource includes all necessary areas of Space for grades five to eight. Get the big picture about the Solar System, Galaxies and the Universe as your students become fascinated by the interesting information about the Sun, Earth, Moon, Comets, Asteroids Meteoroids, Stars and Constellations. Also, thrill your young astronomers as they connect Earth and space cycles with their daily life.

  20. The Big Bang

    CERN Multimedia

    Moods, Patrick

    2006-01-01

    How did the Universe begin? The favoured theory is that everything - space, time, matter - came into existence at the same moment, around 13.7 thousand million years ago. This event was scornfully referred to as the "Big Bang" by Sir Fred Hoyle, who did not believe in it and maintained that the Universe had always existed.

  1. A Big Bang Lab

    Science.gov (United States)

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  2. Governing Big Data

    Directory of Open Access Journals (Sweden)

    Andrej J. Zwitter

    2014-04-01

    Full Text Available 2.5 quintillion bytes of data are created every day through pictures, messages, gps-data, etc. "Big Data" is seen simultaneously as the new Philosophers Stone and Pandora's box: a source of great knowledge and power, but equally, the root of serious problems.

  3. The big bang

    Science.gov (United States)

    Silk, Joseph

    Our universe was born billions of years ago in a hot, violent explosion of elementary particles and radiation - the big bang. What do we know about this ultimate moment of creation, and how do we know it? Drawing upon the latest theories and technology, this new edition of The big bang, is a sweeping, lucid account of the event that set the universe in motion. Joseph Silk begins his story with the first microseconds of the big bang, on through the evolution of stars, galaxies, clusters of galaxies, quasars, and into the distant future of our universe. He also explores the fascinating evidence for the big bang model and recounts the history of cosmological speculation. Revised and updated, this new edition features all the most recent astronomical advances, including: Photos and measurements from the Hubble Space Telescope, Cosmic Background Explorer Satellite (COBE), and Infrared Space Observatory; the latest estimates of the age of the universe; new ideas in string and superstring theory; recent experiments on neutrino detection; new theories about the presence of dark matter in galaxies; new developments in the theory of the formation and evolution of galaxies; the latest ideas about black holes, worm holes, quantum foam, and multiple universes.

  4. Big Java late objects

    CERN Document Server

    Horstmann, Cay S

    2012-01-01

    Big Java: Late Objects is a comprehensive introduction to Java and computer programming, which focuses on the principles of programming, software engineering, and effective learning. It is designed for a two-semester first course in programming for computer science students.

  5. Big is beautiful

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, J.P.

    2007-06-08

    Although big solar systems are both effective and architecturally pleasing, they are still not widespread in Germany. Recently, politicians reacted by improving funding conditions. In order to prevent planning errors, planners and fitters must be better trained, and standardisation of systems must be enhanced. (orig.)

  6. Big ideas: innovation policy

    OpenAIRE

    Van Reenen, John

    2011-01-01

    In the last CentrePiece, John Van Reenen stressed the importance of competition and labour market flexibility for productivity growth. His latest in CEP's 'big ideas' series describes the impact of research on how policy-makers can influence innovation more directly - through tax credits for business spending on research and development.

  7. Business and Science - Big Data, Big Picture

    Science.gov (United States)

    Rosati, A.

    2013-12-01

    Data Science is more than the creation, manipulation, and transformation of data. It is more than Big Data. The business world seems to have a hold on the term 'data science' and, for now, they define what it means. But business is very different than science. In this talk, I address how large datasets, Big Data, and data science are conceptually different in business and science worlds. I focus on the types of questions each realm asks, the data needed, and the consequences of findings. Gone are the days of datasets being created or collected to serve only one purpose or project. The trick with data reuse is to become familiar enough with a dataset to be able to combine it with other data and extract accurate results. As a Data Curator for the Advanced Cooperative Arctic Data and Information Service (ACADIS), my specialty is communication. Our team enables Arctic sciences by ensuring datasets are well documented and can be understood by reusers. Previously, I served as a data community liaison for the North American Regional Climate Change Assessment Program (NARCCAP). Again, my specialty was communicating complex instructions and ideas to a broad audience of data users. Before entering the science world, I was an entrepreneur. I have a bachelor's degree in economics and a master's degree in environmental social science. I am currently pursuing a Ph.D. in Geography. Because my background has embraced both the business and science worlds, I would like to share my perspectives on data, data reuse, data documentation, and the presentation or communication of findings. My experiences show that each can inform and support the other.

  8. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  9. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  10. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan Capalbo

    2005-12-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I are organized into four areas: (1) Evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; (2) Development of GIS-based reporting framework that links with national networks; (3) Design of an integrated suite of monitoring, measuring, and verification technologies, market-based opportunities for carbon management, and an economic/risk assessment framework; (referred to below as the Advanced Concepts component of the Phase I efforts) and (4) Initiation of a comprehensive education and outreach program. As a result of the Phase I activities, the groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that complements the ongoing DOE research agenda in Carbon Sequestration. The geology of the Big Sky Carbon Sequestration Partnership Region is favorable for the potential sequestration of enormous volume of CO{sub 2}. The United States Geological Survey (USGS 1995) identified 10 geologic provinces and 111 plays in the region. These provinces and plays include both sedimentary rock types characteristic of oil, gas, and coal productions as well as large areas of mafic volcanic rocks. Of the 10 provinces and 111 plays, 1 province and 4 plays are located within Idaho. The remaining 9 provinces and 107 plays are dominated by sedimentary rocks and located in the states of Montana and Wyoming. The potential sequestration capacity of the 9 sedimentary provinces within the region ranges from 25,000 to almost 900,000 million metric tons of CO{sub 2}. Overall every sedimentary formation investigated

  11. Pre-big bang geometric extensions of inflationary cosmologies

    CERN Document Server

    Klein, David

    2016-01-01

    Robertson-Walker cosmologies within a large class are geometrically extended to larger spacetimes that include spacetime points with zero and negative cosmological times. In the extended spacetimes, the big bang is lightlike, and though singular, it inherits some geometric structure from the original spacetime. Spacelike geodesics are continuous across the cosmological time zero submanifold which is parameterized by the radius of Fermi space slices, i.e, by the proper distances along spacelike geodesics from a comoving observer to the big bang. The continuous extension of the metric, and the continuously differentiable extension of the leading Fermi metric coefficient g{\\tau}{\\tau} of the observer, restrict the geometry of spacetime points with pre-big bang cosmological time coordinates. In our extensions the big bang is two di- mensional in a certain sense, consistent with some findings in quantum gravity.

  12. Big Data: Survey, Technologies, Opportunities, and Challenges

    OpenAIRE

    Nawsher Khan; Ibrar Yaqoob; Ibrahim Abaker Targio Hashem; Zakira Inayat; Waleed Kamaleldin Mahmoud Ali; Muhammad Alam; Muhammad Shiraz; Abdullah Gani

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information...

  13. Big is beautiful for c. h. p

    Energy Technology Data Exchange (ETDEWEB)

    1979-11-30

    The CEGB in the United Kingdom has retreated to rural surroundings for siting power stations, but this poses no obstacle to the development of combined heat and power. It is pointed out that the cost of transporting hot water across country is not a problem, provided only that the operation is on a large scale. Factors supporting the decision for a big city to become a pioneer in installing a combined heat and power scheme are discussed. (MCW)

  14. Rock stresses (Grimsel rock laboratory)

    International Nuclear Information System (INIS)

    On the research and development project 'Rock Stress Measurements' the BGR has developed and tested several test devices and methods at GTS for use in boreholes at a depth of 200 m and has carried out rock mechanical and engineering geological investigations for the evaluation and interpretation of the stress measurements. The first time a computer for data processing was installed in the borehole together with the BGR-probe. Laboratory tests on hollow cylinders were made to study the stress-deformation behavior. To validate and to interprete the measurement results some test methods were modelled using the finite-element method. The dilatometer-tests yielded high values of Young's modulus, whereas laboratory tests showed lower values with a distinct deformation anisotropy. Stress measurements with the BGR-probe yielded horizontal stresses being higher than the theoretical overburden pressure and vertical stresses which agree well with the theoretical overburden pressure. These results are comparable to the results of the hydraulic fracturing tests, whereas stresses obtained with CSIR-triaxial cells are generally lower. The detailed geological mapping of the borehole indicated relationships between stress and geology. With regard to borehole depth different zones of rock structure joint frequency, joint orientation, and orientation of microfissures as well as stress magnitude, stress direction, and degree of deformation anisotropy could be distinguished. (orig./HP)

  15. Permeability Evolution and Rock Brittle Failure

    Directory of Open Access Journals (Sweden)

    Sun Qiang

    2015-08-01

    Full Text Available This paper reports an experimental study of the evolution of permeability during rock brittle failure and a theoretical analysis of rock critical stress level. It is assumed that the rock is a strain-softening medium whose strength can be described by Weibull’s distribution. Based on the two-dimensional renormalization group theory, it is found that the stress level λ c (the ratio of the stress at the critical point to the peak stress depends mainly on the homogeneity index or shape parameter m in the Weibull’s distribution for the rock. Experimental results show that the evolution of permeability is closely related to rock deformation stages: the permeability has a rapid increase with the growth of cracks and their surface areas (i.e., onset of fracture coalescence point, and reaches the maximum at rock failure. Both the experimental and analytical results show that this point of rapid increase in permeability on the permeabilitypressure curve corresponds to the critical point on the stress-strain curve; for rock compression, the stress at this point is approximately 80% of the peak strength. Thus, monitoring the evolution of permeability may provide a new means of identifying the critical point of rock brittle fracture

  16. ANALYTICS OF BIG DATA

    Directory of Open Access Journals (Sweden)

    Asst. Prof. Shubhada Talegaon

    2014-10-01

    Full Text Available Big Data analytics has started to impact all types of organizations, as it carries the potential power to extract embedded knowledge from big amounts of data and react according to it in real time. The current technology enables us to efficiently store and query large datasets, the focus is now on techniques that make use of the complete data set, instead of sampling. This has tremendous implications in areas like machine learning, pattern recognition and classification, sentiment analysis, social networking analysis to name a few. Therefore, there are a number of requirements for moving beyond standard data mining technique. Purpose of this paper is to understand various techniques to analysis data.

  17. ANALYTICS OF BIG DATA

    Directory of Open Access Journals (Sweden)

    Prof. Shubhada Talegaon

    2015-10-01

    Full Text Available Big Data analytics has started to impact all types of organizations, as it carries the potential power to extract embedded knowledge from big amounts of data and react according to it in real time. The current technology enables us to efficiently store and query large datasets, the focus is now on techniques that make use of the complete data set, instead of sampling. This has tremendous implications in areas like machine learning, pattern recognition and classification, sentiment analysis, social networking analysis to name a few. Therefore, there are a number of requirements for moving beyond standard data mining technique. Purpose of this paper is to understand various techniques to analysis data.

  18. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects......This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... of the international development agenda to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development policies, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion...

  19. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Klinkby Madsen, Anders; Rasche, Andreas

    data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects......This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... of international development agendas to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development efforts, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion...

  20. Really big numbers

    CERN Document Server

    Schwartz, Richard Evan

    2014-01-01

    In the American Mathematical Society's first-ever book for kids (and kids at heart), mathematician and author Richard Evan Schwartz leads math lovers of all ages on an innovative and strikingly illustrated journey through the infinite number system. By means of engaging, imaginative visuals and endearing narration, Schwartz manages the monumental task of presenting the complex concept of Big Numbers in fresh and relatable ways. The book begins with small, easily observable numbers before building up to truly gigantic ones, like a nonillion, a tredecillion, a googol, and even ones too huge for names! Any person, regardless of age, can benefit from reading this book. Readers will find themselves returning to its pages for a very long time, perpetually learning from and growing with the narrative as their knowledge deepens. Really Big Numbers is a wonderful enrichment for any math education program and is enthusiastically recommended to every teacher, parent and grandparent, student, child, or other individual i...

  1. Big Data Refinement

    OpenAIRE

    Boiten, Eerke Albert

    2016-01-01

    "Big data" has become a major area of research and associated funding, as well as a focus of utopian thinking. In the still growing research community, one of the favourite optimistic analogies for data processing is that of the oil refinery, extracting the essence out of the raw data. Pessimists look for their imagery to the other end of the petrol cycle, and talk about the "data exhausts" of our society. Obviously, the refinement community knows how to do "refining". This paper explores...

  2. DARPA's Big Mechanism program

    Science.gov (United States)

    Cohen, Paul R.

    2015-07-01

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  3. Canonical Big Operators

    OpenAIRE

    Bertot, Yves; Gonthier, Georges; Ould Biha, Sidi; Pasca, Ioana

    2008-01-01

    In this paper, we present an approach to describe uniformly iterated “big” operations and to provide lemmas that encapsulate all the commonly used reasoning steps on these constructs. We show that these iterated operations can be handled generically using the syntactic notation and canonical structure facilities provided by the Coq system. We then show how these canonical big operations played a crucial enabling role in the study of various parts of linear algebra and multi-dimensional real a...

  4. Big Bang 6

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 6 RG behandelt die Gravitation, Schwingungen und Wellen, Thermodynamik und eine Einführung in die Elektrizität anhand von Alltagsbeispielen und Querverbindungen zu anderen Disziplinen.

  5. Big Bang 7

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. In Band 7 werden neben einer Einführung auch viele aktuelle Aspekte von Quantenmechanik (z. Beamen) und Elektrodynamik (zB Elektrosmog), sowie die Klimaproblematik und die Chaostheorie behandelt.

  6. Big Bang 8

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Band 8 vermittelt auf verständliche Weise Relativitätstheorie, Kern- und Teilchenphysik (und deren Anwendungen in der Kosmologie und Astrophysik), Nanotechnologie sowie Bionik.

  7. Big Bang 5

    CERN Document Server

    Apolin, Martin

    2007-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 5 RG behandelt die Grundlagen (Maßsystem, Größenordnungen) und die Mechanik (Translation, Rotation, Kraft, Erhaltungssätze).

  8. Think Small Go Big

    Institute of Scientific and Technical Information of China (English)

    汤维维

    2006-01-01

    Vepoo公司在创立之前,经历了三次创业转型。用他们的话来说,从“think big go small”转到“think small go big”用了一年的时间。这期间他们耗尽了初期筹备资金,幸运的是在最后一刻迎来了黎明的曙光。

  9. From Big Bang to Big Crunch and Beyond

    OpenAIRE

    Elitzur, S.; Giveon, A.; Kutasov, D.; Rabinovici, E.

    2002-01-01

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a ``big bang'' singularity, expands and then contracts to a ``big crunch'' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceeding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spaceti...

  10. Water - rock interaction in different rock environments

    International Nuclear Information System (INIS)

    The study assesses the groundwater geochemistry and geological environment of 44 study sites for radioactive waste disposal. Initially, the study sites were divided by rock type into 5 groups: (1) acid - intermediate rocks, (2) mafic - ultramafic rocks, (3) gabbros, amphibolites and gneisses that contain calc-silicate (skarn) rocks, (4) carbonates and (5) sandstones. Separate assessments are made of acid - intermediate plutonic rocks and of a subgroup that comprises migmatites, granite and mica gneiss. These all belong to the group of acid - intermediate rocks. Within the mafic -ultramafic rock group, a subgroup that comprises mafic - ultramafic plutonic rocks, serpentinites, mafic - ultramafic volcanic rocks and volcanic - sedimentary schists is also evaluated separately. Bedrock groundwaters are classified by their concentration of total dissolved solids as fresh, brackish, saline, strongly saline and brine-class groundwaters. (75 refs., 24 figs., 3 tabs.)

  11. NEW THEORY IN TUNNEL STABLILITY CONTROL OF SOFT ROCK ——MECHANICS OF SOFT ROCK ENGINEERING

    Institute of Scientific and Technical Information of China (English)

    何满朝

    1996-01-01

    Tunnel stability control is a world-wide difficult problem. For the sake of solving it,the new theory of soft rock engineering mechanics has been estabilished. Some key points,such as the definition and classification of soft rock, mechanical deformation mechanism of a soft rock tunnel, the critical support technique of soft rock tunnel and the new theory of the soft rock tunnel stability control are proposed in this paper.

  12. How Big is Earth?

    Science.gov (United States)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  13. Big Crunch-based omnidirectional light concentrators

    International Nuclear Information System (INIS)

    Omnidirectional light concentration remains an unsolved problem despite such important practical applications as the design of efficient mobile photovoltaic cells. Recently developed optical black hole designs offer partial solutions to this problem. However, even these solutions are not truly omnidirectional since they do not exhibit a horizon, and at large enough incidence angles the light may be trapped into quasi-stationary orbits around such imperfect optical black holes. Here, we propose and realize experimentally another gravity-inspired design of a broadband omnidirectional light concentrator based on the cosmological Big Crunch solutions. By mimicking the Big Crunch spacetime via a corresponding effective optical metric, we make sure that every photon world line terminates in a single point. (paper)

  14. Big Crunch-based omnidirectional light concentrators

    CERN Document Server

    Smolyaninov, Igor I

    2014-01-01

    Omnidirectional light concentration remains an unsolved problem despite such important practical applications as design of efficient mobile photovoltaic cells. Optical black hole designs developed recently offer partial solution to this problem. However, even these solutions are not truly omnidirectional since they do not exhibit a horizon, and at large enough incidence angles light may be trapped into quasi-stationary orbits around such imperfect optical black holes. Here we propose and realize experimentally another gravity-inspired design of a broadband omnidirectional light concentrator based on the cosmological Big Crunch solutions. By mimicking the Big Crunch spacetime via corresponding effective optical metric we make sure that every photon world line terminates in a single point.

  15. Big Bang Nucleosynthesis Calculation

    CERN Document Server

    Kurki-Suonio, H

    2001-01-01

    I review standard big bang nucleosynthesis and some versions of nonstandard BBN. The abundances of the primordial isotopes D, He-3, and Li-7 produced in standard BBN can be calculated as a function of the baryon density with an accuracy of about 10%. For He-4 the accuracy is better than 1%. The calculated abundances agree fairly well with observations, but the baryon density of the universe cannot be determined with high precision. Possibilities for nonstandard BBN include inhomogeneous and antimatter BBN and nonzero neutrino chemical potentials.

  16. A Matrix Big Bang

    OpenAIRE

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matr...

  17. Big and little OER

    OpenAIRE

    Weller, Martin

    2010-01-01

    Much of the attention around OERs has been on institutional projects which make explicit learning content available. These can be classified as ‘big OER’, but another form of OER is that of small scale, individually produced resources using web 2.0 type services, which are classified as ‘little OER’. This paper examines some of the differences between the use of these two types of OER to highlight issues in open education. These include attitudes towards reputation, the intentionality of the ...

  18. Big Red Telephone, Gone

    Institute of Scientific and Technical Information of China (English)

    Toni Piech

    2006-01-01

    @@ The Chinese big red telephones looked exactly as Iimagined the ones servicing the direct emergen line between the Kreml and the White House duing the cold-war era would have look like. But here in China, every kio seemed to have such a device in t1990s, and anyone could use it for ju 0.2 yuan. The government did not juinstall public phones on street corner but they let small-business owners pa ticipate in telecommunication. Supply and demand were juggled by a kind of Hutong capitalism.

  19. Big Data Challenges

    Directory of Open Access Journals (Sweden)

    Alexandru Adrian TOLE

    2013-10-01

    Full Text Available The amount of data that is traveling across the internet today, not only that is large, but is complex as well. Companies, institutions, healthcare system etc., all of them use piles of data which are further used for creating reports in order to ensure continuity regarding the services that they have to offer. The process behind the results that these entities requests represents a challenge for software developers and companies that provide IT infrastructure. The challenge is how to manipulate an impressive volume of data that has to be securely delivered through the internet and reach its destination intact. This paper treats the challenges that Big Data creates.

  20. Privacy and Big Data

    CERN Document Server

    Craig, Terence

    2011-01-01

    Much of what constitutes Big Data is information about us. Through our online activities, we leave an easy-to-follow trail of digital footprints that reveal who we are, what we buy, where we go, and much more. This eye-opening book explores the raging privacy debate over the use of personal data, with one undeniable conclusion: once data's been collected, we have absolutely no control over who uses it or how it is used. Personal data is the hottest commodity on the market today-truly more valuable than gold. We are the asset that every company, industry, non-profit, and government wants. Pri

  1. Bigness in compatible systems

    OpenAIRE

    Snowden, Andrew; Wiles, Andrew

    2009-01-01

    Clozel, Harris and Taylor have recently proved a modularity lifting theorem of the following general form: if rho is an l-adic representation of the absolute Galois group of a number field for which the residual representation rho-bar comes from a modular form then so does rho. This theorem has numerous hypotheses; a crucial one is that the image of rho-bar must be "big," a technical condition on subgroups of GL(n). In this paper we investigate this condition in compatible systems. Our main r...

  2. Big Data as a Source for Official Statistics

    Directory of Open Access Journals (Sweden)

    Daas Piet J.H.

    2015-06-01

    Full Text Available More and more data are being produced by an increasing number of electronic devices physically surrounding us and on the internet. The large amount of data and the high frequency at which they are produced have resulted in the introduction of the term ‘Big Data’. Because these data reflect many different aspects of our daily lives and because of their abundance and availability, Big Data sources are very interesting from an official statistics point of view. This article discusses the exploration of both opportunities and challenges for official statistics associated with the application of Big Data. Experiences gained with analyses of large amounts of Dutch traffic loop detection records and Dutch social media messages are described to illustrate the topics characteristic of the statistical analysis and use of Big Data.

  3. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    We present an overview of the standard model of big bang nucleosynthesis (BBN), which describes the production of the light elements in the early universe. The theoretical prediction for the abundances of D, 3He, 4He, and 7Li is discussed. We emphasize the role of key nuclear reactions and the methods by which experimental cross section uncertainties are propagated into uncertainties in the predicted abundances. The observational determination of the light nuclides is also discussed. Particular attention is given to the comparison between the predicted and observed abundances, which yields a measurement of the cosmic baryon content. The spectrum of anisotropies in the cosmic microwave background (CMB) now independently measures the baryon density to high precision; we show how the CMB data test BBN, and find that the CMB and the D and 4He observations paint a consistent picture. This concordance stands as a major success of the hot big bang. On the other hand, 7Li remains discrepant with the CMB-preferred baryon density; possible explanations are reviewed. Finally, moving beyond the standard model, primordial nucleosynthesis constraints on early universe and particle physics are also briefly discussed

  4. Institute for Rock Magnetism established

    Science.gov (United States)

    Banerjee, Subir K.

    There is a new focal point for cooperative research in advanced rock magnetism. The University of Minnesota in Minneapolis has established an Institute for Rock Magnetism (IRM) that will provide free access to modern equipment and encourage visiting fellows to focus on important topics in rock magnetism and related interdisciplinary research. Funding for the first three years has been secured from the National Science Foundation, the W.M. Keck Foundation, and the University of Minnesota.In the fall of 1986, the Geomagnetism and Paleomagnetism (GP) section of the AGU held a workshop at Asilomar, Calif., to pinpoint important and emerging research areas in paleomagnetism and rock magnetism, and the means by which to achieve them. In a report of this workshop published by the AGU in September 1987, two urgent needs were set forth. The first was for interdisciplinary research involving rock magnetism, and mineralogy, petrology, sedimentology, and the like. The second need was to ease the access of rock magnetists and paleomagnetists around the country to the latest equipment in modern magnetics technology, such as magneto-optics or electronoptics. Three years after the publication of the report, we announced the opening of these facilities at the GP section of the AGU Fall 1990 Meeting. A classified advertisement inviting applications for visiting fellowships was published in the January 22, 1991, issue of Eos.

  5. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  6. IZVEDBENI ELEMENTI U BIG BROTHERU

    OpenAIRE

    Radman, Korana

    2009-01-01

    Big Brother publici nudi "ultimativnu stvarnost" osiguranu cjelodnevnim nadzorom televizijskih kamera, o čemu je polemizirano od početka njegova prikazivanja u Europi i svijetu. Imajući to na umu, ovaj rad je pristupio Big Brotheru iz perspektive izvedbenih studija, pokušavajući u njemu prepoznati neke od mogućih izvedbi.

  7. The Big Read: Case Studies

    Science.gov (United States)

    National Endowment for the Arts, 2009

    2009-01-01

    The Big Read evaluation included a series of 35 case studies designed to gather more in-depth information on the program's implementation and impact. The case studies gave readers a valuable first-hand look at The Big Read in context. Both formal and informal interviews, focus groups, attendance at a wide range of events--all showed how…

  8. Resources: Building Big (and Small)

    OpenAIRE

    Kelley, Todd R.

    2007-01-01

    The article offers a set of videos and web resources for elementary teachers to help them explore five different structures, including bridges, domes, skyscrapers, dams, and tunnels, that have been built big to meet the human needs and wants. It includes the miniseries video "Building Big" by David Macaulay and the website www.pbs.org/buildingbig.com.

  9. Big Bounce Genesis

    CERN Document Server

    Li, Changhong; Cheung, Yeuk-Kwan E

    2014-01-01

    We report on the possibility to use dark matter mass and its interaction cross section as a smoking gun signal of the existence of a big bounce at the early stage in the evolution of our currently observed universe. A model independent study of dark matter production in the contraction and expansion phases of the bounce universe reveals a new venue for achieving the observed relic abundance in which a significantly smaller amount of dark matter--compared to the standard cosmology--is produced and survives until today, diluted only by the cosmic expansion since the radiation dominated era. Once DM mass and its interaction strength with ordinary matter are determined by experiments, this alternative route becomes a signature of the bounce universe scenario.

  10. A matrix big bang

    International Nuclear Information System (INIS)

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control

  11. A Matrix Big Bang

    CERN Document Server

    Craps, B; Verlinde, E; Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control.

  12. Big nuclear accidents

    International Nuclear Information System (INIS)

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the safety of nuclear power. The way in which the probability and consequences of big nuclear accidents have been presented in the past is reviewed and recommendations for the future are made including the presentation of the long-term consequences of such accidents in terms of 'reduction in life expectancy', 'increased chance of fatal cancer' and the equivalent pattern of compulsory cigarette smoking. (author)

  13. DPF Big One

    International Nuclear Information System (INIS)

    At its latest venue at Fermilab from 10-14 November, the American Physical Society's Division of Particles and Fields meeting entered a new dimension. These regular meetings, which allow younger researchers to communicate with their peers, have been gaining popularity over the years (this was the seventh in the series), but nobody had expected almost a thousand participants and nearly 500 requests to give talks. Thus Fermilab's 800-seat auditorium had to be supplemented with another room with a video hookup, while the parallel sessions were organized into nine bewildering streams covering fourteen major physics topics. With the conventionality of the Standard Model virtually unchallenged, physics does not move fast these days. While most of the physics results had already been covered in principle at the International Conference on High Energy Physics held in Dallas in August (October, page 1), the Fermilab DPF meeting had a very different atmosphere. Major international meetings like Dallas attract big names from far and wide, and it is difficult in such an august atmosphere for young researchers to find a receptive audience. This was not the case at the DPF parallel sessions. The meeting also adopted a novel approach, with the parallels sandwiched between an initial day of plenaries to set the scene, and a final day of summaries. With the whole world waiting for the sixth ('top') quark to be discovered at Fermilab's Tevatron protonantiproton collider, the meeting began with updates from Avi Yagil and Ronald Madaras from the big detectors, CDF and DO respectively. Although rumours flew thick and fast, the Tevatron has not yet reached the top, although Yagil could show one intriguing event of a type expected from the heaviest quark

  14. Mining “Big Data” using Big Data Services

    OpenAIRE

    2014-01-01

    While many colleagues within science are fed up with the “big data fad”, empirical analyses we conducted for the current editorial actually show an inconsistent picture: we use big data services to determine whether there really is an increase in writing about big data or even widespread use of the term. Google Correlate (http://www.google.com/trends/correlate/), the first free tool we are presenting here, doesn’t list the term, showing that number of searches for it are below an absolute min...

  15. Big bang and big crunch in matrix string theory

    International Nuclear Information System (INIS)

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe

  16. Exploring Data in Human Resources Big Data

    Directory of Open Access Journals (Sweden)

    Adela BARA

    2016-01-01

    Full Text Available Nowadays, social networks and informatics technologies and infrastructures are constantly developing and affect each other. In this context, the HR recruitment process became complex and many multinational organizations have encountered selection issues. The objective of the paper is to develop a prototype system for assisting the selection of candidates for an intelligent management of human resources. Such a system can be a starting point for the efficient organization of semi-structured and unstructured data on recruitment activities. The article extends the research presented at the 14th International Conference on Informatics in Economy (IE 2015 in the scientific paper "Big Data challenges for human resources management".

  17. Dynamic experimental study on rock meso-cracks growth by digital image processing technique

    Institute of Scientific and Technical Information of China (English)

    朱珍德; 倪骁慧; 王伟; 李双蓓; 赵杰; 武沂泉

    2008-01-01

    A new meso-mechanical testing scheme based on SEM was developed to carry out the experiment of microfracturing process of rocks. The microfracturing process of the pre-crack marble sample on surrounding rock in the immerged Long-big tunnel in Jinping Cascade II Hydropower Station under uniaxial compression was recorded by using the testing scheme. According to the stereology theory, the propagation and coalescent of cracks at meso-scale were quantitatively investigated with digital technology. Therefore, the basic geometric information of rock microcracks such as area, angle, length, width, perimeter, was obtained from binary images after segmentation. The failure mechanism of specimen under uniaxial compression with the quantitative information was studied from macro and microscopic point of view. The results show that the image of microfracturing process of the specimen can be observed and recorded digitally. During the damage of the specimen, the distribution of microcracks in the specimen is still subjected to exponential distribution with some microcracks concentrated in certain regions. Finally, the change law of the fractal dimension of the local element in marble sample under different external load conditions is obtained by means of the statistical calculation of the fractal dimension.

  18. ATLAS: civil engineering Point 1

    CERN Multimedia

    2000-01-01

    The ATLAS experimental area is located in Point 1, just across the main CERN entrance, in the commune of Meyrin. There people are busy to finish the different infrastructures for ATLAS. Real underground video. Nice view from the surface to the cavern from the pit side - all the big machines looked very small. The film has original working sound.

  19. BigOP: Generating Comprehensive Big Data Workloads as a Benchmarking Framework

    OpenAIRE

    Zhu, Yuqing; Zhan, Jianfeng; Weng, Chuliang; Nambiar, Raghunath; Zhang, Jinchao; Chen, Xingzhen; Wang, Lei

    2014-01-01

    Big Data is considered proprietary asset of companies, organizations, and even nations. Turning big data into real treasure requires the support of big data systems. A variety of commercial and open source products have been unleashed for big data storage and processing. While big data users are facing the choice of which system best suits their needs, big data system developers are facing the question of how to evaluate their systems with regard to general big data processing needs. System b...

  20. From big bang to big crunch and beyond

    International Nuclear Information System (INIS)

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a 'big bang' singularity, expands and then contracts to a 'big crunch' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spacetimes is further connected at the singularities to a series of non-compact static regions with closed timelike curves. These regions contain boundaries, on which the observables of the theory live. This suggests a holographic interpretation of the physics. (author)

  1. The challenges of big data.

    Science.gov (United States)

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. PMID:27147249

  2. Big Data Mining: Tools & Algorithms

    Directory of Open Access Journals (Sweden)

    Adeel Shiraz Hashmi

    2016-03-01

    Full Text Available We are now in Big Data era, and there is a growing demand for tools which can process and analyze it. Big data analytics deals with extracting valuable information from that complex data which can’t be handled by traditional data mining tools. This paper surveys the available tools which can handle large volumes of data as well as evolving data streams. The data mining tools and algorithms which can handle big data have also been summarized, and one of the tools has been used for mining of large datasets using distributed algorithms.

  3. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  4. The challenges of big data

    Science.gov (United States)

    2016-01-01

    ABSTRACT The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. PMID:27147249

  5. Big Data: present and future

    Directory of Open Access Journals (Sweden)

    Mircea Raducu TRIFU

    2014-05-01

    Full Text Available The paper explains the importance of the Big Data concept, a concept that even now, after years of development, is for the most companies just a cool keyword. The paper also describes the level of the actual big data development and the things it can do, and also the things that can be done in the near future. The paper focuses on explaining to nontechnical and non-database related technical specialists what basically is big data, presents the three most important V's, as well as the new ones, the most important solutions used by companies like Google or Amazon, as well as some interesting perceptions based on this subject.

  6. A view on big data and its relation to Informetrics

    Institute of Scientific and Technical Information of China (English)

    Ronald; ROUSSEAU

    2012-01-01

    Purpose:Big data offer a huge challenge.Their very existence leads to the contradiction that the more data we have the less accessible they become,as the particular piece of information one is searching for may be buried among terabytes of other data.In this contribution we discuss the origin of big data and point to three challenges when big data arise:Data storage,data processing and generating insights.Design/methodology/approach:Computer-related challenges can be expressed by the CAP theorem which states that it is only possible to simultaneously provide any two of the three following properties in distributed applications:Consistency(C),availability(A)and partition tolerance(P).As an aside we mention Amdahl’s law and its application for scientific collaboration.We further discuss data mining in large databases and knowledge representation for handling the results of data mining exercises.We further offer a short informetric study of the field of big data,and point to the ethical dimension of the big data phenomenon.Findings:There still are serious problems to overcome before the field of big data can deliver on its promises.Implications and limitations:This contribution offers a personal view,focusing on the information science aspects,but much more can be said about software aspects.Originality/value:We express the hope that the information scientists,including librarians,will be able to play their full role within the knowledge discovery,data mining and big data communities,leading to exciting developments,the reduction of scientific bottlenecks and really innovative applications.

  7. Detecting and understanding big events in big cities

    OpenAIRE

    Furletti, Barbara; Trasarti, Roberto; Gabrielli, Lorenzo; Smoreda, Zbigniew; Vanhoof, Maarten; Ziemlicki, Cezary

    2015-01-01

    Recent studies have shown the great potential of big data such as mobile phone location data to model human behavior. Big data allow to analyze people presence in a territory in a fast and effective way with respect to the classical surveys (diaries or questionnaires). One of the drawbacks of these collection systems is incompleteness of the users' traces; people are localized only when they are using their phones. In this work we define a data mining method for identifying people presence an...

  8. Quantum Fields in a Big Crunch/Big Bang Spacetime

    OpenAIRE

    Tolley, Andrew J.; Turok, Neil

    2002-01-01

    We consider quantum field theory on a spacetime representing the Big Crunch/Big Bang transition postulated in the ekpyrotic or cyclic cosmologies. We show via several independent methods that an essentially unique matching rule holds connecting the incoming state, in which a single extra dimension shrinks to zero, to the outgoing state in which it re-expands at the same rate. For free fields in our construction there is no particle production from the incoming adiabatic vacuum. When interacti...

  9. Sailing through the big crunch-big bang transition

    OpenAIRE

    Bars, Itzhak; Steinhardt, Paul; Turok, Neil

    2013-01-01

    In a recent series of papers, we have shown that theories with scalar fields coupled to gravity (e.g., the standard model) can be lifted to a Weyl-invariant equivalent theory in which it is possible to unambiguously trace the classical cosmological evolution through the transition from big crunch to big bang. The key was identifying a sufficient number of finite, Weyl-invariant conserved quantities to uniquely match the fundamental cosmological degrees of freedom across the transition. In so ...

  10. Hey, big spender

    International Nuclear Information System (INIS)

    Business to business electronic commerce is looming large in the future of the oil industry. It is estimated that by adopting e-commerce the industry could achieve bottom line savings of between $1.8 to $ 3.4 billion a year on annual gross revenues in excess of $ 30 billion. At present there are several teething problems to overcome such as inter-operability standards, which are at least two or three years away. Tying in electronically with specific suppliers is also an expensive proposition, although the big benefits are in fact in doing business with the same suppliers on a continuing basis. Despite these problems, 14 of the world's largest energy and petrochemical companies joined forces in mid-April to create a single Internet procurement marketplace for the industry's complex supply chain. The exchange was designed by B2B (business-to-business) software provider, Commerce One Inc., ; it will leverage the buying clout of these industry giants (BP Amoco, Royal Dutch Shell Group, Conoco, Occidental Petroleum, Phillips Petroleum, Unocal Corporation and Statoil among them), currently about $ 125 billion on procurement per year; they hope to save between 5 to 30 per cent depending on the product and the region involved. Other similar schemes such as Chevron and partners' Petrocosm Marketplace, Network Oil, a Houston-based Internet portal aimed at smaller petroleum companies, are also doing business in the $ 10 billion per annum range. e-Energy, a cooperative project between IBM Ericson and Telus Advertising is another neutral, virtual marketplace targeted at the oil and gas sector. PetroTRAX, a Calgary-based website plans to take online procurement and auction sales a big step forward by establishing a portal to handle any oil company's asset management needs. There are also a number of websites targeting specific needs: IndigoPool.com (acquisitions and divestitures) and WellBid.com (products related to upstream oil and gas operators) are just two examples. All in

  11. CERN’s Summer of Rock

    CERN Multimedia

    Katarina Anthony

    2015-01-01

    When a rock star visits CERN, they don’t just bring their entourage with them. Along for the ride are legions of fans across the world – many of whom may not be the typical CERN audience. In July alone, four big acts paid CERN a visit, sharing their experience with the world: Scorpions, The Script, Kings of Leon and Patti Smith.   @TheScript tweeted: #paleofestival we had the best time! Big love. #CERN (Image: Twitter).   It all started with the Scorpions, the classic rock band whose “Wind of Change” became an anthem in the early 1990s. On 19 July, the band braved the 35-degree heat to tour the CERN site on foot – visiting the Synchrocyclotron and the new Microcosm exhibition. The rockers were very enthusiastic about the research carried out at CERN, and talked about returning in the autumn during their next tour stop. The Scorpions visit Microcosm. Two days later, The Script rolled in. This Irish pop-rock band has been hittin...

  12. Big bang darkleosynthesis

    Directory of Open Access Journals (Sweden)

    Gordan Krnjaic

    2015-12-01

    Full Text Available In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN, in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV/dark-nucleon binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S≫3/2, whose discovery would be smoking gun evidence for dark nuclei.

  13. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  14. Big Lake Dam Inspection Report

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This report summarizes an inspection of the Big Lake Dam that was done in September of 1983. The inspection did not reveal any conditions that constitute and...

  15. Le Big Bang en laboratoire

    CERN Multimedia

    Roy, Christelle

    2006-01-01

    Physiciens have been dreaming of it for 30 years; Thanks to huge particle accelerators, they were able to observe the matter such as it was some instants after the Big Bang (three different articles in 10 pages)

  16. Big Data Analytics in Healthcare

    OpenAIRE

    Ashwin Belle; Raghuram Thiagarajan; S. M. Reza Soroushmehr; Fatemeh Navidi; Daniel A Beard; Kayvan Najarian

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is sti...

  17. Big Data and Ambulatory Care

    OpenAIRE

    Thorpe, Jane Hyatt; Gray, Elizabeth Alexandra

    2014-01-01

    Big data is heralded as having the potential to revolutionize health care by making large amounts of data available to support care delivery, population health, and patient engagement. Critics argue that big data's transformative potential is inhibited by privacy requirements that restrict health information exchange. However, there are a variety of permissible activities involving use and disclosure of patient information that support care delivery and management. This article presents an ov...

  18. The role of big laboratories

    International Nuclear Information System (INIS)

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward. (paper)

  19. The role of big laboratories

    CERN Document Server

    Heuer, Rolf-Dieter

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward.

  20. Big bang theory

    International Nuclear Information System (INIS)

    The development of a new well stimulation technique that uses propellant technology was discussed. The enhanced oil production process involves igniting a cylinder of solid rocket propellant positioned across a perforated zone to disrupt clogging sand. The StimGun assembly is a hybrid method of perforating with simultaneous propellant stimulation. High pressure gas enters the perforated zone and breaks through damage around the tunnel creating fractures. Data recorders can also run with the StimGun to record downhole pressure and provide a data model for estimating rock properties and propellant penetration. The StimGun assembly was developed by Marathon Oil, Computalog Ltd., Owen Oil Tools, and HTH Technical Services Inc., and is available in Canada from Computalog. 1 fig

  1. Big bang nucleosynthesis: Present status

    Science.gov (United States)

    Cyburt, Richard H.; Fields, Brian D.; Olive, Keith A.; Yeh, Tsung-Han

    2016-01-01

    Big bang nucleosynthesis (BBN) describes the production of the lightest nuclides via a dynamic interplay among the four fundamental forces during the first seconds of cosmic time. A brief overview of the essentials of this physics is given, and new calculations presented of light-element abundances through 6Li and 7Li, with updated nuclear reactions and uncertainties including those in the neutron lifetime. Fits are provided for these results as a function of baryon density and of the number of neutrino flavors Nν. Recent developments are reviewed in BBN, particularly new, precision Planck cosmic microwave background (CMB) measurements that now probe the baryon density, helium content, and the effective number of degrees of freedom Neff. These measurements allow for a tight test of BBN and cosmology using CMB data alone. Our likelihood analysis convolves the 2015 Planck data chains with our BBN output and observational data. Adding astronomical measurements of light elements strengthens the power of BBN. A new determination of the primordial helium abundance is included in our likelihood analysis. New D/H observations are now more precise than the corresponding theoretical predictions and are consistent with the standard model and the Planck baryon density. Moreover, D/H now provides a tight measurement of Nν when combined with the CMB baryon density and provides a 2 σ upper limit Nνdata. In contrast with D/H and 4He, 7Li predictions continue to disagree with observations, perhaps pointing to new physics. This paper concludes with a look at future directions including key nuclear reactions, astronomical observations, and theoretical issues.

  2. On The Big Bang Singularity in $k=0$ FLRW Cosmologies

    CERN Document Server

    Kohli, Ikjyot Singh

    2016-01-01

    In this brief paper, we consider the dynamics of a spatially flat FLRW spacetime with a positive cosmological constant and matter obeying a barotropic equation of state. By performing a change of variables on the Raychaudhuri equation, we are able to compactify the big bang singularity to a finite point. We then use Chetaev's instability theorem to prove that such a model is always past asymptotic to a big bang singularity assuming only the weak energy condition, which is more general than the strong energy condition used in the classical singularity theorems of cosmology.

  3. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-06-01

    soil C in the partnership region, and to design a risk/cost effectiveness framework to make comparative assessments of each viable sink, taking into account economic costs, offsetting benefits, scale of sequestration opportunities, spatial and time dimensions, environmental risks, and long term viability. Scientifically sound information on MMV is critical for public acceptance of these technologies. Two key deliverables were completed this quarter--a literature review/database to assess the soil carbon on rangelands, and the draft protocols, contracting options for soil carbon trading. To date, there has been little research on soil carbon on rangelands, and since rangeland constitutes a major land use in the Big Sky region, this is important in achieving a better understanding of terrestrial sinks. The protocols developed for soil carbon trading are unique and provide a key component of the mechanisms that might be used to efficiently sequester GHG and reduce CO{sub 2} concentrations. Progress on other deliverables is noted in the PowerPoint presentations. A series of meetings held during the second quarter have laid the foundations for assessing the issues surrounding the implementation of a market-based setting for soil C credits. These meetings provide a connection to stakeholders in the region and a basis on which to draw for the DOE PEIS hearings. Finally, the education and outreach efforts have resulted in a comprehensive plan and process which serves as a guide for implementing the outreach activities under Phase I. While we are still working on the public website, we have made many presentations to stakeholders and policy makers, connections to other federal and state agencies concerned with GHG emissions, climate change, and efficient and environmentally-friendly energy production. In addition, we have laid plans for integration of our outreach efforts with the students, especially at the tribal colleges and at the universities involved in our partnership

  4. Pre-Big Bang, vacuum and noncyclic cosmologies

    CERN Document Server

    Gonzalez-Mestres, Luis

    2012-01-01

    WMAP and Planck open the way to unprecedented Big Bang phenomenology, potentially allowing to test the standard Big Bang model as well as less conventional approaches including noncyclic pre-Big Bang cosmologies that would incorporate a new fundamental scale beyond the Planck scale and, possibly, new ultimate constituents of matter. Alternatives to standard physics can be considered from a cosmological point of view concerning vacuum structure, the nature of space-time, the origin and evolution of our Universe, the validity of quantum field theory and conventional symmetries, solutions to the cosmological constant problem, inflationary scenarios, dark matter and dark energy, the interpretation of string-like theories... Lorentz-like symmetries for the properties of matter (standard or superbradyonic) can then be naturally stable space-time configurations resulting from general cosmological scenarios that incorporate physics beyond the Planck scale and describe the formation and evolution of the present vacuum...

  5. Small Places, Big Stakes

    DEFF Research Database (Denmark)

    Garsten, Christina; Sörbom, Adrienne

    Ethnographic fieldwork in organizations – such as corporations, state agencies, and international organizations – often entails that the ethnographer has to rely to a large extent on meetings as the primary point of access. Oftentimes, this involves doing fieldwork in workshops, at ceremonies, an...

  6. Pore-scale analysis of electrical properties in thinly bedded rock using digital rock physics

    International Nuclear Information System (INIS)

    We investigated the electrical properties of laminated rock consist of macro-porous layers and micro-porous layers based on digital rock technology. Due to the bedding effect and anisotropy, traditional Archie equations cannot well describe the electrical behavior of laminated rock. The RI-Sw curve of laminated rock shows a nonlinear relationship. The RI-Sw curve can be divided into two linear segments with different saturation exponent. Laminated sand-shale sequences and laminated sands of different porosity or grain size will yield macroscopic electrical anisotropy. Numerical simulation and theoretical analysis lead to the conclusion that electrical anisotropy coefficient of laminated rock is a strong function of water saturation. The function curve can be divided into three segments by the turning point. Therefore, the electrical behavior of laminated rock should be considered in oil exploration and development. (paper)

  7. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-10-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification

  8. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-06-30

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop (see attached agenda). The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement

  9. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process. PMID:27068058

  10. [Big data in official statistics].

    Science.gov (United States)

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany. PMID:26077871

  11. The rock diet

    OpenAIRE

    Fordyce, Fiona; Johnson, Chris

    2002-01-01

    You may think there is little connection between rocks and our diet, indeed a serving of rocks may sound very unappetising! But rocks are a vital source of the essential elements and minerals we need to keep us healthy, such as calcium for healthy teeth and bones.

  12. My Pet Rock

    Science.gov (United States)

    Lark, Adam; Kramp, Robyne; Nurnberger-Haag, Julie

    2008-01-01

    Many teachers and students have experienced the classic pet rock experiment in conjunction with a geology unit. A teacher has students bring in a "pet" rock found outside of school, and the students run geologic tests on the rock. The tests include determining relative hardness using Mohs scale, checking for magnetization, and assessing luster.…

  13. Dual of Big-bang and Big-crunch

    OpenAIRE

    Bak, Dongsu

    2006-01-01

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by procedure of the double anaytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are non singular at all as the coupling goes to zero in the N=4 Super Yang-Mills theory. The cosmological sing...

  14. Turning big bang into big bounce: II. Quantum dynamics

    International Nuclear Information System (INIS)

    We analyze the big bounce transition of the quantum Friedmann-Robertson-Walker model in the setting of the nonstandard loop quantum cosmology (LQC). Elementary observables are used to quantize composite observables. The spectrum of the energy density operator is bounded and continuous. The spectrum of the volume operator is bounded from below and discrete. It has equally distant levels defining a quantum of the volume. The discreteness may imply a foamy structure of spacetime at a semiclassical level which may be detected in astro-cosmo observations. The nonstandard LQC method has a free parameter that should be fixed in some way to specify the big bounce transition.

  15. Physical properties of rocks. Subvol. a

    International Nuclear Information System (INIS)

    The geophysical data of solid earth are compiled in the first two volumes of group V in the New Series of Landolt-Boernstein. V/1 contains a compilation of the physical properties of rocks while vol. V/2 treats physics of the earth as a whole. The present subvolume V/1a includes an introduction on the rocks of the earth, and data tables on the following properties of rocks and minerals: density, porosity, permeability, elasticity and inelasticity, thermal properties (thermal conductivity, specific heat, melting points, radioactive heat generation). (GSCH)

  16. CLOUD COMPUTING WITH BIG DATA: A REVIEW

    OpenAIRE

    Anjali; Er. Amandeep Kaur; Mrs. Shakshi

    2016-01-01

    Big data is a collection of huge quantities of data. Big data is the process of examining large amounts of data. Big data and Cloud computing are the hot issues in Information Technology. Big data is the one of the main problem now a day’s. Researchers focusing how to handle huge amount of data with cloud computing and how to gain a perfect security for big data in cloud computing. To handle the Big Data problem Hadoop framework is used in which data is fragmented and executed parallel....

  17. Big Data Analytics in Healthcare

    Directory of Open Access Journals (Sweden)

    Ashwin Belle

    2015-01-01

    Full Text Available The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  18. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957

  19. Big Data: Astronomical or Genomical?

    Science.gov (United States)

    Stephens, Zachary D; Lee, Skylar Y; Faghri, Faraz; Campbell, Roy H; Zhai, Chengxiang; Efron, Miles J; Iyer, Ravishankar; Schatz, Michael C; Sinha, Saurabh; Robinson, Gene E

    2015-07-01

    Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a "four-headed beast"--it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the "genomical" challenges of the next decade. PMID:26151137

  20. Big Data: Astronomical or Genomical?

    Directory of Open Access Journals (Sweden)

    Zachary D Stephens

    2015-07-01

    Full Text Available Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a "four-headed beast"--it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the "genomical" challenges of the next decade.

  1. Big Book of Windows Hacks

    CERN Document Server

    Gralla, Preston

    2008-01-01

    Bigger, better, and broader in scope, the Big Book of Windows Hacks gives you everything you need to get the most out of your Windows Vista or XP system, including its related applications and the hardware it runs on or connects to. Whether you want to tweak Vista's Aero interface, build customized sidebar gadgets and run them from a USB key, or hack the "unhackable" screensavers, you'll find quick and ingenious ways to bend these recalcitrant operating systems to your will. The Big Book of Windows Hacks focuses on Vista, the new bad boy on Microsoft's block, with hacks and workarounds that

  2. Sosiaalinen asiakassuhdejohtaminen ja big data

    OpenAIRE

    Toivonen, Topi-Antti

    2015-01-01

    Tässä tutkielmassa käsitellään sosiaalista asiakassuhdejohtamista sekä hyötyjä, joita siihen voidaan saada big datan avulla. Sosiaalinen asiakassuhdejohtaminen on terminä uusi ja monille tuntematon. Tutkimusta motivoi aiheen vähäinen tutkimus, suomenkielisen tutkimuksen puuttuminen kokonaan sekä sosiaalisen asiakassuhdejohtamisen mahdollinen olennainen rooli yritysten toiminnassa tulevaisuudessa. Big dataa käsittelevissä tutkimuksissa keskitytään monesti sen tekniseen puoleen, eikä sovellutuk...

  3. AAPOR Report on Big Data

    OpenAIRE

    Task Force Members Include: Lilli Japec; Frauke Kreuter; Marcus Berg; Paul Biemer; Paul Decker; Cliff Lampe; Julia Lane; Cathy O'Neil; Abe Usher

    2015-01-01

    In recent years we have seen an increase in the amount of statistics in society describing different phenomena based on so called Big Data. The term Big Data is used for a variety of data as explained in the report, many of them characterized not just by their large volume, but also by their variety and velocity, the organic way in which they are created, and the new types of processes needed to analyze them and make inference from them. The change in the nature of the new types of data, thei...

  4. Towards a big crunch dual

    Energy Technology Data Exchange (ETDEWEB)

    Hertog, Thomas E-mail: hertog@vulcan2.physics.ucsb.edu; Horowitz, Gary T

    2004-07-01

    We show there exist smooth asymptotically anti-de Sitter initial data which evolve to a big crunch singularity in a low energy supergravity limit of string theory. This opens up the possibility of using the dual conformal field theory to obtain a fully quantum description of the cosmological singularity. A preliminary study of this dual theory suggests that the big crunch is an endpoint of evolution even in the full string theory. We also show that any theory with scalar solitons must have negative energy solutions. The results presented here clarify our earlier work on cosmic censorship violation in N=8 supergravity. (author)

  5. [Big Data- challenges and risks].

    Science.gov (United States)

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-01

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented. PMID:26614539

  6. The BigBOSS Experiment

    OpenAIRE

    Schlegel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Prieto, C. Allende; Annis, J.; Aubourg, E.; Azzaro, M.; Baltay, S. Bailey. C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.

    2011-01-01

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra...

  7. Release plan for Big Pete

    International Nuclear Information System (INIS)

    This release plan is to provide instructions for the Radiological Control Technician (RCT) to conduct surveys for the unconditional release of ''Big Pete,'' which was used in the removal of ''Spacers'' from the N-Reactor. Prior to performing surveys on the rear end portion of ''Big Pete,'' it shall be cleaned (i.e., free of oil, grease, caked soil, heavy dust). If no contamination is found, the vehicle may be released with the permission of the area RCT Supervisor. If contamination is found by any of the surveys, contact the cognizant Radiological Engineer for decontamination instructions

  8. Big society, big data. The radicalisation of the network society

    NARCIS (Netherlands)

    Frissen, V.

    2011-01-01

    During the British election campaign of 2010, David Cameron produced the idea of the ‘Big Society’ as a cornerstone of his political agenda. At the core of the idea is a stronger civil society and local community coupled with a more withdrawn government. Although many commentators have dismissed thi

  9. [Algorithms, machine intelligence, big data : general considerations].

    Science.gov (United States)

    Radermacher, F J

    2015-08-01

    We are experiencing astonishing developments in the areas of big data and artificial intelligence. They follow a pattern that we have now been observing for decades: according to Moore's Law,the performance and efficiency in the area of elementary arithmetic operations increases a thousand-fold every 20 years. Although we have not achieved the status where in the singular sense machines have become as "intelligent" as people, machines are becoming increasingly better. The Internet of Things has again helped to massively increase the efficiency of machines. Big data and suitable analytics do the same. If we let these processes simply continue, our civilization may be endangerd in many instances. If the "containment" of these processes succeeds in the context of a reasonable political global governance, a worldwide eco-social market economy, andan economy of green and inclusive markets, many desirable developments that are advantageous for our future may result. Then, at some point in time, the constant need for more and faster innovation may even stop. However, this is anything but certain. We are facing huge challenges. PMID:26141245

  10. Analyzing Big Data with Dynamic Quantum Clustering

    CERN Document Server

    Weinstein, M; Hume, A; Sciau, Ph; Shaked, G; Hofstetter, R; Persi, E; Mehta, A; Horn, D

    2013-01-01

    How does one search for a needle in a multi-dimensional haystack without knowing what a needle is and without knowing if there is one in the haystack? This kind of problem requires a paradigm shift - away from hypothesis driven searches of the data - towards a methodology that lets the data speak for itself. Dynamic Quantum Clustering (DQC) is such a methodology. DQC is a powerful visual method that works with big, high-dimensional data. It exploits variations of the density of the data (in feature space) and unearths subsets of the data that exhibit correlations among all the measured variables. The outcome of a DQC analysis is a movie that shows how and why sets of data-points are eventually classified as members of simple clusters or as members of - what we call - extended structures. This allows DQC to be successfully used in a non-conventional exploratory mode where one searches data for unexpected information without the need to model the data. We show how this works for big, complex, real-world dataset...

  11. Do Big Bottles Kickstart Infant Weight Issues?

    Science.gov (United States)

    ... nih.gov/medlineplus/news/fullstory_159241.html Do Big Bottles Kickstart Infant Weight Issues? Smaller baby bottles ... 2016 (HealthDay News) -- Feeding babies formula from a big bottle might put them at higher risk for ...

  12. Big data e data science

    OpenAIRE

    Cavique, Luís

    2014-01-01

    Neste artigo foram apresentados os conceitos básicos de Big Data e a nova área a que deu origem, a Data Science. Em Data Science foi discutida e exemplificada a noção de redução da dimensionalidade dos dados.

  13. Do big gods cause anything?

    DEFF Research Database (Denmark)

    Geertz, Armin W.

    2014-01-01

    Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere....

  14. China: Big Changes Coming Soon

    Science.gov (United States)

    Rowen, Henry S.

    2011-01-01

    Big changes are ahead for China, probably abrupt ones. The economy has grown so rapidly for many years, over 30 years at an average of nine percent a year, that its size makes it a major player in trade and finance and increasingly in political and military matters. This growth is not only of great importance internationally, it is already having…

  15. YOUNG CITY,BIG PARTY

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Shenzhen Universiade unites the world’s young people through sports with none of the usual hoop-la, no fireworks, no grand performances by celebrities and superstars, the Shenzhen Summer Universiade lowered the curtain on a big party for youth and college students on August 23.

  16. BIG DATA IN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2015-06-01

    Full Text Available In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured in order to improve current transactions, to develop new business models, to provide a real image of the supply and demand and thereby, generate market advantages. So, the companies that turn to Big Data have a competitive advantage over other firms. Looking from the perspective of IT organizations, they must accommodate the storage and processing Big Data, and provide analysis tools that are easily integrated into business processes. This paper aims to discuss aspects regarding the Big Data concept, the principles to build, organize and analyse huge datasets in the business environment, offering a three-layer architecture, based on actual software solutions. Also, the article refers to the graphical tools for exploring and representing unstructured data, Gephi and NodeXL.

  17. Characterizing and Subsetting Big Data Workloads

    OpenAIRE

    Jia, Zhen; Zhan, Jianfeng; Wang, Lei; Han, Rui; Mckee, Sally A.; Yang, Qiang; Luo, Chunjie; Li, Jingwei

    2014-01-01

    Big data benchmark suites must include a diversity of data and workloads to be useful in fairly evaluating big data systems and architectures. However, using truly comprehensive benchmarks poses great challenges for the architecture community. First, we need to thoroughly understand the behaviors of a variety of workloads. Second, our usual simulation-based research methods become prohibitively expensive for big data. As big data is an emerging field, more and more software stacks are being p...

  18. Big Graph Mining: Frameworks and Techniques

    OpenAIRE

    Aridhi, Sabeur; Nguifo, Engelbert Mephu

    2016-01-01

    Big graph mining is an important research area and it has attracted considerable attention. It allows to process, analyze, and extract meaningful information from large amounts of graph data. Big graph mining has been highly motivated not only by the tremendously increasing size of graphs but also by its huge number of applications. Such applications include bioinformatics, chemoinformatics and social networks. One of the most challenging tasks in big graph mining is pattern mining in big gra...

  19. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  20. Judging Big Deals: Challenges, Outcomes, and Advice

    Science.gov (United States)

    Glasser, Sarah

    2013-01-01

    This article reports the results of an analysis of five Big Deal electronic journal packages to which Hofstra University's Axinn Library subscribes. COUNTER usage reports were used to judge the value of each Big Deal. Limitations of usage statistics are also discussed. In the end, the author concludes that four of the five Big Deals are good…

  1. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  2. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services. PMID:27559194

  3. "Big Data" - Grosse Daten, viel Wissen?

    OpenAIRE

    Hothorn, Torsten

    2015-01-01

    Since a couple of years, the term Big Data describes technologies to extract knowledge from data. Applications of Big Data and their consequences are also increasingly discussed in the mass media. Because medicine is an empirical science, we discuss the meaning of Big Data and its potential for future medical research.

  4. The BigBoss Experiment

    International Nuclear Information System (INIS)

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = λ/Δλ = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 max = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (kmax = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  5. Kasner solutions, climbing scalars and big-bang singularity

    International Nuclear Information System (INIS)

    We elaborate on a recently discovered phenomenon where a scalar field close to big-bang is forced to climb a steep potential by its dynamics. We analyze the phenomenon in more general terms by writing the leading order equations of motion near the singularity. We formulate the conditions for climbing to exist in the case of several scalars and after inclusion of higher-derivative corrections and we apply our results to some models of moduli stabilization. We analyze an example with steep stabilizing potential and notice again a related critical behavior: for a potential steepness above a critical value, going backwards towards big-bang, the scalar undergoes wilder oscillations, with the steep potential pushing it back at every passage and not allowing the scalar to escape to infinity. Whereas it was pointed out earlier that there are possible implications of the climbing phase to CMB, we point out here another potential application, to the issue of initial conditions in inflation

  6. Global Fluctuation Spectra in Big Crunch/Big Bang String Vacua

    OpenAIRE

    Craps, Ben; Ovrut, Burt A.

    2003-01-01

    We study Big Crunch/Big Bang cosmologies that correspond to exact world-sheet superconformal field theories of type II strings. The string theory spacetime contains a Big Crunch and a Big Bang cosmology, as well as additional ``whisker'' asymptotic and intermediate regions. Within the context of free string theory, we compute, unambiguously, the scalar fluctuation spectrum in all regions of spacetime. Generically, the Big Crunch fluctuation spectrum is altered while passing through the bounce...

  7. Hungry for Rocks

    Science.gov (United States)

    2004-01-01

    This image from the Mars Exploration Rover Spirit hazard identification camera shows the rover's perspective just before its first post-egress drive on Mars. On Sunday, the 15th martian day, or sol, of Spirit's journey, engineers drove Spirit approximately 3 meters (10 feet) toward its first rock target, a football-sized, mountain-shaped rock called Adirondack (not pictured). In the foreground of this image are 'Sashimi' and 'Sushi' - two rocks that scientists considered investigating first. Ultimately, these rocks were not chosen because their rough and dusty surfaces are ill-suited for grinding.

  8. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    CERN Multimedia

    2008-01-01

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  9. Perspectives on Big Data and Big Data Analytics

    Directory of Open Access Journals (Sweden)

    Elena Geanina ULARU

    2012-12-01

    Full Text Available Nowadays companies are starting to realize the importance of using more data in order to support decision for their strategies. It was said and proved through study cases that “More data usually beats better algorithms”. With this statement companies started to realize that they can chose to invest more in processing larger sets of data rather than investing in expensive algorithms. The large quantity of data is better used as a whole because of the possible correlations on a larger amount, correlations that can never be found if the data is analyzed on separate sets or on a smaller set. A larger amount of data gives a better output but also working with it can become a challenge due to processing limitations. This article intends to define the concept of Big Data and stress the importance of Big Data Analytics.

  10. Solution of a braneworld big crunch/big bang cosmology

    International Nuclear Information System (INIS)

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c)2. At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios

  11. Web Science Big Wins: Information Big Bang & Fundamental Constants

    OpenAIRE

    Carr, Les

    2010-01-01

    We take for granted a Web that provides free and unrestricted information exchange, but the Web is under pressure to change in order to respond to issues of security, commerce, criminality, privacy. Web Science needs to explain how the Web impacts society and predict the outcomes of proposed changes to Web infrastructure on business and society. Using the analogy of the Big Bang, this presentation describes how the Web spread the conditions of its initial creation throughout the whole of soci...

  12. Big Data – Big Deal for Organization Design?

    OpenAIRE

    Janne J. Korhonen

    2014-01-01

    Analytics is an increasingly important source of competitive advantage. It has even been posited that big data will be the next strategic emphasis of organizations and that analytics capability will be manifested in organizational structure. In this article, I explore how analytics capability might be reflected in organizational structure using the notion of  “requisite organization” developed by Jaques (1998). Requisite organization argues that a new strategic emphasis requires the addition ...

  13. Augmented Borders:Big Data and the Ethics of Immigration Control

    OpenAIRE

    Ajana, Btihaj

    2015-01-01

    Purpose– The aim of this paper is to consider some of the issues in light of the application of Big Data in the domain of border security and immigration management. Investment in the technologies of borders and their securitisation continues to be a focal point for many governments across the globe. This paper is concerned with a particular example of such technologies, namely, “Big Data” analytics. In the past two years, the technology of Big Data has gained a remarkable popularity within a...

  14. Nástroje pro Big Data Analytics

    OpenAIRE

    Miloš, Marek

    2013-01-01

    The thesis covers the term for specific data analysis called Big Data. The thesis firstly defines the term Big Data and the need for its creation because of the rising need for deeper data processing and analysis tools and methods. The thesis also covers some of the technical aspects of Big Data tools, focusing on Apache Hadoop in detail. The later chapters contain Big Data market analysis and describe the biggest Big Data competitors and tools. The practical part of the thesis presents a way...

  15. ISSUES, CHALLENGES, AND SOLUTIONS: BIG DATA MINING

    Directory of Open Access Journals (Sweden)

    Jaseena K.U.

    2014-12-01

    Full Text Available Data has become an indispensable part of every economy, industry, organization, business function and individual. Big Data is a term used to identify the datasets that whose size is beyond the ability of typical database software tools to store, manage and analyze. The Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This paper presents the literature review about the Big data Mining and the issues and challenges with emphasis on the distinguished features of Big Data. It also discusses some methods to deal with big data.

  16. Macro mechanical parameters' size effect of surrounding rock of Shuibuya project's underground power station

    Institute of Scientific and Technical Information of China (English)

    GUO Zhi-hua; ZHOU Chuang-bing; ZHOU Huo-ming; SHENG Qian; LENG Xian-lun

    2005-01-01

    Scale effect is one of the important aspects in the macro mechanical parameters' research of rock mass, from a new point of view, by means of lab and field rock mechanics test, establishment of E~Vp relation, classification of engineering rock mass, numerical simulation test and back analysis based on surrounding rock's displacement monitoring results of Shuibuya Project's underground power station, rock mass deformation module's size effect of surrounding rock of Shuibuya Project's undegroud power station was studied. It's shown that rock mass deformation module's scale effect of surrounding rock of Shuibuya Project's undeground power station is obvious, the rock mass deformation module to tranquilization is 20% of intact rock's. Finally the relation between rock mass deformation modules and the scale of research was established.

  17. Soft rocks in Argentina

    Institute of Scientific and Technical Information of China (English)

    Giambastiani; Mauricio

    2014-01-01

    Soft rocks are a still fairly unexplored chapter in rock mechanics. Within this category are the clastic sedimentary rocks and pyroclastic volcanic rocks, of low to moderate lithification (consolidation, cemen-tation, new formed minerals), chemical sedimentary rocks and metamorphic rocks formed by minerals with Mohs hardness less than 3.5, such as limestone, gypsum, halite, sylvite, between the first and phyllites, graphitic schist, chloritic shale, talc, etc., among the latter. They also include any type of rock that suffered alteration processes (hydrothermal or weathering). In Argentina the study of low-strength rocks has not received much attention despite having extensive outcrops in the Andes and great impact in the design criteria. Correlation between geomechanical properties (UCS, deformability) to physical index (porosity, density, etc.) has shown promising results to be better studied. There are many studies and engineering projects in Argentina in soft rock geological environments, some cited in the text (Chihuído dam, N. Kirchner dam, J. Cepernic Dam, etc.) and others such as International Tunnel in the Province of Mendoza (Corredor Bioceánico), which will require the valuable contribution from rock mechanics. The lack of consistency between some of the physical and mechanical parameters explored from studies in the country may be due to an insufficient amount of information and/or non-standardization of criteria for testing materials. It is understood that more and better academic and professional efforts in improv-ing techniques will result in benefits to the better understanding of the geomechanics of weak rocks.

  18. Big data is not a monolith

    CERN Document Server

    Sugimoto, Cassidy R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  19. Big Numbers in String Theory

    CERN Document Server

    Schellekens, A N

    2016-01-01

    This paper contains some personal reflections on several computational contributions to what is now known as the "String Theory Landscape". It consists of two parts. The first part concerns the origin of big numbers, and especially the number $10^{1500}$ that appeared in work on the covariant lattice construction (with W. Lerche and D. Luest). This part contains some new results. I correct a huge but inconsequential error, discuss some more accurate estimates, and compare with the counting for free fermion constructions. In particular I prove that the latter only provide an exponentially small fraction of all even self-dual lattices for large lattice dimensions. The second part of the paper concerns dealing with big numbers, and contains some lessons learned from various vacuum scanning projects.

  20. George and the big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2012-01-01

    George has problems. He has twin baby sisters at home who demand his parents’ attention. His beloved pig Freddy has been exiled to a farm, where he’s miserable. And worst of all, his best friend, Annie, has made a new friend whom she seems to like more than George. So George jumps at the chance to help Eric with his plans to run a big experiment in Switzerland that seeks to explore the earliest moment of the universe. But there is a conspiracy afoot, and a group of evildoers is planning to sabotage the experiment. Can George repair his friendship with Annie and piece together the clues before Eric’s experiment is destroyed forever? This engaging adventure features essays by Professor Stephen Hawking and other eminent physicists about the origins of the universe and ends with a twenty-page graphic novel that explains how the Big Bang happened—in reverse!

  1. The big wheels of ATLAS

    CERN Multimedia

    2006-01-01

    The ATLAS cavern is filling up at an impressive rate. The installation of the first of the big wheels of the muon spectrometer, a thin gap chamber (TGC) wheel, was completed in September. The muon spectrometer will include four big moving wheels at each end, each measuring 25 metres in diameter. Of the eight wheels in total, six will be composed of thin gap chambers for the muon trigger system and the other two will consist of monitored drift tubes (MDTs) to measure the position of the muons (see Bulletin No. 13/2006). The installation of the 688 muon chambers in the barrel is progressing well, with three-quarters of them already installed between the coils of the toroid magnet.

  2. Big data and ophthalmic research.

    Science.gov (United States)

    Clark, Antony; Ng, Jonathon Q; Morlet, Nigel; Semmens, James B

    2016-01-01

    Large population-based health administrative databases, clinical registries, and data linkage systems are a rapidly expanding resource for health research. Ophthalmic research has benefited from the use of these databases in expanding the breadth of knowledge in areas such as disease surveillance, disease etiology, health services utilization, and health outcomes. Furthermore, the quantity of data available for research has increased exponentially in recent times, particularly as e-health initiatives come online in health systems across the globe. We review some big data concepts, the databases and data linkage systems used in eye research-including their advantages and limitations, the types of studies previously undertaken, and the future direction for big data in eye research. PMID:26844660

  3. Big Bounce in Dipole Cosmology

    OpenAIRE

    Battisti, Marco Valerio; Marciano, Antonino

    2010-01-01

    We derive the cosmological Big Bounce scenario from the dipole approximation of Loop Quantum Gravity. We show that a non-singular evolution takes place for any matter field and that, by considering a massless scalar field as a relational clock for the dynamics, the semi-classical proprieties of an initial state are preserved on the other side of the bounce. This model thus enhances the relation between Loop Quantum Cosmology and the full theory.

  4. BIG DATA IN BUSINESS ENVIRONMENT

    OpenAIRE

    Logica BANICA; Alina HAGIU

    2015-01-01

    In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured) in order to improve current transactions, to develop new business models, to provide a real image ...

  5. BIG Data – A Review.

    OpenAIRE

    Anuradha Bhatia; Gaurav Vaswani

    2013-01-01

    As more data becomes available from an abundance of sources both within and outside, organizations are seeking to use those abundant resources to increase innovation, retain customers, and increase operational efficiency. At the same time, organizations are challenged by their end users, who are demanding greater capability and integration to mine and analyze burgeoning new sources of information. Big Data provides opportunities for business users to ask questions they never were able to ask ...

  6. Big data processing with Hadoop

    OpenAIRE

    Wu, Shiqi

    2015-01-01

    Computing technology has changed the way we work, study, and live. The distributed data processing technology is one of the popular topics in the IT field. It provides a simple and centralized computing platform by reducing the cost of the hardware. The characteristics of distributed data processing technology have changed the whole industry. Hadoop, as the open source project of Apache foundation, is the most representative platform of distributed big data processing. The Hadoop distribu...

  7. Big Bang Nucleosynthesis: An Update

    OpenAIRE

    Olive, Keith A.; Scully, Sean T.

    1995-01-01

    WThe current status of big bang nucleosynthesis is reviewed with an emphasis on the comparison between the observational determination of the light element abundances of \\D, \\he3, \\he4 and \\li7 and the predictions from theory. In particular, we present new analyses for \\he4 and \\li7. Implications for physics beyond the standard model are also discussed. Limits on the effective number of neutrino flavors are also updated.

  8. Industrialization and the Big Push

    OpenAIRE

    1988-01-01

    This paper explores Rosenstein-Rodman's (1943) idea that simultaneous industrialization of many sectors of the economy can be profitable for all of them, even when no sector can break even industrializing alone. We analyze this ides in the context of an imperfectly competitive economy with aggregate demand spillovers, and interpret the big push into industrialization as a move from a bad to a good equilibrium. We show that for two equilibria to exist, it must be the case that an industrializi...

  9. Pragmatic Interaction between Big Powers

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Lu. It is very difficult to summarize the relationship among big powers in 2004. Looking east, there existed a """"small cold war"""" named by some media between Europe and Russia and between the United States and Russia; with regard to the """"orange revolution"""" in Ukraine at the end of the year, a rival show rope and Russia. Looking east, awas displayed between America, Eufresh scent seems to fill the air.

  10. Rock Cycle Roulette.

    Science.gov (United States)

    Schmidt, Stan M.; Palmer, Courtney

    2000-01-01

    Introduces an activity on the rock cycle. Sets 11 stages representing the transitions of an earth material in the rock cycle. Builds six-sided die for each station, and students move to the stations depending on the rolling side of the die. Evaluates students by discussing several questions in the classroom. Provides instructional information for…

  11. The BigBOSS Experiment

    CERN Document Server

    Schlegel, D; Abraham, T; Ahn, C; Prieto, C Allende; Annis, J; Aubourg, E; Azzaro, M; Baltay, S Bailey C; Baugh, C; Bebek, C; Becerril, S; Blanton, M; Bolton, A; Bromley, B; Cahn, R; Carton, P -H; Cervantes-Cota, J L; Chu, Y; Cortes, M; Dawson, K; Dey, A; Dickinson, M; Diehl, H T; Doel, P; Ealet, A; Edelstein, J; Eppelle, D; Escoffier, S; Evrard, A; Faccioli, L; Frenk, C; Geha, M; Gerdes, D; Gondolo, P; Gonzalez-Arroyo, A; Grossan, B; Heckman, T; Heetderks, H; Ho, S; Honscheid, K; Huterer, D; Ilbert, O; Ivans, I; Jelinsky, P; Jing, Y; Joyce, D; Kennedy, R; Kent, S; Kieda, D; Kim, A; Kim, C; Kneib, J -P; Kong, X; Kosowsky, A; Krishnan, K; Lahav, O; Lampton, M; LeBohec, S; Brun, V Le; Levi, M; Li, C; Liang, M; Lim, H; Lin, W; Linder, E; Lorenzon, W; de la Macorra, A; Magneville, Ch; Malina, R; Marinoni, C; Martinez, V; Majewski, S; Matheson, T; McCloskey, R; McDonald, P; McKay, T; McMahon, J; Menard, B; Miralda-Escude, J; Modjaz, M; Montero-Dorta, A; Morales, I; Mostek, N; Newman, J; Nichol, R; Nugent, P; Olsen, K; Padmanabhan, N; Palanque-Delabrouille, N; Park, I; Peacock, J; Percival, W; Perlmutter, S; Peroux, C; Petitjean, P; Prada, F; Prieto, E; Prochaska, J; Reil, K; Rockosi, C; Roe, N; Rollinde, E; Roodman, A; Ross, N; Rudnick, G; Ruhlmann-Kleider, V; Sanchez, J; Sawyer, D; Schimd, C; Schubnell, M; Scoccimaro, R; Seljak, U; Seo, H; Sheldon, E; Sholl, M; Shulte-Ladbeck, R; Slosar, A; Smith, D S; Smoot, G; Springer, W; Stril, A; Szalay, A S; Tao, C; Tarle, G; Taylor, E; Tilquin, A; Tinker, J; Valdes, F; Wang, J; Wang, T; Weaver, B A; Weinberg, D; White, M; Wood-Vasey, M; Yang, J; Yeche, X Yang Ch; Zakamska, N; Zentner, A; Zhai, C; Zhang, P

    2011-01-01

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy red...

  12. Big data: the management revolution.

    Science.gov (United States)

    McAfee, Andrew; Brynjolfsson, Erik

    2012-10-01

    Big data, the authors write, is far more powerful than the analytics of the past. Executives can measure and therefore manage more precisely than ever before. They can make better predictions and smarter decisions. They can target more-effective interventions in areas that so far have been dominated by gut and intuition rather than by data and rigor. The differences between big data and analytics are a matter of volume, velocity, and variety: More data now cross the internet every second than were stored in the entire internet 20 years ago. Nearly real-time information makes it possible for a company to be much more agile than its competitors. And that information can come from social networks, images, sensors, the web, or other unstructured sources. The managerial challenges, however, are very real. Senior decision makers have to learn to ask the right questions and embrace evidence-based decision making. Organizations must hire scientists who can find patterns in very large data sets and translate them into useful business information. IT departments have to work hard to integrate all the relevant internal and external sources of data. The authors offer two success stories to illustrate how companies are using big data: PASSUR Aerospace enables airlines to match their actual and estimated arrival times. Sears Holdings directly analyzes its incoming store data to make promotions much more precise and faster. PMID:23074865

  13. Tipping Point

    Medline Plus

    Full Text Available ... en español Blog About OnSafety CPSC Stands for Safety The Tipping Point Home > 60 Seconds of Safety (Videos) > The Tipping Point The Tipping Point by ... danger death electrical fall furniture head injury product safety television tipover tv Watch the video in Adobe ...

  14. Possible triggering of solar activity to big earthquakes (Ms ≥ 8) in faults with near west-east strike in China

    Institute of Scientific and Technical Information of China (English)

    HAN; Yanben; GUO; Zengjian; WU; Jinbing; MA; Lihua

    2004-01-01

    This paper studies the relationship between solar activity and big earthquakes (Ms≥8) that occurred in China and western Mongolia. It is discovered that the occurrence dates of most of the big earthquakes in and near faults with west-east strike are close to the maximum years of sunspot numbers, whereas dates of some big earthquakes which are not in such faults are not close to the maximum years. We consider that it is possibly because of the appearance of many magnetic storms in the maximum years of solar activity. The magnetic storms result in anomalies of geomagnetic field and then produce eddy current in the faults gestating earthquake with near west-east strike. Perhaps the gestated big earthquakes occur easily since the eddy current heats the rocks in the faults and therefore decreases the shear resistant intensity and the static friction limit of the rocks.

  15. What's Next for Big Bang Nucleosynthesis?

    International Nuclear Information System (INIS)

    Big bang nucleosynthesis (BBN) plays an important role in the standard hot big bang cosmology. BBN theory is used to predict the primordial abundances of the lightest elements, hydrogen, helium and lithium. Comparison between the predicted and observationally determined light element abundances provides a general test of concordance and can be used to fix the baryon content in the universe. Measurements of the cosmic microwave background (CMB) anisotropies now supplant BBN as the premier baryometer, especially with the latest results from the WMAP satellite. With the WMAP baryon density, the test of concordance can be made even more precise. Any disagreement between theory predictions and observations requires careful discussion. Several possibilities exist to explain discrepancies; (1) observational systematics (either physical or technical) may not be properly treated in determining primordial light element abundances (2) nuclear inputs that determine the BBN predictions may have unknown systematics or may be incomplete, and (3) physics beyond that included in the standard BBN scenario may need to be included in the theory calculation. Before we can be absolutely sure new physics is warranted, points (1) and (2) must be addressed and ruled out. All of these scenarios rely on experimental or observational data to make definitive statements of their applicability and range of validity, which currently is not at the level necessary to discern between these possibilities with high confidence. Thus, new light element abundance observations and nuclear experiments are needed to probe these further. Assuming concordance is established, one can use the light element observations to explore the evolution from their primordial values. This can provide useful information on stellar evolution, cosmic rays and other nuclear astrophysics. When combined with detailed models, BBN, the CMB anisotropy and nuclear astrophysics can provide us with information about the populations

  16. Big Bang–Big Crunch Optimization Algorithm for Linear Phase Fir Digital Filter Design

    OpenAIRE

    Ms. Rashmi Singh Dr. H. K. Verma

    2012-01-01

    The Big Bang–Big Crunch (BB–BC) optimization algorithm is a new optimization method that relies on the Big Bang and Big Crunch theory, one of the theories of the evolution of the universe. In this paper, a Big Bang–Big Crunch algorithm has been used here for the design of linear phase finite impulse response (FIR) filters. Here the experimented fitness function based on the mean squared error between the actual and the ideal filter response. This paper presents the plot of magnitude response ...

  17. Military Simulation Big Data: Background, State of the Art, and Challenges

    Directory of Open Access Journals (Sweden)

    Xiao Song

    2015-01-01

    Full Text Available Big data technology has undergone rapid development and attained great success in the business field. Military simulation (MS is another application domain producing massive datasets created by high-resolution models and large-scale simulations. It is used to study complicated problems such as weapon systems acquisition, combat analysis, and military training. This paper firstly reviewed several large-scale military simulations producing big data (MS big data for a variety of usages and summarized the main characteristics of result data. Then we looked at the technical details involving the generation, collection, processing, and analysis of MS big data. Two frameworks were also surveyed to trace the development of the underlying software platform. Finally, we identified some key challenges and proposed a framework as a basis for future work. This framework considered both the simulation and big data management at the same time based on layered and service oriented architectures. The objective of this review is to help interested researchers learn the key points of MS big data and provide references for tackling the big data problem and performing further research.

  18. An Overview of Big Data Privacy Issues

    OpenAIRE

    Patrick Hung

    2013-01-01

    Big data is the term for a collection of large and complex datasets from different sources that is difficult to process using traditional data management and processing applications. In these datasets, some information must be kept secret from others. On the other hand, some information has to be released for acquainting information or big data analytical services. The research challenge is how to protect the private information in the context of big data. Privacy is described by the ability ...

  19. Social Big Data and Privacy Awareness

    OpenAIRE

    Sang, Lin

    2015-01-01

    Based on the rapid development of Big Data, the data from the online social network becomea major part of it. Big data make the social networks became data-oriented rather than social-oriented. Taking this into account, this dissertation presents a qualitative study to research howdoes the data-oriented social network affect its users’ privacy management for nowadays. Within this dissertation, an overview of Big Data and privacy issues on the social network waspresented as a background study. ...

  20. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Corridor Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A first-surface topography Digital Elevation Model (DEM) mosaic for the Big Sandy Creek Corridor Unit of Big Thicket National Preserve in Texas was produced from...

  1. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A bare-earth topography digital elevation model (DEM) mosaic for the Big Sandy Creek Unit of Big Thicket National Preserve in Texas, was produced from remotely...

  2. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A first-surface topography digital elevation model (DEM) mosaic for the Big Sandy Creek Unit of Big Thicket National Preserve in Texas, was produced from remotely...

  3. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Corridor Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A bare-earth topography Digital Elevation Model (DEM) mosaic for the Big Sandy Creek Corridor Unit of Big Thicket National Preserve in Texas was produced from...

  4. Quantization of Big Bang in crypto-Hermitian Heisenberg picture

    CERN Document Server

    Znojil, Miloslav

    2015-01-01

    A background-independent quantization of the Universe near its Big Bang singularity is considered using a drastically simplified toy model. Several conceptual issues are addressed. (1) The observable spatial-geometry characteristics of our empty-space expanding Universe is sampled by the time-dependent operator $Q=Q(t)$ of the distance between two space-attached observers (``Alice and Bob''). (2) For any pre-selected guess of the simple, non-covariant time-dependent observable $Q(t)$ one of the Kato's exceptional points (viz., $t=\\tau_{(EP)}$) is postulated {\\em real-valued}. This enables us to treat it as the time of Big Bang. (3) During our ``Eon'' (i.e., at all $t>\\tau_{(EP)}$) the observability status of operator $Q(t)$ is mathematically guaranteed by its self-adjoint nature with respect to an {\\em ad hoc} Hilbert-space metric $\\Theta(t) \

  5. Simulating Smoke Filling in Big Halls by Computational Fluid Dynamics

    Directory of Open Access Journals (Sweden)

    W. K. Chow

    2011-01-01

    Full Text Available Many tall halls of big space volume were built and, to be built in many construction projects in the Far East, particularly Mainland China, Hong Kong, and Taiwan. Smoke is identified to be the key hazard to handle. Consequently, smoke exhaust systems are specified in the fire code in those areas. An update on applying Computational Fluid Dynamics (CFD in smoke exhaust design will be presented in this paper. Key points to note in CFD simulations on smoke filling due to a fire in a big hall will be discussed. Mathematical aspects concerning of discretization of partial differential equations and algorithms for solving the velocity-pressure linked equations are briefly outlined. Results predicted by CFD with different free boundary conditions are compared with those on room fire tests. Standards on grid size, relaxation factors, convergence criteria, and false diffusion should be set up for numerical experiments with CFD.

  6. Fitting ERGMs on big networks.

    Science.gov (United States)

    An, Weihua

    2016-09-01

    The exponential random graph model (ERGM) has become a valuable tool for modeling social networks. In particular, ERGM provides great flexibility to account for both covariates effects on tie formations and endogenous network formation processes. However, there are both conceptual and computational issues for fitting ERGMs on big networks. This paper describes a framework and a series of methods (based on existent algorithms) to address these issues. It also outlines the advantages and disadvantages of the methods and the conditions to which they are most applicable. Selected methods are illustrated through examples. PMID:27480375

  7. Big deformation in 17C

    International Nuclear Information System (INIS)

    Reaction and interaction cross sections of 17C on a carbon target have been re-analyzed using the modified Glauber model. The analysis with a deformed Woods-Saxon density/potential suggests a big deformation structure for 17C. The existence of a tail in the density distribution supports the possibility of it being a one-neutron halo structure. Under a deformed core plus a single-particle assumption, analysis shows a dominant d-wave of the valence neutron in 17C. (authors)

  8. Big bang nucleosynthesis: An update

    International Nuclear Information System (INIS)

    An update on the standard model of big bang nucleosynthesis (BBN) is presented. With the value of the baryon-tophoton ratio determined to high precision by WMAP, standard BBN is a parameter-free theory. In this context, the theoretical prediction for the abundances of D, 4He, and 7Li is discussed and compared to their observational determination. While concordance for D and 4He is satisfactory, the prediction for 7Li exceeds the observational determination by a factor of about four. Possible solutions to this problem are discussed

  9. Big Five -persoonallisuuspiirteiden yhteydet unettomuuteen

    OpenAIRE

    Aronen, Aino

    2015-01-01

    Tutkimuksen tarkoituksena oli selvittÀÀ, ovatko Big Five -persoonallisuuspiirteet (neuroottisuus, ulospÀinsuuntautuneisuus, tunnollisuus, avoimuus kokemuksille ja sovinnollisuus) yhteydessÀ unettomuuden oireisiin, joita olivat nukahtamisvaikeudet, herÀilyt, vaikeudet pysyÀ unessa ja vÀsyneenÀ herÀÀmiset normaalipituisten unien jÀlkeen. Unettomuutta koskevien teorioiden mukaan korkea neuroottisuus, matala ulospÀinsuuntautuneisuus, matala tunnollisuus ja matala sovinnollisuus voivat...

  10. Pop & rock / Berk Vaher

    Index Scriptorium Estoniae

    Vaher, Berk, 1975-

    2001-01-01

    Uute heliplaatide Redman "Malpractice", Brian Eno & Peter Schwalm "Popstars", Clawfinger "A Whole Lot of Nothing", Dario G "In Full Color", MLTR e. Michael Learns To Rock "Blue Night" lühitutvustused

  11. Rock kinoekraanil / Katrin Rajasaare

    Index Scriptorium Estoniae

    Rajasaare, Katrin

    2008-01-01

    7.-11. juulini kinos Sõprus toimuval filminädalal "Rock On Screen" ekraanile jõudvatest rockmuusikuid portreteerivatest filmidest "Lou Reed's Berlin", "The Future Is Unwritten: Joe Strummer", "Control: Joy Division", "Hurriganes", "Shlaager"

  12. Traffic information computing platform for big data

    International Nuclear Information System (INIS)

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users

  13. Big Data Analytics Using Cloud and Crowd

    OpenAIRE

    Allahbakhsh, Mohammad; Arbabi, Saeed; Motahari-Nezhad, Hamid-Reza; Benatallah, Boualem

    2016-01-01

    The increasing application of social and human-enabled systems in people's daily life from one side and from the other side the fast growth of mobile and smart phones technologies have resulted in generating tremendous amount of data, also referred to as big data, and a need for analyzing these data, i.e., big data analytics. Recently a trend has emerged to incorporate human computing power into big data analytics to solve some shortcomings of existing big data analytics such as dealing with ...

  14. Big data optimization recent developments and challenges

    CERN Document Server

    2016-01-01

    The main objective of this book is to provide the necessary background to work with big data by introducing some novel optimization algorithms and codes capable of working in the big data setting as well as introducing some applications in big data optimization for both academics and practitioners interested, and to benefit society, industry, academia, and government. Presenting applications in a variety of industries, this book will be useful for the researchers aiming to analyses large scale data. Several optimization algorithms for big data including convergent parallel algorithms, limited memory bundle algorithm, diagonal bundle method, convergent parallel algorithms, network analytics, and many more have been explored in this book.

  15. Big data: an introduction for librarians.

    Science.gov (United States)

    Hoy, Matthew B

    2014-01-01

    Modern life produces data at an astounding rate and shows no signs of slowing. This has lead to new advances in data storage and analysis and the concept of "big data," that is, massive data sets that can yield surprising insights when analyzed. This column will briefly describe what big data is and why it is important. It will also briefly explore the possibilities and problems of big data and the implications it has for librarians. A list of big data projects and resources is also included. PMID:25023020

  16. Big data analytics with R and Hadoop

    CERN Document Server

    Prajapati, Vignesh

    2013-01-01

    Big Data Analytics with R and Hadoop is a tutorial style book that focuses on all the powerful big data tasks that can be achieved by integrating R and Hadoop.This book is ideal for R developers who are looking for a way to perform big data analytics with Hadoop. This book is also aimed at those who know Hadoop and want to build some intelligent applications over Big data with R packages. It would be helpful if readers have basic knowledge of R.

  17. Urgent Call for Nursing Big Data.

    Science.gov (United States)

    Delaney, Connie W

    2016-01-01

    The purpose of this panel is to expand internationally a National Action Plan for sharable and comparable nursing data for quality improvement and big data science. There is an urgent need to assure that nursing has sharable and comparable data for quality improvement and big data science. A national collaborative - Nursing Knowledge and Big Data Science includes multi-stakeholder groups focused on a National Action Plan toward implementing and using sharable and comparable nursing big data. Panelists will share accomplishments and future plans with an eye toward international collaboration. This presentation is suitable for any audience attending the NI2016 conference. PMID:27332330

  18. Traffic information computing platform for big data

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn; Liu, Yan, E-mail: ztduan@chd.edu.cn; Dai, Jiting, E-mail: ztduan@chd.edu.cn; Kang, Jun, E-mail: ztduan@chd.edu.cn [Chang' an University School of Information Engineering, Xi' an, China and Shaanxi Engineering and Technical Research Center for Road and Traffic Detection, Xi' an (China)

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  19. BigDataBench: a Big Data Benchmark Suite from Internet Services

    OpenAIRE

    Wang, Lei; Zhan, Jianfeng; Luo, Chunjie; Zhu, Yuqing; Yang, Qiang; He, Yongqiang; Gao, Wanling; Jia, Zhen; Shi, Yingjie; Zhang, Shujie; Zheng, Chen; Lu, Gang; Zhan, Kent; Li, Xiaona; Qiu, Bizhu

    2014-01-01

    As architecture, systems, and data management communities pay greater attention to innovative big data systems and architectures, the pressure of benchmarking and evaluating these systems rises. Considering the broad use of big data systems, big data benchmarks must include diversity of data and workloads. Most of the state-of-the-art big data benchmarking efforts target evaluating specific types of applications or system software stacks, and hence they are not qualified for serving the purpo...

  20. CloudJet4BigData: Streamlining Big Data via an accelerated socket interface

    OpenAIRE

    Frank Z.Wang

    2014-01-01

    Big data needs to feed users with fresh processing results and cloud platforms can be used to speed up big data applications. This paper describes a new data communication protocol (CloudJet) for long distance and large volume big data accessing operations to alleviate the large latencies encountered in sharing big data resources in the clouds. It encapsulates a dynamic multi-stream/multi-path engine at the socket level, which conforms to Portable Operating System Interface (POSIX) and thereb...

  1. CloudJet4BigData: Streamlining Big Data via an Accelerated Socket Interface

    OpenAIRE

    Wang, Frank Zhigang; Dimitrakos, Theo; Helian, Na; Wu, Sining; Li, Ling; Yates, Rodric

    2014-01-01

    Big data needs to feed users with fresh processing results and cloud platforms can be used to speed up big data applications. This paper describes a new data communication protocol (CloudJet) for long distance and large volume big data accessing operations to alleviate the large latencies encountered in sharing big data resources in the clouds. It encapsulates a dynamic multi-stream/multi-path engine at the socket level, which conforms to Portable Operating System Interface (POSIX) and thereb...

  2. 77 FR 27245 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN

    Science.gov (United States)

    2012-05-09

    ... Register (73 FR 76677) on December 17, 2008. For more about the initial process and the history of this... Fish and Wildlife Service Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN... comprehensive conservation plan (CCP) and environmental assessment (EA) for Big Stone National Wildlife...

  3. Comparative validity of brief to medium-length Big Five and Big Six personality questionnaires

    NARCIS (Netherlands)

    A.G. Thalmayer; G. Saucier; A. Eigenhuis

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five

  4. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  5. Scaling Thomson scattering to big machines

    International Nuclear Information System (INIS)

    Thomson scattering is a widely used diagnostic tool for local measurement of both electron temperature and electron density. It is used for both low and high temperature plasmas and it is a key diagnostic on all fusion devices. The extremely low cross-section of the reaction increases the complexity of the design. Since the early days of fusion, when a simple single point measurement was used, the design moved to a multi-point system with a large number of spatial points, LIDAR system or high repetition Thomson scattering diagnostic which are used nowadays. The initial low electron temperature approximation has been replaced by the full relativistic approach necessary for large devices as well as for ITER with expected higher plasma temperature. Along the way, the different development needs and the issues that exist need to be addressed to ensure that the technique is developed sufficiently to handle challenges of the bigger devices of the future as well as current developments needed for ITER. For large devices, the achievement of the necessary temperature range represents an important task. Both high and low temperatures can be measured, however, a large dynamic range makes the design difficult as size of detector and dynamic range are linked together. Therefore, the requirements of the new devices are extending the boundaries of these parameters. Namely, ITER presents challenges as access is also difficult but big efforts have been made to cope with this. This contribution contains a broad review of Thomson scattering diagnostics used in current devices together with comments on recent progress and speculation regarding future developments needed for future large scale devices

  6. Scaling Thomson scattering to big machines

    Science.gov (United States)

    Bílková, P.; Walsh, M.; Böhm, P.; Bassan, M.; Aftanas, M.; Pánek, R.

    2016-03-01

    Thomson scattering is a widely used diagnostic tool for local measurement of both electron temperature and electron density. It is used for both low and high temperature plasmas and it is a key diagnostic on all fusion devices. The extremely low cross-section of the reaction increases the complexity of the design. Since the early days of fusion, when a simple single point measurement was used, the design moved to a multi-point system with a large number of spatial points, LIDAR system or high repetition Thomson scattering diagnostic which are used nowadays. The initial low electron temperature approximation has been replaced by the full relativistic approach necessary for large devices as well as for ITER with expected higher plasma temperature. Along the way, the different development needs and the issues that exist need to be addressed to ensure that the technique is developed sufficiently to handle challenges of the bigger devices of the future as well as current developments needed for ITER. For large devices, the achievement of the necessary temperature range represents an important task. Both high and low temperatures can be measured, however, a large dynamic range makes the design difficult as size of detector and dynamic range are linked together. Therefore, the requirements of the new devices are extending the boundaries of these parameters. Namely, ITER presents challenges as access is also difficult but big efforts have been made to cope with this. This contribution contains a broad review of Thomson scattering diagnostics used in current devices together with comments on recent progress and speculation regarding future developments needed for future large scale devices.

  7. Astronomical surveys and big data

    Science.gov (United States)

    Mickaelian, Areg M.

    Recent all-sky and large-area astronomical surveys and their catalogued data over the whole range of electromagnetic spectrum, from γ -rays to radio waves, are reviewed, including such as Fermi-GLAST and INTEGRAL in γ -ray, ROSAT, XMM and Chandra in X-ray, GALEX in UV, SDSS and several POSS I and POSS II-based catalogues (APM, MAPS, USNO, GSC) in the optical range, 2MASS in NIR, WISE and AKARI IRC in MIR, IRAS and AKARI FIS in FIR, NVSS and FIRST in radio range, and many others, as well as the most important surveys giving optical images (DSS I and II, SDSS, etc.), proper motions (Tycho, USNO, Gaia), variability (GCVS, NSVS, ASAS, Catalina, Pan-STARRS), and spectroscopic data (FBS, SBS, Case, HQS, HES, SDSS, CALIFA, GAMA). An overall understanding of the coverage along the whole wavelength range and comparisons between various surveys are given: galaxy redshift surveys, QSO/AGN, radio, Galactic structure, and Dark Energy surveys. Astronomy has entered the Big Data era, with Astrophysical Virtual Observatories and Computational Astrophysics playing an important role in using and analyzing big data for new discoveries.

  8. Environmental Consequences of Big Nasty Impacts on the Early Earth

    Science.gov (United States)

    Zahnle, Kevin

    2015-01-01

    The geological record of the Archean Earth is spattered with impact spherules from a dozen or so major cosmic collisions involving Earth and asteroids or comets (Lowe, Byerly 1986, 2015). Extrapolation of the documented deposits suggests that most of these impacts were as big or bigger than the Chicxulub event that famously ended the reign of the thunder lizards. As the Archean impacts were greater, the environmental effects were also greater. The number and magnitude of the impacts is bounded by the lunar record. There are no lunar craters bigger than Chicxulub that date to Earth's mid-to-late Archean. Chance dictates that Earth experienced no more than approximately 10 impacts bigger than Chicxulub between 2.5 billion years and 3.5 2.5 billion years, the biggest of which were approximately30-100 times more energetic, comparable to the Orientale impact on the Moon (1x10 (sup 26) joules). To quantify the thermal consequences of big impacts on old Earth, we model the global flow of energy from the impact into the environment. The model presumes that a significant fraction of the impact energy goes into ejecta that interact with the atmosphere. Much of this energy is initially in rock vapor, melt, and high speed particles. (i) The upper atmosphere is heated by ejecta as they reenter the atmosphere. The mix of hot air, rock vapor, and hot silicates cools by thermal radiation. Rock raindrops fall out as the upper atmosphere cools. (ii) The energy balance of the lower atmosphere is set by radiative exchange with the upper atmosphere and with the surface, and by evaporation of seawater. Susequent cooling is governed by condensation of water vapor. (iii) The oceans are heated by thermal radiation and rock rain and cooled by evaporation. Surface waters become hot and salty; if a deep ocean remains it is relatively cool. Subsequently water vapor condenses to replenish the oceans with hot fresh water (how fresh depending on continental weathering, which might be rather rapid

  9. The geology and tectonic significance of the Big Creek Gneiss, Sierra Madre, southeastern Wyoming

    Science.gov (United States)

    Jones, Daniel S.

    The Big Creek Gneiss, southern Sierra Madre, southeastern Wyoming, is a heterogeneous suite of upper-amphibolite-facies metamorphic rocks intruded by post-metamorphic pegmatitic granite. The metamorphic rocks consist of three individual protolith suites: (1) pre- to syn-1780-Ma supracrustal rocks including clastic metasedimentary rocks, calc-silicate paragneiss, and metavolcanic rocks; (2) a bimodal intrusive suite composed of metagabbro and granodiorite-tonalite gneiss; and (3) a younger bimodal suite composed of garnet-bearing metagabbronorite and coarse-grained granitic gneiss. Zircons U-Pb ages from the Big Creek Gneiss demonstrate that: (1) the average age of detrital zircons in the supracrustal rocks is ~1805 Ma, requiring a significant source of 1805-Ma (or older) detritus during deposition, possibly representing an older phase of arc magmatism; (2) the older bimodal igneous suite crystallized at ~1780 Ma, correlative with arc-derived rocks of the Green Mountain Formation; (3) the younger bimodal igneous suite crystallized at ~1763 Ma, coeval with the extensional(?) Horse Creek anorthosite complex in the Laramie Mountains and Sierra Madre Granite batholith in the southwestern Sierra Madre; (4) Big Creek Gneiss rocks were tectonically buried, metamorphosed, and partially melted at ~1750 Ma, coeval with the accretion of the Green Mountain arc to the Wyoming province along the Cheyenne belt; (5) the posttectonic granite and pegmatite bodies throughout the Big Creek Gneiss crystallized at ~1630 Ma and are correlative with the 'white quartz monzonite' of the south-central Sierra Madre. Geochemical analysis of the ~1780-Ma bimodal plutonic suite demonstrates a clear arc-affinity for the mafic rocks, consistent with a subduction environment origin. The granodioritic rocks of this suite were not derived by fractional crystallization from coeval mafic magmas, but are instead interpreted as melts of lower-crustal mafic material. This combination of mantle

  10. An analysis of cross-sectional differences in big and non-big public accounting firms' audit programs

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans); Drieenhuizen, F.; Stein, M.T.; Simunic, D.A.

    2006-01-01

    A significant body of prior research has shown that audits by the Big 5 (now Big 4) public accounting firms are quality differentiated relative to non-Big 5 audits. This result can be derived analytically by assuming that Big 5 and non-Big 5 firms face different loss functions for "audit failures" a

  11. Why Big Data Is a Big Deal (Ⅱ)

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    A new group of data mining technologies promises to change forever the way we sift through our vast stores of data,making it faster and cheaper.Some of the technologies are actively being used by people on the bleeding edge who need the technology now,like those involved in creating Web-based services that are driven by social media.They're also heavily contributing to these projects.In other vertical industries,businesses are realizing that much more of their value proposition is informationbased than they had previously thought,which will allow big data technologies to gain traction quickly,Olofson says.Couple that with affordable hardware and software,and enterprises find themselves in a perfect storm of business transformation opportunities.

  12. Rock displacements measured during URL shaft sinking

    International Nuclear Information System (INIS)

    During sinking of the Canadian Underground Research Laboratory (URL) shaft, borehole extensometers were used to obtain rock displacement measurements and a tape extensometer was used to measure total convergences. The instruments, instrument modifications, and methods used are described. The measurements are summarized and assessed, with particular emphasis on the influence of natural fractures on rock-mass response and the performance of the instrumentation. Displacements varied from 0.09 mm to 1.75 mm. The frequency of sub-vertical fractures in the rock appeared to be the main factor causing the variation in the measured displacements. Although the displacement instrumentation met certain operational requirement well, lack of precision was a problem. Displacement instrumentation used in future URL experiments should have more measuring points, greater sensitivity, and greater accuracy to better measure small displacements

  13. The ethics of Big data: analytical survey

    OpenAIRE

    GIBER L.; KAZANTSEV N.

    2015-01-01

    The number of recent publications on the matter of ethical challenges of the implementation of Big Data has signified the growing interest to all the aspects of this issue. The proposed study specifically aims at analyzing ethical issues connected with Big Data.

  14. A New Look at Big History

    Science.gov (United States)

    Hawkey, Kate

    2014-01-01

    The article sets out a "big history" which resonates with the priorities of our own time. A globalizing world calls for new spacial scales to underpin what the history curriculum addresses, "big history" calls for new temporal scales, while concern over climate change calls for a new look at subject boundaries. The article…

  15. The Big Sleep in the Woods

    Institute of Scientific and Technical Information of China (English)

    王玉峰

    2002-01-01

    Now it's the time of the big sleep for the bees and the bears. Even the buds of the plants whose leaves fall off share in it. But the intensity of this winter sleep, or hibernation, depends on who's doing it.The big sleep of the bears ,for instance ,would probably be thought of as a

  16. Big Science and Long-tail Science

    CERN Multimedia

    2008-01-01

    Jim Downing and I were privileged to be the guests of Salavtore Mele at CERN yesterday and to see the Atlas detector of the Large Hadron Collider . This is a wow experience - although I knew it was big, I hadnt realised how big.

  17. An embedding for the big bang

    Science.gov (United States)

    Wesson, Paul S.

    1994-01-01

    A cosmological model is given that has good physical properties for the early and late universe but is a hypersurface in a flat five-dimensional manifold. The big bang can therefore be regarded as an effect of a choice of coordinates in a truncated higher-dimensional geometry. Thus the big bang is in some sense a geometrical illusion.

  18. Big Red: A Development Environment for Bigraphs

    DEFF Research Database (Denmark)

    Faithfull, Alexander John; Perrone, Gian David; Hildebrandt, Thomas

    2013-01-01

    We present Big Red, a visual editor for bigraphs and bigraphical reactive systems, based upon Eclipse. The editor integrates with several existing bigraph tools to permit simulation and model-checking of bigraphical models. We give a brief introduction to the bigraphs formalism, and show how these...... concepts manifest within the tool using a small motivating example bigraphical model developed in Big Red....

  19. Hom-Big Brackets: Theory and Applications

    OpenAIRE

    Cai, Liqiang; Sheng, Yunhe

    2015-01-01

    In this paper, we introduce the notion of hom-big brackets, which is a generalization of Kosmann-Schwarzbach's big brackets. We show that it gives rise to a graded hom-Lie algebra. Thus, it is a useful tool to study hom-structures. In particular, we use it to describe hom-Lie bialgebras and hom-Nijenhuis operators.

  20. Big system: Interactive graphics for the engineer

    Science.gov (United States)

    Quenneville, C. E.

    1975-01-01

    The BCS Interactive Graphics System (BIG System) approach to graphics was presented, along with several significant engineering applications. The BIG System precompiler, the graphics support library, and the function requirements of graphics applications are discussed. It was concluded that graphics standardization and a device independent code can be developed to assure maximum graphic terminal transferability.

  1. What is beyond the big five?

    Science.gov (United States)

    Saucier, G; Goldberg, L R

    1998-08-01

    Previous investigators have proposed that various kinds of person-descriptive content--such as differences in attitudes or values, in sheer evaluation, in attractiveness, or in height and girth--are not adequately captured by the Big Five Model. We report on a rather exhaustive search for reliable sources of Big Five-independent variation in data from person-descriptive adjectives. Fifty-three candidate clusters were developed in a college sample using diverse approaches and sources. In a nonstudent adult sample, clusters were evaluated with respect to a minimax criterion: minimum multiple correlation with factors from Big Five markers and maximum reliability. The most clearly Big Five-independent clusters referred to Height, Girth, Religiousness, Employment Status, Youthfulness and Negative Valence (or low-base-rate attributes). Clusters referring to Fashionableness, Sensuality/Seductiveness, Beauty, Masculinity, Frugality, Humor, Wealth, Prejudice, Folksiness, Cunning, and Luck appeared to be potentially beyond the Big Five, although each of these clusters demonstrated Big Five multiple correlations of .30 to .45, and at least one correlation of .20 and over with a Big Five factor. Of all these content areas, Religiousness, Negative Valence, and the various aspects of Attractiveness were found to be represented by a substantial number of distinct, common adjectives. Results suggest directions for supplementing the Big Five when one wishes to extend variable selection outside the domain of personality traits as conventionally defined. PMID:9728415

  2. Kansen voor Big data – WPA Vertrouwen

    NARCIS (Netherlands)

    Broek, T.A. van den; Roosendaal, A.P.C.; Veenstra, A.F.E. van; Nunen, A.M. van

    2014-01-01

    Big data is expected to become a driver for economic growth, but this can only be achieved when services based on (big) data are accepted by citizens and consumers. In a recent policy brief, the Cabinet Office mentions trust as one of the three pillars (the others being transparency and control) for

  3. Big Food, Food Systems, and Global Health

    OpenAIRE

    Stuckler, David; Nestle, Marion

    2012-01-01

    In an article that forms part of the PLoS Medicine series on Big Food, guest editors David Stuckler and Marion Nestle lay out why more examination of the food industry is necessary, and offer three competing views on how public health professionals might engage with Big Food.

  4. Probing the pre-big bang universe

    International Nuclear Information System (INIS)

    Superstring theory suggests a new cosmology whereby a long inflationary phase preceded a non singular big bang-like event. After discussing how pre-big bang inflation naturally arises from an almost trivial initial state of the Universe, I will describe how present or near-future experiments can provide sensitive probes of how the Universe behaved in the pre-bang era

  5. Tipping Point

    Medline Plus

    Full Text Available ... Tipping Point by CPSC Blogger September 22 appliance child Childproofing CPSC danger death electrical fall furniture head ... TV falls with about the same force as child falling from the third story of a building. ...

  6. Tipping Point

    Medline Plus

    Full Text Available ... Tipping Point by CPSC Blogger September 22 appliance child Childproofing CPSC danger death electrical fall furniture head ... see news reports about horrible accidents involving young children and furniture, appliance and tv tip-overs. The ...

  7. Turning Point

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Moves from the United States and North Korea give new impetus to nuclear disablement and U.S.-North Korea ties The tense situation surrounding denu-clearization on the Korean Peninsula has reached a turning point. On

  8. BIG Data – A Review.

    Directory of Open Access Journals (Sweden)

    Anuradha Bhatia

    2013-08-01

    Full Text Available As more data becomes available from an abundance of sources both within and outside, organizations are seeking to use those abundant resources to increase innovation, retain customers, and increase operational efficiency. At the same time, organizations are challenged by their end users, who are demanding greater capability and integration to mine and analyze burgeoning new sources of information. Big Data provides opportunities for business users to ask questions they never were able to ask before. How can a financial organization find better ways to detect fraud? How can an insurance company gain a deeper insight into its customers to see who may be the least economical to insure? How does a software company find its most at-risk customers those who are about to deploy a competitive product? They need to integrate Big Data techniques with their current enterprise data to gain that competitive advantage. Heterogeneity, scale, timeliness, complexity, and privacy problems with Big Data impede progress at all phases of the pipeline that can create value from data. The problems start right away during data acquisition, when the data tsunami requires us to make decisions, currently in an ad hoc manner, about what data to keep and what to discard, and how to store what we keep reliably with the right metadata. Much data today is not natively in structured format; for example, tweets and blogs are weakly structured pieces of text, while images and video are structured for storage and display, but not for semantic content and search: transforming such content into a structured format for later analysis is a major challenge. The value of data explodes when it can be linked with other data, thus data integration is a major creator of value. Since most data is directly generated in digital format today, we have the opportunity and the challenge both to influence the creation to facilitate later linkage and to automatically link previously created data

  9. Rock magnetic properties

    International Nuclear Information System (INIS)

    In 1978 the Nuclear Fuel Waste Management Program began the long task of site selection and evaluation for nuclear waste disposal. The Canadian Nuclear Fuel Waste Management Program, administered by Atomic Energy of Canada Limited, Research Company has provided the geophysicist with the unique opportunity to evaluate many modes of geophysical investigation in conjunction with detailed geologic mapping at a number of research areas. Of particular interest is research area RA-7, East Bull Lake, Algoma District, Ontario. Geophysical survey methods applied to the study of this included detailed gravity, ground magnetics, VLF, an airborne magnetic gradiometer survey and an airborne helicopter magnetic and EM survey. A comprehensive suite of rock property studies was also undertaken providing information on rock densities and magnetic rock properties. Preliminary modeling of the magnetic data sets assuming only induced magnetization illustrated the difficulty of arriving at a magnetic source geometry consistent with the mapped surficial and borehole geology. Integration of the magnetic rock properties observations and industry standard magnetic modelling techniques provides a source model geometry that is consistent with other geophysical/geological data sets, e.g. gravity and observed geology. The genesis of individual magnetic signatures in the East Bull Lake gabbro-anorthosite record the intrusion, metamorphism and fracture alteration of the pluton. As shown by this paper, only by understanding the rock magnetic signatures associated with each of these events is it possible to obtain geologically meaningful interpretative models

  10. Groundwater in granitic rocks

    International Nuclear Information System (INIS)

    A comparison of published chemical analyses of ground waters found in granitic rocks from a variety of locations shows that their compositions fall into two distinct classes. Ground waters from shallow wells and springs have a high bicarbonate/chloride ratio resulting from the neutralization of carbonic acid (dissolved CO2) by weathering reactions. The sodium, potassium, and silica released by weathering reactions drive the solutions away from equilibrium with the dominant minerals in the granites (i.e., quartz, muscovite, potassium feldspar, and albite). On the other hand, ground waters from deep wells and excavations are rich in chloride relative to bicarbonate. Their Na, K, H, and silica activities indicate that they are nearly equilibrated with the granite minerals suggesting a very long residence time in the host rock. These observations furnish the basis for a powerful tool to aid in selecting sites for radioactive waste disposal in granitic rocks. When water-bearing fractures are encountered in these rocks, a chemical analysis of the solutions contained within the fracture can determine whether the water came from the surface, i.e., is bicarbonate rich and not equilibrated, or whether it is some sort of connate water that has resided in the rock for a long period, i.e., chloride rich and equilibrated. This technique should allow immediate recognition of fracture systems in granitic radioactive waste repositories that would allow radionuclides to escape to the surface

  11. The ethics of biomedical big data

    CERN Document Server

    Mittelstadt, Brent Daniel

    2016-01-01

    This book presents cutting edge research on the new ethical challenges posed by biomedical Big Data technologies and practices. ‘Biomedical Big Data’ refers to the analysis of aggregated, very large datasets to improve medical knowledge and clinical care. The book describes the ethical problems posed by aggregation of biomedical datasets and re-use/re-purposing of data, in areas such as privacy, consent, professionalism, power relationships, and ethical governance of Big Data platforms. Approaches and methods are discussed that can be used to address these problems to achieve the appropriate balance between the social goods of biomedical Big Data research and the safety and privacy of individuals. Seventeen original contributions analyse the ethical, social and related policy implications of the analysis and curation of biomedical Big Data, written by leading experts in the areas of biomedical research, medical and technology ethics, privacy, governance and data protection. The book advances our understan...

  12. Fracturing tests on reservoir rocks: Analysis of AE events and radial strain evolution

    CERN Document Server

    Pradhan, S; Fjær, E; Stenebråten, J; Lund, H K; Sønstebø, E F; Roy, S

    2015-01-01

    Fracturing in reservoir rocks is an important issue for the petroleum industry - as productivity can be enhanced by a controlled fracturing operation. Fracturing also has a big impact on CO2 storage, geothermal installation and gas production at and from the reservoir rocks. Therefore, understanding the fracturing behavior of different types of reservoir rocks is a basic need for planning field operations towards these activities. In our study, the fracturing of rock sample is monitored by Acoustic Emission (AE) and post-experiment Computer Tomography (CT) scans. The fracturing experiments have been performed on hollow cylinder cores of different rocks - sandstones and chalks. Our analysis show that the amplitudes and energies of acoustic events clearly indicate initiation and propagation of the main fractures. The amplitudes of AE events follow an exponential distribution while the energies follow a power law distribution. Time-evolution of the radial strain measured in the fracturing-test will later be comp...

  13. BEBC, the Big European Bubble Chamber

    CERN Multimedia

    1971-01-01

    The vessel of the Big European Bubble Chamber, BEBC, was installed at the beginning of the 1970s. The large stainless-steel vessel, measuring 3.7 metres in diameter and 4 metres in height, was filled with 35 cubic metres of liquid (hydrogen, deuterium or a neon-hydrogen mixture), whose sensitivity was regulated by means of a huge piston weighing 2 tonnes. During each expansion, the trajectories of the charged particles were marked by a trail of bubbles, where liquid reached boiling point as they passed through it. The first images were recorded in 1973 when BEBC, equipped with the largest superconducting magnet in service at the time, first received beam from the PS. In 1977, the bubble chamber was exposed to neutrino and hadron beams at higher energies of up to 450 GeV after the SPS came into operation. By the end of its active life in 1984, BEBC had delivered a total of 6.3 million photographs to 22 experiments devoted to neutrino or hadron physics. Around 600 scientists from some fifty laboratories through...

  14. Cloud Based Big Data Infrastructure: Architectural Components and Automated Provisioning

    OpenAIRE

    Demchenko, Yuri; Turkmen, Fatih; Blanchet, Christophe; Loomis, Charles; Laat, Caees de

    2016-01-01

    This paper describes the general architecture and functional components of the cloud based Big Data Infrastructure (BDI). The proposed BDI architecture is based on the analysis of the emerging Big Data and data intensive technologies and supported by the definition of the Big Data Architecture Framework (BDAF) that defines the following components of the Big Data technologies: Big Data definition, Data Management including data lifecycle and data structures, Big Data Infrastructure (generical...

  15. Evidence of the Big Fix

    CERN Document Server

    Hamada, Yuta; Kawana, Kiyoharu

    2014-01-01

    We give an evidence of the Big Fix. The theory of wormholes and multiverse suggests that the parameters of the Standard Model are fixed in such a way that the total entropy at the late stage of the universe is maximized, which we call the maximum entropy principle. In this paper, we discuss how it can be confirmed by the experimental data, and we show that it is indeed true for the Higgs vacuum expectation value $v_{h}$. We assume that the baryon number is produced by the sphaleron process, and that the current quark masses, the gauge couplings and the Higgs self coupling are fixed when we vary $v_{h}$. It turns out that the existence of the atomic nuclei plays a crucial role to maximize the entropy. This is reminiscent of the anthropic principle, however it is required by the fundamental low in our case.

  16. Big data ja yrityksen markkinointi

    OpenAIRE

    Perolainen, Pekka

    2014-01-01

    Opinnäytetyössä oli tavoitteena tutkia big datan hyödyntämistä yrityksen myyntityössä ja markkinoinnissa. Yrityksillä on mahdollisuuksia käyttää omista tai ulkoisista lähteistä kerättyä tietoa toimintansa tehostamiseen. Yrityksen omat tiedot ovat lähinnä transaktiotietoja, asiakaskorttitietoa, logistiikkadataa tai anturidataa. Kameratallenteet ovat myös osa yritysten keräämää dataa, lainsäädännössä tämä data lasketaan henkilörekisteritiedoksi. Yritysten on mahdollista kerätä, käsitellä ja yhd...

  17. Spinoffs of big nuclear projects

    International Nuclear Information System (INIS)

    Spinoffs so far used to be discussed only in connection with space travel. The question is well worth investigating wether also big nuclear projects, such as the advanced reactor lines or the nuclear fuel cycle, produce technical spinoffs. One misunderstanding should be cleared right at the beginning: man did not travel to the moon to invent the teflon coated frying pan. Nor is nuclear spinoff the actual purpose of the exercise. The high temperature reactor and the fast breeder reactor, or the closing of the nuclear fuel cycle, are justified independent goals of energy policy. However, if the overall benefit to the national economy of nuclear high technology is to be evaluated, also the question of technical spinoff must be considered. (orig.)

  18. Was the Big Bang hot?

    Science.gov (United States)

    Wright, E. L.

    1983-01-01

    Techniques for verifying the spectrum defined by Woody and Richards (WR, 1981), which serves as a base for dust-distorted models of the 3 K background, are discussed. WR detected a sharp deviation from the Planck curve in the 3 K background. The absolute intensity of the background may be determined by the frequency dependence of the dipole anisotropy of the background or the frequency dependence effect in galactic clusters. Both methods involve the Doppler shift; analytical formulae are defined for characterization of the dipole anisotropy. The measurement of the 30-300 GHz spectra of cold galactic dust may reveal the presence of significant amounts of needle-shaped grains, which would in turn support a theory of a cold Big Bang.

  19. Advancements in Big Data Processing

    CERN Document Server

    Vaniachine, A; The ATLAS collaboration

    2012-01-01

    The ever-increasing volumes of scientific data present new challenges for Distributed Computing and Grid-technologies. The emerging Big Data revolution drives new discoveries in scientific fields including nanotechnology, astrophysics, high-energy physics, biology and medicine. New initiatives are transforming data-driven scientific fields by pushing Bid Data limits enabling massive data analysis in new ways. In petascale data processing scientists deal with datasets, not individual files. As a result, a task (comprised of many jobs) became a unit of petascale data processing on the Grid. Splitting of a large data processing task into jobs enabled fine-granularity checkpointing analogous to the splitting of a large file into smaller TCP/IP packets during data transfers. Transferring large data in small packets achieves reliability through automatic re-sending of the dropped TCP/IP packets. Similarly, transient job failures on the Grid can be recovered by automatic re-tries to achieve reliable Six Sigma produc...

  20. Big Book of Apple Hacks

    CERN Document Server

    Seibold, Chris

    2008-01-01

    Bigger in size, longer in length, broader in scope, and even more useful than our original Mac OS X Hacks, the new Big Book of Apple Hacks offers a grab bag of tips, tricks and hacks to get the most out of Mac OS X Leopard, as well as the new line of iPods, iPhone, and Apple TV. With 125 entirely new hacks presented in step-by-step fashion, this practical book is for serious Apple computer and gadget users who really want to take control of these systems. Many of the hacks take you under the hood and show you how to tweak system preferences, alter or add keyboard shortcuts, mount drives and

  1. Big Bang nucleosynthesis in crisis?

    International Nuclear Information System (INIS)

    A new evaluation of the constraint on the number of light neutrino species (Nν) from big bang nucleosynthesis suggests a discrepancy between the predicted light element abundances and those inferred from observations, unless the inferred primordial 4He abundance has been underestimated by 0.014±0.004 (1σ) or less than 10% (95% C.L.) of 3He survives stellar processing. With the quoted systematic errors in the observed abundances and a conservative chemical evolution parametrization, the best fit to the combined data is Nν=2.1±0.3 (1σ) and the upper limit is Nνν=3) at the 98.6% C.L. copyright 1995 The American Physical Society

  2. Exploring Relationships in Big Data

    Science.gov (United States)

    Mahabal, A.; Djorgovski, S. G.; Crichton, D. J.; Cinquini, L.; Kelly, S.; Colbert, M. A.; Kincaid, H.

    2015-12-01

    Big Data are characterized by several different 'V's. Volume, Veracity, Volatility, Value and so on. For many datasets inflated Volumes through redundant features often make the data more noisy and difficult to extract Value out of. This is especially true if one is comparing/combining different datasets, and the metadata are diverse. We have been exploring ways to exploit such datasets through a variety of statistical machinery, and visualization. We show how we have applied it to time-series from large astronomical sky-surveys. This was done in the Virtual Observatory framework. More recently we have been doing similar work for a completely different domain viz. biology/cancer. The methodology reuse involves application to diverse datasets gathered through the various centers associated with the Early Detection Research Network (EDRN) for cancer, an initiative of the National Cancer Institute (NCI). Application to Geo datasets is a natural extension.

  3. NETIMIS: Dynamic Simulation of Health Economics Outcomes Using Big Data.

    Science.gov (United States)

    Johnson, Owen A; Hall, Peter S; Hulme, Claire

    2016-02-01

    Many healthcare organizations are now making good use of electronic health record (EHR) systems to record clinical information about their patients and the details of their healthcare. Electronic data in EHRs is generated by people engaged in complex processes within complex environments, and their human input, albeit shaped by computer systems, is compromised by many human factors. These data are potentially valuable to health economists and outcomes researchers but are sufficiently large and complex enough to be considered part of the new frontier of 'big data'. This paper describes emerging methods that draw together data mining, process modelling, activity-based costing and dynamic simulation models. Our research infrastructure includes safe links to Leeds hospital's EHRs with 3 million secondary and tertiary care patients. We created a multidisciplinary team of health economists, clinical specialists, and data and computer scientists, and developed a dynamic simulation tool called NETIMIS (Network Tools for Intervention Modelling with Intelligent Simulation; http://www.netimis.com ) suitable for visualization of both human-designed and data-mined processes which can then be used for 'what-if' analysis by stakeholders interested in costing, designing and evaluating healthcare interventions. We present two examples of model development to illustrate how dynamic simulation can be informed by big data from an EHR. We found the tool provided a focal point for multidisciplinary team work to help them iteratively and collaboratively 'deep dive' into big data. PMID:26879667

  4. Constraining Big Bang lithium production with recent solar neutrino data

    CERN Document Server

    Takács, Marcell P; Szücs, Tamás; Zuber, Kai

    2015-01-01

    The 3He({\\alpha},{\\gamma})7Be reaction affects not only the production of 7Li in Big Bang nucleosynthesis, but also the fluxes of 7Be and 8B neutrinos from the Sun. This double role is exploited here to constrain the former by the latter. A number of recent experiments on 3He({\\alpha},{\\gamma})7Be provide precise cross section data at E = 0.5-1.0 MeV center-of-mass energy. However, there is a scarcity of precise data at Big Bang energies, 0.1-0.5 MeV, and below. This problem can be alleviated, based on precisely calibrated 7Be and 8B neutrino fluxes from the Sun that are now available, assuming the neutrino flavour oscillation framework to be correct. These fluxes and the standard solar model are used here to determine the 3He(alpha,gamma)7Be astrophysical S-factor at the solar Gamow peak, S(23+6-5 keV) = 0.548+/-0.054 keVb. This new data point is then included in a re-evaluation of the 3He({\\alpha},{\\gamma})7Be S-factor at Big Bang energies, following an approach recently developed for this reaction in the c...

  5. Three dimensional simulation for Big Hill Strategic Petroleum Reserve (SPR).

    Energy Technology Data Exchange (ETDEWEB)

    Ehgartner, Brian L. (Sandia National Laboratories, Albuquerque, NM); Park, Byoung Yoon; Sobolik, Steven Ronald (Sandia National Laboratories, Albuquerque, NM); Lee, Moo Yul (Sandia National Laboratories, Albuquerque, NM)

    2005-07-01

    3-D finite element analyses were performed to evaluate the structural integrity of caverns located at the Strategic Petroleum Reserve's Big Hill site. State-of-art analyses simulated the current site configuration and considered additional caverns. The addition of 5 caverns to account for a full site and a full dome containing 31 caverns were modeled. Operations including both normal and cavern workover pressures and cavern enlargement due to leaching were modeled to account for as many as 5 future oil drawdowns. Under the modeled conditions, caverns were placed very close to the edge of the salt dome. The web of salt separating the caverns and the web of salt between the caverns and edge of the salt dome were reduced due to leaching. The impacts on cavern stability, underground creep closure, surface subsidence and infrastructure, and well integrity were quantified. The analyses included recently derived damage criterion obtained from testing of Big Hill salt cores. The results show that from a structural view point, many additional caverns can be safely added to Big Hill.

  6. Digital carbonate rock physics

    Science.gov (United States)

    Saenger, Erik H.; Vialle, Stephanie; Lebedev, Maxim; Uribe, David; Osorno, Maria; Duda, Mandy; Steeb, Holger

    2016-08-01

    Modern estimation of rock properties combines imaging with advanced numerical simulations, an approach known as digital rock physics (DRP). In this paper we suggest a specific segmentation procedure of X-ray micro-computed tomography data with two different resolutions in the µm range for two sets of carbonate rock samples. These carbonates were already characterized in detail in a previous laboratory study which we complement with nanoindentation experiments (for local elastic properties). In a first step a non-local mean filter is applied to the raw image data. We then apply different thresholds to identify pores and solid phases. Because of a non-neglectable amount of unresolved microporosity (micritic phase) we also define intermediate threshold values for distinct phases. Based on this segmentation we determine porosity-dependent values for effective P- and S-wave velocities as well as for the intrinsic permeability. For effective velocities we confirm an observed two-phase trend reported in another study using a different carbonate data set. As an upscaling approach we use this two-phase trend as an effective medium approach to estimate the porosity-dependent elastic properties of the micritic phase for the low-resolution images. The porosity measured in the laboratory is then used to predict the effective rock properties from the observed trends for a comparison with experimental data. The two-phase trend can be regarded as an upper bound for elastic properties; the use of the two-phase trend for low-resolution images led to a good estimate for a lower bound of effective elastic properties. Anisotropy is observed for some of the considered subvolumes, but seems to be insignificant for the analysed rocks at the DRP scale. Because of the complexity of carbonates we suggest using DRP as a complementary tool for rock characterization in addition to classical experimental methods.

  7. Rock Hellsinki, Marketing Research

    OpenAIRE

    Todd, Roosa; Jalkanen, Katariina

    2013-01-01

    This paper is a qualitative research about rock and heavy metal music tourism in the capital city of Finland, Helsinki. As Helsinki can be considered the city of contrasts, the silent nature city mixed with urban activities, it is important to also use the potential of the loud rock and heavy metal music contrasting the silence. Finland is known abroad for bands such as HIM, Nightwish, Korpiklaani and Children of Bodom so it would make sense to utilize these in the tourism sector as well. The...

  8. Session: Hard Rock Penetration

    Energy Technology Data Exchange (ETDEWEB)

    Tennyson, George P. Jr.; Dunn, James C.; Drumheller, Douglas S.; Glowka, David A.; Lysne, Peter

    1992-01-01

    This session at the Geothermal Energy Program Review X: Geothermal Energy and the Utility Market consisted of five presentations: ''Hard Rock Penetration - Summary'' by George P. Tennyson, Jr.; ''Overview - Hard Rock Penetration'' by James C. Dunn; ''An Overview of Acoustic Telemetry'' by Douglas S. Drumheller; ''Lost Circulation Technology Development Status'' by David A. Glowka; ''Downhole Memory-Logging Tools'' by Peter Lysne.

  9. Rock engineering applications, 1991

    International Nuclear Information System (INIS)

    This book demonstrates how to apply the theories and principles of rock engineering to actual engineering and construction tasks. It features insights on geology for mining and tunnelling applications. It is practical resource that focuses on the latest technological innovation and examines up-to-date procedures used by engineers for coping with complex rock conditions. The authors also discuss question related to underground space, from design approaches to underground housing and storage. And they cover the monitoring of storage caverns for liquid and gaseous products or toxic and radioactive wastes

  10. Constraining big bang lithium production with recent solar neutrino data

    Science.gov (United States)

    Takács, Marcell P.; Bemmerer, Daniel; Szücs, Tamás; Zuber, Kai

    2015-06-01

    The 3He (α ,γ )7Be reaction affects not only the production of 7Li in big bang nucleosynthesis, but also the fluxes of 7Be and 8B neutrinos from the Sun. This double role is exploited here to constrain the former by the latter. A number of recent experiments on 3He α ,γ )7Be provide precise cross section data at E =0.5 - 1.0 MeV center-of-mass energies. However, there is a scarcity of precise data at big bang energies, 0.1-0.5 MeV, and below. This problem can be alleviated, based on precisely calibrated 7Be and 8B neutrino fluxes from the Sun that are now available, assuming the neutrino flavor oscillation framework to be correct. These fluxes and the standard solar model are used here to determine the 3He α ,γ )7Be astrophysical S -factor at the solar Gamow peak, S34ν(2 3-5+6 keV ) =0.548 ±0.054 keV b . This new data point is then included in a reevaluation of the 3He α ,γ )7Be S -factor at big bang energies, following an approach recently developed for this reaction in the context of solar fusion studies. The reevaluated S -factor curve is then used to redetermine the 3He α ,γ )7Be thermonuclear reaction rate at big bang energies. The predicted primordial lithium abundance is 7Li H =5.0 ×10-10 , far higher than the Spite plateau.

  11. M theory model of a big crunch/big bang transition

    International Nuclear Information System (INIS)

    We consider a picture in which the transition from a big crunch to a big bang corresponds to the collision of two empty orbifold planes approaching each other at a constant nonrelativistic speed in a locally flat background space-time, a situation relevant to recently proposed cosmological models. We show that p-brane states which wind around the extra dimension propagate smoothly and unambiguously across the orbifold plane collision. In particular we calculate the quantum mechanical production of winding M2-branes extending from one orbifold to the other. We find that the resulting density is finite and that the resulting gravitational backreaction is small. These winding states, which include the string theory graviton, can be propagated smoothly across the transition using a perturbative expansion in the membrane tension, an expansion which from the point of view of string theory is an expansion in inverse powers of α'. The conventional description of a crunch based on Einstein general relativity, involving Kasner or mixmaster behavior is misleading, we argue, because general relativity is only the leading order approximation to string theory in an expansion in positive powers of α'. In contrast, in the M theory setup we argue that interactions should be well behaved because of the smooth evolution of the fields combined with the fact that the string coupling tends to zero at the crunch. The production of massive Kaluza-Klein states should also be exponentially suppressed for small collision speeds. We contrast this good behavior with that found in previous studies of strings in Lorentzian orbifolds

  12. Interaction between Injection Points during Hydraulic Fracturing

    OpenAIRE

    Hals, Kjetil M. D.; Berre, Inga

    2012-01-01

    We present a model of the hydraulic fracturing of heterogeneous poroelastic media. The formalism is an effective continuum model that captures the coupled dynamics of the fluid pressure and the fractured rock matrix and models both the tensile and shear failure of the rock. As an application of the formalism, we study the geomechanical stress interaction between two injection points during hydraulic fracturing (hydrofracking) and how this interaction influences the fracturing process. For inj...

  13. Rocking and Rolling Rattlebacks

    Science.gov (United States)

    Cross, Rod

    2013-01-01

    A rattleback is a well-known physics toy that has a preferred direction of rotation. If it is spun about a vertical axis in the "wrong" direction, it will slow down, start rocking from end to end, and then spin in the opposite (i.e. preferred) direction. Many articles have been written about rattlebacks. Some are highly mathematical and…

  14. Stanford Rock Physics database

    Energy Technology Data Exchange (ETDEWEB)

    Nolen-Hoeksema, R. (Stanford Univ., CA (United States)); Hart, C. (Envision Systems, Inc., Fremont, CA (United States))

    The authors have developed a relational database for the Stanford Rock Physics (SRP) Laboratory. The database is a flexible tool for helping researchers find relevant data. It significantly speeds retrieval of data and facilitates new organizations of rock physics information to get answers to research questions. The motivation for a database was to have a computer data storage, search, and display capability to explore the sensitivity of acoustic velocities to changes in the properties and states of rocks. Benefits include data exchange among researchers, discovery of new relations in existing data, and identification of new areas of research. The authors' goal was to build a database flexible enough for the dynamic and multidisciplinary research environment of rock physics. Databases are based on data models. A flexible data model must: (1) Not impose strong, prior constraints on the data; (2) not require a steep learning curve of the database architecture; and (3) be easy to modify. The authors' choice of the relational data model reflects these considerations. The database and some hardware and software considerations were influenced by their choice of data model, and their desire to provide a user-friendly interface for the database and build a distributed database system.

  15. Rock solid energy solution

    International Nuclear Information System (INIS)

    Scientists believe naturally radioactive rocks below the earth's surface could provide an inexhaustible and environmentally friendly power source. And Australia could be a geological hotbed should the concept get off the ground. Despite the scale, the concept itself is simple. The Earth's reserves of heat in naturally radioactive rocks could provide an effectively inexhaustible and environmentally friendly source of power. No greenhouse gas emissions, little water usage and minimal pollution. Natural hot springs are already used to make power in some parts of the world, such as Iceland, but creating artificial hot springs by drilling deep into granite -the hardest of rocks - is a much more ambitious concept. One cubic kilometre of hot granite at 250 deg C has the stored energy equivalent of 40 million barrels of oil. In a nutshell, water is pumped into the hot zone - some 3km to 5km down in Australian conditions - and spreads through a 'reservoir' of hot, cracked rocks. Once superheated, it returns to the surface as steam through a separate production well to spin turbines and generate electricity. The water can then be recaptured and reused, with test sites around the world recovering up to around 90 per cent

  16. Umhlanga Rocks coastal defense

    NARCIS (Netherlands)

    De Jong, L.; De Jong, B.; Ivanova, M.; Gerritse, A.; Rietberg, D.; Dorrepaal, S.

    2014-01-01

    The eThekwini coastline is a vulnerable coastline subject to chronic erosion and damage due to sea level rise. In 2007 a severe storm caused major physical and economic damage along the coastline, proving the need for action. Umhlanga Rocks is a densely populated premium holiday destination on the e

  17. Rock-hard coatings

    NARCIS (Netherlands)

    Muller, M.

    2007-01-01

    Aircraft jet engines have to be able to withstand infernal conditions. Extreme heat and bitter cold tax coatings to the limit. Materials expert Dr Ir. Wim Sloof fits atoms together to develop rock-hard coatings. The latest invention in this field is known as ceramic matrix composites. Sloof has sign

  18. Big questions, big science: meeting the challenges of global ecology.

    Science.gov (United States)

    Schimel, David; Keller, Michael

    2015-04-01

    Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects. PMID:25680334

  19. Big data and the electronic health record.

    Science.gov (United States)

    Peters, Steve G; Buntrock, James D

    2014-01-01

    The electronic medical record has evolved from a digital representation of individual patient results and documents to information of large scale and complexity. Big Data refers to new technologies providing management and processing capabilities, targeting massive and disparate data sets. For an individual patient, techniques such as Natural Language Processing allow the integration and analysis of textual reports with structured results. For groups of patients, Big Data offers the promise of large-scale analysis of outcomes, patterns, temporal trends, and correlations. The evolution of Big Data analytics moves us from description and reporting to forecasting, predictive modeling, and decision optimization. PMID:24887521

  20. BLENDING IOT AND BIG DATA ANALYTICS

    OpenAIRE

    Tulasi.B*; Girish J Vemulkar

    2016-01-01

    Internet is continuously evolving and changing. Internet of Things (IoT) can be considered as the future of Internet applications which involves machine to machine learning (M2M). The actionable intelligence can be derived through fusion of Big Data and real time analytics with IoT. Big Data and IoT can be viewed as two sides of a coin. With the connection between Big Data and the objects on Internet benefits of IoT can be easily reaped. The applications of IoT spread across various domains l...

  1. Big data governance an emerging imperative

    CERN Document Server

    Soares, Sunil

    2012-01-01

    Written by a leading expert in the field, this guide focuses on the convergence of two major trends in information management-big data and information governance-by taking a strategic approach oriented around business cases and industry imperatives. With the advent of new technologies, enterprises are expanding and handling very large volumes of data; this book, nontechnical in nature and geared toward business audiences, encourages the practice of establishing appropriate governance over big data initiatives and addresses how to manage and govern big data, highlighting the relevant processes,

  2. Processing Solutions for Big Data in Astronomy

    Science.gov (United States)

    Fillatre, L.; Lepiller, D.

    2016-09-01

    This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.

  3. Cuttability Assessment of Selected Rocks Through Different Brittleness Values

    Science.gov (United States)

    Dursun, Arif Emre; Gokay, M. Kemal

    2016-04-01

    Prediction of cuttability is a critical issue for successful execution of tunnel or mining excavation projects. Rock cuttability is also used to determine specific energy, which is defined as the work done by the cutting force to excavate a unit volume of yield. Specific energy is a meaningful inverse measure of cutting efficiency, since it simply states how much energy must be expended to excavate a unit volume of rock. Brittleness is a fundamental rock property and applied in drilling and rock excavation. Brittleness is one of the most crucial rock features for rock excavation. For this reason, determination of relations between cuttability and brittleness will help rock engineers. This study aims to estimate the specific energy from different brittleness values of rocks by means of simple and multiple regression analyses. In this study, rock cutting, rock property, and brittleness index tests were carried out on 24 different rock samples with different strength values, including marble, travertine, and tuff, collected from sites around Konya Province, Turkey. Four previously used brittleness concepts were evaluated in this study, denoted as B 1 (ratio of compressive to tensile strength), B 2 (ratio of the difference between compressive and tensile strength to the sum of compressive and tensile strength), B 3 (area under the stress-strain line in relation to compressive and tensile strength), and B 9 = S 20, the percentage of fines (University of Science and Technology (NTNU) model as well as B 9p (B 9 as predicted from uniaxial compressive, Brazilian tensile, and point load strengths of rocks using multiple regression analysis). The results suggest that the proposed simple regression-based prediction models including B 3, B 9, and B 9p outperform the other models including B 1 and B 2 and can be used for more accurate and reliable estimation of specific energy.

  4. BIG DATA, BIG CONSEQUENCES? EEN VERKENNING NAAR PRIVACY EN BIG DATA GEBRUIK BINNEN DE OPSPORING, VERVOLGING EN RECHTSPRAAK

    OpenAIRE

    Lodder, A.R.; Meulen, van der, N.; Wisman, T.H.A.; Meij, Lisette; Zwinkels, C.M.M.

    2014-01-01

    In deze verkenning is ingegaan op de privacy aspecten van Big Data analysis binnen het domein Veiligheid en Justitie. Besproken zijn toepassingen binnen de rechtspraak zoals voorspellen van uitspraken en gebruik in rechtszaken. Met betrekking tot opsporing is onder andere ingegaan op predictive policing en internetopsporing. Na een uiteenzetting van de privacynormen en toepassingsmogelijkheden, zijn de volgende zes uitgangspunten voor Big Data toepassingen voorgesteld: 7 A.R. Lodder e.a. ‐ Bi...

  5. NOAA Big Data Partnership RFI

    Science.gov (United States)

    de la Beaujardiere, J.

    2014-12-01

    In February 2014, the US National Oceanic and Atmospheric Administration (NOAA) issued a Big Data Request for Information (RFI) from industry and other organizations (e.g., non-profits, research laboratories, and universities) to assess capability and interest in establishing partnerships to position a copy of NOAA's vast data holdings in the Cloud, co-located with easy and affordable access to analytical capabilities. This RFI was motivated by a number of concerns. First, NOAA's data facilities do not necessarily have sufficient network infrastructure to transmit all available observations and numerical model outputs to all potential users, or sufficient infrastructure to support simultaneous computation by many users. Second, the available data are distributed across multiple services and data facilities, making it difficult to find and integrate data for cross-domain analysis and decision-making. Third, large datasets require users to have substantial network, storage, and computing capabilities of their own in order to fully interact with and exploit the latent value of the data. Finally, there may be commercial opportunities for value-added products and services derived from our data. Putting a working copy of data in the Cloud outside of NOAA's internal networks and infrastructures should reduce demands and risks on our systems, and should enable users to interact with multiple datasets and create new lines of business (much like the industries built on government-furnished weather or GPS data). The NOAA Big Data RFI therefore solicited information on technical and business approaches regarding possible partnership(s) that -- at no net cost to the government and minimum impact on existing data facilities -- would unleash the commercial potential of its environmental observations and model outputs. NOAA would retain the master archival copy of its data. Commercial partners would not be permitted to charge fees for access to the NOAA data they receive, but

  6. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2005-01-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. Efforts are underway to showcase the architecture of the GIS framework and initial results for sources and sinks. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is

  7. Big Bang–Big Crunch Optimization Algorithm for Linear Phase Fir Digital Filter Design

    Directory of Open Access Journals (Sweden)

    Ms. Rashmi Singh Dr. H. K. Verma

    2012-02-01

    Full Text Available The Big Bang–Big Crunch (BB–BC optimization algorithm is a new optimization method that relies on the Big Bang and Big Crunch theory, one of the theories of the evolution of the universe. In this paper, a Big Bang–Big Crunch algorithm has been used here for the design of linear phase finite impulse response (FIR filters. Here the experimented fitness function based on the mean squared error between the actual and the ideal filter response. This paper presents the plot of magnitude response of FIR filters and error graph. The BB-BC seems to be promising tool for FIR filter design especially in a dynamic environment where filter coefficients have to be adapted and fast convergence is of importance.

  8. Imaging Structure, Stratigraphy and Groundwater with Ground-Penetrating Radar on the Big Island, Hawaii

    Science.gov (United States)

    Shapiro, S. R.; Tchakirides, T. F.; Brown, L. D.

    2004-12-01

    A series of exploratory ground-penetrating radar (GPR) surveys were carried out on the Big Island, Hawaii in March of 2004 to evaluate the efficacy of using GPR to address hydrological, volcanological, and tectonic issues in extrusive basaltic materials. Target sites included beach sands, nearshore lava flows, well-developed soil covers, lava tubes, and major fault zones. Surveys were carried out with a Sensors and Software T Pulse Ekko 100, which was equipped with 50, 100, and 200 MHz antennae. Both reflection profiles and CMP expanding spreads were collected at most sites to provide both structural detail and in situ velocity estimation. In general, the volcanic rocks exhibited propagation velocities of ca 0.09-0.10 m/ns, a value which we interpret to reflect the large air-filled porosity of the media. Penetration in the nearshore area was expectedly small (less than 1 m), which we attribute to seawater infiltration. However, surveys in the volcanics away from the coast routinely probed to depths of 10 m or greater, even at 100 MHz. While internal layering and lava tubes could be identified from individual profiles, the complexity of returns suggests that 3D imaging is required before detailed stratigraphy can be usefully interpreted. A pilot 3D survey over a lava tube complex supports this conclusion, although it was prematurely terminated by bad weather. Although analysis of the CMP data does not show a clear systematic variation in radar velocity with age of flow, the dataset is too limited to support any firm conclusions on this point. Unusually distinct, subhorizontal reflectors on several profiles seem to mark groundwater. In one case, the water seems to lie within a lava tube with an air-filled roof zone. Surveys over part of the controversial Hilana fault zone clearly image the fault as a steeply dipping feature in the subsurface, albeit only to depths of a few meters. The results suggest, however, that deeper extensions of the faults could be mapped by

  9. Point Lepreau

    International Nuclear Information System (INIS)

    This brief pamphlet gives general information about the station. The Point Lepreau Nuclear Generating Station consists of a single CANDU 600 unit with a total net capacity of 630,000 kilowatts. This single reactor, the first nuclear installation in Atlantic Canada, is expected to supply about 20% of New Brunswick's electrical energy during the 1980's. The station is located on the Lepreau peninsula, overlooking the Bay of Fundy, 40 km southwest of Saint John on Route 790, off Highway 1. Construction of Point Lepreau began in May 1975 and was completed late in 1981. At the peak of construction activity in 1979, 3,300 workers were employed on the project. First power was produced in September 1982 and Lepreau began commercial operation early in 1983. Point Lepreau was built with provision for an additional 600 MW unit on the site and is essentially a duplicate of CANDU 600 reactors in Quebec, Argentina and Korea. Although started third, Lepreau was the first of these CANDU's in Canada and abroad to be licensed for operation, the first to achieve criticality (start-up), and the first to begin commercial operation. Lepreau is owned and operated by New Brunswick Power

  10. BDGS: A Scalable Big Data Generator Suite in Big Data Benchmarking

    OpenAIRE

    Ming, Zijian; Luo, Chunjie; Gao, Wanling; Han, Rui; Yang, Qiang; Wang, Lei; Zhan, Jianfeng

    2014-01-01

    Data generation is a key issue in big data benchmarking that aims to generate application-specific data sets to meet the 4V requirements of big data. Specifically, big data generators need to generate scalable data (Volume) of different types (Variety) under controllable generation rates (Velocity) while keeping the important characteristics of raw data (Veracity). This gives rise to various new challenges about how we design generators efficiently and successfully. To date, most existing tec...

  11. HOW BIG ARE ’BIG FOUR’ COMPANIES – EVIDENCE FROM ROMANIA

    OpenAIRE

    SORIN ROMULUS BERINDE

    2013-01-01

    The audit market is divided between two main categories of auditors: Big Four auditors and Non Big Four auditors. The general accepted opinion is that the former cover most audit services. The objective of the study is to quantify the share covered by Big Four auditors at the level of Romanian market. In this respect one collected and processed data obtained from the audited companies from the North-West Region of Romania which is considered representative for extrapolating the results at nat...

  12. BigDataBench: a Big Data Benchmark Suite from Web Search Engines

    OpenAIRE

    Gao, Wanling; Zhu, Yuqing; Jia, Zhen; Luo, Chunjie; Wang, Lei; Li, Zhiguo; Zhan, Jianfeng; Qi, Yong; He, Yongqiang; Gong, Shiming; Li, Xiaona; Zhang, Shujie; Qiu, Bizhu

    2013-01-01

    This paper presents our joint research efforts on big data benchmarking with several industrial partners. Considering the complexity, diversity, workload churns, and rapid evolution of big data systems, we take an incremental approach in big data benchmarking. For the first step, we pay attention to search engines, which are the most important domain in Internet services in terms of the number of page views and daily visitors. However, search engine service providers treat data, applications,...

  13. "Big Data" : big gaps of knowledge in the field of internet science

    OpenAIRE

    Snijders, CCP Chris; Matzat, U Uwe; Reips, UD

    2012-01-01

    Research on so-called 'Big Data' has received a considerable momentum and is expected to grow in the future. One very interesting stream of research on Big Data analyzes online networks. Many online networks are known to have some typical macro-characteristics, such as 'small world' properties. Much less is known about underlying micro-processes leading to these properties. The models used by Big Data researchers usually are inspired by mathematical ease of exposition. We propose to follow in...

  14. 6 Top Tools for Taming Big Data%6Top Tools for Taming Big Data

    Institute of Scientific and Technical Information of China (English)

    JakoB BJ orklund

    2012-01-01

    The industry now has a buzzword,"big data," for how we're going to do something with the huge amount of information piling up."Big data" is replacing "business intelligence,"which subsumed "reporting," which put a nicer gloss on "spreadsheets," which beat out the old-fashioned "printouts."Managers who long ago studied printouts are now hiring mathematicians who claim to be big data specialists to help them solve the same old problem:What's selling and why?

  15. Musical Structure as Narrative in Rock

    Directory of Open Access Journals (Sweden)

    John Fernando Encarnacao

    2011-09-01

    Full Text Available In an attempt to take a fresh look at the analysis of form in rock music, this paper uses Susan McClary’s (2000 idea of ‘quest narrative’ in Western art music as a starting point. While much pop and rock adheres to the basic structure of the establishment of a home territory, episodes or adventures away, and then a return, my study suggests three categories of rock music form that provide alternatives to common combinations of verses, choruses and bridges through which the quest narrative is delivered. Labyrinth forms present more than the usual number of sections to confound our sense of ‘home’, and consequently of ‘quest’. Single-cell forms use repetition to suggest either a kind of stasis or to disrupt our expectations of beginning, middle and end. Immersive forms blur sectional divisions and invite more sensual and participatory responses to the recorded text. With regard to all of these alternative approaches to structure, Judy Lochhead’s (1992 concept of ‘forming’ is called upon to underline rock music forms that unfold as process, rather than map received formal constructs. Central to the argument are a couple of crucial definitions. Following Theodore Gracyk (1996, it is not songs, as such, but particular recordings that constitute rock music texts. Additionally, narrative is understood not in (direct relation to the lyrics of a song, nor in terms of artists’ biographies or the trajectories of musical styles, but considered in terms of musical structure. It is hoped that this outline of non-narrative musical structures in rock may have applications not only to other types of music, but to other time-based art forms.

  16. Geochemical and tectonic uplift controls on rock nitrogen inputs across terrestrial ecosystems

    Science.gov (United States)

    Morford, Scott L.; Houlton, Benjamin Z.; Dahlgren, Randy A.

    2016-02-01

    Rock contains > 99% of Earth's reactive nitrogen (N), but questions remain over the direct importance of rock N weathering inputs to terrestrial biogeochemical cycling. Here we investigate the factors that regulate rock N abundance and develop a new model for quantifying rock N mobilization fluxes across desert to temperate rainforest ecosystems in California, USA. We analyzed the N content of 968 rock samples from 531 locations and compiled 178 cosmogenically derived denudation estimates from across the region to identify landscapes and ecosystems where rocks account for a significant fraction of terrestrial N inputs. Strong coherence between rock N content and geophysical factors, such as protolith, (i.e. parent rock), grain size, and thermal history, are observed. A spatial model that combines rock geochemistry with lithology and topography demonstrates that average rock N reservoirs range from 0.18 to 1.2 kg N m-3 (80 to 534 mg N kg-1) across the nine geomorphic provinces of California and estimates a rock N denudation flux of 20-92 Gg yr-1 across the entire study area (natural atmospheric inputs ~ 140 Gg yr-1). The model highlights regional differences in rock N mobilization and points to the Coast Ranges, Transverse Ranges, and the Klamath Mountains as regions where rock N could contribute meaningfully to ecosystem N cycling. Contrasting these data to global compilations suggests that our findings are broadly applicable beyond California and that the N abundance and variability in rock are well constrained across most of the Earth system.

  17. Relationship between natural radioactivity and rock type in the Van lake basin - Turkey

    International Nuclear Information System (INIS)

    The Van Lake basin located at eastern part of Turkey. The Van lake basin essentially comprises two province, these are namely Van and Bitlis. The former geochemistry research indicated that the uranium concentrations of Van lake water and deep sediments are 78-116 ppb and 0.1-0.5 ppm respectively. Uranium was transported to Van Lake by rivers and streams, flow through to outcrops of Paleozoic Bitlis Massive, and young Pleistocene alkaline/calkalkaline volcanic rocks. This study focused on the revealing natural radioactivity and secondary dispersion of radioactivity related to rock types surface environments in the Van Lake Basin. The Van Lake Basin essentially subdivided into three different parts; the Eastern parts characterized by Mesozoic basic and ultra basic rocks, southern parts dominated by metamorphic rocks of Bitlis Massive, Western and Northwestern parts covered by volcanic rocks of Pleistocene. Volcanic rocks can be subdivided into two different types. The first type is mafic rocks mainly composed of basalts. The second type is felsic rocks represented by rhyolites, dacites and pumice tuff. Surface gamma measurements (cps) and dose rate measurements (μR/h) show different values according to rock type. Surface gamma measurement and surface dose rate values in the basaltic rocks are slightly higher than the average values (130 cps, 11 μR/h). In the felsic volcanic rocks such as rhyolites and dacites surface gamma measurement values and surface dose rate values, occasionally exceed the background. Highest values were obtained in the pumice tuffs. Rhyolitic eruptions related to Quaternary volcanic activity formed thick pumice (natural glassy froth related to felsic volcanic rocks and exhibit spongy texture) sequences Northern and Western part of Van Lake basin. The dose rate of pumice rocks was measured mean 15 μR/h. The highest value for surface gamma measurements was recorded as 200 cps. The pumice has very big water capacity, due to porous texture of

  18. 'Big bang' of quantum universe

    International Nuclear Information System (INIS)

    The reparametrization-invariant generating functional for the unitary and causal perturbation theory in general relativity in a finite space-time is obtained. The classical cosmology of a Universe and the Faddeev-Popov-DeWitt functional correspond to different orders of decomposition of this functional over the inverse 'mass' of a Universe. It is shown that the invariant content of general relativity as a constrained system can be covered by two 'equivalent' unconstrained systems: the 'dynamic' (with 'dynamic' time as the cosmic scale factor and conformal field variables) and 'geometric' (given by the Levi-Civita type canonical transformation to the action-angle variables which determine initial cosmological states with the arrow of the proper time measured by the watch of an observer in the comoving frame). 'Big Bang', the Hubble evolution, and creation of 'dynamic' particles by the 'geometric' vacuum are determined by 'relations' between the dynamic and geometric systems as pure relativistic phenomena, like the Lorentz-type 'relation' between the rest and comoving frames in special relativity

  19. "Big Science" exhibition at Balexert

    CERN Multimedia

    2008-01-01

    CERN is going out to meet those members of the general public who were unable to attend the recent Open Day. The Laboratory will be taking its "Big Science" exhibition from the Globe of Science and Innovation to the Balexert shopping centre from 19 to 31 May 2008. The exhibition, which shows the LHC and its experiments through the eyes of a photographer, features around thirty spectacular photographs measuring 4.5 metres high and 2.5 metres wide. Welcomed and guided around the exhibition by CERN volunteers, shoppers at Balexert will also have the opportunity to discover LHC components on display and watch films. "Fun with Physics" workshops will be held at certain times of the day. Main hall of the Balexert shopping centre, ground floor, from 9.00 a.m. to 7.00 p.m. Monday to Friday and from 10 a.m. to 6 p.m. on the two Saturdays. Call for volunteers All members of the CERN personnel are invited to enrol as volunteers to help welcom...

  20. The NOAA Big Data Project

    Science.gov (United States)

    de la Beaujardiere, J.

    2015-12-01

    The US National Oceanic and Atmospheric Administration (NOAA) is a Big Data producer, generating tens of terabytes per day from hundreds of sensors on satellites, radars, aircraft, ships, and buoys, and from numerical models. These data are of critical importance and value for NOAA's mission to understand and predict changes in climate, weather, oceans, and coasts. In order to facilitate extracting additional value from this information, NOAA has established Cooperative Research and Development Agreements (CRADAs) with five Infrastructure-as-a-Service (IaaS) providers — Amazon, Google, IBM, Microsoft, Open Cloud Consortium — to determine whether hosting NOAA data in publicly-accessible Clouds alongside on-demand computational capability stimulates the creation of new value-added products and services and lines of business based on the data, and if the revenue generated by these new applications can support the costs of data transmission and hosting. Each IaaS provider is the anchor of a "Data Alliance" which organizations or entrepreneurs can join to develop and test new business or research avenues. This presentation will report on progress and lessons learned during the first 6 months of the 3-year CRADAs.

  1. Big-bang nucleosynthesis revisited

    Science.gov (United States)

    Olive, Keith A.; Schramm, David N.; Steigman, Gary; Walker, Terry P.

    1989-01-01

    The homogeneous big-bang nucleosynthesis yields of D, He-3, He-4, and Li-7 are computed taking into account recent measurements of the neutron mean-life as well as updates of several nuclear reaction rates which primarily affect the production of Li-7. The extraction of primordial abundances from observation and the likelihood that the primordial mass fraction of He-4, Y(sub p) is less than or equal to 0.24 are discussed. Using the primordial abundances of D + He-3 and Li-7 we limit the baryon-to-photon ratio (eta in units of 10 exp -10) 2.6 less than or equal to eta(sub 10) less than or equal to 4.3; which we use to argue that baryons contribute between 0.02 and 0.11 to the critical energy density of the universe. An upper limit to Y(sub p) of 0.24 constrains the number of light neutrinos to N(sub nu) less than or equal to 3.4, in excellent agreement with the LEP and SLC collider results. We turn this argument around to show that the collider limit of 3 neutrino species can be used to bound the primordial abundance of He-4: 0.235 less than or equal to Y(sub p) less than or equal to 0.245.

  2. Astronomical Surveys and Big Data

    CERN Document Server

    Mickaelian, A M

    2015-01-01

    Recent all-sky and large-area astronomical surveys and their catalogued data over the whole range of electromagnetic spectrum are reviewed, from Gamma-ray to radio, such as Fermi-GLAST and INTEGRAL in Gamma-ray, ROSAT, XMM and Chandra in X-ray, GALEX in UV, SDSS and several POSS I and II based catalogues (APM, MAPS, USNO, GSC) in optical range, 2MASS in NIR, WISE and AKARI IRC in MIR, IRAS and AKARI FIS in FIR, NVSS and FIRST in radio and many others, as well as most important surveys giving optical images (DSS I and II, SDSS, etc.), proper motions (Tycho, USNO, Gaia), variability (GCVS, NSVS, ASAS, Catalina, Pan-STARRS) and spectroscopic data (FBS, SBS, Case, HQS, HES, SDSS, CALIFA, GAMA). An overall understanding of the coverage along the whole wavelength range and comparisons between various surveys are given: galaxy redshift surveys, QSO/AGN, radio, Galactic structure, and Dark Energy surveys. Astronomy has entered the Big Data era. Astrophysical Virtual Observatories and Computational Astrophysics play a...

  3. Deuterium and big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Measurements of deuterium absorption in high redshift quasar absorption systems provide a direct inference of the deuterium abundance produced by big bang nucleosynthesis (BBN). With measurements and limits from five independent absorption systems, we place strong constraints on the primordial ratio of deuterium to hydrogen, (D/H)p = 3.4 ± 0.3 x 10-5 [1,2]. We employ a direct numerical treatment to improve the estimates of critical reaction rates and reduce the uncertainties in BBN predictions of D/H and 7Li/H by a factor of three[3] over previous efforts[4]. Using our measurements of (D/H)p and new BBN predictions, we find at 95% confidence the baryon density ρb = (3.6 ± 0.4) x 10-31 g cm-3 (Ωbh265 = 0.045 ± 0.006 in units of the critical density), and cosmological baryon-photon ratio η = (5.1 ± 0.6) x 10-10

  4. Tick-Borne Diseases: The Big Two

    Science.gov (United States)

    ... Ticks and Diseases Tick-borne Diseases: The Big Two Past Issues / Spring - Summer 2010 Table of Contents ... muscle pain. The red-spotted rash usually happens 2 to 5 days after the fever begins. Antibiotics ...

  5. ARC Code TI: BigView

    Data.gov (United States)

    National Aeronautics and Space Administration — BigView allows for interactive panning and zooming of images of arbitrary size on desktop PCs running linux. Additionally, it can work in a multi-screen environment...

  6. Heat Waves Pose Big Health Threats

    Science.gov (United States)

    ... https://medlineplus.gov/news/fullstory_159744.html Heat Waves Pose Big Health Threats Kids, elderly among those ... can be inherently dangerous, but the initial heat waves every summer can be particularly perilous to those ...

  7. Scaling big data with Hadoop and Solr

    CERN Document Server

    Karambelkar, Hrishikesh Vijay

    2015-01-01

    This book is aimed at developers, designers, and architects who would like to build big data enterprise search solutions for their customers or organizations. No prior knowledge of Apache Hadoop and Apache Solr/Lucene technologies is required.

  8. Cosmic relics from the big bang

    International Nuclear Information System (INIS)

    A brief introduction to the big bang picture of the early universe is given. Dark matter is discussed; particularly its implications for elementary particle physics. A classification scheme for dark matter relics is given. 21 refs., 11 figs., 1 tab

  9. Fisicos argentinos reproduciran el Big Bang

    CERN Multimedia

    De Ambrosio, Martin

    2008-01-01

    Two groups of argentine physicists from La Plata and Buenos Aires Universities work in a sery of experiments who while recreate the conditions of the big explosion that was at the origin of the universe. (1 page)

  10. Soft computing in big data processing

    CERN Document Server

    Park, Seung-Jong; Lee, Jee-Hyong

    2014-01-01

    Big data is an essential key to build a smart world as a meaning of the streaming, continuous integration of large volume and high velocity data covering from all sources to final destinations. The big data range from data mining, data analysis and decision making, by drawing statistical rules and mathematical patterns through systematical or automatically reasoning. The big data helps serve our life better, clarify our future and deliver greater value. We can discover how to capture and analyze data. Readers will be guided to processing system integrity and implementing intelligent systems. With intelligent systems, we deal with the fundamental data management and visualization challenges in effective management of dynamic and large-scale data, and efficient processing of real-time and spatio-temporal data. Advanced intelligent systems have led to managing the data monitoring, data processing and decision-making in realistic and effective way. Considering a big size of data, variety of data and frequent chan...

  11. Big Fish and Prized Trees Gain Protection

    Institute of Scientific and Technical Information of China (English)

    Fred Pearce; 吴敏

    2004-01-01

    @@ Decisions made at a key conservation① meeting are good news for big and quirky② fish and commercially prized trees. Several species will enjoy extra protection against trade following rulings made at the Convention on International Trade in Endangered Species (CITES).

  12. Hunting Plan : Big Stone National Wildlife Refuge

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The Big Stone National Wildlife Refuge Hunting Plan provides guidance for the management of hunting on the refuge. Hunting program objectives include providing a...

  13. Conjecture on Avoidance of Big Crunch

    Institute of Scientific and Technical Information of China (English)

    SUN Cheng-Yi; ZHANG De-Hai

    2006-01-01

    By conjecturing the physics at the Planck scale, we modify the definition of the Hawking temperature and modify the Friedmann equation. It is found that we can avoid the singularity of the big crunch and obtain a bouncing cosmological model.

  14. Joint Commission on rock properties

    Science.gov (United States)

    A joint commission on Rock Properties for Petroleum Engineers (RPPE) has been established by the International Society of Rock Mechanics and the Society of Petroleum Engineers to set up data banks on the properties of sedimentary rocks encountered during drilling. Computer-based data banks of complete rock properties will be organized for sandstones (GRESA), shales (ARSHA) and carbonates (CARCA). The commission hopes to access data sources from members of the commission, private companies and the public domain.

  15. 76 FR 7837 - Big Rivers Electric Corporation; Notice of Filing

    Science.gov (United States)

    2011-02-11

    ... Energy Regulatory Commission Big Rivers Electric Corporation; Notice of Filing Take notice that on February 4, 2011, Big Rivers Electric Corporation (Big Rivers) filed a notice of cancellation of its Second Revised and Restated Open Access Transmission Tariff. Big Rivers also requests waiver of the...

  16. Did the big bang boil?

    CERN Document Server

    Wilczek, Frank

    2006-01-01

    "Standard theories tell us that, at some point in the Universe's evolution, free quarks and gluons must have become bound together into the hadronic matter we see today. But was this transition abrupt or smooth?

  17. Sufficient conditions for a period incrementing big bang bifurcation in one-dimensional maps

    International Nuclear Information System (INIS)

    Typically, big bang bifurcation occurs for one (or higher)-dimensional piecewise-defined discontinuous systems whenever two border collision bifurcation curves collide transversely in the parameter space. At that point, two (feasible) fixed points collide with one boundary in state space and become virtual, and, in the one-dimensional case, the map becomes continuous. Depending on the properties of the map near the codimension-two bifurcation point, there exist different scenarios regarding how the infinite number of periodic orbits are born, mainly the so-called period adding and period incrementing. In our work we prove that, in order to undergo a big bang bifurcation of the period incrementing type, it is sufficient for a piecewise-defined one-dimensional map that the colliding fixed points are attractive and with associated eigenvalues of different signs

  18. From data quality to big data quality

    OpenAIRE

    Batini, C; Rula, A; Scannapieco, M; Viscusi, G

    2015-01-01

    This article investigates the evolution of data quality issues from traditional structured data managed in relational databases to Big Data. In particular, the paper examines the nature of the relationship between Data Quality and several research coordinates that are relevant in Big Data, such as the variety of data types, data sources and application domains, focusing on maps, semi-structured texts, linked open data, sensor &sensor networks and official statistics. Consequently a set of str...

  19. Adapting bioinformatics curricula for big data

    OpenAIRE

    Greene, Anna C.; Giffin, Kristine A.; Greene, Casey S; Jason H Moore

    2015-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these...

  20. Mining Big Data to Predicting Future

    OpenAIRE

    Tyagi, Amit K.; Priya, R.

    2015-01-01

    Due to technological advances, vast data sets (e.g. big data) are increasing now days. Big Data a new term; is used to identify the collected datasets. But due to their large size and complexity, we cannot manage with our current methodologies or data mining software tools to extract those datasets. Such datasets provide us with unparalleled opportunities for modelling and predicting of future with new challenges. So as an awareness of this and weaknesses as well as the possibilit...

  1. Scientific Big Data Analytics by HPC

    OpenAIRE

    Lippert, Thomas; Mallmann, Daniel; Riedel, Morris

    2016-01-01

    Storing, managing, sharing, curating and especially analysing huge amounts of data face animmense visibility and importance in industry and economy as well as in science and research.Industry and economy exploit "Big Data" for predictive analysis, to increase the efficiency ofinfrastructures, customer segmentation, and tailored services. In science, Big Data allows foraddressing problems with complexities that were impossible to deal with so far. The amountsof data are growing exponentially i...

  2. Effective Dynamics of the Matrix Big Bang

    OpenAIRE

    Craps, Ben; Rajaraman, Arvind; Sethi, Savdeep

    2006-01-01

    We study the leading quantum effects in the recently introduced Matrix Big Bang model. This amounts to a study of supersymmetric Yang-Mills theory compactified on the Milne orbifold. We find a one-loop potential that is attractive near the Big Bang. Surprisingly, the potential decays very rapidly at late times, where it appears to be generated by D-brane effects. Usually, general covariance constrains the form of any effective action generated by renormalization group flow. However, the form ...

  3. Dark energy, wormholes, and the Big Rip

    OpenAIRE

    Faraoni, Valerio; Israel, Werner

    2005-01-01

    The time evolution of a wormhole in a Friedmann universe approaching the Big Rip is studied. The wormhole is modeled by a thin spherical shell accreting the superquintessence fluid - two different models are presented. Contrary to recent claims that the wormhole overtakes the expansion of the universe and engulfs it before the Big Rip is reached, it is found that the wormhole becomes asymptotically comoving with the cosmic fluid and the future evolution of the universe is fully causal.

  4. COBE looks back to the Big Bang

    Science.gov (United States)

    Mather, John C.

    1993-01-01

    An overview is presented of NASA-Goddard's Cosmic Background Explorer (COBE), the first NASA satellite designed to observe the primeval explosion of the universe. The spacecraft carries three extremely sensitive IR and microwave instruments designed to measure the faint residual radiation from the Big Bang and to search for the formation of the first galaxies. COBE's far IR absolute spectrophotometer has shown that the Big Bang radiation has a blackbody spectrum, proving that there was no large energy release after the explosion.

  5. Leading Undergraduate Students to Big Data Generation

    OpenAIRE

    Yang, Jianjun; Shen, Ju

    2015-01-01

    People are facing a flood of data today. Data are being collected at unprecedented scale in many areas, such as networking, image processing, virtualization, scientific computation, and algorithms. The huge data nowadays are called Big Data. Big data is an all encompassing term for any collection of data sets so large and complex that it becomes difficult to process them using traditional data processing applications. In this article, the authors present a unique way which uses network simula...

  6. Cincinnati Big Area Additive Manufacturing (BAAM)

    Energy Technology Data Exchange (ETDEWEB)

    Duty, Chad E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Love, Lonnie J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-03-04

    Oak Ridge National Laboratory (ORNL) worked with Cincinnati Incorporated (CI) to demonstrate Big Area Additive Manufacturing which increases the speed of the additive manufacturing (AM) process by over 1000X, increases the size of parts by over 10X and shows a cost reduction of over 100X. ORNL worked with CI to transition the Big Area Additive Manufacturing (BAAM) technology from a proof-of-principle (TRL 2-3) demonstration to a prototype product stage (TRL 7-8).

  7. The Death of the Big Men

    DEFF Research Database (Denmark)

    Martin, Keir

    2010-01-01

    Recently Tolai people og Papua New Guinea have adopted the term 'Big Shot' to decribe an emerging post-colonial political elite. The mergence of the term is a negative moral evaluation of new social possibilities that have arisen as a consequence of the Big Shots' privileged position within a...... example of a global proces in which key lexical categories that contest, trace and shape how global historical change is experienced and constituted through linguistic categories....

  8. Data Confidentiality Challenges in Big Data Applications

    Energy Technology Data Exchange (ETDEWEB)

    Yin, Jian; Zhao, Dongfang

    2015-12-15

    In this paper, we address the problem of data confidentiality in big data analytics. In many fields, much useful patterns can be extracted by applying machine learning techniques to big data. However, data confidentiality must be protected. In many scenarios, data confidentiality could well be a prerequisite for data to be shared. We present a scheme to provide provable secure data confidentiality and discuss various techniques to optimize performance of such a system.

  9. ATLAS: civil engineering Point 1

    CERN Multimedia

    1998-01-01

    Different phases of the SX 15 realisation to Point 1 : zone of the ATLAS experiment 00:13:43 Realization of the concrete floor 19-10-1998 00:29:26 Putting up the metal rails for the roof 19-10-1998 00:33:42 Road alignment entering to POINT1 and in Bollot wood 27-10-1998 00:41:53 General sight of the buildings in construction Building SX gives the cover for the work at the experiment It is used to shelter the Pit and the work for the underground cavern as well as for covering the ground work with big cranes that allows the lowering of the components belonging to the detector. The hall is also used as a detector part storage and cover during the assembly. It shelters small workshops of mechanics and electronics necessary for the assembly and the maintenance of the ATLAS experiment.

  10. Main Issues in Big Data Security

    Directory of Open Access Journals (Sweden)

    Julio Moreno

    2016-09-01

    Full Text Available Data is currently one of the most important assets for companies in every field. The continuous growth in the importance and volume of data has created a new problem: it cannot be handled by traditional analysis techniques. This problem was, therefore, solved through the creation of a new paradigm: Big Data. However, Big Data originated new issues related not only to the volume or the variety of the data, but also to data security and privacy. In order to obtain a full perspective of the problem, we decided to carry out an investigation with the objective of highlighting the main issues regarding Big Data security, and also the solutions proposed by the scientific community to solve them. In this paper, we explain the results obtained after applying a systematic mapping study to security in the Big Data ecosystem. It is almost impossible to carry out detailed research into the entire topic of security, and the outcome of this research is, therefore, a big picture of the main problems related to security in a Big Data system, along with the principal solutions to them proposed by the research community.

  11. Rock and mineral magnetism

    CERN Document Server

    O’Reilly, W

    1984-01-01

    The past two decades have witnessed a revolution in the earth sciences. The quantitative, instrument-based measurements and physical models of. geophysics, together with advances in technology, have radically transformed the way in which the Earth, and especially its crust, is described. The study of the magnetism of the rocks of the Earth's crust has played a major part in this transformation. Rocks, or more specifically their constituent magnetic minerals, can be regarded as a measuring instrument provided by nature, which can be employed in the service of the earth sciences. Thus magnetic minerals are a recording magnetometer; a goniometer or protractor, recording the directions of flows, fields and forces; a clock; a recording thermometer; a position recorder; astrain gauge; an instrument for geo­ logical surveying; a tracer in climatology and hydrology; a tool in petrology. No instrument is linear, or free from noise and systematic errors, and the performance of nature's instrument must be assessed and ...

  12. Uranium in alkaline rocks

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, M.; Wollenberg, H.; Strisower, B.; Bowman, H.; Flexser, S.; Carmichael, I.

    1978-04-01

    Geologic and geochemical criteria were developed for the occurrence of economic uranium deposits in alkaline igneous rocks. A literature search, a limited chemical analytical program, and visits to three prominent alkaline-rock localities (Ilimaussaq, Greenland; Pocos de Caldas, Brazil; and Powderhorn, Colorado) were made to establish criteria to determine if a site had some uranium resource potential. From the literature, four alkaline-intrusive occurrences of differing character were identified as type-localities for uranium mineralization, and the important aspects of these localities were described. These characteristics were used to categorize and evaluate U.S. occurrences. The literature search disclosed 69 U.S. sites, encompassing nepheline syenite, alkaline granite, and carbonatite. It was possible to compare two-thirds of these sites to the type localities. A ranking system identified ten of the sites as most likely to have uranium resource potential.

  13. Uranium in alkaline rocks

    International Nuclear Information System (INIS)

    Geologic and geochemical criteria were developed for the occurrence of economic uranium deposits in alkaline igneous rocks. A literature search, a limited chemical analytical program, and visits to three prominent alkaline-rock localities (Ilimaussaq, Greenland; Pocos de Caldas, Brazil; and Powderhorn, Colorado) were made to establish criteria to determine if a site had some uranium resource potential. From the literature, four alkaline-intrusive occurrences of differing character were identified as type-localities for uranium mineralization, and the important aspects of these localities were described. These characteristics were used to categorize and evaluate U.S. occurrences. The literature search disclosed 69 U.S. sites, encompassing nepheline syenite, alkaline granite, and carbonatite. It was possible to compare two-thirds of these sites to the type localities. A ranking system identified ten of the sites as most likely to have uranium resource potential

  14. Limados : Rock peruano

    OpenAIRE

    García Morete, Ramiro

    2013-01-01

    Incentivado por la corriente nuevaolera que llegaba de México, fue señalado por especialistas como pionero del punk. Aunque el plan, era tocar con lo que hubiera. Un recodo ínfimo de un período breve pero sorprendentemente poderoso, los 60 en un país que hizo del rock una expresión propia de su cultura.

  15. Deformations of fractured rock

    International Nuclear Information System (INIS)

    Results of the DBM and FEM analysis in this study indicate that a suitable rock mass for repository of radioactive waste should be moderately jointed (about 1 joint/m2) and surrounded by shear zones of the first order. This allowes for a gentle and flexible deformation under tectonic stresses and prevent the development of large cross-cutting failures in the repository area. (author)

  16. Relationalism Evolves the Universe Through the Big Bang

    CERN Document Server

    Koslowski, Tim A; Sloan, David

    2016-01-01

    We investigate the singularities of homogeneous cosmologies from the point of view of relational (and physically relevant) degrees of freedom of the gravitational field. These do not depend on absolute units of length and duration - thus they do not include the volume and extrinsic curvature. We find that the fully relational dynamical system remains well posed for all physical times, even at the point that would be described as the big bang when evolving present day data backwards in time.This result is achieved in two steps: (1) for solutions which are gravity-dominated near the singularity, we show that any extended physical clock (whose readings only depend on the relational degrees of freedom) will undergo an infinite number of ticks before reaching the big bang. The singularity is therefore pushed into the infinite physical past of any physical clock. (2) for solutions where a stiff matter component (e.g. a massless scalar field) dominates at the singularity, we show that the relational degrees of freed...

  17. Analyzing Big Data with the Hybrid Interval Regression Methods

    OpenAIRE

    Chia-Hui Huang; Keng-Chieh Yang; Han-Ying Kao

    2014-01-01

    Big data is a new trend at present, forcing the significant impacts on information technologies. In big data applications, one of the most concerned issues is dealing with large-scale data sets that often require computation resources provided by public cloud services. How to analyze big data efficiently becomes a big challenge. In this paper, we collaborate interval regression with the smooth support vector machine (SSVM) to analyze big data. Recently, the smooth support vector machine (SSVM...

  18. Boosting Big National Lab Data

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-02-21

    Introduction: Big data. Love it or hate it, solving the world’s most intractable problems requires the ability to make sense of huge and complex sets of data and do it quickly. Speeding up the process – from hours to minutes or from weeks to days – is key to our success. One major source of such big data are physical experiments. As many will know, these physical experiments are commonly used to solve challenges in fields such as energy security, manufacturing, medicine, pharmacology, environmental protection and national security. Experiments use different instruments and sensor types to research for example the validity of new drugs, the base cause for diseases, more efficient energy sources, new materials for every day goods, effective methods for environmental cleanup, the optimal ingredients composition for chocolate or determine how to preserve valuable antics. This is done by experimentally determining the structure, properties and processes that govern biological systems, chemical processes and materials. The speed and quality at which we can acquire new insights from experiments directly influences the rate of scientific progress, industrial innovation and competitiveness. And gaining new groundbreaking insights, faster, is key to the economic success of our nations. Recent years have seen incredible advances in sensor technologies, from house size detector systems in large experiments such as the Large Hadron Collider and the ‘Eye of Gaia’ billion pixel camera detector to high throughput genome sequencing. These developments have led to an exponential increase in data volumes, rates and variety produced by instruments used for experimental work. This increase is coinciding with a need to analyze the experimental results at the time they are collected. This speed is required to optimize the data taking and quality, and also to enable new adaptive experiments, where the sample is manipulated as it is observed, e.g. a substance is injected into a

  19. Advances and Applications of Rock Physics for Hydrocarbon Exploration

    Directory of Open Access Journals (Sweden)

    Valle-Molina C.

    2012-10-01

    Full Text Available Integration of the geological and geophysical information with different scale and features is the key point to establish relationships between petrophysical and elastic characteristics of the rocks in the reservoir. It is very important to present the fundamentals and current methodologies of the rock physics analyses applied to hydrocarbons exploration among engineers and Mexican students. This work represents an effort to capacitate personnel of oil exploration through the revision of the subjects of rock physics. The main aim is to show updated improvements and applications of rock physics into seismology for exploration. Most of the methodologies presented in this document are related to the study the physical and geological mechanisms that impact on the elastic properties of the rock reservoirs based on rock specimens characterization and geophysical borehole information. Predictions of the rock properties (litology, porosity, fluid in the voids can be performed using 3D seismic data that shall be properly calibrated with experimental measurements in rock cores and seismic well log data

  20. Rock in Rio: forever young

    OpenAIRE

    Ricardo Ferreira Freitas; Flávio Lins Rodrigues

    2014-01-01

    The purpose of this article is to discuss the role of Rock in Rio: The Musical, as herald of megafestival Rock in Rio. Driven by the success that musicals have reached in Brazil, we believe that the design of this spectacle of music, dance and staging renews the brand of the rock festival, once it adds the force of young and healthy bodies to its concept. Moreover, the musical provides Rock in Rio with some distance from the controversal trilogy of sex, drugs and rock and roll, a strong mark ...

  1. Rock in Rio: forever young

    Directory of Open Access Journals (Sweden)

    Ricardo Ferreira Freitas

    2014-12-01

    Full Text Available The purpose of this article is to discuss the role of Rock in Rio: The Musical, as herald of megafestival Rock in Rio. Driven by the success that musicals have reached in Brazil, we believe that the design of this spectacle of music, dance and staging renews the brand of the rock festival, once it adds the force of young and healthy bodies to its concept. Moreover, the musical provides Rock in Rio with some distance from the controversal trilogy of sex, drugs and rock and roll, a strong mark of past festivals around the world. Thus, the musical expands the possibilities of growth for the brand.

  2. Small government or big government?

    Directory of Open Access Journals (Sweden)

    MATEO SPAHO

    2015-03-01

    Full Text Available Since the beginning of the twentieth century, economists and philosophers were polarizedon their positions beyond the role that the government should have in the economy. On one hand John Maynard Keynes represented, within the optics of market economy, a position where the state should intervene in the economy to maintain the aggregate demand and the employment in the country, without hesitation in creating budget deficits and public debt expansion. This approach happens especially in the moments when the domestic economy and global economic trends show a weak growth or a recession. This means a heavy interference inthe economy, with higher income but with high expenditure to GDP too. On the other side, Liberals and Neoliberalsled by Friedrich Hayek advocated a withdrawal of the government from economic activity not just in moments of economic growth but also during the crisis, believing that the market has self-regulating mechanisms within itself. The government, as a result will have a smaller dimension with lower income and also low expenditures compared to the GDP of the country. We took the South-Eastern Europe countries distinguishing those with a "Big Government" or countries with "Small Government". There are analyzed the economic performances during the global crisis (2007-2014. In which countries the public debt grew less? Which country managed to attract more investments and which were the countries that preserved the purchasing power of their consumers? We shall see if during the economic crisis in Eastern Europe the Great Government or the Liberal and "Small" one has been the most successful the model.

  3. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2005-11-01

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO2 utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research agenda in Carbon Sequestration. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other DOE regional partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for soil C in the

  4. Characteristics and geological significance of olivine xenocrysts in Cenozoic volcanic rocks from western Qinling

    Institute of Scientific and Technical Information of China (English)

    SU Benxun; ZHANG Hongfu; XIAO Yan; ZHAO Xinmiao

    2006-01-01

    Cenozoic volcanic rocks from the Haoti, Dangchang County of the western Qinling Mountains, contain a few clearlyzoned olivines. These olivines are relatively big in grain sizes and usually have cracks or broken features. Their cores have similar compositions (Mg# = 90.4- 91.0) to those for the peridotitic xenoliths entrained in host volcanic rocks and their rims are close to the compositions of olivine phenocrysts (Mg# = 85.5 81.9). The CaO contents in these zoned olivines are lower than 0.1%. These features demonstrate that the clearly zoned olivines are xenocrysts and disaggregated from mantle peridotites. The zoned texture was the result of the interaction between the olivine and host magma. Available data show that the volcanic rocks would have been derived from the mantle source metasomatized by subducted hydrathermally-altered oceanic crust. The formation of these Cenozoic volcanic rocks was perhaps related to the rapid uplift of the Tibetan Plateau.

  5. Rock bolts - Improved design and possibilities

    OpenAIRE

    Thomas-Lepine, Capucine

    2012-01-01

    SummaryRock Bolts, improved design and possibilitiesMaster thesis NTNU 2012Student : Capucine Thomas-LepineSupervisor : Leif LiaKey words : rock foundation, small concrete dam, rock mass classification, rock joints, shear strength of rock discontinuities, fully grouted passive rock bolts designMasters Thesis : “Rock bolts, improved design and possibilities” is a continuation from the Masters Thesis NTNU 2011 “Rock bolts in dams, expected capacity” by Lars Kristian Neby. In...

  6. 大数据与云计算%Big Data Using Cloud Computing

    Institute of Scientific and Technical Information of China (English)

    何清

    2014-01-01

    The concept of big data has been mentioned by more and more people in more and more occasions, and it is often related to cloud computing. The relationship between the cloud computing and big data becomes the hot topic. This report contains the following four thematic areas:First, the value of big data. Second, the challenges brought by big data. Third, the big data research;Fourth, Cloud Computing is a mainstream way of big data mining. In this report, we describe the understanding of big data, as well as awareness of the big data value, explore large data processing and mining technology and discuss the following points:with the absence of internet cloud computing will not exist, without cloud computing, there will be no big data processing and mining technology.%大数据(Big Data)这个概念近年来在越来越多的场合、被越来越多的人提及,并且经常和云计算联系在一起,云计算与大数据之间到底是什么关系成为热点话题。本专题报告包含以下四个方面内容:1.大数据的价值;2.大数据带来的挑战;3.大数据研究成果;4.云计算是大数据挖掘的主流方式。通过本报告阐述我们对大数据的理解,以及对大数据的价值的认识,探讨大数据处理与挖掘技术,论述以下观点:没有互联网就没有云计算模式,没有云计算模式就没有大数据处理技术,也就没有大数据挖掘技术。

  7. Five Big, Big Five Issues : Rationale, Content, Structure, Status, and Crosscultural Assessment

    NARCIS (Netherlands)

    De Raad, Boele

    1998-01-01

    This article discusses the rationale, content, structure, status, and crosscultural assessment of the Big Five trait factors, focusing on topics of dispute and misunderstanding. Taxonomic restrictions of the original Big Five forerunner, the "Norman Five," are discussed, and criticisms regarding the

  8. 大数据,大变革%Big Data Big Changes

    Institute of Scientific and Technical Information of China (English)

    梁爽

    2014-01-01

    Big data are always happen in people’s side, big data era has arrived. This paper has described the characteristics of big data, analyzed big data research status and future application direction. Only to understand big data again, change the thinking of big data, adapt to changes in business model, innovative big data management, strengthen institution construction, enhance law awareness, ensure the personal and national security, it can continuously promote the healthy development of big data.%大数据正时刻发生在人们的身边,大数据时代已经到来。本文通过对大数据特点的描述,分析了大数据在国内外的研究现状以及未来的应用方向,只有重新认识大数据,从思维上变革对大数据的认识,从商业模式上适应大数据的变化,创新大数据管理模式,加强制度建设,增强法律意识,保证个人和国家的安全,才能不断推动大数据的健康发展。

  9. Big Data Big Changes%大数据,大变革

    Institute of Scientific and Technical Information of China (English)

    梁爽

    2014-01-01

    大数据正时刻发生在人们的身边,大数据时代已经到来。本文通过对大数据特点的描述,分析了大数据在国内外的研究现状以及未来的应用方向,只有重新认识大数据,从思维上变革对大数据的认识,从商业模式上适应大数据的变化,创新大数据管理模式,加强制度建设,增强法律意识,保证个人和国家的安全,才能不断推动大数据的健康发展。%Big data are always happen in people’s side, big data era has arrived. This paper has described the characteristics of big data, analyzed big data research status and future application direction. Only to understand big data again, change the thinking of big data, adapt to changes in business model, innovative big data management, strengthen institution construction, enhance law awareness, ensure the personal and national security, it can continuously promote the healthy development of big data.

  10. Rock Pore Structure as Main Reason of Rock Deterioration

    Science.gov (United States)

    Ondrášik, Martin; Kopecký, Miloslav

    2014-03-01

    Crashed or dimensional rocks have been used as natural construction material, decoration stone or as material for artistic sculptures. Especially old historical towns not only in Slovakia have had experiences with use of stones for construction purposes for centuries. The whole buildings were made from dimensional stone, like sandstone, limestone or rhyolite. Pavements were made especially from basalt, andesite, rhyolite or granite. Also the most common modern construction material - concrete includes large amounts of crashed rock, especially limestone, dolostone and andesite. However, rock as any other material if exposed to exogenous processes starts to deteriorate. Especially mechanical weathering can be very intensive if rock with unsuitable rock properties is used. For long it had been believed that repeated freezing and thawing in relation to high absorption is the main reason of the rock deterioration. In Slovakia for many years the high water absorption was set as exclusion criterion for use of rocks and stones in building industry. Only after 1989 the absorption was accepted as merely informational rock property and not exclusion. The reason of the change was not the understanding of the relationship between the porosity and rock deterioration, but more or less good experiences with some high porous rocks used in constructions exposed to severe weather conditions and proving a lack of relationship between rock freeze-thaw resistivity and water absorption. Results of the recent worldwide research suggest that understanding a resistivity of rocks against deterioration is hidden not in the absorption but in the structure of rock pores in relation to thermodynamic properties of pore water and tensile strength of rocks and rock minerals. Also this article presents some results of research on rock deterioration and pore structure performed on 88 rock samples. The results divide the rocks tested into two groups - group N in which the pore water does not freeze

  11. Characteristics of business intelligence and big data in e-government

    DEFF Research Database (Denmark)

    Gaardboe, Rikke; Jonasen, Tanja Svarre; Kanstrup, Anne Marie

    2015-01-01

    Business intelligence and big data represent two different technologies within decision support systems. The present paper concerns the two concepts within the context of e-government. Thus, the purpose of the paper is to present the preliminary findings regarding publication patterns and topic...... coverage within the two technologies by conducting a comparative literature review. A total of 281 papers published in the years 2005–2014 were included in the analysis. A rapid increase of papers regarding big data were identified, the majority being journal papers. As regards business intelligence......, researchers publish in conference proceedings to a greater extent. Further, big data journal papers are published within a broader range of journal topics compared to business intelligence journal papers. The paper concludes by pointing to further analyses that will be carried out within the 281 selected...

  12. Enhancement of β-catenin activity by BIG1 plus BIG2 via Arf activation and cAMP signals.

    Science.gov (United States)

    Li, Chun-Chun; Le, Kang; Kato, Jiro; Moss, Joel; Vaughan, Martha

    2016-05-24

    Multifunctional β-catenin, with critical roles in both cell-cell adhesion and Wnt-signaling pathways, was among HeLa cell proteins coimmunoprecipitated by antibodies against brefeldin A-inhibited guanine nucleotide-exchange factors 1 and 2 (BIG1 or BIG2) that activate ADP-ribosylation factors (Arfs) by accelerating the replacement of bound GDP with GTP. BIG proteins also contain A-kinase anchoring protein (AKAP) sequences that can act as scaffolds for multimolecular assemblies that facilitate and limit cAMP signaling temporally and spatially. Direct interaction of BIG1 N-terminal sequence with β-catenin was confirmed using yeast two-hybrid assays and in vitro synthesized proteins. Depletion of BIG1 and/or BIG2 or overexpression of guanine nucleotide-exchange factor inactive mutant, but not wild-type, proteins interfered with β-catenin trafficking, leading to accumulation at perinuclear Golgi structures. Both phospholipase D activity and vesicular trafficking were required for effects of BIG1 and BIG2 on β-catenin activation. Levels of PKA-phosphorylated β-catenin S675 and β-catenin association with PKA, BIG1, and BIG2 were also diminished after BIG1/BIG2 depletion. Inferring a requirement for BIG1 and/or BIG2 AKAP sequence in PKA modification of β-catenin and its effect on transcription activation, we confirmed dependence of S675 phosphorylation and transcription coactivator function on BIG2 AKAP-C sequence. PMID:27162341

  13. Soil biogeochemistry in the age of big data

    Science.gov (United States)

    Cécillon, Lauric; Barré, Pierre; Coissac, Eric; Plante, Alain; Rasse, Daniel

    2015-04-01

    Data is becoming one of the key resource of the XXIst century. Soil biogeochemistry is not spared by this new movement. The conservation of soils and their services recently came into the political agenda. However, clear knowledge on the links between soil characteristics and the various processes ensuring the provision of soil services is rare at the molecular or the plot scale, and does not exist at the landscape scale. This split between society's expectations on its natural capital, and scientific knowledge on the most complex material on earth has lead to an increasing number of studies on soils, using an increasing number of techniques of increasing complexity, with an increasing spatial and temporal coverage. From data scarcity with a basic data management system, soil biogeochemistry is now facing a proliferation of data, with few quality controls from data collection to publication and few skills to deal with them. Based on this observation, here we (1) address how big data could help in making sense of all these soil biogeochemical data, (2) point out several shortcomings of big data that most biogeochemists will experience in their future career. Massive storage of data is now common and recent opportunities for cloud storage enables data sharing among researchers all over the world. The need for integrative and collaborative computational databases in soil biogeochemistry is emerging through pioneering initiatives in this direction (molTERdb; earthcube), following soil microbiologists (GenBank). We expect that a series of data storage and management systems will rapidly revolutionize the way of accessing raw biogeochemical data, published or not. Data mining techniques combined with cluster or cloud computing hold significant promises for facilitating the use of complex analytical methods, and for revealing new insights previously hidden in complex data on soil mineralogy, organic matter and biodiversity. Indeed, important scientific advances have

  14. Rock mechanics for hard rock nuclear waste repositories

    International Nuclear Information System (INIS)

    The mined geologic burial of high level nuclear waste is now the favored option for disposal. The US National Waste Terminal Storage Program designed to achieve this disposal includes an extensive rock mechanics component related to the design of the wastes repositories. The plan currently considers five candidate rock types. This paper deals with the three hard rocks among them: basalt, granite, and tuff. Their behavior is governed by geological discontinuities. Salt and shale, which exhibit behavior closer to that of a continuum, are not considered here. This paper discusses both the generic rock mechanics R and D, which are required for repository design, as well as examples of projects related to hard rock waste storage. The examples include programs in basalt (Hanford/Washington), in granitic rocks (Climax/Nevada Test Site, Idaho Springs/Colorado, Pinawa/Canada, Oracle/Arizona, and Stripa/Sweden), and in tuff

  15. Session: Hot Dry Rock

    Energy Technology Data Exchange (ETDEWEB)

    Tennyson, George P. Jr.; Duchane, David V.; Ponden, Raymond F.; Brown, Donald W.

    1992-01-01

    This session at the Geothermal Energy Program Review X: Geothermal Energy and the Utility Market consisted of four presentations: ''Hot Dry Rock - Summary'' by George P. Tennyson, Jr.; ''HDR Opportunities and Challenges Beyond the Long Term Flow Test'' by David V. Duchane; ''Start-Up Operations at the Fenton Hill HDR Pilot Plant'' by Raymond F. Ponden; and ''Update on the Long-Term Flow Testing Program'' by Donald W. Brown.

  16. Sealing of fractured rock

    International Nuclear Information System (INIS)

    This paper consists of a presentation of the third phase of the Stripa Project. This phase was dedicated to fracture sealing. First of all it has been necessary to show that fine-grained grouts could effectively be injected in relatively fine cracks, and that the fluidity of bentonite could also be enhanced. The field tests comprised investigation of excavation-induced disturbance and attempts to seal disturbed rock, and, in separate tests, grouting of deposition holes and a natural fine-fracture zone. (TEC). 12 figs., 1 tab., 6 refs

  17. From stones to rocks

    Science.gov (United States)

    Mortier, Marie-Astrid; Jean-Leroux, Kathleen; Cirio, Raymond

    2013-04-01

    With the Aquila earthquake in 2009, earthquake prediction is more and more necessary nowadays, and people are waiting for even more accurate data. Earthquake accuracy has increased in recent times mainly thanks to the understanding of how oceanic expansion works and significant development of numerical seismic prediction models. Despite the improvements, the location and the magnitude can't be as accurate as citizen and authorities would like. The basis of anticipating earthquakes requires the understanding of: - The composition of the earth, - The structure of the earth, - The relations and movements between the different parts of the surface of the earth. In order to answer these questions, the Alps are an interesting field for students. This study combines natural curiosity about understanding the predictable part of natural hazard in geology and scientific skills on site: observing and drawing landscape, choosing and reading a representative core drilling, replacing the facts chronologically and considering the age, the length of time and the strength needed. This experience requires students to have an approach of time and space radically different than the one they can consider in a classroom. It also limits their imagination, in a positive way, because they realize that prediction is based on real data and some of former theories have become present paradigms thanks to geologists. On each location the analyzed data include landscape, core drilling and the relation established between them by students. The data is used by the students to understand the meaning, so that the history of the formation of the rocks tells by the rocks can be explained. Until this year, the CBGA's perspective regarding the study of the Alps ground allowed students to build the story of the creation and disappearance of the ocean, which was a concept required by French educational authorities. But not long ago, the authorities changed their scientific expectations. To meet the

  18. Rock mechanics data package

    International Nuclear Information System (INIS)

    This data package provides a summary of available laboratory and in situ stress field test results from site characterization investigations by the Basalt Waste Isolation Project Modeling and Analysis Group. The objective is to furnish rock mechanics information for use by Rockwell Hanford Operations and their subcontractors in performance assessment and engineering studies. This release includes Reference Repository Location (RRL) site specific laboratory and field test data from boreholes RRL-2, RRL-6, and RRL-14 as well as previous Hanford wide data available as of April, 1985. 25 refs., 9 figs., 16 tabs

  19. Big data analytics a management perspective

    CERN Document Server

    Corea, Francesco

    2016-01-01

    This book is about innovation, big data, and data science seen from a business perspective. Big data is a buzzword nowadays, and there is a growing necessity within practitioners to understand better the phenomenon, starting from a clear stated definition. This book aims to be a starting reading for executives who want (and need) to keep the pace with the technological breakthrough introduced by new analytical techniques and piles of data. Common myths about big data will be explained, and a series of different strategic approaches will be provided. By browsing the book, it will be possible to learn how to implement a big data strategy and how to use a maturity framework to monitor the progress of the data science team, as well as how to move forward from one stage to the next. Crucial challenges related to big data will be discussed, where some of them are more general - such as ethics, privacy, and ownership – while others concern more specific business situations (e.g., initial public offering, growth st...

  20. Relationship between carbonaceous rocks and uranium mineralization

    International Nuclear Information System (INIS)

    The relationship between carboniferous materials in the rocks and the formation of hydrothermal uranium mineralization has been discussed with the example of super-large hydrothermal uranium deposits (such as Canada's Athabasca, Australia's East Alligator River, Germany's Schlema-Alberoda and Roenneberg, Gabon's Franceville). According to the thermodynamic data, it has been emphasized that the interaction between carbon and water causes the formation of gaseous reductants (such as CO2, CO, H2 and CH4) under the condition of higher temperature and lower pressure. It has been indicated that CH4 should be the main gaseous reductants under the temperature (150-200 degree C) and pressure (50-100 MPa) which are suitable to the uranium metallogenesis. This conclusion accords with the practical situation observed in the deposits mentioned above, at the same time disaffirms the traditional points of view that the carbonaceous rocks can be the uranium sources during the formation of hydrothermal uranium deposits. (authors)

  1. Big Data, Big machines, Big Science : vers une société sans sujet et sans causalité ?

    OpenAIRE

    Ibekwe-SanJuan, Fidelia

    2014-01-01

    International audience Les dernières " avancées " en matière des Technologies de l'information et de la communication (TIC) ont accéléré la virtualisation de nombreux secteurs d'activité. Le Big Data, le Cloud computing, l'Open Data et le web participatif entraînent des bouleversements importants en science et en société. Un des effets qui suscite de l'inquiétude est le recours croissant aux algorithmes de traitement des données massives (Big data) comme mode de pilotage des affaires. Le B...

  2. Initial-stage examination of a testbed for the big data transfer over parallel links. The SDN approach

    Science.gov (United States)

    Khoruzhnikov, S. E.; Grudinin, V. A.; Sadov, O. L.; Shevel, A. E.; Titov, V. B.; Kairkanov, A. B.

    2015-04-01

    The transfer of Big Data over a computer network is an important and unavoidable operation in the past, present, and in any feasible future. A large variety of astronomical projects produces the Big Data. There are a number of methods to transfer the data over a global computer network (Internet) with a range of tools. In this paper we consider the transfer of one piece of Big Data from one point in the Internet to another, in general over a long-range distance: many thousand kilometers. Several free of charge systems to transfer the Big Data are analyzed here. The most important architecture features are emphasized, and the idea is discussed to add the SDN OpenFlow protocol technique for fine-grain tuning of the data transfer process over several parallel data links.

  3. Relações hierárquicas entre os traços amplos do Big Five Hierarchical relationship between the broad traits of the Big Five

    Directory of Open Access Journals (Sweden)

    Cristiano Mauro Assis Gomes

    2012-01-01

    through path analysis: a four-level hierarchical model and a non-hierarchical one. The hierarchical model showed adequate data fit, pointing to its superiority in relation to the non-hierarchical model, which did not present it. Implications to the Big Five Model are discussed.

  4. Rock Properties Model

    International Nuclear Information System (INIS)

    The purpose of this model report is to document the Rock Properties Model version 3.1 with regard to input data, model methods, assumptions, uncertainties and limitations of model results, and qualification status of the model. The report also documents the differences between the current and previous versions and validation of the model. The rock properties model provides mean matrix and lithophysae porosity, and the cross-correlated mean bulk density as direct input to the ''Saturated Zone Flow and Transport Model Abstraction'', MDL-NBS-HS-000021, REV 02 (BSC 2004 [DIRS 170042]). The constraints, caveats, and limitations associated with this model are discussed in Section 6.6 and 8.2. Model validation accomplished by corroboration with data not cited as direct input is discussed in Section 7. The revision of this model report was performed as part of activities being conducted under the ''Technical Work Plan for: The Integrated Site Model, Revision 05'' (BSC 2004 [DIRS 169635]). The purpose of this revision is to bring the report up to current procedural requirements and address the Regulatory Integration Team evaluation comments. The work plan describes the scope, objectives, tasks, methodology, and procedures for this process

  5. Overview: Hard Rock Penetration

    Energy Technology Data Exchange (ETDEWEB)

    Dunn, J.C.

    1992-08-01

    The Hard Rock Penetration program is developing technology to reduce the costs of drilling and completing geothermal wells. Current projects include: lost circulation control, rock penetration mechanics, instrumentation, and industry/DOE cost shared projects of the Geothermal Drilling organization. Last year, a number of accomplishments were achieved in each of these areas. A new flow meter being developed to accurately measure drilling fluid outflow was tested extensively during Long Valley drilling. Results show that this meter is rugged, reliable, and can provide useful measurements of small differences in fluid inflow and outflow rates. By providing early indications of fluid gain or loss, improved control of blow-out and lost circulation problems during geothermal drilling can be expected. In the area of downhole tools for lost circulation control, the concept of a downhole injector for injecting a two-component, fast-setting cementitious mud was developed. DOE filed a patent application for this concept during FY 91. The design criteria for a high-temperature potassium, uranium, thorium logging tool featuring a downhole data storage computer were established, and a request for proposals was submitted to tool development companies. The fundamental theory of acoustic telemetry in drill strings was significantly advanced through field experimentation and analysis. A new understanding of energy loss mechanisms was developed.

  6. Overview: Hard Rock Penetration

    Energy Technology Data Exchange (ETDEWEB)

    Dunn, J.C.

    1992-01-01

    The Hard Rock Penetration program is developing technology to reduce the costs of drilling and completing geothermal wells. Current projects include: lost circulation control, rock penetration mechanics, instrumentation, and industry/DOE cost shared projects of the Geothermal Drilling organization. Last year, a number of accomplishments were achieved in each of these areas. A new flow meter being developed to accurately measure drilling fluid outflow was tested extensively during Long Valley drilling. Results show that this meter is rugged, reliable, and can provide useful measurements of small differences in fluid inflow and outflow rates. By providing early indications of fluid gain or loss, improved control of blow-out and lost circulation problems during geothermal drilling can be expected. In the area of downhole tools for lost circulation control, the concept of a downhole injector for injecting a two-component, fast-setting cementitious mud was developed. DOE filed a patent application for this concept during FY 91. The design criteria for a high-temperature potassium, uranium, thorium logging tool featuring a downhole data storage computer were established, and a request for proposals was submitted to tool development companies. The fundamental theory of acoustic telemetry in drill strings was significantly advanced through field experimentation and analysis. A new understanding of energy loss mechanisms was developed.

  7. Overview - Hard Rock Penetration

    Energy Technology Data Exchange (ETDEWEB)

    Dunn, James C.

    1992-03-24

    The Hard Rock Penetration program is developing technology to reduce the costs of drilling and completing geothermal wells. Current projects include: lost circulation control, rock penetration mechanics, instrumentation, and industry/DOE cost shared projects of the Geothermal Drilling Organization. Last year, a number of accomplishments were achieved in each of these areas. A new flow meter being developed to accurately measure drilling fluid outflow was tested extensively during Long Valley drilling. Results show that this meter is rugged, reliable, and can provide useful measurements of small differences in fluid inflow and outflow rates. By providing early indications of fluid gain or loss, improved control of blow-out and lost circulation problems during geothermal drilling can be expected. In the area of downhole tools for lost circulation control, the concept of a downhole injector for injecting a two-component, fast-setting cementitious mud was developed. DOE filed a patent application for this concept during FY 91. The design criteria for a high-temperature potassium, uranium, thorium logging tool featuring a downhole data storage computer were established, and a request for proposals was submitted to tool development companies. The fundamental theory of acoustic telemetry in drill strings was significantly advanced through field experimentation and analysis. A new understanding of energy loss mechanisms was developed.

  8. Rock Properties Model

    Energy Technology Data Exchange (ETDEWEB)

    C. Lum

    2004-09-16

    The purpose of this model report is to document the Rock Properties Model version 3.1 with regard to input data, model methods, assumptions, uncertainties and limitations of model results, and qualification status of the model. The report also documents the differences between the current and previous versions and validation of the model. The rock properties model provides mean matrix and lithophysae porosity, and the cross-correlated mean bulk density as direct input to the ''Saturated Zone Flow and Transport Model Abstraction'', MDL-NBS-HS-000021, REV 02 (BSC 2004 [DIRS 170042]). The constraints, caveats, and limitations associated with this model are discussed in Section 6.6 and 8.2. Model validation accomplished by corroboration with data not cited as direct input is discussed in Section 7. The revision of this model report was performed as part of activities being conducted under the ''Technical Work Plan for: The Integrated Site Model, Revision 05'' (BSC 2004 [DIRS 169635]). The purpose of this revision is to bring the report up to current procedural requirements and address the Regulatory Integration Team evaluation comments. The work plan describes the scope, objectives, tasks, methodology, and procedures for this process.

  9. A smart rock

    Science.gov (United States)

    Pressel, Phil

    2014-12-01

    This project was to design and build a protective weapon for a group of associations that believed in aliens and UFO's. They collected enough contributions from societies and individuals to be able to sponsor and totally fund the design, fabrication and testing of this equipment. The location of this facility is classified. It also eventually was redesigned by the Quartus Engineering Company for use at a major amusement park as a "shoot at targets facility." The challenge of this project was to design a "smart rock," namely an infrared bullet (the size of a gallon can of paint) that could be shot from the ground to intercept a UFO or any incoming suspicious item heading towards the earth. Some of the challenges to design this weapon were to feed cryogenic helium at 5 degrees Kelvin from an inair environment through a unique rotary coupling and air-vacuum seal while spinning the bullet at 1500 rpm and maintain its dynamic stability (wobble) about its spin axis to less than 10 micro-radians (2 arc seconds) while it operated in a vacuum. Precision optics monitored the dynamic motion of the "smart rock."

  10. About Big Data and its Challenges and Benefits in Manufacturing

    OpenAIRE

    Bogdan NEDELCU

    2013-01-01

    The aim of this article is to show the importance of Big Data and its growing influence on companies. It also shows what kind of big data is currently generated and how much big data is estimated to be generated. We can also see how much are the companies willing to invest in big data and how much are they currently gaining from their big data. There are also shown some major influences that big data has over one major segment in the industry (manufacturing) and the challenges that appear.

  11. Bohmian Quantization of the Big Rip

    CERN Document Server

    Pinto-Neto, Nelson; 10.1103/PhysRevD.80.083509

    2009-01-01

    It is shown in this paper that minisuperspace quantization of homogeneous and isotropic geometries with phantom scalar fields, when examined in the light of the Bohm-de Broglie interpretation of quantum mechanics, does not eliminate, in general, the classical big rip singularity present in the classical model. For some values of the Hamilton-Jacobi separation constant present in a class of quantum state solutions of the Wheeler-DeWitt equation, the big rip can be either completely eliminated or may still constitute a future attractor for all expanding solutions. This is contrary to the conclusion presented in Ref.[1], using a different interpretation of the wave function, where the big rip singularity is completely eliminated ("smoothed out") through quantization, independently of such separation constant and for all members of the above mentioned class of solutions. This is an example of the very peculiar situation where different interpretations of the same quantum state of a system are predicting different...

  12. Hot big bang or slow freeze?

    International Nuclear Information System (INIS)

    We confront the big bang for the beginning of the universe with an equivalent picture of a slow freeze — a very cold and slowly evolving universe. In the freeze picture the masses of elementary particles increase and the gravitational constant decreases with cosmic time, while the Newtonian attraction remains unchanged. The freeze and big bang pictures both describe the same observations or physical reality. We present a simple “crossover model” without a big bang singularity. In the infinite past space–time is flat. Our model is compatible with present observations, describing the generation of primordial density fluctuations during inflation as well as the present transition to a dark energy-dominated universe

  13. One Second After the Big Bang

    CERN Document Server

    CERN. Geneva

    2014-01-01

    A new experiment called PTOLEMY (Princeton Tritium Observatory for Light, Early-Universe, Massive-Neutrino Yield) is under development at the Princeton Plasma Physics Laboratory with the goal of challenging one of the most fundamental predictions of the Big Bang – the present-day existence of relic neutrinos produced less than one second after the Big Bang. Using a gigantic graphene surface to hold 100 grams of a single-atomic layer of tritium, low noise antennas that sense the radio waves of individual electrons undergoing cyclotron motion, and a massive array of cryogenic sensors that sit at the transition between normal and superconducting states, the PTOLEMY project has the potential to challenge one of the most fundamental predictions of the Big Bang, to potentially uncover new interactions and properties of the neutrinos, and to search for the existence of a species of light dark matter known as sterile neutrinos.

  14. Big Data Issues: Performance, Scalability, Availability

    Directory of Open Access Journals (Sweden)

    Laura Matei

    2014-03-01

    Full Text Available Nowadays, Big Data is probably one of the most discussed topics not only in the area of data analysis, but, I believe, in the whole realm of information technology. The simple typing of the words „big data” on an online search engine like Google will retrieve approximately 1,660,000,000 results. Having such a buzz gathered around this term, I could not help but wonder what this phenomenon means.The ever greater portion that the combination of Internet, Cloud Computing and mobile devices has been occupying in our lives, lead to an ever increasing amount of data that must be captured, communicated, aggregated, stored, and analyzed. These sets of data that we are generating are called Big Data.

  15. Rock critics as 'Mouldy Modernists'

    Directory of Open Access Journals (Sweden)

    Becky Shepherd

    2011-09-01

    Full Text Available Contemporary rock criticism appears to be firmly tied to the past. The specialist music press valorise rock music of the 1960s and 1970s, and new emerging artists are championed for their ‘retro’ sounding music by journalists who compare the sound of these new artists with those included in the established ‘canon’ of rock music. This article examines the narrative tropes of authenticity and nostalgia that frame the retrospective focus of this contemporary rock writing, and most significantly, the maintenance of the rock canon within contemporary popular culture. The article concludes by suggesting that while contemporary rock criticism is predominately characterised by nostalgia, this nostalgia is not simply a passive romanticism of the past. Rather, this nostalgia fuels a process of active recontextualisation within contemporary popular culture.

  16. Big Bear Exploration Ltd. 1998 annual report

    International Nuclear Information System (INIS)

    During the first quarter of 1998 Big Bear completed a purchase of additional assets in the Rainbow Lake area of Alberta in which light oil purchase was financed with new equity and bank debt. The business plan was to immediately exploit these light oil assets, the result of which would be increased reserves, production and cash flow. Although drilling results in the first quarter on the Rainbow Lake properties was mixed, oil prices started to free fall and drilling costs were much higher than expected. As a result, the company completed a reduced program which resulted in less incremental loss and cash flow than it budgeted for. On April 29, 1998, Big Bear entered into agreement with Belco Oil and Gas Corp. and Moan Investments Ltd. for the issuance of convertible preferred shares at a gross value of $15,750,000, which shares were eventually converted at 70 cents per share to common equity. As a result of the continued plunge in oil prices, the lending value of the company's assets continued to fall, requiring it to take action in order to meet its financial commitments. Late in the third quarter Big Bear issued equity for proceeds of $11,032,000 which further reduced the company's debt. Although the company has been extremely active in identifying and pursuing acquisition opportunities, it became evident that Belco Oil and Gas Corp. and Big Bear did nor share common criteria for acquisitions, which resulted in the restructuring of their relationship in the fourth quarter. With the future of oil prices in question, Big Bear decided that it would change its focus to that of natural gas and would refocus ts efforts to acquire natural gas assets to fuel its growth. The purchase of Blue Range put Big Bear in a difficult position in terms of the latter's growth. In summary, what started as a difficult year ended in disappointment

  17. [Hearing disorders and rock music].

    Science.gov (United States)

    Lindhardt, Bjarne Orskov

    2008-12-15

    Only few studies have investigated the frequency of hearing disorders in rock musicians. Performing rock music is apparently associated with a hearing loss in a fraction of musicians. Tinnitus and hyperacusis are more common among rock musicians than among the background population. It seems as if some sort of resistance against further hearing loss is developed over time. The use of ear protection devices have not been studied systematically but appears to be associated with diminished hearing loss. PMID:19128557

  18. Big Bang riddles and their revelations

    OpenAIRE

    Magueijo, Joao; Baskerville, Kim

    1999-01-01

    We describe how cosmology has converged towards a beautiful model of the Universe: the Big Bang Universe. We praise this model, but show there is a dark side to it. This dark side is usually called ``the cosmological problems'': a set of coincidences and fine tuning features required for the Big Bang Universe to be possible. After reviewing these ``riddles'' we show how they have acted as windows into the very early Universe, revealing new physics and new cosmology just as the Universe came i...

  19. SQL Engines for Big Data Analytics

    OpenAIRE

    Xue, Rui

    2015-01-01

    The traditional relational database systems can not accommodate the need of analyzing data with large volume and various formats, i.e., Big Data. Apache Hadoop as the first generation of open-source Big Data solution provided a stable distributed data storage and resource management system. However, as a MapReduce framework, the only channel of utilizing the parallel computing power of Hadoop is the API. Given a problem, one has to code a corresponding MapReduce program in Java, which is time...

  20. Kansen voor Big data – WPA Vertrouwen

    OpenAIRE

    Broek, T.A. van den; Roosendaal, A.P.C.; Veenstra, A.F.E. van; Nunen, A.M. van

    2014-01-01

    Big data is expected to become a driver for economic growth, but this can only be achieved when services based on (big) data are accepted by citizens and consumers. In a recent policy brief, the Cabinet Office mentions trust as one of the three pillars (the others being transparency and control) for ePrivacy. As such, it is a requirement for realizing economic value of services based on (personal) data. Businesses play a role in guaranteeing data security and privacy of data subjects, but als...

  1. The big head and the long tail

    DEFF Research Database (Denmark)

    Helles, Rasmus

    2013-01-01

    This paper discusses how the advent of big data challenges established theories in Internet studies to redevelop existing explanatory strategies in order to incorporate the possibilities offered by this new empirical resource. The article suggests that established analytical procedures and...... theoretical frameworks used in Internet studies can be fruitfully employed to explain high–level structural phenomena that are only observable through the use of big data. The present article exemplifies this by offering a detailed analysis of how genre analysis of Web sites may be used to shed light on the...

  2. New physics and the new big bang

    International Nuclear Information System (INIS)

    The old concept of the big bang is reviewed, and modifications that have recently occurred in the theory are described. The concept of the false vacuum is explained, and its role in the cosmic inflation scenario is shown. The way inflation solves critical problems of the old big bang scenario is indicated. The potential of supersymmetry and Kaluza-Klein theories for the development of a superunified theory of physical forces is discussed. Superstrings and their possible role in a superunified theory, including their usefulness in solving the problem of infinities, is considered

  3. Effective dynamics of the matrix big bang

    International Nuclear Information System (INIS)

    We study the leading quantum effects in the recently introduced matrix big bang model. This amounts to a study of supersymmetric Yang-Mills theory compactified on the Milne orbifold. We find a one-loop potential that is attractive near the big bang. Surprisingly, the potential decays very rapidly at late times where it appears to be generated by D-brane effects. Usually, general covariance constrains the form of any effective action generated by renormalization group flow. However, the form of our one-loop potential seems to violate these constraints in a manner that suggests a connection between the cosmological singularity and long wavelength, late time physics

  4. Cognitive computing and big data analytics

    CERN Document Server

    Hurwitz, Judith; Bowles, Adrian

    2015-01-01

    MASTER THE ABILITY TO APPLY BIG DATA ANALYTICS TO MASSIVE AMOUNTS OF STRUCTURED AND UNSTRUCTURED DATA Cognitive computing is a technique that allows humans and computers to collaborate in order to gain insights and knowledge from data by uncovering patterns and anomalies. This comprehensive guide explains the underlying technologies, such as artificial intelligence, machine learning, natural language processing, and big data analytics. It then demonstrates how you can use these technologies to transform your organization. You will explore how different vendors and different industries are a

  5. From big data to smart data

    CERN Document Server

    Iafrate, Fernando

    2015-01-01

    A pragmatic approach to Big Data by taking the reader on a journey between Big Data (what it is) and the Smart Data (what it is for). Today's decision making can be reached via information (related to the data), knowledge (related to people and processes), and timing (the capacity to decide, act and react at the right time). The huge increase in volume of data traffic, and its format (unstructured data such as blogs, logs, and video) generated by the "digitalization" of our world modifies radically our relationship to the space (in motion) and time, dimension and by capillarity, the enterpr

  6. Knowledge Management Based on Big Data Processing

    Directory of Open Access Journals (Sweden)

    Li Baoan

    2014-01-01

    Full Text Available At present, many large enterprises, like oil industry accumulated a large amount of data with a range of potential value of knowledge in their value activities over the years. How to help them to put these data into wealth are common problems faced by IT industry and academia. This study analyzed the five key problems of big data processing and knowledge management in-depth and then explained the composition and technical characteristics of knowledge management system based on big data processing. It explored the new approach of knowledge management which can adapt to the ever-change demands of enterprises.

  7. Big bandin puhallinsoittimien haasteet sovittajan silmin

    OpenAIRE

    Soini, Rasmus

    2014-01-01

    Käsittelen opinnäytetyössäni big band –sovittamista puhallinsoittimien näkökulmasta, tarkoituksenani eritellä soittamisen haasteellisuutta lisääviä tekijöitä. Pyrin myös tarjoamaan ratkaisuja siihen kuinka sovittaja voi sovitustyössään nämä tekijät huomioida. Olen huomannut, että ilman omakohtaista puhallinsoitinkokemusta big bandin puhaltimien soittamiseen liittyvät haasteet ovat monelle hyvin vieraita. Lisäksi koen mielekkääksi opintojen lopuksi koota tulevien työtehtävieni kannalta oleelli...

  8. Effective Dynamics of the Matrix Big Bang

    CERN Document Server

    Craps, B; Sethi, S; Craps, Ben; Rajaraman, Arvind; Sethi, Savdeep

    2006-01-01

    We study the leading quantum effects in the recently introduced Matrix Big Bang model. This amounts to a study of supersymmetric Yang-Mills theory compactified on the Milne orbifold. We find a one-loop potential that decays near the Big Bang. More surprisingly, the potential decays very rapidly at late times where it appears to be generated by D-brane effects. Usually, general covariance constrains the form of any effective action generated by renormalization group flow. However, the form of our one-loop potential seems to violate these constraints in a manner that suggests a connection between the cosmological singularity and long wavelength, late time physics.

  9. Tuottojohtaminen Imatra Big Band Festival- ravintolatoiminnassa

    OpenAIRE

    Lahtinen, Satu

    2015-01-01

    Opinnäytetyön aiheena on tuottojohtaminen Imatra Big Band Festival- ravintolatoiminnassa, ja työn tarkoituksena oli tuottaa tietoa ja hyviä käytäntöjä siitä, miten tuottojohtamisen tärkeimpiä keinoja toteutetaan Imatra Big Band Festivaalin ravintolatoiminnassa: mikä meni hyvin ja mitä voisi kehittää. Tuottojohtaminen eli Revenue Management on ollut pitkään tunnettu hotellitoiminnassa ja lentoyhtiöillä, kun taas ravintola-alalla se ei ole vielä kovin laajalti käytössä. Tuottojohtamisesta ravin...

  10. The big bang cosmology - enigmas and nostrums

    International Nuclear Information System (INIS)

    Some outstanding problems in connection with the big bang cosmology and relativity theory are reviewed under the headings: enigmas; nostrums and elixirs (the universe as Phoenix (an oscillating universe), the anthropomorphic universe (existence of observers in the present universe), reproducing universes (could a mini big bang bounce, perhaps adding entropy and matter and eventually developing into a suitable home for observers), variable strength of the gravitational interaction and oscillating universes (possible bounce models that have led eventually to the present hospitable environment). (U.K.)

  11. Dissipative Future Universe without Big Rip

    CERN Document Server

    Yadav, Anil Kumar

    2010-01-01

    The present study deals with dissipative future universe without big rip in context of Eckart formalism. The generalised chaplygin gas, characterised by equaction of state $p=-\\frac{A}{\\rho^\\frac{1}{\\alpha}}$, has been considered as a model for dark energy due to its dark-energy-like evolution at late time. It is demonstrated that, if the cosmic dark energy behaves like a fluid as well as chaplygin gas simultaneously then the big rip problem does not arise and the scale factor is found to be regular for all time.

  12. Ready to Rock and Roll

    Science.gov (United States)

    2004-01-01

    This image from the Mars Exploration Rover Spirit hazard-identification camera shows the rover's perspective just before its first post-egress drive on Mars. On Sunday, the 15th martian day, or sol, of Spirit's journey, engineers drove Spirit approximately 3 meters (10 feet)toward its first rock target, a football-sized, mountain-shaped rock called Adirondack (not pictured). In the foreground of this image are 'Sashimi' and 'Sushi' - two rocks that scientists considered investigating first. Ultimately, these rocks were not chosen because their rough and dusty surfaces are ill-suited for grinding.

  13. Electromagnetic emissions during rock blasting

    Science.gov (United States)

    O'Keefe, S. G.; Thiel, D. V.

    1991-05-01

    Radio emissions during quarry blasting have been recorded in the audio frequency band. Three distinct mechanisms are suggested to explain the observed results; rock fracture at the time of the explosion, charged rocks discharging on impact with the pit floor and micro-fracture of the remaining rock wall due to pressure adjustment of the bench behind the blast. The last mechanism was evident by a train of discrete impulses recorded for up to one minute after the blast. It is assumed that during this time the rock behind the blast was subjected to a significant change in pressure. This may be related to ELF observations during earthquakes.

  14. Petrology of the igneous rocks

    Science.gov (United States)

    Mccallum, I. S.

    1987-01-01

    Papers published during the 1983-1986 period on the petrology and geochemistry of igneous rocks are discussed, with emphasis on tectonic environment. Consideration is given to oceanic rocks, subdivided into divergent margin suites (mid-ocean ridge basalts, ridge-related seamounts, and back-arc basin basalts) and intraplate suites (oceanic island basalts and nonridge seamounts), and to igneous rocks formed at convergent margins (island arc and continental arc suites), subdivided into volcanic associations and plutonic associations. Other rock groups discussed include continental flood basalts, layered mafic intrusions, continental alkalic associations, komatiites, ophiolites, ash-flow tuffs, anorthosites, and mantle xenoliths.

  15. Big bang nucleosynthesis: The standard model and alternatives

    Science.gov (United States)

    Schramm, David N.

    1991-01-01

    Big bang nucleosynthesis provides (with the microwave background radiation) one of the two quantitative experimental tests of the big bang cosmological model. This paper reviews the standard homogeneous-isotropic calculation and shows how it fits the light element abundances ranging from He-4 at 24% by mass through H-2 and He-3 at parts in 10(exp 5) down to Li-7 at parts in 10(exp 10). Furthermore, the recent large electron positron (LEP) (and the stanford linear collider (SLC)) results on the number of neutrinos are discussed as a positive laboratory test of the standard scenario. Discussion is presented on the improved observational data as well as the improved neutron lifetime data. Alternate scenarios of decaying matter or of quark-hadron induced inhomogeneities are discussed. It is shown that when these scenarios are made to fit the observed abundances accurately, the resulting conlusions on the baryonic density relative to the critical density, omega(sub b) remain approximately the same as in the standard homogeneous case, thus, adding to the robustness of the conclusion that omega(sub b) approximately equals 0.06. This latter point is the driving force behind the need for non-baryonic dark matter (assuming omega(sub total) = 1) and the need for dark baryonic matter, since omega(sub visible) is less than omega(sub b).

  16. Big Data : Paving the Road to Improved Customer Support Efficiency

    Directory of Open Access Journals (Sweden)

    Ajay Parashar

    2016-03-01

    Full Text Available The organizational adage ‘customer is king’ is not new. With a significant number of organizational resources devoted to understanding t he ‘king’s’ needs and responding to them, this phrase, in today’s competitive business arena, is an understatement. With the increasing customer touch points and avenues for customers to provide formal/informal feedback, the modern day customer support ecosystem is a complex environment. There is a need to fuse the different components of support ecosystem to create a coherent system and Big Data platform is just the right catalyst that a flat-world organizat ion today needs to re-energize its customer service effort and venture out to capture newer hor izons. This white paper looks at the different components that make up the current customer suppor t service environment and the challenges they pose to a uniform integration strategy. Finall y it highlights how Big Data can be leveraged to achieve this strategy.

  17. Big infrastructures effects on local developments

    Directory of Open Access Journals (Sweden)

    Bruna Vendemmia

    2011-10-01

    Full Text Available This research aims to clarify the consequences generated by regional infrastructures strategies on local city growth.Do regional infrastructure strategies activate transformation processes at a local level? And may these  processes generate virtuous rules for local development in bottom-up transformations?To answer at these questions, in my opinion,  the Metropolitan Area of Naples represents an interesting case study. In these area, and due to the lack of Institutions, the processes, object of this work, are clearly visible: a coexistence between “top-down” projects and “bottom-up” transformations is highlighted. In 2010 Naples lies on a huge conurbation: the high-way infrastructures reduced the distance, increasing the accessibility of the region but without building a clear relation with the surroundings; as a consequence, the city sprawls, messing  up the previous rural structure. At the same time, the industrial areas produced visible fractures on the configuration of the territory.The different technologies produced physical changes in the Metropolitan Area, as well as in citizens life style. We are trying to understand, here, the relations between this two dynamics in order to measure the influences and forecast the transformations.An important fact is that nowadays and worldwide, we are assisting to the replacement of the industrial sector with global services and transport; commercial activities are transforming the landscape, finding their location in places that have well defined characteristics: big plots, high visibility, global connectivity and easy accessibility. In Naples they have been established in the same area where agriculture, industries and residential suburbs have already layered. Even though, here, they symbolize territorial references: “land-marks” (Lynch, 2006. New infrastructure have to been built in order to support this renewed uses of the territory. If the city can be described “as points of

  18. BIG DATA, BIG CONSEQUENCES? EEN VERKENNING NAAR PRIVACY EN BIG DATA GEBRUIK BINNEN DE OPSPORING, VERVOLGING EN RECHTSPRAAK

    NARCIS (Netherlands)

    Lodder, A.R.; Meulen, van der N.S.; Wisman, T.H.A.; Meij, Lisette; Zwinkels, C.M.M.

    2014-01-01

    In deze verkenning is ingegaan op de privacy aspecten van Big Data analysis binnen het domein Veiligheid en Justitie. Besproken zijn toepassingen binnen de rechtspraak zoals voorspellen van uitspraken en gebruik in rechtszaken. Met betrekking tot opsporing is onder andere ingegaan op predictive poli

  19. Rock.XML - Towards a library of rock physics models

    Science.gov (United States)

    Jensen, Erling Hugo; Hauge, Ragnar; Ulvmoen, Marit; Johansen, Tor Arne; Drottning, Åsmund

    2016-08-01

    Rock physics modelling provides tools for correlating physical properties of rocks and their constituents to the geophysical observations we measure on a larger scale. Many different theoretical and empirical models exist, to cover the range of different types of rocks. However, upon reviewing these, we see that they are all built around a few main concepts. Based on this observation, we propose a format for digitally storing the specifications for rock physics models which we have named Rock.XML. It does not only contain data about the various constituents, but also the theories and how they are used to combine these building blocks to make a representative model for a particular rock. The format is based on the Extensible Markup Language XML, making it flexible enough to handle complex models as well as scalable towards extending it with new theories and models. This technology has great advantages as far as documenting and exchanging models in an unambiguous way between people and between software. Rock.XML can become a platform for creating a library of rock physics models; making them more accessible to everyone.

  20. Numerical analysis of tunnel reinforcing influences on failure process of surrounding rock under explosive stress waves

    Institute of Scientific and Technical Information of China (English)

    ZUO Yu-jun; TANG Chun-an; ZHU Wan-cheng; LI Di-yuan; LI Shu-cai

    2008-01-01

    Based on mesoscopic damage mechanics, numerical code RFPA2D (dynamic edition) was developed to analyze the influence of tunnel reinforcing on failure process of surrounding rock under explosive stress waves. The results show that the propagation phenomenon of stress wave in the surrounding rock of tunnel and the failure process of surrounding rock under explosive stress waves are reproduced realistically by using numerical code RFPAED; from the failure process of surrounding rock,the place at which surrounding rock fractures is transferred because of tunnel reinforcing, and the rockfall and collapse caused by failure of surrounding rock are restrained by tunnel reinforcing; furthermore, the absolute values of peak values of major principal stress, and the minimal principal stress and shear stress at center point of tunnel roof are reduced because of tunnel reinforcing, and the displacement at center point of tunnel roof is reduced as well, consequently the stability of tunnel increases.