WorldWideScience

Sample records for big rock point

  1. 78 FR 58570 - Environmental Assessment; Entergy Nuclear Operations, Inc., Big Rock Point

    Science.gov (United States)

    2013-09-24

    ... Assessment; Entergy Nuclear Operations, Inc., Big Rock Point AGENCY: Nuclear Regulatory Commission. ACTION... applicant or the licensee), for the Big Rock Point (BRP) Independent Spent Fuel Storage Installation (ISFSI... Rock Point (BRP) Independent Spent Fuel Storage Installation (ISFSI). II. Environmental Assessment (EA...

  2. Big Rock Point: 35 years of electrical generation

    International Nuclear Information System (INIS)

    Petrosky, T.D.

    1998-01-01

    On September 27, 1962, the 75 MWe boiling water reactor, designed and built by General Electric, of the Big Rock Point Nuclear Power Station went critical for the first time. The US Atomic Energy Commission (AEC) and the plant operator, Consumers Power, had designed the plant also as a research reactor. The first studies were devoted to fuel behavior, higher burnup, and materials research. The reactor was also used for medical technology: Co-60 radiation sources were produced for the treatment of more than 120,000 cancer patients. After the accident at the Three Mile Island-2 nuclear generating unit in 1979, Big Rock Point went through an extensive backfitting phase. Personnel from numerous other American nuclear power plants were trained at the simulator of Big Rock Point. The plant was decommissioned permanently on August 29, 1997 after more than 35 years of operation and a cumulated electric power production of 13,291 GWh. A period of five to seven years is estimated for decommissioning and demolition work up to the 'green field' stage. (orig.) [de

  3. Big Rock Point severe accident management strategies

    International Nuclear Information System (INIS)

    Brogan, B.A.; Gabor, J.R.

    1996-01-01

    December 1994, the Nuclear Energy Institute (NEI) issued guidance relative to the formal industry position on Severe Accident Management (SAM) approved by the NEI Strategic Issues Advisory Committee on November 4, 1994. This paper summarizes how Big Rock Point (BRP) has and continues to address SAM strategies. The historical accounting portion of this presentation includes a description of how the following projects identified and defined the current Big Rock Point SAM strategies: the 1981 Level 3 Probabilistic Risk Assessment performance; the development of the Plant Specific Technical Guidelines from which the symptom oriented Emergency Operating Procedures (EOPs) were developed; the Control Room Design Review; and, the recent completion of the Individual Plant Evaluation (IPE). In addition to the historical presentation deliberation, this paper the present activities that continue to stress SAM strategies

  4. Risks due to fires at Big Rock Point

    International Nuclear Information System (INIS)

    Brinsfield, W.A.; Blanchard, D.P.

    1983-01-01

    The unique and older designs of the Big Rock Point nuclear plant is such that fires contribute significantly to the probability of core damage predicted in the probabilistic risk assessment performed for this plant. The methodology employed to determine this contribution reflects the unique, as constructed, plant design, while systematically and logically addressing the true effect of fires on the operation of the plant and the safety of the public. As a result of the methodology utilized in the PRA, recommendations are made which minimize the risk of core damage due to fires. Included in these recommendations is a proposal for equipment and controls to be included on the Big Rock Point alternate shutdown panel

  5. RETRAN operational transient analysis of the Big Rock Point plant boiling water reactor

    International Nuclear Information System (INIS)

    Sawtelle, G.R.; Atchison, J.D.; Farman, R.F.; VandeWalle, D.J.; Bazydlo, H.G.

    1983-01-01

    Energy Incorporated used the RETRAN computer code to model and calculate nine Consumers Power Company Big Rock Point Nuclear Power Plant transients. RETRAN, a best-estimate, one-dimensional, homogeneous-flow thermal-equilibrium code, is applicable to FSAR Chapter 15 transients for Conditions 1 through IV. The BWR analyses were performed in accordance with USNRC Standard Review Plan criteria and in response to the USNRC Systematic Evaluation Program. The RETRAN Big Rock Point model was verified by comparison to plant startup test data. This paper discusses the unique modeling techniques used in RETRAN to model this steam-drum-type BWR. Transient analyses results are also presented

  6. Integrated plant-safety assessment, Systematic Evaluation Program: Big Rock Point Plant (Docket No. 50-155)

    International Nuclear Information System (INIS)

    1983-09-01

    The Systematic Evaluation Program was initiated in February 1977 by the US Nuclear Regulatory Commission to review the designs of older operating nuclear reactor plants to reconfirm and document their safety. This report documents the review of the Big Rock Point Plant, which is one of ten plants reviewed under Phase II of this program. This report indicates how 137 topics selected for review under Phase I of the program were addressed. It also addresses a majority of the pending licensing actions for Big Rock Point, which include TMI Action Plan requirements and implementation criteria for resolved generic issues. Equipment and procedural changes have been identified as a result of the review

  7. Free Release Standards Utilized at Big Rock Point

    International Nuclear Information System (INIS)

    Robert P. Wills

    2000-01-01

    The decommissioning of Consumers Energy's Big Rock Point (BRP) site involves decommissioning its 75-MW boiling water reactor and all of the associated facilities. Consumers Energy is committed to restoring the site to greenfield conditions. This commitment means that when the decommissioning is complete, all former structures will have been removed, and the site will be available for future use without radiological restrictions. BRP's radiation protection management staff determined that the typical methods used to comply with U.S Nuclear Regulatory Commission (NRC) regulations for analyzing volumetric material for radionuclides would not fulfill the demands of a facility undergoing decommissioning. The challenge at hand is to comply with regulatory requirements and put into production a large-scale bulk release production program. This report describes Consumers Energy's planned approach to the regulatory aspects of free release

  8. Big rock point restoration project BWR major component removal, packaging and shipping - planning and experience

    International Nuclear Information System (INIS)

    Milner, T.; Dam, S.; Papp, M.; Slade, J.; Slimp, B.; Nurden, P.

    2001-01-01

    The Big Rock Point boiling water reactor (BWR) at Charlevoix, MI was permanently shut down on August 29th 1997. In 1999 BNFL Inc.'s Reactor Decommissioning Group (RDG) was awarded a contract by Consumers Energy (CECo) for the Big Rock Point (BRP) Major Component Removal (MCR) project. BNFL Inc. RDG has teamed with MOTA, Sargent and Lundy and MDM Services to plan and execute MCR in support of the facility restoration project. The facility restoration project will be completed by 2005. Key to the success of the project has been the integration of best available demonstrated technology into a robust and responsive project management approach, which places emphasis on safety and quality assurance in achieving project milestones linked to time and cost. To support decommissioning of the BRP MCR activities, a reactor vessel (RV) shipping container is required. Discussed in this paper is the design and fabrication of a 10 CFR Part 71 Type B container necessary to ship the BRP RV. The container to be used for transportation of the RV to the burial site was designed as an Exclusive Use Type B package for shipment and burial at the Barnwell, South Carolina (SC) disposal facility. (author)

  9. Integrated plant safety assessment. Systematic evaluation program, Big Rock Point Plant (Docket No. 50-155). Final report

    International Nuclear Information System (INIS)

    1984-05-01

    The Systematic Evaluation Program was initiated in February 1977 by the U.S. Nuclear Regulatory Commission to review the designs of older operating nuclear reactor plants to reconfirm and document their safety. The review provides (1) an assessment of how these plants compare with current licensing safety requirements relating to selected issues, (2) a basis for deciding how these differences should be resolved in an integrated plant review, and (3) a documented evaluation of plant safety when the supplement to the Final Integrated Plant Safety Assessment Report has been issued. This report documents the review of the Big Rock Point Plant, which is one of ten plants reviewed under Phase II of this program. This report indicates how 137 topics selected for review under Phase I of the program were addressed. It also addresses a majority of the pending licensing actions for Big Rock Point, which include TMI Action Plan requirements and implementation criteria for resolved generic issues. Equipment and procedural changes have been identified as a result of the review

  10. Extended burnup demonstration: reactor fuel program. Pre-irradiation characterization and summary of pre-program poolside examinations. Big Rock Point extended burnup fuel

    International Nuclear Information System (INIS)

    Exarhos, C.A.; Van Swam, L.F.; Wahlquist, F.P.

    1981-12-01

    This report is a resource document characterizing the 64 fuel rods being irradiated at the Big Rock Point reactor as part of the Extended Burnup Demonstration being sponsored jointly by the US Department of Energy, Consumers Power Company, Exxon Nuclear Company, and General Public Utilities. The program entails extending the exposure of standard BWR fuel to a discharge average of 38,000 MWD/MTU to demonstrate the feasibility of operating fuel of standard design to levels significantly above current limits. The fabrication characteristics of the Big Rock Point EBD fuel are presented along with measurement of rod length, rod diameter, pellet stack height, and fuel rod withdrawal force taken at poolside at burnups up to 26,200 MWD/MTU. A review of the fuel examination data indicates no performance characteristics which might restrict the continued irradiation of the fuel

  11. Spatial distribution of radionuclides in Lake Michigan biota near the Big Rock Point Nuclear Plant

    International Nuclear Information System (INIS)

    Wahlgren, M.A.; Yaguchi, E.M.; Nelson, D.M.; Marshall, J.S.

    1974-01-01

    A survey was made of four groups of biota in the vicinity of the Big Rock Point Nuclear Plant near Charlevoix, Michigan, to determine their usefulness in locating possible sources of plutonium and other radionuclides to Lake Michigan. This 70 MW boiling-water reactor, located on the Lake Michigan shoreline, was chosen because its fuel contains recycled plutonium, and because it routinely discharges very low-level radioactive wastes into the lake. Samples of crayfish (Orconectes sp.), green algae (Chara sp. and Cladophora sp.), and an aquatic macrophyte (Potamogeton sp.) were collected in August 1973, at varying distances from the discharge and analyzed for 239 240 Pu, 90 Sr, and five gamma-emitting radionuclides. Comparison samples of reactor waste solution have also been analyzed for these radionuclides. Comparisons of the spatial distributions of the extremely low radionuclide concentrations in biota clearly indicated that 137 Cs, 134 Cs, 65 Zn, and 60 Co were released from the reactor; their concentrations decreased exponentially with increasing distance from the discharge. Conversely, concentrations of 239 240 Pu, 95 Zr, and 90 Sr showed no correlation with distance, suggesting any input from Big Rock was insignificant with respect to the atmospheric origin of these isotopes. The significance of these results is discussed, particularly with respect to current public debate over the possibility of local environmental hazards associated with the use of plutonium as a nuclear fuel. (U.S.)

  12. Technical evaluation of the proposed changes in the technical specifications for emergency power sources for the Big Rock Point nuclear power plant

    International Nuclear Information System (INIS)

    Latorre, V.R.

    1979-12-01

    The technical evaluation is presented for the proposed changes to the Technical Specifications for emergency power sources for the Big Rock Point nuclear power plant. The criteria used to evaluate the acceptability of the changes include those delineated in IEEE Std-308-1974, and IEEE Std-450-1975 as endorsed by US NRC Regulatory Guide 1.129

  13. Application Study of Self-balanced Testing Method on Big Diameter Rock-socketed Piles

    Directory of Open Access Journals (Sweden)

    Qing-biao WANG

    2013-07-01

    Full Text Available Through the technological test of self-balanced testing method on big diameter rock-socketed piles of broadcasting centre building of Tai’an, this paper studies and analyzes the links of the balance position selection, the load cell production and installation, displacement sensor selection and installation, loading steps, stability conditions and determination of the bearing capacity in the process of self-balanced testing. And this paper summarizes key technology and engineering experience of self-balanced testing method of big diameter rock-socketed piles and, meanwhile, it also analyzes the difficult technical problems needed to be resolved urgently at present. Conclusion of the study has important significance to the popularization and application of self-balanced testing method and the similar projects.

  14. Sideloading - Ingestion of Large Point Clouds Into the Apache Spark Big Data Engine

    Science.gov (United States)

    Boehm, J.; Liu, K.; Alis, C.

    2016-06-01

    In the geospatial domain we have now reached the point where data volumes we handle have clearly grown beyond the capacity of most desktop computers. This is particularly true in the area of point cloud processing. It is therefore naturally lucrative to explore established big data frameworks for big geospatial data. The very first hurdle is the import of geospatial data into big data frameworks, commonly referred to as data ingestion. Geospatial data is typically encoded in specialised binary file formats, which are not naturally supported by the existing big data frameworks. Instead such file formats are supported by software libraries that are restricted to single CPU execution. We present an approach that allows the use of existing point cloud file format libraries on the Apache Spark big data framework. We demonstrate the ingestion of large volumes of point cloud data into a compute cluster. The approach uses a map function to distribute the data ingestion across the nodes of a cluster. We test the capabilities of the proposed method to load billions of points into a commodity hardware compute cluster and we discuss the implications on scalability and performance. The performance is benchmarked against an existing native Apache Spark data import implementation.

  15. SIDELOADING – INGESTION OF LARGE POINT CLOUDS INTO THE APACHE SPARK BIG DATA ENGINE

    Directory of Open Access Journals (Sweden)

    J. Boehm

    2016-06-01

    Full Text Available In the geospatial domain we have now reached the point where data volumes we handle have clearly grown beyond the capacity of most desktop computers. This is particularly true in the area of point cloud processing. It is therefore naturally lucrative to explore established big data frameworks for big geospatial data. The very first hurdle is the import of geospatial data into big data frameworks, commonly referred to as data ingestion. Geospatial data is typically encoded in specialised binary file formats, which are not naturally supported by the existing big data frameworks. Instead such file formats are supported by software libraries that are restricted to single CPU execution. We present an approach that allows the use of existing point cloud file format libraries on the Apache Spark big data framework. We demonstrate the ingestion of large volumes of point cloud data into a compute cluster. The approach uses a map function to distribute the data ingestion across the nodes of a cluster. We test the capabilities of the proposed method to load billions of points into a commodity hardware compute cluster and we discuss the implications on scalability and performance. The performance is benchmarked against an existing native Apache Spark data import implementation.

  16. Parallel Processing of Big Point Clouds Using Z-Order Partitioning

    Science.gov (United States)

    Alis, C.; Boehm, J.; Liu, K.

    2016-06-01

    As laser scanning technology improves and costs are coming down, the amount of point cloud data being generated can be prohibitively difficult and expensive to process on a single machine. This data explosion is not only limited to point cloud data. Voluminous amounts of high-dimensionality and quickly accumulating data, collectively known as Big Data, such as those generated by social media, Internet of Things devices and commercial transactions, are becoming more prevalent as well. New computing paradigms and frameworks are being developed to efficiently handle the processing of Big Data, many of which utilize a compute cluster composed of several commodity grade machines to process chunks of data in parallel. A central concept in many of these frameworks is data locality. By its nature, Big Data is large enough that the entire dataset would not fit on the memory and hard drives of a single node hence replicating the entire dataset to each worker node is impractical. The data must then be partitioned across worker nodes in a manner that minimises data transfer across the network. This is a challenge for point cloud data because there exist different ways to partition data and they may require data transfer. We propose a partitioning based on Z-order which is a form of locality-sensitive hashing. The Z-order or Morton code is computed by dividing each dimension to form a grid then interleaving the binary representation of each dimension. For example, the Z-order code for the grid square with coordinates (x = 1 = 012, y = 3 = 112) is 10112 = 11. The number of points in each partition is controlled by the number of bits per dimension: the more bits, the fewer the points. The number of bits per dimension also controls the level of detail with more bits yielding finer partitioning. We present this partitioning method by implementing it on Apache Spark and investigating how different parameters affect the accuracy and running time of the k nearest neighbour algorithm

  17. PARALLEL PROCESSING OF BIG POINT CLOUDS USING Z-ORDER-BASED PARTITIONING

    Directory of Open Access Journals (Sweden)

    C. Alis

    2016-06-01

    Full Text Available As laser scanning technology improves and costs are coming down, the amount of point cloud data being generated can be prohibitively difficult and expensive to process on a single machine. This data explosion is not only limited to point cloud data. Voluminous amounts of high-dimensionality and quickly accumulating data, collectively known as Big Data, such as those generated by social media, Internet of Things devices and commercial transactions, are becoming more prevalent as well. New computing paradigms and frameworks are being developed to efficiently handle the processing of Big Data, many of which utilize a compute cluster composed of several commodity grade machines to process chunks of data in parallel. A central concept in many of these frameworks is data locality. By its nature, Big Data is large enough that the entire dataset would not fit on the memory and hard drives of a single node hence replicating the entire dataset to each worker node is impractical. The data must then be partitioned across worker nodes in a manner that minimises data transfer across the network. This is a challenge for point cloud data because there exist different ways to partition data and they may require data transfer. We propose a partitioning based on Z-order which is a form of locality-sensitive hashing. The Z-order or Morton code is computed by dividing each dimension to form a grid then interleaving the binary representation of each dimension. For example, the Z-order code for the grid square with coordinates (x = 1 = 012, y = 3 = 112 is 10112 = 11. The number of points in each partition is controlled by the number of bits per dimension: the more bits, the fewer the points. The number of bits per dimension also controls the level of detail with more bits yielding finer partitioning. We present this partitioning method by implementing it on Apache Spark and investigating how different parameters affect the accuracy and running time of the k nearest

  18. Recent advances in analysis and prediction of Rock Falls, Rock Slides, and Rock Avalanches using 3D point clouds

    Science.gov (United States)

    Abellan, A.; Carrea, D.; Jaboyedoff, M.; Riquelme, A.; Tomas, R.; Royan, M. J.; Vilaplana, J. M.; Gauvin, N.

    2014-12-01

    The acquisition of dense terrain information using well-established 3D techniques (e.g. LiDAR, photogrammetry) and the use of new mobile platforms (e.g. Unmanned Aerial Vehicles) together with the increasingly efficient post-processing workflows for image treatment (e.g. Structure From Motion) are opening up new possibilities for analysing, modeling and predicting rock slope failures. Examples of applications at different scales ranging from the monitoring of small changes at unprecedented level of detail (e.g. sub millimeter-scale deformation under lab-scale conditions) to the detection of slope deformation at regional scale. In this communication we will show the main accomplishments of the Swiss National Foundation project "Characterizing and analysing 3D temporal slope evolution" carried out at Risk Analysis group (Univ. of Lausanne) in close collaboration with the RISKNAT and INTERES groups (Univ. of Barcelona and Univ. of Alicante, respectively). We have recently developed a series of innovative approaches for rock slope analysis using 3D point clouds, some examples include: the development of semi-automatic methodologies for the identification and extraction of rock-slope features such as discontinuities, type of material, rockfalls occurrence and deformation. Moreover, we have been improving our knowledge in progressive rupture characterization thanks to several algorithms, some examples include the computing of 3D deformation, the use of filtering techniques on permanently based TLS, the use of rock slope failure analogies at different scales (laboratory simulations, monitoring at glacier's front, etc.), the modelling of the influence of external forces such as precipitation on the acceleration of the deformation rate, etc. We have also been interested on the analysis of rock slope deformation prior to the occurrence of fragmental rockfalls and the interaction of this deformation with the spatial location of future events. In spite of these recent advances

  19. Supergene destruction of a hydrothermal replacement alunite deposit at Big Rock Candy Mountain, Utah: Mineralogy, spectroscopic remote sensing, stable-isotope, and argon-age evidences

    Science.gov (United States)

    Cunningham, Charles G.; Rye, Robert O.; Rockwell, Barnaby W.; Kunk, Michael J.; Councell, Terry B.

    2005-01-01

    Big Rock Candy Mountain is a prominent center of variegated altered volcanic rocks in west-central Utah. It consists of the eroded remnants of a hypogene alunite deposit that, at ∼21 Ma, replaced intermediate-composition lava flows. The alunite formed in steam-heated conditions above the upwelling limb of a convection cell that was one of at least six spaced at 3- to 4-km intervals around the margin of a monzonite stock. Big Rock Candy Mountain is horizontally zoned outward from an alunite core to respective kaolinite, dickite, and propylite envelopes. The altered rocks are also vertically zoned from a lower pyrite–propylite assemblage upward through assemblages successively dominated by hypogene alunite, jarosite, and hematite, to a flooded silica cap. This hydrothermal assemblage is undergoing natural destruction in a steep canyon downcut by the Sevier River in Marysvale Canyon. Integrated geological, mineralogical, spectroscopic remote sensing using AVIRIS data, Ar radiometric, and stable isotopic studies trace the hypogene origin and supergene destruction of the deposit and permit distinction of primary (hydrothermal) and secondary (weathering) processes. This destruction has led to the formation of widespread supergene gypsum in cross-cutting fractures and as surficial crusts, and to natrojarosite, that gives the mountain its buff coloration along ridges facing the canyon. A small spring, Lemonade Spring, with a pH of 2.6 and containing Ca, Mg, Si, Al, Fe, Mn, Cl, and SO4, also occurs near the bottom of the canyon. The 40Ar/39Ar age (21.32±0.07 Ma) of the alunite is similar to that for other replacement alunites at Marysvale. However, the age spectrum contains evidence of a 6.6-Ma thermal event that can be related to the tectonic activity responsible for the uplift that led to the downcutting of Big Rock Candy Mountain by the Sevier River. This ∼6.6 Ma event also is present in the age spectrum of supergene natrojarosite forming today, and probably

  20. When Big Ice Turns Into Water It Matters For Houses, Stores And Schools All Over

    Science.gov (United States)

    Bell, R. E.

    2017-12-01

    When ice in my glass turns to water it is not bad but when the big ice at the top and bottom of the world turns into water it is not good. This new water makes many houses, stores and schools wet. It is really bad during when the wind is strong and the rain is hard. New old ice water gets all over the place. We can not get to work or school or home. We go to the big ice at the top and bottom of the world to see if it will turn to water soon and make more houses wet. We fly over the big ice to see how it is doing. Most of the big ice sits on rock. Around the edge of the big sitting on rock ice, is really low ice that rides on top of the water. This really low ice slows down the big rock ice turning into water. If the really low ice cracks up and turns into little pieces of ice, the big rock ice will make more houses wet. We look to see if there is new water in the cracks. Water in the cracks is bad as it hurts the big rock ice. Water in the cracks on the really low ice will turn the low ice into many little pieces of ice. Then the big rock ice will turn to water. That is water in cracks is bad for the houses, schools and businesses. If water moves off the really low ice, it does not stay in the cracks. This is better for the really low ice. This is better for the big rock ice. We took pictures of the really low ice and saw water leaving. The water was not staying in the cracks. Water leaving the really low ice might be good for houses, schools and stores.

  1. Automatic extraction of discontinuity orientation from rock mass surface 3D point cloud

    Science.gov (United States)

    Chen, Jianqin; Zhu, Hehua; Li, Xiaojun

    2016-10-01

    This paper presents a new method for extracting discontinuity orientation automatically from rock mass surface 3D point cloud. The proposed method consists of four steps: (1) automatic grouping of discontinuity sets using an improved K-means clustering method, (2) discontinuity segmentation and optimization, (3) discontinuity plane fitting using Random Sample Consensus (RANSAC) method, and (4) coordinate transformation of discontinuity plane. The method is first validated by the point cloud of a small piece of a rock slope acquired by photogrammetry. The extracted discontinuity orientations are compared with measured ones in the field. Then it is applied to a publicly available LiDAR data of a road cut rock slope at Rockbench repository. The extracted discontinuity orientations are compared with the method proposed by Riquelme et al. (2014). The results show that the presented method is reliable and of high accuracy, and can meet the engineering needs.

  2. Kimberley rock art dating project

    International Nuclear Information System (INIS)

    Walsh, G.L.; Morwood, M.

    1997-01-01

    The art's additional value, unequalled by traditionally recognised artefacts, is its permanent pictorial documentation presenting a 'window' into the otherwise intangible elements of perceptions, vision and mind of pre-historic cultures. Unfortunately it's potential in establishing Kimberley archaeological 'big picture' still remains largely unrecognised. Some of findings of the Kimberley Rock Art Dating Project, using AMS and optical stimulated luminescence (OSL) dating techniques, are outlined. It is estimated that these findings will encourage involvement by a greater diversity of specialist disciplines to tie findings into levels of this art sequence as a primary reference point. The sequence represents a sound basis for selecting specific defined images for targeting detailed studies by a range of dating technique. This effectively removes the undesirable ad hoc sampling of 'apparently old paintings'; a process which must unavoidably remain the case with researchers working on most global bodies of rock art

  3. Properties of uranium and thorium in host rocks of multi-metal (Ag, Pb, U, Cu, Bi, Z, F) Big Kanimansur deposit (Tajikistan)

    International Nuclear Information System (INIS)

    Fayziev, A.R.

    2007-01-01

    Multi-metal Big Kanimansur Deposit host rocks contain high averages of uranium and thorium which are more than clark averages by 7 and 2.5 times accordingly. The second property of radio-active elements distribution are low ratio of thorium to uranium. That criteria can be used as prospecting sings for flanks and depth of know ore fields as well as for new squares of multi-metal mineralisation

  4. Big Data and Neuroimaging.

    Science.gov (United States)

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  5. Automated extraction and analysis of rock discontinuity characteristics from 3D point clouds

    Science.gov (United States)

    Bianchetti, Matteo; Villa, Alberto; Agliardi, Federico; Crosta, Giovanni B.

    2016-04-01

    A reliable characterization of fractured rock masses requires an exhaustive geometrical description of discontinuities, including orientation, spacing, and size. These are required to describe discontinuum rock mass structure, perform Discrete Fracture Network and DEM modelling, or provide input for rock mass classification or equivalent continuum estimate of rock mass properties. Although several advanced methodologies have been developed in the last decades, a complete characterization of discontinuity geometry in practice is still challenging, due to scale-dependent variability of fracture patterns and difficult accessibility to large outcrops. Recent advances in remote survey techniques, such as terrestrial laser scanning and digital photogrammetry, allow a fast and accurate acquisition of dense 3D point clouds, which promoted the development of several semi-automatic approaches to extract discontinuity features. Nevertheless, these often need user supervision on algorithm parameters which can be difficult to assess. To overcome this problem, we developed an original Matlab tool, allowing fast, fully automatic extraction and analysis of discontinuity features with no requirements on point cloud accuracy, density and homogeneity. The tool consists of a set of algorithms which: (i) process raw 3D point clouds, (ii) automatically characterize discontinuity sets, (iii) identify individual discontinuity surfaces, and (iv) analyse their spacing and persistence. The tool operates in either a supervised or unsupervised mode, starting from an automatic preliminary exploration data analysis. The identification and geometrical characterization of discontinuity features is divided in steps. First, coplanar surfaces are identified in the whole point cloud using K-Nearest Neighbor and Principal Component Analysis algorithms optimized on point cloud accuracy and specified typical facet size. Then, discontinuity set orientation is calculated using Kernel Density Estimation and

  6. Big Bang Day : Physics Rocks

    CERN Multimedia

    Brian Cox; John Barrowman; Eddie Izzard

    2008-01-01

    Is particle physics the new rock 'n' roll? The fundamental questions about the nature of the universe that particle physics hopes to answer have attracted the attention of some very high profile and unusual fans. Alan Alda, Ben Miller, Eddie Izzard, Dara O'Briain and John Barrowman all have interests in this branch of physics. Brian Cox - CERN physicist, and former member of 90's band D:Ream, tracks down some very well known celebrity enthusiasts and takes a light-hearted look at why this subject can appeal to all of us.

  7. Still Bay Point-Production Strategies at Hollow Rock Shelter and Umhlatuzana Rock Shelter and Knowledge-Transfer Systems in Southern Africa at about 80-70 Thousand Years Ago

    Science.gov (United States)

    Lombard, Marlize

    2016-01-01

    It has been suggested that technological variations associated with Still Bay assemblages of southern Africa have not been addressed adequately. Here we present a study developed to explore regional and temporal variations in Still Bay point-production strategies. We applied our approach in a regional context to compare the Still Bay point assemblages from Hollow Rock Shelter (Western Cape) and Umhlatuzana Rock Shelter (KwaZulu-Natal). Our interpretation of the point-production strategies implies inter-regional point-production conventions, but also highlights variability and intra-regional knapping strategies used for the production of Still Bay points. These strategies probably reflect flexibility in the organisation of knowledge-transfer systems at work during the later stages of the Middle Stone Age between about 80 ka and 70 ka in South Africa. PMID:27942012

  8. CERN’s Summer of Rock

    CERN Multimedia

    Katarina Anthony

    2015-01-01

    When a rock star visits CERN, they don’t just bring their entourage with them. Along for the ride are legions of fans across the world – many of whom may not be the typical CERN audience. In July alone, four big acts paid CERN a visit, sharing their experience with the world: Scorpions, The Script, Kings of Leon and Patti Smith.   @TheScript tweeted: #paleofestival we had the best time! Big love. #CERN (Image: Twitter).   It all started with the Scorpions, the classic rock band whose “Wind of Change” became an anthem in the early 1990s. On 19 July, the band braved the 35-degree heat to tour the CERN site on foot – visiting the Synchrocyclotron and the new Microcosm exhibition. The rockers were very enthusiastic about the research carried out at CERN, and talked about returning in the autumn during their next tour stop. The Scorpions visit Microcosm. Two days later, The Script rolled in. This Irish pop-rock band has been hittin...

  9. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  10. Big Data; A Management Revolution : The emerging role of big data in businesses

    OpenAIRE

    Blasiak, Kevin

    2014-01-01

    Big data is a term that was coined in 2012 and has since then emerged to one of the top trends in business and technology. Big data is an agglomeration of different technologies resulting in data processing capabilities that have been unreached before. Big data is generally characterized by 4 factors. Volume, velocity and variety. These three factors distinct it from the traditional data use. The possibilities to utilize this technology are vast. Big data technology has touch points in differ...

  11. Favorability for uranium in tertiary sedimentary rocks, southwestern Montana

    International Nuclear Information System (INIS)

    Wopat, M.A.; Curry, W.E.; Robins, J.W.; Marjaniemi, D.K.

    1977-10-01

    Tertiary sedimentary rocks in the basins of southwestern Montana were studied to determine their favorability for potential uranium resources. Uranium in the Tertiary sedimentary rocks was probably derived from the Boulder batholith and from silicic volcanic material. The batholith contains numerous uranium occurrences and is the most favorable plutonic source for uranium in the study area. Subjective favorability categories of good, moderate, and poor, based on the number and type of favorable criteria present, were used to classify the rock sequences studied. Rocks judged to have good favorability for uranium deposits are (1) Eocene and Oligocene strata and undifferentiated Tertiary rocks in the western Three Forks basin and (2) Oligocene rocks in the Helena basin. Rocks having moderate favorability consist of (1) Eocene and Oligocene strata in the Jefferson River, Beaverhead River, and lower Ruby River basins, (2) Oligocene rocks in the Townsend and Clarkston basins, (3) Miocene and Pliocene rocks in the Upper Ruby River basin, and (4) all Tertiary sedimentary formations in the eastern Three Forks basin, and in the Grasshopper Creek, Horse Prairie, Medicine Lodge Creek, Big Sheep Creek, Deer Lodge, Big Hole River, and Bull Creek basins. The following have poor favorability: (1) the Beaverhead Conglomerate in the Red Rock and Centennial basins, (2) Eocene and Oligocene rocks in the Upper Ruby River basin, (3) Miocene and Pliocene rocks in the Townsend, Clarkston, Smith River, and Divide Creek basins, (4) Miocene through Pleistocene rocks in the Jefferson River, Beaverhead River, and Lower Ruby River basins, and (5) all Tertiary sedimentary rocks in the Boulder River, Sage Creek, Muddy Creek, Madison River, Flint Creek, Gold Creek, and Bitterroot basins

  12. CRED REA Fish Team Stationary Point Count Surveys at Kaula Rock, Main Hawaiian Islands, 2006

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Stationary Point Counts at 4 stations at each survey site were surveyed as part of Rapid Ecological Assessments (REA) conducted at 2 sites around Kaula Rock in the...

  13. Brit Crit: Turning Points in British Rock Criticism 1960-1990

    DEFF Research Database (Denmark)

    Gudmundsson, Gestur; Lindberg, U.; Michelsen, M.

    2002-01-01

    had national specific traits and there have been more profound paradigm shifts than in American rock criticism. This is primarily explained by the fact that American rock criticism is more strongly connected to general cultural history, while the UK rock criticism has been more alienated from dominant......The article examines the development of rock criticism in the United Kingdom from the perspective of a Bourdieuan field-analysis. Early British rock critics, like Nik Cohn, were international pioneers, a few years later there was a strong American influence, but British rock criticism has always...... culture and more linked to youth culture. However, also in the UK rock criticism has been part and parcel of the legitimation of rock culture and has moved closer to dominant fields and positions in the cultural hierarchy....

  14. Sense Things in the Big Deep Water Bring the Big Deep Water to Computers so People can understand the Deep Water all the Time without getting wet

    Science.gov (United States)

    Pelz, M.; Heesemann, M.; Scherwath, M.; Owens, D.; Hoeberechts, M.; Moran, K.

    2015-12-01

    Senses help us learn stuff about the world. We put sense things in, over, and under the water to help people understand water, ice, rocks, life and changes over time out there in the big water. Sense things are like our eyes and ears. We can use them to look up and down, right and left all of the time. We can also use them on top of or near the water to see wind and waves. As the water gets deep, we can use our sense things to see many a layer of different water that make up the big water. On the big water we watch ice grow and then go away again. We think our sense things will help us know if this is different from normal, because it could be bad for people soon if it is not normal. Our sense things let us hear big water animals talking low (but sometimes high). We can also see animals that live at the bottom of the big water and we take lots of pictures of them. Lots of the animals we see are soft and small or hard and small, but sometimes the really big ones are seen too. We also use our sense things on the bottom and sometimes feel the ground shaking. Sometimes, we get little pockets of bad smelling air going up, too. In other areas of the bottom, we feel hot hot water coming out of the rock making new rocks and we watch some animals even make houses and food out of the hot hot water that turns to rock as it cools. To take care of the sense things we use and control water cars and smaller water cars that can dive deep in the water away from the bigger water car. We like to put new things in the water and take things out of the water that need to be fixed at least once a year. Sense things are very cool because you can use the sense things with your computer too. We share everything for free on our computers, which your computer talks to and gets pictures and sounds for you. Sharing the facts from the sense things is the best part about having the sense things because we can get many new ideas about understanding the big water from anyone with a computer!

  15. Balancing on the Edge: An Approach to Leadership and Resiliency that Combines Rock Climbing with Four Key Touch Points

    Science.gov (United States)

    Winkler, Harold E.

    2005-01-01

    In this article, the author compares leadership and resiliency with rock climbing. It describes the author's personal experience on a rock climbing adventure with his family and how it required application of similar elements as that of leadership and resiliency. The article contains the following sections: (1) Being Resilient; (2) Points of…

  16. Attempt of groundwater dating using the drilled rock core. 1. Development of the rock sampling method for measurement of noble gases dissolved in interstitial water in rock

    International Nuclear Information System (INIS)

    Mahara, Yasunori

    2002-01-01

    Groundwater dating in low permeable rock is very difficult and impracticable, because we take a very long time to collect groundwater sample in a borehole and have to invest much fund in production of the in-situ groundwater sampler and in operation of it. If we can directly measure noble gases dissolved in interstitial groundwater in rock core, we have a big merit to estimate groundwater resident time easy. In this study, we designed and produced a high vacuum container to let dissolved noble gases diffuse until reaching in equilibrium, and we made a handling manual of the rock core into the container and a procedure to vacuum out air from the sealed container. We compared data sets of noble gas concentration obtained from rock cores and groundwater sample collected from boreholes in-situ. The measured rocks are pumice-tuff rock, mud rock and hornfels, which have their permeabilities of 10 -6 cm/s, 10 -9 cm/s and 10 -11 cm/s, respectively. Consequently, we evaluated the rock core method is better than the in-situ groundwater sampling method for low permeable rock. (author)

  17. Evaluation of Rock Bolt Support for Polish Hard Rock Mines

    Science.gov (United States)

    Skrzypkowski, Krzysztof

    2018-03-01

    The article presents different types of rock bolt support used in Polish ore mining. Individual point resin and expansion rock bolt support were characterized. The roof classes for zinc and lead and copper ore mines were presented. Furthermore, in the article laboratory tests of point resin rock bolt support in a geometric scale of 1:1 with minimal fixing length of 0.6 m were made. Static testing of point resin rock bolt support were carried out on a laboratory test facility of Department of Underground Mining which simulate mine conditions for Polish ore and hard coal mining. Laboratory tests of point resin bolts were carried out, especially for the ZGH Bolesław, zinc and lead "Olkusz - Pomorzany" mine. The primary aim of the research was to check whether at the anchoring point length of 0.6 m by means of one and a half resin cartridge, the type bolt "Olkusz - 20A" is able to overcome the load.The second purpose of the study was to obtain load - displacement characteristic with determination of the elastic and plastic range of the bolt. For the best simulation of mine conditions the station steel cylinders with an external diameter of 0.1 m and a length of 0.6 m with a core of rock from the roof of the underground excavations were used.

  18. Water resources in the Big Lost River Basin, south-central Idaho

    Science.gov (United States)

    Crosthwaite, E.G.; Thomas, C.A.; Dyer, K.L.

    1970-01-01

    The Big Lost River basin occupies about 1,400 square miles in south-central Idaho and drains to the Snake River Plain. The economy in the area is based on irrigation agriculture and stockraising. The basin is underlain by a diverse-assemblage of rocks which range, in age from Precambrian to Holocene. The assemblage is divided into five groups on the basis of their hydrologic characteristics. Carbonate rocks, noncarbonate rocks, cemented alluvial deposits, unconsolidated alluvial deposits, and basalt. The principal aquifer is unconsolidated alluvial fill that is several thousand feet thick in the main valley. The carbonate rocks are the major bedrock aquifer. They absorb a significant amount of precipitation and, in places, are very permeable as evidenced by large springs discharging from or near exposures of carbonate rocks. Only the alluvium, carbonate rock and locally the basalt yield significant amounts of water. A total of about 67,000 acres is irrigated with water diverted from the Big Lost River. The annual flow of the river is highly variable and water-supply deficiencies are common. About 1 out of every 2 years is considered a drought year. In the period 1955-68, about 175 irrigation wells were drilled to provide a supplemental water supply to land irrigated from the canal system and to irrigate an additional 8,500 acres of new land. Average. annual precipitation ranged from 8 inches on the valley floor to about 50 inches at some higher elevations during the base period 1944-68. The estimated water yield of the Big Lost River basin averaged 650 cfs (cubic feet per second) for the base period. Of this amount, 150 cfs was transpired by crops, 75 cfs left the basin as streamflow, and 425 cfs left as ground-water flow. A map of precipitation and estimated values of evapotranspiration were used to construct a water-yield map. A distinctive feature of the Big Lost River basin, is the large interchange of water from surface streams into the ground and from the

  19. Source rock potential of middle cretaceous rocks in Southwestern Montana

    Science.gov (United States)

    Dyman, T.S.; Palacas, J.G.; Tysdal, R.G.; Perry, W.J.; Pawlewicz, M.J.

    1996-01-01

    The middle Cretaceous in southwestern Montana is composed of a marine and nonmarine succession of predominantly clastic rocks that were deposited along the western margin of the Western Interior Seaway. In places, middle Cretaceous rocks contain appreciable total organic carbon (TOC), such as 5.59% for the Mowry Shale and 8.11% for the Frontier Formation in the Madison Range. Most samples, however, exhibit less than 1.0% TOC. The genetic or hydrocarbon potential (S1+S2) of all the samples analyzed, except one, yield less than 1 mg HC/g rock, strongly indicating poor potential for generating commercial amounts of hydrocarbons. Out of 51 samples analyzed, only one (a Thermopolis Shale sample from the Snowcrest Range) showed a moderate petroleum potential of 3.1 mg HC/g rock. Most of the middle Cretaceous samples are thermally immature to marginally mature, with vitrinite reflectance ranging from about 0.4 to 0.6% Ro. Maturity is high in the Pioneer Mountains, where vitrinite reflectance averages 3.4% Ro, and at Big Sky Montana, where vitrinite reflectance averages 2.5% Ro. At both localities, high Ro values are due to local heat sources, such as the Pioneer batholith in the Pioneer Mountains.

  20. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  1. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  2. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  3. Antigravity and the big crunch/big bang transition

    International Nuclear Information System (INIS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  4. How did the ball of rock we live on get so nice?

    Science.gov (United States)

    Armstrong, K.

    2017-12-01

    We want to understand how the big ball of rock, water, and air we live on got to be as nice as it is. Some of the things that made our world so nice are what it's made of, and what happened to those things after they were all put together. When our home ball of rock was little, it got bigger by grabbing some of the other stuff that was close by in space. So our world is made of all that space stuff, which was rocks, and that stuff that turns red if you let it get wet. Most of that red stuff is deep down in the middle of the world, covered up by a lot of rocks. But there is a lot of it inside the rocks, too. That red stuff does a lot of cool stuff when it touches other things, especially the kind of air we like to breathe- that is what makes it turn red. Just one very tiny piece of the red stuff can join up with the breathing air in different ways: it can team up with none at all, or a little bit, or a lot, or even more. If we look at how much of the breathing air is teamed up with the red stuff inside the rocks, we can learn about how the rocks got there and what happened to them a long time ago. In the rocks that we can look at, there is more of the breathing air teamed up with the red stuff than we might have thought. We think that maybe that is because when more and more stuff tries to fit in the same small space, and gets pressed down, like it does deep down in our world, it can change in ways we do not expect. When our world was almost as big as it is now, it probably grabbed some space stuff that was almost as big as it was. The space stuff would have run into our world really hard, and that would have made everything really hot, hot enough to make all the rocks in the world move like water. We have an idea that maybe, when all the rocks on the world were really hot and moved around like water, deep down the red stuff and the breathing air might have got all pressed together very hard, so that more of the breathing air would team up with the red stuff than it

  5. What Happens Where the Water and the Rock Touch in Small Space Bodies

    Science.gov (United States)

    Byrne, P. K.; Regensburger, P. V.; Klimczak, C.; Bohnenstiehl, D. R.; Dombard, A. J.; Hauck, S. A., II

    2017-12-01

    There are several small space bodies that go around bigger worlds that might have a layer of water under a layer of ice. Lots of study has been done to understand the outside ice layer of these small space bodies, because the ice can tells us important things about the big water layer under it. Some of these small space bodies are very interesting because the right things for life—water, hot rock, and food—might be at the bottom of the water layer, where it touches the top of the next layer down, which is made of rock. But it is very hard to understand what this rock at the bottom of the water is like, because we can't see it. So, we are imagining what this rock is like by thinking about what the rock is like under the water layer on our own world. If hot rock comes out of the rock layer through cracks under the water, the cold of the water makes the hot rock go very cold very fast, and it makes funny rolls as it does so. This might happen on some small space bodies that are hot enough on the inside to make hot rock. We know that on our own world the rock layer under the water is wet to as far down as cracks can go, so it makes sense that this is true for small space bodies, too. We did some thinking about numbers and found out that the cracks can go a few ten hundred steps into the rock layer on small space bodies, but for bigger (well, not quite so small) space bodies, the cracks can go at least tens of ten hundred steps into the rock layer. This means that water goes into the rock layer this much, too. But get this: some small bodies are not really that small—one of them is bigger than the first world from the Sun! And on a few of these big (small) bodies, the layer of water is so heavy that the bottom of that water is pushed together from all sides and turns into a type of hot ice. This means that, for these big (small) worlds, the water can't get into the rock layer through cracks (since there is a layer of hot ice in the way), and so these bodies are

  6. Challenges of Big Data Analysis.

    Science.gov (United States)

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  7. An Approach for Automatic Orientation of Big Point Clouds from the Stationary Scanners Based on the Spherical Targets

    Directory of Open Access Journals (Sweden)

    YAO Jili

    2015-04-01

    Full Text Available Terrestrial laser scanning (TLS technology has high speed of data acquisition, large amount of point cloud, long distance of measuring. However, there are some disadvantages such as distance limitation in target detecting, hysteresis in point clouds processing, low automation and weaknesses of adapting long-distance topographic survey. In this case, we put forward a method on long-range targets detecting in big point clouds orientation. The method firstly searches point cloud rings that contain targets according to their engineering coordinate system. Then the detected rings are divided into sectors to detect targets in a very short time so as to obtain central coordinates of these targets. Finally, the position and orientation parameters of scanner are calculated and point clouds in scanner's own coordinate system(SOCS are converted into engineering coordinate system. The method is able to be applied in ordinary computers for long distance topographic(the distance between scanner and targets ranges from 180 to 700 m survey in mountainous areas with targets radius of 0.162m.

  8. Rock History and Culture

    OpenAIRE

    Gonzalez, Éric

    2013-01-01

    Two ambitious works written by French-speaking scholars tackle rock music as a research object, from different but complementary perspectives. Both are a definite must-read for anyone interested in the contextualisation of rock music in western popular culture. In Une histoire musicale du rock (i.e. A Musical History of Rock), rock music is approached from the point of view of the people – musicians and industry – behind the music. Christophe Pirenne endeavours to examine that field from a m...

  9. Crisis analytics : big data-driven crisis response

    NARCIS (Netherlands)

    Qadir, Junaid; ur Rasool, Raihan; Zwitter, Andrej; Sathiaseelan, Arjuna; Crowcroft, Jon

    2016-01-01

    Disasters have long been a scourge for humanity. With the advances in technology (in terms of computing, communications, and the ability to process, and analyze big data), our ability to respond to disasters is at an inflection point. There is great optimism that big data tools can be leveraged to

  10. Disc cutter wear and rock texture in hard rock TBM tunneling

    International Nuclear Information System (INIS)

    Koizumi, Yu; Tsusaka, Kimikazu; Tanimoto, Chikaosa; Nakagawa, Shigeo; Fujita, Naoya

    2008-01-01

    Disc cutter wear in TBM tunneling is caused by initial fragmentation of a solid rock face (the primary fragmentation) and fragmentation of residual rock pieces between a cutterhead and the face (the secondary fragmentation). In two projects through sedimentary and granitic rocks, the authors investigated the relationships between the rate of cutter wear caused by the primary fragmentation, point load index and the grain size and contents of abrasive minerals. As a result, it was found that the tensile strength and the mineral contents of rocks significantly influenced the cutter wear in both projects and thus it is necessary to take into account of rock type. (author)

  11. Urban Big Data and the Development of City Intelligence

    Directory of Open Access Journals (Sweden)

    Yunhe Pan

    2016-06-01

    Full Text Available This study provides a definition for urban big data while exploring its features and applications of China's city intelligence. The differences between city intelligence in China and the “smart city” concept in other countries are compared to highlight and contrast the unique definition and model for China's city intelligence in this paper. Furthermore, this paper examines the role of urban big data in city intelligence by showing that it not only serves as the cornerstone of this trend as it also plays a core role in the diffusion of city intelligence technology and serves as an inexhaustible resource for the sustained development of city intelligence. This study also points out the challenges of shaping and developing of China's urban big data. Considering the supporting and core role that urban big data plays in city intelligence, the study then expounds on the key points of urban big data, including infrastructure support, urban governance, public services, and economic and industrial development. Finally, this study points out that the utility of city intelligence as an ideal policy tool for advancing the goals of China's urban development. In conclusion, it is imperative that China make full use of its unique advantages—including using the nation's current state of development and resources, geographical advantages, and good human relations—in subjective and objective conditions to promote the development of city intelligence through the proper application of urban big data.

  12. Water-quality effects on phytoplankton species and density and trophic state indices at Big Base and Little Base Lakes, Little Rock Air Force Base, Arkansas, June through August, 2015

    Science.gov (United States)

    Driver, Lucas; Justus, Billy

    2016-01-01

    Big Base and Little Base Lakes are located on Little Rock Air Force Base, Arkansas, and their close proximity to a dense residential population and an active military/aircraft installation make the lakes vulnerable to water-quality degradation. The U.S. Geological Survey (USGS) conducted a study from June through August 2015 to investigate the effects of water quality on phytoplankton species and density and trophic state in Big Base and Little Base Lakes, with particular regard to nutrient concentrations. Nutrient concentrations, trophic-state indices, and the large part of the phytoplankton biovolume composed of cyanobacteria, indicate eutrophic conditions were prevalent for Big Base and Little Base Lakes, particularly in August 2015. Cyanobacteria densities and biovolumes measured in this study likely pose a low to moderate risk of adverse algal toxicity, and the high proportion of filamentous cyanobacteria in the lakes, in relation to other algal groups, is important from a fisheries standpoint because these algae are a poor food source for many aquatic taxa. In both lakes, total nitrogen to total phosphorus (N:P) ratios declined over the sampling period as total phosphorus concentrations increased relative to nitrogen concentrations. The N:P ratios in the August samples (20:1 and 15:1 in Big Base and Little Base Lakes, respectively) and other indications of eutrophic conditions are of concern and suggest that exposure of the two lakes to additional nutrients could cause unfavorable dissolved-oxygen conditions and increase the risk of cyanobacteria blooms and associated cyanotoxin issues.

  13. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  14. USE OF BIG DATA ANALYTICS FOR CUSTOMER RELATIONSHIP MANAGEMENT: POINT OF PARITY OR SOURCE OF COMPETITIVE ADVANTAGE?

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas; Zablah, Alex R.; Straub, Detmar W.

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (CA use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: 1. What are the key antecedents of big data customer analytics use? 2. How, and to what extent, does big data...

  15. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2015 The Authors.

  16. [Embracing medical innovation in the era of big data].

    Science.gov (United States)

    You, Suning

    2015-01-01

    Along with the advent of big data era worldwide, medical field has to place itself in it inevitably. The current article thoroughly introduces the basic knowledge of big data, and points out the coexistence of its advantages and disadvantages. Although the innovations in medical field are struggling, the current medical pattern will be changed fundamentally by big data. The article also shows quick change of relevant analysis in big data era, depicts a good intention of digital medical, and proposes some wise advices to surgeons.

  17. Lattice Boltzmann Simulations of Fluid Flow in Continental Carbonate Reservoir Rocks and in Upscaled Rock Models Generated with Multiple-Point Geostatistics

    Directory of Open Access Journals (Sweden)

    J. Soete

    2017-01-01

    Full Text Available Microcomputed tomography (μCT and Lattice Boltzmann Method (LBM simulations were applied to continental carbonates to quantify fluid flow. Fluid flow characteristics in these complex carbonates with multiscale pore networks are unique and the applied method allows studying their heterogeneity and anisotropy. 3D pore network models were introduced to single-phase flow simulations in Palabos, a software tool for particle-based modelling of classic computational fluid dynamics. In addition, permeability simulations were also performed on rock models generated with multiple-point geostatistics (MPS. This allowed assessing the applicability of MPS in upscaling high-resolution porosity patterns into large rock models that exceed the volume limitations of the μCT. Porosity and tortuosity control fluid flow in these porous media. Micro- and mesopores influence flow properties at larger scales in continental carbonates. Upscaling with MPS is therefore necessary to overcome volume-resolution problems of CT scanning equipment. The presented LBM-MPS workflow is applicable to other lithologies, comprising different pore types, shapes, and pore networks altogether. The lack of straightforward porosity-permeability relationships in complex carbonates highlights the necessity for a 3D approach. 3D fluid flow studies provide the best understanding of flow through porous media, which is of crucial importance in reservoir modelling.

  18. Old flying ice-rock body in space allows a glance at its inner working.

    Science.gov (United States)

    Bieler, A. M.

    2015-12-01

    I am studying old, cold bodies of rock and ice flying through space, usually far, far away from the Sun. They are even behind the last of the big 8 balls we call our home worlds. (There were 9 balls a few yearsago, but then one of the balls was not considered a ball anymore by some people and he/she had to leave the group.)Because they are so far away from the Sun, they remain dark and very cold for the most part of their life.That is why even most of the very nervous stuff sticks on them ever since. With stuff I mean the little things that the Sun, the big 8 balls, we humans and everything else that is flying around the Sun ismade of. The nervous ones quickly change into something wind like andcan get lost. But the cold on the ice-rock bodies slows this down andthey stick around. This makes those ice-rock bodies interesting tostudy, they did not change too much since they were made.I study news sent back from a computer controlled box flying around oneof those rock-ice things that is now closer to the Sun. When thespace between such a body and the Sun gets smaller, it warms up andsome of the ice changes into wind like things. We find out how muchof what stuff is flying away from that body and at what time.Then I and my friends put those numbers into a big ass computer to findout more on how those rock-ice bodies work. Where does the wind comefrom? Do they all come from the same place or only some? Is it really the Sun's fault? How many cups of ice change into wind each day? Many questions.

  19. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  20. NASA's Big Data Task Force

    Science.gov (United States)

    Holmes, C. P.; Kinter, J. L.; Beebe, R. F.; Feigelson, E.; Hurlburt, N. E.; Mentzel, C.; Smith, G.; Tino, C.; Walker, R. J.

    2017-12-01

    Two years ago NASA established the Ad Hoc Big Data Task Force (BDTF - https://science.nasa.gov/science-committee/subcommittees/big-data-task-force), an advisory working group with the NASA Advisory Council system. The scope of the Task Force included all NASA Big Data programs, projects, missions, and activities. The Task Force focused on such topics as exploring the existing and planned evolution of NASA's science data cyber-infrastructure that supports broad access to data repositories for NASA Science Mission Directorate missions; best practices within NASA, other Federal agencies, private industry and research institutions; and Federal initiatives related to big data and data access. The BDTF has completed its two-year term and produced several recommendations plus four white papers for NASA's Science Mission Directorate. This presentation will discuss the activities and results of the TF including summaries of key points from its focused study topics. The paper serves as an introduction to the papers following in this ESSI session.

  1. Epidemiology in wonderland: Big Data and precision medicine.

    Science.gov (United States)

    Saracci, Rodolfo

    2018-03-01

    Big Data and precision medicine, two major contemporary challenges for epidemiology, are critically examined from two different angles. In Part 1 Big Data collected for research purposes (Big research Data) and Big Data used for research although collected for other primary purposes (Big secondary Data) are discussed in the light of the fundamental common requirement of data validity, prevailing over "bigness". Precision medicine is treated developing the key point that high relative risks are as a rule required to make a variable or combination of variables suitable for prediction of disease occurrence, outcome or response to treatment; the commercial proliferation of allegedly predictive tests of unknown or poor validity is commented. Part 2 proposes a "wise epidemiology" approach to: (a) choosing in a context imprinted by Big Data and precision medicine-epidemiological research projects actually relevant to population health, (b) training epidemiologists, (c) investigating the impact on clinical practices and doctor-patient relation of the influx of Big Data and computerized medicine and (d) clarifying whether today "health" may be redefined-as some maintain in purely technological terms.

  2. Big Rock Point Nuclear Plant. Annual operating report for 1976

    International Nuclear Information System (INIS)

    1977-01-01

    Net electrical power generated was 244,492.9 MWH with the reactor on line 4,405 hrs. Information is presented concerning operations, power generation, shutdowns, corrective maintenance, chemistry and radiochemistry, occupational radiation exposure, release of radioactive materials, reportable occurrences, and fuel performance

  3. Schools K-12, This is a point feature class of Schools within Rock County. This data does not contain religious or parochial schools, or schools affiliated with churches., Published in 2005, Rock County Government.

    Data.gov (United States)

    NSGIC Local Govt | GIS Inventory — Schools K-12 dataset current as of 2005. This is a point feature class of Schools within Rock County. This data does not contain religious or parochial schools, or...

  4. Application of rock mechanics in opencast mining

    Energy Technology Data Exchange (ETDEWEB)

    Desurmont, M; Feuga, B

    1979-07-01

    The significance of opencast mining in the world today is mentioned. With the exception of coal, opencast workings provide approximately 80% of output. The importance of opencast has continued to increase over the last ten years. Access to the mineral usually necessitates the removal of large quantities of rock. The aim is to reduce the quantity of the latter as much as possible in order to minimize the dirt/mineral ratio. For this purpose use has been made of the operating techniques of rock mechanics in order to determine the optimum dimensions of the access trench compatible with safety requirements. The author illustrates this technique by means of three examples: the Luzenac talc workings, the Mont-Roc fluorine workings and the Big Hole at Kimberley.

  5. Big Data as a Source for Official Statistics

    Directory of Open Access Journals (Sweden)

    Daas Piet J.H.

    2015-06-01

    Full Text Available More and more data are being produced by an increasing number of electronic devices physically surrounding us and on the internet. The large amount of data and the high frequency at which they are produced have resulted in the introduction of the term ‘Big Data’. Because these data reflect many different aspects of our daily lives and because of their abundance and availability, Big Data sources are very interesting from an official statistics point of view. This article discusses the exploration of both opportunities and challenges for official statistics associated with the application of Big Data. Experiences gained with analyses of large amounts of Dutch traffic loop detection records and Dutch social media messages are described to illustrate the topics characteristic of the statistical analysis and use of Big Data.

  6. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    Science.gov (United States)

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  7. Pore-scale analysis of electrical properties in thinly bedded rock using digital rock physics

    International Nuclear Information System (INIS)

    Sun, Jianmeng; Zhao, Jianpeng; Liu, Xuefeng; Chen, Hui; Jiang, LiMing; Zhang, JinYan

    2014-01-01

    We investigated the electrical properties of laminated rock consist of macro-porous layers and micro-porous layers based on digital rock technology. Due to the bedding effect and anisotropy, traditional Archie equations cannot well describe the electrical behavior of laminated rock. The RI-Sw curve of laminated rock shows a nonlinear relationship. The RI-Sw curve can be divided into two linear segments with different saturation exponent. Laminated sand-shale sequences and laminated sands of different porosity or grain size will yield macroscopic electrical anisotropy. Numerical simulation and theoretical analysis lead to the conclusion that electrical anisotropy coefficient of laminated rock is a strong function of water saturation. The function curve can be divided into three segments by the turning point. Therefore, the electrical behavior of laminated rock should be considered in oil exploration and development. (paper)

  8. Transporting radioactive rock

    International Nuclear Information System (INIS)

    Pearce, G.

    1990-01-01

    The case is made for exempting geological specimens from the IAEA Regulations for Safer Transport of Radioactive Materials. It is pointed out that many mineral collectors in Devon and Cornwall may be unwittingly infringing these regulations by taking naturally radioactive rocks and specimens containing uranium ores. Even if these collectors are aware that these rocks are radioactive, and many are not, few have the necessary equipment to monitor the activity levels. If the transport regulations were to be enforced alarm could be generated and the regulations devalued in case of an accident. The danger from a spill of rock specimens is negligible compared with an accident involving industrial or medical radioactive substances yet would require similar special treatment. (UK)

  9. Big data business models: Challenges and opportunities

    Directory of Open Access Journals (Sweden)

    Ralph Schroeder

    2016-12-01

    Full Text Available This paper, based on 28 interviews from a range of business leaders and practitioners, examines the current state of big data use in business, as well as the main opportunities and challenges presented by big data. It begins with an account of the current landscape and what is meant by big data. Next, it draws distinctions between the ways organisations use data and provides a taxonomy of big data business models. We observe a variety of different business models, depending not only on sector, but also on whether the main advantages derive from analytics capabilities or from having ready access to valuable data sources. Some major challenges emerge from this account, including data quality and protectiveness about sharing data. The conclusion discusses these challenges, and points to the tensions and differing perceptions about how data should be governed as between business practitioners, the promoters of open data, and the wider public.

  10. Big Data, Big Problems: A Healthcare Perspective.

    Science.gov (United States)

    Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M

    2017-01-01

    Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."

  11. [Three applications and the challenge of the big data in otology].

    Science.gov (United States)

    Lei, Guanxiong; Li, Jianan; Shen, Weidong; Yang, Shiming

    2016-03-01

    With the expansion of human practical activities, more and more areas have suffered from big data problems. The emergence of big data requires people to update the research paradigm and develop new technical methods. This review discussed that big data might bring opportunities and challenges in the area of auditory implantation, the deafness genome, and auditory pathophysiology, and pointed out that we needed to find appropriate theories and methods to make this kind of expectation into reality.

  12. Experimental research on the electromagnetic radiation (EMR) characteristics of cracked rock.

    Science.gov (United States)

    Song, Xiaoyan; Li, Xuelong; Li, Zhonghui; Cheng, Fuqi; Zhang, Zhibo; Niu, Yue

    2018-03-01

    Coal rock would emit the electromagnetic radiation (EMR) while deformation and fracture, and there exists structural body in the coal rock because of mining and geological structure. In this paper, we conducted an experimental test the EMR characteristics of cracked rock under loading. Results show that crack appears firstly in the prefabricated crack tip then grows stably parallel to the maximum principal stress, and the coal rock buckling failure is caused by the wing crack tension. Besides, the compressive strength significantly decreases because of the precrack, and the compressive strength increases with the crack angle. Intact rock EMR increases with the loading, and the cracked rock EMR shows stage and fluctuant characteristics. The bigger the angle, the more obvious the stage and fluctuant characteristics, that is EMR becomes richer. While the cracked angle is little, EMR is mainly caused by the electric charge rapid separates because of friction sliding. While the cracked angle is big, there is another significant contribution to EMR, which is caused by the electric dipole transient of crack expansion. Through this, we can know more clear about the crack extends route and the corresponding influence on the EMR characteristic and mechanism, which has important theoretical and practical significance to monitor the coal rock dynamical disasters.

  13. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  14. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van

  15. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan Capalbo

    2005-12-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I are organized into four areas: (1) Evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; (2) Development of GIS-based reporting framework that links with national networks; (3) Design of an integrated suite of monitoring, measuring, and verification technologies, market-based opportunities for carbon management, and an economic/risk assessment framework; (referred to below as the Advanced Concepts component of the Phase I efforts) and (4) Initiation of a comprehensive education and outreach program. As a result of the Phase I activities, the groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that complements the ongoing DOE research agenda in Carbon Sequestration. The geology of the Big Sky Carbon Sequestration Partnership Region is favorable for the potential sequestration of enormous volume of CO{sub 2}. The United States Geological Survey (USGS 1995) identified 10 geologic provinces and 111 plays in the region. These provinces and plays include both sedimentary rock types characteristic of oil, gas, and coal productions as well as large areas of mafic volcanic rocks. Of the 10 provinces and 111 plays, 1 province and 4 plays are located within Idaho. The remaining 9 provinces and 107 plays are dominated by sedimentary rocks and located in the states of Montana and Wyoming. The potential sequestration capacity of the 9 sedimentary provinces within the region ranges from 25,000 to almost 900,000 million metric tons of CO{sub 2}. Overall every sedimentary formation investigated

  16. Big data: survey, technologies, opportunities, and challenges.

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  17. Big Data: Survey, Technologies, Opportunities, and Challenges

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Mahmoud Ali, Waleed Kamaleldin; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  18. [Utilization of Big Data in Medicine and Future Outlook].

    Science.gov (United States)

    Kinosada, Yasutomi; Uematsu, Machiko; Fujiwara, Takuya

    2016-03-01

    "Big data" is a new buzzword. The point is not to be dazzled by the volume of data, but rather to analyze it, and convert it into insights, innovations, and business value. There are also real differences between conventional analytics and big data. In this article, we show some results of big data analysis using open DPC (Diagnosis Procedure Combination) data in areas of the central part of JAPAN: Toyama, Ishikawa, Fukui, Nagano, Gifu, Aichi, Shizuoka, and Mie Prefectures. These 8 prefectures contain 51 medical administration areas called the second medical area. By applying big data analysis techniques such as k-means, hierarchical clustering, and self-organizing maps to DPC data, we can visualize the disease structure and detect similarities or variations among the 51 second medical areas. The combination of a big data analysis technique and open DPC data is a very powerful method to depict real figures on patient distribution in Japan.

  19. X-ray microtomography application in pore space reservoir rock

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, M.F.S.; Lima, I. [Nuclear Instrumentation Laboratory, COPPE/UFRJ, P.O. Box 68509, 21.941-972, Rio de Janeiro (Brazil); Borghi, L. [Geology Department, Geosciences Institute, Federal University of Rio de Janeiro, Brazil. (Brazil); Lopes, R.T., E-mail: ricardo@lin.ufrj.br [Nuclear Instrumentation Laboratory, COPPE/UFRJ, P.O. Box 68509, 21.941-972, Rio de Janeiro (Brazil)

    2012-07-15

    Characterization of porosity in carbonate rocks is important in the oil and gas industry since a major hydrocarbons field is formed by this lithology and they have a complex media porous. In this context, this research presents a study of the pore space in limestones rocks by x-ray microtomography. Total porosity, type of porosity and pore size distribution were evaluated from 3D high resolution images. Results show that carbonate rocks has a complex pore space system with different pores types at the same facies. - Highlights: Black-Right-Pointing-Pointer This study is about porosity parameter in carbonate rocks by 3D X-Ray Microtomography. Black-Right-Pointing-Pointer This study has become useful as data input for modeling reservoir characterization. Black-Right-Pointing-Pointer This technique was able to provide pores, grains and mineralogical differences among the samples.

  20. Compliance Monitoring of Underwater Blasting for Rock Removal at Warrior Point, Columbia River Channel Improvement Project, 2009/2010

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, Thomas J.; Johnson, Gary E.; Woodley, Christa M.; Skalski, J. R.; Seaburg, Adam

    2011-05-10

    The U.S. Army Corps of Engineers, Portland District (USACE) conducted the 20-year Columbia River Channel Improvement Project (CRCIP) to deepen the navigation channel between Portland, Oregon, and the Pacific Ocean to allow transit of fully loaded Panamax ships (100 ft wide, 600 to 700 ft long, and draft 45 to 50 ft). In the vicinity of Warrior Point, between river miles (RM) 87 and 88 near St. Helens, Oregon, the USACE conducted underwater blasting and dredging to remove 300,000 yd3 of a basalt rock formation to reach a depth of 44 ft in the Columbia River navigation channel. The purpose of this report is to document methods and results of the compliance monitoring study for the blasting project at Warrior Point in the Columbia River.

  1. ROCK inhibitor prevents the dedifferentiation of human articular chondrocytes

    Energy Technology Data Exchange (ETDEWEB)

    Matsumoto, Emi [Department of Orthopaedic Surgery, Science of Functional Recovery and Reconstruction, Okayama University Graduate School of Medicine, Dentistry, and Pharmaceutical Sciences, 2-5-1 Shikatacho, Kitaku, Okayama 700-8558 (Japan); Furumatsu, Takayuki, E-mail: matino@md.okayama-u.ac.jp [Department of Orthopaedic Surgery, Science of Functional Recovery and Reconstruction, Okayama University Graduate School of Medicine, Dentistry, and Pharmaceutical Sciences, 2-5-1 Shikatacho, Kitaku, Okayama 700-8558 (Japan); Kanazawa, Tomoko; Tamura, Masanori; Ozaki, Toshifumi [Department of Orthopaedic Surgery, Science of Functional Recovery and Reconstruction, Okayama University Graduate School of Medicine, Dentistry, and Pharmaceutical Sciences, 2-5-1 Shikatacho, Kitaku, Okayama 700-8558 (Japan)

    2012-03-30

    Highlights: Black-Right-Pointing-Pointer ROCK inhibitor stimulates chondrogenic gene expression of articular chondrocytes. Black-Right-Pointing-Pointer ROCK inhibitor prevents the dedifferentiation of monolayer-cultured chondrocytes. Black-Right-Pointing-Pointer ROCK inhibitor enhances the redifferentiation of cultured chondrocytes. Black-Right-Pointing-Pointer ROCK inhibitor is useful for preparation of un-dedifferentiated chondrocytes. Black-Right-Pointing-Pointer ROCK inhibitor may be a useful reagent for chondrocyte-based regeneration therapy. -- Abstract: Chondrocytes lose their chondrocytic phenotypes in vitro. The Rho family GTPase ROCK, involved in organizing the actin cytoskeleton, modulates the differentiation status of chondrocytic cells. However, the optimum method to prepare a large number of un-dedifferentiated chondrocytes is still unclear. In this study, we investigated the effect of ROCK inhibitor (ROCKi) on the chondrogenic property of monolayer-cultured articular chondrocytes. Human articular chondrocytes were subcultured in the presence or absence of ROCKi (Y-27632). The expression of chondrocytic marker genes such as SOX9 and COL2A1 was assessed by quantitative real-time PCR analysis. Cellular morphology and viability were evaluated. Chondrogenic redifferentiation potential was examined by a pellet culture procedure. The expression level of SOX9 and COL2A1 was higher in ROCKi-treated chondrocytes than in untreated cells. Chondrocyte morphology varied from a spreading form to a round shape in a ROCKi-dependent manner. In addition, ROCKi treatment stimulated the proliferation of chondrocytes. The deposition of safranin O-stained proteoglycans and type II collagen was highly detected in chondrogenic pellets derived from ROCKi-pretreated chondrocytes. Our results suggest that ROCKi prevents the dedifferentiation of monolayer-cultured chondrocytes, and may be a useful reagent to maintain chondrocytic phenotypes in vitro for chondrocyte

  2. The BIG Data Center: from deposition to integration to translation.

    Science.gov (United States)

    2017-01-04

    Biological data are generated at unprecedentedly exponential rates, posing considerable challenges in big data deposition, integration and translation. The BIG Data Center, established at Beijing Institute of Genomics (BIG), Chinese Academy of Sciences, provides a suite of database resources, including (i) Genome Sequence Archive, a data repository specialized for archiving raw sequence reads, (ii) Gene Expression Nebulas, a data portal of gene expression profiles based entirely on RNA-Seq data, (iii) Genome Variation Map, a comprehensive collection of genome variations for featured species, (iv) Genome Warehouse, a centralized resource housing genome-scale data with particular focus on economically important animals and plants, (v) Methylation Bank, an integrated database of whole-genome single-base resolution methylomes and (vi) Science Wikis, a central access point for biological wikis developed for community annotations. The BIG Data Center is dedicated to constructing and maintaining biological databases through big data integration and value-added curation, conducting basic research to translate big data into big knowledge and providing freely open access to a variety of data resources in support of worldwide research activities in both academia and industry. All of these resources are publicly available and can be found at http://bigd.big.ac.cn. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  3. BigOP: Generating Comprehensive Big Data Workloads as a Benchmarking Framework

    OpenAIRE

    Zhu, Yuqing; Zhan, Jianfeng; Weng, Chuliang; Nambiar, Raghunath; Zhang, Jinchao; Chen, Xingzhen; Wang, Lei

    2014-01-01

    Big Data is considered proprietary asset of companies, organizations, and even nations. Turning big data into real treasure requires the support of big data systems. A variety of commercial and open source products have been unleashed for big data storage and processing. While big data users are facing the choice of which system best suits their needs, big data system developers are facing the question of how to evaluate their systems with regard to general big data processing needs. System b...

  4. Big Data: Survey, Technologies, Opportunities, and Challenges

    Directory of Open Access Journals (Sweden)

    Nawsher Khan

    2014-01-01

    Full Text Available Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  5. Permeability Evolution and Rock Brittle Failure

    OpenAIRE

    Sun Qiang; Xue Lei; Zhu Shuyun

    2015-01-01

    This paper reports an experimental study of the evolution of permeability during rock brittle failure and a theoretical analysis of rock critical stress level. It is assumed that the rock is a strain-softening medium whose strength can be described by Weibull’s distribution. Based on the two-dimensional renormalization group theory, it is found that the stress level λ c (the ratio of the stress at the critical point to the peak stress) depends mainly on the homogeneity index or shape paramete...

  6. Military Simulation Big Data: Background, State of the Art, and Challenges

    Directory of Open Access Journals (Sweden)

    Xiao Song

    2015-01-01

    Full Text Available Big data technology has undergone rapid development and attained great success in the business field. Military simulation (MS is another application domain producing massive datasets created by high-resolution models and large-scale simulations. It is used to study complicated problems such as weapon systems acquisition, combat analysis, and military training. This paper firstly reviewed several large-scale military simulations producing big data (MS big data for a variety of usages and summarized the main characteristics of result data. Then we looked at the technical details involving the generation, collection, processing, and analysis of MS big data. Two frameworks were also surveyed to trace the development of the underlying software platform. Finally, we identified some key challenges and proposed a framework as a basis for future work. This framework considered both the simulation and big data management at the same time based on layered and service oriented architectures. The objective of this review is to help interested researchers learn the key points of MS big data and provide references for tackling the big data problem and performing further research.

  7. How Big Is Too Big?

    Science.gov (United States)

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  8. To What Extent Can the Big Five and Learning Styles Predict Academic Achievement

    Science.gov (United States)

    Köseoglu, Yaman

    2016-01-01

    Personality traits and learning styles play defining roles in shaping academic achievement. 202 university students completed the Big Five personality traits questionnaire and the Inventory of Learning Processes Scale and self-reported their grade point averages. Conscientiousness and agreeableness, two of the Big Five personality traits, related…

  9. Big data analytics for mitigating carbon emissions in smart cities : opportunities and challenges

    NARCIS (Netherlands)

    Giest, S.N.

    2017-01-01

    The paper addresses the growing scepticism around big data use in the context of smart cities. Big data is said to transform city governments into being more efficient, effective and evidence-based. However, critics point towards the limited capacity of government to overcome the siloed structure of

  10. The source rock characters of U-rich granite

    Energy Technology Data Exchange (ETDEWEB)

    Mingyue, Feng; Debao, He [CNNC Key Laboratory of Uranium Resources Exploration and Evaluation Technology, Beijing Research Institute of Uranium Geology (China)

    2012-03-15

    This paper discusses the stratum composition, lithological association, uranium content of crust and the activation, migration, concentration of uranium at each tectonic cycle in South China. The authors point out that the source rock of U-rich granite is U-rich continental crust which is rich in Si, Al and K. The lithological association is mainly composed of terrestrial clastic rocks formation of mudstone and sandstone, mingled with intermediate-acidic, mafic pyroclastic rocks and carbonate rocks formation. During tectonic movements, the rocks had undergone regional metamorphism, migmatitization, granitization, and formed U-rich granites finally. (authors)

  11. The source rock characters of U-rich granite

    International Nuclear Information System (INIS)

    Feng Mingyue; He Debao

    2012-01-01

    This paper discusses the stratum composition, lithological association, uranium content of crust and the activation, migration, concentration of uranium at each tectonic cycle in South China. The authors point out that the source rock of U-rich granite is U-rich continental crust which is rich in Si, Al and K. The lithological association is mainly composed of terrestrial clastic rocks formation of mudstone and sandstone, mingled with intermediate-acidic, mafic pyroclastic rocks and carbonate rocks formation. During tectonic movements, the rocks had undergone regional metamorphism, migmatitization, granitization, and formed U-rich granites finally. (authors)

  12. [Big data approaches in psychiatry: examples in depression research].

    Science.gov (United States)

    Bzdok, D; Karrer, T M; Habel, U; Schneider, F

    2017-11-29

    The exploration and therapy of depression is aggravated by heterogeneous etiological mechanisms and various comorbidities. With the growing trend towards big data in psychiatry, research and therapy can increasingly target the individual patient. This novel objective requires special methods of analysis. The possibilities and challenges of the application of big data approaches in depression are examined in closer detail. Examples are given to illustrate the possibilities of big data approaches in depression research. Modern machine learning methods are compared to traditional statistical methods in terms of their potential in applications to depression. Big data approaches are particularly suited to the analysis of detailed observational data, the prediction of single data points or several clinical variables and the identification of endophenotypes. A current challenge lies in the transfer of results into the clinical treatment of patients with depression. Big data approaches enable biological subtypes in depression to be identified and predictions in individual patients to be made. They have enormous potential for prevention, early diagnosis, treatment choice and prognosis of depression as well as for treatment development.

  13. A 3D clustering approach for point clouds to detect and quantify changes at a rock glacier front

    Science.gov (United States)

    Micheletti, Natan; Tonini, Marj; Lane, Stuart N.

    2016-04-01

    Terrestrial Laser Scanners (TLS) are extensively used in geomorphology to remotely-sense landforms and surfaces of any type and to derive digital elevation models (DEMs). Modern devices are able to collect many millions of points, so that working on the resulting dataset is often troublesome in terms of computational efforts. Indeed, it is not unusual that raw point clouds are filtered prior to DEM creation, so that only a subset of points is retained and the interpolation process becomes less of a burden. Whilst this procedure is in many cases necessary, it implicates a considerable loss of valuable information. First, and even without eliminating points, the common interpolation of points to a regular grid causes a loss of potentially useful detail. Second, it inevitably causes the transition from 3D information to only 2.5D data where each (x,y) pair must have a unique z-value. Vector-based DEMs (e.g. triangulated irregular networks) partially mitigate these issues, but still require a set of parameters to be set and a considerable burden in terms of calculation and storage. Because of the reasons above, being able to perform geomorphological research directly on point clouds would be profitable. Here, we propose an approach to identify erosion and deposition patterns on a very active rock glacier front in the Swiss Alps to monitor sediment dynamics. The general aim is to set up a semiautomatic method to isolate mass movements using 3D-feature identification directly from LiDAR data. An ultra-long range LiDAR RIEGL VZ-6000 scanner was employed to acquire point clouds during three consecutive summers. In order to isolate single clusters of erosion and deposition we applied the Density-Based Scan Algorithm with Noise (DBSCAN), previously successfully employed by Tonini and Abellan (2014) in a similar case for rockfall detection. DBSCAN requires two input parameters, strongly influencing the number, shape and size of the detected clusters: the minimum number of

  14. Rock-fall Hazard In The Yosemite Valley, California

    Science.gov (United States)

    Guzzetti, F.; Reichenbach, P.; Wieczorek, G. F.

    Rock slides and rock falls are the most frequent slope movements in Yosemite Na- tional Park, California. In historical time (1851-2001), more than 400 rock falls and rock slides have been documented in the valley, and some of them have been mapped in detail. We present the preliminary results of an attempt to assess rockfall hazard in the Yosemite Valley using STONE, a 3-dimensional rock-fall simulation computer program. The software computes 3-dimensional rock-fall trajectories starting from a digital terrain model (DTM), the location of rock-fall release points (source areas), and maps of the dynamic rolling coefficient and of the coefficients of normal and tan- gential energy restitution. For each DTM cell the software also calculates the number of rock falls passing through the cell, the maximum rock-fall velocity and the maxi- mum flying height. For the Yosemite Valley, a DTM with a ground resolution of 10 x 10 m was prepared using topographic contour lines from USGS 1:24,000-scale maps. Rock-fall release points were identified as DTM cells having a slope steeper than 60 degrees, an assumption based on the location of historical rock falls. Maps of the nor- mal and tangential energy restitution coefficients and of the rolling friction coefficient were produced from a surficial geologic map. The availability of historical rock falls mapped in detail allowed us to check the computer program performance and to cali- brate the model parameters. Visual and statistical comparison of the model results with the mapped rock falls confirmed the accuracy of the model. The model results are also compared with a geomorphic assessment of rock-fall hazard based on potential energy referred to as a "shadow angle" approach, recently completed for the Yosemite Valley.

  15. Nursing Needs Big Data and Big Data Needs Nursing.

    Science.gov (United States)

    Brennan, Patricia Flatley; Bakken, Suzanne

    2015-09-01

    Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.

  16. Rock deformation equations and application to the study on slantingly installed disc cutter

    Science.gov (United States)

    Zhang, Zhao-Huang; Meng, Liang; Sun, Fei

    2014-08-01

    At present the mechanical model of the interaction between a disc cutter and rock mainly concerns indentation experiment, linear cutting experiment and tunnel boring machine (TBM) on-site data. This is not in line with the actual rock-breaking movement of the disc cutter and impedes to some extent the research on the rock-breaking mechanism, wear mechanism and design theory. Therefore, our study focuses on the interaction between the slantingly installed disc cutter and rock, developing a model in accordance with the actual rock-breaking movement. Displacement equations are established through an analysis of the velocity vector at the rock-breaking point of the disc cutter blade; the functional relationship between the displacement parameters at the rock-breaking point and its rectangular coordinates is established through an analysis of micro-displacement vectors at the rock-breaking point, thus leading to the geometric equations of rock deformation caused by the slantingly installed disc cutter. Considering the basically linear relationship between the cutting force of disc cutters and the rock deformation before and after the leap break of rock, we express the constitutive relations of rock deformation as generalized Hooke's law and analyze the effect of the slanting installation angle of disc cutters on the rock-breaking force. This will, as we hope, make groundbreaking contributions to the development of the design theory and installation practice of TBM.

  17. BIG Data - BIG Gains? Understanding the Link Between Big Data Analytics and Innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance for product innovations. Since big data technologies provide new data information practices, they create new decision-making possibilities, which firms can use to realize innovations. Applying German firm-level data we find suggestive evidence that big data analytics matters for the likelihood of becoming a product innovator as well as the market success of the firms’ product innovat...

  18. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  19. Point cloud data management (extended abstract)

    NARCIS (Netherlands)

    Van Oosterom, P.J.M.; Ravada, S.; Horhammer, M.; Martinez Rubi, O.; Ivanova, M.; Kodde, M.; Tijssen, T.P.M.

    2014-01-01

    Point cloud data are important sources for 3D geo-information. The point cloud data sets are growing in popularity and in size. Modern Big Data acquisition and processing technologies, such as laser scanning from airborne, mobile, or static platforms, dense image matching from photos, multi-beam

  20. Global fluctuation spectra in big-crunch-big-bang string vacua

    International Nuclear Information System (INIS)

    Craps, Ben; Ovrut, Burt A.

    2004-01-01

    We study big-crunch-big-bang cosmologies that correspond to exact world-sheet superconformal field theories of type II strings. The string theory spacetime contains a big crunch and a big bang cosmology, as well as additional 'whisker' asymptotic and intermediate regions. Within the context of free string theory, we compute, unambiguously, the scalar fluctuation spectrum in all regions of spacetime. Generically, the big crunch fluctuation spectrum is altered while passing through the bounce singularity. The change in the spectrum is characterized by a function Δ, which is momentum and time dependent. We compute Δ explicitly and demonstrate that it arises from the whisker regions. The whiskers are also shown to lead to 'entanglement' entropy in the big bang region. Finally, in the Milne orbifold limit of our superconformal vacua, we show that Δ→1 and, hence, the fluctuation spectrum is unaltered by the big-crunch-big-bang singularity. We comment on, but do not attempt to resolve, subtleties related to gravitational back reaction and light winding modes when interactions are taken into account

  1. Hopi and Anasazi Alignments and Rock Art

    Science.gov (United States)

    Bates, Bryan C.

    The interaction of light and shadow on ancestral Puebloan rock art, or rock art demarcating sunrise/set horizon points that align with culturally significant dates, has long been assumed to be evidence of "intentional construct" for marking time or event by the native creator. However, anthropological rock art research requires the scientific control of cultural time, element orientation and placement, structure, and association with other rock art elements. The evaluation of five exemplars challenges the oft-held assumption that "if the interaction occurs, it therefore supports intentional construct" and thereby conveys meaning to the native culture.

  2. The Usability of Noise Level from Rock Cutting for the Prediction of Physico-Mechanical Properties of Rocks

    Science.gov (United States)

    Delibalta, M. S.; Kahraman, S.; Comakli, R.

    2015-11-01

    Because the indirect tests are easier and cheaper than the direct tests, the prediction of rock properties from the indirect testing methods is important especially for the preliminary investigations. In this study, the predictability of the physico-mechanical rock properties from the noise level measured during cutting rock with diamond saw was investigated. Noise measurement test, uniaxial compressive strength (UCS) test, Brazilian tensile strength (BTS) test, point load strength (Is) test, density test, and porosity test were carried out on 54 different rock types in the laboratory. The results were statistically analyzed to derive estimation equations. Strong correlations between the noise level and the mechanical rock properties were found. The relations follow power functions. Increasing rock strength increases the noise level. Density and porosity also correlated strongly with the noise level. The relations follow linear functions. Increasing density increases the noise level while increasing porosity decreases the noise level. The developed equations are valid for the rocks with a compressive strength below 150 MPa. Concluding remark is that the physico-mechanical rock properties can reliably be estimated from the noise level measured during cutting the rock with diamond saw.

  3. Big Data in Caenorhabditis elegans: quo vadis?

    Science.gov (United States)

    Hutter, Harald; Moerman, Donald

    2015-11-05

    A clear definition of what constitutes "Big Data" is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of "complete" data sets for this organism is actually rather small--not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein-protein interaction--important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell. © 2015 Hutter and Moerman. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  4. Big Argumentation?

    Directory of Open Access Journals (Sweden)

    Daniel Faltesek

    2013-08-01

    Full Text Available Big Data is nothing new. Public concern regarding the mass diffusion of data has appeared repeatedly with computing innovations, in the formation before Big Data it was most recently referred to as the information explosion. In this essay, I argue that the appeal of Big Data is not a function of computational power, but of a synergistic relationship between aesthetic order and a politics evacuated of a meaningful public deliberation. Understanding, and challenging, Big Data requires an attention to the aesthetics of data visualization and the ways in which those aesthetics would seem to depoliticize information. The conclusion proposes an alternative argumentative aesthetic as the appropriate response to the depoliticization posed by the popular imaginary of Big Data.

  5. The three-dimension model for the rock-breaking mechanism of disc cutter and analysis of rock-breaking forces

    Science.gov (United States)

    Zhang, Zhao-Huang; Sun, Fei

    2012-06-01

    To study the rock deformation with three-dimensional model under rolling forces of disc cutter, by carrying out the circular-grooving test with disc cutter rolling around on the rock, the rock mechanical behavior under rolling disc cutter is studied, the mechanical model of disc cutter rolling around the groove is established, and the theory of single-point and double-angle variables is proposed. Based on this theory, the physics equations and geometric equations of rock mechanical behavior under disc cutters of tunnel boring machine (TBM) are studied, and then the balance equations of interactive forces between disc cutter and rock are established. Accordingly, formulas about normal force, rolling force and side force of a disc cutter are derived, and their validity is studied by tests. Therefore, a new method and theory is proposed to study rock-breaking mechanism of disc cutters.

  6. Sufficient conditions for a period incrementing big bang bifurcation in one-dimensional maps

    International Nuclear Information System (INIS)

    Avrutin, V; Granados, A; Schanz, M

    2011-01-01

    Typically, big bang bifurcation occurs for one (or higher)-dimensional piecewise-defined discontinuous systems whenever two border collision bifurcation curves collide transversely in the parameter space. At that point, two (feasible) fixed points collide with one boundary in state space and become virtual, and, in the one-dimensional case, the map becomes continuous. Depending on the properties of the map near the codimension-two bifurcation point, there exist different scenarios regarding how the infinite number of periodic orbits are born, mainly the so-called period adding and period incrementing. In our work we prove that, in order to undergo a big bang bifurcation of the period incrementing type, it is sufficient for a piecewise-defined one-dimensional map that the colliding fixed points are attractive and with associated eigenvalues of different signs

  7. Sufficient conditions for a period incrementing big bang bifurcation in one-dimensional maps

    Science.gov (United States)

    Avrutin, V.; Granados, A.; Schanz, M.

    2011-09-01

    Typically, big bang bifurcation occurs for one (or higher)-dimensional piecewise-defined discontinuous systems whenever two border collision bifurcation curves collide transversely in the parameter space. At that point, two (feasible) fixed points collide with one boundary in state space and become virtual, and, in the one-dimensional case, the map becomes continuous. Depending on the properties of the map near the codimension-two bifurcation point, there exist different scenarios regarding how the infinite number of periodic orbits are born, mainly the so-called period adding and period incrementing. In our work we prove that, in order to undergo a big bang bifurcation of the period incrementing type, it is sufficient for a piecewise-defined one-dimensional map that the colliding fixed points are attractive and with associated eigenvalues of different signs.

  8. Cosmological analogy between the big bang and a supernova

    Energy Technology Data Exchange (ETDEWEB)

    Sen, S. (Hamburg, Germany, F.R.)

    1983-10-01

    The author presents an objection to Brown's (1981) analogy between a supernova and the Big Bang. According to Brown an expanding spherical shell is quite similar to an ejected supernova shell. However, the fragmented shell of a supernova moves outward in pre-existing space. The force of repulsion which makes the fragments of the shell drift apart can be regarded as equivalent to the force of attraction of the rest of the universe on the supernova. By definition, such a force of attraction is absent in the case of the Big Bang. Energy is supposed suddenly to appear simultaneously at all points throughout the universe at the time of the Big Bang. As the universe expands, space expands too. In the relativistic cosmology, the universe cannot expand in pre-existing space.

  9. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations......The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  10. Big slow movers: a look at weathered-rock slides in Western North Carolina

    Science.gov (United States)

    Rebecca S. Latham; Richard M. Wooten; Anne C. Witt; Stephen J. Fuemmeler; Kenneth a. Gillon; Thomas J. Douglas; Jennifer B. Bauer; Barton D. Clinton

    2007-01-01

    The North Carolina Geological Survey (NCGS) is currently implementing a landslide hazard-mapping program in western North Carolina authorized by the North Carolina Hurricane Recovery Act of 2005. To date, over 2700 landslides and landslide deposits have been documented. A small number of these landslides are relatively large, slow-moving, weathered-rock slides...

  11. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  12. From big bang to big crunch and beyond

    International Nuclear Information System (INIS)

    Elitzur, Shmuel; Rabinovici, Eliezer; Giveon, Amit; Kutasov, David

    2002-01-01

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a 'big bang' singularity, expands and then contracts to a 'big crunch' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spacetimes is further connected at the singularities to a series of non-compact static regions with closed timelike curves. These regions contain boundaries, on which the observables of the theory live. This suggests a holographic interpretation of the physics. (author)

  13. BIG data - BIG gains? Empirical evidence on the link between big data analytics and innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance in terms of product innovations. Since big data technologies provide new data information practices, they create novel decision-making possibilities, which are widely believed to support firms’ innovation process. Applying German firm-level data within a knowledge production function framework we find suggestive evidence that big data analytics is a relevant determinant for the likel...

  14. Phosphine from rocks: mechanically driven phosphate reduction?

    Science.gov (United States)

    Glindemann, Dietmar; Edwards, Marc; Morgenstern, Peter

    2005-11-01

    Natural rock and mineral samples released trace amounts of phosphine during dissolution in mineral acid. An order of magnitude more phosphine (average 1982 ng PH3 kg rock and maximum 6673 ng PH3/kg rock) is released from pulverized rock samples (basalt, gneiss, granite, clay, quartzitic pebbles, or marble). Phosphine was correlated to hardness and mechanical pulverization energy of the rocks. The yield of PH3 ranged from 0 to 0.01% of the total P content of the dissolved rock. Strong circumstantial evidence was gathered for reduction of phosphate in the rock via mechanochemical or "tribochemical" weathering at quartz and calcite/marble inclusions. Artificial reproduction of this mechanism by rubbing quartz rods coated with apatite-phosphate to the point of visible triboluminescence, led to detection of more than 70 000 ng/kg PH3 in the apatite. This reaction pathway may be considered a mechano-chemical analogue of phosphate reduction from lightning or electrical discharges and may contribute to phosphine production via tectonic forces and processing of rocks.

  15. Georges et le big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2011-01-01

    Georges et Annie, sa meilleure amie, sont sur le point d'assister à l'une des plus importantes expériences scientifiques de tous les temps : explorer les premiers instants de l'Univers, le Big Bang ! Grâce à Cosmos, leur super ordinateur, et au Grand Collisionneur de hadrons créé par Éric, le père d'Annie, ils vont enfin pouvoir répondre à cette question essentielle : pourquoi existons nous ? Mais Georges et Annie découvrent qu'un complot diabolique se trame. Pire, c'est toute la recherche scientifique qui est en péril ! Entraîné dans d'incroyables aventures, Georges ira jusqu'aux confins de la galaxie pour sauver ses amis...Une plongée passionnante au coeur du Big Bang. Les toutes dernières théories de Stephen Hawking et des plus grands scientifiques actuels.

  16. Radon and rock bursts in deep mines

    International Nuclear Information System (INIS)

    Bulashevich, Yu.P.; Utkin, V.I.; Yurkov, A.K.; Nikolaev, V.V.

    1996-01-01

    Variation fields of radon concentration in time to ascertain stress-strain state of the North Ural bauxite mines have been studied. It is shown that dynamic changes in the stress-strain state of the rocks prior to the rock burst bring about variations in radon concentration in the observation wells. Depending on mutual positioning of the observation points and the rock burst epicenter, the above-mentioned variations differ in principle, reduction of radon concentration in the near zone and its increase in the far zone are observed [ru

  17. Effects of wing locations on wing rock induced by forebody vortices

    Directory of Open Access Journals (Sweden)

    Ma Baofeng

    2016-10-01

    Full Text Available Previous studies have shown that asymmetric vortex wakes over slender bodies exhibit a multi-vortex structure with an alternate arrangement along a body axis at high angle of attack. In this investigation, the effects of wing locations along a body axis on wing rock induced by forebody vortices was studied experimentally at a subcritical Reynolds number based on a body diameter. An artificial perturbation was added onto the nose tip to fix the orientations of forebody vortices. Particle image velocimetry was used to identify flow patterns of forebody vortices in static situations, and time histories of wing rock were obtained using a free-to-roll rig. The results show that the wing locations can affect significantly the motion patterns of wing rock owing to the variation of multi-vortex patterns of forebody vortices. As the wing locations make the forebody vortices a two-vortex pattern, the wing body exhibits regularly divergence and fixed-point motion with azimuthal variations of the tip perturbation. If a three-vortex pattern exists over the wing, however, the wing-rock patterns depend on the impact of the highest vortex and newborn vortex. As the three vortices together influence the wing flow, wing-rock patterns exhibit regularly fixed-points and limit-cycled oscillations. With the wing moving backwards, the newborn vortex becomes stronger, and wing-rock patterns become fixed-points, chaotic oscillations, and limit-cycled oscillations. With further backward movement of wings, the vortices are far away from the upper surface of wings, and the motions exhibit divergence, limit-cycled oscillations and fixed-points. For the rearmost location of the wing, the wing body exhibits stochastic oscillations and fixed-points.

  18. A multi points ultrasonic detection method for material flow of belt conveyor

    Science.gov (United States)

    Zhang, Li; He, Rongjun

    2018-03-01

    For big detection error of single point ultrasonic ranging technology used in material flow detection of belt conveyor when coal distributes unevenly or is large, a material flow detection method of belt conveyor is designed based on multi points ultrasonic counter ranging technology. The method can calculate approximate sectional area of material by locating multi points on surfaces of material and belt, in order to get material flow according to running speed of belt conveyor. The test results show that the method has smaller detection error than single point ultrasonic ranging technology under the condition of big coal with uneven distribution.

  19. Modelling of nuclear explosions in hard rock sites

    International Nuclear Information System (INIS)

    Brunish, W.M.; App, F.N.

    1993-01-01

    This study represents part of a larger effort to systematically model the effects of differing source region properties on ground motion from underground nuclear explosions at the Nevada Test Site. In previous work by the authors the primary emphasis was on alluvium and both saturated and unsaturated tuff. We have attempted to model events on Pahute Mesa, where either the working point medium, or some of the layers above the working point, or both, are hard rock. The complex layering at these sites, however, has prevented us from drawing unambiguous conclusions about modelling hard rock

  20. Development of uniform hazard response spectra for rock sites considering line and point sources of earthquakes

    International Nuclear Information System (INIS)

    Ghosh, A.K.; Kushwaha, H.S.

    2001-12-01

    Traditionally, the seismic design basis ground motion has been specified by normalised response spectral shapes and peak ground acceleration (PGA). The mean recurrence interval (MRI) used to computed for PGA only. It is shown that the MRI associated with such response spectra are not the same at all frequencies. The present work develops uniform hazard response spectra i.e. spectra having the same MRI at all frequencies for line and point sources of earthquakes by using a large number of strong motion accelerograms recorded on rock sites. Sensitivity of the number of the results to the changes in various parameters has also been presented. This work is an extension of an earlier work for aerial sources of earthquakes. These results will help to determine the seismic hazard at a given site and the associated uncertainities. (author)

  1. Using a laser measurement system for monitoring morphological changes on the Strug rock fall, Slovenia

    Directory of Open Access Journals (Sweden)

    M. Mikoš

    2005-01-01

    Full Text Available A medium-ranged high performance handheld reflectorless laser measurement system, was used for a morphological survey on the Strug rock fall in W Slovenia in the period from August 2003 to August 2004. The purpose was to evaluate its potential for monitoring ground surface changes in rock fall source areas and to help evaluating morphological changes by measuring distance from fixed points. In the area, 21 fixed geodetic points have been established. Altogether, seven measurement sets with more than 5500 points have been gathered in the rock fall area. Choosing a point cloud with a density of less than 1 point per 10m2 on a very rough rock fall surface failed to be a good solution. The changes on larger areas were shown by displacements of selected significantly large-sized rock blocks with a volume of several m3. Because only smaller changes were observed between the single field series, the rock fall surface generally remained unchanged. Local surface changes of the order of 1 m or more, were clearly shown by measurements in the selected referenced cross sections. The usage of these cross sections gave a possibility to evaluate volumetric changes on the surface. The laser measurement system provided a good replacement for the classical terrestrial geodetic survey equipment, especially when performing remote monitoring of morphological changes in rock fall hazard zones, however, the case is different when fixed points are to be measured precisely.

  2. Big Bang or vacuum fluctuation

    International Nuclear Information System (INIS)

    Zel'dovich, Ya.B.

    1980-01-01

    Some general properties of vacuum fluctuations in quantum field theory are described. The connection between the ''energy dominance'' of the energy density of vacuum fluctuations in curved space-time and the presence of singularity is discussed. It is pointed out that a de-Sitter space-time (with the energy density of the vacuum fluctuations in the Einstein equations) that matches the expanding Friedman solution may describe the history of the Universe before the Big Bang. (P.L.)

  3. Benchmarking Big Data Systems and the BigData Top100 List.

    Science.gov (United States)

    Baru, Chaitanya; Bhandarkar, Milind; Nambiar, Raghunath; Poess, Meikel; Rabl, Tilmann

    2013-03-01

    "Big data" has become a major force of innovation across enterprises of all sizes. New platforms with increasingly more features for managing big datasets are being announced almost on a weekly basis. Yet, there is currently a lack of any means of comparability among such platforms. While the performance of traditional database systems is well understood and measured by long-established institutions such as the Transaction Processing Performance Council (TCP), there is neither a clear definition of the performance of big data systems nor a generally agreed upon metric for comparing these systems. In this article, we describe a community-based effort for defining a big data benchmark. Over the past year, a Big Data Benchmarking Community has become established in order to fill this void. The effort focuses on defining an end-to-end application-layer benchmark for measuring the performance of big data applications, with the ability to easily adapt the benchmark specification to evolving challenges in the big data space. This article describes the efforts that have been undertaken thus far toward the definition of a BigData Top100 List. While highlighting the major technical as well as organizational challenges, through this article, we also solicit community input into this process.

  4. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  5. BigDataBench: a Big Data Benchmark Suite from Internet Services

    OpenAIRE

    Wang, Lei; Zhan, Jianfeng; Luo, Chunjie; Zhu, Yuqing; Yang, Qiang; He, Yongqiang; Gao, Wanling; Jia, Zhen; Shi, Yingjie; Zhang, Shujie; Zheng, Chen; Lu, Gang; Zhan, Kent; Li, Xiaona; Qiu, Bizhu

    2014-01-01

    As architecture, systems, and data management communities pay greater attention to innovative big data systems and architectures, the pressure of benchmarking and evaluating these systems rises. Considering the broad use of big data systems, big data benchmarks must include diversity of data and workloads. Most of the state-of-the-art big data benchmarking efforts target evaluating specific types of applications or system software stacks, and hence they are not qualified for serving the purpo...

  6. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  7. A cosmological analogy between the big bang and a supernova

    International Nuclear Information System (INIS)

    Sen, S.

    1983-01-01

    The author presents an objection to Brown's (1981) analogy between a supernova and the Big Bang. According to Brown an expanding spherical shell is quite similar to an ejected supernova shell. However, the fragmented shell of a supernova moves outward in pre-existing space. The force of repulsion which makes the fragments of the shell drift apart can be regarded as equivalent to the force of attraction of the rest of the universe on the supernova. By definition, such a force of attraction is absent in the case of the Big Bang. Energy is supposed suddenly to appear simultaneously at all points throughout the universe at the time of the Big Bang. As the universe expands, space expands too. In the relativistic cosmology, the universe cannot expand in pre-existing space. (Auth.)

  8. BigDansing

    KAUST Repository

    Khayyat, Zuhair

    2015-06-02

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to scaling to big datasets. This presents a serious impediment since data cleansing often involves costly computations such as enumerating pairs of tuples, handling inequality joins, and dealing with user-defined functions. In this paper, we present BigDansing, a Big Data Cleansing system to tackle efficiency, scalability, and ease-of-use issues in data cleansing. The system can run on top of most common general purpose data processing platforms, ranging from DBMSs to MapReduce-like frameworks. A user-friendly programming interface allows users to express data quality rules both declaratively and procedurally, with no requirement of being aware of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic and real datasets show that BigDansing outperforms existing baseline systems up to more than two orders of magnitude without sacrificing the quality provided by the repair algorithms.

  9. Agronomic behavior of phosphoric rock from Bahia Inglesa using isotopic techniques. 1. Field trial with concentrated and non concentrated rock

    International Nuclear Information System (INIS)

    Pino N, I.; Casa G, L.

    1989-01-01

    With the aim to assess the agronomic behaviour of the phosphoric rock from Bahia Inglesa, a field trial was carried out with concentrated and non concentrated 100 mesh sieved rock. The method of isotopic dilution was used with TSP labeled P32 (TSP-P32) as standard fertilizer. Total dry matter, total P by colorimetry and P32 by liquid scintillation using the Cerenkov effect were measured. Both agronomic and isotope parameters were analyzed. The concentrated phosphoric rock was 3.7 times better than the same non concentrated rock. These also was a positive effect from non concentrated at 400 kg P205/ha dose. This effect was attributed to a higher saturation in the points of P sorption. The TSP showed a better behaviour than the phosphoric rock under study. (author)

  10. The big data phenomenon: The business and public impact

    Directory of Open Access Journals (Sweden)

    Chroneos-Krasavac Biljana

    2016-01-01

    Full Text Available The subject of the research in this paper is the emergence of big data phenomenon and application of big data technologies for business' needs with the specific emphasis on marketing and trade. The purpose of the research is to make a comprehensive overview of different discussions about the characteristics, application possibilities, achievements, constraints and the future of big data development. Based on the relevant literature, the concept of big data is presented and the potential of large impact of big data on business activities is discussed. One of the key findings indicates that the most prominent change that big data brings to the business arena is the appearance of new business models, as well as revisions of the existing ones. Substantial part of the paper is devoted to the marketing and marketing research which are under the strong impact of big data. The most exciting outcomes of the research in this domain concerns the new abilities in profiling the customers. In addition to the vast amount of structured data which are used in marketing for a long period, big data initiatives suggest the inclusion of semi-structured and unstructured data, opening up the room for substantial improvements in customer profile analysis. Considering the usage of information communication technologies (ICT as a prerequisite for big data project success, the concept of Networked Readiness Index (NRI is presented and the position of Serbia and regional countries in NRI framework is analyzed. The main outcome of the analysis points out that Serbia, with its NRI score took the lowest position in the region, excluding Albania. Also, Serbia is lagging behind the appropriate EU mean values regarding all observed composite indicators - pillars. Further on, this analysis reveals the domains of ICT usage in Serbia, which could be focused for an improvement and where incentives can be made. These domains are: political and regulatory environment, business and

  11. Preliminary geologic map of the Big Costilla Peak area, Taos County, New Mexico, and Costilla County, Colorado

    Science.gov (United States)

    Fridrich, Christopher J.; Shroba, Ralph R.; Hudson, Adam M.

    2012-01-01

    This map covers the Big Costilla Peak, New Mex.&nash;Colo. quadrangle and adjacent parts of three other 7.5 minute quadrangles: Amalia, New Mex.–Colo., Latir Peak, New Mex., and Comanche Point, New Mex. The study area is in the southwesternmost part of that segment of the Sangre de Cristo Mountains known as the Culebra Range; the Taos Range segment lies to the southwest of Costilla Creek and its tributary, Comanche Creek. The map area extends over all but the northernmost part of the Big Costilla horst, a late Cenozoic uplift of Proterozoic (1.7-Ga and less than 1.4-Ga) rocks that is largely surrounded by down-faulted middle to late Cenozoic (about 40 Ma to about 1 Ma) rocks exposed at significantly lower elevations. This horst is bounded on the northwest side by the San Pedro horst and Culebra graben, on the northeast and east sides by the Devils Park graben, and on the southwest side by the (about 30 Ma to about 25 Ma) Latir volcanic field. The area of this volcanic field, at the north end of the Taos Range, has undergone significantly greater extension than the area to the north of Costilla Creek. The horsts and grabens discussed above are all peripheral structures on the eastern flank of the San Luis basin, which is the axial part of the (about 26 Ma to present) Rio Grande rift at the latitude of the map. The Raton Basin lies to the east of the Culebra segment of the Sangre de Cristo Mountains. This foreland basin formed during, and is related to, the original uplift of the Sangre de Cristo Mountains which was driven by tectonic contraction of the Laramide (about 70 Ma to about 40 Ma) orogeny. Renewed uplift and structural modification of these mountains has occurred during formation of the Rio Grande rift. Surficial deposits in the study area include alluvial, mass-movement, and glacial deposits of middle Pleistocene to Holocene age.

  12. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  13. Big science

    CERN Multimedia

    Nadis, S

    2003-01-01

    " "Big science" is moving into astronomy, bringing large experimental teams, multi-year research projects, and big budgets. If this is the wave of the future, why are some astronomers bucking the trend?" (2 pages).

  14. Liquid infiltration through the boiling-point isotherm in a desiccating fractured rock matrix

    International Nuclear Information System (INIS)

    Phillips, O.M.

    1994-01-01

    Over a long time interval, the integrity of the radioactive waste repository proposed at Yucca Mountain may be compromised by corrosion accelerated by intermittent wetting which could occur by episodic infiltration of meteoric water from above through the fracture network. A simple two-dimensional model is constructed for the infiltration of liquid water down a fracture in a permeable rock matrix, beyond the boiling-point isotherm. The water may derive from episodic infiltration or from the condensation of steam above a desiccating region. Boiling of the water in the fracture is maintained by heat transfer from a surrounding superheated matrix blocks. There are two intrinsic length scales in this situation, (1): l s = ρ l q o L/(k m β) which is such that the total heat flow over this lateral distance balances that needed for evaporation of the liquid water infiltration, and (2): The thermal diffusion distance l θ = (k m t) 1/2 which increases with time after the onset of infiltration. The primary results are: (a) for two-dimensional infiltration down an isolated fracture or fault, the depth of penetration below the (undisturbed) boiling point isotherm is given by 1/2 π 1/2 (l s l θ ) 1/2 , and so increases as t 1/4 . Immediately following the onset of infiltration, penetration is rapid, but quickly slows. This behavior continues until l θ (and D) become comparable with l s . (b) With continuing infiltration down an isolated fracture or cluster of fractures, when l θ >> l s the temperature distribution becomes steady and the penetration distance stabilizes at a value proportional to l s . (c) Effects such as three-dimensionality of the liquid flow paths and flow rates, matrix infiltration, etc., appear to reduce the penetration distance

  15. Big Data and central banks

    OpenAIRE

    David Bholat

    2015-01-01

    This commentary recaps a Centre for Central Banking Studies event held at the Bank of England on 2–3 July 2014. The article covers three main points. First, it situates the Centre for Central Banking Studies event within the context of the Bank’s Strategic Plan and initiatives. Second, it summarises and reflects on major themes from the event. Third, the article links central banks’ emerging interest in Big Data approaches with their broader uptake by other economic agents.

  16. Big Data and central banks

    Directory of Open Access Journals (Sweden)

    David Bholat

    2015-04-01

    Full Text Available This commentary recaps a Centre for Central Banking Studies event held at the Bank of England on 2–3 July 2014. The article covers three main points. First, it situates the Centre for Central Banking Studies event within the context of the Bank’s Strategic Plan and initiatives. Second, it summarises and reflects on major themes from the event. Third, the article links central banks’ emerging interest in Big Data approaches with their broader uptake by other economic agents.

  17. Uranium in the rock fragments from Lunar soil

    International Nuclear Information System (INIS)

    Komarov, A.N.; Sergeev, S.A.

    1983-01-01

    Uranium content and distribution in Lunar rock fragments 0.4-0.9 mm in size from ''Lunar-16+ -20, -24'' stations were studied by the method of autoradiography. Uranium is almost absent in rock-forming minerals and is concentrated in some accessory mineral. Uranium content in microgabro fragments from ''Lunar-20 and -24'' equals (0.0n - n.0)16 -6 g/g. Variations are not related to fragment representation. Radiogra-- phies of fragments from Lunar soil showed the uranium distribution from uniform (in glasses) to extremely nonuniform in some holocrystalline rocks. It was pointed out, that uranium micro distributions in Lunar and Earth (effusive and magmatic) rocks have common features. In both cases rock-forming minerals don't contain appreciable uranium amount in the form of isomorphic admixture; uranium is highly concentrated in some accessory minerais. The difference lies in tne absence of hydroxyl -containing secondary minerals, which are enriched with uranium on Earth, in Lunar rocks. ''Film'' uranium micromineralization, which occurs in rocks of the Earth along the boundaries of mineral grains is absent in Lunar rocks as well

  18. Big bang and big crunch in matrix string theory

    OpenAIRE

    Bedford, J; Papageorgakis, C; Rodríguez-Gómez, D; Ward, J

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  19. Results from the Big Spring basin water quality monitoring and demonstration projects, Iowa, USA

    Science.gov (United States)

    Rowden, R.D.; Liu, H.; Libra, R.D.

    2001-01-01

    Agricultural practices, hydrology, and water quality of the 267-km2 Big Spring groundwater drainage basin in Clayton County, Iowa, have been monitored since 1981. Land use is agricultural; nitrate-nitrogen (-N) and herbicides are the resulting contaminants in groundwater and surface water. Ordovician Galena Group carbonate rocks comprise the main aquifer in the basin. Recharge to this karstic aquifer is by infiltration, augmented by sinkhole-captured runoff. Groundwater is discharged at Big Spring, where quantity and quality of the discharge are monitored. Monitoring has shown a threefold increase in groundwater nitrate-N concentrations from the 1960s to the early 1980s. The nitrate-N discharged from the basin typically is equivalent to over one-third of the nitrogen fertilizer applied, with larger losses during wetter years. Atrazine is present in groundwater all year; however, contaminant concentrations in the groundwater respond directly to recharge events, and unique chemical signatures of infiltration versus runoff recharge are detectable in the discharge from Big Spring. Education and demonstration efforts have reduced nitrogen fertilizer application rates by one-third since 1981. Relating declines in nitrate and pesticide concentrations to inputs of nitrogen fertilizer and pesticides at Big Spring is problematic. Annual recharge has varied five-fold during monitoring, overshadowing any water-quality improvements resulting from incrementally decreased inputs. ?? Springer-Verlag 2001.

  20. Big Bang, Blowup, and Modular Curves: Algebraic Geometry in Cosmology

    Science.gov (United States)

    Manin, Yuri I.; Marcolli, Matilde

    2014-07-01

    We introduce some algebraic geometric models in cosmology related to the ''boundaries'' of space-time: Big Bang, Mixmaster Universe, Penrose's crossovers between aeons. We suggest to model the kinematics of Big Bang using the algebraic geometric (or analytic) blow up of a point x. This creates a boundary which consists of the projective space of tangent directions to x and possibly of the light cone of x. We argue that time on the boundary undergoes the Wick rotation and becomes purely imaginary. The Mixmaster (Bianchi IX) model of the early history of the universe is neatly explained in this picture by postulating that the reverse Wick rotation follows a hyperbolic geodesic connecting imaginary time axis to the real one. Penrose's idea to see the Big Bang as a sign of crossover from ''the end of previous aeon'' of the expanding and cooling Universe to the ''beginning of the next aeon'' is interpreted as an identification of a natural boundary of Minkowski space at infinity with the Big Bang boundary.

  1. A Review of Rock Bolt Monitoring Using Smart Sensors

    Directory of Open Access Journals (Sweden)

    Gangbing Song

    2017-04-01

    Full Text Available Rock bolts have been widely used as rock reinforcing members in underground coal mine roadways and tunnels. Failures of rock bolts occur as a result of overloading, corrosion, seismic burst and bad grouting, leading to catastrophic economic and personnel losses. Monitoring the health condition of the rock bolts plays an important role in ensuring the safe operation of underground mines. This work presents a brief introduction on the types of rock bolts followed by a comprehensive review of rock bolt monitoring using smart sensors. Smart sensors that are used to assess rock bolt integrity are reviewed to provide a firm perception of the application of smart sensors for enhanced performance and reliability of rock bolts. The most widely used smart sensors for rock bolt monitoring are the piezoelectric sensors and the fiber optic sensors. The methodologies and principles of these smart sensors are reviewed from the point of view of rock bolt integrity monitoring. The applications of smart sensors in monitoring the critical status of rock bolts, such as the axial force, corrosion occurrence, grout quality and resin delamination, are highlighted. In addition, several prototypes or commercially available smart rock bolt devices are also introduced.

  2. A Review of Rock Bolt Monitoring Using Smart Sensors.

    Science.gov (United States)

    Song, Gangbing; Li, Weijie; Wang, Bo; Ho, Siu Chun Michael

    2017-04-05

    Rock bolts have been widely used as rock reinforcing members in underground coal mine roadways and tunnels. Failures of rock bolts occur as a result of overloading, corrosion, seismic burst and bad grouting, leading to catastrophic economic and personnel losses. Monitoring the health condition of the rock bolts plays an important role in ensuring the safe operation of underground mines. This work presents a brief introduction on the types of rock bolts followed by a comprehensive review of rock bolt monitoring using smart sensors. Smart sensors that are used to assess rock bolt integrity are reviewed to provide a firm perception of the application of smart sensors for enhanced performance and reliability of rock bolts. The most widely used smart sensors for rock bolt monitoring are the piezoelectric sensors and the fiber optic sensors. The methodologies and principles of these smart sensors are reviewed from the point of view of rock bolt integrity monitoring. The applications of smart sensors in monitoring the critical status of rock bolts, such as the axial force, corrosion occurrence, grout quality and resin delamination, are highlighted. In addition, several prototypes or commercially available smart rock bolt devices are also introduced.

  3. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  4. Big data uncertainties.

    Science.gov (United States)

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  5. Determination of rock depth using artificial intelligence techniques

    Institute of Scientific and Technical Information of China (English)

    R. Viswanathan; Pijush Samui

    2016-01-01

    This article adopts three artificial intelligence techniques, Gaussian Process Regression (GPR), Least Square Support Vector Machine (LSSVM) and Extreme Learning Machine (ELM), for prediction of rock depth (d) at any point in Chennai. GPR, ELM and LSSVM have been used as regression techniques. Latitude and longitude are also adopted as inputs of the GPR, ELM and LSSVM models. The performance of the ELM, GPR and LSSVM models has been compared. The developed ELM, GPR and LSSVM models produce spatial variability of rock depth and offer robust models for the prediction of rock depth.

  6. Advances and Applications of Rock Physics for Hydrocarbon Exploration

    Directory of Open Access Journals (Sweden)

    Valle-Molina C.

    2012-10-01

    Full Text Available Integration of the geological and geophysical information with different scale and features is the key point to establish relationships between petrophysical and elastic characteristics of the rocks in the reservoir. It is very important to present the fundamentals and current methodologies of the rock physics analyses applied to hydrocarbons exploration among engineers and Mexican students. This work represents an effort to capacitate personnel of oil exploration through the revision of the subjects of rock physics. The main aim is to show updated improvements and applications of rock physics into seismology for exploration. Most of the methodologies presented in this document are related to the study the physical and geological mechanisms that impact on the elastic properties of the rock reservoirs based on rock specimens characterization and geophysical borehole information. Predictions of the rock properties (litology, porosity, fluid in the voids can be performed using 3D seismic data that shall be properly calibrated with experimental measurements in rock cores and seismic well log data

  7. Characteristics of business intelligence and big data in e-government

    DEFF Research Database (Denmark)

    Gaardboe, Rikke; Jonasen, Tanja Svarre; Kanstrup, Anne Marie

    2015-01-01

    Business intelligence and big data represent two different technologies within decision support systems. The present paper concerns the two concepts within the context of e-government. Thus, the purpose of the paper is to present the preliminary findings regarding publication patterns and topic...... coverage within the two technologies by conducting a comparative literature review. A total of 281 papers published in the years 2005–2014 were included in the analysis. A rapid increase of papers regarding big data were identified, the majority being journal papers. As regards business intelligence......, researchers publish in conference proceedings to a greater extent. Further, big data journal papers are published within a broader range of journal topics compared to business intelligence journal papers. The paper concludes by pointing to further analyses that will be carried out within the 281 selected...

  8. Determination of the thermal neutron absorption cross section for rock samples by a single measurement of the time decay constant

    International Nuclear Information System (INIS)

    Krynicka, E.

    1993-01-01

    A calibration method for the determination of the thermal neutron macroscopic mass absorption cross section for rock samples is presented. The standard deviation of the final results is discussed in detail. A big advantage of the presented method is that the calibration curves have been found using the results obtained for a variety of natural rock samples of different stratigraphies and lithologies measured by Czubek's methods. An important part of the paper is a through analysis of the standard deviation of the final result. (author). 13 refs, 11 figs, 5 tabs

  9. Ionization and Corona Discharges from Stressed Rocks

    Science.gov (United States)

    Winnick, M. J.; Kulahci, I.; Cyr, G.; Tregloan-Reed, J.; Freund, F. T.

    2008-12-01

    Pre-earthquake signals have long been observed and documented, though they have not been adequately explained scientifically. These signals include air ionization, occasional flashes of light from the ground, radio frequency emissions, and effects on the ionosphere that occur hours or even days before large earthquakes. The theory that rocks function as p-type semiconductors when deviatoric stresses are applied offers a mechanism for this group of earthquake precursors. When an igneous or high-grade metamorphic rock is subjected to deviatoric stresses, peroxy bonds that exist in the rock's minerals as point defects dissociate, releasing positive hole charge carriers. The positive holes travel by phonon-assisted electron hopping from the stressed into and through the unstressed rock volume and build up a positive surface charge. At sufficiently large electric fields, especially along edges and sharp points of the rock, air molecules become field-ionized, loosing an electron to the rock surface and turning into airborne positive ions. This in turn can lead to corona discharges, which manifest themselves by flashes of light and radio frequency emissions. We applied concentrated stresses to one end of a block of gabbro, 30 x 15 x 10 cm3, inside a shielded Faraday cage and observed positive ion currents through an air gap about 25 cm from the place where the stresses were applied, punctuated by short bursts, accompanied by flashes of light and radio frequency emissions characteristic of a corona discharge. These observations may serve to explain a range of pre-earthquake signals, in particular changes in air conductivity, luminous phenomena, radio frequency noise, and ionospheric perturbations.

  10. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  11. Aespoe Hard Rock Laboratory. Sensor Data Report No 23

    Energy Technology Data Exchange (ETDEWEB)

    Goudarzi, Reza; Johannesson, Lars-Erik (Clay Technology AB (Sweden))

    2010-11-15

    The Prototype Repository Test consists of two sections. The installation of the first Section of Prototype Repository was made during summer and autumn 2001 and Section 2 was installed in spring and summer 2003. This report presents data from measurements in the Prototype Repository during the period 20010917-20100601. The report is organized so that the actual measured results are shown in Appendix 1-10, where Appendix 8 deals with measurements of canister displacements (by AITEMIN), Appendix 9 deals with geo-electric measurements in the backfill (by GRS), Appendix 10 deals with stress and strain measurement in the rock (by AaF) and Appendix 11 deals with measurement of water pressure in the rock (by VBB/VIAK). The main report and Appendix 1-7 deal with the rest of the measurements. Section 1. The following measurements are made in the bentonite in each of the two instrumented deposition holes in Section 1 (1 and 3): Temperature is measured in 32 points, total pressure in 27 points, pore water pressure in 14 points and relative humidity in 37 points. Temperature is also measured by all relative humidity gauges. Every measuring point is related to a local coordinate system in the deposition hole. The following measurements are made in the backfill in Section 1. Temperature is measured in 20 points, total pressure in 18 points, pore water pressure in 23 points and relative humidity in 45 points. Temperature is also measured by all relative humidity gauges. Furthermore, water content is measured by an electric chain in one section. Every measuring point is related to a local coordinate system in the tunnel. The following measurements are made on the surface of the canisters in Section 1: Temperature is measured every meter along two fiber optic cables. Furthermore, displacements of the canister in hole 3 are measured with 6 gauges. The following measurements are made in the rock in Section 1: Temperature is measured in 37 points in boreholes in the floor. Water

  12. Big bang and big crunch in matrix string theory

    International Nuclear Information System (INIS)

    Bedford, J.; Ward, J.; Papageorgakis, C.; Rodriguez-Gomez, D.

    2007-01-01

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe

  13. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  14. Potency of Agroindustrial Wastewaters to Increase the Dissolution of Phosphate Rock Fertilizers

    Directory of Open Access Journals (Sweden)

    Ainin Niswati

    2014-06-01

    Full Text Available The used of agroindustrial wastewaters are not maximum yet in Lampung Province, althought it can be used as an acid solvent because of its acidic properties. This study was aimed to explore the most potential agroindustrial wastewaters in dissolving phosphate rock through acidulation in the laboratory scale. The experiment was arranged in a factorial. The first factor was origined of phosphate rock (Sukabumi, west Java and Selagailingga, central Lampung and the second factor was solvent types (agroindustrial wastewaters which were pineapple, tapioca, tofu industry, and palm oil as well as conventional acid solvents which were HCl, H2SO4, and CH3COOH. The incubation processes were 0, 1, 2, and 3 months. The results showed that agroindustrial wastewaters that have the highest potency to solubize phosphate rock was industrial tofu wastewaters and followed by industrial wastewaters of tapioca, palm oil, and pineapple. Both the conventional acid and agroindustrial wastewaters solvent had a big potency to solubilize phosphate rock, however, its highest soluble P-value did not match with the ISO criteria for phosphate fertilizers Quality I (SNI because it did not reach the solubility of 80% of its total P2O5, but it has been qualified as a fertilizer both the quality phosphate A, B, and C (SNI.

  15. The use of point load test for Dubai weak calcareous sandstones

    Directory of Open Access Journals (Sweden)

    Amr Farouk Elhakim

    2015-08-01

    Full Text Available Intact rock is typically described according to its uniaxial compressive strength (UCS. The UCS is needed in the design of geotechnical engineering problems including stability of rock slopes and design of shallow and deep foundations resting on and/or in rocks. Accordingly, a correct measurement/evaluation of the UCS is essential to a safe and economic design. Typically, the UCS is measured using the unconfined compression tests performed on cylindrical intact specimens with a minimum length to width ratio of 2. In several cases, especially for weak and very weak rocks, it is not possible to extract intact specimens with the needed minimum dimensions. Thus, alternative tests (e.g. point load test, Schmidt hammer are used to measure rock strength. The UCS is computed based on the results of these tests through empirical correlations. The literature includes a plethora of these correlations that vary widely in estimating rock strength. Thus, it is paramount to validate these correlations to check their suitability for estimating rock strength for a specific location and geology. A review of the available correlations used to estimate the UCS from the point load test results is performed and summarized herein. Results of UCS, point load strength index and Young's modulus are gathered for calcareous sandstone specimens extracted from the Dubai area. A correlation for estimating the UCS from the point load strength index is proposed. Furthermore, the Young's modulus is correlated to the UCS.

  16. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  17. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  18. The geomechanical strength of carbonate rock in Kinta valley, Ipoh, Perak Malaysia

    Science.gov (United States)

    Mazlan, Nur Amanina; Lai, Goh Thian; Razib, Ainul Mardhiyah Mohd; Rafek, Abdul Ghani; Serasa, Ailie Sofyiana; Simon, Norbert; Surip, Noraini; Ern, Lee Khai; Mohamed, Tuan Rusli

    2018-04-01

    The stability of both cut rocks and underground openings were influenced by the geomechanical strength of rock materials, while the strength characteristics are influenced by both material characteristics and the condition of weathering. This paper present a systematic approach to quantify the rock material strength characteristics for material failure and material & discontinuities failure by using uniaxial compressive strength, point load strength index and Brazilian tensile strength for carbonate rocks. Statistical analysis of the results at 95 percent confidence level showed that the mean value of compressive strength, point load strength index and Brazilian tensile strength for with material failure and material & discontinuities failure were 76.8 ± 4.5 and 41.2 ± 4.1 MPa with standard deviation of 15.2 and 6.5 MPa, respectively. The point load strength index for material failure and material & discontinuities failure were 3.1 ± 0.2 MPa and 1.8 ± 0.3 MPa with standard deviation of 0.9 and 0.6 MPa, respectively. The Brazilian tensile strength with material failure and material & discontinuities failure were 7.1 ± 0.3 MPa and 4.1 ± 0.3 MPa with standard deviation of 1.4 and 0.6 MPa, respectively. The results of this research revealed that the geomechanical strengths of rock material of carbonate rocks for material & discontinuities failure deteriorates approximately ½ from material failure.

  19. Baryon symmetric big bang cosmology

    Science.gov (United States)

    Stecker, F. W.

    1978-01-01

    Both the quantum theory and Einsteins theory of special relativity lead to the supposition that matter and antimatter were produced in equal quantities during the big bang. It is noted that local matter/antimatter asymmetries may be reconciled with universal symmetry by assuming (1) a slight imbalance of matter over antimatter in the early universe, annihilation, and a subsequent remainder of matter; (2) localized regions of excess for one or the other type of matter as an initial condition; and (3) an extremely dense, high temperature state with zero net baryon number; i.e., matter/antimatter symmetry. Attention is given to the third assumption, which is the simplest and the most in keeping with current knowledge of the cosmos, especially as pertains the universality of 3 K background radiation. Mechanisms of galaxy formation are discussed, whereby matter and antimatter might have collided and annihilated each other, or have coexisted (and continue to coexist) at vast distances. It is pointed out that baryon symmetric big bang cosmology could probably be proved if an antinucleus could be detected in cosmic radiation.

  20. Big Bang as a Critical Point

    Directory of Open Access Journals (Sweden)

    Jakub Mielczarek

    2017-01-01

    Full Text Available This article addresses the issue of possible gravitational phase transitions in the early universe. We suggest that a second-order phase transition observed in the Causal Dynamical Triangulations approach to quantum gravity may have a cosmological relevance. The phase transition interpolates between a nongeometric crumpled phase of gravity and an extended phase with classical properties. Transition of this kind has been postulated earlier in the context of geometrogenesis in the Quantum Graphity approach to quantum gravity. We show that critical behavior may also be associated with a signature change in Loop Quantum Cosmology, which occurs as a result of quantum deformation of the hypersurface deformation algebra. In the considered cases, classical space-time originates at the critical point associated with a second-order phase transition. Relation between the gravitational phase transitions and the corresponding change of symmetry is underlined.

  1. Seismic analysis of APR1400 RCS for site envelope using big mass method

    International Nuclear Information System (INIS)

    Kim, J. Y.; Jeon, J. H.; Lee, D. H.; Park, S. H.

    2002-01-01

    One of design concepts of APR1400 is the site envelope considering various soil sites as well as rock site. The KSNP's are constructed on the rock site where only the translational excitations are directly transferred to the plant. On the other hand, the rotational motions affect the responses of the structures in the soil cases. In this study, a Big Mass Method is used to consider rotational motions as excitations at the foundation in addition to translational ones to obtain seismic responses of the APR1400 RCS main components. The seismic analyses for the APR1400 excited simultaneously by translation and rotational motions were performed. The results show that the effect of soil sites is not significant for the design of main components and supports of the RCS, but it may be considerable for the design of reactor vessel internals, piping, and nozzles which have lower natural frequencies

  2. Numerical analysis of the big bounce in loop quantum cosmology

    International Nuclear Information System (INIS)

    Laguna, Pablo

    2007-01-01

    Loop quantum cosmology (LQC) homogeneous models with a massless scalar field show that the big-bang singularity can be replaced by a big quantum bounce. To gain further insight on the nature of this bounce, we study the semidiscrete loop quantum gravity Hamiltonian constraint equation from the point of view of numerical analysis. For illustration purposes, we establish a numerical analogy between the quantum bounces and reflections in finite difference discretizations of wave equations triggered by the use of nonuniform grids or, equivalently, reflections found when solving numerically wave equations with varying coefficients. We show that the bounce is closely related to the method for the temporal update of the system and demonstrate that explicit time-updates in general yield bounces. Finally, we present an example of an implicit time-update devoid of bounces and show back-in-time, deterministic evolutions that reach and partially jump over the big-bang singularity

  3. Research of the Rock Art from the point of view of geography: the neolithic painting of the Mediterranean area of the Iberian Peninsula

    Directory of Open Access Journals (Sweden)

    Cruz Berrocal, María

    2004-12-01

    Full Text Available The rock art of the Mediterranean Arch (which includes what are conventionally called Levantine Rock Art, Schematic Rock Art and Macroschematic Rock Art, among other styles, designated as part of the Human Heritage in 1997, is studied from the point of view of the Archaeology of Landscape. The information sources used were field work, cartographic analysis and analysis in GIS, besides two Rock Art Archives: the UNESCO Document and the Corpus of Levantine Cave Painting (Corpus de Pintura Rupestre Levantina. The initial hypothesis was that this rock art was involved in the process of neolithisation of the Eastern part of Iberia, of which it is a symptom and a result, and it must be understood as an element of landscape construction. If this is true, it would have a concrete distribution in the form of locational patterns. Through statistical procedures and heuristical approaches, it has been demonstrated that there is a structure of the neolithic landscape, defined by rock art, which is possible to interpret functional and economically.

    Se estudia el arte rupestre del Arco Mediterráneo (que incluye a los convencionalmente conocidos como Arte Levantino, Arte Esquemático y Arte Macroesquemático, entre otros estilos, nombrado Patrimonio de la Humanidad en 1998, desde el punto de vista de su localización. Las fuentes de información utilizadas fueron trabajo de campo, revisión cartográfica y análisis en Sistema de Información Geográfica, además de dos archivos de arte rupestre: el Expediente UNESCO y el Corpus de Pintura Rupestre Levantina. La hipótesis inicial fue que este arte rupestre se imbrica en el proceso de neolitización del Levante peninsular, del que es síntoma y resultado, y debe entenderse como un elemento de construcción paisajística, de lo que se deduce que ha de presentar una distribución determinable en forma de patrones locacionales. Por medio tanto de contrastes y descripciones estadísticas como de

  4. Are Rural Costs of Living Lower? Evidence from a Big Mac Index Approach

    OpenAIRE

    Scott Loveridge; Dusan Paredes

    2015-01-01

    Rural leaders can point to low housing costs as a reason that their area should be competitive for business attraction. To what extent do rural housing costs offset transportation and other locational disadvantages in costs structures? The US lacks information to systematically answer the question. We adapt a strategy employed by The Economist in exploring purchasing power parity: the Big Mac Index. We gather information on Big Mac prices with a random sample of restaurants across the contigu...

  5. Big bang in a universe with infinite extension

    Energy Technology Data Exchange (ETDEWEB)

    Groen, Oeyvind [Oslo College, Department of Engineering, PO Box 4, St Olavs Pl, 0130 Oslo (Norway); Institute of Physics, University of Oslo, PO Box 1048 Blindern, 0316 Oslo (Norway)

    2006-05-01

    How can a universe coming from a point-like big bang event have infinite spatial extension? It is shown that the relativity of simultaneity is essential in answering this question. Space is finite as defined by the simultaneity of one observer, but it may be infinite as defined by the simultaneity of all the clocks participating in the Hubble flow.

  6. Big bang in a universe with infinite extension

    International Nuclear Information System (INIS)

    Groen, Oeyvind

    2006-01-01

    How can a universe coming from a point-like big bang event have infinite spatial extension? It is shown that the relativity of simultaneity is essential in answering this question. Space is finite as defined by the simultaneity of one observer, but it may be infinite as defined by the simultaneity of all the clocks participating in the Hubble flow

  7. Big Data en surveillance, deel 1 : Definities en discussies omtrent Big Data

    NARCIS (Netherlands)

    Timan, Tjerk

    2016-01-01

    Naar aanleiding van een (vrij kort) college over surveillance en Big Data, werd me gevraagd iets dieper in te gaan op het thema, definities en verschillende vraagstukken die te maken hebben met big data. In dit eerste deel zal ik proberen e.e.a. uiteen te zetten betreft Big Data theorie en

  8. Characterizing Big Data Management

    OpenAIRE

    Rogério Rossi; Kechi Hirama

    2015-01-01

    Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: t...

  9. Determining the Accuracy of Paleomagnetic Remanence and High-Resolution Chronostratigraphy for Sedimentary Rocks using Rock Magnetics

    Science.gov (United States)

    Kodama, K. P.

    2017-12-01

    The talk will consider two broad topics in rock magnetism and paleomagnetism: the accuracy of paleomagnetic remanence and the use of rock magnetics to measure geologic time in sedimentary sequences. The accuracy of the inclination recorded by sedimentary rocks is crucial to paleogeographic reconstructions. Laboratory compaction experiments show that inclination shallows on the order of 10˚-15˚. Corrections to the inclination can be made using the effects of compaction on the directional distribution of secular variation recorded by sediments or the anisotropy of the magnetic grains carrying the ancient remanence. A summary of all the compaction correction studies as of 2012 shows that 85% of sedimentary rocks studied have enjoyed some amount of inclination shallowing. Future work should also consider the effect of grain-scale strain on paleomagnetic remanence. High resolution chronostratigraphy can be assigned to a sedimentary sequence using rock magnetics to detect astronomically-forced climate cycles. The power of the technique is relatively quick, non-destructive measurements, the objective identification of the cycles compared to facies interpretations, and the sensitivity of rock magnetics to subtle changes in sedimentary source. An example of this technique comes from using rock magnetics to identify astronomically-forced climate cycles in three globally distributed occurrences of the Shuram carbon isotope excursion. The Shuram excursion may record the oxidation of the world ocean in the Ediacaran, just before the Cambrian explosion of metazoans. Using rock magnetic cyclostratigraphy, the excursion is shown to have the same duration (8-9 Myr) in southern California, south China and south Australia. Magnetostratigraphy of the rocks carrying the excursion in California and Australia shows a reversed to normal geomagnetic field polarity transition at the excursion's nadir, thus supporting the synchroneity of the excursion globally. Both results point to a

  10. Relationship between natural radioactivity and rock type in the Van lake basin - Turkey

    International Nuclear Information System (INIS)

    Tolluoglu, A. U.; Eral, M.; Aytas, S.

    2004-01-01

    The Van Lake basin located at eastern part of Turkey. The Van lake basin essentially comprises two province, these are namely Van and Bitlis. The former geochemistry research indicated that the uranium concentrations of Van lake water and deep sediments are 78-116 ppb and 0.1-0.5 ppm respectively. Uranium was transported to Van Lake by rivers and streams, flow through to outcrops of Paleozoic Bitlis Massive, and young Pleistocene alkaline/calkalkaline volcanic rocks. This study focused on the revealing natural radioactivity and secondary dispersion of radioactivity related to rock types surface environments in the Van Lake Basin. The Van Lake Basin essentially subdivided into three different parts; the Eastern parts characterized by Mesozoic basic and ultra basic rocks, southern parts dominated by metamorphic rocks of Bitlis Massive, Western and Northwestern parts covered by volcanic rocks of Pleistocene. Volcanic rocks can be subdivided into two different types. The first type is mafic rocks mainly composed of basalts. The second type is felsic rocks represented by rhyolites, dacites and pumice tuff. Surface gamma measurements (cps) and dose rate measurements (μR/h) show different values according to rock type. Surface gamma measurement and surface dose rate values in the basaltic rocks are slightly higher than the average values (130 cps, 11 μR/h). In the felsic volcanic rocks such as rhyolites and dacites surface gamma measurement values and surface dose rate values, occasionally exceed the background. Highest values were obtained in the pumice tuffs. Rhyolitic eruptions related to Quaternary volcanic activity formed thick pumice (natural glassy froth related to felsic volcanic rocks and exhibit spongy texture) sequences Northern and Western part of Van Lake basin. The dose rate of pumice rocks was measured mean 15 μR/h. The highest value for surface gamma measurements was recorded as 200 cps. The pumice has very big water capacity, due to porous texture of

  11. Environmental Consequences of Big Nasty Impacts on the Early Earth

    Science.gov (United States)

    Zahnle, Kevin

    2015-01-01

    The geological record of the Archean Earth is spattered with impact spherules from a dozen or so major cosmic collisions involving Earth and asteroids or comets (Lowe, Byerly 1986, 2015). Extrapolation of the documented deposits suggests that most of these impacts were as big or bigger than the Chicxulub event that famously ended the reign of the thunder lizards. As the Archean impacts were greater, the environmental effects were also greater. The number and magnitude of the impacts is bounded by the lunar record. There are no lunar craters bigger than Chicxulub that date to Earth's mid-to-late Archean. Chance dictates that Earth experienced no more than approximately 10 impacts bigger than Chicxulub between 2.5 billion years and 3.5 2.5 billion years, the biggest of which were approximately30-100 times more energetic, comparable to the Orientale impact on the Moon (1x10 (sup 26) joules). To quantify the thermal consequences of big impacts on old Earth, we model the global flow of energy from the impact into the environment. The model presumes that a significant fraction of the impact energy goes into ejecta that interact with the atmosphere. Much of this energy is initially in rock vapor, melt, and high speed particles. (i) The upper atmosphere is heated by ejecta as they reenter the atmosphere. The mix of hot air, rock vapor, and hot silicates cools by thermal radiation. Rock raindrops fall out as the upper atmosphere cools. (ii) The energy balance of the lower atmosphere is set by radiative exchange with the upper atmosphere and with the surface, and by evaporation of seawater. Susequent cooling is governed by condensation of water vapor. (iii) The oceans are heated by thermal radiation and rock rain and cooled by evaporation. Surface waters become hot and salty; if a deep ocean remains it is relatively cool. Subsequently water vapor condenses to replenish the oceans with hot fresh water (how fresh depending on continental weathering, which might be rather rapid

  12. Aespoe Hard Rock Laboratory. Sensor Data Report No 23

    International Nuclear Information System (INIS)

    Goudarzi, Reza; Johannesson, Lars-Erik

    2010-11-01

    The Prototype Repository Test consists of two sections. The installation of the first Section of Prototype Repository was made during summer and autumn 2001 and Section 2 was installed in spring and summer 2003. This report presents data from measurements in the Prototype Repository during the period 20010917-20100601. The report is organized so that the actual measured results are shown in Appendix 1-10, where Appendix 8 deals with measurements of canister displacements (by AITEMIN), Appendix 9 deals with geo-electric measurements in the backfill (by GRS), Appendix 10 deals with stress and strain measurement in the rock (by AaF) and Appendix 11 deals with measurement of water pressure in the rock (by VBB/VIAK). The main report and Appendix 1-7 deal with the rest of the measurements. Section 1. The following measurements are made in the bentonite in each of the two instrumented deposition holes in Section 1 (1 and 3): Temperature is measured in 32 points, total pressure in 27 points, pore water pressure in 14 points and relative humidity in 37 points. Temperature is also measured by all relative humidity gauges. Every measuring point is related to a local coordinate system in the deposition hole. The following measurements are made in the backfill in Section 1. Temperature is measured in 20 points, total pressure in 18 points, pore water pressure in 23 points and relative humidity in 45 points. Temperature is also measured by all relative humidity gauges. Furthermore, water content is measured by an electric chain in one section. Every measuring point is related to a local coordinate system in the tunnel. The following measurements are made on the surface of the canisters in Section 1: Temperature is measured every meter along two fiber optic cables. Furthermore, displacements of the canister in hole 3 are measured with 6 gauges. The following measurements are made in the rock in Section 1: Temperature is measured in 37 points in boreholes in the floor. Water

  13. Water - rock interaction in different rock environments

    International Nuclear Information System (INIS)

    Lamminen, S.

    1995-01-01

    The study assesses the groundwater geochemistry and geological environment of 44 study sites for radioactive waste disposal. Initially, the study sites were divided by rock type into 5 groups: (1) acid - intermediate rocks, (2) mafic - ultramafic rocks, (3) gabbros, amphibolites and gneisses that contain calc-silicate (skarn) rocks, (4) carbonates and (5) sandstones. Separate assessments are made of acid - intermediate plutonic rocks and of a subgroup that comprises migmatites, granite and mica gneiss. These all belong to the group of acid - intermediate rocks. Within the mafic -ultramafic rock group, a subgroup that comprises mafic - ultramafic plutonic rocks, serpentinites, mafic - ultramafic volcanic rocks and volcanic - sedimentary schists is also evaluated separately. Bedrock groundwaters are classified by their concentration of total dissolved solids as fresh, brackish, saline, strongly saline and brine-class groundwaters. (75 refs., 24 figs., 3 tabs.)

  14. Rock index properties for geoengineering in the Paradox Basin

    International Nuclear Information System (INIS)

    O'Rourke, J.E.; Rey, P.H.; Alviti, E.; Capps, C.C.

    1986-02-01

    Previous researchers have investigated the use of a number of rapid index tests that can be used on core samples, or in situ, to determine rock properties needed for geoengineering design, or to predict construction performance in these rock types. Selected research is reviewed, and the correlations of index tests with laboratory tests of rock properties found by the earlier investigators are discussed. The selection and testing of rock core samples from the Gibson Dome No. 1 borehole in Paradox Basin are described. The samples consist primarily of non-salt rock above salt cycle 6, but include some samples of anhydrite and salt cycle 6. The index tests included the point load test, Schmidt hammer rebound test, and abrasion hardness test. Statistical methods were used to analyze the correlations of index test data with laboratory test data of rock properties for the same core. Complete statistical results and computer-generated graphics are presented; these results are discussed in relation to the work of earlier investigations for index testing of similar rock types. Generally, fair to good correlations were obtained for predicting unconfined compressive strength and Young's modulus for sandstone and siltstone, while poorer correlations were found for limestone. This may be due to the large variability of limestone properties compared to the small number of samples. Overall, the use of index tests to assess rock properties at Paradox Basin appears to be practial for some conceptual and preliminary design needs, and the technique should prove useful at any salt repository site. However, it is likely that specific correlations should be demonstrated separately for each site, and the data base for establishing the correlations should probably include at least several hundred data points for each type

  15. Big Rock Point Nuclear Plant. Semiannual operations report No. 22, January--June 1975

    International Nuclear Information System (INIS)

    1975-01-01

    Net electrical power generated was 50,198.2 MWH(e) with the reactor on line 922.6 hrs. Information is presented concerning power generation, shutdowns, corrective maintenance, chemistry and radiochemistry, occupational radiation exposure, and abnormal occurrences. (FS)

  16. Seismic capacities of masonry walls at the big rock point nuclear generating plant

    International Nuclear Information System (INIS)

    Wesley, D.A.; Bunon, H.; Jenkins, R.B.

    1984-01-01

    An evaluation to determine the ability of selected concrete block walls in the vicinity of essential equipment to withstand seismic excitation was conducted. The seismic input to the walls was developed in accordance with the Systematic Evaluation Program (SEP) site-specific response spectra for the site. Time-history inputs to the walls were determined from the response of the turbine building complex. Analyses were performed to determine the capacities of the walls to withstand both in-plane and transverse seismic loads. Transverse load capacities were determined from time-history analyses of nonlinear two-dimensional analytical models of the walls. Separate inputs were used at the tops and bottoms of the walls to reflect the amplification through the building. The walls were unreinforced vertically with one exception, and have unsupported heights as high as 20'-8''. Also, cantilever walls as high as 11'-2'' were included in the evaluation. Factors of safety based on stability of the walls were determined for the transverse response, and on code allowable stresses (Reference 1) for the in-plane response

  17. Research of long-term mechanical displaced behavior of soft rock

    International Nuclear Information System (INIS)

    Inoue, Hiroyuki; Minami, Kosuke

    2003-01-01

    When it thinks about a stratum disposition system of high-level radioactive waste, it is important to evaluate the long-term mechanical displaced behavior of the near field bedrock which is boundary condition of the engineered barrier that should be evaluated based on the reality. In this research, three following examination was carried out for reliability improvement of long-term dynamic deformation behavior estimate. 1) We evaluated the sedimentary rock of Horonobe where we used Okubo model as while changing hydraulic condition and temperature condition. 2) We carried out the model experiment that inner pressure acted on in order to grasp a movement of near field bedrock. 3) We examined model to evaluate that. As a result, the following things were provided. 1) Sedimentary rock of Horonobe is easy to cause strength degradation for being wet and dry cycles. When the rock is saturated after drying, it is broken along potential cracking. The rock reacts for a change of moisture content sensitively. In addition, a variation of the strength occurs in a little depth remainder. This diffuseness gave the strong influence on failure time. 2) Big plastic deformation may not do elasto-plasticity behavior according to theory for stress modification of rock mass. 3) We think with one of the factor that it produces remainder in prediction and real creep hour that these is as 'm = n (conatnt of Okubo model)' simply. Therefore we collect data after peak, and it is necessary to grasp 'm/n'. In addition, it is necessary to improve 'n' in the model which we can change by environment and stress state on the way. (author)

  18. Development of a Unified Rock Bolt Model in Discontinuous Deformation Analysis

    Science.gov (United States)

    He, L.; An, X. M.; Zhao, X. B.; Zhao, Z. Y.; Zhao, J.

    2018-03-01

    In this paper, a unified rock bolt model is proposed and incorporated into the two-dimensional discontinuous deformation analysis. In the model, the bolt shank is discretized into a finite number of (modified) Euler-Bernoulli beam elements with the degrees of freedom represented at the end nodes, while the face plate is treated as solid blocks. The rock mass and the bolt shank deform independently, but interact with each other through a few anchored points. The interactions between the rock mass and the face plate are handled via general contact algorithm. Different types of rock bolts (e.g., Expansion Shell, fully grouted rebar, Split Set, cone bolt, Roofex, Garford and D-bolt) can be realized by specifying the corresponding constitutive model for the tangential behavior of the anchored points. Four failure modes, namely tensile failure and shear failure of the bolt shank, debonding along the bolt/rock interface and loss of the face plate, are available in the analysis procedure. The performance of a typical conventional rock bolt (fully grouted rebar) and a typical energy-absorbing rock bolt (D-bolt) under the scenarios of suspending loosened blocks and rock dilation is investigated using the proposed model. The reliability of the proposed model is verified by comparing the simulation results with theoretical predictions and experimental observations. The proposed model could be used to reveal the mechanism of each type of rock bolt in realistic scenarios and to provide a numerical way for presenting the detailed profile about the behavior of bolts, in particular at intermediate loading stages.

  19. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  20. An analysis of cross-sectional differences in big and non-big public accounting firms' audit programs

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans); Drieenhuizen, F.; Stein, M.T.; Simunic, D.A.

    2006-01-01

    A significant body of prior research has shown that audits by the Big 5 (now Big 4) public accounting firms are quality differentiated relative to non-Big 5 audits. This result can be derived analytically by assuming that Big 5 and non-Big 5 firms face different loss functions for "audit failures"

  1. Big Data is invading big places as CERN

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Big Data technologies are becoming more popular with the constant grow of data generation in different fields such as social networks, internet of things and laboratories like CERN. How is CERN making use of such technologies? How machine learning is applied at CERN with Big Data technologies? How much data we move and how it is analyzed? All these questions will be answered during the talk.

  2. The big bang

    International Nuclear Information System (INIS)

    Chown, Marcus.

    1987-01-01

    The paper concerns the 'Big Bang' theory of the creation of the Universe 15 thousand million years ago, and traces events which physicists predict occurred soon after the creation. Unified theory of the moment of creation, evidence of an expanding Universe, the X-boson -the particle produced very soon after the big bang and which vanished from the Universe one-hundredth of a second after the big bang, and the fate of the Universe, are all discussed. (U.K.)

  3. Numerical modelling of fluid-rock interactions: Lessons learnt from carbonate rocks diagenesis studies

    Science.gov (United States)

    Nader, Fadi; Bachaud, Pierre; Michel, Anthony

    2015-04-01

    Quantitative assessment of fluid-rock interactions and their impact on carbonate host-rocks has recently become a very attractive research topic within academic and industrial realms. Today, a common operational workflow that aims at predicting the relevant diagenetic processes on the host rocks (i.e. fluid-rock interactions) consists of three main stages: i) constructing a conceptual diagenesis model including inferred preferential fluids pathways; ii) quantifying the resulted diagenetic phases (e.g. depositing cements, dissolved and recrystallized minerals); and iii) numerical modelling of diagenetic processes. Most of the concepts of diagenetic processes operate at the larger, basin-scale, however, the description of the diagenetic phases (products of such processes) and their association with the overall petrophysical evolution of sedimentary rocks remain at reservoir (and even outcrop/ well core) scale. Conceptual models of diagenetic processes are thereafter constructed based on studying surface-exposed rocks and well cores (e.g. petrography, geochemistry, fluid inclusions). We are able to quantify the diagenetic products with various evolving techniques and on varying scales (e.g. point-counting, 2D and 3D image analysis, XRD, micro-CT and pore network models). Geochemical modelling makes use of thermodynamic and kinetic rules as well as data-bases to simulate chemical reactions and fluid-rock interactions. This can be through a 0D model, whereby a certain process is tested (e.g. the likelihood of a certain chemical reaction to operate under specific conditions). Results relate to the fluids and mineral phases involved in the chemical reactions. They could be used as arguments to support or refute proposed outcomes of fluid-rock interactions. Coupling geochemical modelling with transport (reactive transport model; 1D, 2D and 3D) is another possibility, attractive as it provides forward simulations of diagenetic processes and resulting phases. This

  4. Small Big Data Congress 2017

    NARCIS (Netherlands)

    Doorn, J.

    2017-01-01

    TNO, in collaboration with the Big Data Value Center, presents the fourth Small Big Data Congress! Our congress aims at providing an overview of practical and innovative applications based on big data. Do you want to know what is happening in applied research with big data? And what can already be

  5. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  6. Mechanical properties of granitic rocks from Gideaa, Sweden

    International Nuclear Information System (INIS)

    Ljunggren, C.; Stephansson, O.; Alm, O.; Hakami, H.; Mattila, U.

    1985-10-01

    The elastic and mechanical properties were determined for two rock types from the Gideaa study area. Gideaa is located approximately 30 km north-east of Oernskoeldsvik, Northern Sweden. The rock types that were tested were migmatitic gneiss and migmatitic granite. The following tests were conducted: - sound velocity measurements; - uniaxial compression tests with acoustic emission recording; - brazilian disc tests; - triaxial tests; - three point bending tests. All together, 12 rock samples were tested with each test method. Six samples of these were migmatic gneiss and six samples were migmatitic granite. The result shows that the migmatitic gneiss has varying strength properties with low compressive strength in comparison with its high tensile strength. The migmatitic granite, on the other hand, is found to have parameter values similar to other granitic rocks. With 15 refs. (Author)

  7. Globalisation, big business and the Blair government

    OpenAIRE

    Grant, Wyn

    2000-01-01

    After reviewing definitions of globalisation, this paper suggests that the ‘company state model is becoming increasingly important in business-government relations. It is argued that Prime Minister Blair has a particular construction of globalisation which fits in well with the agenda of big international business. However, increasing tensions have arisen in the relationship between New Labour and business, reaching crisis point in May 2000. The paper concludes by suggesting that Burnham’s de...

  8. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  9. Cryptography for Big Data Security

    Science.gov (United States)

    2015-07-13

    Cryptography for Big Data Security Book Chapter for Big Data: Storage, Sharing, and Security (3S) Distribution A: Public Release Ariel Hamlin1 Nabil...Email: arkady@ll.mit.edu ii Contents 1 Cryptography for Big Data Security 1 1.1 Introduction...48 Chapter 1 Cryptography for Big Data Security 1.1 Introduction With the amount

  10. Data: Big and Small.

    Science.gov (United States)

    Jones-Schenk, Jan

    2017-02-01

    Big data is a big topic in all leadership circles. Leaders in professional development must develop an understanding of what data are available across the organization that can inform effective planning for forecasting. Collaborating with others to integrate data sets can increase the power of prediction. Big data alone is insufficient to make big decisions. Leaders must find ways to access small data and triangulate multiple types of data to ensure the best decision making. J Contin Educ Nurs. 2017;48(2):60-61. Copyright 2017, SLACK Incorporated.

  11. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  12. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  13. Big Data Analytics An Overview

    Directory of Open Access Journals (Sweden)

    Jayshree Dwivedi

    2015-08-01

    Full Text Available Big data is a data beyond the storage capacity and beyond the processing power is called big data. Big data term is used for data sets its so large or complex that traditional data it involves data sets with sizes. Big data size is a constantly moving target year by year ranging from a few dozen terabytes to many petabytes of data means like social networking sites the amount of data produced by people is growing rapidly every year. Big data is not only a data rather it become a complete subject which includes various tools techniques and framework. It defines the epidemic possibility and evolvement of data both structured and unstructured. Big data is a set of techniques and technologies that require new forms of assimilate to uncover large hidden values from large datasets that are diverse complex and of a massive scale. It is difficult to work with using most relational database management systems and desktop statistics and visualization packages exacting preferably massively parallel software running on tens hundreds or even thousands of servers. Big data environment is used to grab organize and resolve the various types of data. In this paper we describe applications problems and tools of big data and gives overview of big data.

  14. Urbanising Big

    DEFF Research Database (Denmark)

    Ljungwall, Christer

    2013-01-01

    Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis.......Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis....

  15. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Boyd, Richard N.

    2001-01-01

    The precision of measurements in modern cosmology has made huge strides in recent years, with measurements of the cosmic microwave background and the determination of the Hubble constant now rivaling the level of precision of the predictions of big bang nucleosynthesis. However, these results are not necessarily consistent with the predictions of the Standard Model of big bang nucleosynthesis. Reconciling these discrepancies may require extensions of the basic tenets of the model, and possibly of the reaction rates that determine the big bang abundances

  16. Improving Site Characterization for Rock Dredging using a Drilling Parameter Recorder and the Point Load Test

    Science.gov (United States)

    1994-09-01

    materials. Also, available data from drilling rates in the mining and tunneling industries (Howarth and Rowlands 1987, Somerton 1959) indicate a...selected uniform natural rock materials and several man -made rock simulants were used to obtain drilling parameter records for materials of known...Dredging Seminar, Atlantic City, NJ, May 1993. Western Dredging Association (WEDA) and Texas A&M University. Somerton , W. H. (1959). "A laboratory study of

  17. Uranium occurrence in major rock types by fission-track mapping

    International Nuclear Information System (INIS)

    Ledger, E.G.; Bomber, B.J.; Schaftenaar, W.E.; Tieh, T.T.

    1984-01-01

    Microscopic occurrence of uranium has been determined in about 50 igneous rocks from various location, and in a genetically unrelated sandstone from south Texas. Precambrian granites from the Llano uplift of central Texas contain from a few ppm uranium (considered normal) to over 100 ppm on a whole-rock basis. In granite, uranium is concentrated in: (1) accessory minerals including zircon, biotite, allanite, Fe-Ti oxides, and altered sphene, (2) along grain boundaries and in microfractures by precipitation from deuteric fluids, and (3) as point sources (small inclusions) in quartz and feldspars. Tertiary volcanic rocks from the Davis Mountains of west Texas include diverse rock types from basalt to rhyolite. Average uranium contents increase from 1 ppm in basalts to 7 ppm in rhyolites. Concentration occurs: (1) in iron-titanium-oxides, zircon, and rutile, (2) in the fine-grained groundmass as uniform and point-source concentrations, and (3) as late uranium in cavities associated with banded, silica-rich material. Uranium in ore-grade sandstone is concentrated to more than 3%. Specific occurrences include (1) leucoxene and/or anatase, (2) opaline and calcite cements, (3) mud clasts and altered volcanic rock fragments, and (4) in a few samples, as silt-size uranium- and molybdenum-rich spheres. Uranium content is quite low in pyrite, marcasite, and zeolites

  18. The ethics of big data in big agriculture

    OpenAIRE

    Carbonell (Isabelle M.)

    2016-01-01

    This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique in...

  19. Age and gender might influence big five factors of personality: a preliminary report in Indian population.

    Science.gov (United States)

    Magan, Dipti; Mehta, Manju; Sarvottam, Kumar; Yadav, Raj Kumar; Pandey, R M

    2014-01-01

    Age and gender are two important physiological variables which might influence the personality of an individual. The influence of age and gender on big five personality domains in Indian population was assessed in this cross-sectional study that included 155 subjects (female = 76, male = 79) aged from 16-75 years. Big five personality factors were evaluated using 60-item NEO-Five Factor Inventory (NEO-FFI) at a single point in time. Among the big five factors of personality, Conscientiousness was positively correlated (r = 0.195; P personality traits might change with age, and is gender-dependent.

  20. The big data-big model (BDBM) challenges in ecological research

    Science.gov (United States)

    Luo, Y.

    2015-12-01

    The field of ecology has become a big-data science in the past decades due to development of new sensors used in numerous studies in the ecological community. Many sensor networks have been established to collect data. For example, satellites, such as Terra and OCO-2 among others, have collected data relevant on global carbon cycle. Thousands of field manipulative experiments have been conducted to examine feedback of terrestrial carbon cycle to global changes. Networks of observations, such as FLUXNET, have measured land processes. In particular, the implementation of the National Ecological Observatory Network (NEON), which is designed to network different kinds of sensors at many locations over the nation, will generate large volumes of ecological data every day. The raw data from sensors from those networks offer an unprecedented opportunity for accelerating advances in our knowledge of ecological processes, educating teachers and students, supporting decision-making, testing ecological theory, and forecasting changes in ecosystem services. Currently, ecologists do not have the infrastructure in place to synthesize massive yet heterogeneous data into resources for decision support. It is urgent to develop an ecological forecasting system that can make the best use of multiple sources of data to assess long-term biosphere change and anticipate future states of ecosystem services at regional and continental scales. Forecasting relies on big models that describe major processes that underlie complex system dynamics. Ecological system models, despite great simplification of the real systems, are still complex in order to address real-world problems. For example, Community Land Model (CLM) incorporates thousands of processes related to energy balance, hydrology, and biogeochemistry. Integration of massive data from multiple big data sources with complex models has to tackle Big Data-Big Model (BDBM) challenges. Those challenges include interoperability of multiple

  1. The comparative analysis of rocks' resistance to forward-slanting disc cutters and traditionally installed disc cutters

    Science.gov (United States)

    Zhang, Zhao-Huang; Fei, Sun; Liang, Meng

    2016-08-01

    At present, disc cutters of a full face rock tunnel boring machine are mostly mounted in the traditional way. Practical use in engineering projects reveals that this installation method not only heavily affects the operation life of disc cutters, but also increases the energy consumption of a full face rock tunnel boring machine. To straighten out this issue, therefore, a rock-breaking model is developed for disc cutters' movement after the research on the rock breaking of forward-slanting disc cutters. Equations of its displacement are established based on the analysis of velocity vector of a disc cutter's rock-breaking point. The functional relations then are brought forward between the displacement parameters of a rock-breaking point and its coordinate through the analysis of micro displacement of a rock-breaking point. Thus, the geometric equations of rock deformation are derived for the forward-slanting installation of disc cutters. With a linear relationship remaining between the acting force and its deformation either before or after the leap breaking, the constitutive relation of rock deformation can be expressed in the form of generalized Hooke law, hence the comparative analysis of the variation in the resistance of rock to the disc cutters mounted in the forward-slanting way with that in the traditional way. It is discovered that with the same penetration, strain of the rock in contact with forward-slanting disc cutters is apparently on the decline, in other words, the resistance of rock to disc cutters is reduced. Thus wear of disc cutters resulted from friction is lowered and energy consumption is correspondingly decreased. It will be useful for the development of installation and design theory of disc cutters, and significant for the breakthrough in the design of full face rock tunnel boring machine.

  2. A Big Video Manifesto

    DEFF Research Database (Denmark)

    Mcilvenny, Paul Bruce; Davidsen, Jacob

    2017-01-01

    and beautiful visualisations. However, we also need to ask what the tools of big data can do both for the Humanities and for more interpretative approaches and methods. Thus, we prefer to explore how the power of computation, new sensor technologies and massive storage can also help with video-based qualitative......For the last few years, we have witnessed a hype about the potential results and insights that quantitative big data can bring to the social sciences. The wonder of big data has moved into education, traffic planning, and disease control with a promise of making things better with big numbers...

  3. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  4. Seismic activity and environment protection in rock burst areas

    International Nuclear Information System (INIS)

    Travnicek, L.; Holecko, J.; Knotek, S.

    1993-01-01

    The significance is pointed out of seismic activity caused by mining activities in rock burst areas of the Ostrava-Karvinna district. The need is emphasized of the monitoring of the seismic activity at the Czech-Poland border as needed by the Two-party international committee for exploitation of coal supplies on the common border. The adverse effect of rock burst on the surface is documented by examples provided by the Polish party. The technique is described of investigating the DPB seismic polygon, allowing to evaluate the adverse impact of rock burst on the environment. (author) 1 fig., 8 refs

  5. Applications of Big Data in Education

    OpenAIRE

    Faisal Kalota

    2015-01-01

    Big Data and analytics have gained a huge momentum in recent years. Big Data feeds into the field of Learning Analytics (LA) that may allow academic institutions to better understand the learners' needs and proactively address them. Hence, it is important to have an understanding of Big Data and its applications. The purpose of this descriptive paper is to provide an overview of Big Data, the technologies used in Big Data, and some of the applications of Big Data in educa...

  6. Thermal Inertia of Rocks and Rock Populations

    Science.gov (United States)

    Golombek, M. P.; Jakosky, B. M.; Mellon, M. T.

    2001-01-01

    The effective thermal inertia of rock populations on Mars and Earth is derived from a model of effective inertia versus rock diameter. Results allow a parameterization of the effective rock inertia versus rock abundance and bulk and fine component inertia. Additional information is contained in the original extended abstract.

  7. The Big Bang as the Ultimate Traffic Jam

    Science.gov (United States)

    Jejjala, Vishnu; Kavic, Michael; Minic, Djordje; Tze, Chia-Hsiung

    We present a novel solution to the nature and formation of the initial state of the Universe. It derives from the physics of a generally covariant extension of matrix theory. We focus on the dynamical state space of this background-independent quantum theory of gravity and matter — an infinite-dimensional, complex, nonlinear Grassmannian. When this space is endowed with a Fubini-Study-like metric, the associated geodesic distance between any two of its points is zero. This striking mathematical result translates into a physical description of a hot, zero-entropy Big Bang. The latter is then seen as a far-from-equilibrium, large-fluctuation-driven, metastable ordered transition — a "freezing by heating" jamming transition. Moreover, the subsequent unjamming transition could provide a mechanism for inflation while rejamming may model a Big Crunch, the final state of gravitational collapse.

  8. Big Data Semantics

    NARCIS (Netherlands)

    Ceravolo, Paolo; Azzini, Antonia; Angelini, Marco; Catarci, Tiziana; Cudré-Mauroux, Philippe; Damiani, Ernesto; Mazak, Alexandra; van Keulen, Maurice; Jarrar, Mustafa; Santucci, Giuseppe; Sattler, Kai-Uwe; Scannapieco, Monica; Wimmer, Manuel; Wrembel, Robert; Zaraket, Fadi

    2018-01-01

    Big Data technology has discarded traditional data modeling approaches as no longer applicable to distributed data processing. It is, however, largely recognized that Big Data impose novel challenges in data and infrastructure management. Indeed, multiple components and procedures must be

  9. Information base for waste repository design. Volume 3. Waste/rock interactions

    International Nuclear Information System (INIS)

    Koplick, C.M.; Pentz, D.L.; Oston, S.G.; Talbot, R.

    1979-01-01

    This report describes the important effects resulting from interaction between radioactive waste and the rock in a nuclear waste repository. The state of the art in predicting waste/rock interactions is summarized. Where possible, independent numerical calculations have been performed. Recommendations are made pointing out areas which require additional research

  10. Toward a Learning Health-care System - Knowledge Delivery at the Point of Care Empowered by Big Data and NLP.

    Science.gov (United States)

    Kaggal, Vinod C; Elayavilli, Ravikumar Komandur; Mehrabi, Saeed; Pankratz, Joshua J; Sohn, Sunghwan; Wang, Yanshan; Li, Dingcheng; Rastegar, Majid Mojarad; Murphy, Sean P; Ross, Jason L; Chaudhry, Rajeev; Buntrock, James D; Liu, Hongfang

    2016-01-01

    The concept of optimizing health care by understanding and generating knowledge from previous evidence, ie, the Learning Health-care System (LHS), has gained momentum and now has national prominence. Meanwhile, the rapid adoption of electronic health records (EHRs) enables the data collection required to form the basis for facilitating LHS. A prerequisite for using EHR data within the LHS is an infrastructure that enables access to EHR data longitudinally for health-care analytics and real time for knowledge delivery. Additionally, significant clinical information is embedded in the free text, making natural language processing (NLP) an essential component in implementing an LHS. Herein, we share our institutional implementation of a big data-empowered clinical NLP infrastructure, which not only enables health-care analytics but also has real-time NLP processing capability. The infrastructure has been utilized for multiple institutional projects including the MayoExpertAdvisor, an individualized care recommendation solution for clinical care. We compared the advantages of big data over two other environments. Big data infrastructure significantly outperformed other infrastructure in terms of computing speed, demonstrating its value in making the LHS a possibility in the near future.

  11. Comparative validity of brief to medium-length Big Five and Big Six personality questionnaires

    NARCIS (Netherlands)

    Thalmayer, A.G.; Saucier, G.; Eigenhuis, A.

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five

  12. Big Data and medicine: a big deal?

    Science.gov (United States)

    Mayer-Schönberger, V; Ingelsson, E

    2018-05-01

    Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research. Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data's role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval. © 2017 The Association for the Publication of the Journal of Internal Medicine.

  13. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  14. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  15. Comparative validity of brief to medium-length Big Five and Big Six Personality Questionnaires.

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-12-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are faced with a variety of options as to inventory length. Furthermore, a 6-factor model has been proposed to extend and update the Big Five model, in part by adding a dimension of Honesty/Humility or Honesty/Propriety. In this study, 3 popular brief to medium-length Big Five measures (NEO Five Factor Inventory, Big Five Inventory [BFI], and International Personality Item Pool), and 3 six-factor measures (HEXACO Personality Inventory, Questionnaire Big Six Scales, and a 6-factor version of the BFI) were placed in competition to best predict important student life outcomes. The effect of test length was investigated by comparing brief versions of most measures (subsets of items) with original versions. Personality questionnaires were administered to undergraduate students (N = 227). Participants' college transcripts and student conduct records were obtained 6-9 months after data was collected. Six-factor inventories demonstrated better predictive ability for life outcomes than did some Big Five inventories. Additional behavioral observations made on participants, including their Facebook profiles and cell-phone text usage, were predicted similarly by Big Five and 6-factor measures. A brief version of the BFI performed surprisingly well; across inventory platforms, increasing test length had little effect on predictive validity. Comparative validity of the models and measures in terms of outcome prediction and parsimony is discussed.

  16. Big Machines and Big Science: 80 Years of Accelerators at Stanford

    Energy Technology Data Exchange (ETDEWEB)

    Loew, Gregory

    2008-12-16

    Longtime SLAC physicist Greg Loew will present a trip through SLAC's origins, highlighting its scientific achievements, and provide a glimpse of the lab's future in 'Big Machines and Big Science: 80 Years of Accelerators at Stanford.'

  17. Dual of big bang and big crunch

    International Nuclear Information System (INIS)

    Bak, Dongsu

    2007-01-01

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory

  18. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  19. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  20. Rock pushing and sampling under rocks on Mars

    Science.gov (United States)

    Moore, H.J.; Liebes, S.; Crouch, D.S.; Clark, L.V.

    1978-01-01

    Viking Lander 2 acquired samples on Mars from beneath two rocks, where living organisms and organic molecules would be protected from ultraviolet radiation. Selection of rocks to be moved was based on scientific and engineering considerations, including rock size, rock shape, burial depth, and location in a sample field. Rock locations and topography were established using the computerized interactive video-stereophotogrammetric system and plotted on vertical profiles and in plan view. Sampler commands were developed and tested on Earth using a full-size lander and surface mock-up. The use of power by the sampler motor correlates with rock movements, which were by plowing, skidding, and rolling. Provenance of the samples was determined by measurements and interpretation of pictures and positions of the sampler arm. Analytical results demonstrate that the samples were, in fact, from beneath the rocks. Results from the Gas Chromatograph-Mass Spectrometer of the Molecular Analysis experiment and the Gas Exchange instrument of the Biology experiment indicate that more adsorbed(?) water occurs in samples under rocks than in samples exposed to the sun. This is consistent with terrestrial arid environments, where more moisture occurs in near-surface soil un- der rocks than in surrounding soil because the net heat flow is toward the soil beneath the rock and the rock cap inhibits evaporation. Inorganic analyses show that samples of soil from under the rocks have significantly less iron than soil exposed to the sun. The scientific significance of analyses of samples under the rocks is only partly evaluated, but some facts are clear. Detectable quantities of martian organic molecules were not found in the sample from under a rock by the Molecular Analysis experiment. The Biology experiments did not find definitive evidence for Earth-like living organisms in their sample. Significant amounts of adsorbed water may be present in the martian regolith. The response of the soil

  1. Geologic map of the Big Delta B-2 quadrangle, east-central Alaska

    Science.gov (United States)

    Day, Warren C.; Aleinikoff, John N.; Roberts, Paul; Smith, Moira; Gamble, Bruce M.; Henning, Mitchell W.; Gough, Larry P.; Morath, Laurie C.

    2003-01-01

    New 1:63,360-scale geologic mapping of the Big Delta B-2 quadrangle provides important data on the structural setting and age of geologic units, as well as on the timing of gold mineralization plutonism within the Yukon-Tanana Upland of east-central Alaska. Gold exploration has remained active throughout the region in response to the discovery of the Pogo gold deposit, which lies within the northwestern part of the quadrangle near the south bank of the Goodpaster River. Geologic mapping and associated geochronological and geochemical studies by the U.S. Geological Survey (USGS) and the Alaska Department of Natural Resources, Division of Mining and Water Management, provide baseline data to help understand the regional geologic framework. Teck Cominco Limited geologists have provided the geologic mapping for the area that overlies the Pogo gold deposit as well as logistical support, which has lead to a much improved and informative product. The Yukon-Tanana Upland lies within the Tintina province in Alaska and consists of Paleozoic and possibly older(?) supracrustal rocks intruded by Paleozoic (Devonian to Mississippian) and Cretaceous plutons. The oldest rocks in the Big Delta B-2 quadrangle are Paleozoic gneisses of both plutonic and sedimentary origin. Paleozoic deformation, potentially associated with plutonism, was obscured by intense Mesozoic deformation and metamorphism. At least some of the rocks in the quadrangle underwent tectonism during the Middle Jurassic (about 188 Ma), and were subsequently deformed in an Early Cretaceous contractional event between about 130 and 116 Ma. New U-Pb SHRIMP data presented here on zircons from the Paleozoic biotite gneisses record inherited cores that range from 363 Ma to about 2,130 Ma and have rims of euhedral Early Cretaceous metamorphic overgrowths (116 +/- 4 Ma), interpreted to record recrystallization during Cretaceous west-northwest-directed thrusting and folding. U-Pb SHRIMP dating of monazite from a Paleozoic

  2. Results of monitoring at Olkiluoto in 2004. Rock mechanics

    International Nuclear Information System (INIS)

    Riikonen, S.

    2005-09-01

    This report presents Posiva Oy's results of the rock mechanical monitoring programme from the year 2004. Monitoring programme was established for long time monitoring of modifications in the bedrock during the excavation of the ONKALO underground research facility stated in Olkiluoto island. This is the first annual report where rock mechanical research work has being reported also from the monitoring point of view. Rock mechanical research work consists of both GPS measurements and microseismic measurements carried out in Olkiluoto island. Both measurements have been performed during several years even before monitoring programme was established. GPS measurements have been carried out since 1995 and microseismic network has operated since 2002. There have been no significant changes in observations when studying rock mechanical results from the year 2004 and comparing them to results from the previous years. Therefore it can be said, that so far ONKALO has barely had any effect on rock mechanics in Olkiluoto. Report has been composed from the annual reports of GPS measurements.(orig.)

  3. Uranium and thorium in rocks and minerals of Zaangarsk alkaline massif

    International Nuclear Information System (INIS)

    Zhmodin, S.M.; Gofman, A.M.; Ksenzova, V.I.; Malmova, Z.V.; Nemirovskaya, N.A.

    1981-01-01

    U and Th distribution in rocks of the massif of alkaline-granitoid formation is studied using the methods of γ-spectrometry and neutron- fragment radiography. Predominant accumulation of U and Th in final products of magmatic differentiation - foyaites - is established. U and Th concentrations increased sharply during postmagmatic stage of alkaline massif formation - in permatites and metasomatically alterated rocks (Th/U and U/K ratios can serve as criteria for identification of such formations). The increase of U part, connected with accessory minerals in pegmatites and metasomatically alterated rocks, is pointed out. For U in postmagmatically alterated rocks high concentrations due to microcracks are characteristic [ru

  4. Particle Physics Catalysis of Thermal Big Bang Nucleosynthesis

    International Nuclear Information System (INIS)

    Pospelov, Maxim

    2007-01-01

    We point out that the existence of metastable, τ>10 3 s, negatively charged electroweak-scale particles (X - ) alters the predictions for lithium and other primordial elemental abundances for A>4 via the formation of bound states with nuclei during big bang nucleosynthesis. In particular, we show that the bound states of X - with helium, formed at temperatures of about T=10 8 K, lead to the catalytic enhancement of 6 Li production, which is 8 orders of magnitude more efficient than the standard channel. In particle physics models where subsequent decay of X - does not lead to large nonthermal big bang nucleosynthesis effects, this directly translates to the level of sensitivity to the number density of long-lived X - particles (τ>10 5 s) relative to entropy of n X - /s -17 , which is one of the most stringent probes of electroweak scale remnants known to date

  5. Interventions for treating osteoarthritis of the big toe joint.

    Science.gov (United States)

    Zammit, Gerard V; Menz, Hylton B; Munteanu, Shannon E; Landorf, Karl B; Gilheany, Mark F

    2010-09-08

    Osteoarthritis affecting of the big toe joint of the foot (hallux limitus or rigidus) is a common and painful condition. Although several treatments have been proposed, few have been adequately evaluated. To identify controlled trials evaluating interventions for osteoarthritis of the big toe joint and to determine the optimum intervention(s). Literature searches were conducted across the following electronic databases: CENTRAL; MEDLINE; EMBASE; CINAHL; and PEDro (to 14th January 2010). No language restrictions were applied. Randomised controlled trials, quasi-randomised trials, or controlled clinical trials that assessed treatment outcomes for osteoarthritis of the big toe joint. Participants of any age or gender with osteoarthritis of the big toe joint (defined either radiographically or clinically) were included. Two authors examined the list of titles and abstracts identified by the literature searches. One content area expert and one methodologist independently applied the pre-determined inclusion and exclusion criteria to the full text of identified trials. To minimise error and reduce potential bias, data were extracted independently by two content experts. Only one trial satisfactorily fulfilled the inclusion criteria and was included in this review. This trial evaluated the effectiveness of two physical therapy programs in 20 individuals with osteoarthritis of the big toe joint. Assessment outcomes included pain levels, big toe joint range of motion and plantar flexion strength of the hallux. Mean differences at four weeks follow up were 3.80 points (95% CI 2.74 to 4.86) for self reported pain, 28.30 degrees (95% CI 21.37 to 35.23) for big toe joint range of motion, and 2.80 kg (95% CI 2.13 to 3.47) for muscle strength. Although differences in outcomes between treatment and control groups were reported, the risk of bias was high. The trial failed to employ appropriate randomisation or adequate allocation concealment, used a relatively small sample and

  6. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  7. Generalized formal model of Big Data

    OpenAIRE

    Shakhovska, N.; Veres, O.; Hirnyak, M.

    2016-01-01

    This article dwells on the basic characteristic features of the Big Data technologies. It is analyzed the existing definition of the “big data” term. The article proposes and describes the elements of the generalized formal model of big data. It is analyzed the peculiarities of the application of the proposed model components. It is described the fundamental differences between Big Data technology and business analytics. Big Data is supported by the distributed file system Google File System ...

  8. BigWig and BigBed: enabling browsing of large distributed datasets.

    Science.gov (United States)

    Kent, W J; Zweig, A S; Barber, G; Hinrichs, A S; Karolchik, D

    2010-09-01

    BigWig and BigBed files are compressed binary indexed files containing data at several resolutions that allow the high-performance display of next-generation sequencing experiment results in the UCSC Genome Browser. The visualization is implemented using a multi-layered software approach that takes advantage of specific capabilities of web-based protocols and Linux and UNIX operating systems files, R trees and various indexing and compression tricks. As a result, only the data needed to support the current browser view is transmitted rather than the entire file, enabling fast remote access to large distributed data sets. Binaries for the BigWig and BigBed creation and parsing utilities may be downloaded at http://hgdownload.cse.ucsc.edu/admin/exe/linux.x86_64/. Source code for the creation and visualization software is freely available for non-commercial use at http://hgdownload.cse.ucsc.edu/admin/jksrc.zip, implemented in C and supported on Linux. The UCSC Genome Browser is available at http://genome.ucsc.edu.

  9. Structure analysis - chiromancy of the rock

    International Nuclear Information System (INIS)

    Huber, A.; Huber, M.

    1989-01-01

    The reader may initially be surprised by a comparison between structure analysis and palmistry which is, in effect, a comparison between a scientific research method on the one hand and art which is equated with magical powers on the other. In the figurative sense, however, these two fields have some points in common which should help us to obtain a first impression of the nature of geological structure analysis. Chiromancy uses the lines and the form of the hand to predict the character and the future of the person in question. In the same way, geologists use rocks and rock forms to obtain information on structure and behaviour of different formations. Structure analysis is a specialised field of geological investigation in which traces of deformation are interpreted as expressions of rockforming forces. This article discusses how and why the character of a rock formation as well as its past, present and even future behaviour can be determined using structure analysis. (author) 11 figs

  10. Digital Rock Studies of Tight Porous Media

    Energy Technology Data Exchange (ETDEWEB)

    Silin, Dmitriy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-08-07

    This technical report summarizes some recently developed approaches to studies of rock properties at a pore scale. Digital rock approach is complementary to laboratory and field studies. It can be especially helpful in situations where experimental data are uncertain, or are difficult or impossible to obtain. Digitized binary images of the pore geometries of natural rocks obtained by different imaging techniques are the input data. Computer-generated models of natural rocks can be used instead of images in a case where microtomography data are unavailable, or the resolution of the tools is insufficient to adequately characterize the features of interest. Simulations of creeping viscous flow in pores produce estimates of Darcy permeability. Maximal Inscribed Spheres calculations estimate two-phase fluid distribution in capillary equilibrium. A combination of both produce relative permeability curves. Computer-generated rock models were employed to study two-phase properties of fractured rocks, or tight sands with slit-like pores, too narrow to be characterized with micro-tomography. Various scenarios can simulate different fluid displacement mechanisms, from piston-like drainage to liquid dropout at the dew point. A finite differences discretization of Stokes equation is developed to simulate flow in the pore space of natural rocks. The numerical schemes are capable to handle both no-slip and slippage flows. An upscaling procedure estimates the permeability by subsampling a large data set. Capillary equilibrium and capillary pressure curves are efficiently estimated with the method of maximal inscribed spheres both an arbitrary contact angle. The algorithms can handle gigobytes of data on a desktop workstation. Customized QuickHull algorithms model natural rocks. Capillary pressure curves evaluated from computer-generated images mimic those obtained for microtomography data.

  11. Big data-driven business how to use big data to win customers, beat competitors, and boost profits

    CERN Document Server

    Glass, Russell

    2014-01-01

    Get the expert perspective and practical advice on big data The Big Data-Driven Business: How to Use Big Data to Win Customers, Beat Competitors, and Boost Profits makes the case that big data is for real, and more than just big hype. The book uses real-life examples-from Nate Silver to Copernicus, and Apple to Blackberry-to demonstrate how the winners of the future will use big data to seek the truth. Written by a marketing journalist and the CEO of a multi-million-dollar B2B marketing platform that reaches more than 90% of the U.S. business population, this book is a comprehens

  12. The December 2008 Crammont rock avalanche, Mont Blanc massif area, Italy

    Directory of Open Access Journals (Sweden)

    P. Deline

    2011-12-01

    Full Text Available We describe a 0.5 Mm3 rock avalanche that occurred in 2008 in the western Alps and discuss possible roles of controlling factors in the context of current climate change. The source is located between 2410 m and 2653 m a.s.l. on Mont Crammont and is controlled by a densely fractured rock structure. The main part of the collapsed rock mass deposited at the foot of the rock wall. A smaller part travelled much farther, reaching horizontal and vertical travel distances of 3050 m and 1560 m, respectively. The mobility of the rock mass was enhanced by channelization and snow. The rock-avalanche volume was calculated by comparison of pre- and post-event DTMs, and geomechanical characterization of the detachment zone was extracted from LiDAR point cloud processing. Back analysis of the rock-avalanche runout suggests a two stage event.

    There was no previous rock avalanche activity from the Mont Crammont ridge during the Holocene. The 2008 rock avalanche may have resulted from permafrost degradation in the steep rock wall, as suggested by seepage water in the scar after the collapse in spite of negative air temperatures, and modelling of rock temperatures that indicate warm permafrost (T > −2 °C.

  13. Addressing the Big-Earth-Data Variety Challenge with the Hierarchical Triangular Mesh

    Science.gov (United States)

    Rilee, Michael L.; Kuo, Kwo-Sen; Clune, Thomas; Oloso, Amidu; Brown, Paul G.; Yu, Honfeng

    2016-01-01

    We have implemented an updated Hierarchical Triangular Mesh (HTM) as the basis for a unified data model and an indexing scheme for geoscience data to address the variety challenge of Big Earth Data. We observe that, in the absence of variety, the volume challenge of Big Data is relatively easily addressable with parallel processing. The more important challenge in achieving optimal value with a Big Data solution for Earth Science (ES) data analysis, however, is being able to achieve good scalability with variety. With HTM unifying at least the three popular data models, i.e. Grid, Swath, and Point, used by current ES data products, data preparation time for integrative analysis of diverse datasets can be drastically reduced and better variety scaling can be achieved. In addition, since HTM is also an indexing scheme, when it is used to index all ES datasets, data placement alignment (or co-location) on the shared nothing architecture, which most Big Data systems are based on, is guaranteed and better performance is ensured. Moreover, our updated HTM encoding turns most geospatial set operations into integer interval operations, gaining further performance advantages.

  14. Radon exhalation from granitic rocks

    International Nuclear Information System (INIS)

    Del Claro, Flávia; Paschuk, Sergei A.; Corrêa, Janine N.; Mazer, Wellington; Narloch, Danielle Cristine; Martin, Aline Cristina; Denyak, Valeriy

    2017-01-01

    Naturally occurring radionuclides such as radon ( 222 Rn), its decay products and other elements from the radioactive series of uranium ( 238 U and 235 U) and thorium ( 232 Th) are an important source of human exposure to natural radioactivity. The worldwide evaluation of health radiobiological effects and risks from population exposure to natural radionuclides is a growing concern. About 50% of personal radiation annual dose is related to radionuclides such as radon ( 222 Rn), thoron ( 220 Rn), radium ( 226 Ra), thorium ( 232 Th) and potassium ( 40 K), which are present in modern materials commonly used in construction of dwellings and buildings. The radioactivity of marbles and granites is of big concern since under certain conditions the radioactivity levels of these materials can be hazardous to the population and require the implementation of mitigation procedures. Present survey of the 222 Rn and 220 Rn activity concentration liberated in the air was performed using commercialized Brazilian granite rocks at national market as well as exported to other countries. The 222 Rn and 220 Rn measurements were performed using the AlphaGUARD instant monitor and RAD7 detector, respectively. This study was performed at the Applied Nuclear Physics Laboratory of the Federal University of Technology – Paraná (UTFPR). Obtained results of radon concentration activity in air exhaled studied samples of granites varied from 3±1 Bq/m 3 to 2087±19 Bq/m 3 , which shows that some samples of granitic rocks represent rather elevated health risk the population. (author)

  15. Change-point analysis of geophysical time-series: application to landslide displacement rate (Séchilienne rock avalanche, France)

    Science.gov (United States)

    Amorese, D.; Grasso, J.-R.; Garambois, S.; Font, M.

    2018-05-01

    The rank-sum multiple change-point method is a robust statistical procedure designed to search for the optimal number and the location of change points in an arbitrary continue or discrete sequence of values. As such, this procedure can be used to analyse time-series data. Twelve years of robust data sets for the Séchilienne (French Alps) rockslide show a continuous increase in average displacement rate from 50 to 280 mm per month, in the 2004-2014 period, followed by a strong decrease back to 50 mm per month in the 2014-2015 period. When possible kinematic phases are tentatively suggested in previous studies, its solely rely on the basis of empirical threshold values. In this paper, we analyse how the use of a statistical algorithm for change-point detection helps to better understand time phases in landslide kinematics. First, we test the efficiency of the statistical algorithm on geophysical benchmark data, these data sets (stream flows and Northern Hemisphere temperatures) being already analysed by independent statistical tools. Second, we apply the method to 12-yr daily time-series of the Séchilienne landslide, for rainfall and displacement data, from 2003 December to 2015 December, in order to quantitatively extract changes in landslide kinematics. We find two strong significant discontinuities in the weekly cumulated rainfall values: an average rainfall rate increase is resolved in 2012 April and a decrease in 2014 August. Four robust changes are highlighted in the displacement time-series (2008 May, 2009 November-December-2010 January, 2012 September and 2014 March), the 2010 one being preceded by a significant but weak rainfall rate increase (in 2009 November). Accordingly, we are able to quantitatively define five kinematic stages for the Séchilienne rock avalanche during this period. The synchronization between the rainfall and displacement rate, only resolved at the end of 2009 and beginning of 2010, corresponds to a remarkable change (fourfold

  16. Stalin's Big Fleet Program

    National Research Council Canada - National Science Library

    Mauner, Milan

    2002-01-01

    Although Dr. Milan Hauner's study 'Stalin's Big Fleet program' has focused primarily on the formation of Big Fleets during the Tsarist and Soviet periods of Russia's naval history, there are important lessons...

  17. Thermo-hydro-mechanical behavior of fractured rock mass

    International Nuclear Information System (INIS)

    Coste, F.

    1997-12-01

    The purpose of this research is to model Thermo-Hydro-Mechanical behavior of fractured rock mass regarding a nuclear waste re-depository. For this, a methodology of modeling was proposed and was applied to a real underground site (EDF site at Nouvelle Romanche). This methodology consists, in a first step, to determine hydraulic and mechanical REV. Beyond the greatest of these REV, development of a finite element code allows to model all the fractures in an explicit manner. The homogenized mechanical properties are determined in drained and undrained boundary conditions by simulating triaxial tests that represent rock mass subject to loading. These simulations allow to study the evolution of hydraulic and mechanical properties as a function of stress state. Drained and undrained boundary conditions enable to discuss the validity of assimilation of a fractured rock mass to a porous medium. The simulations lead to a better understanding of the behavior of the fractured rock masses and allow to show the dominant role of the shear behavior of the fractures on the hydraulic and mechanical homogenized properties. From a thermal point of view, as long as conduction is dominant, thermal properties of the rock mass are almost the same as those the intact rock. (author)

  18. Five Big, Big Five Issues : Rationale, Content, Structure, Status, and Crosscultural Assessment

    NARCIS (Netherlands)

    De Raad, Boele

    1998-01-01

    This article discusses the rationale, content, structure, status, and crosscultural assessment of the Big Five trait factors, focusing on topics of dispute and misunderstanding. Taxonomic restrictions of the original Big Five forerunner, the "Norman Five," are discussed, and criticisms regarding the

  19. Cataclastic effects in rock salt laboratory and in situ measurements

    International Nuclear Information System (INIS)

    Gramberg, J.; Roest, J.P.A.

    1984-01-01

    The aim of the research is the determination of eventual cataclastic effects in environmental rock salt of a heated part of a vertical deep test bore hole, a model for HLW disposal. Known cataclastic systems from hard rock mining and rock salt mines will form the starting point for the explanation of convergence of underground cavity walls. In rock salt, however, different elements seem to prevail: crystal plasticity and micro-cataclasis. The environmental measurements at the deep bore hole have to be carried out from a distance. To this end the acoustic micro-seismic method will be a suitable one. The appropriate equipment for micro-seismic cross hole measurement is designed, constructed and tested in the laboratory as well as underground. Acoustic velocity data form a crucial point. A micro-seismic acoustic P-wave model, adapted to the process of structural changes, is developed. P-wave velocity measurements in rock salt cubes in the laboratory are described. An underground cross hole measurement in the wall of a gallery with semi-circular section is treated and analysed. A conclusion was that, in this case, no macro-cataclasis (systematic large fractures) will be involved in the process of gallery convergence, but that the mechanism proved to be a combination of crystal plasticity and micro-cataclasis. The same mechanism might be expected to be present in the environmental rock salt of the HLW-disposal deep bore hole. As a result this environmental rock salt might be expected to be impermeable. A plan for the application of the developed equipment during the heating test on the ECN-deep-bore-hole is shown. A theory on ''disking'' or ''rim cracks'' is presented in an annex

  20. Rb/Sr and K/Ar dating of Slovakia's rocks, its possibilities and interpretation

    International Nuclear Information System (INIS)

    Cambel, B.; Bagdasaryan, G.P.; Gukasyan, P.Kh.; Veselskij, I.

    1979-01-01

    Nuclear geochronological data are evaluated and summarized into histograms. On the basis of an evaluation, the results are compared obtained using various methods (K/Ar, Rb/Sr and U-Th-Pb) and the age of rocks in West Carpathians determined. It is pointed out that the age determination is dependent on the geochemical state of the studied minerals and rocks. The obtained results confirm the palaeozoic age of the majority of rocks of West Carpathians. Direct nuclear geochronological evidence of precambrium rocks has not yet been obtained. (author)

  1. Toward a Learning Health-care System – Knowledge Delivery at the Point of Care Empowered by Big Data and NLP

    Science.gov (United States)

    Kaggal, Vinod C.; Elayavilli, Ravikumar Komandur; Mehrabi, Saeed; Pankratz, Joshua J.; Sohn, Sunghwan; Wang, Yanshan; Li, Dingcheng; Rastegar, Majid Mojarad; Murphy, Sean P.; Ross, Jason L.; Chaudhry, Rajeev; Buntrock, James D.; Liu, Hongfang

    2016-01-01

    The concept of optimizing health care by understanding and generating knowledge from previous evidence, ie, the Learning Health-care System (LHS), has gained momentum and now has national prominence. Meanwhile, the rapid adoption of electronic health records (EHRs) enables the data collection required to form the basis for facilitating LHS. A prerequisite for using EHR data within the LHS is an infrastructure that enables access to EHR data longitudinally for health-care analytics and real time for knowledge delivery. Additionally, significant clinical information is embedded in the free text, making natural language processing (NLP) an essential component in implementing an LHS. Herein, we share our institutional implementation of a big data-empowered clinical NLP infrastructure, which not only enables health-care analytics but also has real-time NLP processing capability. The infrastructure has been utilized for multiple institutional projects including the MayoExpertAdvisor, an individualized care recommendation solution for clinical care. We compared the advantages of big data over two other environments. Big data infrastructure significantly outperformed other infrastructure in terms of computing speed, demonstrating its value in making the LHS a possibility in the near future. PMID:27385912

  2. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  3. Big Data and HPC collocation: Using HPC idle resources for Big Data Analytics

    OpenAIRE

    MERCIER , Michael; Glesser , David; Georgiou , Yiannis; Richard , Olivier

    2017-01-01

    International audience; Executing Big Data workloads upon High Performance Computing (HPC) infrastractures has become an attractive way to improve their performances. However, the collocation of HPC and Big Data workloads is not an easy task, mainly because of their core concepts' differences. This paper focuses on the challenges related to the scheduling of both Big Data and HPC workloads on the same computing platform. In classic HPC workloads, the rigidity of jobs tends to create holes in ...

  4. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  5. Big data - a 21st century science Maginot Line? No-boundary thinking: shifting from the big data paradigm.

    Science.gov (United States)

    Huang, Xiuzhen; Jennings, Steven F; Bruce, Barry; Buchan, Alison; Cai, Liming; Chen, Pengyin; Cramer, Carole L; Guan, Weihua; Hilgert, Uwe Kk; Jiang, Hongmei; Li, Zenglu; McClure, Gail; McMullen, Donald F; Nanduri, Bindu; Perkins, Andy; Rekepalli, Bhanu; Salem, Saeed; Specker, Jennifer; Walker, Karl; Wunsch, Donald; Xiong, Donghai; Zhang, Shuzhong; Zhang, Yu; Zhao, Zhongming; Moore, Jason H

    2015-01-01

    Whether your interests lie in scientific arenas, the corporate world, or in government, you have certainly heard the praises of big data: Big data will give you new insights, allow you to become more efficient, and/or will solve your problems. While big data has had some outstanding successes, many are now beginning to see that it is not the Silver Bullet that it has been touted to be. Here our main concern is the overall impact of big data; the current manifestation of big data is constructing a Maginot Line in science in the 21st century. Big data is not "lots of data" as a phenomena anymore; The big data paradigm is putting the spirit of the Maginot Line into lots of data. Big data overall is disconnecting researchers and science challenges. We propose No-Boundary Thinking (NBT), applying no-boundary thinking in problem defining to address science challenges.

  6. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Andersen, Kristina Vaarst; Jeppesen, Jacob

    In this paper we investigate the micro-mechanisms governing structural evolution and performance of scientific collaboration. Scientific discovery tends not to be lead by so called lone ?stars?, or big egos, but instead by collaboration among groups of researchers, from a multitude of institutions...

  7. Big Data and Big Science

    OpenAIRE

    Di Meglio, Alberto

    2014-01-01

    Brief introduction to the challenges of big data in scientific research based on the work done by the HEP community at CERN and how the CERN openlab promotes collaboration among research institutes and industrial IT companies. Presented at the FutureGov 2014 conference in Singapore.

  8. Big Data, jejich skladování a možnosti využití

    OpenAIRE

    Macek, Jáchym

    2013-01-01

    The content of this bachelor's thesis is to analyze work with data especially with large-volume unstructured data, thus Big Data. The thesis is retrieval and contents informational survey based on questionnaires and interviews. The aim is to evaluate and approximate Big Data theme, their storage, tools for their management and opportunities of its exploitation to the reader from technological and business point of view. The objective for the practical part is a survey. The thesis is divided i...

  9. Usage of energy- dispersial analysis in studying rocks melts

    Directory of Open Access Journals (Sweden)

    Kudelas Dušan

    2001-09-01

    Full Text Available EDS analysis of constituent minerals of nefelitic basanite from locality Konrádovce – lava stream of ceric basalt formation of upper Phocene-Pleistocene age was carried out using the electron microscope JEOL JSM-840 and the energy-dispersive microanalyser KEVEX DELTA+ with MIRROR QUANTEX+ software.Based on the results of EDS microanalysis, the primary rock can be, from the petrographic point of view, described as nefelitic basanite. The following substances were determined in the primer matter and porfiric phenocrysts:- isometric grainsof pyroxene-augite (point A1,- grainsof nepheline-kalsite (point A2,- cryptocrystallic glassy matter (point A3,- grainsof olivine (point A4,- microlite of basic plagioclase (point A5.The energy-dispersive analysis is fast and full spectrum is taken at the same time. In common, a required time is less than one minute. Results of the measurement donot depend significantly on the topography of sample and it is also possible to analyze a rough surface which makes easier the preparation of samples. A very important aspect of the mentioned method is the precision of obtained results in order to identify the chemical composition of analyzed point which, in a subsequent step, allows to determine the type of mineral. EDS is a convenient and powerful supplement of microscopic studies which are, sometime, unable to distinguish exactly the complete composition of the analyzed rocks.

  10. Big data is not a monolith

    CERN Document Server

    Ekbia, Hamid R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  11. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......, and highlight some recent methodological advancements in machine learning and image analysis triggered by astronomical applications....

  12. Poker Player Behavior After Big Wins and Big Losses

    OpenAIRE

    Gary Smith; Michael Levere; Robert Kurtzman

    2009-01-01

    We find that experienced poker players typically change their style of play after winning or losing a big pot--most notably, playing less cautiously after a big loss, evidently hoping for lucky cards that will erase their loss. This finding is consistent with Kahneman and Tversky's (Kahneman, D., A. Tversky. 1979. Prospect theory: An analysis of decision under risk. Econometrica 47(2) 263-292) break-even hypothesis and suggests that when investors incur a large loss, it might be time to take ...

  13. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  14. Estimating the Wet-Rock P-Wave Velocity from the Dry-Rock P-Wave Velocity for Pyroclastic Rocks

    Science.gov (United States)

    Kahraman, Sair; Fener, Mustafa; Kilic, Cumhur Ozcan

    2017-07-01

    Seismic methods are widely used for the geotechnical investigations in volcanic areas or for the determination of the engineering properties of pyroclastic rocks in laboratory. Therefore, developing a relation between the wet- and dry-rock P-wave velocities will be helpful for engineers when evaluating the formation characteristics of pyroclastic rocks. To investigate the predictability of the wet-rock P-wave velocity from the dry-rock P-wave velocity for pyroclastic rocks P-wave velocity measurements were conducted on 27 different pyroclastic rocks. In addition, dry-rock S-wave velocity measurements were conducted. The test results were modeled using Gassmann's and Wood's theories and it was seen that estimates for saturated P-wave velocity from the theories fit well measured data. For samples having values of less and greater than 20%, practical equations were derived for reliably estimating wet-rock P-wave velocity as function of dry-rock P-wave velocity.

  15. Big data in medical science--a biostatistical view.

    Science.gov (United States)

    Binder, Harald; Blettner, Maria

    2015-02-27

    Inexpensive techniques for measurement and data storage now enable medical researchers to acquire far more data than can conveniently be analyzed by traditional methods. The expression "big data" refers to quantities on the order of magnitude of a terabyte (1012 bytes); special techniques must be used to evaluate such huge quantities of data in a scientifically meaningful way. Whether data sets of this size are useful and important is an open question that currently confronts medical science. In this article, we give illustrative examples of the use of analytical techniques for big data and discuss them in the light of a selective literature review. We point out some critical aspects that should be considered to avoid errors when large amounts of data are analyzed. Machine learning techniques enable the recognition of potentially relevant patterns. When such techniques are used, certain additional steps should be taken that are unnecessary in more traditional analyses; for example, patient characteristics should be differentially weighted. If this is not done as a preliminary step before similarity detection, which is a component of many data analysis operations, characteristics such as age or sex will be weighted no higher than any one out of 10 000 gene expression values. Experience from the analysis of conventional observational data sets can be called upon to draw conclusions about potential causal effects from big data sets. Big data techniques can be used, for example, to evaluate observational data derived from the routine care of entire populations, with clustering methods used to analyze therapeutically relevant patient subgroups. Such analyses can provide complementary information to clinical trials of the classic type. As big data analyses become more popular, various statistical techniques for causality analysis in observational data are becoming more widely available. This is likely to be of benefit to medical science, but specific adaptations will

  16. Particle physics catalysis of thermal big bang nucleosynthesis.

    Science.gov (United States)

    Pospelov, Maxim

    2007-06-08

    We point out that the existence of metastable, tau>10(3) s, negatively charged electroweak-scale particles (X-) alters the predictions for lithium and other primordial elemental abundances for A>4 via the formation of bound states with nuclei during big bang nucleosynthesis. In particular, we show that the bound states of X- with helium, formed at temperatures of about T=10(8) K, lead to the catalytic enhancement of 6Li production, which is 8 orders of magnitude more efficient than the standard channel. In particle physics models where subsequent decay of X- does not lead to large nonthermal big bang nucleosynthesis effects, this directly translates to the level of sensitivity to the number density of long-lived X- particles (tau>10(5) s) relative to entropy of nX-/s less, approximately <3x10(-17), which is one of the most stringent probes of electroweak scale remnants known to date.

  17. Big data in Finnish financial services

    OpenAIRE

    Laurila, M. (Mikko)

    2017-01-01

    Abstract This thesis aims to explore the concept of big data, and create understanding of big data maturity in the Finnish financial services industry. The research questions of this thesis are “What kind of big data solutions are being implemented in the Finnish financial services sector?” and “Which factors impede faster implementation of big data solutions in the Finnish financial services sector?”. ...

  18. Big data in fashion industry

    Science.gov (United States)

    Jain, S.; Bruniaux, J.; Zeng, X.; Bruniaux, P.

    2017-10-01

    Significant work has been done in the field of big data in last decade. The concept of big data includes analysing voluminous data to extract valuable information. In the fashion world, big data is increasingly playing a part in trend forecasting, analysing consumer behaviour, preference and emotions. The purpose of this paper is to introduce the term fashion data and why it can be considered as big data. It also gives a broad classification of the types of fashion data and briefly defines them. Also, the methodology and working of a system that will use this data is briefly described.

  19. AUTOMATIC EXTRACTION OF ROCK JOINTS FROM LASER SCANNED DATA BY MOVING LEAST SQUARES METHOD AND FUZZY K-MEANS CLUSTERING

    Directory of Open Access Journals (Sweden)

    S. Oh

    2012-09-01

    Full Text Available Recent development of laser scanning device increased the capability of representing rock outcrop in a very high resolution. Accurate 3D point cloud model with rock joint information can help geologist to estimate stability of rock slope on-site or off-site. An automatic plane extraction method was developed by computing normal directions and grouping them in similar direction. Point normal was calculated by moving least squares (MLS method considering every point within a given distance to minimize error to the fitting plane. Normal directions were classified into a number of dominating clusters by fuzzy K-means clustering. Region growing approach was exploited to discriminate joints in a point cloud. Overall procedure was applied to point cloud with about 120,000 points, and successfully extracted joints with joint information. The extraction procedure was implemented to minimize number of input parameters and to construct plane information into the existing point cloud for less redundancy and high usability of the point cloud itself.

  20. [Algorithms, machine intelligence, big data : general considerations].

    Science.gov (United States)

    Radermacher, F J

    2015-08-01

    We are experiencing astonishing developments in the areas of big data and artificial intelligence. They follow a pattern that we have now been observing for decades: according to Moore's Law,the performance and efficiency in the area of elementary arithmetic operations increases a thousand-fold every 20 years. Although we have not achieved the status where in the singular sense machines have become as "intelligent" as people, machines are becoming increasingly better. The Internet of Things has again helped to massively increase the efficiency of machines. Big data and suitable analytics do the same. If we let these processes simply continue, our civilization may be endangerd in many instances. If the "containment" of these processes succeeds in the context of a reasonable political global governance, a worldwide eco-social market economy, andan economy of green and inclusive markets, many desirable developments that are advantageous for our future may result. Then, at some point in time, the constant need for more and faster innovation may even stop. However, this is anything but certain. We are facing huge challenges.

  1. 'Escher' Rock

    Science.gov (United States)

    2004-01-01

    [figure removed for brevity, see original site] Chemical Changes in 'Endurance' Rocks [figure removed for brevity, see original site] Figure 1 This false-color image taken by NASA's Mars Exploration Rover Opportunity shows a rock dubbed 'Escher' on the southwestern slopes of 'Endurance Crater.' Scientists believe the rock's fractures, which divide the surface into polygons, may have been formed by one of several processes. They may have been caused by the impact that created Endurance Crater, or they might have arisen when water leftover from the rock's formation dried up. A third possibility is that much later, after the rock was formed, and after the crater was created, the rock became wet once again, then dried up and developed cracks. Opportunity has spent the last 14 sols investigating Escher, specifically the target dubbed 'Kirchner,' and other similar rocks with its scientific instruments. This image was taken on sol 208 (Aug. 24, 2004) by the rover's panoramic camera, using the 750-, 530- and 430-nanometer filters. The graph above shows that rocks located deeper into 'Endurance Crater' are chemically altered to a greater degree than rocks located higher up. This chemical alteration is believed to result from exposure to water. Specifically, the graph compares ratios of chemicals between the deep rock dubbed 'Escher,' and the more shallow rock called 'Virginia,' before (red and blue lines) and after (green line) the Mars Exploration Rover Opportunity drilled into the rocks. As the red and blue lines indicate, Escher's levels of chlorine relative to Virginia's went up, and sulfur down, before the rover dug a hole into the rocks. This implies that the surface of Escher has been chemically altered to a greater extent than the surface of Virginia. Scientists are still investigating the role water played in influencing this trend. These data were taken by the rover's alpha particle X-ray spectrometer.

  2. Big data bioinformatics.

    Science.gov (United States)

    Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao

    2014-12-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.

  3. Big Rock Point Nuclear Plant. 23rd semiannual report of operations, July--December 1976

    International Nuclear Information System (INIS)

    1976-01-01

    Net electrical power generated was 240,333.9 MWh(e) with the reactor on line 4,316.6 hr. Information is presented concerning operation, power generation, shutdowns, corrective maintenance, chemistry and radiochemistry, occupational radiation exposure, release of radioactive materials, changes, tests, experiments, and environmental monitoring

  4. Changing the personality of a face: Perceived Big Two and Big Five personality factors modeled in real photographs.

    Science.gov (United States)

    Walker, Mirella; Vetter, Thomas

    2016-04-01

    General, spontaneous evaluations of strangers based on their faces have been shown to reflect judgments of these persons' intention and ability to harm. These evaluations can be mapped onto a 2D space defined by the dimensions trustworthiness (intention) and dominance (ability). Here we go beyond general evaluations and focus on more specific personality judgments derived from the Big Two and Big Five personality concepts. In particular, we investigate whether Big Two/Big Five personality judgments can be mapped onto the 2D space defined by the dimensions trustworthiness and dominance. Results indicate that judgments of the Big Two personality dimensions almost perfectly map onto the 2D space. In contrast, at least 3 of the Big Five dimensions (i.e., neuroticism, extraversion, and conscientiousness) go beyond the 2D space, indicating that additional dimensions are necessary to describe more specific face-based personality judgments accurately. Building on this evidence, we model the Big Two/Big Five personality dimensions in real facial photographs. Results from 2 validation studies show that the Big Two/Big Five are perceived reliably across different samples of faces and participants. Moreover, results reveal that participants differentiate reliably between the different Big Two/Big Five dimensions. Importantly, this high level of agreement and differentiation in personality judgments from faces likely creates a subjective reality which may have serious consequences for those being perceived-notably, these consequences ensue because the subjective reality is socially shared, irrespective of the judgments' validity. The methodological approach introduced here might prove useful in various psychological disciplines. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  5. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  6. Cuttability Assessment of Selected Rocks Through Different Brittleness Values

    Science.gov (United States)

    Dursun, Arif Emre; Gokay, M. Kemal

    2016-04-01

    Prediction of cuttability is a critical issue for successful execution of tunnel or mining excavation projects. Rock cuttability is also used to determine specific energy, which is defined as the work done by the cutting force to excavate a unit volume of yield. Specific energy is a meaningful inverse measure of cutting efficiency, since it simply states how much energy must be expended to excavate a unit volume of rock. Brittleness is a fundamental rock property and applied in drilling and rock excavation. Brittleness is one of the most crucial rock features for rock excavation. For this reason, determination of relations between cuttability and brittleness will help rock engineers. This study aims to estimate the specific energy from different brittleness values of rocks by means of simple and multiple regression analyses. In this study, rock cutting, rock property, and brittleness index tests were carried out on 24 different rock samples with different strength values, including marble, travertine, and tuff, collected from sites around Konya Province, Turkey. Four previously used brittleness concepts were evaluated in this study, denoted as B 1 (ratio of compressive to tensile strength), B 2 (ratio of the difference between compressive and tensile strength to the sum of compressive and tensile strength), B 3 (area under the stress-strain line in relation to compressive and tensile strength), and B 9 = S 20, the percentage of fines (point load strengths of rocks using multiple regression analysis). The results suggest that the proposed simple regression-based prediction models including B 3, B 9, and B 9p outperform the other models including B 1 and B 2 and can be used for more accurate and reliable estimation of specific energy.

  7. Big game hunting practices, meanings, motivations and constraints: a survey of Oregon big game hunters

    Science.gov (United States)

    Suresh K. Shrestha; Robert C. Burns

    2012-01-01

    We conducted a self-administered mail survey in September 2009 with randomly selected Oregon hunters who had purchased big game hunting licenses/tags for the 2008 hunting season. Survey questions explored hunting practices, the meanings of and motivations for big game hunting, the constraints to big game hunting participation, and the effects of age, years of hunting...

  8. Elasticity of water-saturated rocks as a function of temperature and pressure.

    Science.gov (United States)

    Takeuchi, S.; Simmons, G.

    1973-01-01

    Compressional and shear wave velocities of water-saturated rocks were measured as a function of both pressure and temperature near the melting point of ice to confining pressure of 2 kb. The pore pressure was kept at about 1 bar before the water froze. The presence of a liquid phase (rather than ice) in microcracks of about 0.3% porosity affected the compressional wave velocity by about 5% and the shear wave velocity by about 10%. The calculated effective bulk modulus of the rocks changes rapidly over a narrow range of temperature near the melting point of ice, but the effective shear modulus changes gradually over a wider range of temperature. This phenomenon, termed elastic anomaly, is attributed to the existence of liquid on the boundary between rock and ice due to local stresses and anomalous melting of ice under pressure.

  9. Influence of various excavation techniques on the structure and physical properties of 'near-field' rock around large boreholes

    International Nuclear Information System (INIS)

    Pusch, R.

    1989-12-01

    The procedure employed in the excavation of canister deposition holes affects the structure and physical properties of the 'near-field' rock. Except for smooth blasting, the generated damage appears to be less important than the increase in 'axial' hydraulic conductivity that is caused by stress release effects, but both combine to yield significant local flow passages. This is particularly obvious where the rock structure yield steep wedges, which is frequently occurring in granite. Percussion drilling is concluded to cause rich fine-fissuring to a distance of up to one centimeter from the borehole wall, and 'discing'. Richer fissuring and some generation of new fractures and growth of preexisting ones are produced within several decimeters from the borehole wall by full-face drilling. Core drilling has the least effect on the rock structure. Smooth blasting produces a particular form of regular fractures which appear to be determinants of the hydraulic conductivity of the near-field rock. Theoretically, its conductivity in the axial direction of blasted big holes or tunnels should be in the range of 10 -8 - 10 -6 m/s, which is in agreement with measurements in the Stripa mine. (orig.)

  10. Quantitative rock-fall hazard and risk assessment for Yosemite Valley, California

    Science.gov (United States)

    Stock, G. M.; Luco, N.; Collins, B. D.; Harp, E.; Reichenbach, P.; Frankel, K. L.

    2011-12-01

    Rock falls are a considerable hazard in Yosemite Valley, California with more than 835 rock falls and other slope movements documented since 1857. Thus, rock falls pose potentially significant risk to the nearly four million annual visitors to Yosemite National Park. Building on earlier hazard assessment work by the U.S. Geological Survey, we performed a quantitative rock-fall hazard and risk assessment for Yosemite Valley. This work was aided by several new data sets, including precise Geographic Information System (GIS) maps of rock-fall deposits, airborne and terrestrial LiDAR-based point cloud data and digital elevation models, and numerical ages of talus deposits. Using Global Position Systems (GPS), we mapped the positions of over 500 boulders on the valley floor and measured their distance relative to the mapped base of talus. Statistical analyses of these data yielded an initial hazard zone that is based on the 90th percentile distance of rock-fall boulders beyond the talus edge. This distance was subsequently scaled (either inward or outward from the 90th percentile line) based on rock-fall frequency information derived from a combination of cosmogenic beryllium-10 exposure dating of boulders beyond the edge of the talus, and computer model simulations of rock-fall runout. The scaled distances provide the basis for a new hazard zone on the floor of Yosemite Valley. Once this zone was delineated, we assembled visitor, employee, and resident use data for each structure within the hazard zone to quantitatively assess risk exposure. Our results identify areas within the new hazard zone that may warrant more detailed study, for example rock-fall susceptibility, which can be assessed through examination of high-resolution photographs, structural measurements on the cliffs, and empirical calculations derived from LiDAR point cloud data. This hazard and risk information is used to inform placement of existing and potential future infrastructure in Yosemite Valley.

  11. Musical Structure as Narrative in Rock

    Directory of Open Access Journals (Sweden)

    John Fernando Encarnacao

    2011-09-01

    Full Text Available In an attempt to take a fresh look at the analysis of form in rock music, this paper uses Susan McClary’s (2000 idea of ‘quest narrative’ in Western art music as a starting point. While much pop and rock adheres to the basic structure of the establishment of a home territory, episodes or adventures away, and then a return, my study suggests three categories of rock music form that provide alternatives to common combinations of verses, choruses and bridges through which the quest narrative is delivered. Labyrinth forms present more than the usual number of sections to confound our sense of ‘home’, and consequently of ‘quest’. Single-cell forms use repetition to suggest either a kind of stasis or to disrupt our expectations of beginning, middle and end. Immersive forms blur sectional divisions and invite more sensual and participatory responses to the recorded text. With regard to all of these alternative approaches to structure, Judy Lochhead’s (1992 concept of ‘forming’ is called upon to underline rock music forms that unfold as process, rather than map received formal constructs. Central to the argument are a couple of crucial definitions. Following Theodore Gracyk (1996, it is not songs, as such, but particular recordings that constitute rock music texts. Additionally, narrative is understood not in (direct relation to the lyrics of a song, nor in terms of artists’ biographies or the trajectories of musical styles, but considered in terms of musical structure. It is hoped that this outline of non-narrative musical structures in rock may have applications not only to other types of music, but to other time-based art forms.

  12. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  13. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  14. Determining air distribution during outbursts of gases and rocks

    Energy Technology Data Exchange (ETDEWEB)

    Struminski, A; Sikora, M; Urbanski, J [Politechnika Wroclawska (Poland). Instytut Gornictwa

    1989-01-01

    Discusses use of the KPW-1 iterative and autocorrelation method developed by A. Struminski for forecasting effects of rock bursts on ventilation systems of underground coal mines with increased content of methane or carbon dioxide in coal seams and adjacent rock strata. The method is used for prediction of air flow changes caused by a rock burst accompanied by violent outburst of gases. Directions of air flow, flow rate and concentration of gases emitted from surrounding strata to mine workings are predicted. On the basis of this prediction concentration of gases from a coal outburst is determined for any point in a ventilation network. The prediction method is used for assessing hazards for coal mines during and after a rock burst. Use of the method is explained on the example of the Thorez and Walbrzych coal mines. Computer programs developed for ODRA and IBM/XT computers are discussed. 6 refs.

  15. Exploring complex and big data

    Directory of Open Access Journals (Sweden)

    Stefanowski Jerzy

    2017-12-01

    Full Text Available This paper shows how big data analysis opens a range of research and technological problems and calls for new approaches. We start with defining the essential properties of big data and discussing the main types of data involved. We then survey the dedicated solutions for storing and processing big data, including a data lake, virtual integration, and a polystore architecture. Difficulties in managing data quality and provenance are also highlighted. The characteristics of big data imply also specific requirements and challenges for data mining algorithms, which we address as well. The links with related areas, including data streams and deep learning, are discussed. The common theme that naturally emerges from this characterization is complexity. All in all, we consider it to be the truly defining feature of big data (posing particular research and technological challenges, which ultimately seems to be of greater importance than the sheer data volume.

  16. Magnetic mineralogy and rock magnetic properties of silicate and carbonatite rocks from Oldoinyo Lengai volcano (Tanzania)

    Science.gov (United States)

    Mattsson, H. B.; Balashova, A.; Almqvist, B. S. G.; Bosshard-Stadlin, S. A.; Weidendorfer, D.

    2018-06-01

    Oldoinyo Lengai, a stratovolcano in northern Tanzania, is most famous for being the only currently active carbonatite volcano on Earth. The bulk of the volcanic edifice is dominated by eruptive products produced by silica-undersaturated, peralkaline, silicate magmas (effusive, explosive and/or as cumulates at depth). The recent (2007-2008) explosive eruption produced the first ever recorded pyroclastic flows at this volcano and the accidental lithics incorporated into the pyroclastic flows represent a broad variety of different rock types, comprising both extrusive and intrusive varieties, in addition to various types of cumulates. This mix of different accidental lithics provides a unique insight into the inner workings of the world's only active carbonatite volcano. Here, we focus on the magnetic mineralogy and the rock magnetic properties of a wide selection of samples spanning the spectrum of Oldoinyo Lengai rock types compositionally, as well from a textural point of view. Here we show that the magnetic properties of most extrusive silicate rocks are dominated by magnetite-ulvöspinel solid solutions, and that pyrrhotite plays a larger role in the magnetic properties of the intrusive silicate rocks. The natrocarbonatitic lavas, for which the volcano is best known for, show distinctly different magnetic properties in comparison with the silicate rocks. This discrepancy may be explained by abundant alabandite crystals/blebs in the groundmass of the natrocarbonatitic lavas. A detailed combination of petrological/mineralogical studies with geophysical investigations is an absolute necessity in order to understand, and to better constrain, the overall architecture and inner workings of the subvolcanic plumbing system. The results presented here may also have implications for the quest in order to explain the genesis of the uniquely natrocarbonatitic magmas characteristic of Oldoinyo Lengai.

  17. Was there a big bang

    International Nuclear Information System (INIS)

    Narlikar, J.

    1981-01-01

    In discussing the viability of the big-bang model of the Universe relative evidence is examined including the discrepancies in the age of the big-bang Universe, the red shifts of quasars, the microwave background radiation, general theory of relativity aspects such as the change of the gravitational constant with time, and quantum theory considerations. It is felt that the arguments considered show that the big-bang picture is not as soundly established, either theoretically or observationally, as it is usually claimed to be, that the cosmological problem is still wide open and alternatives to the standard big-bang picture should be seriously investigated. (U.K.)

  18. BIG DATA-DRIVEN MARKETING: AN ABSTRACT

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (BD use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: What are the key antecedents of big data customer analytics use? How, and to what extent, does big data customer an...

  19. Big Data Analytics in Medicine and Healthcare.

    Science.gov (United States)

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  20. The trashing of Big Green

    International Nuclear Information System (INIS)

    Felten, E.

    1990-01-01

    The Big Green initiative on California's ballot lost by a margin of 2-to-1. Green measures lost in five other states, shocking ecology-minded groups. According to the postmortem by environmentalists, Big Green was a victim of poor timing and big spending by the opposition. Now its supporters plan to break up the bill and try to pass some provisions in the Legislature

  1. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  2. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  3. Relações hierárquicas entre os traços amplos do Big Five Hierarchical relationship between the broad traits of the Big Five

    Directory of Open Access Journals (Sweden)

    Cristiano Mauro Assis Gomes

    2012-01-01

    through path analysis: a four-level hierarchical model and a non-hierarchical one. The hierarchical model showed adequate data fit, pointing to its superiority in relation to the non-hierarchical model, which did not present it. Implications to the Big Five Model are discussed.

  4. Ore potential of basic rocks in Finland

    International Nuclear Information System (INIS)

    Reino, J.; Ekberg, M.; Heinonen, P.; Karppanen, T.; Hakapaeae, A.; Sandberg, E.

    1993-02-01

    The report is associated with a study programme on basic rocks, which has the aim to complement the preliminary site investigations on repository for TVO's (Teollisuuden Voima Oy) spent nuclear fuel. The report comprises a mining enterprise's view of the ore potential of basic plutonic rocks in Finland. The ores associated with basic plutonic rocks are globally known and constitute a significant share of the global mining industry. The ores comprise chromium, vanadium-titanium-iron, nickel-copper and platinum group element ores. The resources of the metals in question and their mining industry are examined globally. A review of the use of these metals in the industry is presented as well. General factors affecting the mining industry, such as metal prices, political conjunctures, transport facilities, environmental requirements and raw material sources for the Finnish smelters have been observed from the point of view of their future effect on exploration activity and industrial development in Finland. Information on ores and mineralizations associated with Finnish basic rocks have been compiled in the report. The file comprises 4 chromium occurrences, 8 vanadium-titanium-iron occurrences, 13 PGE occurrences and 38 nickel-copper occurrences

  5. Rock fragmentation

    Energy Technology Data Exchange (ETDEWEB)

    Brown, W.S.; Green, S.J.; Hakala, W.W.; Hustrulid, W.A.; Maurer, W.C. (eds.)

    1976-01-01

    Experts in rock mechanics, mining, excavation, drilling, tunneling and use of underground space met to discuss the relative merits of a wide variety of rock fragmentation schemes. Information is presented on novel rock fracturing techniques; tunneling using electron beams, thermocorer, electric spark drills, water jets, and diamond drills; and rock fracturing research needs for mining and underground construction. (LCL)

  6. Medical big data: promise and challenges.

    Science.gov (United States)

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  7. Medical big data: promise and challenges

    Directory of Open Access Journals (Sweden)

    Choong Ho Lee

    2017-03-01

    Full Text Available The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  8. Overview of geotechnical methods to characterize rock masses

    International Nuclear Information System (INIS)

    Heuze, F.E.

    1981-12-01

    The methods that are used to characterize discontinuous rock masses from a geotechnical point of view are summarized. Emphasis is put on providing key references on each subject. The topics of exploration, in-situ stresses, mechanical properties, thermal properties, and hydraulic properties are addressed

  9. What is beyond the big five?

    Science.gov (United States)

    Saucier, G; Goldberg, L R

    1998-08-01

    Previous investigators have proposed that various kinds of person-descriptive content--such as differences in attitudes or values, in sheer evaluation, in attractiveness, or in height and girth--are not adequately captured by the Big Five Model. We report on a rather exhaustive search for reliable sources of Big Five-independent variation in data from person-descriptive adjectives. Fifty-three candidate clusters were developed in a college sample using diverse approaches and sources. In a nonstudent adult sample, clusters were evaluated with respect to a minimax criterion: minimum multiple correlation with factors from Big Five markers and maximum reliability. The most clearly Big Five-independent clusters referred to Height, Girth, Religiousness, Employment Status, Youthfulness and Negative Valence (or low-base-rate attributes). Clusters referring to Fashionableness, Sensuality/Seductiveness, Beauty, Masculinity, Frugality, Humor, Wealth, Prejudice, Folksiness, Cunning, and Luck appeared to be potentially beyond the Big Five, although each of these clusters demonstrated Big Five multiple correlations of .30 to .45, and at least one correlation of .20 and over with a Big Five factor. Of all these content areas, Religiousness, Negative Valence, and the various aspects of Attractiveness were found to be represented by a substantial number of distinct, common adjectives. Results suggest directions for supplementing the Big Five when one wishes to extend variable selection outside the domain of personality traits as conventionally defined.

  10. Big Data Analytics and Its Applications

    Directory of Open Access Journals (Sweden)

    Mashooque A. Memon

    2017-10-01

    Full Text Available The term, Big Data, has been authored to refer to the extensive heave of data that can't be managed by traditional data handling methods or techniques. The field of Big Data plays an indispensable role in various fields, such as agriculture, banking, data mining, education, chemistry, finance, cloud computing, marketing, health care stocks. Big data analytics is the method for looking at big data to reveal hidden patterns, incomprehensible relationship and other important data that can be utilize to resolve on enhanced decisions. There has been a perpetually expanding interest for big data because of its fast development and since it covers different areas of applications. Apache Hadoop open source technology created in Java and keeps running on Linux working framework was used. The primary commitment of this exploration is to display an effective and free solution for big data application in a distributed environment, with its advantages and indicating its easy use. Later on, there emerge to be a required for an analytical review of new developments in the big data technology. Healthcare is one of the best concerns of the world. Big data in healthcare imply to electronic health data sets that are identified with patient healthcare and prosperity. Data in the healthcare area is developing past managing limit of the healthcare associations and is relied upon to increment fundamentally in the coming years.

  11. Rock mechanics for hard rock nuclear waste repositories

    International Nuclear Information System (INIS)

    Heuze, F.E.

    1981-09-01

    The mined geologic burial of high level nuclear waste is now the favored option for disposal. The US National Waste Terminal Storage Program designed to achieve this disposal includes an extensive rock mechanics component related to the design of the wastes repositories. The plan currently considers five candidate rock types. This paper deals with the three hard rocks among them: basalt, granite, and tuff. Their behavior is governed by geological discontinuities. Salt and shale, which exhibit behavior closer to that of a continuum, are not considered here. This paper discusses both the generic rock mechanics R and D, which are required for repository design, as well as examples of projects related to hard rock waste storage. The examples include programs in basalt (Hanford/Washington), in granitic rocks (Climax/Nevada Test Site, Idaho Springs/Colorado, Pinawa/Canada, Oracle/Arizona, and Stripa/Sweden), and in tuff

  12. Elastic Rock Heterogeneity Controls Brittle Rock Failure during Hydraulic Fracturing

    Science.gov (United States)

    Langenbruch, C.; Shapiro, S. A.

    2014-12-01

    For interpretation and inversion of microseismic data it is important to understand, which properties of the reservoir rock control the occurrence probability of brittle rock failure and associated seismicity during hydraulic stimulation. This is especially important, when inverting for key properties like permeability and fracture conductivity. Although it became accepted that seismic events are triggered by fluid flow and the resulting perturbation of the stress field in the reservoir rock, the magnitude of stress perturbations, capable of triggering failure in rocks, can be highly variable. The controlling physical mechanism of this variability is still under discussion. We compare the occurrence of microseismic events at the Cotton Valley gas field to elastic rock heterogeneity, obtained from measurements along the treatment wells. The heterogeneity is characterized by scale invariant fluctuations of elastic properties. We observe that the elastic heterogeneity of the rock formation controls the occurrence of brittle failure. In particular, we find that the density of events is increasing with the Brittleness Index (BI) of the rock, which is defined as a combination of Young's modulus and Poisson's ratio. We evaluate the physical meaning of the BI. By applying geomechanical investigations we characterize the influence of fluctuating elastic properties in rocks on the probability of brittle rock failure. Our analysis is based on the computation of stress fluctuations caused by elastic heterogeneity of rocks. We find that elastic rock heterogeneity causes stress fluctuations of significant magnitude. Moreover, the stress changes necessary to open and reactivate fractures in rocks are strongly related to fluctuations of elastic moduli. Our analysis gives a physical explanation to the observed relation between elastic heterogeneity of the rock formation and the occurrence of brittle failure during hydraulic reservoir stimulations. A crucial factor for understanding

  13. Measuring the Promise of Big Data Syllabi

    Science.gov (United States)

    Friedman, Alon

    2018-01-01

    Growing interest in Big Data is leading industries, academics and governments to accelerate Big Data research. However, how teachers should teach Big Data has not been fully examined. This article suggests criteria for redesigning Big Data syllabi in public and private degree-awarding higher education establishments. The author conducted a survey…

  14. Features of the distribution of uranium in igneous rocks - uranium deposits associated with igneous rocks

    International Nuclear Information System (INIS)

    Soerensen, H.

    1977-01-01

    The generally accepted main features of the distribution of uranium and thorium in igneous rocks are briefly reviewed. It is pointed out that uranium in most cases examined is strongly partitioned into the melt during consolidation of magmas and that uranium is concentrated in the most volatile-rich parts of magmas. The mode of emplacement and the consolidation of magmas control the retention or the expulsion of the volatile phase from consolidating magmas and also the distribution of uranium between magmas and the volatile phase. After a brief review of the types of uranium deposits associated with igneous rocks it is concluded that it is difficult to establish universally valid exploration criteria to be used in the search of these types of deposit. It is emphasized, however, that detailed petrological and geochemical studies may be useful in outlining exploration targets. (author)

  15. Radon exhalation from granitic rocks

    Energy Technology Data Exchange (ETDEWEB)

    Del Claro, Flávia; Paschuk, Sergei A.; Corrêa, Janine N.; Mazer, Wellington; Narloch, Danielle Cristine; Martin, Aline Cristina [Universidade Tecnológica Federal do Paraná (UTFPR), Curitiba, PR (Brazil); Denyak, Valeriy, E-mail: flaviadelclaro@gmail.com, E-mail: spaschuk@gmail.com, E-mail: janine_nicolosi@hotmail.com, E-mail: denyak@gmail.com [Instituto de Pesquisa Pelé Pequeno Príncipe (IPPP), Curitiba, PR (Brazil)

    2017-07-01

    Naturally occurring radionuclides such as radon ({sup 222}Rn), its decay products and other elements from the radioactive series of uranium ({sup 238}U and {sup 235}U) and thorium ({sup 232}Th) are an important source of human exposure to natural radioactivity. The worldwide evaluation of health radiobiological effects and risks from population exposure to natural radionuclides is a growing concern. About 50% of personal radiation annual dose is related to radionuclides such as radon ({sup 222}Rn), thoron ({sup 220}Rn), radium ({sup 226}Ra), thorium ({sup 232}Th) and potassium ({sup 40}K), which are present in modern materials commonly used in construction of dwellings and buildings. The radioactivity of marbles and granites is of big concern since under certain conditions the radioactivity levels of these materials can be hazardous to the population and require the implementation of mitigation procedures. Present survey of the {sup 222}Rn and {sup 220}Rn activity concentration liberated in the air was performed using commercialized Brazilian granite rocks at national market as well as exported to other countries. The {sup 222}Rn and {sup 220}Rn measurements were performed using the AlphaGUARD instant monitor and RAD7 detector, respectively. This study was performed at the Applied Nuclear Physics Laboratory of the Federal University of Technology – Paraná (UTFPR). Obtained results of radon concentration activity in air exhaled studied samples of granites varied from 3±1 Bq/m{sup 3} to 2087±19 Bq/m{sup 3}, which shows that some samples of granitic rocks represent rather elevated health risk the population. (author)

  16. Rock.XML - Towards a library of rock physics models

    Science.gov (United States)

    Jensen, Erling Hugo; Hauge, Ragnar; Ulvmoen, Marit; Johansen, Tor Arne; Drottning, Åsmund

    2016-08-01

    Rock physics modelling provides tools for correlating physical properties of rocks and their constituents to the geophysical observations we measure on a larger scale. Many different theoretical and empirical models exist, to cover the range of different types of rocks. However, upon reviewing these, we see that they are all built around a few main concepts. Based on this observation, we propose a format for digitally storing the specifications for rock physics models which we have named Rock.XML. It does not only contain data about the various constituents, but also the theories and how they are used to combine these building blocks to make a representative model for a particular rock. The format is based on the Extensible Markup Language XML, making it flexible enough to handle complex models as well as scalable towards extending it with new theories and models. This technology has great advantages as far as documenting and exchanging models in an unambiguous way between people and between software. Rock.XML can become a platform for creating a library of rock physics models; making them more accessible to everyone.

  17. 77 FR 27245 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN

    Science.gov (United States)

    2012-05-09

    ... DEPARTMENT OF THE INTERIOR Fish and Wildlife Service [FWS-R3-R-2012-N069; FXRS1265030000S3-123-FF03R06000] Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN AGENCY: Fish and... plan (CCP) and environmental assessment (EA) for Big Stone National Wildlife Refuge (Refuge, NWR) for...

  18. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  19. Burnable absorber-integrated Guide Thimble (BigT) - 1. Design concepts and neutronic characterization on the fuel assembly benchmarks

    International Nuclear Information System (INIS)

    Yahya, Mohd-Syukri; Yu, Hwanyeal; Kim, Yonghee

    2016-01-01

    This paper presents the conceptual designs of a new burnable absorber (BA) for the pressurized water reactor (PWR), which is named 'Burnable absorber-integrated Guide Thimble' (BigT). The BigT integrates BA materials into standard guide thimble in a PWR fuel assembly. Neutronic sensitivities and practical design considerations of the BigT concept are points of highlight in the first half of the paper. Specifically, the BigT concepts are characterized in view of its BA material and spatial self-shielding variations. In addition, the BigT replaceability requirement, bottom-end design specifications and thermal-hydraulic considerations are also deliberated. Meanwhile, much of the second half of the paper is devoted to demonstrate practical viability of the BigT absorbers via comparative evaluations against the conventional BA technologies in representative 17x17 and 16x16 fuel assembly lattices. For the 17x17 lattice evaluations, all three BigT variants are benchmarked against Westinghouse's existing BA technologies, while in the 16x16 assembly analyses, the BigT designs are compared against traditional integral gadolinia-urania rod design. All analyses clearly show that the BigT absorbers perform as well as the commercial BA technologies in terms of reactivity and power peaking management. In addition, it has been shown that sufficiently high control rod worth can be obtained with the BigT absorbers in place. All neutronic simulations were completed using the Monte Carlo Serpent code with ENDF/B-VII.0 library. (author)

  20. Big data and educational research

    OpenAIRE

    Beneito-Montagut, Roser

    2017-01-01

    Big data and data analytics offer the promise to enhance teaching and learning, improve educational research and progress education governance. This chapter aims to contribute to the conceptual and methodological understanding of big data and analytics within educational research. It describes the opportunities and challenges that big data and analytics bring to education as well as critically explore the perils of applying a data driven approach to education. Despite the claimed value of the...

  1. Thick-Big Descriptions

    DEFF Research Database (Denmark)

    Lai, Signe Sophus

    The paper discusses the rewards and challenges of employing commercial audience measurements data – gathered by media industries for profitmaking purposes – in ethnographic research on the Internet in everyday life. It questions claims to the objectivity of big data (Anderson 2008), the assumption...... communication systems, language and behavior appear as texts, outputs, and discourses (data to be ‘found’) – big data then documents things that in earlier research required interviews and observations (data to be ‘made’) (Jensen 2014). However, web-measurement enterprises build audiences according...... to a commercial logic (boyd & Crawford 2011) and is as such directed by motives that call for specific types of sellable user data and specific segmentation strategies. In combining big data and ‘thick descriptions’ (Geertz 1973) scholars need to question how ethnographic fieldwork might map the ‘data not seen...

  2. Big Data's Role in Precision Public Health.

    Science.gov (United States)

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts.

  3. Rock burst governance of working face under igneous rock

    Science.gov (United States)

    Chang, Zhenxing; Yu, Yue

    2017-01-01

    As a typical failure phenomenon, rock burst occurs in many mines. It can not only cause the working face to cease production, but also cause serious damage to production equipment, and even result in casualties. To explore how to govern rock burst of working face under igneous rock, the 10416 working face in some mine is taken as engineering background. The supports damaged extensively and rock burst took place when the working face advanced. This paper establishes the mechanical model and conducts theoretical analysis and calculation to predict the fracture and migration mechanism and energy release of the thick hard igneous rock above the working face, and to obtain the advancing distance of the working face when the igneous rock fractures and critical value of the energy when rock burst occurs. Based on the specific conditions of the mine, this paper put forward three kinds of governance measures, which are borehole pressure relief, coal seam water injection and blasting pressure relief.

  4. Model to Assess the Quality of Magmatic Rocks for Reliable and Sustainable Constructions

    Directory of Open Access Journals (Sweden)

    Mihaela Toderaş

    2017-10-01

    Full Text Available Geomechanical assessment of rocks requires knowledge of phenomena that occur under the influence of internal and external factors at a macroscopic or microscopic scale, when rocks are submitted to different actions. To elucidate the quantitative and qualitative geomechanical behavior of rocks, knowing their geological and physical–mechanical characteristics becomes an imperative. Mineralogical, petrographical and chemical analyses provided an opportunity to identify 15 types of igneous rocks (gabbro, diabases, granites, diorites, rhyolites, andesites, and basalts, divided into plutonic and volcanic rocks. In turn, these have been grouped into acidic, neutral (intermediate and basic magmatites. A new ranking method is proposed, based on considering the rock characteristics as indicators of quantitative assessment, and the grading system, by given points, allowing the rocks framing in admissibility classes. The paper is structured into two parts, experimental and interpretation of experimental data, showing the methodology to assess the quality of igneous rocks analyzed, and the results of theoretical and experimental research carried out on the analyzed rock types. The proposed method constitutes an appropriate instrument for assessment and verification of the requirements regarding the quality of rocks used for sustainable construction.

  5. Big Data, indispensable today

    Directory of Open Access Journals (Sweden)

    Radu-Ioan ENACHE

    2015-10-01

    Full Text Available Big data is and will be used more in the future as a tool for everything that happens both online and offline. Of course , online is a real hobbit, Big Data is found in this medium , offering many advantages , being a real help for all consumers. In this paper we talked about Big Data as being a plus in developing new applications, by gathering useful information about the users and their behaviour.We've also presented the key aspects of real-time monitoring and the architecture principles of this technology. The most important benefit brought to this paper is presented in the cloud section.

  6. Big data: een zoektocht naar instituties

    NARCIS (Netherlands)

    van der Voort, H.G.; Crompvoets, J

    2016-01-01

    Big data is a well-known phenomenon, even a buzzword nowadays. It refers to an abundance of data and new possibilities to process and use them. Big data is subject of many publications. Some pay attention to the many possibilities of big data, others warn us for their consequences. This special

  7. Data, Data, Data : Big, Linked & Open

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Eckartz, S.M.

    2013-01-01

    De gehele business en IT-wereld praat op dit moment over Big Data, een trend die medio 2013 Cloud Computing is gepasseerd (op basis van Google Trends). Ook beleidsmakers houden zich actief bezig met Big Data. Neelie Kroes, vice-president van de Europese Commissie, spreekt over de ‘Big Data

  8. Geochronology and correlation of Tertiary volcanic and intrusive rocks in part of the southern Toquima Range, Nye County, Nevada

    Science.gov (United States)

    Shawe, Daniel R.; Snee, Lawrence W.; Byers, Frank M.; du Bray, Edward A.

    2014-01-01

    Extensive volcanic and intrusive igneous activity, partly localized along regional structural zones, characterized the southern Toquima Range, Nevada, in the late Eocene, Oligocene, and Miocene. The general chronology of igneous activity has been defined previously. This major episode of Tertiary magmatism began with emplacement of a variety of intrusive rocks, followed by formation of nine major calderas and associated with voluminous extrusive and additional intrusive activity. Emplacement of volcanic eruptive and collapse megabreccias accompanied formation of some calderas. Penecontemporaneous volcanism in central Nevada resulted in deposition of distally derived outflow facies ash-flow tuff units that are interleaved in the Toquima Range with proximally derived ash-flow tuffs. Eruption of the Northumberland Tuff in the north part of the southern Toquima Range and collapse of the Northumberland caldera occurred about 32.3 million years ago. The poorly defined Corcoran Canyon caldera farther to the southeast formed following eruption of the tuff of Corcoran Canyon about 27.2 million years ago. The Big Ten Peak caldera in the south part of the southern Toquima Range Tertiary volcanic complex formed about 27 million years ago during eruption of the tuff of Big Ten Peak and associated air-fall tuffs. The inferred Ryecroft Canyon caldera formed in the south end of the Monitor Valley adjacent to the southern Toquima Range and just north of the Big Ten Peak caldera in response to eruption of the tuff of Ryecroft Canyon about 27 million years ago, and the Moores Creek caldera just south of the Northumberland caldera developed at about the same time. Eruption of the tuff of Mount Jefferson about 26.8 million years ago was accompanied by collapse of the Mount Jefferson caldera in the central part of the southern Toquima Range. An inferred caldera, mostly buried beneath alluvium of Big Smoky Valley southwest of the Mount Jefferson caldera, formed about 26.5 million years

  9. A Review of Data Mining with Big Data towards Its Applications in the Electronics Industry

    Directory of Open Access Journals (Sweden)

    Shengping Lv

    2018-04-01

    Full Text Available Data mining (DM with Big Data has been widely used in the lifecycle of electronic products that range from the design and production stages to the service stage. A comprehensive analysis of DM with Big Data and a review of its application in the stages of its lifecycle will not only benefit researchers to develop strong research themes and identify gaps in the field but also help practitioners for DM application system development. In this paper, a brief clarification of DM-related topics is presented first. A flowchart of DM and the main content of the flowchart steps are given in which commonly used data preparation and preprocessing approaches, DM functions and techniques, and performances indicators are summarized. Then, a comprehensive review covering 105 articles from 2007 to 2017 on DM or Big Data applications in the electronics industry is provided according to the flowchart from various points of view such as data handling, applications of DM, or Big Data at different lifecycle stages, and the software used in the applications. On this basis, a diagram of data content for different knowledge areas and a framework for DM and Big Data applications in the electronics industry are established. Finally, conclusions and future research directions are given.

  10. Methods and tools for big data visualization

    OpenAIRE

    Zubova, Jelena; Kurasova, Olga

    2015-01-01

    In this paper, methods and tools for big data visualization have been investigated. Challenges faced by the big data analysis and visualization have been identified. Technologies for big data analysis have been discussed. A review of methods and tools for big data visualization has been done. Functionalities of the tools have been demonstrated by examples in order to highlight their advantages and disadvantages.

  11. Systematic evaluation program, status summary report

    International Nuclear Information System (INIS)

    1983-01-01

    Status reports are presented on the systematic evaluation program for the Big Rock Point reactor, Dresden-1 reactor, Dresden-2 reactor, Ginna-1 reactor, Connecticut Yankee reactor, LACBWR reactor, Millstone-1 reactor, Oyster Creek-1 reactor, Palisades-1 reactor, San Onofre-1 reactor, and Rowe Yankee reactor

  12. Failure Mechanism of Rock Bridge Based on Acoustic Emission Technique

    Directory of Open Access Journals (Sweden)

    Guoqing Chen

    2015-01-01

    Full Text Available Acoustic emission (AE technique is widely used in various fields as a reliable nondestructive examination technology. Two experimental tests were carried out in a rock mechanics laboratory, which include (1 small scale direct shear tests of rock bridge with different lengths and (2 large scale landslide model with locked section. The relationship of AE event count and record time was analyzed during the tests. The AE source location technology and comparative analysis with its actual failure model were done. It can be found that whether it is small scale test or large scale landslide model test, AE technique accurately located the AE source point, which reflected the failure generation and expansion of internal cracks in rock samples. Large scale landslide model with locked section test showed that rock bridge in rocky slope has typical brittle failure behavior. The two tests based on AE technique well revealed the rock failure mechanism in rocky slope and clarified the cause of high speed and long distance sliding of rocky slope.

  13. Big data analytics methods and applications

    CERN Document Server

    Rao, BLS; Rao, SB

    2016-01-01

    This book has a collection of articles written by Big Data experts to describe some of the cutting-edge methods and applications from their respective areas of interest, and provides the reader with a detailed overview of the field of Big Data Analytics as it is practiced today. The chapters cover technical aspects of key areas that generate and use Big Data such as management and finance; medicine and healthcare; genome, cytome and microbiome; graphs and networks; Internet of Things; Big Data standards; bench-marking of systems; and others. In addition to different applications, key algorithmic approaches such as graph partitioning, clustering and finite mixture modelling of high-dimensional data are also covered. The varied collection of themes in this volume introduces the reader to the richness of the emerging field of Big Data Analytics.

  14. The Big bang and the Quantum

    Science.gov (United States)

    Ashtekar, Abhay

    2010-06-01

    General relativity predicts that space-time comes to an end and physics comes to a halt at the big-bang. Recent developments in loop quantum cosmology have shown that these predictions cannot be trusted. Quantum geometry effects can resolve singularities, thereby opening new vistas. Examples are: The big bang is replaced by a quantum bounce; the `horizon problem' disappears; immediately after the big bounce, there is a super-inflationary phase with its own phenomenological ramifications; and, in presence of a standard inflation potential, initial conditions are naturally set for a long, slow roll inflation independently of what happens in the pre-big bang branch. As in my talk at the conference, I will first discuss the foundational issues and then the implications of the new Planck scale physics near the Big Bang.

  15. Big Bang baryosynthesis

    International Nuclear Information System (INIS)

    Turner, M.S.; Chicago Univ., IL

    1983-01-01

    In these lectures I briefly review Big Bang baryosynthesis. In the first lecture I discuss the evidence which exists for the BAU, the failure of non-GUT symmetrical cosmologies, the qualitative picture of baryosynthesis, and numerical results of detailed baryosynthesis calculations. In the second lecture I discuss the requisite CP violation in some detail, further the statistical mechanics of baryosynthesis, possible complications to the simplest scenario, and one cosmological implication of Big Bang baryosynthesis. (orig./HSI)

  16. For Those About to Rock : Naislaulajat rock-genressä

    OpenAIRE

    Herranen, Linda

    2015-01-01

    For those about to rock – naislaulajat rock-genressä antaa lukijalleen kokonaisvaltaisen käsityksen naisista rock-genressä: rockin historiasta, sukupuolittuneisuudesta, seksismistä, suomalaisten naislaulajien menestyksestä. Työn aineisto on koottu aihepiirin kirjallisuudesta ja alalla toimiville naislaulajille teetettyjen kyselyiden tuloksista. Lisäksi avaan omia kokemuksiani ja ajatuksiani, jotta näkökulma naisista rock-genressä tulisi esille mahdollisimman monipuolisesti. Ajatus aihees...

  17. Exploiting big data for critical care research.

    Science.gov (United States)

    Docherty, Annemarie B; Lone, Nazir I

    2015-10-01

    Over recent years the digitalization, collection and storage of vast quantities of data, in combination with advances in data science, has opened up a new era of big data. In this review, we define big data, identify examples of critical care research using big data, discuss the limitations and ethical concerns of using these large datasets and finally consider scope for future research. Big data refers to datasets whose size, complexity and dynamic nature are beyond the scope of traditional data collection and analysis methods. The potential benefits to critical care are significant, with faster progress in improving health and better value for money. Although not replacing clinical trials, big data can improve their design and advance the field of precision medicine. However, there are limitations to analysing big data using observational methods. In addition, there are ethical concerns regarding maintaining confidentiality of patients who contribute to these datasets. Big data have the potential to improve medical care and reduce costs, both by individualizing medicine, and bringing together multiple sources of data about individual patients. As big data become increasingly mainstream, it will be important to maintain public confidence by safeguarding data security, governance and confidentiality.

  18. Proceedings of the 3. Canada-US rock mechanics symposium and 20. Canadian rock mechanics symposium : rock engineering 2009 : rock engineering in difficult conditions

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2009-07-01

    This conference provided a forum for geologists, mining operators and engineers to discuss the application of rock mechanics in engineering designs. Members of the scientific and engineering communities discussed challenges and interdisciplinary elements involved in rock engineering. New geological models and methods of characterizing rock masses and ground conditions in underground engineering projects were discussed along with excavation and mining methods. Papers presented at the conference discussed the role of rock mechanics in forensic engineering. Geophysics, geomechanics, and risk-based approaches to rock engineering designs were reviewed. Issues related to high pressure and high flow water conditions were discussed, and new rock physics models designed to enhance hydrocarbon recovery were presented. The conference featured 84 presentations, of which 9 have been catalogued separately for inclusion in this database. tabs., figs.

  19. Achievable Rate Estimation of IEEE 802.11ad Visual Big-Data Uplink Access in Cloud-Enabled Surveillance Applications.

    Science.gov (United States)

    Kim, Joongheon; Kim, Jong-Kook

    2016-01-01

    This paper addresses the computation procedures for estimating the impact of interference in 60 GHz IEEE 802.11ad uplink access in order to construct visual big-data database from randomly deployed surveillance camera sensing devices. The acquired large-scale massive visual information from surveillance camera devices will be used for organizing big-data database, i.e., this estimation is essential for constructing centralized cloud-enabled surveillance database. This performance estimation study captures interference impacts on the target cloud access points from multiple interference components generated by the 60 GHz wireless transmissions from nearby surveillance camera devices to their associated cloud access points. With this uplink interference scenario, the interference impacts on the main wireless transmission from a target surveillance camera device to its associated target cloud access point with a number of settings are measured and estimated under the consideration of 60 GHz radiation characteristics and antenna radiation pattern models.

  20. Empathy and the Big Five

    OpenAIRE

    Paulus, Christoph

    2016-01-01

    Del Barrio et al. (2004) haben vor mehr als 10 Jahren versucht, eine direkte Beziehung zwischen Empathie und den Big Five herzustellen. Im Mittel hatten in ihrer Stichprobe Frauen höhere Werte in der Empathie und auf den Big Five-Faktoren mit Ausnahme des Faktors Neurotizismus. Zusammenhänge zu Empathie fanden sie in den Bereichen Offenheit, Verträglichkeit, Gewissenhaftigkeit und Extraversion. In unseren Daten besitzen Frauen sowohl in der Empathie als auch den Big Five signifikant höhere We...

  1. Rock index properties for geoengineering in underground development

    International Nuclear Information System (INIS)

    O'Rourke, J.E.

    1989-01-01

    This paper describes the use of index testing to obtain rock properties that are useful in the design and construction planning of an underground development for civil engineering or mining projects. The index properties discussed include: point load; Schmidt hammer hardness; abrasion hardness; and total hardness. The first two index properties correlate to uniaxial compressive strength (UCS) and Young's modulus. Discussions are given on empirical, normalized relationships of UCS to rock mass properties and the integrated use with semi-empirical, geotechnical design methods. The hardness property indices correlate to construction performance parameters and some relevant experience is cited. Examples of data are presented from an index testing program carried out primarily on siltstone, sandstone and limestone rock core samples retrieved from depths up to 1005 m (3300 ft) in a borehole drilled in the Paradox Basin in eastern Utah. The borehole coring was done for a nuclear waste repository site investigation

  2. Strike Point Control on EAST Using an Isoflux Control Method

    International Nuclear Information System (INIS)

    Xing Zhe; Xiao Bingjia; Luo Zhengping; Walker, M. L.; Humphreys, D. A.

    2015-01-01

    For the advanced tokamak, the particle deposition and thermal load on the divertor is a big challenge. By moving the strike points on divertor target plates, the position of particle deposition and thermal load can be shifted. We could adjust the Poloidal Field (PF) coil current to achieve the strike point position feedback control. Using isoflux control method, the strike point position can be controlled by controlling the X point position. On the basis of experimental data, we establish relational expressions between X point position and strike point position. Benchmark experiments are carried out to validate the correctness and robustness of the control methods. The strike point position is successfully controlled following our command in the EAST operation. (paper)

  3. An Overview on “Cloud Control” Photogrammetry in Big Data Era

    OpenAIRE

    ZHANG Zuxun; TAO Pengjie

    2017-01-01

    In the present era of big data, photogrammetric image collection modes are characterized with the progressive course of diversity, efficiency and facilitation, which are producing large sets of photogrammetric image data. They further bring the request for advanced processing with higher level of efficiency, automation and intelligence. However, the efficiency of fundamental photogrammetric processing, known as geometric positioning, is still majorly restricted to control points acquired thro...

  4. Art Rocks with Rock Art!

    Science.gov (United States)

    Bickett, Marianne

    2011-01-01

    This article discusses rock art which was the very first "art." Rock art, such as the images created on the stone surfaces of the caves of Lascaux and Altimira, is the true origin of the canvas, paintbrush, and painting media. For there, within caverns deep in the earth, the first artists mixed animal fat, urine, and saliva with powdered minerals…

  5. Radioactive safety analysis and assessment of waste rock pile site in uranium tailings

    International Nuclear Information System (INIS)

    Liu Changrong; Liu Zehua; Wang Zhiyong; Zhou Xinghuo

    2007-01-01

    Based on theoretical calculation and in-situ test results, distribution and emissions of radioactive nuclides of uranium tailings impoundment and waste rock pile sites are analyzed in this paper. It is pointed out that 222 Rn is the main nuclide of uranium tailings impoundment and waste rock pile site. Also 222 Rn is the main source term of public dose. 222 Rn concentrations in the atmospheric environment around and individual dose to Rn gradually decrease with increasing distances to uranium tailings impoundment and waste rock pile site. Based on in-situ tests on five uranium tailings impoundment and waste rock pile sites, a decisive method and safety protection distance are presented, which can be used to guide the validation and design of radioactive safety protection in uranium tailings impoundment and waste rock pile sites. (authors)

  6. Big domains are novel Ca²+-binding modules: evidences from big domains of Leptospira immunoglobulin-like (Lig) proteins.

    Science.gov (United States)

    Raman, Rajeev; Rajanikanth, V; Palaniappan, Raghavan U M; Lin, Yi-Pin; He, Hongxuan; McDonough, Sean P; Sharma, Yogendra; Chang, Yung-Fu

    2010-12-29

    Many bacterial surface exposed proteins mediate the host-pathogen interaction more effectively in the presence of Ca²+. Leptospiral immunoglobulin-like (Lig) proteins, LigA and LigB, are surface exposed proteins containing Bacterial immunoglobulin like (Big) domains. The function of proteins which contain Big fold is not known. Based on the possible similarities of immunoglobulin and βγ-crystallin folds, we here explore the important question whether Ca²+ binds to a Big domains, which would provide a novel functional role of the proteins containing Big fold. We selected six individual Big domains for this study (three from the conserved part of LigA and LigB, denoted as Lig A3, Lig A4, and LigBCon5; two from the variable region of LigA, i.e., 9(th) (Lig A9) and 10(th) repeats (Lig A10); and one from the variable region of LigB, i.e., LigBCen2. We have also studied the conserved region covering the three and six repeats (LigBCon1-3 and LigCon). All these proteins bind the calcium-mimic dye Stains-all. All the selected four domains bind Ca²+ with dissociation constants of 2-4 µM. Lig A9 and Lig A10 domains fold well with moderate thermal stability, have β-sheet conformation and form homodimers. Fluorescence spectra of Big domains show a specific doublet (at 317 and 330 nm), probably due to Trp interaction with a Phe residue. Equilibrium unfolding of selected Big domains is similar and follows a two-state model, suggesting the similarity in their fold. We demonstrate that the Lig are Ca²+-binding proteins, with Big domains harbouring the binding motif. We conclude that despite differences in sequence, a Big motif binds Ca²+. This work thus sets up a strong possibility for classifying the proteins containing Big domains as a novel family of Ca²+-binding proteins. Since Big domain is a part of many proteins in bacterial kingdom, we suggest a possible function these proteins via Ca²+ binding.

  7. K-Ar dating on acidic rocks from the Western Aizu District, Fukushima Prefecture

    International Nuclear Information System (INIS)

    Shimada, Ikuro; Ueda, Yoshio

    1979-01-01

    K-Ar age determinations were carried out on twelve samples of various acidic rocks (six volcanic rocks, two pyroclastics and four granitic rocks) which were obtained from the western part of Aizu district. The district studied is one of the important acidic petrographic provinces in the Green tuff region of Northeast Japan, and is widely covered by the acidic volcanic rocks and pyroclastics of Neogene period. The ages of six volcanic rocks range from 8 to 23 m.y., and they are generally correlated to the stratigraphic units of the Neogene in Northeast Japan. Dating results on four granitic rocks from the Tagokura granitic body showed the age range of 39 to 65 m.y., corresponding to the Late Cretaceous to Eocene. A sample of dacitic welded tuff from the Miyako River area gave an age of 44 m.y. It is pointed out that the welded tuff may be correlated to the Late Cretaceous to Paleogene acidic igneous rocks such as Nohi rhyolites, Asahi rhyolites, Tagawa acidic rocks and others, on the basis of the age and lithofacies of the rock. However, further geological and geochronological data are necessary to settle the problem. (author)

  8. In Situ Observation of Hard Surrounding Rock Displacement at 2400-m-Deep Tunnels

    Science.gov (United States)

    Feng, Xia-Ting; Yao, Zhi-Bin; Li, Shao-Jun; Wu, Shi-Yong; Yang, Cheng-Xiang; Guo, Hao-Sen; Zhong, Shan

    2018-03-01

    This paper presents the results of in situ investigation of the internal displacement of hard surrounding rock masses within deep tunnels at China's Jinping Underground Laboratory Phase II. The displacement evolution of the surrounding rock during the entire excavation processes was monitored continuously using pre-installed continuous-recording multi-point extensometers. The evolution of excavation-damaged zones and fractures in rock masses were also observed using acoustic velocity testing and digital borehole cameras, respectively. The results show four kinds of displacement behaviours of the hard surrounding rock masses during the excavation process. The displacement in the inner region of the surrounding rock was found to be greater than that of the rock masses near the tunnel's side walls in some excavation stages. This leads to a multi-modal distribution characteristic of internal displacement for hard surrounding rock masses within deep tunnels. A further analysis of the evolution information on the damages and fractures inside the surrounding rock masses reveals the effects of excavation disturbances and local geological conditions. This recognition can be used as the reference for excavation and supporting design and stability evaluations of hard-rock tunnels under high-stress conditions.

  9. Semantic Web Technologies and Big Data Infrastructures: SPARQL Federated Querying of Heterogeneous Big Data Stores

    OpenAIRE

    Konstantopoulos, Stasinos; Charalambidis, Angelos; Mouchakis, Giannis; Troumpoukis, Antonis; Jakobitsch, Jürgen; Karkaletsis, Vangelis

    2016-01-01

    The ability to cross-link large scale data with each other and with structured Semantic Web data, and the ability to uniformly process Semantic Web and other data adds value to both the Semantic Web and to the Big Data community. This paper presents work in progress towards integrating Big Data infrastructures with Semantic Web technologies, allowing for the cross-linking and uniform retrieval of data stored in both Big Data infrastructures and Semantic Web data. The technical challenges invo...

  10. Quantum fields in a big-crunch-big-bang spacetime

    International Nuclear Information System (INIS)

    Tolley, Andrew J.; Turok, Neil

    2002-01-01

    We consider quantum field theory on a spacetime representing the big-crunch-big-bang transition postulated in ekpyrotic or cyclic cosmologies. We show via several independent methods that an essentially unique matching rule holds connecting the incoming state, in which a single extra dimension shrinks to zero, to the outgoing state in which it reexpands at the same rate. For free fields in our construction there is no particle production from the incoming adiabatic vacuum. When interactions are included the particle production for fixed external momentum is finite at the tree level. We discuss a formal correspondence between our construction and quantum field theory on de Sitter spacetime

  11. Turning big bang into big bounce: II. Quantum dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Malkiewicz, Przemyslaw; Piechocki, Wlodzimierz, E-mail: pmalk@fuw.edu.p, E-mail: piech@fuw.edu.p [Theoretical Physics Department, Institute for Nuclear Studies, Hoza 69, 00-681 Warsaw (Poland)

    2010-11-21

    We analyze the big bounce transition of the quantum Friedmann-Robertson-Walker model in the setting of the nonstandard loop quantum cosmology (LQC). Elementary observables are used to quantize composite observables. The spectrum of the energy density operator is bounded and continuous. The spectrum of the volume operator is bounded from below and discrete. It has equally distant levels defining a quantum of the volume. The discreteness may imply a foamy structure of spacetime at a semiclassical level which may be detected in astro-cosmo observations. The nonstandard LQC method has a free parameter that should be fixed in some way to specify the big bounce transition.

  12. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-07-31

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to big data scaling. This presents a serious impediment since identify- ing and repairing dirty data often involves processing huge input datasets, handling sophisticated error discovery approaches and managing huge arbitrary errors. With large datasets, error detection becomes overly expensive and complicated especially when considering user-defined functions. Furthermore, a distinctive algorithm is de- sired to optimize inequality joins in sophisticated error discovery rather than na ̈ıvely parallelizing them. Also, when repairing large errors, their skewed distribution may obstruct effective error repairs. In this dissertation, I present solutions to overcome the above three problems in scaling data cleansing. First, I present BigDansing as a general system to tackle efficiency, scalability, and ease-of-use issues in data cleansing for Big Data. It automatically parallelizes the user’s code on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also enables parallel execution of serial repair algorithms by exploiting the graph representation of discovered errors. The experimental results show that BigDansing outperforms existing baselines up to more than two orders of magnitude. Although BigDansing scales cleansing jobs, it still lacks the ability to handle sophisticated error discovery requiring inequality joins. Therefore, I developed IEJoin as an algorithm for fast inequality joins. It is based on sorted arrays and space efficient bit-arrays to reduce the problem’s search space. By comparing IEJoin against well- known optimizations, I show that it is more scalable, and several orders of magnitude faster. BigDansing depends on vertex-centric graph systems, i.e., Pregel

  13. The ethics of big data in big agriculture

    Directory of Open Access Journals (Sweden)

    Isabelle M. Carbonell

    2016-03-01

    Full Text Available This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique insights on a field-by-field basis into a third or more of the US farmland. This power asymmetry may be rebalanced through open-sourced data, and publicly-funded data analytic tools which rival Climate Corp. in complexity and innovation for use in the public domain.

  14. Simulating Smoke Filling in Big Halls by Computational Fluid Dynamics

    Directory of Open Access Journals (Sweden)

    W. K. Chow

    2011-01-01

    Full Text Available Many tall halls of big space volume were built and, to be built in many construction projects in the Far East, particularly Mainland China, Hong Kong, and Taiwan. Smoke is identified to be the key hazard to handle. Consequently, smoke exhaust systems are specified in the fire code in those areas. An update on applying Computational Fluid Dynamics (CFD in smoke exhaust design will be presented in this paper. Key points to note in CFD simulations on smoke filling due to a fire in a big hall will be discussed. Mathematical aspects concerning of discretization of partial differential equations and algorithms for solving the velocity-pressure linked equations are briefly outlined. Results predicted by CFD with different free boundary conditions are compared with those on room fire tests. Standards on grid size, relaxation factors, convergence criteria, and false diffusion should be set up for numerical experiments with CFD.

  15. Homogeneous and isotropic big rips?

    CERN Document Server

    Giovannini, Massimo

    2005-01-01

    We investigate the way big rips are approached in a fully inhomogeneous description of the space-time geometry. If the pressure and energy densities are connected by a (supernegative) barotropic index, the spatial gradients and the anisotropic expansion decay as the big rip is approached. This behaviour is contrasted with the usual big-bang singularities. A similar analysis is performed in the case of sudden (quiescent) singularities and it is argued that the spatial gradients may well be non-negligible in the vicinity of pressure singularities.

  16. Rate Change Big Bang Theory

    Science.gov (United States)

    Strickland, Ken

    2013-04-01

    The Rate Change Big Bang Theory redefines the birth of the universe with a dramatic shift in energy direction and a new vision of the first moments. With rate change graph technology (RCGT) we can look back 13.7 billion years and experience every step of the big bang through geometrical intersection technology. The analysis of the Big Bang includes a visualization of the first objects, their properties, the astounding event that created space and time as well as a solution to the mystery of anti-matter.

  17. Intelligent Test Mechanism Design of Worn Big Gear

    Directory of Open Access Journals (Sweden)

    Hong-Yu LIU

    2014-10-01

    Full Text Available With the continuous development of national economy, big gear was widely applied in metallurgy and mine domains. So, big gear plays an important role in above domains. In practical production, big gear abrasion and breach take place often. It affects normal production and causes unnecessary economic loss. A kind of intelligent test method was put forward on worn big gear mainly aimed at the big gear restriction conditions of high production cost, long production cycle and high- intensity artificial repair welding work. The measure equations transformations were made on involute straight gear. Original polar coordinate equations were transformed into rectangular coordinate equations. Big gear abrasion measure principle was introduced. Detection principle diagram was given. Detection route realization method was introduced. OADM12 laser sensor was selected. Detection on big gear abrasion area was realized by detection mechanism. Tested data of unworn gear and worn gear were led in designed calculation program written by Visual Basic language. Big gear abrasion quantity can be obtained. It provides a feasible method for intelligent test and intelligent repair welding on worn big gear.

  18. Provenance of radioactive placers, Big Meadow area, Valley and Boise Counties, Idaho

    International Nuclear Information System (INIS)

    Truesdell, D.; Wegrzyn, R.; Dixon, M.

    1977-02-01

    For many years, radioactive black-sand placers have been known to be present in the Bear Valley area of west-central Idaho. The largest of these is in Big Meadow, near the head of Bear Valley Creek. Presence of these placers suggests that low-grade uranium deposits might occur in rocks of the Idaho Batholith, adjacent to Bear Valley. This study was undertaken to locate the provenance of the radioactive minerals and to identify problems that need to be solved before undertaking further investigations. The principal radioactive minerals in these placers are monazite and euxenite. Other minerals include columbite, samarskite, fergusonite, xenotime, zircon, allanite, sphene, and brannerite. Only brannerite is a uranium mineral; the others contain uranium as an impurity in crystal lattices. Radiometric determinations of the concentration of uranium in stream sediments strongly indicate that the radioactive materials originate in an area drained by Casner and Howard Creeks. Equivalent uranium levels in bedrock are highest on the divide between Casner and Howard Creeks. However, this area is not known to contain low-grade uranium occurrences. Euxenite, brannerite, columbite-tantalite, samarskite, and allanite are the principal radioactive minerals that were identified in rock samples. These minerals were found in granite pegmatites, granites, and quartz monzonites. Appreciably higher equivalent uranium concentrations were also found within these rock types. The major problem encountered in this study was the difficulty in mapping bedrock because of extensive soil and glacial mantle. A partial solution to this problem might be the application of radon emanometry so that radiometric measurements would not be limited to the sparse bedrock samples

  19. [Big data in medicine and healthcare].

    Science.gov (United States)

    Rüping, Stefan

    2015-08-01

    Healthcare is one of the business fields with the highest Big Data potential. According to the prevailing definition, Big Data refers to the fact that data today is often too large and heterogeneous and changes too quickly to be stored, processed, and transformed into value by previous technologies. The technological trends drive Big Data: business processes are more and more executed electronically, consumers produce more and more data themselves - e.g. in social networks - and finally ever increasing digitalization. Currently, several new trends towards new data sources and innovative data analysis appear in medicine and healthcare. From the research perspective, omics-research is one clear Big Data topic. In practice, the electronic health records, free open data and the "quantified self" offer new perspectives for data analytics. Regarding analytics, significant advances have been made in the information extraction from text data, which unlocks a lot of data from clinical documentation for analytics purposes. At the same time, medicine and healthcare is lagging behind in the adoption of Big Data approaches. This can be traced to particular problems regarding data complexity and organizational, legal, and ethical challenges. The growing uptake of Big Data in general and first best-practice examples in medicine and healthcare in particular, indicate that innovative solutions will be coming. This paper gives an overview of the potentials of Big Data in medicine and healthcare.

  20. High-resolution three-dimensional imaging and analysis of rock falls in Yosemite valley, California

    Science.gov (United States)

    Stock, Gregory M.; Bawden, G.W.; Green, J.K.; Hanson, E.; Downing, G.; Collins, B.D.; Bond, S.; Leslar, M.

    2011-01-01

    We present quantitative analyses of recent large rock falls in Yosemite Valley, California, using integrated high-resolution imaging techniques. Rock falls commonly occur from the glacially sculpted granitic walls of Yosemite Valley, modifying this iconic landscape but also posing signifi cant potential hazards and risks. Two large rock falls occurred from the cliff beneath Glacier Point in eastern Yosemite Valley on 7 and 8 October 2008, causing minor injuries and damaging structures in a developed area. We used a combination of gigapixel photography, airborne laser scanning (ALS) data, and ground-based terrestrial laser scanning (TLS) data to characterize the rock-fall detachment surface and adjacent cliff area, quantify the rock-fall volume, evaluate the geologic structure that contributed to failure, and assess the likely failure mode. We merged the ALS and TLS data to resolve the complex, vertical to overhanging topography of the Glacier Point area in three dimensions, and integrated these data with gigapixel photographs to fully image the cliff face in high resolution. Three-dimensional analysis of repeat TLS data reveals that the cumulative failure consisted of a near-planar rock slab with a maximum length of 69.0 m, a mean thickness of 2.1 m, a detachment surface area of 2750 m2, and a volume of 5663 ?? 36 m3. Failure occurred along a surfaceparallel, vertically oriented sheeting joint in a clear example of granitic exfoliation. Stress concentration at crack tips likely propagated fractures through the partially attached slab, leading to failure. Our results demonstrate the utility of high-resolution imaging techniques for quantifying far-range (>1 km) rock falls occurring from the largely inaccessible, vertical rock faces of Yosemite Valley, and for providing highly accurate and precise data needed for rock-fall hazard assessment. ?? 2011 Geological Society of America.

  1. From Big Data to Big Business

    DEFF Research Database (Denmark)

    Lund Pedersen, Carsten

    2017-01-01

    Idea in Brief: Problem: There is an enormous profit potential for manufacturing firms in big data, but one of the key barriers to obtaining data-driven growth is the lack of knowledge about which capabilities are needed to extract value and profit from data. Solution: We (BDBB research group at C...

  2. Where Are the Logical Errors in the Theory of Big Bang?

    Science.gov (United States)

    Kalanov, Temur Z.

    2015-04-01

    The critical analysis of the foundations of the theory of Big Bang is proposed. The unity of formal logic and of rational dialectics is methodological basis of the analysis. It is argued that the starting point of the theory of Big Bang contains three fundamental logical errors. The first error is the assumption that a macroscopic object (having qualitative determinacy) can have an arbitrarily small size and can be in the singular state (i.e., in the state that has no qualitative determinacy). This assumption implies that the transition, (macroscopic object having the qualitative determinacy) --> (singular state of matter that has no qualitative determinacy), leads to loss of information contained in the macroscopic object. The second error is the assumption that there are the void and the boundary between matter and void. But if such boundary existed, then it would mean that the void has dimensions and can be measured. The third error is the assumption that the singular state of matter can make a transition into the normal state without the existence of the program of qualitative and quantitative development of the matter, without controlling influence of other (independent) object. However, these assumptions conflict with the practice and, consequently, formal logic, rational dialectics, and cybernetics. Indeed, from the point of view of cybernetics, the transition, (singular state of the Universe) -->(normal state of the Universe),would be possible only in the case if there was the Managed Object that is outside the Universe and have full, complete, and detailed information about the Universe. Thus, the theory of Big Bang is a scientific fiction.

  3. Making big sense from big data in toxicology by read-across.

    Science.gov (United States)

    Hartung, Thomas

    2016-01-01

    Modern information technologies have made big data available in safety sciences, i.e., extremely large data sets that may be analyzed only computationally to reveal patterns, trends and associations. This happens by (1) compilation of large sets of existing data, e.g., as a result of the European REACH regulation, (2) the use of omics technologies and (3) systematic robotized testing in a high-throughput manner. All three approaches and some other high-content technologies leave us with big data--the challenge is now to make big sense of these data. Read-across, i.e., the local similarity-based intrapolation of properties, is gaining momentum with increasing data availability and consensus on how to process and report it. It is predominantly applied to in vivo test data as a gap-filling approach, but can similarly complement other incomplete datasets. Big data are first of all repositories for finding similar substances and ensure that the available data is fully exploited. High-content and high-throughput approaches similarly require focusing on clusters, in this case formed by underlying mechanisms such as pathways of toxicity. The closely connected properties, i.e., structural and biological similarity, create the confidence needed for predictions of toxic properties. Here, a new web-based tool under development called REACH-across, which aims to support and automate structure-based read-across, is presented among others.

  4. Distribution of base rock depth estimated from Rayleigh wave measurement by forced vibration tests

    International Nuclear Information System (INIS)

    Hiroshi Hibino; Toshiro Maeda; Chiaki Yoshimura; Yasuo Uchiyama

    2005-01-01

    This paper shows an application of Rayleigh wave methods to a real site, which was performed to determine spatial distribution of base rock depth from the ground surface. At a certain site in Sagami Plain in Japan, the base rock depth from surface is assumed to be distributed up to 10 m according to boring investigation. Possible accuracy of the base rock depth distribution has been needed for the pile design and construction. In order to measure Rayleigh wave phase velocity, forced vibration tests were conducted with a 500 N vertical shaker and linear arrays of three vertical sensors situated at several points in two zones around the edges of the site. Then, inversion analysis was carried out for soil profile by genetic algorithm, simulating measured Rayleigh wave phase velocity with the computed counterpart. Distribution of the base rock depth inverted from the analysis was consistent with the roughly estimated inclination of the base rock obtained from the boring tests, that is, the base rock is shallow around edge of the site and gradually inclines towards the center of the site. By the inversion analysis, the depth of base rock was determined as from 5 m to 6 m in the edge of the site, 10 m in the center of the site. The determined distribution of the base rock depth by this method showed good agreement on most of the points where boring investigation were performed. As a result, it was confirmed that the forced vibration tests on the ground by Rayleigh wave methods can be useful as the practical technique for estimating surface soil profiles to a depth of up to 10 m. (authors)

  5. [Big data in official statistics].

    Science.gov (United States)

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  6. The Usability of Rock-Like Materials for Numerical Studies on Rocks

    Science.gov (United States)

    Zengin, Enes; Abiddin Erguler, Zeynal

    2017-04-01

    The approaches of synthetic rock material and mass are widely used by many researchers for understanding the failure behavior of different rocks. In order to model the failure behavior of rock material, researchers take advantageous of different techniques and software. But, the majority of all these instruments are based on distinct element method (DEM). For modeling the failure behavior of rocks, and so to create a fundamental synthetic rock material model, it is required to perform related laboratory experiments for providing strength parameters. In modelling studies, model calibration processes are performed by using parameters of intact rocks such as porosity, grain size, modulus of elasticity and Poisson ratio. In some cases, it can be difficult or even impossible to acquire representative rock samples for laboratory experiments from heavily jointed rock masses and vuggy rocks. Considering this limitation, in this study, it was aimed to investigate the applicability of rock-like material (e.g. concrete) to understand and model the failure behavior of rock materials having complex inherent structures. For this purpose, concrete samples having a mixture of %65 cement dust and %35 water were utilized. Accordingly, intact concrete samples representing rocks were prepared in laboratory conditions and their physical properties such as porosity, pore size and density etc. were determined. In addition, to acquire the mechanical parameters of concrete samples, uniaxial compressive strength (UCS) tests were also performed by simultaneously measuring strain during testing. The measured physical and mechanical properties of these extracted concrete samples were used to create synthetic material and then uniaxial compressive tests were modeled and performed by using two dimensional discontinuum program known as Particle Flow Code (PFC2D). After modeling studies in PFC2D, approximately similar failure mechanism and testing results were achieved from both experimental and

  7. Modelling and analysis of canister and buffer for earthquake induced rock shear and glacial load

    International Nuclear Information System (INIS)

    Hernelind, Jan

    2010-08-01

    Existing fractures crossing a deposition hole may be activated and sheared by an earthquake. The effect of such a rock shear has been investigated by finite element calculations. The buffer material in a deposition hole acts as a cushion between the canister and the rock, which reduces the effect of a rock shear substantially. Lower density of the buffer yields softer material and reduced effect on the canister. However, at the high density that is suggested for a repository the stiffness of the buffer is rather high. The stiffness is also a function of the rate of shear, which means that there may be a substantial damage on the canister at very high shear rates. However, the earthquake induced rock shear velocity is lower than 1 m/s which is not considered to be very high. The rock shear has been modelled with finite element calculations with the code Abaqus. A three dimensional finite element mesh of the buffer and the canister has been created and simulation of a rock shear has been performed. The rock shear has been assumed to take place either perpendicular to the canister at the quarter point or at an inclined angle of 22.5 deg in tension. Furthermore horizontal shear has been studied using a vertical shear plane either at the centre or at 1/4-point for the canister. The shear calculations have been driven to a total shear of 10 cm. The canister also has to be designed to withstand the loads caused by a thick ice sheet. Besides rock shear the model has been used to analyse the effect of such glacial load (either combined with rock shear or without rock shear). This report also summarizes the effect when considering creep in the copper shell

  8. Modelling and analysis of canister and buffer for earthquake induced rock shear and glacial load

    Energy Technology Data Exchange (ETDEWEB)

    Hernelind, Jan (5T Engineering AB (Sweden))

    2010-08-15

    Existing fractures crossing a deposition hole may be activated and sheared by an earthquake. The effect of such a rock shear has been investigated by finite element calculations. The buffer material in a deposition hole acts as a cushion between the canister and the rock, which reduces the effect of a rock shear substantially. Lower density of the buffer yields softer material and reduced effect on the canister. However, at the high density that is suggested for a repository the stiffness of the buffer is rather high. The stiffness is also a function of the rate of shear, which means that there may be a substantial damage on the canister at very high shear rates. However, the earthquake induced rock shear velocity is lower than 1 m/s which is not considered to be very high. The rock shear has been modelled with finite element calculations with the code Abaqus. A three dimensional finite element mesh of the buffer and the canister has been created and simulation of a rock shear has been performed. The rock shear has been assumed to take place either perpendicular to the canister at the quarter point or at an inclined angle of 22.5 deg in tension. Furthermore horizontal shear has been studied using a vertical shear plane either at the centre or at 1/4-point for the canister. The shear calculations have been driven to a total shear of 10 cm. The canister also has to be designed to withstand the loads caused by a thick ice sheet. Besides rock shear the model has been used to analyse the effect of such glacial load (either combined with rock shear or without rock shear). This report also summarizes the effect when considering creep in the copper shell

  9. Big-Leaf Mahogany on CITES Appendix II: Big Challenge, Big Opportunity

    Science.gov (United States)

    JAMES GROGAN; PAULO BARRETO

    2005-01-01

    On 15 November 2003, big-leaf mahogany (Swietenia macrophylla King, Meliaceae), the most valuable widely traded Neotropical timber tree, gained strengthened regulatory protection from its listing on Appendix II of the Convention on International Trade in Endangered Species ofWild Fauna and Flora (CITES). CITES is a United Nations-chartered agreement signed by 164...

  10. Contrasting Nature of Magnetic Anomalies over Thin Sections Made out of Barrandien’s Basaltic Rocks Points to their Origin

    Czech Academy of Sciences Publication Activity Database

    Kletetschka, Günther; Pruner, Petr; Schnabl, Petr; Šifnerová, Kristýna

    -, special issue (2012), s. 69-70 ISSN 1335-2806. [Castle meeting New Trends in Geomagnetism : Paleo, rock and environmental magnetism/13./. 17.06.2012-23.06.2012, Zvolen] R&D Projects: GA ČR GAP210/10/2351 Institutional support: RVO:67985831 Keywords : magnetic anomalies * thin sections * volcanic rocks Subject RIV: DE - Earth Magnetism, Geodesy, Geography http://gauss.savba.sk/GPIweb/conferences/Castle2012/abstrCastle.pdf

  11. Big Data as Information Barrier

    Directory of Open Access Journals (Sweden)

    Victor Ya. Tsvetkov

    2014-07-01

    Full Text Available The article covers analysis of ‘Big Data’ which has been discussed over last 10 years. The reasons and factors for the issue are revealed. It has proved that the factors creating ‘Big Data’ issue has existed for quite a long time, and from time to time, would cause the informational barriers. Such barriers were successfully overcome through the science and technologies. The conducted analysis refers the “Big Data” issue to a form of informative barrier. This issue may be solved correctly and encourages development of scientific and calculating methods.

  12. Big Data in Space Science

    OpenAIRE

    Barmby, Pauline

    2018-01-01

    It seems like “big data” is everywhere these days. In planetary science and astronomy, we’ve been dealing with large datasets for a long time. So how “big” is our data? How does it compare to the big data that a bank or an airline might have? What new tools do we need to analyze big datasets, and how can we make better use of existing tools? What kinds of science problems can we address with these? I’ll address these questions with examples including ESA’s Gaia mission, ...

  13. Big Data in Medicine is Driving Big Changes

    Science.gov (United States)

    Verspoor, K.

    2014-01-01

    Summary Objectives To summarise current research that takes advantage of “Big Data” in health and biomedical informatics applications. Methods Survey of trends in this work, and exploration of literature describing how large-scale structured and unstructured data sources are being used to support applications from clinical decision making and health policy, to drug design and pharmacovigilance, and further to systems biology and genetics. Results The survey highlights ongoing development of powerful new methods for turning that large-scale, and often complex, data into information that provides new insights into human health, in a range of different areas. Consideration of this body of work identifies several important paradigm shifts that are facilitated by Big Data resources and methods: in clinical and translational research, from hypothesis-driven research to data-driven research, and in medicine, from evidence-based practice to practice-based evidence. Conclusions The increasing scale and availability of large quantities of health data require strategies for data management, data linkage, and data integration beyond the limits of many existing information systems, and substantial effort is underway to meet those needs. As our ability to make sense of that data improves, the value of the data will continue to increase. Health systems, genetics and genomics, population and public health; all areas of biomedicine stand to benefit from Big Data and the associated technologies. PMID:25123716

  14. Main Issues in Big Data Security

    Directory of Open Access Journals (Sweden)

    Julio Moreno

    2016-09-01

    Full Text Available Data is currently one of the most important assets for companies in every field. The continuous growth in the importance and volume of data has created a new problem: it cannot be handled by traditional analysis techniques. This problem was, therefore, solved through the creation of a new paradigm: Big Data. However, Big Data originated new issues related not only to the volume or the variety of the data, but also to data security and privacy. In order to obtain a full perspective of the problem, we decided to carry out an investigation with the objective of highlighting the main issues regarding Big Data security, and also the solutions proposed by the scientific community to solve them. In this paper, we explain the results obtained after applying a systematic mapping study to security in the Big Data ecosystem. It is almost impossible to carry out detailed research into the entire topic of security, and the outcome of this research is, therefore, a big picture of the main problems related to security in a Big Data system, along with the principal solutions to them proposed by the research community.

  15. Innovating Big Data Computing Geoprocessing for Analysis of Engineered-Natural Systems

    Science.gov (United States)

    Rose, K.; Baker, V.; Bauer, J. R.; Vasylkivska, V.

    2016-12-01

    Big data computing and analytical techniques offer opportunities to improve predictions about subsurface systems while quantifying and characterizing associated uncertainties from these analyses. Spatial analysis, big data and otherwise, of subsurface natural and engineered systems are based on variable resolution, discontinuous, and often point-driven data to represent continuous phenomena. We will present examples from two spatio-temporal methods that have been adapted for use with big datasets and big data geo-processing capabilities. The first approach uses regional earthquake data to evaluate spatio-temporal trends associated with natural and induced seismicity. The second algorithm, the Variable Grid Method (VGM), is a flexible approach that presents spatial trends and patterns, such as those resulting from interpolation methods, while simultaneously visualizing and quantifying uncertainty in the underlying spatial datasets. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analyses to efficiently consume and utilize large geospatial data in these custom analytical algorithms through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom `Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop

  16. Survival Times of Meter-Sized Rock Boulders on the Surface of Airless Bodies

    Science.gov (United States)

    Basilevsky, A. T.; Head, J. W.; Horz, F.; Ramsley, K.

    2015-01-01

    This study considers the survival times of meter-sized rock boulders on the surfaces of several airless bodies. As the starting point, we employ estimates of the survival times of such boulders on the surface of the Moon by[1], then discuss the role of destruction due to day-night temperature cycling, consider the meteorite bombardment environment on the considered bodies in terms of projectile flux and velocities and finally estimate the survival times. Survival times of meter-sized rocks on lunar surface: The survival times of hand specimen-sized rocks exposed to the lunar surface environment were estimated based on experiments modeling the destruction of rocks by meteorite impacts, combined with measurements of the lunar surface meteorite flux, (e.g.,[2]). For estimations of the survival times of meter-sized lunar boulders, [1] suggested a different approach based on analysis of the spatial density of boulders on the rims of small lunar craters of known absolute age. It was found that for a few million years, only a small fraction of the boulders ejected by cratering process are destroyed, for several tens of million years approx.50% are destroyed, and for 200-300 Ma, 90 to 99% are destroyed. Following [2] and other works, [1] considered that the rocks are mostly destroyed by meteorite impacts. Destruction of rocks by thermal-stress. However, high diurnal temperature variations on the surface of the Moon and other airless bodies imply that thermal stresses may also be a cause of surface rock destruction. Delbo et al. [3] interpreted the observed presence of fine debris on the surface of small asteroids as due to thermal surface cycling. They stated that because of the very low gravity on the surface of these bodies, ejecta from meteorite impacts should leave the body, so formation there of fine debris has to be due to thermal cycling. Based on experiments on heating-cooling of cm-scale pieces of ordinary and carbonaceous chondrites and theoretical modeling of

  17. Harnessing the Power of Big Data to Improve Graduate Medical Education: Big Idea or Bust?

    Science.gov (United States)

    Arora, Vineet M

    2018-06-01

    With the advent of electronic medical records (EMRs) fueling the rise of big data, the use of predictive analytics, machine learning, and artificial intelligence are touted as transformational tools to improve clinical care. While major investments are being made in using big data to transform health care delivery, little effort has been directed toward exploiting big data to improve graduate medical education (GME). Because our current system relies on faculty observations of competence, it is not unreasonable to ask whether big data in the form of clinical EMRs and other novel data sources can answer questions of importance in GME such as when is a resident ready for independent practice.The timing is ripe for such a transformation. A recent National Academy of Medicine report called for reforms to how GME is delivered and financed. While many agree on the need to ensure that GME meets our nation's health needs, there is little consensus on how to measure the performance of GME in meeting this goal. During a recent workshop at the National Academy of Medicine on GME outcomes and metrics in October 2017, a key theme emerged: Big data holds great promise to inform GME performance at individual, institutional, and national levels. In this Invited Commentary, several examples are presented, such as using big data to inform clinical experience and provide clinically meaningful data to trainees, and using novel data sources, including ambient data, to better measure the quality of GME training.

  18. DETECTION OF SLOPE MOVEMENT BY COMPARING POINT CLOUDS CREATED BY SFM SOFTWARE

    Directory of Open Access Journals (Sweden)

    K. Oda

    2016-06-01

    Full Text Available This paper proposes movement detection method between point clouds created by SFM software, without setting any onsite georeferenced points. SfM software, like Smart3DCaputure, PhotoScan, and Pix4D, are convenient for non-professional operator of photogrammetry, because these systems require simply specification of sequence of photos and output point clouds with colour index which corresponds to the colour of original image pixel where the point is projected. SfM software can execute aerial triangulation and create dense point clouds fully automatically. This is useful when monitoring motion of unstable slopes, or loos rocks in slopes along roads or railroads. Most of existing method, however, uses mesh-based DSM for comparing point clouds before/after movement and it cannot be applied in such cases that part of slopes forms overhangs. And in some cases movement is smaller than precision of ground control points and registering two point clouds with GCP is not appropriate. Change detection method in this paper adopts CCICP (Classification and Combined ICP algorithm for registering point clouds before / after movement. The CCICP algorithm is a type of ICP (Iterative Closest Points which minimizes point-to-plane, and point-to-point distances, simultaneously, and also reject incorrect correspondences based on point classification by PCA (Principle Component Analysis. Precision test shows that CCICP method can register two point clouds up to the 1 pixel size order in original images. Ground control points set in site are useful for initial setting of two point clouds. If there are no GCPs in site of slopes, initial setting is achieved by measuring feature points as ground control points in the point clouds before movement, and creating point clouds after movement with these ground control points. When the motion is rigid transformation, in case that a loose Rock is moving in slope, motion including rotation can be analysed by executing CCICP for a

  19. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  20. A survey of big data research

    Science.gov (United States)

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  1. Big Data in Action for Government : Big Data Innovation in Public Services, Policy, and Engagement

    OpenAIRE

    World Bank

    2017-01-01

    Governments have an opportunity to harness big data solutions to improve productivity, performance and innovation in service delivery and policymaking processes. In developing countries, governments have an opportunity to adopt big data solutions and leapfrog traditional administrative approaches

  2. Radiogeochemical characteristic of rocks of the Crimea peninsula and some principles of sedimentation

    International Nuclear Information System (INIS)

    Gherasimov, Yu.G.

    1983-01-01

    Radiogeochemical mapping with rock sampling of profiles, crossing all main structural facies zones of the Crimea peninsula was conducted. 1000 samples were taken. Uranium determination in samples was performed by fluorescence method (2 g/t threshold sensitivity). The distributions of U and Th background contents in rocks of the Crimea were tabulated. Maps of sampling of geological formations and distribution of U and Th background contents in rocks are given. It is shown that radioelement content in Crimea rocks is for the most part lower than clark one: 1.3-2.1 g/t contents prevail for U, Th contents don't exceed 12 g/t. Closeness of some radiogeochemical parameters points to the formation of terrigenous Crimea rocks due to removal of the material from the Ukrainian shield. Reworking of initial terrigenous material by hypergene processes led to U and Th separation, as well as to enrichment of younger sedimentary rocks with uranium

  3. Impact compressive and bending behaviour of rocks accompanied by electromagnetic phenomena.

    Science.gov (United States)

    Kobayashi, Hidetoshi; Horikawa, Keitaro; Ogawa, Kinya; Watanabe, Keiko

    2014-08-28

    It is well known that electromagnetic phenomena are often observed preceding earthquakes. However, the mechanism by which these electromagnetic waves are generated during the fracture and deformation of rocks has not been fully identified. Therefore, in order to examine the relationship between the electromagnetic phenomena and the mechanical properties of rocks, uniaxial compression and three-point bending tests for two kinds of rocks with different quartz content, granite and gabbro, have been carried out at quasi-static and dynamic rates. Especially, in the bending tests, pre-cracked specimens of granite were also tested. Using a split Hopkinson pressure bar and a ferrite-core antenna in close proximity to the specimens, both the stress-strain (load-displacement) curve and simultaneous electromagnetic wave magnitude were measured. It was found that the dynamic compressive and bending strengths and the stress increase slope of both rocks were higher than those observed in static tests; therefore, there is a strain-rate dependence in their strength and stress increase rate. It was found from the tests using the pre-cracked bending specimens that the intensity of electromagnetic waves measured during crack extension increased almost proportionally to the increase of the maximum stress intensity factor of specimens. This tendency was observed in both the dynamic and quasi-static three-point bending tests for granite. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  4. 78 FR 3911 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN; Final Comprehensive...

    Science.gov (United States)

    2013-01-17

    ... DEPARTMENT OF THE INTERIOR Fish and Wildlife Service [FWS-R3-R-2012-N259; FXRS1265030000-134-FF03R06000] Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN; Final Comprehensive... significant impact (FONSI) for the environmental assessment (EA) for Big Stone National Wildlife Refuge...

  5. Terrestrial gamma ray dose rates on Ryoke granitic rocks in Ikoma Mountains

    International Nuclear Information System (INIS)

    Ikeda, Tadashi; Ueshima, Masaaki; Shibayama, Motohiko; Hiraoka, Yoshitsugu; Muslim, Dicky

    2012-01-01

    We measured the γ dose rate of 16 rock bodies in the field, which belonged to Ryoke granitic rocks distributed over Ikoma Mountains. The measurement points were 190 spots, and the mean dose rate was 82.0 ± 21.0 nGy/h. Results of analysis were summarized as follows. (1) The distribution of the dose rate in the Fukihata quartz diorite showed that the rocks crystallization differentiation had progressed from the south to the north. (2) The dose rate of granite tended to arise with the increase of SiO 2 quantity, but in the Iwahashiyama granite, the Takayasuyama granite, the Omichi granite and the Katakami granite, it was revealed that the dose rate was low in spite of high SiO 2 quantity. (3) It became clear that the dose rate of Ryoke granitic rocks from the first stage to the fourth stage was high to be considered as a new rock body. (4) Because the relationship between the dose rate of rocks and the main chemical elements did not show a common characteristic, it may be that those rocks were formed from different Magma. (author)

  6. Neutron activation analysis of alternative phosphate rocks used in animal nutrition

    International Nuclear Information System (INIS)

    Canella, Artur A.; Ferreira, Walter M.

    2005-01-01

    Since 1980's, Bovine Sponghiform Encephalophaty has insidiously created a fierce battleground between farmers, scientists, environmentalists and consumers. The use of meat and bone meals is currently prohibited in ruminant feeds throughout the world. Some inorganic sources offer the combination of high phosphorus content and acceptable animal digestibility make them options as supplemental phosphorus, for instance phosphate rocks, general term applied to minerals valued chiefly for their phosphorus content. However, phosphate rocks are long been known containing hazardous elements, make them sometimes unsuitable for animal nutrition. Neutron Activation Analysis has been supportive to the mineral evaluation of alternative phosphate rocks. This evaluation is subject of on-going doctoral thesis which has been carried-out by the main author. The NAA method has been very efficient due to its highly sensitive and multi-elemental nature. In this paper results of Vanadium content from three different phosphate rocks are presented. Their values have been pointed out that Brazilian phosphate rocks present hazardous elements at the same levels of phosphate rocks from some countries of Africa, North America and Middle East, data from our study (Brazilian data) and FAO - Food and Agriculture Organization (others countries). (author)

  7. Mineral Chemistry and Geochemistry of Volcanic Rocks in The North of Pasinler (Erzurum

    Directory of Open Access Journals (Sweden)

    Oktay KILIÇ

    2009-02-01

    Full Text Available In the north of Pasinler (Erzurum, Upper Miocene-Pliocene volcanic rocks crop out. These volcanites are composed of basaltic andesite, andesite, dacite, rhyolite lavas and rhyolitic pyroclastics. The rocks show porphyritic, microlitic porphyritic, hyalo-microlitic porphyritic, vitrophyric, glomeroporphyritic, pilotaxitic and hyalopilitic textures. The investigated volcanites contain plagioclase (An29-80, olivine (Fo65-82, clinopyroxene (augite, orthopyroxene (enstatite, amphibole (Mg#: 0.57-0.71, biotite (phlogopite: 0.44-0.47, annite: 0.33-0.37, sanidine, quartz and opaque mineral (titano-magnetite and ilmenite. The volcanic rocks are calc-alkaline in character and have medium to high-K contents. Major oxide and trace element variations point out open-system magmatic differentiation in the evolution of rocks. Geochemical data indicate an important role of fractionation of phenocryst phases in the rocks during differentiation process. However, it is considered that assimilation±magma mixing might have accompanied to the process. High LILE (K, Rb, Ba, Th and relatively low HFSE (Nb, Ta, Hf, Zr contents of the rocks indicate that these rocks derived from parental magmas carrying subduction signature.

  8. Achievable Rate Estimation of IEEE 802.11ad Visual Big-Data Uplink Access in Cloud-Enabled Surveillance Applications.

    Directory of Open Access Journals (Sweden)

    Joongheon Kim

    Full Text Available This paper addresses the computation procedures for estimating the impact of interference in 60 GHz IEEE 802.11ad uplink access in order to construct visual big-data database from randomly deployed surveillance camera sensing devices. The acquired large-scale massive visual information from surveillance camera devices will be used for organizing big-data database, i.e., this estimation is essential for constructing centralized cloud-enabled surveillance database. This performance estimation study captures interference impacts on the target cloud access points from multiple interference components generated by the 60 GHz wireless transmissions from nearby surveillance camera devices to their associated cloud access points. With this uplink interference scenario, the interference impacts on the main wireless transmission from a target surveillance camera device to its associated target cloud access point with a number of settings are measured and estimated under the consideration of 60 GHz radiation characteristics and antenna radiation pattern models.

  9. Big domains are novel Ca²+-binding modules: evidences from big domains of Leptospira immunoglobulin-like (Lig proteins.

    Directory of Open Access Journals (Sweden)

    Rajeev Raman

    Full Text Available BACKGROUND: Many bacterial surface exposed proteins mediate the host-pathogen interaction more effectively in the presence of Ca²+. Leptospiral immunoglobulin-like (Lig proteins, LigA and LigB, are surface exposed proteins containing Bacterial immunoglobulin like (Big domains. The function of proteins which contain Big fold is not known. Based on the possible similarities of immunoglobulin and βγ-crystallin folds, we here explore the important question whether Ca²+ binds to a Big domains, which would provide a novel functional role of the proteins containing Big fold. PRINCIPAL FINDINGS: We selected six individual Big domains for this study (three from the conserved part of LigA and LigB, denoted as Lig A3, Lig A4, and LigBCon5; two from the variable region of LigA, i.e., 9(th (Lig A9 and 10(th repeats (Lig A10; and one from the variable region of LigB, i.e., LigBCen2. We have also studied the conserved region covering the three and six repeats (LigBCon1-3 and LigCon. All these proteins bind the calcium-mimic dye Stains-all. All the selected four domains bind Ca²+ with dissociation constants of 2-4 µM. Lig A9 and Lig A10 domains fold well with moderate thermal stability, have β-sheet conformation and form homodimers. Fluorescence spectra of Big domains show a specific doublet (at 317 and 330 nm, probably due to Trp interaction with a Phe residue. Equilibrium unfolding of selected Big domains is similar and follows a two-state model, suggesting the similarity in their fold. CONCLUSIONS: We demonstrate that the Lig are Ca²+-binding proteins, with Big domains harbouring the binding motif. We conclude that despite differences in sequence, a Big motif binds Ca²+. This work thus sets up a strong possibility for classifying the proteins containing Big domains as a novel family of Ca²+-binding proteins. Since Big domain is a part of many proteins in bacterial kingdom, we suggest a possible function these proteins via Ca²+ binding.

  10. High-level radioactive waste isolation by incorporation in silicate rock

    International Nuclear Information System (INIS)

    Schwartz, L.L.; Cohen, J.J.; Lewis, A.E.; Braun, R.L.

    1978-01-01

    A number of technical possibilities for isolating high-level radioactive materials have been theoretically investigated at various times and places. Isolating such wastes deep underground to ensure long term removal from the biosphere is one such possibility. The present concept involves as a first step creating the necessary void space at considerable depth, say 2 to 5 km, in a very-low-permeability silicate medium such as shale. Waste in dry, calcined or vitrified form is then lowered into the void space, and the access hole or shaft sealed. Energy released by the radioactive decay raises the temperature to a point where the surrounding rock begins to melt. The waste is then dissolved in it. The extent of this melt region grows until the heat generated is balanced by conduction away from the molten zone. Resolidification then begins, and ends when the radioactive decay has progressed to the point that the temperature falls below the melting point of the rock-waste solution. Calculations are presented showing the growth and resolidification process. A nuclear explosion is one way of creating the void space. (author)

  11. New 'bigs' in cosmology

    International Nuclear Information System (INIS)

    Yurov, Artyom V.; Martin-Moruno, Prado; Gonzalez-Diaz, Pedro F.

    2006-01-01

    This paper contains a detailed discussion on new cosmic solutions describing the early and late evolution of a universe that is filled with a kind of dark energy that may or may not satisfy the energy conditions. The main distinctive property of the resulting space-times is that they make to appear twice the single singular events predicted by the corresponding quintessential (phantom) models in a manner which can be made symmetric with respect to the origin of cosmic time. Thus, big bang and big rip singularity are shown to take place twice, one on the positive branch of time and the other on the negative one. We have also considered dark energy and phantom energy accretion onto black holes and wormholes in the context of these new cosmic solutions. It is seen that the space-times of these holes would then undergo swelling processes leading to big trip and big hole events taking place on distinct epochs along the evolution of the universe. In this way, the possibility is considered that the past and future be connected in a non-paradoxical manner in the universes described by means of the new symmetric solutions

  12. Rock Art

    Science.gov (United States)

    Henn, Cynthia A.

    2004-01-01

    There are many interpretations for the symbols that are seen in rock art, but no decoding key has ever been discovered. This article describes one classroom's experiences with a lesson on rock art--making their rock art and developing their own personal symbols. This lesson allowed for creativity, while giving an opportunity for integration…

  13. 2nd INNS Conference on Big Data

    CERN Document Server

    Manolopoulos, Yannis; Iliadis, Lazaros; Roy, Asim; Vellasco, Marley

    2017-01-01

    The book offers a timely snapshot of neural network technologies as a significant component of big data analytics platforms. It promotes new advances and research directions in efficient and innovative algorithmic approaches to analyzing big data (e.g. deep networks, nature-inspired and brain-inspired algorithms); implementations on different computing platforms (e.g. neuromorphic, graphics processing units (GPUs), clouds, clusters); and big data analytics applications to solve real-world problems (e.g. weather prediction, transportation, energy management). The book, which reports on the second edition of the INNS Conference on Big Data, held on October 23–25, 2016, in Thessaloniki, Greece, depicts an interesting collaborative adventure of neural networks with big data and other learning technologies.

  14. Big geo data surface approximation using radial basis functions: A comparative study

    Science.gov (United States)

    Majdisova, Zuzana; Skala, Vaclav

    2017-12-01

    Approximation of scattered data is often a task in many engineering problems. The Radial Basis Function (RBF) approximation is appropriate for big scattered datasets in n-dimensional space. It is a non-separable approximation, as it is based on the distance between two points. This method leads to the solution of an overdetermined linear system of equations. In this paper the RBF approximation methods are briefly described, a new approach to the RBF approximation of big datasets is presented, and a comparison for different Compactly Supported RBFs (CS-RBFs) is made with respect to the accuracy of the computation. The proposed approach uses symmetry of a matrix, partitioning the matrix into blocks and data structures for storage of the sparse matrix. The experiments are performed for synthetic and real datasets.

  15. The tipping point how little things can make a big difference

    CERN Document Server

    Gladwell, Malcolm

    2002-01-01

    The tipping point is that magic moment when an idea, trend, or social behavior crosses a threshold, tips, and spreads like wildfire. Just as a single sick person can start an epidemic of the flu, so too can a small but precisely targeted push cause a fashion trend, the popularity of a new product, or a drop in the crime rate. This widely acclaimed bestseller, in which Malcolm Gladwell explores and brilliantly illuminates the tipping point phenomenon, is already changing the way people throughout the world think about selling products and disseminating ideas.

  16. The ethics of biomedical big data

    CERN Document Server

    Mittelstadt, Brent Daniel

    2016-01-01

    This book presents cutting edge research on the new ethical challenges posed by biomedical Big Data technologies and practices. ‘Biomedical Big Data’ refers to the analysis of aggregated, very large datasets to improve medical knowledge and clinical care. The book describes the ethical problems posed by aggregation of biomedical datasets and re-use/re-purposing of data, in areas such as privacy, consent, professionalism, power relationships, and ethical governance of Big Data platforms. Approaches and methods are discussed that can be used to address these problems to achieve the appropriate balance between the social goods of biomedical Big Data research and the safety and privacy of individuals. Seventeen original contributions analyse the ethical, social and related policy implications of the analysis and curation of biomedical Big Data, written by leading experts in the areas of biomedical research, medical and technology ethics, privacy, governance and data protection. The book advances our understan...

  17. Scalable privacy-preserving big data aggregation mechanism

    Directory of Open Access Journals (Sweden)

    Dapeng Wu

    2016-08-01

    Full Text Available As the massive sensor data generated by large-scale Wireless Sensor Networks (WSNs recently become an indispensable part of ‘Big Data’, the collection, storage, transmission and analysis of the big sensor data attract considerable attention from researchers. Targeting the privacy requirements of large-scale WSNs and focusing on the energy-efficient collection of big sensor data, a Scalable Privacy-preserving Big Data Aggregation (Sca-PBDA method is proposed in this paper. Firstly, according to the pre-established gradient topology structure, sensor nodes in the network are divided into clusters. Secondly, sensor data is modified by each node according to the privacy-preserving configuration message received from the sink. Subsequently, intra- and inter-cluster data aggregation is employed during the big sensor data reporting phase to reduce energy consumption. Lastly, aggregated results are recovered by the sink to complete the privacy-preserving big data aggregation. Simulation results validate the efficacy and scalability of Sca-PBDA and show that the big sensor data generated by large-scale WSNs is efficiently aggregated to reduce network resource consumption and the sensor data privacy is effectively protected to meet the ever-growing application requirements.

  18. Exploring Data in Human Resources Big Data

    Directory of Open Access Journals (Sweden)

    Adela BARA

    2016-01-01

    Full Text Available Nowadays, social networks and informatics technologies and infrastructures are constantly developing and affect each other. In this context, the HR recruitment process became complex and many multinational organizations have encountered selection issues. The objective of the paper is to develop a prototype system for assisting the selection of candidates for an intelligent management of human resources. Such a system can be a starting point for the efficient organization of semi-structured and unstructured data on recruitment activities. The article extends the research presented at the 14th International Conference on Informatics in Economy (IE 2015 in the scientific paper "Big Data challenges for human resources management".

  19. Modelling of nuclear explosions in hard rock sites

    International Nuclear Information System (INIS)

    Brunish, W.M.; App, F.N.

    1993-01-01

    This study represents part of a larger effort to systematically model the effects of differing source region properties on ground motion from underground nuclear explosions at the Nevada Test Site. In previous work by the authors the primary emphasis was on alluvium and both saturated and unsaturated tuff. We have attempted to model events on Pahute Mesa, where either the working point medium, or some of the layers above the working point, or both, are hard rock. The complex layering at these sites, however, has prevented us from drawing unambiguous conclusions about modelling hard rock. In order to learn more about the response of hard rock to underground nuclear explosions, we have attempted to model the PILEDRIVER event. PILEDRIVER was fired on June 2, 1966 in the granite stock of Area 15 at the Nevada Test Site. The working point was at a depth of 462.7 m and the yield was determined to be 61 kt. Numerous surface, sub-surface and free-field measurements were made and analyzed by SRI. An attempt was made to determine the contribution of spall to the teleseismic signal, but proved unsuccessful because most of the data from below-shot-level gauges was lost. Nonetheless, there is quite a bit of good quality data from a variety of locations. We have been able to obtain relatively good agreement with the experimental PILEDRIVER waveforms. In order to do so, we had to model the granodiorite as being considerably weaker than ''good quality'' granite, and it had to undergo considerable weakening due to shock damage as well. In addition, the near-surface layers had to be modeled as being weak and compressible and as have a much lower sound speed than the material at depth. The is consistent with a fractured and jointed material at depth, and a weathered material near the surface

  20. Ethische aspecten van big data

    NARCIS (Netherlands)

    N. (Niek) van Antwerpen; Klaas Jan Mollema

    2017-01-01

    Big data heeft niet alleen geleid tot uitdagende technische vraagstukken, ook gaat het gepaard met allerlei nieuwe ethische en morele kwesties. Om verantwoord met big data om te gaan, moet ook over deze kwesties worden nagedacht. Want slecht datagebruik kan nadelige gevolgen hebben voor

  1. Big Data and Analytics in Healthcare.

    Science.gov (United States)

    Tan, S S-L; Gao, G; Koch, S

    2015-01-01

    This editorial is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". The amount of data being generated in the healthcare industry is growing at a rapid rate. This has generated immense interest in leveraging the availability of healthcare data (and "big data") to improve health outcomes and reduce costs. However, the nature of healthcare data, and especially big data, presents unique challenges in processing and analyzing big data in healthcare. This Focus Theme aims to disseminate some novel approaches to address these challenges. More specifically, approaches ranging from efficient methods of processing large clinical data to predictive models that could generate better predictions from healthcare data are presented.

  2. Big Data for Business Ecosystem Players

    Directory of Open Access Journals (Sweden)

    Perko Igor

    2016-06-01

    Full Text Available In the provided research, some of the Big Data most prospective usage domains connect with distinguished player groups found in the business ecosystem. Literature analysis is used to identify the state of the art of Big Data related research in the major domains of its use-namely, individual marketing, health treatment, work opportunities, financial services, and security enforcement. System theory was used to identify business ecosystem major player types disrupted by Big Data: individuals, small and mid-sized enterprises, large organizations, information providers, and regulators. Relationships between the domains and players were explained through new Big Data opportunities and threats and by players’ responsive strategies. System dynamics was used to visualize relationships in the provided model.

  3. "Big data" in economic history.

    Science.gov (United States)

    Gutmann, Myron P; Merchant, Emily Klancher; Roberts, Evan

    2018-03-01

    Big data is an exciting prospect for the field of economic history, which has long depended on the acquisition, keying, and cleaning of scarce numerical information about the past. This article examines two areas in which economic historians are already using big data - population and environment - discussing ways in which increased frequency of observation, denser samples, and smaller geographic units allow us to analyze the past with greater precision and often to track individuals, places, and phenomena across time. We also explore promising new sources of big data: organically created economic data, high resolution images, and textual corpora.

  4. Big Data Knowledge in Global Health Education.

    Science.gov (United States)

    Olayinka, Olaniyi; Kekeh, Michele; Sheth-Chandra, Manasi; Akpinar-Elci, Muge

    The ability to synthesize and analyze massive amounts of data is critical to the success of organizations, including those that involve global health. As countries become highly interconnected, increasing the risk for pandemics and outbreaks, the demand for big data is likely to increase. This requires a global health workforce that is trained in the effective use of big data. To assess implementation of big data training in global health, we conducted a pilot survey of members of the Consortium of Universities of Global Health. More than half the respondents did not have a big data training program at their institution. Additionally, the majority agreed that big data training programs will improve global health deliverables, among other favorable outcomes. Given the observed gap and benefits, global health educators may consider investing in big data training for students seeking a career in global health. Copyright © 2017 Icahn School of Medicine at Mount Sinai. Published by Elsevier Inc. All rights reserved.

  5. The Influence Of Switching-Off The Big Lamps On The Humidity Operation Hall

    International Nuclear Information System (INIS)

    Wiranto, Slamet; Sriawan

    2001-01-01

    When there is no activity in the Operation Hall, the big lamps in this are switched off. Due to the water trap of ventilation system is not in good function, the humidity of the Operation Hall increases. In any point of time the humidity rise over the permitted limit value. To avoid this problem it is needed to investigate the characteristic by measuring the humidity of the Operation Hall at various condition and situation. From the characteristic, it can be determined that for normal condition, the Operation Hall big lamps should be switched off, and 2 days before start-up reactor, the all operation building lamps should be switched on for about 5 days as the operation building humidity back to normal value

  6. GEOSS: Addressing Big Data Challenges

    Science.gov (United States)

    Nativi, S.; Craglia, M.; Ochiai, O.

    2014-12-01

    In the sector of Earth Observation, the explosion of data is due to many factors including: new satellite constellations, the increased capabilities of sensor technologies, social media, crowdsourcing, and the need for multidisciplinary and collaborative research to face Global Changes. In this area, there are many expectations and concerns about Big Data. Vendors have attempted to use this term for their commercial purposes. It is necessary to understand whether Big Data is a radical shift or an incremental change for the existing digital infrastructures. This presentation tries to explore and discuss the impact of Big Data challenges and new capabilities on the Global Earth Observation System of Systems (GEOSS) and particularly on its common digital infrastructure called GCI. GEOSS is a global and flexible network of content providers allowing decision makers to access an extraordinary range of data and information at their desk. The impact of the Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on GEOSS is discussed. The main solutions and experimentation developed by GEOSS along these axes are introduced and analyzed. GEOSS is a pioneering framework for global and multidisciplinary data sharing in the Earth Observation realm; its experience on Big Data is valuable for the many lessons learned.

  7. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  8. BIG DATA IN TAMIL: OPPORTUNITIES, BENEFITS AND CHALLENGES

    OpenAIRE

    R.S. Vignesh Raj; Babak Khazaei; Ashik Ali

    2015-01-01

    This paper gives an overall introduction on big data and has tried to introduce Big Data in Tamil. It discusses the potential opportunities, benefits and likely challenges from a very Tamil and Tamil Nadu perspective. The paper has also made original contribution by proposing the ‘big data’s’ terminology in Tamil. The paper further suggests a few areas to explore using big data Tamil on the lines of the Tamil Nadu Government ‘vision 2023’. Whilst, big data has something to offer everyone, it ...

  9. Big data in biomedicine.

    Science.gov (United States)

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Urban Intersection Recognition and Construction Based on Big Trace Data

    Directory of Open Access Journals (Sweden)

    TANG Luliang

    2017-06-01

    Full Text Available Intersection is an important part of the generation and renewal of urban traffic network. In this paper, a new method was proposed to detect urban intersections automatically from the spatiotemporal big trace data. Firstly, the turning point pairs were based on tracking the trace data collected by vehicles. Secondly, different types of turning point pairs were clustered by using spatial growing clustering method based on angle and distance differences, and the clustering methods of local connectivity was used to recognize the intersection. Finally, the intersection structure of multi-level road network was constructed with the range of the intersection and turning point pairs. Taking the taxi trajectory data in Wuhan city as an example, the experimental results showed that the method proposed in this paper can automatically detect and recognize the road intersection and its structure.

  11. Big Data’s Role in Precision Public Health

    Science.gov (United States)

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts. PMID:29594091

  12. Big inquiry

    Energy Technology Data Exchange (ETDEWEB)

    Wynne, B [Lancaster Univ. (UK)

    1979-06-28

    The recently published report entitled 'The Big Public Inquiry' from the Council for Science and Society and the Outer Circle Policy Unit is considered, with especial reference to any future enquiry which may take place into the first commercial fast breeder reactor. Proposals embodied in the report include stronger rights for objectors and an attempt is made to tackle the problem that participation in a public inquiry is far too late to be objective. It is felt by the author that the CSS/OCPU report is a constructive contribution to the debate about big technology inquiries but that it fails to understand the deeper currents in the economic and political structure of technology which so influence the consequences of whatever formal procedures are evolved.

  13. Thermally induced rock stress increment and rock reinforcement response

    International Nuclear Information System (INIS)

    Hakala, M.; Stroem, J.; Nujiten, G.; Uotinen, L.; Siren, T.; Suikkanen, J.

    2014-07-01

    This report describes a detailed study of the effect of thermal heating by the spent nuclear fuel containers on the in situ rock stress, any potential rock failure, and associated rock reinforcement strategies for the Olkiluoto underground repository. The modelling approach and input data are presented together repository layout diagrams. The numerical codes used to establish the effects of heating on the in situ stress field are outlined, together with the rock mass parameters, in situ stress values, radiogenic temperatures and reinforcement structures. This is followed by a study of the temperature and stress evolution during the repository's operational period and the effect of the heating on the reinforcement structures. It is found that, during excavation, the maximum principal stress is concentrated at the transition areas where the profile changes and that, due to the heating from the deposition of spent nuclear fuel, the maximum principal stress rises significantly in the tunnel arch area of NW/SW oriented central tunnels. However, it is predicted that the rock's crack damage (CD, short term strength) value of 99 MPa will not be exceeded anywhere within the model. Loads onto the reinforcement structures will come from damaged and loosened rock which is assumed in the modelling as a free rock wedge - but this is very much a worst case scenario because there is no guarantee that rock cracking would form a free rock block. The structural capacity of the reinforcement structures is described and it is predicted that the current quantity of the rock reinforcement is strong enough to provide a stable tunnel opening during the peak of the long term stress state, with damage predicted on the sprayed concrete liner. However, the long term stability and safety can be improved through the implementation of the principles of the Observational Method. The effect of ventilation is also considered and an additional study of the radiogenic heating effect on the brittle

  14. Thermally induced rock stress increment and rock reinforcement response

    Energy Technology Data Exchange (ETDEWEB)

    Hakala, M. [KMS Hakala Oy, Nokia (Finland); Stroem, J.; Nujiten, G.; Uotinen, L. [Rockplan, Helsinki (Finland); Siren, T.; Suikkanen, J.

    2014-07-15

    This report describes a detailed study of the effect of thermal heating by the spent nuclear fuel containers on the in situ rock stress, any potential rock failure, and associated rock reinforcement strategies for the Olkiluoto underground repository. The modelling approach and input data are presented together repository layout diagrams. The numerical codes used to establish the effects of heating on the in situ stress field are outlined, together with the rock mass parameters, in situ stress values, radiogenic temperatures and reinforcement structures. This is followed by a study of the temperature and stress evolution during the repository's operational period and the effect of the heating on the reinforcement structures. It is found that, during excavation, the maximum principal stress is concentrated at the transition areas where the profile changes and that, due to the heating from the deposition of spent nuclear fuel, the maximum principal stress rises significantly in the tunnel arch area of NW/SW oriented central tunnels. However, it is predicted that the rock's crack damage (CD, short term strength) value of 99 MPa will not be exceeded anywhere within the model. Loads onto the reinforcement structures will come from damaged and loosened rock which is assumed in the modelling as a free rock wedge - but this is very much a worst case scenario because there is no guarantee that rock cracking would form a free rock block. The structural capacity of the reinforcement structures is described and it is predicted that the current quantity of the rock reinforcement is strong enough to provide a stable tunnel opening during the peak of the long term stress state, with damage predicted on the sprayed concrete liner. However, the long term stability and safety can be improved through the implementation of the principles of the Observational Method. The effect of ventilation is also considered and an additional study of the radiogenic heating effect on the

  15. Big data analytics with R and Hadoop

    CERN Document Server

    Prajapati, Vignesh

    2013-01-01

    Big Data Analytics with R and Hadoop is a tutorial style book that focuses on all the powerful big data tasks that can be achieved by integrating R and Hadoop.This book is ideal for R developers who are looking for a way to perform big data analytics with Hadoop. This book is also aimed at those who know Hadoop and want to build some intelligent applications over Big data with R packages. It would be helpful if readers have basic knowledge of R.

  16. Big data in forensic science and medicine.

    Science.gov (United States)

    Lefèvre, Thomas

    2018-07-01

    In less than a decade, big data in medicine has become quite a phenomenon and many biomedical disciplines got their own tribune on the topic. Perspectives and debates are flourishing while there is a lack for a consensual definition for big data. The 3Vs paradigm is frequently evoked to define the big data principles and stands for Volume, Variety and Velocity. Even according to this paradigm, genuine big data studies are still scarce in medicine and may not meet all expectations. On one hand, techniques usually presented as specific to the big data such as machine learning techniques are supposed to support the ambition of personalized, predictive and preventive medicines. These techniques are mostly far from been new and are more than 50 years old for the most ancient. On the other hand, several issues closely related to the properties of big data and inherited from other scientific fields such as artificial intelligence are often underestimated if not ignored. Besides, a few papers temper the almost unanimous big data enthusiasm and are worth attention since they delineate what is at stakes. In this context, forensic science is still awaiting for its position papers as well as for a comprehensive outline of what kind of contribution big data could bring to the field. The present situation calls for definitions and actions to rationally guide research and practice in big data. It is an opportunity for grounding a true interdisciplinary approach in forensic science and medicine that is mainly based on evidence. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  17. Through the big bang: Continuing Einstein's equations beyond a cosmological singularity

    Science.gov (United States)

    Koslowski, Tim A.; Mercati, Flavio; Sloan, David

    2018-03-01

    All measurements are comparisons. The only physically accessible degrees of freedom (DOFs) are dimensionless ratios. The objective description of the universe as a whole thus predicts only how these ratios change collectively as one of them is changed. Here we develop a description for classical Bianchi IX cosmology implementing these relational principles. The objective evolution decouples from the volume and its expansion degree of freedom. We use the relational description to investigate both vacuum dominated and quiescent Bianchi IX cosmologies. In the vacuum dominated case the relational dynamical system predicts an infinite amount of change of the relational DOFs, in accordance with the well known chaotic behaviour of Bianchi IX. In the quiescent case the relational dynamical system evolves uniquely though the point where the decoupled scale DOFs predict the big bang/crunch. This is a non-trivial prediction of the relational description; the big bang/crunch is not the end of physics - it is instead a regular point of the relational evolution. Describing our solutions as spacetimes that satisfy Einstein's equations, we find that the relational dynamical system predicts two singular solutions of GR that are connected at the hypersurface of the singularity such that relational DOFs are continuous and the orientation of the spatial frame is inverted.

  18. Comparison of disposal concepts for rock salt and hard rock

    International Nuclear Information System (INIS)

    Papp, R.

    1998-01-01

    The study was carried out in the period 1994-1996. The goals were to prepare a draft on spent fuel disposal in hard rock and additionally a comparison with existing disposal concepts for rock salt. A cask for direct disposal of spent fuel and a repository for hard rock including a safeguards concept were conceptually designed. The results of the study confirm, that the early German decision to employ rock salt was reasonable. (orig.)

  19. Big Data Technologies

    Science.gov (United States)

    Bellazzi, Riccardo; Dagliati, Arianna; Sacchi, Lucia; Segagni, Daniele

    2015-01-01

    The so-called big data revolution provides substantial opportunities to diabetes management. At least 3 important directions are currently of great interest. First, the integration of different sources of information, from primary and secondary care to administrative information, may allow depicting a novel view of patient’s care processes and of single patient’s behaviors, taking into account the multifaceted nature of chronic care. Second, the availability of novel diabetes technologies, able to gather large amounts of real-time data, requires the implementation of distributed platforms for data analysis and decision support. Finally, the inclusion of geographical and environmental information into such complex IT systems may further increase the capability of interpreting the data gathered and extract new knowledge from them. This article reviews the main concepts and definitions related to big data, it presents some efforts in health care, and discusses the potential role of big data in diabetes care. Finally, as an example, it describes the research efforts carried on in the MOSAIC project, funded by the European Commission. PMID:25910540

  20. The Berlin Inventory of Gambling behavior - Screening (BIG-S): Validation using a clinical sample.

    Science.gov (United States)

    Wejbera, Martin; Müller, Kai W; Becker, Jan; Beutel, Manfred E

    2017-05-18

    Published diagnostic questionnaires for gambling disorder in German are either based on DSM-III criteria or focus on aspects other than life time prevalence. This study was designed to assess the usability of the DSM-IV criteria based Berlin Inventory of Gambling Behavior Screening tool in a clinical sample and adapt it to DSM-5 criteria. In a sample of 432 patients presenting for behavioral addiction assessment at the University Medical Center Mainz, we checked the screening tool's results against clinical diagnosis and compared a subsample of n=300 clinically diagnosed gambling disorder patients with a comparison group of n=132. The BIG-S produced a sensitivity of 99.7% and a specificity of 96.2%. The instrument's unidimensionality and the diagnostic improvements of DSM-5 criteria were verified by exploratory and confirmatory factor analysis as well as receiver operating characteristic analysis. The BIG-S is a reliable and valid screening tool for gambling disorder and demonstrated its concise and comprehensible operationalization of current DSM-5 criteria in a clinical setting.

  1. Mechanism of Rock Burst Occurrence in Specially Thick Coal Seam with Rock Parting

    Science.gov (United States)

    Wang, Jian-chao; Jiang, Fu-xing; Meng, Xiang-jun; Wang, Xu-you; Zhu, Si-tao; Feng, Yu

    2016-05-01

    Specially thick coal seam with complex construction, such as rock parting and alternative soft and hard coal, is called specially thick coal seam with rock parting (STCSRP), which easily leads to rock burst during mining. Based on the stress distribution of rock parting zone, this study investigated the mechanism, engineering discriminant conditions, prevention methods, and risk evaluation method of rock burst occurrence in STCSRP through setting up a mechanical model. The main conclusions of this study are as follows. (1) When the mining face moves closer to the rock parting zone, the original non-uniform stress of the rock parting zone and the advancing stress of the mining face are combined to intensify gradually the shearing action of coal near the mining face. When the shearing action reaches a certain degree, rock burst easily occurs near the mining face. (2) Rock burst occurrence in STCSRP is positively associated with mining depth, advancing stress concentration factor of the mining face, thickness of rock parting, bursting liability of coal, thickness ratio of rock parting to coal seam, and difference of elastic modulus between rock parting and coal, whereas negatively associated with shear strength. (3) Technologies of large-diameter drilling, coal seam water injection, and deep hole blasting can reduce advancing stress concentration factor, thickness of rock parting, and difference of elastic modulus between rock parting and coal to lower the risk of rock burst in STCSRP. (4) The research result was applied to evaluate and control the risk of rock burst occurrence in STCSRP.

  2. Rollerjaw Rock Crusher

    Science.gov (United States)

    Peters, Gregory; Brown, Kyle; Fuerstenau, Stephen

    2009-01-01

    The rollerjaw rock crusher melds the concepts of jaw crushing and roll crushing long employed in the mining and rock-crushing industries. Rollerjaw rock crushers have been proposed for inclusion in geological exploration missions on Mars, where they would be used to pulverize rock samples into powders in the tens of micrometer particle size range required for analysis by scientific instruments.

  3. Traffic information computing platform for big data

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn; Liu, Yan, E-mail: ztduan@chd.edu.cn; Dai, Jiting, E-mail: ztduan@chd.edu.cn; Kang, Jun, E-mail: ztduan@chd.edu.cn [Chang' an University School of Information Engineering, Xi' an, China and Shaanxi Engineering and Technical Research Center for Road and Traffic Detection, Xi' an (China)

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  4. Traffic information computing platform for big data

    International Nuclear Information System (INIS)

    Duan, Zongtao; Li, Ying; Zheng, Xibin; Liu, Yan; Dai, Jiting; Kang, Jun

    2014-01-01

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users

  5. Fremtidens landbrug bliver big business

    DEFF Research Database (Denmark)

    Hansen, Henning Otte

    2016-01-01

    Landbrugets omverdensforhold og konkurrencevilkår ændres, og det vil nødvendiggøre en udvikling i retning af “big business“, hvor landbrugene bliver endnu større, mere industrialiserede og koncentrerede. Big business bliver en dominerende udvikling i dansk landbrug - men ikke den eneste...

  6. Range sections as rock models for intensity rock scene segmentation

    CSIR Research Space (South Africa)

    Mkwelo, S

    2007-11-01

    Full Text Available This paper presents another approach to segmenting a scene of rocks on a conveyor belt for the purposes of measuring rock size. Rock size estimation instruments are used to monitor, optimize and control milling and crushing in the mining industry...

  7. Quantum nature of the big bang.

    Science.gov (United States)

    Ashtekar, Abhay; Pawlowski, Tomasz; Singh, Parampreet

    2006-04-14

    Some long-standing issues concerning the quantum nature of the big bang are resolved in the context of homogeneous isotropic models with a scalar field. Specifically, the known results on the resolution of the big-bang singularity in loop quantum cosmology are significantly extended as follows: (i) the scalar field is shown to serve as an internal clock, thereby providing a detailed realization of the "emergent time" idea; (ii) the physical Hilbert space, Dirac observables, and semiclassical states are constructed rigorously; (iii) the Hamiltonian constraint is solved numerically to show that the big bang is replaced by a big bounce. Thanks to the nonperturbative, background independent methods, unlike in other approaches the quantum evolution is deterministic across the deep Planck regime.

  8. Mentoring in Schools: An Impact Study of Big Brothers Big Sisters School-Based Mentoring

    Science.gov (United States)

    Herrera, Carla; Grossman, Jean Baldwin; Kauh, Tina J.; McMaken, Jennifer

    2011-01-01

    This random assignment impact study of Big Brothers Big Sisters School-Based Mentoring involved 1,139 9- to 16-year-old students in 10 cities nationwide. Youth were randomly assigned to either a treatment group (receiving mentoring) or a control group (receiving no mentoring) and were followed for 1.5 school years. At the end of the first school…

  9. Big data processing in the cloud - Challenges and platforms

    Science.gov (United States)

    Zhelev, Svetoslav; Rozeva, Anna

    2017-12-01

    Choosing the appropriate architecture and technologies for a big data project is a difficult task, which requires extensive knowledge in both the problem domain and in the big data landscape. The paper analyzes the main big data architectures and the most widely implemented technologies used for processing and persisting big data. Clouds provide for dynamic resource scaling, which makes them a natural fit for big data applications. Basic cloud computing service models are presented. Two architectures for processing big data are discussed, Lambda and Kappa architectures. Technologies for big data persistence are presented and analyzed. Stream processing as the most important and difficult to manage is outlined. The paper highlights main advantages of cloud and potential problems.

  10. Ethics and Epistemology in Big Data Research.

    Science.gov (United States)

    Lipworth, Wendy; Mason, Paul H; Kerridge, Ian; Ioannidis, John P A

    2017-12-01

    Biomedical innovation and translation are increasingly emphasizing research using "big data." The hope is that big data methods will both speed up research and make its results more applicable to "real-world" patients and health services. While big data research has been embraced by scientists, politicians, industry, and the public, numerous ethical, organizational, and technical/methodological concerns have also been raised. With respect to technical and methodological concerns, there is a view that these will be resolved through sophisticated information technologies, predictive algorithms, and data analysis techniques. While such advances will likely go some way towards resolving technical and methodological issues, we believe that the epistemological issues raised by big data research have important ethical implications and raise questions about the very possibility of big data research achieving its goals.

  11. Victoria Stodden: Scholarly Communication in the Era of Big Data and Big Computation

    OpenAIRE

    Stodden, Victoria

    2015-01-01

    Victoria Stodden gave the keynote address for Open Access Week 2015. "Scholarly communication in the era of big data and big computation" was sponsored by the University Libraries, Computational Modeling and Data Analytics, the Department of Computer Science, the Department of Statistics, the Laboratory for Interdisciplinary Statistical Analysis (LISA), and the Virginia Bioinformatics Institute. Victoria Stodden is an associate professor in the Graduate School of Library and Information Scien...

  12. Big Data: Concept, Potentialities and Vulnerabilities

    Directory of Open Access Journals (Sweden)

    Fernando Almeida

    2018-03-01

    Full Text Available The evolution of information systems and the growth in the use of the Internet and social networks has caused an explosion in the amount of available data relevant to the activities of the companies. Therefore, the treatment of these available data is vital to support operational, tactical and strategic decisions. This paper aims to present the concept of big data and the main technologies that support the analysis of large data volumes. The potential of big data is explored considering nine sectors of activity, such as financial, retail, healthcare, transports, agriculture, energy, manufacturing, public, and media and entertainment. In addition, the main current opportunities, vulnerabilities and privacy challenges of big data are discussed. It was possible to conclude that despite the potential for using the big data to grow in the previously identified areas, there are still some challenges that need to be considered and mitigated, namely the privacy of information, the existence of qualified human resources to work with Big Data and the promotion of a data-driven organizational culture.

  13. Big data analytics a management perspective

    CERN Document Server

    Corea, Francesco

    2016-01-01

    This book is about innovation, big data, and data science seen from a business perspective. Big data is a buzzword nowadays, and there is a growing necessity within practitioners to understand better the phenomenon, starting from a clear stated definition. This book aims to be a starting reading for executives who want (and need) to keep the pace with the technological breakthrough introduced by new analytical techniques and piles of data. Common myths about big data will be explained, and a series of different strategic approaches will be provided. By browsing the book, it will be possible to learn how to implement a big data strategy and how to use a maturity framework to monitor the progress of the data science team, as well as how to move forward from one stage to the next. Crucial challenges related to big data will be discussed, where some of them are more general - such as ethics, privacy, and ownership – while others concern more specific business situations (e.g., initial public offering, growth st...

  14. Human factors in Big Data

    NARCIS (Netherlands)

    Boer, J. de

    2016-01-01

    Since 2014 I am involved in various (research) projects that try to make the hype around Big Data more concrete and tangible for the industry and government. Big Data is about multiple sources of (real-time) data that can be analysed, transformed to information and be used to make 'smart' decisions.

  15. [Cultivation strategy and path analysis on big brand Chinese medicine for small and medium-sized enterprises].

    Science.gov (United States)

    Wang, Yong-Yan; Yang, Hong-Jun

    2014-03-01

    Small and medium-sized enterprises (SMEs) are important components in Chinese medicine industry. However, the lack of big brand is becoming an urgent problem which is critical to the survival of SMEs. This article discusses the concept and traits of Chinese medicine of big brand, from clinical, scientific and market value three aspects. Guided by market value, highlighting clinical value, aiming at the scientific value improvement of big brand cultivation, we put forward the key points in cultivation, aiming at obtaining branded Chinese medicine with widely recognized efficacy, good quality control system and mechanism well explained and meanwhile which can bring innovation improvement to theory of Chinese medicine. According to the characters of SMEs, we hold a view that to build multidisciplinary research union could be considered as basic path, and then, from top-level design, skill upgrading and application three stages to probe the implementation strategy.

  16. Comprehensive Interpretation of the Laboratory Experiments Results to Construct Model of the Polish Shale Gas Rocks

    Science.gov (United States)

    Jarzyna, Jadwiga A.; Krakowska, Paulina I.; Puskarczyk, Edyta; Wawrzyniak-Guz, Kamila; Zych, Marcin

    2018-03-01

    More than 70 rock samples from so-called sweet spots, i.e. the Ordovician Sa Formation and Silurian Ja Member of Pa Formation from the Baltic Basin (North Poland) were examined in the laboratory to determine bulk and grain density, total and effective/dynamic porosity, absolute permeability, pore diameters size, total surface area, and natural radioactivity. Results of the pyrolysis, i.e., TOC (Total Organic Carbon) together with S1 and S2 - parameters used to determine the hydrocarbon generation potential of rocks, were also considered. Elemental composition from chemical analyses and mineral composition from XRD measurements were also included. SCAL analysis, NMR experiments, Pressure Decay Permeability measurements together with water immersion porosimetry and adsorption/ desorption of nitrogen vapors method were carried out along with the comprehensive interpretation of the outcomes. Simple and multiple linear statistical regressions were used to recognize mutual relationships between parameters. Observed correlations and in some cases big dispersion of data and discrepancies in the property values obtained from different methods were the basis for building shale gas rock model for well logging interpretation. The model was verified by the result of the Monte Carlo modelling of spectral neutron-gamma log response in comparison with GEM log results.

  17. Slaves to Big Data. Or Are We?

    Directory of Open Access Journals (Sweden)

    Mireille Hildebrandt

    2013-10-01

    Full Text Available

    In this contribution, the notion of Big Data is discussed in relation to the monetisation of personal data. The claim of some proponents, as well as adversaries, that Big Data implies that ‘n = all’, meaning that we no longer need to rely on samples because we have all the data, is scrutinised and found to be both overly optimistic and unnecessarily pessimistic. A set of epistemological and ethical issues is presented, focusing on the implications of Big Data for our perception, cognition, fairness, privacy and due process. The article then looks into the idea of user-centric personal data management to investigate to what extent it provides solutions for some of the problems triggered by the Big Data conundrum. Special attention is paid to the core principle of data protection legislation, namely purpose binding. Finally, this contribution seeks to inquire into the influence of Big Data politics on self, mind and society, and asks how we can prevent ourselves from becoming slaves to Big Data.

  18. Time dependency in the mechanical properties of crystalline rocks. A literature survey

    International Nuclear Information System (INIS)

    Hagros, A.; Johansson, E.; Hudson, J.A.

    2008-09-01

    Because of the long design life, elevated temperatures, and the location at depth (high stresses), time-dependent aspects of the mechanical properties of crystalline rock are potentially important for the design and the long term safety of the radioactive waste repository at Olkiluoto. However, time-dependent effects in rock mechanics are still one of the least understood aspects of the physical behaviour of rock masses, this being partly due to the fact that it is difficult to conduct long-term experimental tests - either in the laboratory or in situ. Yet, the time-dependent mechanical behaviour needs to be characterised so that it can be included in the modelling studies supporting repository design. The Introduction explains the background to the literature survey and includes definitions of the terms 'creep' (increasing strain at constant stress) and 'stress relaxation' (decreasing stress at constant strain). Moreover, it is noted that the rock around an in situ excavation is loaded by the adjacent rock elements and so the timedependent behaviour will depend on the unloading stiffness of these and hence will not actually be either pure creep or pure stress relaxation. The Appendix contains the results of the literature survey of reported time-dependent research as it applies to crystalline rock. A summary of each of the 38 literature items is presented in tabular form covering document number, subject area, document reference, subject matter, objectives, methodology, highlighted figures, conclusions and comments. It is concluded that the time-dependent failure strength of all rocks observed may be interpreted by sub-critical crack growth assisted by the stress corrosion mechanism. Also, certain parameters are known to affect the long-term properties: mineralogy, grain size, water/water chemistry, confining stress and loading history. At some point in the loading history of rock, the state of crack development reaches a point whereby the continued generation of

  19. From Central Asia to South Africa: In Search of Inspiration in Rock Art Studies

    Directory of Open Access Journals (Sweden)

    Rozwadowski Andrzej

    2017-06-01

    Full Text Available The paper describes the story of discovering South African rock art as an inspiration for research in completely different part of the globe, namely in Central Asia and Siberia. It refers to those aspect of African research which proved to importantly develop the understanding of rock art in Asia. Several aspects are addressed. First, it points to importance of rethinking of relationship between art, myth and ethnography, which in South Africa additionally resulted in reconsidering the ontology of rock images and the very idea of reading of rock art. From the latter viewpoint particularly inspiring appeared the idea of three-dimensionality of rock art ‘text’. The second issue of South African ‘origin,’ which notably inspired research all over the world, concerns a new theorizing of shamanism. The paper then discusses how and to what extent this new theory add to the research on the rock art in Siberia and Central Asia.

  20. Will Organization Design Be Affected By Big Data?

    Directory of Open Access Journals (Sweden)

    Giles Slinger

    2014-12-01

    Full Text Available Computing power and analytical methods allow us to create, collate, and analyze more data than ever before. When datasets are unusually large in volume, velocity, and variety, they are referred to as “big data.” Some observers have suggested that in order to cope with big data (a organizational structures will need to change and (b the processes used to design organizations will be different. In this article, we differentiate big data from relatively slow-moving, linked people data. We argue that big data will change organizational structures as organizations pursue the opportunities presented by big data. The processes by which organizations are designed, however, will be relatively unaffected by big data. Instead, organization design processes will be more affected by the complex links found in people data.

  1. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  2. Experimental Study of Bilinear Initiating System Based on Hard Rock Pile Blasting

    Directory of Open Access Journals (Sweden)

    Yusong Miao

    2017-01-01

    Full Text Available It is difficult to use industrial explosives to excavate hard rock and achieve suitable blasting effect due to the low energy utilization rate resulting in large rocks and short blasting footage. Thus, improving the utilization ratio of the explosive energy is important. In this study, a novel bilinear initiation system based on hard rock blasting was proposed to improve the blasting effects. Furthermore, on the basis of the detonation wave collision theory, frontal collision, oblique reflection, and Mach reflection during detonation wave propagation were studied. The results show that the maximum detonation pressure at the Mach reflection point where the incident angle is 46.9° is three times larger than the value of the explosive complete detonation. Then, in order to analyze the crack propagation in different initiation forms, a rock fracture test slot was designed, and the results show that bilinear initiating system can change the energy distribution of explosives. Finally, field experiment was implemented at the hard rock pile blasting engineering, and experimental results show that the present system possesses high explosive energy utilization ratio and low rock fragments size. The results of this study can be used to improve the efficiency in hard rock blasting.

  3. Big Data

    OpenAIRE

    Bútora, Matúš

    2017-01-01

    Cieľom bakalárskej práca je popísať problematiku Big Data a agregačné operácie OLAP pre podporu rozhodovania, ktoré sú na ne aplikované pomocou technológie Apache Hadoop. Prevažná časť práce je venovaná popisu práve tejto technológie. Posledná kapitola sa zaoberá spôsobom aplikovania agregačných operácií a problematikou ich realizácie. Nasleduje celkové zhodnotenie práce a možnosti využitia výsledného systému do budúcna. The aim of the bachelor thesis is to describe the Big Data issue and ...

  4. BigDansing

    KAUST Repository

    Khayyat, Zuhair; Ilyas, Ihab F.; Jindal, Alekh; Madden, Samuel; Ouzzani, Mourad; Papotti, Paolo; Quiané -Ruiz, Jorge-Arnulfo; Tang, Nan; Yin, Si

    2015-01-01

    of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic

  5. Leveraging Mobile Network Big Data for Developmental Policy ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Some argue that big data and big data users offer advantages to generate evidence. ... Supported by IDRC, this research focused on transportation planning in urban ... Using mobile network big data for land use classification CPRsouth 2015.

  6. Evaluation of tunnel seismic prediction (TSP) result using the Japanese highway rock mass classification system for Pahang-Selangor Raw Water Transfer Tunnel

    Science.gov (United States)

    Von, W. C.; Ismail, M. A. M.

    2017-10-01

    The knowing of geological profile ahead of tunnel face is significant to minimize the risk in tunnel excavation work and cost control in preventative measure. Due to mountainous area, site investigation with vertical boring is not recommended to obtain the geological profile for Pahang-Selangor Raw Water Transfer project. Hence, tunnel seismic prediction (TSP) method is adopted to predict the geological profile ahead of tunnel face. In order to evaluate the TSP results, IBM SPSS Statistic 22 is used to run artificial neural network (ANN) analysis to back calculate the predicted Rock Grade Points (JH) from actual Rock Grade Points (JH) using Vp, Vs and Vp/Vs from TSP. The results show good correlation between predicted Rock Grade points and actual Rock Grade Points (JH). In other words, TSP can provide geological profile prediction ahead of tunnel face significantly while allowing continuously TBM excavation works. Identifying weak zones or faults ahead of tunnel face is crucial for preventative measures to be carried out in advance for a safer tunnel excavation works.

  7. Practice Variation in Big-4 Transparency Reports

    DEFF Research Database (Denmark)

    Girdhar, Sakshi; Klarskov Jeppesen, Kim

    2018-01-01

    Purpose: The purpose of this paper is to examine the transparency reports published by the Big-4 public accounting firms in the UK, Germany and Denmark to understand the determinants of their content within the networks of big accounting firms. Design/methodology/approach: The study draws...... on a qualitative research approach, in which the content of transparency reports is analyzed and semi-structured interviews are conducted with key people from the Big-4 firms who are responsible for developing the transparency reports. Findings: The findings show that the content of transparency reports...... is inconsistent and the transparency reporting practice is not uniform within the Big-4 networks. Differences were found in the way in which the transparency reporting practices are coordinated globally by the respective central governing bodies of the Big-4. The content of the transparency reports...

  8. Big data and biomedical informatics: a challenging opportunity.

    Science.gov (United States)

    Bellazzi, R

    2014-05-22

    Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations.

  9. Fluid and rock interaction in permeable volcanic rock

    International Nuclear Information System (INIS)

    Lindley, J.I.

    1985-01-01

    Four types of interrelated changes -geochemical, mineralogic, isotopic, and physical - occur in Oligocene volcanic units of the Mogollon-Datil volcanic field, New Mexico. These changes resulted from the operation of a geothermal system that, through fluid-rock interaction, affected 5 rhyolite ash-flow tuffs and an intercalated basaltic andesite lava flow causing a potassium metasomatism type of alteration. (1) Previous studies have shown enrichment of rocks in K 2 O as much as 130% of their original values at the expense of Na 2 O and CaO with an accompanying increase in Rb and decreases in MgO and Sr. (2) X-ray diffraction results of this study show that phenocrystic plagioclase and groundmass feldspar have been replaced with pure potassium feldspar and quartz in altered rock. Phenocrystic potassium feldspar, biotite, and quartz are unaffected. Pyroxene in basaltic andesite is replaced by iron oxide. (3) delta 18 O increases for rhyolitic units from values of 8-10 permil, typical of unaltered rock, to 13-15 permil, typical of altered rock. Basaltic andesite, however, shows opposite behavior with a delta 18 of 9 permil in unaltered rock and 6 permit in altered. (4) Alteration results in a density decrease. SEM revealed that replacement of plagioclase by fine-grained quartz and potassium feldspar is not a volume for volume replacement. Secondary porosity is created in the volcanics by the chaotic arrangement of secondary crystals

  10. Was the big bang hot

    International Nuclear Information System (INIS)

    Wright, E.L.

    1983-01-01

    The author considers experiments to confirm the substantial deviations from a Planck curve in the Woody and Richards spectrum of the microwave background, and search for conducting needles in our galaxy. Spectral deviations and needle-shaped grains are expected for a cold Big Bang, but are not required by a hot Big Bang. (Auth.)

  11. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  12. Keynote: Big Data, Big Opportunities

    OpenAIRE

    Borgman, Christine L.

    2014-01-01

    The enthusiasm for big data is obscuring the complexity and diversity of data in scholarship and the challenges for stewardship. Inside the black box of data are a plethora of research, technology, and policy issues. Data are not shiny objects that are easily exchanged. Rather, data are representations of observations, objects, or other entities used as evidence of phenomena for the purposes of research or scholarship. Data practices are local, varying from field to field, individual to indiv...

  13. Integrating R and Hadoop for Big Data Analysis

    OpenAIRE

    Bogdan Oancea; Raluca Mariana Dragoescu

    2014-01-01

    Analyzing and working with big data could be very diffi cult using classical means like relational database management systems or desktop software packages for statistics and visualization. Instead, big data requires large clusters with hundreds or even thousands of computing nodes. Offi cial statistics is increasingly considering big data for deriving new statistics because big data sources could produce more relevant and timely statistics than traditional sources. One of the software tools ...

  14. Diffusivity database (DDB) for major rocks. Database for the second progress report

    Energy Technology Data Exchange (ETDEWEB)

    Sato, Haruo

    1999-10-01

    A database for diffusivity for a data setting of effective diffusion coefficients in rock matrices in the second progress report, was developed. In this database, 3 kinds of diffusion coefficients: effective diffusion coefficient (De), apparent diffusion coefficient (Da) and free water diffusion coefficient (Do) were treated. The database, based on literatures published between 1980 and 1998, was developed considering the following points. (1) Since Japanese geological environment is focused in the second progress report, data for diffusion are collected focused on Japanese major rocks. (2) Although 22 elements are considered to be important in performance assessment for geological disposal, all elements and aquatic tracers are treated in this database development considering general purpose. (3) Since limestone, which belongs to sedimentary rock, can become one of the natural resources and is inappropriate as a host rock, it is omitted in this database development. Rock was categorized into 4 kinds of rocks; acid crystalline rock, alkaline crystalline rock, sedimentary rock (argillaceous/tuffaceous rock) and sedimentary rock (psammitic rock/sandy stone) from the viewpoint of geology and mass transport. In addition, rocks around neutrality among crystalline rock were categorized into the alkaline crystalline rock in this database. The database is composed of sub-databases for 4 kinds of rocks. Furthermore, the sub-databases for 4 kinds of the rocks are composed of databases to individual elements, in which totally, 24 items such as species, rock name, diffusion coefficients (De, Da, Do), obtained conditions (method, porewater, pH, Eh, temperature, atmosphere, etc.), etc. are input. As a result of literature survey, for De values for acid crystalline rock, totally, 207 data for 18 elements and one tracer (hydrocarbon) have been reported and all data were for granitic rocks such as granite, granodiorite and biotitic granite. For alkaline crystalline rock, totally, 32

  15. The challenges of big data.

    Science.gov (United States)

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. © 2016. Published by The Company of Biologists Ltd.

  16. Big³. Editorial.

    Science.gov (United States)

    Lehmann, C U; Séroussi, B; Jaulent, M-C

    2014-05-22

    To provide an editorial introduction into the 2014 IMIA Yearbook of Medical Informatics with an overview of the content, the new publishing scheme, and upcoming 25th anniversary. A brief overview of the 2014 special topic, Big Data - Smart Health Strategies, and an outline of the novel publishing model is provided in conjunction with a call for proposals to celebrate the 25th anniversary of the Yearbook. 'Big Data' has become the latest buzzword in informatics and promise new approaches and interventions that can improve health, well-being, and quality of life. This edition of the Yearbook acknowledges the fact that we just started to explore the opportunities that 'Big Data' will bring. However, it will become apparent to the reader that its pervasive nature has invaded all aspects of biomedical informatics - some to a higher degree than others. It was our goal to provide a comprehensive view at the state of 'Big Data' today, explore its strengths and weaknesses, as well as its risks, discuss emerging trends, tools, and applications, and stimulate the development of the field through the aggregation of excellent survey papers and working group contributions to the topic. For the first time in history will the IMIA Yearbook be published in an open access online format allowing a broader readership especially in resource poor countries. For the first time, thanks to the online format, will the IMIA Yearbook be published twice in the year, with two different tracks of papers. We anticipate that the important role of the IMIA yearbook will further increase with these changes just in time for its 25th anniversary in 2016.

  17. Thermomechanical stability of underground installations: significance of the thermophysical properties of rocks

    International Nuclear Information System (INIS)

    Mirkovich, V.

    1981-01-01

    When heat is generated in an underground installation, there are several interdependent factors-such as the rate of heat dissipation, changes in this rate with temperature, or the effects of thermal gradients and thermal expansivities-which influence the stability of the rock mass. To evaluate the thermomechanical stability of a proposed site for an underground nuclear power station, rock specimens from a 300 m deep drill core were obtained, and their thermal diffusivity and linear thermal expansion were measured between 25 0 C and 500 0 C. The thermal conductivity was also measured, in the temperature range 100-500 0 C. Under normal operating conditions, heat transfer to the surface of the rock mass surrounding the power installation would be low. However, in some contingencies, this heat load could become large. The results are discussed from the point of view of the stability of a rock enclosure at higher heat fluxes; they indicate that the rocks studied would, in general, not be suitable as an unprotected wall for containment of such a heat source. (author)

  18. Modelling the effect of diffusion into the rock matrix on radionuclide migration

    International Nuclear Information System (INIS)

    Lever, D.A.; Bradbury, M.H.; Hemingway, S.J.

    1983-01-01

    Diffusion into the rock matrix is potentially an important retardation mechanism for nuclides leached from an underground radioactive waste repository in a fractured hard rock. Models of this diffusion process are discussed and incorporated into three-dimensional radionuclide migration models. Simple solutions to these models are derived for two regions: the region near to the repository where the nuclide is diffusing into effectively infinite rock, and that much further downstream where the concentrations in the rock and fractures are almost in equilibrium. These solutions are used to evaluate the possible impact on migration. It is shown that retardation factors in excess of 100 and reductions in the peak concentration at a given point on the flow path by three or four orders of magnitude are possibe for non-sorbed ions, which would otherwise be carried by the flow and not retarded at all. (author)

  19. Cloud Based Big Data Infrastructure: Architectural Components and Automated Provisioning

    OpenAIRE

    Demchenko, Yuri; Turkmen, Fatih; Blanchet, Christophe; Loomis, Charles; Laat, Caees de

    2016-01-01

    This paper describes the general architecture and functional components of the cloud based Big Data Infrastructure (BDI). The proposed BDI architecture is based on the analysis of the emerging Big Data and data intensive technologies and supported by the definition of the Big Data Architecture Framework (BDAF) that defines the following components of the Big Data technologies: Big Data definition, Data Management including data lifecycle and data structures, Big Data Infrastructure (generical...

  20. Physics with Big Karl Brainstorming. Abstracts

    International Nuclear Information System (INIS)

    Machner, H.; Lieb, J.

    2000-08-01

    Before summarizing details of the meeting, a short description of the spectrometer facility Big Karl is given. The facility is essentially a new instrument using refurbished dipole magnets from its predecessor. The large acceptance quadrupole magnets and the beam optics are new. Big Karl has a design very similar as the focussing spectrometers at MAMI (Mainz), AGOR (Groningen) and the high resolution spectrometer (HRS) in Hall A at Jefferson Laboratory with ΔE/E = 10 -4 but at some lower maximum momentum. The focal plane detectors consisting of multiwire drift chambers and scintillating hodoscopes are similar. Unlike HRS, Big Karl still needs Cerenkov counters and polarimeters in its focal plane; detectors which are necessary to perform some of the experiments proposed during the brainstorming. In addition, BIG KARL allows emission angle reconstruction via track measurements in its focal plane with high resolution. In the following the physics highlights, the proposed and potential experiments are summarized. During the meeting it became obvious that the physics to be explored at Big Karl can be grouped into five distinct categories, and this summary is organized accordingly. (orig.)

  1. Seed bank and big sagebrush plant community composition in a range margin for big sagebrush

    Science.gov (United States)

    Martyn, Trace E.; Bradford, John B.; Schlaepfer, Daniel R.; Burke, Ingrid C.; Laurenroth, William K.

    2016-01-01

    The potential influence of seed bank composition on range shifts of species due to climate change is unclear. Seed banks can provide a means of both species persistence in an area and local range expansion in the case of increasing habitat suitability, as may occur under future climate change. However, a mismatch between the seed bank and the established plant community may represent an obstacle to persistence and expansion. In big sagebrush (Artemisia tridentata) plant communities in Montana, USA, we compared the seed bank to the established plant community. There was less than a 20% similarity in the relative abundance of species between the established plant community and the seed bank. This difference was primarily driven by an overrepresentation of native annual forbs and an underrepresentation of big sagebrush in the seed bank compared to the established plant community. Even though we expect an increase in habitat suitability for big sagebrush under future climate conditions at our sites, the current mismatch between the plant community and the seed bank could impede big sagebrush range expansion into increasingly suitable habitat in the future.

  2. Application and Prospect of Big Data in Water Resources

    Science.gov (United States)

    Xi, Danchi; Xu, Xinyi

    2017-04-01

    Because of developed information technology and affordable data storage, we h ave entered the era of data explosion. The term "Big Data" and technology relate s to it has been created and commonly applied in many fields. However, academic studies just got attention on Big Data application in water resources recently. As a result, water resource Big Data technology has not been fully developed. This paper introduces the concept of Big Data and its key technologies, including the Hadoop system and MapReduce. In addition, this paper focuses on the significance of applying the big data in water resources and summarizing prior researches by others. Most studies in this field only set up theoretical frame, but we define the "Water Big Data" and explain its tridimensional properties which are time dimension, spatial dimension and intelligent dimension. Based on HBase, the classification system of Water Big Data is introduced: hydrology data, ecology data and socio-economic data. Then after analyzing the challenges in water resources management, a series of solutions using Big Data technologies such as data mining and web crawler, are proposed. Finally, the prospect of applying big data in water resources is discussed, it can be predicted that as Big Data technology keeps developing, "3D" (Data Driven Decision) will be utilized more in water resources management in the future.

  3. Big Data in food and agriculture

    Directory of Open Access Journals (Sweden)

    Kelly Bronson

    2016-06-01

    Full Text Available Farming is undergoing a digital revolution. Our existing review of current Big Data applications in the agri-food sector has revealed several collection and analytics tools that may have implications for relationships of power between players in the food system (e.g. between farmers and large corporations. For example, Who retains ownership of the data generated by applications like Monsanto Corproation's Weed I.D . “app”? Are there privacy implications with the data gathered by John Deere's precision agricultural equipment? Systematically tracing the digital revolution in agriculture, and charting the affordances as well as the limitations of Big Data applied to food and agriculture, should be a broad research goal for Big Data scholarship. Such a goal brings data scholarship into conversation with food studies and it allows for a focus on the material consequences of big data in society.

  4. Big data optimization recent developments and challenges

    CERN Document Server

    2016-01-01

    The main objective of this book is to provide the necessary background to work with big data by introducing some novel optimization algorithms and codes capable of working in the big data setting as well as introducing some applications in big data optimization for both academics and practitioners interested, and to benefit society, industry, academia, and government. Presenting applications in a variety of industries, this book will be useful for the researchers aiming to analyses large scale data. Several optimization algorithms for big data including convergent parallel algorithms, limited memory bundle algorithm, diagonal bundle method, convergent parallel algorithms, network analytics, and many more have been explored in this book.

  5. Una aproximación a Big Data = An approach to Big Data

    OpenAIRE

    Puyol Moreno, Javier

    2014-01-01

    Big Data puede ser considerada como una tendencia en el avance de la tecnología que ha abierto la puerta a un nuevo enfoque para la comprensión y la toma de decisiones, que se utiliza para describir las enormes cantidades de datos (estructurados, no estructurados y semi- estructurados) que sería demasiado largo y costoso para cargar una base de datos relacional para su análisis. Así, el concepto de Big Data se aplica a toda la información que no puede ser procesada o analizada utilizando herr...

  6. Toward a Literature-Driven Definition of Big Data in Healthcare.

    Science.gov (United States)

    Baro, Emilie; Degoul, Samuel; Beuscart, Régis; Chazard, Emmanuel

    2015-01-01

    The aim of this study was to provide a definition of big data in healthcare. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n) and the number of variables (p) for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. A total of 196 papers were included. Big data can be defined as datasets with Log(n∗p) ≥ 7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR) data.

  7. Big Data Analytic, Big Step for Patient Management and Care in Puerto Rico.

    Science.gov (United States)

    Borrero, Ernesto E

    2018-01-01

    This letter provides an overview of the application of big data in health care system to improve quality of care, including predictive modelling for risk and resource use, precision medicine and clinical decision support, quality of care and performance measurement, public health and research applications, among others. The author delineates the tremendous potential for big data analytics and discuss how it can be successfully implemented in clinical practice, as an important component of a learning health-care system.

  8. Soil biogeochemistry in the age of big data

    Science.gov (United States)

    Cécillon, Lauric; Barré, Pierre; Coissac, Eric; Plante, Alain; Rasse, Daniel

    2015-04-01

    Data is becoming one of the key resource of the XXIst century. Soil biogeochemistry is not spared by this new movement. The conservation of soils and their services recently came into the political agenda. However, clear knowledge on the links between soil characteristics and the various processes ensuring the provision of soil services is rare at the molecular or the plot scale, and does not exist at the landscape scale. This split between society's expectations on its natural capital, and scientific knowledge on the most complex material on earth has lead to an increasing number of studies on soils, using an increasing number of techniques of increasing complexity, with an increasing spatial and temporal coverage. From data scarcity with a basic data management system, soil biogeochemistry is now facing a proliferation of data, with few quality controls from data collection to publication and few skills to deal with them. Based on this observation, here we (1) address how big data could help in making sense of all these soil biogeochemical data, (2) point out several shortcomings of big data that most biogeochemists will experience in their future career. Massive storage of data is now common and recent opportunities for cloud storage enables data sharing among researchers all over the world. The need for integrative and collaborative computational databases in soil biogeochemistry is emerging through pioneering initiatives in this direction (molTERdb; earthcube), following soil microbiologists (GenBank). We expect that a series of data storage and management systems will rapidly revolutionize the way of accessing raw biogeochemical data, published or not. Data mining techniques combined with cluster or cloud computing hold significant promises for facilitating the use of complex analytical methods, and for revealing new insights previously hidden in complex data on soil mineralogy, organic matter and biodiversity. Indeed, important scientific advances have

  9. Big Data and Biomedical Informatics: A Challenging Opportunity

    Science.gov (United States)

    2014-01-01

    Summary Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations. PMID:24853034

  10. Big data governance an emerging imperative

    CERN Document Server

    Soares, Sunil

    2012-01-01

    Written by a leading expert in the field, this guide focuses on the convergence of two major trends in information management-big data and information governance-by taking a strategic approach oriented around business cases and industry imperatives. With the advent of new technologies, enterprises are expanding and handling very large volumes of data; this book, nontechnical in nature and geared toward business audiences, encourages the practice of establishing appropriate governance over big data initiatives and addresses how to manage and govern big data, highlighting the relevant processes,

  11. Big Data and historical social science

    Directory of Open Access Journals (Sweden)

    Peter Bearman

    2015-11-01

    Full Text Available “Big Data” can revolutionize historical social science if it arises from substantively important contexts and is oriented towards answering substantively important questions. Such data may be especially important for answering previously largely intractable questions about the timing and sequencing of events, and of event boundaries. That said, “Big Data” makes no difference for social scientists and historians whose accounts rest on narrative sentences. Since such accounts are the norm, the effects of Big Data on the practice of historical social science may be more limited than one might wish.

  12. Rocks Can Wow? Yes, Rocks Can Wow!

    Science.gov (United States)

    Hardman, Sally; Luke, Sue

    2016-01-01

    Rocks and fossils appear in the National Curriculum of England science programmes of study for children in year 3 (ages 7-8). A frequently asked question is "How do you make the classification of rocks engaging?" In response to this request from a school, a set of interactive activities was designed and organised by tutors and students…

  13. Traceological analysis of a singular artefact: The rock crystal point from O Achadizo (Boiro, A Coruña, Galicia

    Directory of Open Access Journals (Sweden)

    Juan Luis Fernández Marchena

    2016-09-01

    In this paper we present the data obtained from a use-wear study of a rock crystal tool from the O Achadizo hill fort (Boiro, A Coruña, Galicia. This tool was located in shell midden A, dated as Second Iron Age, and is of particular importance because of its pointed morphology and the configuration evidence on its perimeter. We carried out a macroscopic and microscopic analysis to obtain as much data on this piece as possible. Macroscopically we identified retouching as well as an impact fracture, and at the microscopic level we found several series of striations on the ventral face which are not in keeping with the use of the piece as a projectile tip. We decided to generate several “gigapixel” images of different areas of the tool, in order to record the order and arrangement of these striations, and to understand their origin. We identified differential orientation of the striations in the various sectors of the tool, suggesting a technical origin. The combination of the macro and microscopic analysis of both faces has allowed us to functionally interpret the tool as a sharp element.

  14. Three dimensional rock microstructures: insights from FIB-SEM tomography

    Science.gov (United States)

    Drury, Martyn; Pennock, Gill; de Winter, Matthijs

    2016-04-01

    Most studies of rock microstructures investigate two-dimensional sections or thin slices of three dimensional grain structures. With advances of X-ray and electron tomography methods the 3-D microstructure can be(relatively) routinely investigated on scales from a few microns to cm. 3D studies are needed to investigate the connectivity of microstructures and to test the assumptions we use to calculate 3D properties from 2D sections. We have used FIB-SEM tomography to study the topology of melts in synthetic olivine rocks, 3D crystal growth microstructures, pore networks and subgrain structures. The technique uses a focused ion beam to make serial sections with a spacing of tens to hundreds of nanometers. Each section is then imaged or mapped using the electron beam. The 3D geometry of grains and subgrains can be investigated using orientation contrast or EBSD mapping. FIB-SEM tomography of rocks and minerals can be limited by charging of the uncoated surfaces exposed by the ion beam. The newest generation of FIB-SEMs have much improved low voltage imaging capability allowing high resolution charge free imaging. Low kV FIB-SEM tomography is now widely used to study the connectivity of pore networks. In-situ fluids can also be studied using cryo-FIB-SEM on frozen samples, although special freezing techniques are needed to avoid artifacts produced by ice crystallization. FIB-SEM tomography is complementary, in terms of spatial resolution and sampled volume, to TEM tomography and X-ray tomography, and the combination of these methods can cover a wide range of scales. Our studies on melt topology in synthetic olivine rocks with a high melt content show that many grain boundaries are wetted by nanometre scale melt layers that are too thin to resolve by X-ray tomography. A variety of melt layer geometries occur consistent with several mechanisms of melt layer formation. The nature of melt geometries along triple line junctions and quadruple points can be resolved

  15. The Inverted Big-Bang

    OpenAIRE

    Vaas, Ruediger

    2004-01-01

    Our universe appears to have been created not out of nothing but from a strange space-time dust. Quantum geometry (loop quantum gravity) makes it possible to avoid the ominous beginning of our universe with its physically unrealistic (i.e. infinite) curvature, extreme temperature, and energy density. This could be the long sought after explanation of the big-bang and perhaps even opens a window into a time before the big-bang: Space itself may have come from an earlier collapsing universe tha...

  16. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  17. Big data, little data, no data scholarship in the networked world

    CERN Document Server

    Borgman, Christine L

    2015-01-01

    "Big Data" is on the covers of Science, Nature, the Economist, and Wired magazines, on the front pages of the Wall Street Journal and the New York Times. But despite the media hyperbole, as Christine Borgman points out in this examination of data and scholarly research, having the right data is usually better than having more data; little data can be just as valuable as big data. In many cases, there are no data -- because relevant data don't exist, cannot be found, or are not available. Moreover, data sharing is difficult, incentives to do so are minimal, and data practices vary widely across disciplines. Borgman, an often-cited authority on scholarly communication, argues that data have no value or meaning in isolation; they exist within a knowledge infrastructure -- an ecology of people, practices, technologies, institutions, material objects, and relationships. After laying out the premises of her investigation -- six "provocations" meant to inspire discussion about the uses of data in scholarship -- Bor...

  18. ROCKING. A computer program for seismic response analysis of radioactive materials transport AND/OR storage casks

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1995-11-01

    The computer program ROCKING has been developed for seismic response analysis, which includes rocking and sliding behavior, of radioactive materials transport and/or storage casks. Main features of ROCKING are as follows; (1) Cask is treated as a rigid body. (2) Rocking and sliding behavior are considered. (3) Impact forces are represented by the spring dashpot model located at impact points. (4) Friction force is calculated at interface between a cask and a floor. (5) Forces of wire ropes against tip-over work only as tensile loads. In the paper, the calculation model, the calculation equations, validity calculations and user's manual are shown. (author)

  19. Classical propagation of strings across a big crunch/big bang singularity

    International Nuclear Information System (INIS)

    Niz, Gustavo; Turok, Neil

    2007-01-01

    One of the simplest time-dependent solutions of M theory consists of nine-dimensional Euclidean space times 1+1-dimensional compactified Milne space-time. With a further modding out by Z 2 , the space-time represents two orbifold planes which collide and re-emerge, a process proposed as an explanation of the hot big bang [J. Khoury, B. A. Ovrut, P. J. Steinhardt, and N. Turok, Phys. Rev. D 64, 123522 (2001).][P. J. Steinhardt and N. Turok, Science 296, 1436 (2002).][N. Turok, M. Perry, and P. J. Steinhardt, Phys. Rev. D 70, 106004 (2004).]. When the two planes are near, the light states of the theory consist of winding M2-branes, describing fundamental strings in a particular ten-dimensional background. They suffer no blue-shift as the M theory dimension collapses, and their equations of motion are regular across the transition from big crunch to big bang. In this paper, we study the classical evolution of fundamental strings across the singularity in some detail. We also develop a simple semiclassical approximation to the quantum evolution which allows one to compute the quantum production of excitations on the string and implement it in a simplified example

  20. The Information Panopticon in the Big Data Era

    Directory of Open Access Journals (Sweden)

    Martin Berner

    2014-04-01

    Full Text Available Taking advantage of big data opportunities is challenging for traditional organizations. In this article, we take a panoptic view of big data – obtaining information from more sources and making it visible to all organizational levels. We suggest that big data requires the transformation from command and control hierarchies to post-bureaucratic organizational structures wherein employees at all levels can be empowered while simultaneously being controlled. We derive propositions that show how to best exploit big data technologies in organizations.

  1. Solid images for geostructural mapping and key block modeling of rock discontinuities

    Science.gov (United States)

    Assali, Pierre; Grussenmeyer, Pierre; Villemin, Thierry; Pollet, Nicolas; Viguier, Flavien

    2016-04-01

    Rock mass characterization is obviously a key element in rock fall hazard analysis. Managing risk and determining the most adapted reinforcement method require a proper understanding of the considered rock mass. Description of discontinuity sets is therefore a crucial first step in the reinforcement work design process. The on-field survey is then followed by a structural modeling in order to extrapolate the data collected at the rock surface to the inner part of the massif. Traditional compass survey and manual observations can be undoubtedly surpassed by dense 3D data such as LiDAR or photogrammetric point clouds. However, although the acquisition phase is quite fast and highly automated, managing, handling and exploiting such great amount of collected data is an arduous task and especially for non specialist users. In this study, we propose a combined approached using both 3D point clouds (from LiDAR or image matching) and 2D digital images, gathered into the concept of ''solid image''. This product is the connection between the advantages of classical true colors 2D digital images, accessibility and interpretability, and the particular strengths of dense 3D point clouds, i.e. geometrical completeness and accuracy. The solid image can be considered as the information support for carrying-out a digital survey at the surface of the outcrop without being affected by traditional deficiencies (lack of data and sampling difficulties due to inaccessible areas, safety risk in steep sectors, etc.). Computational tools presented in this paper have been implemented into one standalone software through a graphical user interface helping operators with the completion of a digital geostructural survey and analysis. 3D coordinates extraction, 3D distances and area measurement, planar best-fit for discontinuity orientation, directional roughness profiles, block size estimation, and other tools have been experimented on a calcareous quarry in the French Alps.

  2. WE-H-BRB-00: Big Data in Radiation Oncology

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2016-06-15

    Big Data in Radiation Oncology: (1) Overview of the NIH 2015 Big Data Workshop, (2) Where do we stand in the applications of big data in radiation oncology?, and (3) Learning Health Systems for Radiation Oncology: Needs and Challenges for Future Success The overriding goal of this trio panel of presentations is to improve awareness of the wide ranging opportunities for big data impact on patient quality care and enhancing potential for research and collaboration opportunities with NIH and a host of new big data initiatives. This presentation will also summarize the Big Data workshop that was held at the NIH Campus on August 13–14, 2015 and sponsored by AAPM, ASTRO, and NIH. The workshop included discussion of current Big Data cancer registry initiatives, safety and incident reporting systems, and other strategies that will have the greatest impact on radiation oncology research, quality assurance, safety, and outcomes analysis. Learning Objectives: To discuss current and future sources of big data for use in radiation oncology research To optimize our current data collection by adopting new strategies from outside radiation oncology To determine what new knowledge big data can provide for clinical decision support for personalized medicine L. Xing, NIH/NCI Google Inc.

  3. WE-H-BRB-00: Big Data in Radiation Oncology

    International Nuclear Information System (INIS)

    2016-01-01

    Big Data in Radiation Oncology: (1) Overview of the NIH 2015 Big Data Workshop, (2) Where do we stand in the applications of big data in radiation oncology?, and (3) Learning Health Systems for Radiation Oncology: Needs and Challenges for Future Success The overriding goal of this trio panel of presentations is to improve awareness of the wide ranging opportunities for big data impact on patient quality care and enhancing potential for research and collaboration opportunities with NIH and a host of new big data initiatives. This presentation will also summarize the Big Data workshop that was held at the NIH Campus on August 13–14, 2015 and sponsored by AAPM, ASTRO, and NIH. The workshop included discussion of current Big Data cancer registry initiatives, safety and incident reporting systems, and other strategies that will have the greatest impact on radiation oncology research, quality assurance, safety, and outcomes analysis. Learning Objectives: To discuss current and future sources of big data for use in radiation oncology research To optimize our current data collection by adopting new strategies from outside radiation oncology To determine what new knowledge big data can provide for clinical decision support for personalized medicine L. Xing, NIH/NCI Google Inc.

  4. De impact van Big Data op Internationale Betrekkingen

    NARCIS (Netherlands)

    Zwitter, Andrej

    Big Data changes our daily lives, but does it also change international politics? In this contribution, Andrej Zwitter (NGIZ chair at Groningen University) argues that Big Data impacts on international relations in ways that we only now start to understand. To comprehend how Big Data influences

  5. Epidemiology in the Era of Big Data

    Science.gov (United States)

    Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M

    2015-01-01

    Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called ‘3 Vs’: variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that, while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field’s future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future. PMID:25756221

  6. Big data and analytics strategic and organizational impacts

    CERN Document Server

    Morabito, Vincenzo

    2015-01-01

    This book presents and discusses the main strategic and organizational challenges posed by Big Data and analytics in a manner relevant to both practitioners and scholars. The first part of the book analyzes strategic issues relating to the growing relevance of Big Data and analytics for competitive advantage, which is also attributable to empowerment of activities such as consumer profiling, market segmentation, and development of new products or services. Detailed consideration is also given to the strategic impact of Big Data and analytics on innovation in domains such as government and education and to Big Data-driven business models. The second part of the book addresses the impact of Big Data and analytics on management and organizations, focusing on challenges for governance, evaluation, and change management, while the concluding part reviews real examples of Big Data and analytics innovation at the global level. The text is supported by informative illustrations and case studies, so that practitioners...

  7. Big Science and Long-tail Science

    CERN Document Server

    2008-01-01

    Jim Downing and I were privileged to be the guests of Salavtore Mele at CERN yesterday and to see the Atlas detector of the Large Hadron Collider . This is a wow experience - although I knew it was big, I hadnt realised how big.

  8. Rock mechanics related to Jurassic underburden at Valdemar oil field

    DEFF Research Database (Denmark)

    Foged, Niels

    1999-01-01

    .It has been initiated as a feasibility study of the North Jens-1 core 12 taken in the top Jurassic clay shale as a test specimens for integrated petrological, mineralogical and rock mechanical studies. Following topics are studied:(1) Pore pressure generation due to conversion of organic matter...... and deformation properties of the clay shale using the actual core material or outcrop equivalents.(3) Flushing mechanisms for oil and gas from source rocks due to possibly very high pore water pressure creating unstable conditions in deeply burried sedimentsThere seems to be a need for integrating the knowledge...... in a number of geosciences to the benefit of common understanding of important reservoir mechanisms. Rock mechanics and geotechnical modelling might be key points for this understanding of reservoir geology and these may constitute a platform for future research in the maturing and migration from the Jurassic...

  9. Toward a Literature-Driven Definition of Big Data in Healthcare

    Directory of Open Access Journals (Sweden)

    Emilie Baro

    2015-01-01

    Full Text Available Objective. The aim of this study was to provide a definition of big data in healthcare. Methods. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n and the number of variables (p for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. Results. A total of 196 papers were included. Big data can be defined as datasets with Log⁡(n*p≥7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Conclusion. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR data.

  10. Toward a Literature-Driven Definition of Big Data in Healthcare

    Science.gov (United States)

    Baro, Emilie; Degoul, Samuel; Beuscart, Régis; Chazard, Emmanuel

    2015-01-01

    Objective. The aim of this study was to provide a definition of big data in healthcare. Methods. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n) and the number of variables (p) for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. Results. A total of 196 papers were included. Big data can be defined as datasets with Log⁡(n∗p) ≥ 7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Conclusion. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR) data. PMID:26137488

  11. Big-Eyed Bugs Have Big Appetite for Pests

    Science.gov (United States)

    Many kinds of arthropod natural enemies (predators and parasitoids) inhabit crop fields in Arizona and can have a large negative impact on several pest insect species that also infest these crops. Geocoris spp., commonly known as big-eyed bugs, are among the most abundant insect predators in field c...

  12. Rock Physics

    DEFF Research Database (Denmark)

    Fabricius, Ida Lykke

    2017-01-01

    Rock physics is the discipline linking petrophysical properties as derived from borehole data to surface based geophysical exploration data. It can involve interpretation of both elastic wave propagation and electrical conductivity, but in this chapter focus is on elasticity. Rock physics is based...... on continuum mechanics, and the theory of elasticity developed for statics becomes the key to petrophysical interpretation of velocity of elastic waves. In practice, rock physics involves interpretation of well logs including vertical seismic profiling (VSP) and analysis of core samples. The results...

  13. Mineral and rock chemistry of Mata da Corda Kamafugitic Rocks (Minas Gerais State, Brazil)

    International Nuclear Information System (INIS)

    Albuquerque Sgarbi, Patricia B. de; Valenca, Joel G.

    1995-01-01

    The volcanic rocks of the Mata da Corda Formation (Upper Cretaceous) in Minas Gerais, Brazil, are mafic potassic to ultra potassic rocks of kamafugitic affinity containing essentially clinopyroxenes, perovskite, magnetite and occasionally olivine, phlogopite, melilite pseudomorphs and apatite. The felsic phases are kalsilite and/or leucite pseudomorphs. The rocks are classified as mafitites, leucitites and kalsilitites. The analysis of the available data of the rocks studied, based on the relevant aspects of the main proposals for the classification of alkaline mafic to ultramafic potassic rocks leads to the conclusion that Sahama's (1974) proposal to divide potassium rich alkaline rocks in two large families is the one to which the Mata da Corda rocks adapt best. According to this and the data in the literature on the mineralogy and mineral and rock chemistries of the other similar occurrences, these rocks may be interpreted as alkaline potassic to ultra potassic rocks of hamafugitic affinity. 11 figs., 5 tabs

  14. Three-Dimensional Synthetic Aperture Focusing Using a Rocking Convex Array Transducer

    DEFF Research Database (Denmark)

    Andresen, Henrik; Nikolov, Svetoslav; Pedersen, Mads Møller

    2010-01-01

    Volumetric imaging can be performed using 1-D arrays in combination with mechanical motion. Outside the elevation focus of the array, the resolution and contrast quickly degrade compared with the lateral plane, because of the fixed transducer focus. This paper shows the feasibility of using...... synthetic aperture focusing for enhancing the elevation focus for a convex rocking array. The method uses a virtual source (VS) for defocused multi-element transmit, and another VS in the elevation focus point. This allows a direct time-of-flight to be calculated for a given 3-D point. To avoid artifacts...... and increase SNR at the elevation VS, a plane-wave VS approach has been implemented. Simulations and measurements using an experimental scanner with a convex rocking array show an average improvement in resolution of 26% and 33%, respectively. This improvement is also seen in in vivo measurements...

  15. Big Data - What is it and why it matters.

    Science.gov (United States)

    Tattersall, Andy; Grant, Maria J

    2016-06-01

    Big data, like MOOCs, altmetrics and open access, is a term that has been commonplace in the library community for some time yet, despite its prevalence, many in the library and information sector remain unsure of the relationship between big data and their roles. This editorial explores what big data could mean for the day-to-day practice of health library and information workers, presenting examples of big data in action, considering the ethics of accessing big data sets and the potential for new roles for library and information workers. © 2016 Health Libraries Group.

  16. Research on information security in big data era

    Science.gov (United States)

    Zhou, Linqi; Gu, Weihong; Huang, Cheng; Huang, Aijun; Bai, Yongbin

    2018-05-01

    Big data is becoming another hotspot in the field of information technology after the cloud computing and the Internet of Things. However, the existing information security methods can no longer meet the information security requirements in the era of big data. This paper analyzes the challenges and a cause of data security brought by big data, discusses the development trend of network attacks under the background of big data, and puts forward my own opinions on the development of security defense in technology, strategy and product.

  17. BIG DATA IN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2015-06-01

    Full Text Available In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured in order to improve current transactions, to develop new business models, to provide a real image of the supply and demand and thereby, generate market advantages. So, the companies that turn to Big Data have a competitive advantage over other firms. Looking from the perspective of IT organizations, they must accommodate the storage and processing Big Data, and provide analysis tools that are easily integrated into business processes. This paper aims to discuss aspects regarding the Big Data concept, the principles to build, organize and analyse huge datasets in the business environment, offering a three-layer architecture, based on actual software solutions. Also, the article refers to the graphical tools for exploring and representing unstructured data, Gephi and NodeXL.

  18. Fuzzy 2-partition entropy threshold selection based on Big Bang–Big Crunch Optimization algorithm

    Directory of Open Access Journals (Sweden)

    Baljit Singh Khehra

    2015-03-01

    Full Text Available The fuzzy 2-partition entropy approach has been widely used to select threshold value for image segmenting. This approach used two parameterized fuzzy membership functions to form a fuzzy 2-partition of the image. The optimal threshold is selected by searching an optimal combination of parameters of the membership functions such that the entropy of fuzzy 2-partition is maximized. In this paper, a new fuzzy 2-partition entropy thresholding approach based on the technology of the Big Bang–Big Crunch Optimization (BBBCO is proposed. The new proposed thresholding approach is called the BBBCO-based fuzzy 2-partition entropy thresholding algorithm. BBBCO is used to search an optimal combination of parameters of the membership functions for maximizing the entropy of fuzzy 2-partition. BBBCO is inspired by the theory of the evolution of the universe; namely the Big Bang and Big Crunch Theory. The proposed algorithm is tested on a number of standard test images. For comparison, three different algorithms included Genetic Algorithm (GA-based, Biogeography-based Optimization (BBO-based and recursive approaches are also implemented. From experimental results, it is observed that the performance of the proposed algorithm is more effective than GA-based, BBO-based and recursion-based approaches.

  19. Addressing Data Veracity in Big Data Applications

    Energy Technology Data Exchange (ETDEWEB)

    Aman, Saima [Univ. of Southern California, Los Angeles, CA (United States). Dept. of Computer Science; Chelmis, Charalampos [Univ. of Southern California, Los Angeles, CA (United States). Dept. of Electrical Engineering; Prasanna, Viktor [Univ. of Southern California, Los Angeles, CA (United States). Dept. of Electrical Engineering

    2014-10-27

    Big data applications such as in smart electric grids, transportation, and remote environment monitoring involve geographically dispersed sensors that periodically send back information to central nodes. In many cases, data from sensors is not available at central nodes at a frequency that is required for real-time modeling and decision-making. This may be due to physical limitations of the transmission networks, or due to consumers limiting frequent transmission of data from sensors located at their premises for security and privacy concerns. Such scenarios lead to partial data problem and raise the issue of data veracity in big data applications. We describe a novel solution to the problem of making short term predictions (up to a few hours ahead) in absence of real-time data from sensors in Smart Grid. A key implication of our work is that by using real-time data from only a small subset of influential sensors, we are able to make predictions for all sensors. We thus reduce the communication complexity involved in transmitting sensory data in Smart Grids. We use real-world electricity consumption data from smart meters to empirically demonstrate the usefulness of our method. Our dataset consists of data collected at 15-min intervals from 170 smart meters in the USC Microgrid for 7 years, totaling 41,697,600 data points.

  20. A little big history of Tiananmen

    NARCIS (Netherlands)

    Quaedackers, E.; Grinin, L.E.; Korotayev, A.V.; Rodrigue, B.H.

    2011-01-01

    This contribution aims at demonstrating the usefulness of studying small-scale subjects such as Tiananmen, or the Gate of Heavenly Peace, in Beijing - from a Big History perspective. By studying such a ‘little big history’ of Tiananmen, previously overlooked yet fundamental explanations for why

  1. Addressing big data issues in Scientific Data Infrastructure

    NARCIS (Netherlands)

    Demchenko, Y.; Membrey, P.; Grosso, P.; de Laat, C.; Smari, W.W.; Fox, G.C.

    2013-01-01

    Big Data are becoming a new technology focus both in science and in industry. This paper discusses the challenges that are imposed by Big Data on the modern and future Scientific Data Infrastructure (SDI). The paper discusses a nature and definition of Big Data that include such features as Volume,

  2. Improving Healthcare Using Big Data Analytics

    Directory of Open Access Journals (Sweden)

    Revanth Sonnati

    2017-03-01

    Full Text Available In daily terms we call the current era as Modern Era which can also be named as the era of Big Data in the field of Information Technology. Our daily lives in todays world are rapidly advancing never quenching ones thirst. The fields of science engineering and technology are producing data at an exponential rate leading to Exabytes of data every day. Big data helps us to explore and re-invent many areas not limited to education health and law. The primary purpose of this paper is to provide an in-depth analysis in the area of Healthcare using the big data and analytics. The main purpose is to emphasize on the usage of the big data which is being stored all the time helping to look back in the history but this is the time to emphasize on the analyzation to improve the medication and services. Although many big data implementations happen to be in-house development this proposed implementation aims to propose a broader extent using Hadoop which just happen to be the tip of the iceberg. The focus of this paper is not limited to the improvement and analysis of the data it also focusses on the strengths and drawbacks compared to the conventional techniques available.

  3. Big Data - Smart Health Strategies

    Science.gov (United States)

    2014-01-01

    Summary Objectives To select best papers published in 2013 in the field of big data and smart health strategies, and summarize outstanding research efforts. Methods A systematic search was performed using two major bibliographic databases for relevant journal papers. The references obtained were reviewed in a two-stage process, starting with a blinded review performed by the two section editors, and followed by a peer review process operated by external reviewers recognized as experts in the field. Results The complete review process selected four best papers, illustrating various aspects of the special theme, among them: (a) using large volumes of unstructured data and, specifically, clinical notes from Electronic Health Records (EHRs) for pharmacovigilance; (b) knowledge discovery via querying large volumes of complex (both structured and unstructured) biological data using big data technologies and relevant tools; (c) methodologies for applying cloud computing and big data technologies in the field of genomics, and (d) system architectures enabling high-performance access to and processing of large datasets extracted from EHRs. Conclusions The potential of big data in biomedicine has been pinpointed in various viewpoint papers and editorials. The review of current scientific literature illustrated a variety of interesting methods and applications in the field, but still the promises exceed the current outcomes. As we are getting closer towards a solid foundation with respect to common understanding of relevant concepts and technical aspects, and the use of standardized technologies and tools, we can anticipate to reach the potential that big data offer for personalized medicine and smart health strategies in the near future. PMID:25123721

  4. About Big Data and its Challenges and Benefits in Manufacturing

    OpenAIRE

    Bogdan NEDELCU

    2013-01-01

    The aim of this article is to show the importance of Big Data and its growing influence on companies. It also shows what kind of big data is currently generated and how much big data is estimated to be generated. We can also see how much are the companies willing to invest in big data and how much are they currently gaining from their big data. There are also shown some major influences that big data has over one major segment in the industry (manufacturing) and the challenges that appear.

  5. Big Data Management in US Hospitals: Benefits and Barriers.

    Science.gov (United States)

    Schaeffer, Chad; Booton, Lawrence; Halleck, Jamey; Studeny, Jana; Coustasse, Alberto

    Big data has been considered as an effective tool for reducing health care costs by eliminating adverse events and reducing readmissions to hospitals. The purposes of this study were to examine the emergence of big data in the US health care industry, to evaluate a hospital's ability to effectively use complex information, and to predict the potential benefits that hospitals might realize if they are successful in using big data. The findings of the research suggest that there were a number of benefits expected by hospitals when using big data analytics, including cost savings and business intelligence. By using big data, many hospitals have recognized that there have been challenges, including lack of experience and cost of developing the analytics. Many hospitals will need to invest in the acquiring of adequate personnel with experience in big data analytics and data integration. The findings of this study suggest that the adoption, implementation, and utilization of big data technology will have a profound positive effect among health care providers.

  6. Big Data Strategy for Telco: Network Transformation

    OpenAIRE

    F. Amin; S. Feizi

    2014-01-01

    Big data has the potential to improve the quality of services; enable infrastructure that businesses depend on to adapt continually and efficiently; improve the performance of employees; help organizations better understand customers; and reduce liability risks. Analytics and marketing models of fixed and mobile operators are falling short in combating churn and declining revenue per user. Big Data presents new method to reverse the way and improve profitability. The benefits of Big Data and ...

  7. Big Data in Shipping - Challenges and Opportunities

    OpenAIRE

    Rødseth, Ørnulf Jan; Perera, Lokukaluge Prasad; Mo, Brage

    2016-01-01

    Big Data is getting popular in shipping where large amounts of information is collected to better understand and improve logistics, emissions, energy consumption and maintenance. Constraints to the use of big data include cost and quality of on-board sensors and data acquisition systems, satellite communication, data ownership and technical obstacles to effective collection and use of big data. New protocol standards may simplify the process of collecting and organizing the data, including in...

  8. [Relevance of big data for molecular diagnostics].

    Science.gov (United States)

    Bonin-Andresen, M; Smiljanovic, B; Stuhlmüller, B; Sörensen, T; Grützkau, A; Häupl, T

    2018-04-01

    Big data analysis raises the expectation that computerized algorithms may extract new knowledge from otherwise unmanageable vast data sets. What are the algorithms behind the big data discussion? In principle, high throughput technologies in molecular research already introduced big data and the development and application of analysis tools into the field of rheumatology some 15 years ago. This includes especially omics technologies, such as genomics, transcriptomics and cytomics. Some basic methods of data analysis are provided along with the technology, however, functional analysis and interpretation requires adaptation of existing or development of new software tools. For these steps, structuring and evaluating according to the biological context is extremely important and not only a mathematical problem. This aspect has to be considered much more for molecular big data than for those analyzed in health economy or epidemiology. Molecular data are structured in a first order determined by the applied technology and present quantitative characteristics that follow the principles of their biological nature. These biological dependencies have to be integrated into software solutions, which may require networks of molecular big data of the same or even different technologies in order to achieve cross-technology confirmation. More and more extensive recording of molecular processes also in individual patients are generating personal big data and require new strategies for management in order to develop data-driven individualized interpretation concepts. With this perspective in mind, translation of information derived from molecular big data will also require new specifications for education and professional competence.

  9. 238U And 232Th Concentration In Rock Samples using Alpha Autoradiography and Gamma Spectroscopy Techniques

    International Nuclear Information System (INIS)

    Hafez, A.F.; El-Farrash, A.H.; Yousef, H.A.

    2009-01-01

    The activity concentrations of uranium and thorium were measured for some rock samples selected from Dahab region in the south tip of Sinai. In order to detect any harmful radiation that would affect on the tourists and is becoming economic resource because Dahab have open fields of tourism in Egypt. The activity concentration of uranium and thorium in rocks samples was measured using two techniques. The first is .-autoradiography technique with LR-115 and CR-39 detectors and the second is gamma spectroscopic technique with NaI(Tl) detector. It was found that the average activity concentrations of uranium and thorium using .-autoradiography technique ranged from 6.41-49.31 Bqkg-1, 4.86- 40.87 Bqkg-1 respectively and by gamma detector are ranged from 6.70- 49.50 Bqkg-1, 4.47- 42.33 Bqkg-1 respectively. From the obtained data we can conclude that there is no radioactive healthy hazard for human and living beings in the area under investigation. It was found that there are no big differences between the calculated thorium to uranium ratios in both techniques

  10. Big data in psychology: A framework for research advancement.

    Science.gov (United States)

    Adjerid, Idris; Kelley, Ken

    2018-02-22

    The potential for big data to provide value for psychology is significant. However, the pursuit of big data remains an uncertain and risky undertaking for the average psychological researcher. In this article, we address some of this uncertainty by discussing the potential impact of big data on the type of data available for psychological research, addressing the benefits and most significant challenges that emerge from these data, and organizing a variety of research opportunities for psychology. Our article yields two central insights. First, we highlight that big data research efforts are more readily accessible than many researchers realize, particularly with the emergence of open-source research tools, digital platforms, and instrumentation. Second, we argue that opportunities for big data research are diverse and differ both in their fit for varying research goals, as well as in the challenges they bring about. Ultimately, our outlook for researchers in psychology using and benefiting from big data is cautiously optimistic. Although not all big data efforts are suited for all researchers or all areas within psychology, big data research prospects are diverse, expanding, and promising for psychology and related disciplines. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  11. Recovery of uranium from phosphatic rock and its derivatives

    International Nuclear Information System (INIS)

    Romero Guzman, E.T.

    1992-01-01

    The recovery of uranium present in the manufacture process of phosphoric acid and fertilizers has been one interesting field of study in chemistry. It is true that the recovery of uranium it is not very attractive from the commercial point of view, however the phosphatic fertilizers have an important amount of uranium which comes from the starting materials (phosphatic rock), therefore there must be many tons of uranium that are dispersed in the environmental together with the fertilizers used in agriculture every year. They are utilized for the enrichment of the nutrients which are exhausted in the soil. In this work, uranium was identified and quantified in the phosphatic rocks and in inorganic fertilizers using Gamma Spectroscopy, Neutron Activation Analysis, UV/Visible Spectrophotometry, Alpha Spectroscopy. On the other hand, it was done a correlation of the behaviour of uranium with inorganic elements present in the samples such as phosphorus, calcium and iron; which were determined by UV/Visible Spectrophotometry for phosphorus and Atomic Absorption Spectrometry for calcium and iron. The quantity of uranium found in the phosphatic rock, phosphoric acid and fertilizers was considerable (70-200 ppm). The adequate conditions for the recovery of 40% of total of uranium from the phosphatic rock with the addition of leaching solutions were stablished. (Author)

  12. 'Big data' in pharmaceutical science: challenges and opportunities.

    Science.gov (United States)

    Dossetter, Al G; Ecker, Gerhard; Laverty, Hugh; Overington, John

    2014-05-01

    Future Medicinal Chemistry invited a selection of experts to express their views on the current impact of big data in drug discovery and design, as well as speculate on future developments in the field. The topics discussed include the challenges of implementing big data technologies, maintaining the quality and privacy of data sets, and how the industry will need to adapt to welcome the big data era. Their enlightening responses provide a snapshot of the many and varied contributions being made by big data to the advancement of pharmaceutical science.

  13. An investigation of Laser Induced Breakdown Spectroscopy for use as a control in the laser removal of rock from fossils found at the Malapa hominin site, South Africa

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, D.E., E-mail: troberts@csir.co.za [CSIR National Laser Centre, PO Box 395, Meiring Naude Road, Pretoria 0001 (South Africa); Plessis, A. du [CSIR National Laser Centre, PO Box 395, Meiring Naude Road, Pretoria 0001 (South Africa); University of Stellenbosch, Private Bag X1, Matieland, Stellenbosch (South Africa); Steyn, J.; Botha, L.R.; Pityana, S. [CSIR National Laser Centre, PO Box 395, Meiring Naude Road, Pretoria 0001 (South Africa); Berger, L.R. [Institute for Human Evolution, School of GeoSciences, University of Witwatersrand, Private Bag 3, Wits 2050 (South Africa)

    2012-07-15

    Laser Induced Breakdown Spectroscopy (LIBS) was used to study the spectra from fossils and surrounding rock recovered from the Cradle of Mankind site at Malapa, South Africa. The objective was to find a suitable spectral line(s), specific to fossils, which could be used as a control signal to limit damage to fossils during high speed laser removal of the encasing rock. The calcified clastic matrix (rock) encasing the fossils was found to emit a variety of complex LIBS spectra. Nevertheless, it was found possible to distinguish fossils in a single LIBS pulse, and without significant damage to the fossil, using spectral lines of neutral phosphorus. - Highlights: Black-Right-Pointing-Pointer LIBS used to discriminate fossils from rock as potential processing control mechanism. Black-Right-Pointing-Pointer 2 million year old fossils from Malapa hominin site found to be high in phosphorus. Black-Right-Pointing-Pointer Rock spectral lines from silicon, iron and manganese, but no phosphorus. Black-Right-Pointing-Pointer Holds great promise for process control in laser preparation of fossils. Black-Right-Pointing-Pointer Also promising for accurate identification of fossils at excavation sites.

  14. Big bang nucleosynthesis: The standard model and alternatives

    International Nuclear Information System (INIS)

    Schramm, D.N.

    1991-01-01

    Big bang nucleosynthesis provides (with the microwave background radiation) one of the two quantitative experimental tests of the big bang cosmological model. This paper reviews the standard homogeneous-isotropic calculation and shows how it fits the light element abundances ranging from 4 He at 24% by mass through 2 H and 3 He at parts in 10 5 down to 7 Li at parts in 10 10 . Furthermore, the recent LEP (and SLC) results on the number of neutrinos are discussed as a positive laboratory test of the standard scenario. Discussion is presented on the improved observational data as well as the improved neutron lifetime data. Alternate scenarios of decaying matter or of quark-hadron induced inhomogeneities are discussed. It is shown that when these scenarios are made to fit the observed abundances accurately, the resulting conclusions on the baryonic density relative to the critical density, Ω b , remain approximately the same as in the standard homogeneous case, thus, adding to the robustness of the conclusion that Ω b ≅0.06. This latter point is the driving force behind the need for non-baryonic dark matter (assuming Ω total =1) and the need for dark baryonic matter, since Ω visible b . (orig.)

  15. Soft computing in big data processing

    CERN Document Server

    Park, Seung-Jong; Lee, Jee-Hyong

    2014-01-01

    Big data is an essential key to build a smart world as a meaning of the streaming, continuous integration of large volume and high velocity data covering from all sources to final destinations. The big data range from data mining, data analysis and decision making, by drawing statistical rules and mathematical patterns through systematical or automatically reasoning. The big data helps serve our life better, clarify our future and deliver greater value. We can discover how to capture and analyze data. Readers will be guided to processing system integrity and implementing intelligent systems. With intelligent systems, we deal with the fundamental data management and visualization challenges in effective management of dynamic and large-scale data, and efficient processing of real-time and spatio-temporal data. Advanced intelligent systems have led to managing the data monitoring, data processing and decision-making in realistic and effective way. Considering a big size of data, variety of data and frequent chan...

  16. Performance of diamond and point attack coal cutter picks

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Y. [CSIRO, Brisbane, Qld. (Australia). Division of Exploration and Mining

    1996-12-31

    This paper presents results of laboratory experiments and field trials of PDC (Polycrystalline Diamond Compact) and PA (Point Attack) coal cutter picks. Laboratory cutting tests included linear rock and coal cutting and turning rock cutting. The following parameters were measured to assess performance of PDC and PA cutter picks: cutting force, normal force, specific energy consumption, yield, dust generation and ignitional characteristics (temperature rise). Field trials were conducted on a longwall shearer. Performance of both types of pick interims of pick life and dust generation were assessed. 3 refs., 18 figs., 3 tabs.

  17. Solution of a braneworld big crunch/big bang cosmology

    International Nuclear Information System (INIS)

    McFadden, Paul L.; Turok, Neil; Steinhardt, Paul J.

    2007-01-01

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c) 2 . At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios

  18. Depositional environment and source rock potential of Cenomanian and Turonian sedimentary rocks of the Tarfaya Basin, Southwest Morocco

    Energy Technology Data Exchange (ETDEWEB)

    Ghassal, B.I.; Littke, R.; Sachse, V.; Sindern, S.; Schwarzbauer, J.

    2016-07-01

    Detailed organic and inorganic geochemical analyses were used to assess the depositional environment and source rock potential of the Cenomanian and Turonian oil shale deposits in the Tarfaya Basin. This study is based on core samples from the Tarfaya Sondage-4 well that penetrated over 300m of Mid Cretaceous organic matter-rich deposits. A total of 242 samples were analyzed for total organic and inorganic carbon and selected samples for total sulfur and major elements as well as for organic petrology, Rock-Eval pyrolysis, Curie-Point-pyrolysis-gaschromatography-Mass-Spectrometry and molecular geochemistry of solvent extracts. Based on major elements the lower Cenomanian differs from the other intervals by higher silicate and lower carbonate contents. Moreover, the molecular geochemistry suggests anoxic bottom marine water conditions during the Cenomanian-Turonian Boundary Event (CTBE; Oceanic Anoxic Event 2: OAE2). As a proxy for the Sorg/Corg ratio, the ratio total thiophenes/total benzenes compounds was calculated from pyrolysate compositions. The results suggest that Sorg/ Corg is low in the lower Cenomanian, moderate in the upper Cenomanian, very high in the CTBE (CenomanianTuronian Boundary Event) and high in the Turonian samples. Rock-Eval data reveal that the lower Cenomanian is a moderately organic carbon-rich source rock with good potential to generate oil and gas upon thermal maturation. On the other hand, the samples from the upper Cenomanian to Turonian exhibit higher organic carbon content and can be classified as oil-prone source rocks. Based on Tmax data, all rocks are thermally immature. The microscopic investigations suggest dominance of submicroscopic organic matter in all samples and different contents of bituminite and alginite. The lower Cenomanian samples have little visible organic matter and no bituminite. The upper Cenomanian and CTBE samples are poor in bituminite and have rare visible organic matter, whereas the Turonian samples change

  19. [Big data and their perspectives in radiation therapy].

    Science.gov (United States)

    Guihard, Sébastien; Thariat, Juliette; Clavier, Jean-Baptiste

    2017-02-01

    The concept of big data indicates a change of scale in the use of data and data aggregation into large databases through improved computer technology. One of the current challenges in the creation of big data in the context of radiation therapy is the transformation of routine care items into dark data, i.e. data not yet collected, and the fusion of databases collecting different types of information (dose-volume histograms and toxicity data for example). Processes and infrastructures devoted to big data collection should not impact negatively on the doctor-patient relationship, the general process of care or the quality of the data collected. The use of big data requires a collective effort of physicians, physicists, software manufacturers and health authorities to create, organize and exploit big data in radiotherapy and, beyond, oncology. Big data involve a new culture to build an appropriate infrastructure legally and ethically. Processes and issues are discussed in this article. Copyright © 2016 Société Française du Cancer. Published by Elsevier Masson SAS. All rights reserved.

  20. Current applications of big data in obstetric anesthesiology.

    Science.gov (United States)

    Klumpner, Thomas T; Bauer, Melissa E; Kheterpal, Sachin

    2017-06-01

    The narrative review aims to highlight several recently published 'big data' studies pertinent to the field of obstetric anesthesiology. Big data has been used to study rare outcomes, to identify trends within the healthcare system, to identify variations in practice patterns, and to highlight potential inequalities in obstetric anesthesia care. Big data studies have helped define the risk of rare complications of obstetric anesthesia, such as the risk of neuraxial hematoma in thrombocytopenic parturients. Also, large national databases have been used to better understand trends in anesthesia-related adverse events during cesarean delivery as well as outline potential racial/ethnic disparities in obstetric anesthesia care. Finally, real-time analysis of patient data across a number of disparate health information systems through the use of sophisticated clinical decision support and surveillance systems is one promising application of big data technology on the labor and delivery unit. 'Big data' research has important implications for obstetric anesthesia care and warrants continued study. Real-time electronic surveillance is a potentially useful application of big data technology on the labor and delivery unit.