WorldWideScience

Sample records for big rock point reactor

  1. Extended burnup demonstration: reactor fuel program. Pre-irradiation characterization and summary of pre-program poolside examinations. Big Rock Point extended burnup fuel

    International Nuclear Information System (INIS)

    This report is a resource document characterizing the 64 fuel rods being irradiated at the Big Rock Point reactor as part of the Extended Burnup Demonstration being sponsored jointly by the US Department of Energy, Consumers Power Company, Exxon Nuclear Company, and General Public Utilities. The program entails extending the exposure of standard BWR fuel to a discharge average of 38,000 MWD/MTU to demonstrate the feasibility of operating fuel of standard design to levels significantly above current limits. The fabrication characteristics of the Big Rock Point EBD fuel are presented along with measurement of rod length, rod diameter, pellet stack height, and fuel rod withdrawal force taken at poolside at burnups up to 26,200 MWD/MTU. A review of the fuel examination data indicates no performance characteristics which might restrict the continued irradiation of the fuel

  2. Big Rock Point: An Analysis of Project Estimate Performance

    International Nuclear Information System (INIS)

    The Big Rock Point Restoration Project is well into its third year of decommissioning and restoring the site to a green-field condition. Although the project has gone well and remains on schedule and within budget, much has been learned about decommissioning cost estimates versus actual costs, as well as areas in which the estimate appears adequate and in which the estimate is challenged. These items are briefly described in this report

  3. 78 FR 61401 - Entergy Nuclear Operations, Inc.; Big Rock Point; Independent Spent Fuel Storage Installation

    Science.gov (United States)

    2013-10-03

    ... COMMISSION Entergy Nuclear Operations, Inc.; Big Rock Point; Independent Spent Fuel Storage Installation..., Inc. (ENO) on June 20, 2012, for the Big Rock Point (BRP) Independent Spent Fuel Storage Installation... Regulatory Evaluation In the Final Rule for Storage of Spent Fuel in NRC-Approved Storage Casks at...

  4. Integrated plant-safety assessment, Systematic Evaluation Program: Big Rock Point Plant (Docket No. 50-155)

    International Nuclear Information System (INIS)

    The Systematic Evaluation Program was initiated in February 1977 by the US Nuclear Regulatory Commission to review the designs of older operating nuclear reactor plants to reconfirm and document their safety. This report documents the review of the Big Rock Point Plant, which is one of ten plants reviewed under Phase II of this program. This report indicates how 137 topics selected for review under Phase I of the program were addressed. It also addresses a majority of the pending licensing actions for Big Rock Point, which include TMI Action Plan requirements and implementation criteria for resolved generic issues. Equipment and procedural changes have been identified as a result of the review

  5. Integrated plant safety assessment. Systematic evaluation program, Big Rock Point Plant (Docket No. 50-155). Final report

    International Nuclear Information System (INIS)

    The Systematic Evaluation Program was initiated in February 1977 by the U.S. Nuclear Regulatory Commission to review the designs of older operating nuclear reactor plants to reconfirm and document their safety. The review provides (1) an assessment of how these plants compare with current licensing safety requirements relating to selected issues, (2) a basis for deciding how these differences should be resolved in an integrated plant review, and (3) a documented evaluation of plant safety when the supplement to the Final Integrated Plant Safety Assessment Report has been issued. This report documents the review of the Big Rock Point Plant, which is one of ten plants reviewed under Phase II of this program. This report indicates how 137 topics selected for review under Phase I of the program were addressed. It also addresses a majority of the pending licensing actions for Big Rock Point, which include TMI Action Plan requirements and implementation criteria for resolved generic issues. Equipment and procedural changes have been identified as a result of the review

  6. Big Rock Point Nuclear Plant. 23rd semiannual report of operations, July--December 1976

    International Nuclear Information System (INIS)

    Net electrical power generated was 240,333.9 MWh(e) with the reactor on line 4,316.6 hr. Information is presented concerning operation, power generation, shutdowns, corrective maintenance, chemistry and radiochemistry, occupational radiation exposure, release of radioactive materials, changes, tests, experiments, and environmental monitoring

  7. Big Bang Day : Physics Rocks

    CERN Multimedia

    Brian Cox; John Barrowman; Eddie Izzard

    2008-01-01

    Is particle physics the new rock 'n' roll? The fundamental questions about the nature of the universe that particle physics hopes to answer have attracted the attention of some very high profile and unusual fans. Alan Alda, Ben Miller, Eddie Izzard, Dara O'Briain and John Barrowman all have interests in this branch of physics. Brian Cox - CERN physicist, and former member of 90's band D:Ream, tracks down some very well known celebrity enthusiasts and takes a light-hearted look at why this subject can appeal to all of us.

  8. Turning points in reactor design

    International Nuclear Information System (INIS)

    This article provides some historical aspects on nuclear reactor design, beginning with PWR development for Naval Propulsion and the first commercial application at Yankee Rowe. Five turning points in reactor design and some safety problems associated with them are reviewed: (1) stability of Dresden-1, (2) ECCS, (3) PRA, (4) TMI-2, and (5) advanced passive LWR designs. While the emphasis is on the thermal-hydraulic aspects, the discussion is also about reactor systems

  9. Turning points in reactor design

    Energy Technology Data Exchange (ETDEWEB)

    Beckjord, E.S.

    1995-09-01

    This article provides some historical aspects on nuclear reactor design, beginning with PWR development for Naval Propulsion and the first commercial application at Yankee Rowe. Five turning points in reactor design and some safety problems associated with them are reviewed: (1) stability of Dresden-1, (2) ECCS, (3) PRA, (4) TMI-2, and (5) advanced passive LWR designs. While the emphasis is on the thermal-hydraulic aspects, the discussion is also about reactor systems.

  10. CLASSIFICATION OF BIG POINT CLOUD DATA USING CLOUD COMPUTING

    OpenAIRE

    Liu, K.; J. Boehm

    2015-01-01

    Point cloud data plays an significant role in various geospatial applications as it conveys plentiful information which can be used for different types of analysis. Semantic analysis, which is an important one of them, aims to label points as different categories. In machine learning, the problem is called classification. In addition, processing point data is becoming more and more challenging due to the growing data volume. In this paper, we address point data classification in a big data co...

  11. Classification of Big Point Cloud Data Using Cloud Computing

    Science.gov (United States)

    Liu, K.; Boehm, J.

    2015-08-01

    Point cloud data plays an significant role in various geospatial applications as it conveys plentiful information which can be used for different types of analysis. Semantic analysis, which is an important one of them, aims to label points as different categories. In machine learning, the problem is called classification. In addition, processing point data is becoming more and more challenging due to the growing data volume. In this paper, we address point data classification in a big data context. The popular cluster computing framework Apache Spark is used through the experiments and the promising results suggests a great potential of Apache Spark for large-scale point data processing.

  12. Application Study of Self-balanced Testing Method on Big Diameter Rock-socketed Piles

    Directory of Open Access Journals (Sweden)

    Qing-biao WANG

    2013-07-01

    Full Text Available Through the technological test of self-balanced testing method on big diameter rock-socketed piles of broadcasting centre building of Tai’an, this paper studies and analyzes the links of the balance position selection, the load cell production and installation, displacement sensor selection and installation, loading steps, stability conditions and determination of the bearing capacity in the process of self-balanced testing. And this paper summarizes key technology and engineering experience of self-balanced testing method of big diameter rock-socketed piles and, meanwhile, it also analyzes the difficult technical problems needed to be resolved urgently at present. Conclusion of the study has important significance to the popularization and application of self-balanced testing method and the similar projects.

  13. Effect of loading point position on fracture mode of rock

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Anti-symmetric four-point bending specimen with different loading point positions was used to study effect of loading point position on fracture mode of rock in order to explore a feasible method for achieving Mode Ⅱ fracture and determining Mode Ⅱ fracture toughness of rock, KⅡC. Numerical and experimental results show that the distance between the inner and outer loading points, L1+L2, has a great influence on stresses at notch tip and fracture mode. When L1+L2>0.5L or 0.1L<L1+L2<0.5L, maximum principal stress σ1 exceeds the tensile strength σt. The ratio of τmax/σ1 is relatively low or high and thus Mode Ⅰ or mixed mode fracture occurs. When L1+L2<0.1L, σ1 is smaller than σt and the ratio of τmax/σ1 is much higher, which facilitates the occurrence of Mode Ⅱ fracture.

  14. High-Temperature Gas-Cooled Test Reactor Point Design

    Energy Technology Data Exchange (ETDEWEB)

    Sterbentz, James William [Idaho National Laboratory; Bayless, Paul David [Idaho National Laboratory; Nelson, Lee Orville [Idaho National Laboratory; Gougar, Hans David [Idaho National Laboratory; Kinsey, James Carl [Idaho National Laboratory; Strydom, Gerhard [Idaho National Laboratory; Kumar, Akansha [Idaho National Laboratory

    2016-04-01

    A point design has been developed for a 200 MW high-temperature gas-cooled test reactor. The point design concept uses standard prismatic blocks and 15.5% enriched UCO fuel. Reactor physics and thermal-hydraulics simulations have been performed to characterize the capabilities of the design. In addition to the technical data, overviews are provided on the technological readiness level, licensing approach and costs.

  15. Recent advances in analysis and prediction of Rock Falls, Rock Slides, and Rock Avalanches using 3D point clouds

    Science.gov (United States)

    Abellan, A.; Carrea, D.; Jaboyedoff, M.; Riquelme, A.; Tomas, R.; Royan, M. J.; Vilaplana, J. M.; Gauvin, N.

    2014-12-01

    The acquisition of dense terrain information using well-established 3D techniques (e.g. LiDAR, photogrammetry) and the use of new mobile platforms (e.g. Unmanned Aerial Vehicles) together with the increasingly efficient post-processing workflows for image treatment (e.g. Structure From Motion) are opening up new possibilities for analysing, modeling and predicting rock slope failures. Examples of applications at different scales ranging from the monitoring of small changes at unprecedented level of detail (e.g. sub millimeter-scale deformation under lab-scale conditions) to the detection of slope deformation at regional scale. In this communication we will show the main accomplishments of the Swiss National Foundation project "Characterizing and analysing 3D temporal slope evolution" carried out at Risk Analysis group (Univ. of Lausanne) in close collaboration with the RISKNAT and INTERES groups (Univ. of Barcelona and Univ. of Alicante, respectively). We have recently developed a series of innovative approaches for rock slope analysis using 3D point clouds, some examples include: the development of semi-automatic methodologies for the identification and extraction of rock-slope features such as discontinuities, type of material, rockfalls occurrence and deformation. Moreover, we have been improving our knowledge in progressive rupture characterization thanks to several algorithms, some examples include the computing of 3D deformation, the use of filtering techniques on permanently based TLS, the use of rock slope failure analogies at different scales (laboratory simulations, monitoring at glacier's front, etc.), the modelling of the influence of external forces such as precipitation on the acceleration of the deformation rate, etc. We have also been interested on the analysis of rock slope deformation prior to the occurrence of fragmental rockfalls and the interaction of this deformation with the spatial location of future events. In spite of these recent advances

  16. Sideloading - Ingestion of Large Point Clouds Into the Apache Spark Big Data Engine

    Science.gov (United States)

    Boehm, J.; Liu, K.; Alis, C.

    2016-06-01

    In the geospatial domain we have now reached the point where data volumes we handle have clearly grown beyond the capacity of most desktop computers. This is particularly true in the area of point cloud processing. It is therefore naturally lucrative to explore established big data frameworks for big geospatial data. The very first hurdle is the import of geospatial data into big data frameworks, commonly referred to as data ingestion. Geospatial data is typically encoded in specialised binary file formats, which are not naturally supported by the existing big data frameworks. Instead such file formats are supported by software libraries that are restricted to single CPU execution. We present an approach that allows the use of existing point cloud file format libraries on the Apache Spark big data framework. We demonstrate the ingestion of large volumes of point cloud data into a compute cluster. The approach uses a map function to distribute the data ingestion across the nodes of a cluster. We test the capabilities of the proposed method to load billions of points into a commodity hardware compute cluster and we discuss the implications on scalability and performance. The performance is benchmarked against an existing native Apache Spark data import implementation.

  17. Second nuclear reactor, Point Lepreau, New Brunswick

    International Nuclear Information System (INIS)

    This is a report of the findings, conclusions and recommendations of the Environmental Assessment Panel appointed by the Ministers of Environment of New Brunswick and Canada to review the proposal to build a seond nuclear unit at Point Lepreau, New Brunswick. The Panel's mandate was to assess the environmental and related social impacts of the proposal. The Panel concludes that the project can proceed without significant adverse effects provided certain recommendations are followed. In order to understand the impacts of Lepreau II, it was necessary to review, to the extent possible, the actual effects of Lepreau I before estimating the incremental effects of Lepreau II. In so doing, the Panel made a number of recommendations that should be implemented now. The information gathered and experience gained can be applied to Lepreau II to ensure that potential impacts are reduced to a minimum and existing concerns associated with Lepreau I can be corrected

  18. Innovations and Enhancements for a Consortium of Big-10 University Research and Training Reactors. Final Report

    International Nuclear Information System (INIS)

    The Consortium of Big-10 University Research and Training Reactors was by design a strategic partnership of seven leading institutions. We received the support of both our industry and DOE laboratory partners. Investments in reactor, laboratory and program infrastructure, allowed us to lead the national effort to expand and improve the education of engineers in nuclear science and engineering, to provide outreach and education to pre-college educators and students and to become a key resource of ideas and trained personnel for our U.S. industrial and DOE laboratory collaborators.

  19. Solution of the reactor point kinetics equations by MATLAB computing

    Directory of Open Access Journals (Sweden)

    Singh Sudhansu S.

    2015-01-01

    Full Text Available The numerical solution of the point kinetics equations in the presence of Newtonian temperature feedback has been a challenging issue for analyzing the reactor transients. Reactor point kinetics equations are a system of stiff ordinary differential equations which need special numerical treatments. Although a plethora of numerical intricacies have been introduced to solve the point kinetics equations over the years, some of the simple and straightforward methods still work very efficiently with extraordinary accuracy. As an example, it has been shown recently that the fundamental backward Euler finite difference algorithm with its simplicity has proven to be one of the most effective legacy methods. Complementing the back-ward Euler finite difference scheme, the present work demonstrates the application of ordinary differential equation suite available in the MATLAB software package to solve the stiff reactor point kinetics equations with Newtonian temperature feedback effects very effectively by analyzing various classic benchmark cases. Fair accuracy of the results implies the efficient application of MATLAB ordinary differential equation suite for solving the reactor point kinetics equations as an alternate method for future applications.

  20. Automated Rock Detection and Shape Analysis from Mars Rover Imagery and 3D Point Cloud Data

    Institute of Scientific and Technical Information of China (English)

    Kaichang Di; Zongyu Yue; Zhaoqin Liu; Shuliang Wang

    2013-01-01

    A new object-oriented method has been developed for the extraction of Mars rocks from Mars rover data.It is based on a combination of Mars rover imagery and 3D point cloud data.First,Navcam or Pancam images taken by the Mars rovers are segmented into homogeneous objects with a mean-shift algorithm.Then,the objects in the segmented images are classified into small rock candidates,rock shadows,and large objects.Rock shadows and large objects are considered as the regions within which large rocks may exist.In these regions,large rock candidates are extracted through ground-plane fitting with the 3D point cloud data.Small and large rock candidates are combined and postprocessed to obtain the final rock extraction results.The shape properties of the rocks (angularity,circularity,width,height,and width-height ratio) have been calculated for subsequent geological studies.

  1. Rock massif observation from underground coal gasification point of view

    Directory of Open Access Journals (Sweden)

    T. Sasvári

    2009-04-01

    Full Text Available The Underground coal gasification (UCG of the coal seams is determined by suitable geological structure of the area. The assumption of the qualitative changes of the rock massif can be also enabled by application of geophysical methods (electric resisting methods and geoelectric tomography. This article shows the example of evaluating possibilities of realization of the underground coal gasification in the area of the Upper Nitra Coal Basin in Cíge¾ and Nováky deposits, and recommend the needs of cooperation among geological, geotechnical and geophysical researchers.

  2. Field Plot and Accuracy Assessment Points for Pictured Rocks National Lakeshore Vegetation Mapping Project

    Data.gov (United States)

    National Park Service, Department of the Interior — The vegetation point data for Pictured Rocks National Lakeshore (PIRO) was developed to support two projects associated with the 2004 vegetation map, the collection...

  3. Incoherent SSI Analysis of Reactor Building using 2007 Hard-Rock Coherency Model

    International Nuclear Information System (INIS)

    Many strong earthquake recordings show the response motions at building foundations to be less intense than the corresponding free-field motions. To account for these phenomena, the concept of spatial variation, or wave incoherence was introduced. Several approaches for its application to practical analysis and design as part of soil-structure interaction (SSI) effect have been developed. However, conventional wave incoherency models didn't reflect the characteristics of earthquake data from hard-rock site, and their application to the practical nuclear structures on the hard-rock sites was not justified sufficiently. This paper is focused on the response impact of hard-rock coherency model proposed in 2007 on the incoherent SSI analysis results of nuclear power plant (NPP) structure. A typical reactor building of pressurized water reactor (PWR) type NPP is modeled classified into surface and embedded foundations. The model is also assumed to be located on medium-hard rock and hard-rock sites. The SSI analysis results are obtained and compared in case of coherent and incoherent input motions. The structural responses considering rocking and torsion effects are also investigated

  4. Big ambitions for small reactors as investors size up power options

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, John [nuclear24, Redditch (United Kingdom)

    2016-04-15

    Earlier this year, US nuclear developer NuScale Power completed a study for the UK's National Nuclear Laboratory (NNL) that supported the suitability of NuScale's small modular reactor (SMR) technology for the effective disposition of plutonium. The UK is a frontrunner to compete in the SMR marketplace, both in terms of technological capabilities, trade and political commitment. Industry observers are openly speculating whether SMR design and construction could start to move ahead faster than 'big and conventional' nuclear construction projects - not just in the UK but worldwide. Economies of scale could increase the attraction of SMRs to investors and the general public.

  5. Big ambitions for small reactors as investors size up power options

    International Nuclear Information System (INIS)

    Earlier this year, US nuclear developer NuScale Power completed a study for the UK's National Nuclear Laboratory (NNL) that supported the suitability of NuScale's small modular reactor (SMR) technology for the effective disposition of plutonium. The UK is a frontrunner to compete in the SMR marketplace, both in terms of technological capabilities, trade and political commitment. Industry observers are openly speculating whether SMR design and construction could start to move ahead faster than 'big and conventional' nuclear construction projects - not just in the UK but worldwide. Economies of scale could increase the attraction of SMRs to investors and the general public.

  6. Parallel Processing of Big Point Clouds Using Z-Order Partitioning

    Science.gov (United States)

    Alis, C.; Boehm, J.; Liu, K.

    2016-06-01

    As laser scanning technology improves and costs are coming down, the amount of point cloud data being generated can be prohibitively difficult and expensive to process on a single machine. This data explosion is not only limited to point cloud data. Voluminous amounts of high-dimensionality and quickly accumulating data, collectively known as Big Data, such as those generated by social media, Internet of Things devices and commercial transactions, are becoming more prevalent as well. New computing paradigms and frameworks are being developed to efficiently handle the processing of Big Data, many of which utilize a compute cluster composed of several commodity grade machines to process chunks of data in parallel. A central concept in many of these frameworks is data locality. By its nature, Big Data is large enough that the entire dataset would not fit on the memory and hard drives of a single node hence replicating the entire dataset to each worker node is impractical. The data must then be partitioned across worker nodes in a manner that minimises data transfer across the network. This is a challenge for point cloud data because there exist different ways to partition data and they may require data transfer. We propose a partitioning based on Z-order which is a form of locality-sensitive hashing. The Z-order or Morton code is computed by dividing each dimension to form a grid then interleaving the binary representation of each dimension. For example, the Z-order code for the grid square with coordinates (x = 1 = 012, y = 3 = 112) is 10112 = 11. The number of points in each partition is controlled by the number of bits per dimension: the more bits, the fewer the points. The number of bits per dimension also controls the level of detail with more bits yielding finer partitioning. We present this partitioning method by implementing it on Apache Spark and investigating how different parameters affect the accuracy and running time of the k nearest neighbour algorithm

  7. Downstream-migrating fluvial point bars in the rock record

    Science.gov (United States)

    Ghinassi, Massimiliano; Ielpi, Alessandro; Aldinucci, Mauro; Fustic, Milovan

    2016-04-01

    Classical models developed for ancient fluvial point bars are based on the assumption that meander bends invariably increase their radius as meander-bend apices migrate in a direction transverse to the channel-belt axis (i.e., meander bend expansion). However, many modern meandering rivers are also characterized by down-valley migration of the bend apex, a mechanism that takes place without a significant change in meander radius and wavelength. Downstream-migrating fluvial point bars (DMFPB) are the dominant architectural element of these types of meander belts. Yet they are poorly known from ancient fluvial-channel belts, since their disambiguation from expansional point bars often requires fully-3D perspectives. This study aims to review DMFPB deposits spanning in age from Devonian to Holocene, and to discuss their main architectural and sedimentological features from published outcrop, borehole and 3D-seismic datasets. Fluvial successions hosting DMFPB mainly accumulated in low accommodation conditions, where channel belts were affected by different degrees of morphological (e.g., valleys) or tectonic (e.g., axial drainage of shortening basins) confinement. In confined settings, bends migrate downstream along the erosion-resistant valley flanks and little or no floodplain deposits are preserved. Progressive floor aggradation (e.g., valley filling) allow meander belts with DMFPB to decrease their degree of confinement. In less confined settings, meander bends migrate downstream mainly after impinging against older, erosion-resistant channel fill mud. By contrast, tectonic confinement is commonly associated with uplifted alluvial plains that prevented meander-bend expansion, in turn triggering downstream translation. At the scale of individual point bars, translational morphodynamics promote the preservation of downstream-bar deposits, whereas the coarser-grained upstream and central beds are less frequently preserved. However, enhanced preservation of upstream

  8. Reactor physics and safety aspects of various design options of a Russian light water reactor with rock-like fuels

    Science.gov (United States)

    Bondarenko, A. V.; Komissarov, O. V.; Kozmenkov, Ya. K.; Matveev, Yu. V.; Orekhov, Yu. I.; Pivovarov, V. A.; Sharapov, V. N.

    2003-06-01

    This paper presents results of analytical studies on weapons grade plutonium incineration in VVER (640) medium size light water reactors using a special composition of rock-like fuel (ROX-fuel) to assure spent fuel long-term storage without its reprocessing. The main goal is to achieve high degree of plutonium incineration in once-through cycle. In this paper we considered two fuel compositions. In both compositions weapons grade plutonium is used as fissile material. Spinel (MgAl 2O 4) is used as the 'preserving' material assuring safe storage of the spent fuel. Besides an inert matrix, the option of rock-like fuel with thorium dioxide was studied. One of principal problems in the realization of the proposed approach is the substantial change of properties of the light water reactor core when passing to the use of the ROX-fuel, in particular: (i) due to the absence of 238U the Doppler effect playing a crucial role in reactor's self-regulation and limiting the consequences of reactivity accidents, decreases significantly, (ii) no fuel breeding on one hand, and the quest to attain the maximum plutonium burnup on the other hand, would result in a drastical change of the fuel assembly power during the lifetime and, as a consequence, the rise in irregularity of the power density of fuel assemblies, (iii) both the control rods worth and dissolved boron worth decrease in view of neutron spectrum hardening brought on by the larger absorption cross-section of plutonium as compared to uranium, (iv) βeff is markedly reduced. All these distinctive features are potentially detrimental to the reactor nuclear safety. The principal objective of this work is that to identify a variant of the fuel composition and the reactor layout, which would permit neutralize the negative effect of the above-mentioned distinctive features.

  9. High Temperature Gas-Cooled Test Reactor Point Design: Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Sterbentz, James William [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bayless, Paul David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Nelson, Lee Orville [Idaho National Lab. (INL), Idaho Falls, ID (United States); Gougar, Hans David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-01-01

    A point design has been developed for a 200-MW high-temperature gas-cooled test reactor. The point design concept uses standard prismatic blocks and 15.5% enriched uranium oxycarbide fuel. Reactor physics and thermal-hydraulics simulations have been performed to characterize the capabilities of the design. In addition to the technical data, overviews are provided on the technology readiness level, licensing approach, and costs of the test reactor point design.

  10. High Temperature Gas-Cooled Test Reactor Point Design: Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Sterbentz, James William [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bayless, Paul David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Nelson, Lee Orville [Idaho National Lab. (INL), Idaho Falls, ID (United States); Gougar, Hans David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kinsey, J. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-03-01

    A point design has been developed for a 200-MW high-temperature gas-cooled test reactor. The point design concept uses standard prismatic blocks and 15.5% enriched uranium oxycarbide fuel. Reactor physics and thermal-hydraulics simulations have been performed to characterize the capabilities of the design. In addition to the technical data, overviews are provided on the technology readiness level, licensing approach, and costs of the test reactor point design.

  11. High Temperature Gas-Cooled Test Reactor Point Design: Summary Report

    International Nuclear Information System (INIS)

    A point design has been developed for a 200-MW high-temperature gas-cooled test reactor. The point design concept uses standard prismatic blocks and 15.5% enriched uranium oxycarbide fuel. Reactor physics and thermal-hydraulics simulations have been performed to characterize the capabilities of the design. In addition to the technical data, overviews are provided on the technology readiness level, licensing approach, and costs of the test reactor point design.

  12. Influence of Uncertainty of Rock Properties on Seismic Responses of Reactor Buildings

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    The influence of the dispersion and uncertainty of the dynamic shear wave velocity and Poisson's ratio of soil in a hard rock site was investigated on the seismic response of reactor building structure. The analysis is performed by considering the soil-structure interaction effects and based on the model of the reactor building in a typical pressurized water reactor nuclear power plant (NPP). The numerical results show that for the typical floor selected, while the relative increment ratio of the dynamic shear wave velocity varies from -30% to 30% compared to the basis of 1 930 m/s, the relative variation of the horizontal response spectra peak value lies in the scope of ±10% for the internal structure, and the relative variation of the frequency corresponding to the spectra peak is 0.0% in most cases. The relative variation of the vertical response spectra peak value lies in the scope of - 10% to 22%, and the relative variation of the frequency corresponding to the spectra peak lies in the scope of -22% to 4%. The analysis indicates that the dynamic shear wave velocity and the Poisson's ratio of the rock would affect the seismic response of structure and the soil-structure interaction effects should be considered in seismic analysis and design of NPP even for a hard rock site.

  13. Point Movement Trace Vs. The Range Of Mining Exploitation Effects In The Rock Mass

    Science.gov (United States)

    Sroka, Anton; Knothe, Stanisław; Tajduś, Krzysztof; Misa, Rafał

    2015-12-01

    The geometric-integral theories of the rock mass point movements due to mining exploitation assume the relationship between the progress of subsidence and horizontal movement. By analysing the movement trace of a point located on the surface, and the influence of the mining exploitation in the rock mass, an equation describing the relationship between the main components of the deformation conditions was formulated. The result is consistent with the in situ observations and indicates the change of the rock mass component volume due to mining exploitation. The analyses and in situ observations demonstrate clearly that the continuity equation adopted in many solutions in the form: sumlimits_{i = 1}^{i = 3} {\\varepsilon _{ii}= 0} is fundamentally incorrect.

  14. Automatic extraction of discontinuity orientation from rock mass surface 3D point cloud

    Science.gov (United States)

    Chen, Jianqin; Zhu, Hehua; Li, Xiaojun

    2016-10-01

    This paper presents a new method for extracting discontinuity orientation automatically from rock mass surface 3D point cloud. The proposed method consists of four steps: (1) automatic grouping of discontinuity sets using an improved K-means clustering method, (2) discontinuity segmentation and optimization, (3) discontinuity plane fitting using Random Sample Consensus (RANSAC) method, and (4) coordinate transformation of discontinuity plane. The method is first validated by the point cloud of a small piece of a rock slope acquired by photogrammetry. The extracted discontinuity orientations are compared with measured ones in the field. Then it is applied to a publicly available LiDAR data of a road cut rock slope at Rockbench repository. The extracted discontinuity orientations are compared with the method proposed by Riquelme et al. (2014). The results show that the presented method is reliable and of high accuracy, and can meet the engineering needs.

  15. Small Stress Change Triggering a Big Earthquake: a Test of the Critical Point Hypothesis for Earthquakes

    Institute of Scientific and Technical Information of China (English)

    万永革; 吴忠良; 周公威

    2003-01-01

    Whether or not a small stress change can trigger a big earthquake is one of the most important problems related to the critical point hypothesis for earthquakes. We investigate global earthquakes with different focal mechanisms which have different levels of ambient shear stress. This ambient stress level is the stress level required by the earthquakes for their occurrence. Earthquake pairs are studied to see whether the occurrence of the preceding event encourages the occurrence of the succeeding one in terms of the Coulomb stress triggering. It is observed that the stress triggering effect produced by the change of Coulomb failure stress in the same order of magnitudes,about 10-2 MPa, is distinctly different for different focal mechanisms, and thus for different ambient stress levels.For non-strike-slip earthquakes with a relatively low ambient stress level, the triggering effect is more evident,while for strike-slip earthquakes with a relatively high ambient stress level, there is no evident triggering effect.This water level test provides an observational support to the critical point hypothesis for earthquakes.

  16. Geology of Precambrian rocks and isotope geochemistry of shear zones in the Big Narrows area, northern Front Range, Colorado

    Science.gov (United States)

    Abbott, Jeffrey T.

    1970-01-01

    Rocks within the Big Narrows and Poudre Park quadrangles located in the northern Front Range of Colorado are Precambrian metasedimentary and metaigneous schists and gneisses and plutonic igneous rocks. These are locally mantled by extensive late Tertiary and Quaternary fluvial gravels. The southern boundary of the Log Cabin batholith lies within the area studied. A detailed chronology of polyphase deformation, metamorphism and plutonism has been established. Early isoclinal folding (F1) was followed by a major period of plastic deformation (F2), sillimanite-microcline grade regional metamorphism, migmatization and synkinematic Boulder Creek granodiorite plutonism (1.7 b.y.). Macroscopic doubly plunging antiformal and synformal structures were developed. P-T conditions at the peak of metamorphism were probably about 670?C and 4.5 Kb. Water pressures may locally have differed from load pressures. The 1.4 b.y. Silver Plume granite plutonism was post kinematic and on the basis of petrographic and field criteria can be divided into three facies. Emplacement was by forcible injection and assimilation. Microscopic and mesoscopic folds which postdate the formation of the characteristic mineral phases during the 1.7 b.y. metamorphism are correlated with the emplacement of the Silver Plume Log Cabin batholith. Extensive retrograde metamorphism was associated with this event. A major period of mylonitization postdates Silver Plume plutonism and produced large E-W and NE trending shear zones. A detailed study of the Rb/Sr isotope geochemistry of the layered mylonites demonstrated that the mylonitization and associated re- crystallization homogenized the Rb87/Sr 86 ratios. Whole-rock dating techniques applied to the layered mylonites indicate a probable age of 1.2 b.y. Petrographic studies suggest that the mylonitization-recrystallization process produced hornfels facies assemblages in the adjacent metasediments. Minor Laramide faulting, mineralization and igneous activity

  17. The TITAN Reversed-Field Pinch Reactor: Design-point determination and parametric studies

    International Nuclear Information System (INIS)

    The multi-institutional TITAN study has examined the physics, technology, safety, and economics issues associated with the operation of a Reversed-Field Pinch (RFP) magnetic fusion reactor at high power density. A comprehensive system and trade study have been conducted as an integral and ongoing part of the reactor assessment. Attractive design points emerging from these parametric studies are subjected to more detailed analysis and design integration, the results of which are used to refine the parametric systems model. The design points and tradeoffs for two TITAN/RFP reactor embodiments are discussed. 14 refs

  18. Evaluation of the integrity of SEP reactor vessels

    International Nuclear Information System (INIS)

    A documented review is presented of the integrity of the 11 reactor pressure vessels covered in the Systematic Evaluation Program. This review deals primarily with the design specifications and quality assurance programs used in the vessel construction and the status of material surveillance programs, pressure-temperature operating limits, and inservice inspection programs of the applicable plants. Several generic items such as PWR overpressurization protection and BWR nozzle and safe-end cracking also are evaluated. The 11 vessels evaluated include Dresden Units 1 and 2, Big Rock Point, Haddam Neck, Yankee Rowe, Oyster Creek, San Onofre 1, LaCrosse, Ginna, Millstone 1, and Palisades

  19. Balancing on the Edge: An Approach to Leadership and Resiliency that Combines Rock Climbing with Four Key Touch Points

    Science.gov (United States)

    Winkler, Harold E.

    2005-01-01

    In this article, the author compares leadership and resiliency with rock climbing. It describes the author's personal experience on a rock climbing adventure with his family and how it required application of similar elements as that of leadership and resiliency. The article contains the following sections: (1) Being Resilient; (2) Points of…

  20. Potential of acoustic emissions from three point bending tests as rock failure precursors

    Institute of Scientific and Technical Information of China (English)

    Agioutantis Z.; Kaklis K.; Mavrigiannakis S.; Verigakis M.; Vallianatos F.; Saltas V.

    2016-01-01

    Development of failure in brittle materials is associated with microcracks, which release energy in the form of elastic waves called acoustic emissions. This paper presents results from acoustic emission mea-surements obtained during three point bending tests on Nestos marble under laboratory conditions. Acoustic emission activity was monitored using piezoelectric acoustic emission sensors, and the potential for accurate prediction of rock damage based on acoustic emission data was investigated. Damage local-ization was determined based on acoustic emissions generated from the critically stressed region as scat-tered events at stresses below and close to the strength of the material.

  1. Unioned layer for the Point of Rocks-Black Butte coal assessment area, Green River Basin, Wyoming (porbbfing.shp)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This ArcView shapefile contains a polygon representation of the spatial query layer for the Point of Rocks-Black Butte coalfield, Greater Green River Basin,...

  2. Analytical solution of the point reactor kinetics equations with temperature feedback

    International Nuclear Information System (INIS)

    Highlights: ► Supercritical process in a pressurized-water reactor with 235U as fissile materials. ► Solution of the point reactor kinetics equation with a temperature feedback. ► The linear relationship between reactivity and neutron generation time. - Abstract: In this paper the point reactor kinetics equations with one group of averaged delayed neutrons and the adiabatic feedback model are solved analytically. The relations of reactivity, and neutron density with neutron lifetime are calculated. The numerical results of the delayed-supercritical process in a pressurized-water reactor with 235U as a fissile material under constant step reactivity of ρ0 = β/2 are given. Our investigations report one of the most accurate results. However this method is valid and applicable as long as the adiabatic condition of heat transfer from fuel rods to the coolant is met.

  3. Automated extraction and analysis of rock discontinuity characteristics from 3D point clouds

    Science.gov (United States)

    Bianchetti, Matteo; Villa, Alberto; Agliardi, Federico; Crosta, Giovanni B.

    2016-04-01

    A reliable characterization of fractured rock masses requires an exhaustive geometrical description of discontinuities, including orientation, spacing, and size. These are required to describe discontinuum rock mass structure, perform Discrete Fracture Network and DEM modelling, or provide input for rock mass classification or equivalent continuum estimate of rock mass properties. Although several advanced methodologies have been developed in the last decades, a complete characterization of discontinuity geometry in practice is still challenging, due to scale-dependent variability of fracture patterns and difficult accessibility to large outcrops. Recent advances in remote survey techniques, such as terrestrial laser scanning and digital photogrammetry, allow a fast and accurate acquisition of dense 3D point clouds, which promoted the development of several semi-automatic approaches to extract discontinuity features. Nevertheless, these often need user supervision on algorithm parameters which can be difficult to assess. To overcome this problem, we developed an original Matlab tool, allowing fast, fully automatic extraction and analysis of discontinuity features with no requirements on point cloud accuracy, density and homogeneity. The tool consists of a set of algorithms which: (i) process raw 3D point clouds, (ii) automatically characterize discontinuity sets, (iii) identify individual discontinuity surfaces, and (iv) analyse their spacing and persistence. The tool operates in either a supervised or unsupervised mode, starting from an automatic preliminary exploration data analysis. The identification and geometrical characterization of discontinuity features is divided in steps. First, coplanar surfaces are identified in the whole point cloud using K-Nearest Neighbor and Principal Component Analysis algorithms optimized on point cloud accuracy and specified typical facet size. Then, discontinuity set orientation is calculated using Kernel Density Estimation and

  4. PREMOR: a point reactor exposure model computer code for survey analysis of power plant performance

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.

    1979-10-01

    The PREMOR computer code was written to exploit a simple, two-group point nuclear reactor power plant model for survey analysis. Up to thirteen actinides, fourteen fission products, and one lumped absorber nuclide density are followed over a reactor history. Successive feed batches are accounted for with provision for from one to twenty batches resident. The effect of exposure of each of the batches to the same neutron flux is determined.

  5. Forests and Forest Cover - TREES_BIG2005_IN: Champion Tree Locations for 2005 in Indiana (Bernardin-Lochmueller and Associates, Point Shapefile)

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — TREES_BIG2005_IN is a point shapefile showing the locations of state champion trees in Indiana. The register is updated every 5 years. Each location represents a...

  6. Optimal Power flow with valve point loading effects of cost function and mixed-integer control variables using Big-Bang and Big-Crunch optimization.

    Directory of Open Access Journals (Sweden)

    Chatti venkata Gopala krishna Rao

    2012-11-01

    Full Text Available This paper  presents application of Big Bang and Big crunch(BB-BC a nature inspired optimization method which is developed from the  concepts of universal evolution to solve complex static optimal power flow (OPF  with an aim to obtain minimum cost of thermal power generating units whose cost functions  are  non-convex due to  valve point loading effects. Control variables to   optimize   cost functions by satisfying usual constraints of OPF are of continuous and discrete type (mixed- integer control variables. Mathematical programming approaches presents problem in solving non-convex OPF.Nature inspired heuristic methods can be applied to solve such non-convex optimization problems. One of the requirements of heuristic methods are numerical simplicity without trial parameters in update equation of optimization along with reliability and ease in developing computer code for implementation. Most of the nature inspired methods search efficiency and reliability depends on choice of trial parameters to update control variables as optimization advances in search of optimal control variables.BB-BC optimization has search ability on par with other popular heuristic methods but free from choice of trial parameters is applied to obtain OPF solutions on two typical power  systems networks  and results are compared with MATLAB-7.0 pattern  random search optimization tool box .Digital simulation results   indicates a promising nature of the BB-BC to deal with  non-convex optimization requirements of power system situations .

  7. Preliminary Demonstration Reactor Point Design for the Fluoride Salt-Cooled High-Temperature Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Qualls, A. L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Betzler, Benjamin R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Brown, Nicholas R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Carbajo, Juan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Greenwood, Michael Scott [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hale, Richard Edward [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Harrison, Thomas J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Powers, Jeffrey J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Robb, Kevin R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Terrell, Jerry W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-01

    Development of the Fluoride Salt-Cooled High-Temperature Reactor (FHR) Demonstration Reactor (DR) is a necessary intermediate step to enable commercial FHR deployment through disruptive and rapid technology development and demonstration. The FHR DR will utilize known, mature technology to close remaining gaps to commercial viability. Lower risk technologies are included in the initial FHR DR design to ensure that the reactor can be built, licensed, and operated within an acceptable budget and schedule. These technologies include tristructural-isotropic (TRISO) particle fuel, replaceable core structural material, the use of that same material for the primary and intermediate loops, and tube-and-shell heat exchangers. This report provides an update on the development of the FHR DR. At this writing, the core neutronics and thermal hydraulics have been developed and analyzed. The mechanical design details are still under development and are described to their current level of fidelity. It is anticipated that the FHR DR can be operational within 10 years because of the use of low-risk, near-term technology options.

  8. Causality and entropic arguments pointing to a null Big Bag hypersurface

    Energy Technology Data Exchange (ETDEWEB)

    Minguzzi, E, E-mail: ettore.minguzzi@unifi.it [Dipartimento di Matematica Applicata, Universita degli Studi di Firenze, Via S. Marta 3, I-50139 Firenze (Italy)

    2011-09-22

    I propose a causality argument in order to solve the homogeneity (horizon) problem and the entropy problem of cosmology. The solution is based on the replacement of the spacelike Big Bang boundary with a null boundary behind which stays a chronology violating region. This solution requires a tilting of the light cones near the null boundary and thus it is based more on the behavior of the light cones and hence on causality than on the behavior of the scale factor (expansion). The connection of this picture with Augustine of Hippo famous philosophical discussion on time and creation is mentioned.

  9. Innovations and enhancements in neutronic analysis of the Big-10 university research and training reactors based on the AGENT code system

    International Nuclear Information System (INIS)

    Introduction. This paper summarizes salient aspects of the 'virtual' reactor system developed at Purdue Univ. emphasizing efficient neutronic modeling through AGENT (Arbitrary Geometry Neutron Transport) a deterministic neutron transport code. DOE's Big-10 Innovations in Nuclear Infrastructure and Education (INIE) Consortium was launched in 2002 to enhance scholarship activities pertaining to university research and training reactors (URTRs). Existing and next generation URTRs are powerful campus tools for nuclear engineering as well as a number of disciplines that include, but are not limited to, medicine, biology, material science, and food science. Advancing new computational environments for the analysis and configuration of URTRs is an important Big-10 INIE aim. Specifically, Big-10 INIE has pursued development of a 'virtual' reactor, an advanced computational environment to serve as a platform on which to build operations, utilization (research and education), and systemic analysis of URTRs physics. The 'virtual' reactor computational system will integrate computational tools addressing the URTR core and near core physics (transport, dynamics, fuel management and fuel configuration); thermal-hydraulics; beam line, in-core and near-core experiments; instrumentation and controls; confinement/containment and security issues. Such integrated computational environment does not currently exist. The 'virtual' reactor is designed to allow researchers and educators to configure and analyze their systems to optimize experiments, fuel locations for flux shaping, as well as detector selection and configuration. (authors)

  10. Compact reversed-field pinch reactors (CRFPR): sensitivity study and design-point determination

    International Nuclear Information System (INIS)

    If the costing assumptions upon which the positive assessment of conventional large superconducting fusion reactors are based proves overly optimistic, approaches that promise considerably increased system power density and reduced mass utilization will be required. These more compact reactor embodiments generally must operate with reduced shield thickness and resistive magnets. Because of the unique, magnetic topology associated with the Reversed-Field Pinch (RFP), the compact reactor embodiment for this approach is particularly attractive from the viewpoint of low-field resistive coils operating with Ohmic losses that can be made small relative to the fusion power. A comprehensive system model is developed and described for a steady-state, compact RFP reactor (CRFPR). This model is used to select a unique cost-optimized design point that will be used for a conceptual engineering design. The cost-optimized CRFPR design presented herein would operate with system power densities and mass utilizations that are comparable to fission power plants and are an order of magnitude more favorable than the conventional approaches to magnetic fusion power. The sensitivity of the base-case design point to changes in plasma transport, profiles, beta, blanket thickness, normal vs superconducting coils, and fuel cycle (DT vs DD) is examined. The RFP approach is found to yield a point design for a high-power-density reactor that is surprisingly resilient to changes in key, but relatively unknown, physics and systems parameters

  11. Supergene destruction of a hydrothermal replacement alunite deposit at Big Rock Candy Mountain, Utah: Mineralogy, spectroscopic remote sensing, stable-isotope, and argon-age evidences

    Science.gov (United States)

    Cunningham, C.G.; Rye, R.O.; Rockwell, B.W.; Kunk, M.J.; Councell, T.B.

    2005-01-01

    Big Rock Candy Mountain is a prominent center of variegated altered volcanic rocks in west-central Utah. It consists of the eroded remnants of a hypogene alunite deposit that, at ???21 Ma, replaced intermediate-composition lava flows. The alunite formed in steam-heated conditions above the upwelling limb of a convection cell that was one of at least six spaced at 3- to 4-km intervals around the margin of a monzonite stock. Big Rock Candy Mountain is horizontally zoned outward from an alunite core to respective kaolinite, dickite, and propylite envelopes. The altered rocks are also vertically zoned from a lower pyrite-propylite assemblage upward through assemblages successively dominated by hypogene alunite, jarosite, and hematite, to a flooded silica cap. This hydrothermal assemblage is undergoing natural destruction in a steep canyon downcut by the Sevier River in Marysvale Canyon. Integrated geological, mineralogical, spectroscopic remote sensing using AVIRIS data, Ar radiometric, and stable isotopic studies trace the hypogene origin and supergene destruction of the deposit and permit distinction of primary (hydrothermal) and secondary (weathering) processes. This destruction has led to the formation of widespread supergene gypsum in cross-cutting fractures and as surficial crusts, and to natrojarosite, that gives the mountain its buff coloration along ridges facing the canyon. A small spring, Lemonade Spring, with a pH of 2.6 and containing Ca, Mg, Si, Al, Fe, Mn, Cl, and SO4, also occurs near the bottom of the canyon. The 40Ar/39 Ar age (21.32??0.07 Ma) of the alunite is similar to that for other replacement alunites at Marysvale. However, the age spectrum contains evidence of a 6.6-Ma thermal event that can be related to the tectonic activity responsible for the uplift that led to the downcutting of Big Rock Candy Mountain by the Sevier River. This ???6.6 Ma event also is present in the age spectrum of supergene natrojarosite forming today, and probably dates

  12. Brit Crit: Turning Points in British Rock Criticism 1960-1990

    DEFF Research Database (Denmark)

    Gudmundsson, Gestur; Lindberg, U.; Michelsen, M.;

    2002-01-01

    The article examines the development of rock criticism in the United Kingdom from the perspective of a Bourdieuan field-analysis. Early British rock critics, like Nik Cohn, were international pioneers, a few years later there was a strong American influence, but British rock criticism has always...... had national specific traits and there have been more profound paradigm shifts than in American rock criticism. This is primarily explained by the fact that American rock criticism is more strongly connected to general cultural history, while the UK rock criticism has been more alienated from dominant...

  13. On the numerical solution of the point reactor kinetics equations by generalized Runge-Kutta methods

    International Nuclear Information System (INIS)

    Generalized Runge-Kutta methods specifically devised for the numerical solution of stiff systems of ordinary differential equations are described. An A-stable method is employed to solve several sample point reactor kinetics problems, explicitly showing the quantities required by the method. The accuracy and speed of calculation as obtained by implementing the method in a microcomputer are found to be acceptable

  14. Sensitivity Analysis for Reactor Period Induced by Positive Reactivity Using One-point Adjoint Kinetic Equation

    Science.gov (United States)

    Chiba, G.; Tsuji, M.; Narabayashi, T.

    2014-04-01

    In order to better predict a kinetic behavior of a nuclear fission reactor, an improvement of the delayed neutron parameters is essential. The present paper specifies important nuclear data for a reactor kinetics: Fission yield and decay constant data of 86Ge, some bromine isotopes, 94Rb, 98mY and some iodine isotopes. Their importance is quantified as sensitivities with a help of the adjoint kinetic equation, and it is found that they are dependent on an inserted reactivity (or a reactor period). Moreover, dependence of sensitivities on nuclear data files is also quantified using the latest files. Even though the currently evaluated data are used, there are large differences among different data files from a view point of the delayed neutrons.

  15. Aging management program of the reactor building concrete at Point Lepreau Generating Station

    Directory of Open Access Journals (Sweden)

    Gendron T.

    2011-04-01

    Full Text Available In order for New Brunswick Power Nuclear (NBPN to control the risks of degradation of the concrete reactor building at the Point Lepreau Generating Station (PLGS the development of an aging management plan (AMP was initiated. The intention of this plan was to determine the requirements for specific structural components of concrete of the reactor building that require regular inspection and maintenance to ensure the safe and reliable operation of the plant. The document is currently in draft form and presents an integrated methodology for the application of an AMP for the concrete of the reactor building. The current AMP addresses the reactor building structure and various components, such as joint sealant and liners that are integral to the structure. It does not include internal components housed within the structure. This paper provides background information regarding the document developed and the strategy developed to manage potential degradation of the concrete of the reactor building, as well as specific programs and preventive and corrective maintenance activities initiated.

  16. The tipping point how little things can make a big difference

    CERN Document Server

    Gladwell, Malcolm

    2002-01-01

    The tipping point is that magic moment when an idea, trend, or social behavior crosses a threshold, tips, and spreads like wildfire. Just as a single sick person can start an epidemic of the flu, so too can a small but precisely targeted push cause a fashion trend, the popularity of a new product, or a drop in the crime rate. This widely acclaimed bestseller, in which Malcolm Gladwell explores and brilliantly illuminates the tipping point phenomenon, is already changing the way people throughout the world think about selling products and disseminating ideas.

  17. Physics and thermal hydraulics design of a small water cooled reactor fuelled with plutonium in rock-like oxide (ROX) form

    Energy Technology Data Exchange (ETDEWEB)

    Gaultier, M.; Danguy, G. [Ecole des Applications Militaires de l ' Energie Atomique, Cherbourg (France); Perry, A. [Rolls-Royce MP, Raynesway, Derby (United Kingdom); Williams, A. [Nuclear Dept., DCEME, Military Road, Gosport, PO12 3BY (United Kingdom); Brushwood, J. [Nuclear Dept., DCEME, Military Road, Gosport, PO12 3BY (United Kingdom); Kings College, Univ. of London (United Kingdom); Thompson, A.; Beeley, P. A. [Nuclear Dept., DCEME, Military Road, Gosport, PO12 3BY (United Kingdom)

    2006-07-01

    This paper describes the Physics and Thermal Hydraulics areas of a design study for a small water-cooled reactor. The aim was to design a Pressurised Water Reactor (PWR) of maximum power 80 MWt, using a dispersed layout, capable of maximising primary natural circulation flow. The reactor fuel consists of plutonium contained in granular form within a Rock-like Oxide (ROX) pellet structure. (authors)

  18. Revisiting the Rosenbrock numerical solutions of the reactor point kinetics equation with numerous examples

    Directory of Open Access Journals (Sweden)

    Yang Xue

    2009-01-01

    Full Text Available The fourth order Rosenbrock method with an automatic step size control feature was described and applied to solve the reactor point kinetics equations. A FORTRAN 90 program was developed to test the computational speed and algorithm accuracy. From the results of various benchmark tests with different types of reactivity insertions, the Rosenbrock method shows high accuracy, high efficiency and stable character of the solution.

  19. Numerical Solution of Fractional Neutron Point Kinetics Model in Nuclear Reactor

    OpenAIRE

    Nowak Tomasz Karol; Duzinkiewicz Kazimierz; Piotrowski Robert

    2014-01-01

    This paper presents results concerning solutions of the fractional neutron point kinetics model for a nuclear reactor. Proposed model consists of a bilinear system of fractional and ordinary differential equations. Three methods to solve the model are presented and compared. The first one entails application of discrete Grünwald-Letnikov definition of the fractional derivative in the model. Second involves building an analog scheme in the FOMCON Toolbox in MATLAB environment. Third is the met...

  20. Shut-off rod performance study for the Point Lepreau reactor

    International Nuclear Information System (INIS)

    Shut-Off Rods (SORs) are strong neutron absorbing reactivity devices which are inserted into the Pt. Lepreau reactor core to terminate the nuclear chain reaction. SORs are inserted automatically by Shut-Down System One (SDS-1), in the event any of its trip parameters reach the trip set point on 2 out of the 3 electronic trip channels. SOR performance requirements for the Pt. Lepreau reactor have recently been reviewed and are reported here. In addition, the impact of rod unavailability and slow insertion characteristics on overall SDS-1 capability have been assessed. The shut-off rod system for the Pt. Lepreau reactor consists of 28 vertical rods which re suspended normally by an electromagnetic clutch assembly at the top fo the reactor just outside the core. Upon receipt of a trip signal, on two out of three SDS-1 trip channels, the electromagnetic clutches de-energize and allow the SORs to fall (spring assisted) into the core. In assessing the operational performance of the SOR system the two key parameters are the number of SORs available for insertion at any given time and the rate at which the rods are inserted when they fall. During the operation of the plant the number of rods available to be inserted may change due to routine maintenance and testing or due to system impairments. The rate at which the rods drop may also change due to mechanical wear or malfunction of the clutch or suspension assembly. The purpose of this study was to define the operational performance envelope which clearly identifies the SOR performance required to ensure a safe reactor shut down for all anticipated reactivity transients. Thus the main impetus of the study was to provide a clear technical basis for establishing operating performance requirements from which operating impairment guidelines could be derived. These guidelines are necessary to provide instructions for appropriate and timely action to the reactor operator in the event of SOR performance degradation

  1. Development and analysis of some versions of the fractional-order point reactor kinetics model for a nuclear reactor with slab geometry

    Science.gov (United States)

    Vyawahare, Vishwesh A.; Nataraj, P. S. V.

    2013-07-01

    In this paper, we report the development and analysis of some novel versions and approximations of the fractional-order (FO) point reactor kinetics model for a nuclear reactor with slab geometry. A systematic development of the FO Inhour equation, Inverse FO point reactor kinetics model, and fractional-order versions of the constant delayed neutron rate approximation model and prompt jump approximation model is presented for the first time (for both one delayed group and six delayed groups). These models evolve from the FO point reactor kinetics model, which has been derived from the FO Neutron Telegraph Equation for the neutron transport considering the subdiffusive neutron transport. Various observations and the analysis results are reported and the corresponding justifications are addressed using the subdiffusive framework for the neutron transport. The FO Inhour equation is found out to be a pseudo-polynomial with its degree depending on the order of the fractional derivative in the FO model. The inverse FO point reactor kinetics model is derived and used to find the reactivity variation required to achieve exponential and sinusoidal power variation in the core. The situation of sudden insertion of negative reactivity is analyzed using the FO constant delayed neutron rate approximation. Use of FO model for representing the prompt jump in reactor power is advocated on the basis of subdiffusion. Comparison with the respective integer-order models is carried out for the practical data. Also, it has been shown analytically that integer-order models are a special case of FO models when the order of time-derivative is one. Development of these FO models plays a crucial role in reactor theory and operation as it is the first step towards achieving the FO control-oriented model for a nuclear reactor. The results presented here form an important step in the efforts to establish a step-by-step and systematic theory for the FO modeling of a nuclear reactor.

  2. Fractional neutron point kinetics equations for nuclear reactor dynamics – Numerical solution investigations

    International Nuclear Information System (INIS)

    Highlights: • The fractional neutron point kinetics model with six groups of delayed neutron is developed. • The numerical and analytical results obtained for ordinary derivatives are compared. • The numerical stability of solutions is investigated. - Abstract: This paper presents results concerning numerical solutions to a fractional neutron point kinetics model for a nuclear reactor. The paper discusses and expands on results presented in (Espinosa-Paredes et al., 2011). The fractional neutron point kinetics model with six groups of delayed neutron precursors was developed and a numerical solution using the Edwards’ method was proposed (Edwards et al., 2002). The mathematical model was implemented in the Matlab environment and tested using typical step input change. Experimental evaluation of numerical stability was complemented by an analysis of the effect selected model parameters have on obtained results

  3. Numerical Solution of Fractional Neutron Point Kinetics Model in Nuclear Reactor

    Directory of Open Access Journals (Sweden)

    Nowak Tomasz Karol

    2014-06-01

    Full Text Available This paper presents results concerning solutions of the fractional neutron point kinetics model for a nuclear reactor. Proposed model consists of a bilinear system of fractional and ordinary differential equations. Three methods to solve the model are presented and compared. The first one entails application of discrete Grünwald-Letnikov definition of the fractional derivative in the model. Second involves building an analog scheme in the FOMCON Toolbox in MATLAB environment. Third is the method proposed by Edwards. The impact of selected parameters on the model’s response was examined. The results for typical input were discussed and compared.

  4. Toward Microfluidic Reactors for Cell-Free Protein Synthesis at the Point-of-Care.

    Science.gov (United States)

    Timm, Andrea C; Shankles, Peter G; Foster, Carmen M; Doktycz, Mitchel J; Retterer, Scott T

    2016-02-10

    Cell-free protein synthesis (CFPS) is a powerful technology that allows for optimization of protein production without maintenance of a living system. Integrated within micro and nanofluidic architectures, CFPS can be optimized for point-of-care use. Here, the development of a microfluidic bioreactor designed to facilitate the production of a single-dose of a therapeutic protein, in a small footprint device at the point-of-care, is described. This new design builds on the use of a long, serpentine channel bioreactor and is enhanced by integrating a nanofabricated membrane to allow exchange of materials between parallel "reactor" and "feeder" channels. This engineered membrane facilitates the exchange of metabolites, energy, and inhibitory species, and can be altered by plasma-enhanced chemical vapor deposition and atomic layer deposition to tune the exchange rate of small molecules. This allows for extended reaction times and improved yields. Further, the reaction product and higher molecular weight components of the transcription/translation machinery in the reactor channel can be retained. It has been shown that the microscale bioreactor design produces higher protein yields than conventional tube-based batch formats, and that product yields can be dramatically improved by facilitating small molecule exchange within the dual-channel bioreactor. PMID:26690885

  5. Semi-automatic characterization of fractured rock masses using 3D point clouds: discontinuity orientation, spacing and SMR geomechanical classification

    Science.gov (United States)

    Riquelme, Adrian; Tomas, Roberto; Abellan, Antonio; Cano, Miguel; Jaboyedoff, Michel

    2015-04-01

    Investigation of fractured rock masses for different geological applications (e.g. fractured reservoir exploitation, rock slope instability, rock engineering, etc.) requires a deep geometric understanding of the discontinuity sets affecting rock exposures. Recent advances in 3D data acquisition using photogrammetric and/or LiDAR techniques currently allow a quick and an accurate characterization of rock mass discontinuities. This contribution presents a methodology for: (a) use of 3D point clouds for the identification and analysis of planar surfaces outcropping in a rocky slope; (b) calculation of the spacing between different discontinuity sets; (c) semi-automatic calculation of the parameters that play a capital role in the Slope Mass Rating geomechanical classification. As for the part a) (discontinuity orientation), our proposal identifies and defines the algebraic equations of the different discontinuity sets of the rock slope surface by applying an analysis based on a neighbouring points coplanarity test. Additionally, the procedure finds principal orientations by Kernel Density Estimation and identifies clusters (Riquelme et al., 2014). As a result of this analysis, each point is classified with a discontinuity set and with an outcrop plane (cluster). Regarding the part b) (discontinuity spacing) our proposal utilises the previously classified point cloud to investigate how different outcropping planes are linked in space. Discontinuity spacing is calculated for each pair of linked clusters within the same discontinuity set, and then spacing values are analysed calculating their statistic values. Finally, as for the part c) the previous results are used to calculate parameters F_1, F2 and F3 of the Slope Mass Rating geomechanical classification. This analysis is carried out for each discontinuity set using their respective orientation extracted in part a). The open access tool SMRTool (Riquelme et al., 2014) is then used to calculate F1 to F3 correction

  6. Spectral characteristics of Acoustic Emission of rock based on Singular point of HHT Analysis

    Directory of Open Access Journals (Sweden)

    Zhou Xiaoshan

    2016-01-01

    Full Text Available The sandstone test of uniaxial compression acoustic emission (AE test has been studied, the HHT analysis is applied to AE signal processing, and through the analysis of AE signal to reveal the process of rock fracture. The results show that HHT is a method that based on principal component analysis of time-frequency analysis. The method of HHT can very convenient to deal the singular signal; it can be determine the main composition of singular signal. The instantaneous frequency can be used to describe precisely the time-frequency characteristics of singular signal. The method has a very important significance to reveal the frequency characteristics of AE signal. The EMD signal is decomposed into 8 IMF components in the failure process of rock sound. The component of IMF1 ~ IMF4 is the main component, and the IMF5 ~ IMF8 for low frequency noise signal. Through the EMD of AE signal frequency, the rock fracture has been decomposition into three stages: the initial zone, wave zone, quiet zone. This shows that in the analysis of rupture must eliminate noise interference signal characteristics of AE.

  7. Petrofabrics of High-Pressure Rocks Exhumed at the Slab-Mantle Interface from the 'Point of No Return'

    Science.gov (United States)

    Whitney, D. L.; Teyssier, C. P.; Seaton, N. C.; Fornash, K.

    2014-12-01

    The highest pressure typically recorded by metamorphic rocks exhumed from oceanic subduction zones is ~2.5±1 GPa, corresponding to the maximum decoupling depth (MDD) (80±10 km) identified in active subduction zones; beyond the MDD (the 'point of no return') exhumation is unlikely. One of the few places where rocks returned from the MDD largely unaltered is Sivrihisar, Turkey: a structurally coherent terrane of lawsonite eclogite and blueschist facies rocks in which assemblages and fabrics record P-T-fluid-deformation conditions during exhumation from ~80 to 45 km. Crystallographic fabrics and other structural features of high-pressure metasedimentary and metabasaltic rocks record transitions during exhumation. In quartzite, heterogeneous microstructures and crystallographic fabrics record deformation and dynamic recrystallization from ~2.6 GPa to ~1.5 GPa, as expressed by transition from prism c-axis patterns through progressive overprinting and activation of rhomb and basal slip. Omphacite, glaucophane, phengite, and lawsonite in quartzite remained stable during deformation. In marble, CaCO3 deformed in dislocation creep as aragonite, producing strong crystallographic fabrics. This fabric persisted through formation of calcite and destruction of the shape-preferred orientation, indicating the strength of aragonite marble. Omphacite in metabasalt and quartzite displays an L-type crystallographic fabric. Lawsonite kinematic vorticity data and other fabrics in metabasalt are consistent with exhumation involving increasing amounts of pure shear relative to simple shear and indicate strain localization and simple shear near the fault contact between the high-pressure unit and a serpentinite body. This large coaxial component multiplied the exhuming power of the subduction channel and forced rocks to return from the MDD.

  8. Approximate Solution of the Point Reactor Kinetic Equations of Average One-Group of Delayed Neutrons for Step Reactivity Insertion

    Directory of Open Access Journals (Sweden)

    S. Yamoah

    2012-04-01

    Full Text Available The understanding of the time-dependent behaviour of the neutron population in a nuclear reactor in response to either a planned or unplanned change in the reactor conditions is of great importance to the safe and reliable operation of the reactor. In this study two analytical methods have been presented to solve the point kinetic equations of average one-group of delayed neutrons. These methods which are both approximate solution of the point reactor kinetic equations are compared with a numerical solution using the Euler’s first order method. To obtain accurate solution for the Euler method, a relatively small time step was chosen for the numerical solution. These methods are applied to different types of reactivity to check the validity of the analytical method by comparing the analytical results with the numerical results. From the results, it is observed that the analytical solution agrees well with the numerical solution.

  9. Ozo-Dyes mixture degradation in a fixed bed biofilm reactor packed with volcanic porous rock

    Energy Technology Data Exchange (ETDEWEB)

    Contreras-Blancas, E.; Cobos-Vasconcelos, D. de los; Juarez-Ramirez, C.; Poggi-Varaldo, H. M.; Ruiz-Ordaz, N.; Galindez-Mayer, J.

    2009-07-01

    Textile industries discharge great amounts of dyes and dyeing-process auxiliaries, which pollute streams and water bodies. Several dyes, especially the ones containing the azo group, can cause harmful effects to different organisms including humans. Through bacterial and mammalian tests, azo dyes or their derived aromatic amines have shown cell genotoxicity. The purpose of this work was to evaluate the effect of air flow rate on azo-dyes mixture biodegradation by a microbial community immobilized in a packed bed reactor. (Author)

  10. Ozo-Dyes mixture degradation in a fixed bed biofilm reactor packed with volcanic porous rock

    International Nuclear Information System (INIS)

    Textile industries discharge great amounts of dyes and dyeing-process auxiliaries, which pollute streams and water bodies. Several dyes, especially the ones containing the azo group, can cause harmful effects to different organisms including humans. Through bacterial and mammalian tests, azo dyes or their derived aromatic amines have shown cell genotoxicity. The purpose of this work was to evaluate the effect of air flow rate on azo-dyes mixture biodegradation by a microbial community immobilized in a packed bed reactor. (Author)

  11. Compliance Monitoring of Underwater Blasting for Rock Removal at Warrior Point, Columbia River Channel Improvement Project, 2009/2010

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, Thomas J.; Johnson, Gary E.; Woodley, Christa M.; Skalski, J. R.; Seaburg, Adam

    2011-05-10

    The U.S. Army Corps of Engineers, Portland District (USACE) conducted the 20-year Columbia River Channel Improvement Project (CRCIP) to deepen the navigation channel between Portland, Oregon, and the Pacific Ocean to allow transit of fully loaded Panamax ships (100 ft wide, 600 to 700 ft long, and draft 45 to 50 ft). In the vicinity of Warrior Point, between river miles (RM) 87 and 88 near St. Helens, Oregon, the USACE conducted underwater blasting and dredging to remove 300,000 yd3 of a basalt rock formation to reach a depth of 44 ft in the Columbia River navigation channel. The purpose of this report is to document methods and results of the compliance monitoring study for the blasting project at Warrior Point in the Columbia River.

  12. Harnessing big data for precision medicine: A panel of experts elucidates the data challenges and proposes key strategic decisions points

    Directory of Open Access Journals (Sweden)

    Carol Isaacson Barash

    2015-03-01

    Full Text Available A group of disparate translational bioinformatics experts convened at the 6th Annual Precision Medicine Partnership Meeting, October 29–30, 2014 to discuss big data challenges and key strategic decisions needed to advance precision medicine, emerging solutions, and the anticipated path to success. This article reports the panel discussion.

  13. Saddle point condition for D minus sup 3 He tokamak fusion reactor

    Energy Technology Data Exchange (ETDEWEB)

    Mitarai, O. (Kumamoto Inst. of Technology, Dept. of Electrical Engineering, Ikeda 4-22-1, Kumamoto 860 (JP)); Hirose, A.; Skarsgard, H.M. (Univ. of Saskatchewan, Dept. of Physics Sasakatoon, Saskatchewan, S7N 0W0 (CA))

    1991-03-01

    In this paper the concept of a generalized ignition contour map, showing {bar P}{sub ht}T{sup 2}{sub E}, NT{sub E}, and T, is used to study the ignition criterion for a D{minus}{sup 3}He fusion reactor with plasma temperature and density profiles. Direct heating scenarios to the D {minus} {sup 3}He ignition regime without the help of deuterium-tritium burning are considered. The machine size and enhancement factor for the confinement time required to reach D {minus} {sup 3}He ignition can be simple determined by comparing the height of the operation path with Goldston L-mode scaling and the height of the generalized saddle point. A confinement enhancement factor of 2 to 3 is required in the case of a large plasma current (30 to 80 MA) in a small-aspect-ratio tokamak. On the other hand, for a small plasma current ({approx lt} 10 MA), large-aspect-ratio tokamak, an enhancement factor of 5 to 6 is necessary to reach ignition. Fuel dilution effects by fusion products and impurities, the confinement degradation effect due to 14-MeV protons, and the operation paths are also considered. To lower the height of the saddle point, and hence the auxiliary heating power, we optimize the fuel composition and examine operation in the hot ion mode.

  14. A 3D clustering approach for point clouds to detect and quantify changes at a rock glacier front

    Science.gov (United States)

    Micheletti, Natan; Tonini, Marj; Lane, Stuart N.

    2016-04-01

    Terrestrial Laser Scanners (TLS) are extensively used in geomorphology to remotely-sense landforms and surfaces of any type and to derive digital elevation models (DEMs). Modern devices are able to collect many millions of points, so that working on the resulting dataset is often troublesome in terms of computational efforts. Indeed, it is not unusual that raw point clouds are filtered prior to DEM creation, so that only a subset of points is retained and the interpolation process becomes less of a burden. Whilst this procedure is in many cases necessary, it implicates a considerable loss of valuable information. First, and even without eliminating points, the common interpolation of points to a regular grid causes a loss of potentially useful detail. Second, it inevitably causes the transition from 3D information to only 2.5D data where each (x,y) pair must have a unique z-value. Vector-based DEMs (e.g. triangulated irregular networks) partially mitigate these issues, but still require a set of parameters to be set and a considerable burden in terms of calculation and storage. Because of the reasons above, being able to perform geomorphological research directly on point clouds would be profitable. Here, we propose an approach to identify erosion and deposition patterns on a very active rock glacier front in the Swiss Alps to monitor sediment dynamics. The general aim is to set up a semiautomatic method to isolate mass movements using 3D-feature identification directly from LiDAR data. An ultra-long range LiDAR RIEGL VZ-6000 scanner was employed to acquire point clouds during three consecutive summers. In order to isolate single clusters of erosion and deposition we applied the Density-Based Scan Algorithm with Noise (DBSCAN), previously successfully employed by Tonini and Abellan (2014) in a similar case for rockfall detection. DBSCAN requires two input parameters, strongly influencing the number, shape and size of the detected clusters: the minimum number of

  15. Toward a Learning Health-care System - Knowledge Delivery at the Point of Care Empowered by Big Data and NLP.

    Science.gov (United States)

    Kaggal, Vinod C; Elayavilli, Ravikumar Komandur; Mehrabi, Saeed; Pankratz, Joshua J; Sohn, Sunghwan; Wang, Yanshan; Li, Dingcheng; Rastegar, Majid Mojarad; Murphy, Sean P; Ross, Jason L; Chaudhry, Rajeev; Buntrock, James D; Liu, Hongfang

    2016-01-01

    The concept of optimizing health care by understanding and generating knowledge from previous evidence, ie, the Learning Health-care System (LHS), has gained momentum and now has national prominence. Meanwhile, the rapid adoption of electronic health records (EHRs) enables the data collection required to form the basis for facilitating LHS. A prerequisite for using EHR data within the LHS is an infrastructure that enables access to EHR data longitudinally for health-care analytics and real time for knowledge delivery. Additionally, significant clinical information is embedded in the free text, making natural language processing (NLP) an essential component in implementing an LHS. Herein, we share our institutional implementation of a big data-empowered clinical NLP infrastructure, which not only enables health-care analytics but also has real-time NLP processing capability. The infrastructure has been utilized for multiple institutional projects including the MayoExpertAdvisor, an individualized care recommendation solution for clinical care. We compared the advantages of big data over two other environments. Big data infrastructure significantly outperformed other infrastructure in terms of computing speed, demonstrating its value in making the LHS a possibility in the near future. PMID:27385912

  16. Toward a Learning Health-care System - Knowledge Delivery at the Point of Care Empowered by Big Data and NLP.

    Science.gov (United States)

    Kaggal, Vinod C; Elayavilli, Ravikumar Komandur; Mehrabi, Saeed; Pankratz, Joshua J; Sohn, Sunghwan; Wang, Yanshan; Li, Dingcheng; Rastegar, Majid Mojarad; Murphy, Sean P; Ross, Jason L; Chaudhry, Rajeev; Buntrock, James D; Liu, Hongfang

    2016-01-01

    The concept of optimizing health care by understanding and generating knowledge from previous evidence, ie, the Learning Health-care System (LHS), has gained momentum and now has national prominence. Meanwhile, the rapid adoption of electronic health records (EHRs) enables the data collection required to form the basis for facilitating LHS. A prerequisite for using EHR data within the LHS is an infrastructure that enables access to EHR data longitudinally for health-care analytics and real time for knowledge delivery. Additionally, significant clinical information is embedded in the free text, making natural language processing (NLP) an essential component in implementing an LHS. Herein, we share our institutional implementation of a big data-empowered clinical NLP infrastructure, which not only enables health-care analytics but also has real-time NLP processing capability. The infrastructure has been utilized for multiple institutional projects including the MayoExpertAdvisor, an individualized care recommendation solution for clinical care. We compared the advantages of big data over two other environments. Big data infrastructure significantly outperformed other infrastructure in terms of computing speed, demonstrating its value in making the LHS a possibility in the near future.

  17. Particles fluidized bed receiver/reactor with a beam-down solar concentrating optics: 30-kWth performance test using a big sun-simulator

    Science.gov (United States)

    Kodama, Tatsuya; Gokon, Nobuyuki; Cho, Hyun Seok; Matsubara, Koji; Etori, Tetsuro; Takeuchi, Akane; Yokota, Shin-nosuke; Ito, Sumie

    2016-05-01

    A novel concept of particles fluidized bed receiver/reactor with a beam-down solar concentrating optics was performed using a 30-kWth window type receiver by a big sun-simulator. A fluidized bed of quartz sand particles was created by passing air from the bottom distributor of the receiver, and about 30 kWth of high flux visible light from 19 xenon-arc lamps of the sun-simulator was directly irradiated on the top of the fluidized bed in the receiver through a quartz window. The particle bed temperature at the center position of the fluidized bed went up to a temperature range from 1050 to 1200°C by the visible light irradiation with the average heat flux of about 950 kW/m2, depending on the air flow rate. The output air temperature from the receiver reached 1000 - 1060°C.

  18. A planar circular detector based on multiple point chemi- or bio-luminescent source within a coaxial cylindrical reactor

    International Nuclear Information System (INIS)

    An analytical method was proposed for calculating radiative fluxes incident on a planar circular detector from a volume multiple point chemi- or bio-luminescent source inside a coaxial cylindrical reactor. The method was designed for a cylindrical reactor when the surface reflections were neglected and when chemi- or bio-luminescence reaches a detector embedded in the same homogeneous optical medium as the point emitters of the volume multiple point source model. The radiative fluxes from arbitrarily distributed point emitters were expressed by one generalized quadruple-integral formula. Then some double- and single-integral formulas were obtained for calculating radiative fluxes from identically radiating point emitters uniformly distributed within the reactor. Selected results were computed and illustrated graphically. The obtained formulas are suitable for optimizing and/or calibrating the considered source-detectors systems (optical radiometers or luminometers) and determining radiative fluxes generated by chemical, biological, and physical processes leading to chemi-, bio-, radio-, and sono-luminescence for example.

  19. High Temperature Gas-Cooled Test Reactor Point Design: Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Sterbentz, James William [Idaho National Laboratory; Bayless, Paul David [Idaho National Laboratory; Nelson, Lee Orville [Idaho National Laboratory; Gougar, Hans David [Idaho National Laboratory; Strydom, Gerhard [Idaho National Laboratory

    2016-03-01

    • Provide an initial summary description of the design and its main attributes o Summarize the main Test Reactor attributes: reactor type, power, coolant, irradiation conditions (fast and thermal flux levels, number of test loops, positions and volumes), costs (project, operational), schedule and availability factor. o Identify secondary missions and power conversion options, if applicable. o Include statements on the envisioned attractiveness of the reactor type in relation to anticipated domestic and global irradiation services needs, citing past and current trends in reactor development and deployment. o Include statements on Test Reactor scalability (e.g. trade-off between size, power/flux levels and costs), prototypical conditions, overall technology maturity of the specific design and the general technology type. The intention is that this summary must be readable as a stand-alone section.

  20. Fluoride Salt-Cooled High-Temperature Demonstration Reactor Point Design

    International Nuclear Information System (INIS)

    The fluoride salt-cooled high-temperature reactor (FHR) demonstration reactor (DR) is a concept for a salt-cooled reactor with 100 megawatts of thermal output (MWt). It would use tristructural-isotropic (TRISO) particle fuel within prismatic graphite blocks. FLiBe (2 LiF-BeF2) is the reference primary coolant. The FHR DR is designed to be small, simple, and affordable. Development of the FHR DR is a necessary intermediate step to enable near-term commercial FHRs. Lower risk technologies are purposely included in the initial FHR DR design to ensure that the reactor can be built, licensed, and operated within an acceptable budget and schedule. These technologies include TRISO particle fuel, replaceable core structural material, the use of that same material for the primary and intermediate loops, and tube-and-shell primary-to-intermediate heat exchangers. Several preconceptual and conceptual design efforts that have been conducted on FHR concepts bear a significant influence on the FHR DR design. Specific designs include the Oak Ridge National Laboratory (ORNL) advanced high-temperature reactor (AHTR) with 3400/1500 MWt/megawatts of electric output (MWe), as well as a 125 MWt small modular AHTR (SmAHTR) from ORNL. Other important examples are the Mk1 pebble bed FHR (PB-FHR) concept from the University of California, Berkeley (UCB), and an FHR test reactor design developed at the Massachusetts Institute of Technology (MIT). The MIT FHR test reactor is based on a prismatic fuel platform and is directly relevant to the present FHR DR design effort. These FHR concepts are based on reasonable assumptions for credible commercial prototypes. The FHR DR concept also directly benefits from the operating experience of the Molten Salt Reactor Experiment (MSRE), as well as the detailed design efforts for a large molten salt reactor concept and its breeder variant, the Molten Salt Breeder Reactor. The FHR DR technology is most representative of the 3400 MWt AHTR concept, and it

  1. Fluoride Salt-Cooled High-Temperature Demonstration Reactor Point Design

    Energy Technology Data Exchange (ETDEWEB)

    Qualls, A. L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Brown, Nicholas R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Betzler, Benjamin R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Carbajo, Juan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hale, Richard Edward [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Harrison, Thomas J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Powers, Jeffrey J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Robb, Kevin R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Terrell, Jerry W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wysocki, Aaron J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-02-01

    The fluoride salt-cooled high-temperature reactor (FHR) demonstration reactor (DR) is a concept for a salt-cooled reactor with 100 megawatts of thermal output (MWt). It would use tristructural-isotropic (TRISO) particle fuel within prismatic graphite blocks. FLiBe (2 LiF-BeF2) is the reference primary coolant. The FHR DR is designed to be small, simple, and affordable. Development of the FHR DR is a necessary intermediate step to enable near-term commercial FHRs. Lower risk technologies are purposely included in the initial FHR DR design to ensure that the reactor can be built, licensed, and operated within an acceptable budget and schedule. These technologies include TRISO particle fuel, replaceable core structural material, the use of that same material for the primary and intermediate loops, and tube-and-shell primary-to-intermediate heat exchangers. Several preconceptual and conceptual design efforts that have been conducted on FHR concepts bear a significant influence on the FHR DR design. Specific designs include the Oak Ridge National Laboratory (ORNL) advanced high-temperature reactor (AHTR) with 3400/1500 MWt/megawatts of electric output (MWe), as well as a 125 MWt small modular AHTR (SmAHTR) from ORNL. Other important examples are the Mk1 pebble bed FHR (PB-FHR) concept from the University of California, Berkeley (UCB), and an FHR test reactor design developed at the Massachusetts Institute of Technology (MIT). The MIT FHR test reactor is based on a prismatic fuel platform and is directly relevant to the present FHR DR design effort. These FHR concepts are based on reasonable assumptions for credible commercial prototypes. The FHR DR concept also directly benefits from the operating experience of the Molten Salt Reactor Experiment (MSRE), as well as the detailed design efforts for a large molten salt reactor concept and its breeder variant, the Molten Salt Breeder Reactor. The FHR DR technology is most representative of the 3400 MWt AHTR

  2. The Research about Strengthening Project of Middle Rock Pillars in the Big Section and Small Clear Distance Tunnel%大断面小净距隧道中间岩柱加固方案研究

    Institute of Scientific and Technical Information of China (English)

    陈佳

    2014-01-01

    探讨了在质量较差的Ⅴ级围岩中大断面小净距隧道中间岩柱加固方案,得出当左右洞之间的距离小于最小净距时,对中间岩柱进行注浆加固能很好地改善隧道洞周位移及衬砌内力,而对中间岩柱进行对拉预应力锚杆加固并不能起到很好的效果。对大断面小净距隧道中间岩柱的加固有一定的借鉴意义。%This paper discusses middle roc pillars reinforcement scheme in the poor quality sur-rounding rock in the big section and small clear distance tunnel ,it is concluded that when the distance of the two hole less than the minimum interval,the grouting reinforcement in middle rock pillars can per-fect the displacement and internal force lining in tunnel,and prestressed anchor reinforcement in middle rock pillars does not have very good effect. This paper have certain reference significance for the rein-forcement of middle rock pillars in the large section and small clear distance tunnel.

  3. Pilot program: NRC [Nuclear Regulatory Commission] severe reactor accident incident response training manual: Overview and summary of major points

    International Nuclear Information System (INIS)

    This pilot training manual has been written to fill the need for a general text on NRC response to reactor accidents. The manual is intended to be the foundation for a course for all NRC response personnel. Overview and Summary of Major Points is the first in a series of volumes that collectively summarize the US Nuclear Regulatory Commission (NRC) emergency response during severe power reactor accidents and provide necessary background information. This volume describes elementary perspectives on severe accidents and accident assessment. Each volume serves, respectively, as the text for a course of instruction in a series of courses for NRC response personnel. These materials do not provide guidance of license requirements for NRC licensees. Each volume is accompanied by an appendix of slides that can be used to present this material. The slides are called out in the text

  4. Systematic evaluation program: status summary report

    International Nuclear Information System (INIS)

    The status of safety evaluation issues is reviewed for the following reactors: Big Rock Point reactor; Dresden-1 reactor; Dresden-2 reactor; Ginna-1 reactor; Connecticut Yankee reactor; LACBWR reactor; Millstone-1 reactor; Oyster Creek-1 reactor; Palisades-1 reactor; San Onofre-1 reactor; and Rowe Yankee reactor

  5. Heuristic optimization of a continuous flow point-of-use UV-LED disinfection reactor using computational fluid dynamics.

    Science.gov (United States)

    Jenny, Richard M; Jasper, Micah N; Simmons, Otto D; Shatalov, Max; Ducoste, Joel J

    2015-10-15

    Alternative disinfection sources such as ultraviolet light (UV) are being pursued to inactivate pathogenic microorganisms such as Cryptosporidium and Giardia, while simultaneously reducing the risk of exposure to carcinogenic disinfection by-products (DBPs) in drinking water. UV-LEDs offer a UV disinfecting source that do not contain mercury, have the potential for long lifetimes, are robust, and have a high degree of design flexibility. However, the increased flexibility in design options will add a substantial level of complexity when developing a UV-LED reactor, particularly with regards to reactor shape, size, spatial orientation of light, and germicidal emission wavelength. Anticipating that LEDs are the future of UV disinfection, new methods are needed for designing such reactors. In this research study, the evaluation of a new design paradigm using a point-of-use UV-LED disinfection reactor has been performed. ModeFrontier, a numerical optimization platform, was coupled with COMSOL Multi-physics, a computational fluid dynamics (CFD) software package, to generate an optimized UV-LED continuous flow reactor. Three optimality conditions were considered: 1) single objective analysis minimizing input supply power while achieving at least (2.0) log10 inactivation of Escherichia coli ATCC 11229; and 2) two multi-objective analyses (one of which maximized the log10 inactivation of E. coli ATCC 11229 and minimized the supply power). All tests were completed at a flow rate of 109 mL/min and 92% UVT (measured at 254 nm). The numerical solution for the first objective was validated experimentally using biodosimetry. The optimal design predictions displayed good agreement with the experimental data and contained several non-intuitive features, particularly with the UV-LED spatial arrangement, where the lights were unevenly populated throughout the reactor. The optimal designs may not have been developed from experienced designers due to the increased degrees of

  6. Operating experience of reactors points up need for new thermal-hydraulic inquiries

    International Nuclear Information System (INIS)

    Review of accident and preaccident situation in the context of thermal-hydraulic processes in PWR and BWR is presented. The most frequently occurring preaccident events in the reactor operation pertaining to thermal-hydraulic processes: water hammer, thermal fatigue, transition processes, supercooling, formation of vortex, oscillation of power in BWR are discussed. Activation of theoretical and experimental thermal-hydraulic studies with the aim of improvement of safety and efficiency of NPU is proposed

  7. Radiation damage in ferritic/martensitic steels for fusion reactors: a simulation point of view

    Science.gov (United States)

    Schäublin, R.; Baluc, N.

    2007-12-01

    Low activation ferritic/martensitic steels are good candidates for the future fusion reactors, for, relative to austenitic steels, their lower damage accumulation and moderate swelling under irradiation by the 14 MeV neutrons produced by the fusion reaction. Irradiation of these steels, e.g. EUROFER97, is known to produce hardening, loss of ductility, shift in ductile to brittle transition temperature and a reduction of fracture toughness and creep resistance starting at the lowest doses. Helium, produced by transmutation by the 14 MeV neutrons, is known to impact mechanical properties, but its effect at the microstructure level is still unclear. The mechanisms underlying the degradation of mechanical properties are not well understood, despite numerous studies on the evolution of the microstructure under irradiation. This impedes our ability to predict materials' behaviour at higher doses for use in the future fusion reactors. Simulations of these effects are now essential. An overview is presented on molecular dynamics simulations of the primary state of damage in iron and of the mobility of a dislocation, vector of plasticity, in the presence of a defect.

  8. Systematic evaluation program. Status summary report

    International Nuclear Information System (INIS)

    Correspondence activities are listed that are related to the licensing of the Big Rock Point reactor; Dresden-1 reactor; Dresden-2 reactor; Ginna-1 reactor; Connecticut Yankee reactor; LACBWR reactor; Millstone-1 reactor; Oyster Creek-1 reactor; Palisades-1 reactor; San Onofre-1 reactor; and Rowe Yankee reactor

  9. Solution of Point Reactor Neutron Kinetics Equations with Temperature Feedback by Singularly Perturbed Method

    Directory of Open Access Journals (Sweden)

    Wenzhen Chen

    2013-01-01

    Full Text Available The singularly perturbed method (SPM is proposed to obtain the analytical solution for the delayed supercritical process of nuclear reactor with temperature feedback and small step reactivity inserted. The relation between the reactivity and time is derived. Also, the neutron density (or power and the average density of delayed neutron precursors as the function of reactivity are presented. The variations of neutron density (or power and temperature with time are calculated and plotted and compared with those by accurate solution and other analytical methods. It is shown that the results by the SPM are valid and accurate in the large range and the SPM is simpler than those in the previous literature.

  10. A refined way of solving reactor point kinetics equations for imposed reactivity insertions

    Directory of Open Access Journals (Sweden)

    Ganapol Barry D.

    2009-01-01

    Full Text Available We apply the concept of convergence acceleration, also known as extrapolation, to find the solution of the reactor kinetics equations (RKEs. The method features simplicity in that an approximate finite difference formulation is constructed and converged to high accuracy from knowledge of the error term. Through the Romberg extrapolation, we demonstrate its high accuracy for a variety of imposed reactivity insertions found in the literature. The unique feature of the proposed algorithm, called RKE/R(omberg, is that no special attention is given to the stiffness of the RKEs. Finally, because of its simplicity and accuracy, the RKE/R algorithm is arguably the most efficient numerical solution of the RKEs developed to date.

  11. Kimberley rock art dating project

    International Nuclear Information System (INIS)

    The art's additional value, unequalled by traditionally recognised artefacts, is its permanent pictorial documentation presenting a 'window' into the otherwise intangible elements of perceptions, vision and mind of pre-historic cultures. Unfortunately it's potential in establishing Kimberley archaeological 'big picture' still remains largely unrecognised. Some of findings of the Kimberley Rock Art Dating Project, using AMS and optical stimulated luminescence (OSL) dating techniques, are outlined. It is estimated that these findings will encourage involvement by a greater diversity of specialist disciplines to tie findings into levels of this art sequence as a primary reference point. The sequence represents a sound basis for selecting specific defined images for targeting detailed studies by a range of dating technique. This effectively removes the undesirable ad hoc sampling of 'apparently old paintings'; a process which must unavoidably remain the case with researchers working on most global bodies of rock art

  12. Big data

    OpenAIRE

    Thomsen, Christoffer Bolvig; Steffensen, Nikolaj; Jørgensen, Frederik Thordal; Olesen, Rasmus Bjørk; Nilsson, Martin Becker; Iramdane, Souphian

    2014-01-01

    In the recent past, a new phenomenon in digital marketing, has gained more attention from companies, the phenomenon is called big data. Big data is a term in computer science that broadly covers the collection, storage, analysis and interpretation of huge amounts of data, from various sources. In the project questions are made to find answers for what companies should pay attention to when using big data. Companies use big data to marketing, inventory management and general business managemen...

  13. End points in discharge cleaning on TFTR [Tokamak Fusion Test Reactor

    International Nuclear Information System (INIS)

    It has been found necessary to perform a series of first-wall conditioning steps prior to successful high power plasma operation in the Tokamak Fusion Test Reactor (TFTR). This series begins with glow discharge cleaning (GDC) and is followed by pulse discharge cleaning (PDC). During machine conditioning, the production of impurities is monitored by a Residual Gas Analyzer (RGA). PDC is made in two distinct modes: Taylor discharge cleaning (TDC), where the plasma current is kept low (15--50 kA) and of short duration (50 ms) by means of a relatively high prefill pressure and aggressive PDC, where lower prefill pressure and higher toroidal field result in higher current (200--400 kA) limited by disruptions at q(a) approx 3 at approx 250 ms. At a constant repetition rate of 12 discharges/minute, the production rate of H2O, CO, or other impurities has been found to be an unreliable measure of progress in cleaning. However, the ability to produce aggressive PDC with substantial limiter heating, but without the production of x-rays from runaway electrons, is an indication that TDC is no longer necessary after approx 105 pulses. During aggressive PDC, the uncooled limiters are heated by the plasma from the bakeout temperature of 150 degree C to about 250 degree C over a period of three to eight hours. This limiter heating is important to enhance the rate at which H2O is removed from the graphite limiter. 14 refs., 3 figs., 1 tab

  14. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian;

    2016-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......Astrophysics and cosmology are rich with data. The advent of wide-area digital cameras on large aperture telescopes has led to ever more ambitious surveys of the sky. Data volumes of entire surveys a decade ago can now be acquired in a single night and real-time analysis is often desired. Thus...... with label and measurement noise. We argue that this makes astronomy a great domain for computer science research, as it pushes the boundaries of data analysis. In the following, we will present this exciting application area for data scientists. We will focus on exemplary results, discuss main challenges...

  15. Petrofabrics of high-pressure rocks exhumed at the slab-mantle interface from the "point of no return" in a subduction zone (Sivrihisar, Turkey)

    Science.gov (United States)

    Whitney, Donna L.; Teyssier, Christian; Seaton, Nicholas C. A.; Fornash, Katherine F.

    2014-12-01

    The highest pressure recorded by metamorphic rocks exhumed from oceanic subduction zones is ~2.5 GPa, corresponding to the maximum decoupling depth (MDD) (80 ± 10 km) identified in active subduction zones; beyond the MDD (the "point of no return") exhumation is unlikely. The Sivrihisar massif (Turkey) is a coherent terrane of lawsonite eclogite and blueschist facies rocks in which assemblages and fabrics record P-T-fluid-deformation conditions during exhumation from ~80 to 45 km. Crystallographic fabrics and other features of high-pressure metasedimentary and metabasaltic rocks record transitions during exhumation. In quartzite, microstructures and crystallographic fabrics record deformation in the dislocation creep regime, including dynamic recrystallization during decompression, and a transition from prism slip to activation of rhomb and basal slip that may be related to a decrease in water fugacity during decompression (~2.5 to ~1.5 GPa). Phengite, lawsonite, and omphacite or glaucophane in quartzite and metabasalt remained stable during deformation, and omphacite developed an L-type crystallographic fabric. In marble, aragonite developed columnar textures with strong crystallographic fabrics that persisted during partial to complete dynamic recrystallization that was likely achieved in the stability field of aragonite (P > ~1.2 GPa). Results of kinematic vorticity analysis based on lawsonite shape fabrics are consistent with shear criteria in quartzite and metabasalt and indicate a large component of coaxial deformation in the exhuming channel beneath a simple shear dominated interface. This large coaxial component may have multiplied the exhuming power of the subduction channel and forced deeply subducted rocks to flow back from the point of no return.

  16. SEVERAL POINTS IN DYNAMIC STABILITY ANALYSIS OF ROCK SLOPE%岩质边坡动力稳定性分析的几个要点

    Institute of Scientific and Technical Information of China (English)

    李宁; 姚显春; 张承客

    2012-01-01

    重点讨论动荷载作用下岩质边坡稳定性的3个基本问题:(1)动荷载作用下裂隙岩体的力学特性;(2)边坡动力响应及安全评价;(3)边坡加固措施(主要指锚索)在动力荷载作用下的安全性问题.在总结多年来的工作经验与研究积累的基础上,探讨边坡动力稳定性评价方法,特别是对于在地震、爆破动荷载作用下,岩质高边坡稳定性的安全评价:总结提出岩体在动荷载作用下的强度特性与研究重点:系统分析岩质边坡的动力响应:提出从动安全系数走向、潜在滑动面的动态开裂和滑移及边坡关键点的质点振动速度3个方面来评价边坡动力稳定性的思路与方法;最后就如何评价边坡预应力锚索加固措施在动荷载作用下的安全性提出科学的设计原则与评价指标.%Emphasis is put on discussions of three basic problems in dynamic stability analysis of rock slope: (1) mechanical characteristics of fractured rock masses under dynamic loads; (2) dynamic response and safety assessment of slope; (3) safety evaluation of slope reinforcement measures(mainly anchor cables) under dynamic loads. Based on long-term engineering experiences and researches, assessment methods for dynamic stability of slopes are discussed, especially for high rock slopes under seismic and blasting loads. The strength characteristics and research emphasis for rock masses under dynamic loads are summarized and put forward. The dynamic responses of rock slope are analyzed systematically. It is pointed out that the dynamic stability of rock slopes should be assessed from three aspects, i.e. the trend of dynamic factor of safety, the dynamic fracturing and sliding of potential sliding surface, and the particle vibration velocity of key points in slopes. Finally, the security assessment of prestressed anchor cables under dynamic loads is proposed.

  17. Research of the Rock Art from the point of view of geography: the neolithic painting of the Mediterranean area of the Iberian Peninsula

    Directory of Open Access Journals (Sweden)

    Cruz Berrocal, María

    2004-12-01

    Full Text Available The rock art of the Mediterranean Arch (which includes what are conventionally called Levantine Rock Art, Schematic Rock Art and Macroschematic Rock Art, among other styles, designated as part of the Human Heritage in 1997, is studied from the point of view of the Archaeology of Landscape. The information sources used were field work, cartographic analysis and analysis in GIS, besides two Rock Art Archives: the UNESCO Document and the Corpus of Levantine Cave Painting (Corpus de Pintura Rupestre Levantina. The initial hypothesis was that this rock art was involved in the process of neolithisation of the Eastern part of Iberia, of which it is a symptom and a result, and it must be understood as an element of landscape construction. If this is true, it would have a concrete distribution in the form of locational patterns. Through statistical procedures and heuristical approaches, it has been demonstrated that there is a structure of the neolithic landscape, defined by rock art, which is possible to interpret functional and economically.

    Se estudia el arte rupestre del Arco Mediterráneo (que incluye a los convencionalmente conocidos como Arte Levantino, Arte Esquemático y Arte Macroesquemático, entre otros estilos, nombrado Patrimonio de la Humanidad en 1998, desde el punto de vista de su localización. Las fuentes de información utilizadas fueron trabajo de campo, revisión cartográfica y análisis en Sistema de Información Geográfica, además de dos archivos de arte rupestre: el Expediente UNESCO y el Corpus de Pintura Rupestre Levantina. La hipótesis inicial fue que este arte rupestre se imbrica en el proceso de neolitización del Levante peninsular, del que es síntoma y resultado, y debe entenderse como un elemento de construcción paisajística, de lo que se deduce que ha de presentar una distribución determinable en forma de patrones locacionales. Por medio tanto de contrastes y descripciones estadísticas como de

  18. Accurate 3D point cloud comparison and volumetric change analysis of Terrestrial Laser Scan data in a hard rock coastal cliff environment

    Science.gov (United States)

    Earlie, C. S.; Masselink, G.; Russell, P.; Shail, R.; Kingston, K.

    2013-12-01

    Our understanding of the evolution of hard rock coastlines is limited due to the episodic nature and ';slow' rate at which changes occur. High-resolution surveying techniques, such as Terrestrial Laser Scanning (TLS), have just begun to be adopted as a method of obtaining detailed point cloud data to monitor topographical changes over short periods of time (weeks to months). However, the difficulties involved in comparing consecutive point cloud data sets in a complex three-dimensional plane, such as occlusion due to surface roughness and positioning of data capture point as a result of a consistently changing environment (a beach profile), mean that comparing data sets can lead to errors in the region of 10 - 20 cm. Meshing techniques are often used for point cloud data analysis for simple surfaces, but in surfaces such as rocky cliff faces, this technique has been found to be ineffective. Recession rates of hard rock coastlines in the UK are typically determined using aerial photography or airborne LiDAR data, yet the detail of the important changes occurring to the cliff face and toe are missed using such techniques. In this study we apply an algorithm (M3C2 - Multiscale Model to Model Cloud Comparison), initially developed for analysing fluvial morphological change, that directly compares point to point cloud data using surface normals that are consistent with surface roughness and measure the change that occurs along the normal direction (Lague et al., 2013). The surfaces changes are analysed using a set of user defined scales based on surface roughness and registration error. Once the correct parameters are defined, the volumetric cliff face changes are calculated by integrating the mean distance between the point clouds. The analysis has been undertaken at two hard rock sites identified for their active erosion located on the UK's south west peninsular at Porthleven in south west Cornwall and Godrevy in north Cornwall. Alongside TLS point cloud data, in

  19. Heart tissue of harlequin (hq)/Big Blue mice has elevated reactive oxygen species without significant impact on the frequency and nature of point mutations in nuclear DNA

    Energy Technology Data Exchange (ETDEWEB)

    Crabbe, Rory A. [Department of Biology, University of Western Ontario, London, Ontario, N6A 5B7 (Canada); Hill, Kathleen A., E-mail: khill22@uwo.ca [Department of Biology, University of Western Ontario, London, Ontario, N6A 5B7 (Canada)

    2010-09-10

    Age is a major risk factor for heart disease, and cardiac aging is characterized by elevated mitochondrial reactive oxygen species (ROS) with compromised mitochondrial and nuclear DNA integrity. To assess links between increased ROS levels and mutations, we examined in situ levels of ROS and cII mutation frequency, pattern and spectrum in the heart of harlequin (hq)/Big Blue mice. The hq mouse is a model of premature aging with mitochondrial dysfunction and increased risk of oxidative stress-induced heart disease with the means for in vivo mutation detection. The hq mutation produces a significant downregulation in the X-linked apoptosis-inducing factor gene (Aif) impairing both the antioxidant and oxidative phosphorylation functions of AIF. Brain and skin of hq disease mice have elevated frequencies of point mutations in nuclear DNA and histopathology characterized by cell loss. Reports of associated elevations in ROS in brain and skin have mixed results. Herein, heart in situ ROS levels were elevated in hq disease compared to AIF-proficient mice (p < 0.0001) yet, mutation frequency and pattern were similar in hq disease, hq carrier and AIF-proficient mice. Heart cII mutations were also assessed 15 days following an acute exposure to an exogenous ROS inducer (10 mg paraquat/kg). Acute paraquat exposure with a short mutant manifestation period was insufficient to elevate mutation frequency or alter mutation pattern in the post-mitotic heart tissue of AIF-proficient mice. Paraquat induction of ROS requires mitochondrial complex I and thus is likely compromised in hq mice. Results of this preliminary survey and the context of recent literature suggest that determining causal links between AIF deficiency and the premature aging phenotypes of specific tissues is better addressed with assay of mitochondrial ROS and large-scale changes in mitochondrial DNA in specific cell types.

  20. Big Data

    Directory of Open Access Journals (Sweden)

    Prachi More

    2013-05-01

    Full Text Available Demand and spurt in collections and accumulation of data has coined new term “Big Data” has begun. Accidently, incidentally and by interaction of people, information so called data is massively generated. This BIG DATA is to be smartly and effectively used Computer scientists, physicists, economists, mathematicians, political scientists, bio-informaticists, sociologists and many Variety of Intellegesia debate over the potential benefits and costs of analysing information from Twitter, Google, Facebook, Wikipedia and every space where large groups of people leave digital traces and deposit data. Given the rise of Big Data as both a phenomenon and a methodological persuasion, it is time to start critically interrogating this phenomenon, its assumptions and its biases. Big data, which refers to the data sets that are too big to be handled using the existing database management tools, are emerging in many important applications, such as Internet search, business informatics, social networks, social media, genomics, and meteorology. Big data presents a grand challenge for database and data analytics research. This paper is a blend of non-technical and introductory-level technical detail, ideal for the novice. We conclude with some technical challenges as well as the solutions that can be used to these challenges. Big Data differs from other data with five characteristics like volume, variety, value, velocity and complexity. The article will focus on some current and future cases and causes for BIG DATA.

  1. Big Data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin;

    2016-01-01

    The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  2. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Ruppert, Evelyn; Flyverbom, Mikkel;

    2016-01-01

    The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments that is contained in any big data practice. Secondly, it suggest a research agenda built around a set of sub-themes that each deserve dedicated scrutiny when studying the interplay between big...

  3. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Jeppesen, Jacob; Vaarst Andersen, Kristina; Lauto, Giancarlo;

    and locations, having a diverse knowledge set and capable of tackling more and more complex problems. This prose the question if Big Egos continues to dominate in this rising paradigm of big science. Using a dataset consisting of full bibliometric coverage from a Large Scale Research Facility, we utilize...... a stochastic actor oriented model (SAOM) to analyze both network endogeneous mechanisms and individual agency driving the collaboration network and further if being a Big Ego in Big Science translates to increasing performance. Our findings suggest that the selection of collaborators is not based...... on preferentialattachment, but more of an assortativity effect creating not merely a rich-gets-richer effect but an elitist network with high entry barriers. In this acclaimed democratic and collaborative environment of Big Science, the elite closes in on itself. We propose this tendency to be even more explicit in other...

  4. Big Data

    OpenAIRE

    Sabater Picañol, Jordi

    2013-01-01

    El proyecto se ha desarrollado en la empresa Everis y ha sido el resultado de la colaboración con otro estudiante de la Facultad de Informática de Barcelona, Robert Serrat Morros. [CASTELLÀ] El concepto Big Data está cobrando actualmente un gran interés creciente por parte de las empresas, que ven en ello una gran ventaja competitiva. Este proyecto busca justificar este interés creciente partiendo de los conceptos más básicos de Big Data. [ANGLÈS] The Big Data concept is nowadays...

  5. Thermal-maturity trends within Franciscan rocks near Big Sur, California: Implications for offset along the San Gregorio San Simeon Hosgri fault zone

    Science.gov (United States)

    Underwood, Michael B.; Laughland, Matthew M.; Shelton, Kevin L.; Sedlock, Richard L.

    1995-09-01

    Conventional neotectonic interpretations place the Lucia and Point Sur subterranes of the Franciscan subduction complex on opposite sides of the San Gregorio San Simeon Hosgri dextral fault system and connect that system through the Sur fault zone. Our reconstructed paleotemperature contours, however, are not offset across the San Simeon segment, so differential displacement between the subterranes after peak heating appears to have been negligible. One explanation is that dextral slip on the faults has totaled only 5 10 km. A second possibility is that a discrete Hosgri San Simeon segment extends offshore of the amalgamated Point Sur and Lucia subterranes and that an en echelon stepover transfers dextral slip eastward to the San Gregorio Palo Colorado segment. In either case, the Sur fault zone appears to play a relatively insignificant role in the late Cenozoic tectonic evolution of central California.

  6. Big Data

    OpenAIRE

    Prachi More; Latika Chaudhary; Sangita Panmand; Prof. Nilesh Shah

    2013-01-01

    Demand and spurt in collections and accumulation of data has coined new term “Big Data” has begun. Accidently, incidentally and by interaction of people, information so called data is massively generated. This BIG DATA is to be smartly and effectively used Computer scientists, physicists, economists, mathematicians, political scientists, bio-informaticists, sociologists and many Variety of Intellegesia debate over the potential benefits and costs of analysing information from Twitter, Google,...

  7. Comparing Two Photo-Reconstruction Methods to Produce High Density Point Clouds and DEMs in the Corral del Veleta Rock Glacier (Sierra Nevada, Spain

    Directory of Open Access Journals (Sweden)

    Álvaro Gómez-Gutiérrez

    2014-06-01

    Full Text Available In this paper, two methods based on computer vision are presented in order to produce dense point clouds and high resolution DEMs (digital elevation models of the Corral del Veleta rock glacier in Sierra Nevada (Spain. The first one is a semi-automatic 3D photo-reconstruction method (SA-3D-PR based on the Scale-Invariant Feature Transform algorithm and the epipolar geometry theory that uses oblique photographs and camera calibration parameters as input. The second method is fully automatic (FA-3D-PR and is based on the recently released software 123D-Catch that uses the Structure from Motion and MultiView Stereo algorithms and needs as input oblique photographs and some measurements in order to scale and geo-reference the resulting model. The accuracy of the models was tested using as benchmark a 3D model registered by means of a Terrestrial Laser Scanner (TLS. The results indicate that both methods can be applied to micro-scale study of rock glacier morphologies and processes with average distances to the TLS point cloud of 0.28 m and 0.21 m, for the SA-3D-PR and the FA-3D-PR methods, respectively. The performance of the models was also tested by means of the dimensionless relative precision ratio parameter resulting in figures of 1:1071 and 1:1429 for the SA-3D-PR and the FA-3D-PR methods, respectively. Finally, Digital Elevation Models (DEMs of the study area were produced and compared with the TLS-derived DEM. The results showed average absolute differences with the TLS-derived DEM of 0.52 m and 0.51 m for the SA-3D-PR and the FA-3D-PR methods, respectively.

  8. Toward a Learning Health-care System – Knowledge Delivery at the Point of Care Empowered by Big Data and NLP

    Science.gov (United States)

    Kaggal, Vinod C.; Elayavilli, Ravikumar Komandur; Mehrabi, Saeed; Pankratz, Joshua J.; Sohn, Sunghwan; Wang, Yanshan; Li, Dingcheng; Rastegar, Majid Mojarad; Murphy, Sean P.; Ross, Jason L.; Chaudhry, Rajeev; Buntrock, James D.; Liu, Hongfang

    2016-01-01

    The concept of optimizing health care by understanding and generating knowledge from previous evidence, ie, the Learning Health-care System (LHS), has gained momentum and now has national prominence. Meanwhile, the rapid adoption of electronic health records (EHRs) enables the data collection required to form the basis for facilitating LHS. A prerequisite for using EHR data within the LHS is an infrastructure that enables access to EHR data longitudinally for health-care analytics and real time for knowledge delivery. Additionally, significant clinical information is embedded in the free text, making natural language processing (NLP) an essential component in implementing an LHS. Herein, we share our institutional implementation of a big data-empowered clinical NLP infrastructure, which not only enables health-care analytics but also has real-time NLP processing capability. The infrastructure has been utilized for multiple institutional projects including the MayoExpertAdvisor, an individualized care recommendation solution for clinical care. We compared the advantages of big data over two other environments. Big data infrastructure significantly outperformed other infrastructure in terms of computing speed, demonstrating its value in making the LHS a possibility in the near future.

  9. Toward a Learning Health-care System – Knowledge Delivery at the Point of Care Empowered by Big Data and NLP

    Science.gov (United States)

    Kaggal, Vinod C.; Elayavilli, Ravikumar Komandur; Mehrabi, Saeed; Pankratz, Joshua J.; Sohn, Sunghwan; Wang, Yanshan; Li, Dingcheng; Rastegar, Majid Mojarad; Murphy, Sean P.; Ross, Jason L.; Chaudhry, Rajeev; Buntrock, James D.; Liu, Hongfang

    2016-01-01

    The concept of optimizing health care by understanding and generating knowledge from previous evidence, ie, the Learning Health-care System (LHS), has gained momentum and now has national prominence. Meanwhile, the rapid adoption of electronic health records (EHRs) enables the data collection required to form the basis for facilitating LHS. A prerequisite for using EHR data within the LHS is an infrastructure that enables access to EHR data longitudinally for health-care analytics and real time for knowledge delivery. Additionally, significant clinical information is embedded in the free text, making natural language processing (NLP) an essential component in implementing an LHS. Herein, we share our institutional implementation of a big data-empowered clinical NLP infrastructure, which not only enables health-care analytics but also has real-time NLP processing capability. The infrastructure has been utilized for multiple institutional projects including the MayoExpertAdvisor, an individualized care recommendation solution for clinical care. We compared the advantages of big data over two other environments. Big data infrastructure significantly outperformed other infrastructure in terms of computing speed, demonstrating its value in making the LHS a possibility in the near future. PMID:27385912

  10. Study of temperature distribution of fuel, clad and coolant in the VVER-1000 reactor core during group-10 control rod scram by using diffusion and point kinetic methods

    International Nuclear Information System (INIS)

    In this paper, through the application of two different methods (point kinetic and diffusion), the temperature distribution of fuel, clad and coolant has been studied and calculated during group-10 control rod scram, in the Bushehr Nuclear Power Plant (Iran) with a VVER-1000 reactor core. In the reactor core of Bushehr NPP, 10 groups of control rods are used of which, group-10 control rods contain the highest amount of injected negative reactivity in terms of quantity as compared to other groups of control rods. In this paper we explain impacts of negative reactivity, caused by a complete or minor scram of group-10 control rods, on thermoneutronic parameters of the VVER-1000 nuclear reactor core. It should be noted that through these calculations and by using the results, we can develop a sound understanding of impacts of this controlling element in optimum control of the reactor core and, on this basis, with careful attention and by gaining access to a reliable simulation (on the basis of results of calculations made in this survey) we can monitor the VVER-1000 reactor core through a smart control system. In continuation, for a more accurate survey and for comparing results of different calculation systems (point kinetic and diffusion), by using COSTANZA-R,Z calculation code (in which neutronic calculations are based on diffusion model) and using WIMS code at different areas and temperatures (for calculation of constant physical coefficients and temperature coefficients needed in COSTANZAR, Z code) for the VVER-1000 reactor core of Bushehr NPP, calculation of temperature distribution of fuel elements and coolant by using diffusion model is made in the course of group-10 control rods scram and afterwards. (author)

  11. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  12. Experimental Study of Big Row Spacing Cultivation of Tomato Using Straw Biological Reactor Technology%应用秸秆生物反应堆技术大行距栽培番茄试验研究

    Institute of Scientific and Technical Information of China (English)

    王继涛; 张翔; 温学萍; 赵玮; 俞风娟; 汪金山

    2015-01-01

    应用秸秆生物反应堆技术能有效地改善设施内环境因素、减缓病害发生、提高产量效益,但此项技术在开沟过程中比较费工费力,为了降低秸秆生物反应堆技术劳动用工和生产投入,特开展秸秆生物反应堆技术大行距栽培番茄试验研究。结果表明:仅挖沟、埋秸秆、起垄、铺设滴管、定植环节比对照每公顷节省劳动用工35.7%,节约成本16810.5元/hm2,上市期提前5 d,产量增加26.68%,病虫害发病率明显降低。综合田间生长势及室内考种数据,建议在宁夏地区大面积推广应用秸秆生物反应堆技术大行距栽培番茄。%The application of the straw biological reactor technology can effectively improve the environmental factors within the facility, slow down the occurrence of the disease and improve the yield and benefit. But with this technology, in the process of ditching, a lot of work and effort are needed. In order to reduce the labor employment and production inputs in the utilization of the technology, an experiment research on the big row spacing cultivation of tomato using the straw biologi-cal reactor technology was conducted. The results showed that compared with the control, only in the links such as ditching, straw burring, ridging, laying of dropper and planting, 35.7% of the labor employment per hectare, 16,810.5 yuan/hm2 of the cost could be saved the marketing time could be advance by 5 days, the yield could be increased by 26.68% and the inci-dence of pests and diseases could be lowered significantly. In considering the comprehensive growth potential in the field and the indoor test data it is suggested that the big row spacing cultivation of tomato using the straw biological reactor technology should be extended and applied in large areas in Ningxia.

  13. Big Dreams

    Science.gov (United States)

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  14. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  15. Analytical Solution of the Point Reactor Kinetics Equations for One-Group of Delayed Neutrons for a Discontinuous Linear Reactivity Insertion

    Directory of Open Access Journals (Sweden)

    S. Yamoah

    2012-11-01

    Full Text Available The understanding of the time-dependent behaviour of the neutron population in a nuclear reactor in response to either a planned or unplanned change in the reactor conditions is of great importance to the safe and reliable operation of the reactor. It is therefore important to understand the response of the neutron density and how it relates to the speed of lifting control rods. In this study, an analytical solution of point reactor kinetic equations for one-group of delayed neutrons is developed to calculate the change in neutron density when reactivity is linearly introduced discontinuously. The formulation presented in this study is validated with numerical solution using the Euler method. It is observed that for higher speed, r = 0.0005 the Euler method predicted higher values than the method presented in this study. However with r = 0.0001, the Euler method predicted lower values than the method presented in this study except for t = 1.0 s and 5.0 s. The results obtained have shown to be compatible with the numerical method.

  16. Release plan for Big Pete

    International Nuclear Information System (INIS)

    This release plan is to provide instructions for the Radiological Control Technician (RCT) to conduct surveys for the unconditional release of ''Big Pete,'' which was used in the removal of ''Spacers'' from the N-Reactor. Prior to performing surveys on the rear end portion of ''Big Pete,'' it shall be cleaned (i.e., free of oil, grease, caked soil, heavy dust). If no contamination is found, the vehicle may be released with the permission of the area RCT Supervisor. If contamination is found by any of the surveys, contact the cognizant Radiological Engineer for decontamination instructions

  17. Antigravity and the big crunch/big bang transition

    OpenAIRE

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.(Princeton Center for Theoretical Science, Princeton University, Princeton, NJ, 08544, USA); Turok, Neil

    2011-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition...

  18. Deep crescentic features caused by subglacial boulder point pressure on jointed rock; an example from Virkisjökull, SE Iceland

    Science.gov (United States)

    Krabbendam, M.; Bradwell, T.; Everest, J.

    2012-04-01

    A variety of subglacially formed, erosional crescentic features (e.g. crescentic gouges, lunate fractures) have been widely reported on deglaciated bedrock surfaces. They are characterised by a conchoidal fracture that dips in the same direction as the palaeo-ice flow direction, and a steeper fracture that faces against the ice flow. They are generally interpreted as being formed by point pressure exerted by large boulders entrained in basal ice. They are significant in that they record palaeo-ice flow even if shallower glacial striae are obliterated by post-glacial weathering [1, 2, 3]. This contribution reports on deep scallop-shaped, crescentic depressions observed on abraded surfaces of roche moutonnées and whalebacks recently (depressions at Virkisjökull are cut into smoothed, abraded surfaces festooned with abundant glacial striae. Differences with previously reported crescentic features are: • The scallop-shaped depressions are considerably deeper (5-20 cm); • The steep fracture facing ice flow coincides in all cases with a pre-existing joint that cuts the entire whaleback. The steep joints developed thus before the conchoidal fracture, whilst in reported crescentic features they develop after the conchoidal fracture. We suggest the following formation mechanism. A boulder encased in basal ice exerts continuous pressure on its contact point as it moves across the ice-bedrock contact. This sets up a stress field in the bedrock that does not necessarily exceed the intact rock strength (other crescentic features are rare to absent at Virkisjökull). However, as the stress field migrates (with the transported boulder) and encounters a subvertical, pre-existing joint, stress concentrations build up that do exceed the intact rock strength, resulting in a new (conchoidal) fracture, 'spalling' off a thick, scallop-shaped fragment. The significance of the deep scallop-shaped crescentic depressions is that: • in common with other crescentic features they

  19. Rock History and Culture

    OpenAIRE

    Gonzalez, Éric

    2013-01-01

    Two ambitious works written by French-speaking scholars tackle rock music as a research object, from different but complementary perspectives. Both are a definite must-read for anyone interested in the contextualisation of rock music in western popular culture. In Une histoire musicale du rock (i.e. A Musical History of Rock), rock music is approached from the point of view of the people – musicians and industry – behind the music. Christophe Pirenne endeavours to examine that field from a m...

  20. Big Data; A Management Revolution : The emerging role of big data in businesses

    OpenAIRE

    Blasiak, Kevin

    2014-01-01

    Big data is a term that was coined in 2012 and has since then emerged to one of the top trends in business and technology. Big data is an agglomeration of different technologies resulting in data processing capabilities that have been unreached before. Big data is generally characterized by 4 factors. Volume, velocity and variety. These three factors distinct it from the traditional data use. The possibilities to utilize this technology are vast. Big data technology has touch points in differ...

  1. Big Data and Big Science

    OpenAIRE

    Di Meglio, Alberto

    2014-01-01

    Brief introduction to the challenges of big data in scientific research based on the work done by the HEP community at CERN and how the CERN openlab promotes collaboration among research institutes and industrial IT companies. Presented at the FutureGov 2014 conference in Singapore.

  2. International conference on opportunities and challenges for water cooled reactors in the 21. century. PowerPoint presentations

    International Nuclear Information System (INIS)

    Water Cooled Reactors have been the keystone of the nuclear industry in the 20th Century. As we move into the 21st Century and face new challenges such as the threat of climate change or the large growth in world energy demand, nuclear energy has been singled out as one of the sources that could substantially and sustainably contribute to power the world. As the nuclear community worldwide looks into the future with the development of advanced and innovative reactor designs and fuel cycles, it becomes important to explore the role Water Cooled Reactors (WCRs) will play in this future. To support the future role of WCRs, substantial design and development programmes are underway in a number of Member States to incorporate additional technology improvements into advanced nuclear power plants (NPPs) designs. One of the key features of advanced nuclear reactor designs is their improved safety due to a reduction in the probability and consequences of accidents and to an increase in the operator time allowed to better assess and properly react to abnormal events. A systematic approach and the experience of many years of successful operation have allowed designers to focus their design efforts and develop safer, more efficient and more reliable designs, and to optimize plant availability and cost through improved maintenance programs and simpler operation and inspection practices. Because many of these advanced WCR designs will be built in countries with no previous nuclear experience, it is also important to establish a forum to facilitate the exchange of information on the infrastructure and technical issues associated with the sustainable deployment of advanced nuclear reactors and its application for the optimization of maintenance of operating nuclear power plants. This international conference seeks to be all-inclusive, bringing together the policy, economic and technical decision-makers and the stakeholders in the nuclear industry such as operators, suppliers

  3. Big Data

    OpenAIRE

    Eils, Jürgen

    2013-01-01

    Das weltweite Datenvolumen verdoppelt sich ca. alle zwei Jahre. Der Fortschritt bei der Auswertung ist atemberaubend. Tausende Wissenschaftlcer brauchten mehr als ein Jahrzehnt, um erstmals das vollständige menschliche Genom zu entschlüsseln. Das hat sich dramatisch geändert, meint Jürgen Eils, Leiter der Data-Management-Gruppe am Deutschen Krebsforschungszentrum in Heidelberg ... Der Beitrag über Big Data erschien in der Sendereihe "Campus-Report" - einer Beitragsreihe, in der über aktue...

  4. Online stress corrosion crack and fatigue usages factor monitoring and prognostics in light water reactor components: Probabilistic modeling, system identification and data fusion based big data analytics approach

    Energy Technology Data Exchange (ETDEWEB)

    Mohanty, Subhasish M. [Argonne National Lab. (ANL), Argonne, IL (United States); Jagielo, Bryan J. [Argonne National Lab. (ANL), Argonne, IL (United States); Oakland Univ., Rochester, MI (United States); Iverson, William I. [Argonne National Lab. (ANL), Argonne, IL (United States); Univ. of Illinois at Urbana-Champaign, Champaign, IL (United States); Bhan, Chi Bum [Argonne National Lab. (ANL), Argonne, IL (United States); Pusan National Univ., Busan (Korea, Republic of); Soppet, William S. [Argonne National Lab. (ANL), Argonne, IL (United States); Majumdar, Saurin M. [Argonne National Lab. (ANL), Argonne, IL (United States); Natesan, Ken N. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-12-10

    Nuclear reactors in the United States account for roughly 20% of the nation's total electric energy generation, and maintaining their safety in regards to key component structural integrity is critical not only for long term use of such plants but also for the safety of personnel and the public living around the plant. Early detection of damage signature such as of stress corrosion cracking, thermal-mechanical loading related material degradation in safety-critical components is a necessary requirement for long-term and safe operation of nuclear power plant systems.

  5. Comparative study on nutrient removal of agricultural non-point source pollution for three filter media filling schemes in eco-soil reactors.

    Science.gov (United States)

    Du, Fuyi; Xie, Qingjie; Fang, Longxiang; Su, Hang

    2016-08-01

    Nutrients (nitrogen and phosphorus) from agricultural non-point source (NPS) pollution have been increasingly recognized as a major contributor to the deterioration of water quality in recent years. The purpose of this article is to investigate the discrepancies in interception of nutrients in agricultural NPS pollution for eco-soil reactors using different filling schemes. Parallel eco-soil reactors of laboratory scale were created and filled with filter media, such as grit, zeolite, limestone, and gravel. Three filling schemes were adopted: increasing-sized filling (I-filling), decreasing-sized filling (D-filling), and blend-sized filling (B-filling). The systems were intermittent operations via simulated rainstorm runoff. The nutrient removal efficiency, biomass accumulation and vertical dissolved oxygen (DO) distribution were defined to assess the performance of eco-soil. The results showed that B-filling reactor presented an ideal DO for partial nitrification-denitrification across the eco-soil, and B-filling was the most stable in the change of bio-film accumulation trends with depth in the three fillings. Simultaneous and highest removals of NH4(+)-N (57.74-70.52%), total nitrogen (43.69-54.50%), and total phosphorus (42.50-55.00%) were obtained in the B-filling, demonstrating the efficiency of the blend filling schemes of eco-soil for oxygen transfer and biomass accumulation to cope with agricultural NPS pollution. PMID:27441855

  6. CRITERIA FOR ROCK ENGINEERING FAILURE

    Institute of Scientific and Technical Information of China (English)

    ZHUDeren; ZHANGYuzhuo

    1995-01-01

    A great number of underground rock projects are maintained in the rock mass which is subject to rock damage and failure development. In many cases, the rock. engineering is still under normal working conditions even though rock is already fails to some extent. This paper introduces two different concepts: rock failure and rock engineering failure. Rock failure is defined as a mechanical state under which an applicable characteristic is changed or lost.However, the rock engineering failure is an engineering state under which an applicable function is changed or lost. The failure of surrounding rocks is the major reason of rock engineering failure. The criterion of rock engineering failure depends on the limit of applicable functions. The rock engineering failure state possesses a corresponding point in rock failure state. In this paper, a description of rock engineering failure criterion is given by simply using a mechanical equation or expression. It is expected that the study of rock engineering failure criterion will be an optimal approach that combines research of rock mechanics with rock engineering problems.

  7. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  8. Big queues

    CERN Document Server

    Ganesh, Ayalvadi; Wischik, Damon

    2004-01-01

    Big Queues aims to give a simple and elegant account of how large deviations theory can be applied to queueing problems. Large deviations theory is a collection of powerful results and general techniques for studying rare events, and has been applied to queueing problems in a variety of ways. The strengths of large deviations theory are these: it is powerful enough that one can answer many questions which are hard to answer otherwise, and it is general enough that one can draw broad conclusions without relying on special case calculations.

  9. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  10. Challenges of Big Data Analysis.

    Science.gov (United States)

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  11. Study on Automatic Regulating of Reactor Axial Power Distribution based on Double-Point Reactor Model%基于双点堆模型的反应堆轴向功率分布自动调节研究

    Institute of Scientific and Technical Information of China (English)

    刘玉燕; 李玉红; 王恒

    2014-01-01

    针对二代压水堆核电厂轴向功率分布多为操作员手动调节的现状,提出了一种棒位调节和硼酸浓度调节相结合的反应堆功率和轴向功率分布的自动控制策略。基于 SIMULINK 建立了能够描述温度反馈效应、氙毒效应和轴向功率分布的双点堆模型,以及所设计的控制系统模型,并针对反应堆慢速斜坡降升功率和快速斜坡降升功率两种情况进行了仿真实验。仿真结果表明所提控制策略有较好的控制品质,负荷跟踪的超调量小于1%,轴向功率偏差进入轴向偏差参考带之内的时间小于5000秒。%The reactor power and axial power distribution automatic control strategy which com-bines rod position adjustment with boric acid concentration regulation is put forward for the status that axial power distribution of two generation pressurized water reactor nuclear power plant is adjusted by operator manually. A double-point reactor model and the control system model which can describe the temperature feedback,the effects of xenon and axial power distribution are established based on SIMU-LINK. The simulation results for reactor slow and fast slope lifting power show that the proposed control strategy has better control quality. Load tracking overshoot is less than 1%. The time for axial power dif-ference into reference tape is less than 5000 seconds.

  12. Big Data: Overview

    OpenAIRE

    Gupta, Richa; Gupta, Sunny; Singhal, Anuradha

    2014-01-01

    Big data is data that exceeds the processing capacity of traditional databases. The data is too big to be processed by a single machine. New and innovative methods are required to process and store such large volumes of data. This paper provides an overview on big data, its importance in our live and some technologies to handle big data.

  13. STUDY OF FACTORS AFFECTING CUSTOMER BEHAVIOUR USING BIG DATA TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Prabin Sahoo

    2014-10-01

    Full Text Available Big data technology is getting momentum recently. There are several articles, books, blogs and discussion points to various facets of big data technology. The study in this paper focuses on big data as concept, and insights into 3 Vs such as Volume, Velocity and Variety and demonstrates their significance with respect to factors that can be processed using big data for studying customer behaviour for online users.

  14. Big data=Big marketing?!

    Institute of Scientific and Technical Information of China (English)

    肖明超

    2012-01-01

    <正>互联网刚刚兴起的时候,有句话很流行:"在网上,没人知道你是一条狗。"但是,在20多年后的今天,这句话已经早被扔进了历史的垃圾堆,因为在技术的推动下,随着移动互联、社交网络、电子商务等的迅速发展,消费者的"行踪"变得越来越容易被把握,消费者在互联网上的眼球、行为轨迹、谈论、喜好、购物经历等等都可能被捕捉到,消费者进入一个几乎透明化生存的"大数据时代"(Age of Big Data)。数据不仅仅正在变得更加可用,人工智能(AI)技术,包括自然语言处理、模式识别和机器学习等技术的发展,正在让数据变得更加容易被计算机所理解,

  15. Geological and geotechnical aspects of the foundation pit of Kaiga atomic power plant reactor building 2, Kaiga, Uttara Kannada district, Karnataka

    International Nuclear Information System (INIS)

    In India Nuclear Power Plants are constructed as per the guidelines laid by IAEA and AERB. Before concrete is poured into reactor building pits, they are systematically mapped and Iithostructural maps are prepared for pit base and side walls. The constraints noticed are carefully attended with geotechnical solutions and remedies to make foundation safe for the entire period of reactor life. Similarly, pit of Kaiga Reactor Building II was systematically mapped for circular base and side walls. Geo-engineering solutions like scrapping out loose, foliated schistose patches, scooping out soft altered zones, filling with grouting, rock-bolting rock segments with major joints and fractures for stopping seepage points were suggested. (author)

  16. Derivation of Pal-Bell equations for two-point reactors, and its application to correlation measurements at KUCA

    International Nuclear Information System (INIS)

    A probability is defined for an event in which m neutrons exist at time t sub(f) in core I of a coupled-core system, originating from a neutron injected into the core I at an earlier time t; we call it P sub(I,I,m)(t sub(f)/t). Similarly, P sub(I,II,m)(t sub(f)/t) is defined as the probability for m neutrons to exist in core II of the system at time t sub(f), originating from a neutron injected into the core I at time t. Then a system of coupled equations are derived for the generating functions G sub(Ij)(z, t sub(f)/t) = μP sub(Ijm)(t sub(f)/t).z sup(m), where j = I, II. By similar procedures equations are derived for the generating functions associated with joint probability of the following events: a given combination of numbers of neutrons are detected during given series of detection time intervals by a detector inserted in one of the cores. The above two kinds of systems of equations can be regarded as a two-point version of Pal-Bell's equations. As the application of these formulations, analyzing formula for correlation measurements, namely (1) Feynman-alpha experiment and (2) Rossi-alpha experiment of Orndoff-type, are derived, and their feasibility is verified by experiments carried out at KUCA. (author)

  17. KREEP Rocks

    Institute of Scientific and Technical Information of China (English)

    邹永廖; 徐琳; 欧阳自远

    2004-01-01

    KREEP rocks with high contents of K, REE and P were first recognized in Apollo-12 samples, and it was confirmed later that there were KREEP rock fragments in all of the Apollo samples, particularly in Apollo-12 and-14 samples. The KREEP rocks distributed on the lunar surface are the very important objects of study on the evolution of the moon, as well as to evaluate the utilization prospect of REE in KREEP rocks. Based on previous studies and lunar exploration data, the authors analyzed the chemical and mineral characteristics of KREEP rocks, the abundance of Th on the lunar surface materials, the correlation between Th and REE of KREEP rocks in abundance, studied the distribution regions of KREEP rocks on the lunar surface, and further evaluated the utilization prospect of REE in KREEP rocks.

  18. Rock Stars

    Institute of Scientific and Technical Information of China (English)

    张国平

    2000-01-01

    Around the world young people are spending unbelievable sums of money to listen to rock music. Forbes Magazine reports that at least fifty rock stars have incomes between two million and six million dollars per year.

  19. Rock Finding

    Science.gov (United States)

    Rommel-Esham, Katie; Constable, Susan D.

    2006-01-01

    In this article, the authors discuss a literature-based activity that helps students discover the importance of making detailed observations. In an inspiring children's classic book, "Everybody Needs a Rock" by Byrd Baylor (1974), the author invites readers to go "rock finding," laying out 10 rules for finding a "perfect" rock. In this way, the…

  20. 基于大数据技术的配电网抢修驻点优化方法%Optimization Method of Repair the Stagnation Point Distribution Based on Big Data Analysis Method

    Institute of Scientific and Technical Information of China (English)

    陆如; 范宏; 周献远

    2015-01-01

    Emergency repair is an important task in the operation of distribution network,for which the scientific and efficient management and implementation method is vital to improve reliability and service quality of distribution network. A method based on Hadoop analysis method is proposed to solve the optimization problem of distribution network emergency repair stagnation points.The factors that affect the efficiency of distribution network emergency repair is analyzed comprehensively,optimization model for emergency repair stagnation point is built up,and the data mining technique for processing big data is introduced to enhance the efficiency of model analysis.In addition,the reasonable and effective allocation of emergency repair resources is achieved by quick and accurate estimation of fault time and fault point and comprehensive analysis and location of distribution network emergency repair points and states,thus improving the serve quantity and efficiency of emergency repair.%配电网故障抢修是配电网运行的重要工作,科学高效的抢修管理和实施方法对提高配电网供电可靠性和配电网服务质量意义重大.提出了基于Hadoop处理技术的大数据解决方法处理配电网抢修驻点优化问题;全面分析了影响配电网抢修效率的各个因素;建立了配电网抢修驻点优化模型;引入了处理大数据的数据挖掘技术以提高模型分析的效率.此外,通过对配电网抢修点和抢修态进行综合分析与定位,对电网故障发生时间和故障位置的准确快速判断,实现合理有效调配抢修资源,从而提高配电网故障抢修工作的服务质量和效率.

  1. Adobe photoshop quantification (PSQ) rather than point-counting: A rapid and precise method for quantifying rock textural data and porosities

    Science.gov (United States)

    Zhang, Xuefeng; Liu, Bo; Wang, Jieqiong; Zhang, Zhe; Shi, Kaibo; Wu, Shuanglin

    2014-08-01

    Commonly used petrological quantification methods are visual estimation, counting, and image analyses. However, in this article, an Adobe Photoshop-based analyzing method (PSQ) is recommended for quantifying the rock textural data and porosities. Adobe Photoshop system provides versatile abilities in selecting an area of interest and the pixel number of a selection could be read and used to calculate its area percentage. Therefore, Adobe Photoshop could be used to rapidly quantify textural components, such as content of grains, cements, and porosities including total porosities and different genetic type porosities. This method was named as Adobe Photoshop Quantification (PSQ). The workflow of the PSQ method was introduced with the oolitic dolomite samples from the Triassic Feixianguan Formation, Northeastern Sichuan Basin, China, for example. And the method was tested by comparing with the Folk's and Shvetsov's "standard" diagrams. In both cases, there is a close agreement between the "standard" percentages and those determined by the PSQ method with really small counting errors and operator errors, small standard deviations and high confidence levels. The porosities quantified by PSQ were evaluated against those determined by the whole rock helium gas expansion method to test the specimen errors. Results have shown that the porosities quantified by the PSQ are well correlated to the porosities determined by the conventional helium gas expansion method. Generally small discrepancies (mostly ranging from -3% to 3%) are caused by microporosities which would cause systematic underestimation of 2% and/or by macroporosities causing underestimation or overestimation in different cases. Adobe Photoshop could be used to quantify rock textural components and porosities. This method has been tested to be precise and accurate. It is time saving compared with usual methods.

  2. Analyzing and studying factors for determining neutral point position of fully grouted rock bolt%全长注浆岩石锚杆中性点影响因素分析研究

    Institute of Scientific and Technical Information of China (English)

    朱训国; 杨庆

    2009-01-01

    The neutral point theory is the important theory in underground engineering reinforcement theory. At present, the formula to determine neutral point position has been existed some unreasonable place. The neutral point theory had been further consummated and improved on the foundation of pre-researchers. Base on the developed analytical model and the theory of frictional resistance is zero at the neutral point, the factors have been detailed analyzed for affecting the neutral point position; and the correlations have been gained which affecting neutral point position. Through analyzing, it is revealed that the hydrostatic primary stress, bolt length, bolt spacing having no influence to neutral point position, and the radius of tunnel, Young's moduli of rock and bolt, the bolt diameter having remarkable influence to it. Among them, it is linear relation among the radius of tunnel, the bolt diameter with the neutral point position, but it is exponential function relations among the Young's modulus of rock and bolt with the neutral point position. The relationship of Young's modulus of rock mass and neutral point position presents exponential decreasing; the relationship of -Young's modulus of bolt and neutral point position presents exponential increasing. Through the factors analyzed, it has obtained that the general functional between the neutral point position and the correlation parameter, for further to study the neutral point theory providing the certain reference significance.%中性点理论是地下工程锚固理论中的重要理论,但目前对于中性点位置的确定计算公式存在不合理之处.在前人工作的基础上,对中性点理论进行了完善和改进.在建立的锚杆解析本构模型的基础上,结合中性点理论中锚杆中性点位置处的摩阻力为0的思想,对影响中性点位置的因素进行了较详细的分析,得出了影响锚杆中性点位置的相关因素.通过分析研究得到,

  3. Characterizing Big Data Management

    OpenAIRE

    Rogério Rossi; Kechi Hirama

    2015-01-01

    Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: t...

  4. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  5. Comparison of lactate sampling sites for rock climbing.

    Science.gov (United States)

    Fryer, S; Draper, N; Dickson, T; Blackwell, G; Winter, D; Ellis, G

    2011-06-01

    Comparisons of capillary blood lactate concentrations pre and post climb have featured in the protocols of many rock climbing studies, with most researchers obtaining samples from the fingertip. The nature of rock climbing, however, places a comparatively high physiological loading on the foreaand fingertips. Indeed, the fingertips are continually required for gripping and this makes pre-climb sampling at this site problematic. The purpose of our study was to examine differences in capillary blood lactate concentrations from samples taken at the fingertip and first (big) toe in a rock climbing context. 10 participants (9 males and 1 female) completed climbing bouts at 3 different angles (91°, 100° and 110°). Capillary blood samples were taken simultaneously from the fingertip and first toe pre and post climb. A limit of agreement plot revealed all data points to be well within the upper and lower bounds of the 95% population confidence interval. Subsequent regression analysis revealed a strong relationship (R (2)=0.94, y=0.940x + 0.208) between fingertip and first toe capillary blood lactate concentrations. Findings from our study suggest that the toe offers a valid alternative site for capillary blood lactate concentration analysis in a rock climbing context.

  6. Nuclear Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Hogerton, John

    1964-01-01

    This pamphlet describes how reactors work; discusses reactor design; describes research, teaching, and materials testing reactors; production reactors; reactors for electric power generation; reactors for supply heat; reactors for propulsion; reactors for space; reactor safety; and reactors of tomorrow. The appendix discusses characteristics of U.S. civilian power reactor concepts and lists some of the U.S. reactor power projects, with location, type, capacity, owner, and startup date.

  7. CERN Rocks

    CERN Multimedia

    2004-01-01

    The 15th CERN Hardronic Festival took place on 17 July on the terrace of Rest 3 (Prévessin). Over 1000 people, from CERN and other International Organizations, came to enjoy the warm summer night, and to watch the best of the World's High Energy music. Jazz, rock, pop, country, metal, blues, funk and punk blasted out from 9 bands from the CERN Musiclub and Jazz club, alternating on two stages in a non-stop show.  The night reached its hottest point when The Canettes Blues Band got everybody dancing to sixties R&B tunes (pictured). Meanwhile, the bars and food vans were working at full capacity, under the expert management of the CERN Softball club, who were at the same time running a Softball tournament in the adjacent "Higgs Field". The Hardronic Festival is the main yearly CERN music event, and it is organized with the support of the Staff Association and the CERN Administration.

  8. The Big Group of People Looking at How to Control Putting the Parts of the Air That Are the Same as What You Breathe Out Into Small Spaces in Rocks

    Energy Technology Data Exchange (ETDEWEB)

    Stack, Andrew

    2013-07-18

    Representing the Nanoscale Control of Geologic CO2 (NCGC), this document is one of the entries in the Ten Hundred and One Word Challenge. As part of the challenge, the 46 Energy Frontier Research Centers were invited to represent their science in images, cartoons, photos, words and original paintings, but any descriptions or words could only use the 1000 most commonly used words in the English language, with the addition of one word important to each of the EFRCs and the mission of DOE energy. The mission of NCGC is to build a fundamental understanding of molecular-to-pore-scale processes in fluid-rock systems, and to demonstrate the ability to control critical aspects of flow, transport, and mineralization in porous rock media as applied to the injection and storage of carbon dioxide (CO2) in subsurface reservoirs.

  9. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  10. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  11. Mining "big data" using big data services

    OpenAIRE

    Reips, UD; Matzat, U Uwe

    2014-01-01

    While many colleagues within science are fed up with the “big data fad”, empirical analyses we conducted for the current editorial actually show an inconsistent picture: we use big data services to determine whether there really is an increase in writing about big data or even widespread use of the term. Google Correlate (http://www.google.com/trends/correlate/), the first free tool we are presenting here, doesn’t list the term, showing that number of searches for it are below an absolute min...

  12. Results of new petrologic and remote sensing studies in the Big Bend region

    Science.gov (United States)

    Benker, Stevan Christian

    The initial section of this manuscript involves the South Rim Formation, a series of 32.2-32 Ma comenditic quartz trachytic-rhyolitic volcanics and associated intrusives, erupted and was emplaced in Big Bend National Park, Texas. Magmatic parameters have only been interpreted for one of the two diverse petrogenetic suites comprising this formation. Here, new mineralogic data for the South Rim Formation rocks are presented. Magmatic parameters interpreted from these data assist in deciphering lithospheric characteristics during the mid-Tertiary. Results indicate low temperatures (Fledermaus 3D three-dimensional visualization software to drape Google Earth horizontal positions over a National Elevation Dataset (NED) digital elevation map (DEM) in order to adopt a large set of elevation data. A vertical position accuracy of 1.63 meters RMSE was determined between 268 Google Earth data points and the NED. Since determined accuracies were considerably lower than those reported in previous investigations, we devoted a later portion of this investigation to testing Google Earth-NED data in paleo-surface modeling of the Big Bend region. An 18 x 30 kilometer area in easternmost Big Ranch State Park was selected to create a post-Laramide paleo-surface model via interpolation of approximately 2900 Google Earth-NED data points representing sections of an early Tertiary

  13. From Big Crunch to Big Bang

    OpenAIRE

    Khoury, Justin; Ovrut, Burt A.; Seiberg, Nathan; Steinhardt, Paul J.(Princeton Center for Theoretical Science, Princeton University, Princeton, NJ, 08544, USA); Turok, Neil

    2001-01-01

    We consider conditions under which a universe contracting towards a big crunch can make a transition to an expanding big bang universe. A promising example is 11-dimensional M-theory in which the eleventh dimension collapses, bounces, and re-expands. At the bounce, the model can reduce to a weakly coupled heterotic string theory and, we conjecture, it may be possible to follow the transition from contraction to expansion. The possibility opens the door to new classes of cosmological models. F...

  14. Flow characteristics of caved ore and rock in the multiple draw-point condition%多放矿口条件下崩落矿岩流动特性

    Institute of Scientific and Technical Information of China (English)

    孙浩; 金爱兵; 高永涛; 孟新秋

    2015-01-01

    基于离散元理论和PFC3D程序构建放矿模型,探究多放矿口条件下崩落矿岩流动特性,实现多放矿口条件下放出体及矿石残留体形态变化过程的可视化。同时,将模拟结果与已有研究结论进行对比分析,验证基于PFC程序的放矿模型在崩落矿岩流动特性研究中的可靠性。放矿PFC模拟结果表明,多放矿口条件下放出体形态会因各放矿口间的相互影响而产生交错、缺失等程度的不同变异,并不是一个规则的椭球体。在单一放矿口和多放矿口条件下,放出体高度的变化趋势均可概括为两个阶段:在放矿初始阶段,放出体高度呈指数形式快速增加,随放矿量的增加,其增长率逐渐减小;随后,放出体高度将随放矿量的增加而呈线性增长的趋势。矿石损失率随放矿口尺寸及崩落矿石层高度的增大而减小,随放矿口间距的增大而增大。当相邻放矿口间产生相互影响时,平面放矿方式与立面放矿方式相比,其矿石残留量更小,且崩落矿岩接触面呈近似水平状态下降。%Based on the particle flow theory and PFC3D code, a draw model was constructed to research the flow characteristics of caved ore and rock in the multiple draw-point condition and visualize the form-changing process of the isolated extraction zone ( IEZ) and the ridge hangover body. Simultaneously, the suitability and reliability of this draw model were validated in the flow characteristics study of caved ore and rock by comparative analysis between simulated results and existing research conclusions. Due to interactions among multiple draw-points, the IEZ’ s form produces different degrees of variation in the multiple draw-point condition, including interlacement and deficiency, which result in that the IEZ’ s form is not a regular ellipsoid. The height changing trend of the IEZ in both the isolated draw-point condition and the multiple draw-point

  15. Reactivity changes in hybrid thermal-fast reactor systems during fast core flooding

    International Nuclear Information System (INIS)

    A new space-dependent kinetic model in adiabatic approximation with local feedback reactivity parameters for reactivity determination in the coupled systems is proposed in this thesis. It is applied in the accident calculation of the 'HERBE' fast-thermal reactor system and compared to usual point kinetics model with core-averaged parameters. Advantages of the new model - more realistic picture of the reactor kinetics and dynamics during local large reactivity perturbation, under the same heat transfer conditions, are underlined. Calculated reactivity parameters of the new model are verified in the experiments performed at the 'HERBE' coupled core. The model has shown that the 'HERBE' safety system can shutdown reactor safely and fast even in the case of highly set power trip and even under conditions of big partial failure of the reactor safety system (author)

  16. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  17. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  18. Antigravity and the big crunch/big bang transition

    CERN Document Server

    Bars, Itzhak; Steinhardt, Paul J; Turok, Neil

    2011-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  19. Why R&D for Generation IV reactors should be subsidised? A strictly economic point of view based real option theory

    International Nuclear Information System (INIS)

    Generation IV fast reactors make better use of natural uranium than current reactors. Since there is a significant risk that the uranium market could come under pressure before the end of the 21st century, fast reactors are in a position to play a vital 'sustainable' role by making the resource usable on longer time scales. However, given their likely higher investment costs compared to previous generations, their competitiveness is not guaranteed. This paper aims at assessing the R&D budget available to develop this technology, considering that it would be deployed to counterbalance an important price rise of uranium. The deployment decision thus depends on its relative competitiveness, which is determined by its overcost and uranium price. A model based on the real options theory shows that the budget willingly allocated to R&D is positive even in cases of overcost and uranium price forecasts unfavourable to fast reactors. (author)

  20. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  1. Matrix Big Brunch

    OpenAIRE

    Bedford, J; Papageorgakis, C.; Rodriguez-Gomez, D.; Ward, J.

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  2. ANALYSIS OF BIG DATA

    OpenAIRE

    Anshul Sharma; Preeti Gulia

    2014-01-01

    Big Data is data that either is too large, grows too fast, or does not fit into traditional architectures. Within such data can be valuable information that can be discovered through data analysis [1]. Big data is a collection of complex and large data sets that are difficult to process and mine for patterns and knowledge using traditional database management tools or data processing and mining systems. Big Data is data whose scale, diversity and complexity require new architecture, technique...

  3. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  4. Pre-Big Bang, vacuum and noncyclic cosmologies

    OpenAIRE

    Gonzalez-Mestres, L.

    2011-01-01

    WMAP and Planck open the way to unprecedented Big Bang phenomenology, potentially allowing to test the standard Big Bang model as well as less conventional approaches including noncyclic pre-Big Bang cosmologies that would incorporate a new fundamental scale beyond the Planck scale and, possibly, new ultimate constituents of matter. Alternatives to standard physics can be considered from a cosmological point of view concerning vacuum structure, the nature of space-time, the origin and evoluti...

  5. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  6. Big Boss Interval Games

    NARCIS (Netherlands)

    Alparslan-Gok, S.Z.; Brânzei, R.; Tijs, S.H.

    2008-01-01

    In this paper big boss interval games are introduced and various characterizations are given. The structure of the core of a big boss interval game is explicitly described and plays an important role relative to interval-type bi-monotonic allocation schemes for such games. Specifically, each element

  7. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled. PMID:26173222

  8. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  9. Crisis analytics : big data-driven crisis response

    NARCIS (Netherlands)

    Qadir, Junaid; ur Rasool, Raihan; Zwitter, Andrej; Sathiaseelan, Arjuna; Crowcroft, Jon

    2016-01-01

    Disasters have long been a scourge for humanity. With the advances in technology (in terms of computing, communications, and the ability to process, and analyze big data), our ability to respond to disasters is at an inflection point. There is great optimism that big data tools can be leveraged to p

  10. Big Data, Big Knowledge: Big Data for Personalized Healthcare.

    OpenAIRE

    Viceconti, M.; Hunter, P.; Hose, R.

    2015-01-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine soluti...

  11. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority. PMID:26218867

  12. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  13. Big Data: Survey, Technologies, Opportunities, and Challenges

    Directory of Open Access Journals (Sweden)

    Nawsher Khan

    2014-01-01

    Full Text Available Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  14. Big data: survey, technologies, opportunities, and challenges.

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  15. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  16. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  17. Research present situation and analysis on classification of rock drillability

    Institute of Scientific and Technical Information of China (English)

    Zhang Zhi-hong; MA Qin-yong

    2001-01-01

    Rock drillability reflects the drill bit fragments rock hardly or easily. At present, rock drillability classification indexes have rock single-axle compressive strength, point load intensity,fracture stress during chiseling, drill speed, chiseling specific work, acoustic parameter, cutting magnitude, and so on. Every index reflects rock drillability but isn't overall. It is feasible that using many indexes of fuzzy mathematics method etc. to evaluate rock drillability.

  18. Rock Paintings.

    Science.gov (United States)

    Jones, Julienne Edwards

    1998-01-01

    Discusses the integration of art and academics in a fifth-grade instructional unit on Native American culture. Describes how students studied Native American pictographs, designed their own pictographs, made their own tools, and created rock paintings of their pictographs using these tools. Provides a list of references on Native American…

  19. Intellektuaalne rock

    Index Scriptorium Estoniae

    2007-01-01

    Briti laulja-helilooja ja näitleja Toyah Willcox ning Bill Rieflin ansamblist R.E.M. ja Pat Mastelotto King Krimsonist esinevad koos ansamblitega The Humans ja Tuner 25. okt. Tallinnas Rock Cafés ja 27. okt Tartu Jaani kirikus

  20. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    2016-01-01

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van dez

  1. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  2. Big Data ethics

    Directory of Open Access Journals (Sweden)

    Andrej Zwitter

    2014-11-01

    Full Text Available The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with specific and knowable outcomes, towards actions by many unaware that they may have taken actions with unintended consequences for anyone. Responses will require a rethinking of ethical choices, the lack thereof and how this will guide scientists, governments, and corporate agencies in handling Big Data. This essay elaborates on the ways Big Data impacts on ethical conceptions.

  3. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  4. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  5. Big Creek Pit Tags

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BCPITTAGS database is used to store data from an Oncorhynchus mykiss (steelhead/rainbow trout) population dynamics study in Big Creek, a coastal stream along...

  6. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Klinkby Madsen, Anders; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  7. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  8. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  9. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  10. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact.......This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...

  11. Big Data Analytics

    Indian Academy of Sciences (India)

    2016-08-01

    The volume and variety of data being generated using computersis doubling every two years. It is estimated that in 2015,8 Zettabytes (Zetta=1021) were generated which consistedmostly of unstructured data such as emails, blogs, Twitter,Facebook posts, images, and videos. This is called big data. Itis possible to analyse such huge data collections with clustersof thousands of inexpensive computers to discover patterns inthe data that have many applications. But analysing massiveamounts of data available in the Internet has the potential ofimpinging on our privacy. Inappropriate analysis of big datacan lead to misleading conclusions. In this article, we explainwhat is big data, how it is analysed, and give some case studiesillustrating the potentials and pitfalls of big data analytics.

  12. "Big Data": Big Knowledge Gaps in the Field of Internet Science

    Directory of Open Access Journals (Sweden)

    Ulf-Dietrich Reips

    2012-01-01

    Full Text Available Research on so-called ‘Big Data’ has received a considerable momentum and is expected to grow in the future. One very interesting stream of research on Big Data analyzes online networks. Many online networks are known to have some typical macro-characteristics, such as ‘small world’ properties. Much less is known about underlying micro-processes leading to these properties. The models used by Big Data researchers usually are inspired by mathematical ease of exposition. We propose to follow in addition a different strategy that leads to knowledge about micro-processes that match with actual online behavior. This knowledge can then be used for the selection of mathematically-tractable models of online network formation and evolution. Insight from social and behavioral research is needed for pursuing this strategy of knowledge generation about micro-processes. Accordingly, our proposal points to a unique role that social scientists could play in Big Data research. ...

  13. Testing Big Bang Nucleosynthesis

    OpenAIRE

    Steigman, Gary

    1996-01-01

    Big Bang Nucleosynthesis (BBN), along with the cosmic background radiation and the Hubble expansion, is one of the pillars ofthe standard, hot, big bang cosmology since the primordial synthesis of the light nuclides (D, $^3$He, $^4$He, $^7$Li) must have occurred during the early evolution of a universe described by this model. The overall consistency between the predicted and observed abundances of the light nuclides, each of which spans a range of some nine orders of magnitude, provides impr...

  14. Sharing big biomedical data

    OpenAIRE

    Toga, Arthur W.; Dinov, Ivo D.

    2015-01-01

    Background The promise of Big Biomedical Data may be offset by the enormous challenges in handling, analyzing, and sharing it. In this paper, we provide a framework for developing practical and reasonable data sharing policies that incorporate the sociological, financial, technical and scientific requirements of a sustainable Big Data dependent scientific community. Findings Many biomedical and healthcare studies may be significantly impacted by using large, heterogeneous and incongruent data...

  15. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  16. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  17. Geochronology of the Suomenniemi rapakivi granite complex revisited: Implications of point-specific errors on zircon U-Pb and refined λ87 on whole-rock Rb-Sr

    Directory of Open Access Journals (Sweden)

    Rämö, O.T.

    2015-06-01

    Full Text Available Multi-grain isotope dilution and secondary ion microprobe zircon U-Pb as well as wholerock Rb-Sr isotope dilution data on the late Paleoproterozoic Suomenniemi rapakivi granite complex (exposed on the northern flank of the Wiborg batholith in southeastern Finland are discussed in the light of point-specific errors on Pb/U and proposed new values of the decay constant of 87Rb, λ87. U-Pb zircon data on hornblende granite and biotite granite of the main metaluminous-marginally peraluminous granite fractionation series of the Suomenniemi batholith indicate crystallization in the 1644-1640 Ma range, with a preferred age at 1644±4 Ma. A cross-cutting hornblende-clinopyroxene-fayalite granite is probably slightly younger, as are quartz-feldspar porphyry dikes (1634±4 Ma that cut both the main granite series and the metamorphic Svecofennian country rocks of the Suomenniemi batholith. Recalculation of whole-rock Rb-Sr data published on the main granite series of the batholith by Rämö (1999 implies errorchron ages of 1635±10 Ma and 1630±10 Ma and a magmatic 87Sr/86Sri of 0.7062±0.0024. This relatively high initial ratio is indicative of a major Proterozoic crustal source component in the granites of the batholith. The main granite series of the batholith probably cooled relatively rapidly to and below the closure temperature of the Rb-Sr isotope system, with little subsequent subsolidus adjustment. The three discrete silicic magmatic phases of the batholith (the main granite series, the hornblende-clinopyroxene-fayalite granite, and the quartz-feldspar porphyry dikes were all probably emplaced before the main volume of rapakivi granite (the Wiborg batholith proper in southeastern Finland. The Suomenniemi batholith thus represents an early magmatic precursor to the classic Wiborg batholith and was emplaced clearly before the massive rise of isotherms associated with the ascent and crystallization of the magmas that formed the bulk of the Wiborg

  18. Big Data in Caenorhabditis elegans: quo vadis?

    Science.gov (United States)

    Hutter, Harald; Moerman, Donald

    2015-11-01

    A clear definition of what constitutes "Big Data" is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of "complete" data sets for this organism is actually rather small--not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein-protein interaction--important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell.

  19. Nuclear power in rock. Principal report

    International Nuclear Information System (INIS)

    In September 1975 the Swedish Government directed the Swedish State Power Board to study the question of rock-siting nuclear power plants. The study accounted for in this report aims at clarifying the advantages and disadvantages of siting a nuclear power plant in rock, compared to siting on ground level, considering reactor safety, war protection and sabotage. The need for nuclear power production during war situations and the closing down of nuclear power plants after terminated operation are also dealt with. (author)

  20. Rock stresses (Grimsel rock laboratory)

    International Nuclear Information System (INIS)

    On the research and development project 'Rock Stress Measurements' the BGR has developed and tested several test devices and methods at GTS for use in boreholes at a depth of 200 m and has carried out rock mechanical and engineering geological investigations for the evaluation and interpretation of the stress measurements. The first time a computer for data processing was installed in the borehole together with the BGR-probe. Laboratory tests on hollow cylinders were made to study the stress-deformation behavior. To validate and to interprete the measurement results some test methods were modelled using the finite-element method. The dilatometer-tests yielded high values of Young's modulus, whereas laboratory tests showed lower values with a distinct deformation anisotropy. Stress measurements with the BGR-probe yielded horizontal stresses being higher than the theoretical overburden pressure and vertical stresses which agree well with the theoretical overburden pressure. These results are comparable to the results of the hydraulic fracturing tests, whereas stresses obtained with CSIR-triaxial cells are generally lower. The detailed geological mapping of the borehole indicated relationships between stress and geology. With regard to borehole depth different zones of rock structure joint frequency, joint orientation, and orientation of microfissures as well as stress magnitude, stress direction, and degree of deformation anisotropy could be distinguished. (orig./HP)

  1. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan Capalbo

    2005-12-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I are organized into four areas: (1) Evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; (2) Development of GIS-based reporting framework that links with national networks; (3) Design of an integrated suite of monitoring, measuring, and verification technologies, market-based opportunities for carbon management, and an economic/risk assessment framework; (referred to below as the Advanced Concepts component of the Phase I efforts) and (4) Initiation of a comprehensive education and outreach program. As a result of the Phase I activities, the groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that complements the ongoing DOE research agenda in Carbon Sequestration. The geology of the Big Sky Carbon Sequestration Partnership Region is favorable for the potential sequestration of enormous volume of CO{sub 2}. The United States Geological Survey (USGS 1995) identified 10 geologic provinces and 111 plays in the region. These provinces and plays include both sedimentary rock types characteristic of oil, gas, and coal productions as well as large areas of mafic volcanic rocks. Of the 10 provinces and 111 plays, 1 province and 4 plays are located within Idaho. The remaining 9 provinces and 107 plays are dominated by sedimentary rocks and located in the states of Montana and Wyoming. The potential sequestration capacity of the 9 sedimentary provinces within the region ranges from 25,000 to almost 900,000 million metric tons of CO{sub 2}. Overall every sedimentary formation investigated

  2. Aspects of the Flora and Vegetation of the “Izvorul Bigăr” Nature Reserve (South-Western Romania

    Directory of Open Access Journals (Sweden)

    Ilinca M. IMBREA

    2009-06-01

    Full Text Available The “Izvorul Bigăr” Nature Reserve is located in south-western Romania. The aim of the present paper is to describe some aspects of the flora and vegetation around Bigăr spring. The analysis of the vegetal association was carried out using the method of the Central-European phytocoenological school. The vegetation around the Bigăr spring and waterfall is dominated by compact beech forests with a frequently reduced grassy layer and soil rich in humus. On the banks of the watercourse and on the rocks around the spring there are species specific to flooding plains of the beech sub-stratum and also thermophilous and xerophilous species, many of them Balkan, Pontic or Mediterranean elements. The phytocoenoses we analysed belong to the Phyllitidi - Fagetum Vida (1959 1963 association. The association is characteristic to shady and moist slopes with soils rich in humus and formed on a lime substratum sometimes with surface rocks. The species with high abundance-dominance values are: Fagus sylvatica, Fraxinus ornus, Acer pseudoplatanus, Tilia cordata, Hedera helix, Asplenium scolopendrium, Arum orientale, Asarum europaeum, Cardamine bulbifera, Lunaria annua, Polypodium vulgare. Such species as Carpinus orientalis, Cotinus coggygria, Fraxinus ornus, Ruscus hypoglossum, Syringa vulgaris point out the thermophilous character of the forests in southern Banat.

  3. Permeability Evolution and Rock Brittle Failure

    Directory of Open Access Journals (Sweden)

    Sun Qiang

    2015-08-01

    Full Text Available This paper reports an experimental study of the evolution of permeability during rock brittle failure and a theoretical analysis of rock critical stress level. It is assumed that the rock is a strain-softening medium whose strength can be described by Weibull’s distribution. Based on the two-dimensional renormalization group theory, it is found that the stress level λ c (the ratio of the stress at the critical point to the peak stress depends mainly on the homogeneity index or shape parameter m in the Weibull’s distribution for the rock. Experimental results show that the evolution of permeability is closely related to rock deformation stages: the permeability has a rapid increase with the growth of cracks and their surface areas (i.e., onset of fracture coalescence point, and reaches the maximum at rock failure. Both the experimental and analytical results show that this point of rapid increase in permeability on the permeabilitypressure curve corresponds to the critical point on the stress-strain curve; for rock compression, the stress at this point is approximately 80% of the peak strength. Thus, monitoring the evolution of permeability may provide a new means of identifying the critical point of rock brittle fracture

  4. Nuclear reactor PBMR and cogeneration; Reactor nuclear PBMR y cogeneracion

    Energy Technology Data Exchange (ETDEWEB)

    Ramirez S, J. R.; Alonso V, G., E-mail: ramon.ramirez@inin.gob.mx [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2013-10-15

    In recent years the nuclear reactor designs for the electricity generation have increased their costs, so that at the moment costs are managed of around the 5000 US D for installed kw, reason for which a big nuclear plant requires of investments of the order of billions of dollars, the designed reactors as modular of low power seek to lighten the initial investment of a big reactor dividing the power in parts and dividing in modules the components to lower the production costs, this way it can begin to build a module and finished this to build other, differing the long term investment, getting less risk therefore in the investment. On the other hand the reactors of low power can be very useful in regions where is difficult to have access to the electric net being able to take advantage of the thermal energy of the reactor to feed other processes like the water desalination or the vapor generation for the processes industry like the petrochemical, or even more the possible hydrogen production to be used as fuel. In this work the possibility to generate vapor of high quality for the petrochemical industry is described using a spheres bed reactor of high temperature. (Author)

  5. Big data in biomedicine.

    Science.gov (United States)

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. PMID:24183925

  6. Big and Small

    CERN Document Server

    Ekers, R D

    2010-01-01

    Technology leads discovery in astronomy, as in all other areas of science, so growth in technology leads to the continual stream of new discoveries which makes our field so fascinating. Derek de Solla Price had analysed the discovery process in science in the 1960s and he introduced the terms 'Little Science' and 'Big Science' as part of his discussion of the role of exponential growth in science. I will show how the development of astronomical facilities has followed this same trend from 'Little Science' to 'Big Science' as a field matures. We can see this in the discoveries resulting in Nobel Prizes in astronomy. A more detailed analysis of discoveries in radio astronomy shows the same effect. I include a digression to look at how science progresses, comparing the roles of prediction, serendipity, measurement and explanation. Finally I comment on the differences between the 'Big Science' culture in Physics and in Astronomy.

  7. Big Data and Peacebuilding

    Directory of Open Access Journals (Sweden)

    Sanjana Hattotuwa

    2013-11-01

    Full Text Available Any peace process is an exercise in the negotiation of big data. From centuries old communal hagiography to the reams of official texts, media coverage and social media updates, peace negotiations generate data. Peacebuilding and peacekeeping today are informed by, often respond and contribute to big data. This is no easy task. As recently as a few years ago, before the term big data embraced the virtual on the web, what informed peace process design and implementation was in the physical domain – from contested borders and resources to background information in the form of text. The move from analogue, face-to-face negotiations to online, asynchronous, web-mediated negotiations – which can still include real world meetings – has profound implications for how peace is strengthened in fragile democracies.

  8. Big Crater as Viewed by Pathfinder Lander

    Science.gov (United States)

    1997-01-01

    The 'Big Crater' is actually a relatively small Martian crater to the southeast of the Mars Pathfinder landing site. It is 1500 meters (4900 feet) in diameter, or about the same size as Meteor Crater in Arizona. Superimposed on the rim of Big Crater (the central part of the rim as seen here) is a smaller crater nicknamed 'Rimshot Crater.' The distance to this smaller crater, and the nearest portion of the rim of Big Crater, is 2200 meters (7200 feet). To the right of Big Crater, south from the spacecraft, almost lost in the atmospheric dust 'haze,' is the large streamlined mountain nicknamed 'Far Knob.' This mountain is over 450 meters (1480 feet) tall, and is over 30 kilometers (19 miles) from the spacecraft. Another, smaller and closer knob, nicknamed 'Southeast Knob' can be seen as a triangular peak to the left of the flanks of the Big Crater rim. This knob is 21 kilometers (13 miles) southeast from the spacecraft.The larger features visible in this scene - Big Crater, Far Knob, and Southeast Knob - were discovered on the first panoramas taken by the IMP camera on the 4th of July, 1997, and subsequently identified in Viking Orbiter images taken over 20 years ago. The scene includes rocky ridges and swales or 'hummocks' of flood debris that range from a few tens of meters away from the lander to the distance of South Twin Peak. The largest rock in the nearfield, just left of center in the foreground, nicknamed 'Otter', is about 1.5 meters (4.9 feet) long and 10 meters (33 feet) from the spacecraft.This view of Big Crater was produced by combining 6 individual 'Superpan' scenes from the left and right eyes of the IMP camera. Each frame consists of 8 individual frames (left eye) and 7 frames (right eye) taken with different color filters that were enlarged by 500% and then co-added using Adobe Photoshop to produce, in effect, a super-resolution panchromatic frame that is sharper than an individual frame would be.Mars Pathfinder is the second in NASA's Discovery

  9. N Reactor

    Data.gov (United States)

    Federal Laboratory Consortium — The last of Hanfordqaodmasdkwaspemas7ajkqlsmdqpakldnzsdflss nine plutonium production reactors to be built was the N Reactor.This reactor was called a dual purpose...

  10. Georges et le big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2011-01-01

    Georges et Annie, sa meilleure amie, sont sur le point d'assister à l'une des plus importantes expériences scientifiques de tous les temps : explorer les premiers instants de l'Univers, le Big Bang ! Grâce à Cosmos, leur super ordinateur, et au Grand Collisionneur de hadrons créé par Éric, le père d'Annie, ils vont enfin pouvoir répondre à cette question essentielle : pourquoi existons nous ? Mais Georges et Annie découvrent qu'un complot diabolique se trame. Pire, c'est toute la recherche scientifique qui est en péril ! Entraîné dans d'incroyables aventures, Georges ira jusqu'aux confins de la galaxie pour sauver ses amis...Une plongée passionnante au coeur du Big Bang. Les toutes dernières théories de Stephen Hawking et des plus grands scientifiques actuels.

  11. Baryon symmetric big bang cosmology

    Science.gov (United States)

    Stecker, F. W.

    1978-01-01

    Both the quantum theory and Einsteins theory of special relativity lead to the supposition that matter and antimatter were produced in equal quantities during the big bang. It is noted that local matter/antimatter asymmetries may be reconciled with universal symmetry by assuming (1) a slight imbalance of matter over antimatter in the early universe, annihilation, and a subsequent remainder of matter; (2) localized regions of excess for one or the other type of matter as an initial condition; and (3) an extremely dense, high temperature state with zero net baryon number; i.e., matter/antimatter symmetry. Attention is given to the third assumption, which is the simplest and the most in keeping with current knowledge of the cosmos, especially as pertains the universality of 3 K background radiation. Mechanisms of galaxy formation are discussed, whereby matter and antimatter might have collided and annihilated each other, or have coexisted (and continue to coexist) at vast distances. It is pointed out that baryon symmetric big bang cosmology could probably be proved if an antinucleus could be detected in cosmic radiation.

  12. Networks & big data

    OpenAIRE

    Litvak, Nelly; Meulen, van der, P.

    2015-01-01

    Once a year, the NWO cluster Stochastics – Theoretical and Applied Research (STAR) organises a STAR Outreach Day, a one-day event around a theme that is of a broad interest to the stochastics community in the Netherlands. The last Outreach Day took place at Eurandom on 12 December 2014. The theme of the day was ‘Networks & Big Data’. The topic is very timely. The Vision document 2025 of the PlatformWiskunde Nederland (PWN) mentions big data as one of the six “major societal and scientific tre...

  13. Primordial Big Bang Nucleosynthesis

    OpenAIRE

    Olive, Keith A.

    1999-01-01

    Big Bang Nucleosynthesis is the theory of the production of the the light element isotopes of D, He3, He4, and Li7. After a brief review of the essential elements of the standard Big Bang model at a temperature of about 1 MeV, the theoretical input and predictions of BBN are discussed. The theory is tested by the observational determinations of the light element abundances and the current status of these observations is reviewed. Concordance of standard model and the related observations is f...

  14. A Novel Burnable Absorber Concept for PWR: BigT (Burnable Absorber-Integrated Guide Thimble)

    Energy Technology Data Exchange (ETDEWEB)

    Yahya, Mohdsyukri; Kim, Yonghee [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Chung, Chang Kyu [KEPCO Engineering and Construction Company, Daejeon (Korea, Republic of)

    2014-05-15

    This paper presents the essential BigT design concepts and its lattice neutronic characteristics. Neutronic performance of a newly-proposed BA concept for PWR named BigT is investigated in this study. Preliminary lattice analyses of the BigT absorber-loaded WH 17x17 fuel assembly show a high potential of the concept as it performs relatively well in comparison with commercial burnable absorber technologies, especially in managing reactivity depletion and peaking factor. A sufficiently high control rod worth can still be obtained with the BigT absorbers in place. It is expected that with such performance and design flexibilities, any loading pattern and core management objective, including a soluble boron-free PWR, can potentially be fulfilled with the BigT absorbers. Future study involving full 3D reactor core simulations with the BigT absorbers shall hopefully verify this hypothesis. A new burnable absorber design for Pressurized Water Reactor (PWR) named 'Burnable absorber-Integrated control rod Guide Thimble' (BigT) was recently proposed. Unlike conventional burnable absorber (BA) technologies, the BigT integrates BA materials directly into the guide thimble but still allows insertion of control rod (CR). In addition, the BigT offers a variety of design flexibilities such that any loading pattern and core management objective can potentially be fulfilled.

  15. Water resources in the Big Lost River Basin, south-central Idaho

    Science.gov (United States)

    Crosthwaite, E.G.; Thomas, C.A.; Dyer, K.L.

    1970-01-01

    The Big Lost River basin occupies about 1,400 square miles in south-central Idaho and drains to the Snake River Plain. The economy in the area is based on irrigation agriculture and stockraising. The basin is underlain by a diverse-assemblage of rocks which range, in age from Precambrian to Holocene. The assemblage is divided into five groups on the basis of their hydrologic characteristics. Carbonate rocks, noncarbonate rocks, cemented alluvial deposits, unconsolidated alluvial deposits, and basalt. The principal aquifer is unconsolidated alluvial fill that is several thousand feet thick in the main valley. The carbonate rocks are the major bedrock aquifer. They absorb a significant amount of precipitation and, in places, are very permeable as evidenced by large springs discharging from or near exposures of carbonate rocks. Only the alluvium, carbonate rock and locally the basalt yield significant amounts of water. A total of about 67,000 acres is irrigated with water diverted from the Big Lost River. The annual flow of the river is highly variable and water-supply deficiencies are common. About 1 out of every 2 years is considered a drought year. In the period 1955-68, about 175 irrigation wells were drilled to provide a supplemental water supply to land irrigated from the canal system and to irrigate an additional 8,500 acres of new land. Average. annual precipitation ranged from 8 inches on the valley floor to about 50 inches at some higher elevations during the base period 1944-68. The estimated water yield of the Big Lost River basin averaged 650 cfs (cubic feet per second) for the base period. Of this amount, 150 cfs was transpired by crops, 75 cfs left the basin as streamflow, and 425 cfs left as ground-water flow. A map of precipitation and estimated values of evapotranspiration were used to construct a water-yield map. A distinctive feature of the Big Lost River basin, is the large interchange of water from surface streams into the ground and from the

  16. [Utilization of Big Data in Medicine and Future Outlook].

    Science.gov (United States)

    Kinosada, Yasutomi; Uematsu, Machiko; Fujiwara, Takuya

    2016-03-01

    "Big data" is a new buzzword. The point is not to be dazzled by the volume of data, but rather to analyze it, and convert it into insights, innovations, and business value. There are also real differences between conventional analytics and big data. In this article, we show some results of big data analysis using open DPC (Diagnosis Procedure Combination) data in areas of the central part of JAPAN: Toyama, Ishikawa, Fukui, Nagano, Gifu, Aichi, Shizuoka, and Mie Prefectures. These 8 prefectures contain 51 medical administration areas called the second medical area. By applying big data analysis techniques such as k-means, hierarchical clustering, and self-organizing maps to DPC data, we can visualize the disease structure and detect similarities or variations among the 51 second medical areas. The combination of a big data analysis technique and open DPC data is a very powerful method to depict real figures on patient distribution in Japan.

  17. [Utilization of Big Data in Medicine and Future Outlook].

    Science.gov (United States)

    Kinosada, Yasutomi; Uematsu, Machiko; Fujiwara, Takuya

    2016-03-01

    "Big data" is a new buzzword. The point is not to be dazzled by the volume of data, but rather to analyze it, and convert it into insights, innovations, and business value. There are also real differences between conventional analytics and big data. In this article, we show some results of big data analysis using open DPC (Diagnosis Procedure Combination) data in areas of the central part of JAPAN: Toyama, Ishikawa, Fukui, Nagano, Gifu, Aichi, Shizuoka, and Mie Prefectures. These 8 prefectures contain 51 medical administration areas called the second medical area. By applying big data analysis techniques such as k-means, hierarchical clustering, and self-organizing maps to DPC data, we can visualize the disease structure and detect similarities or variations among the 51 second medical areas. The combination of a big data analysis technique and open DPC data is a very powerful method to depict real figures on patient distribution in Japan. PMID:27363223

  18. Inference of fitness values and putative appearance time points for evolvable self-replicating molecules from time series of occurrence frequencies in an evolution reactor.

    Science.gov (United States)

    Aita, Takuyo; Ichihashi, Norikazu; Yomo, Tetsuya

    2016-07-21

    We have established a translation-coupled RNA replication system within a cell-like compartment, and conducted an experimental evolution of the RNA molecules in the system. Then, we obtained a time series of occurrence frequencies of 91 individual genotypes through random sampling and next-generation sequencing. The time series showed a complex clonal interference and a polymorphic population called the "quasispecies". By fitting a deterministic kinetic model of evolvable simple self-replicators to the time series, we estimated the fitness value and "putative appearance time point" for each of the 91 major genotypes identified, where the putative appearance time point is defined as a certain time point at which a certain mutant genotype is supposed to appear in the deterministic kinetic model. As a result, the kinetic model was well fitted and additionally we confirmed that the estimated fitness values for 11 genotypes were considerably close to the experimentally measured ones (Ichihashi et al., 2015). In this sequel paper, with the theoretical basis of the deterministic kinetic model, we present the details of inference of the fitness values and putative appearance time points for the 91 genotypes. It may be possible to apply this methodology to other self-replicating molecules, viruses and bacteria. PMID:27091052

  19. Water - rock interaction in different rock environments

    International Nuclear Information System (INIS)

    The study assesses the groundwater geochemistry and geological environment of 44 study sites for radioactive waste disposal. Initially, the study sites were divided by rock type into 5 groups: (1) acid - intermediate rocks, (2) mafic - ultramafic rocks, (3) gabbros, amphibolites and gneisses that contain calc-silicate (skarn) rocks, (4) carbonates and (5) sandstones. Separate assessments are made of acid - intermediate plutonic rocks and of a subgroup that comprises migmatites, granite and mica gneiss. These all belong to the group of acid - intermediate rocks. Within the mafic -ultramafic rock group, a subgroup that comprises mafic - ultramafic plutonic rocks, serpentinites, mafic - ultramafic volcanic rocks and volcanic - sedimentary schists is also evaluated separately. Bedrock groundwaters are classified by their concentration of total dissolved solids as fresh, brackish, saline, strongly saline and brine-class groundwaters. (75 refs., 24 figs., 3 tabs.)

  20. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with sp

  1. Space big book

    CERN Document Server

    Homer, Charlene

    2007-01-01

    Our Combined resource includes all necessary areas of Space for grades five to eight. Get the big picture about the Solar System, Galaxies and the Universe as your students become fascinated by the interesting information about the Sun, Earth, Moon, Comets, Asteroids Meteoroids, Stars and Constellations. Also, thrill your young astronomers as they connect Earth and space cycles with their daily life.

  2. A Big Bang Lab

    Science.gov (United States)

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  3. Governing Big Data

    Directory of Open Access Journals (Sweden)

    Andrej J. Zwitter

    2014-04-01

    Full Text Available 2.5 quintillion bytes of data are created every day through pictures, messages, gps-data, etc. "Big Data" is seen simultaneously as the new Philosophers Stone and Pandora's box: a source of great knowledge and power, but equally, the root of serious problems.

  4. The Big Bang

    CERN Multimedia

    Moods, Patrick

    2006-01-01

    How did the Universe begin? The favoured theory is that everything - space, time, matter - came into existence at the same moment, around 13.7 thousand million years ago. This event was scornfully referred to as the "Big Bang" by Sir Fred Hoyle, who did not believe in it and maintained that the Universe had always existed.

  5. Big Java late objects

    CERN Document Server

    Horstmann, Cay S

    2012-01-01

    Big Java: Late Objects is a comprehensive introduction to Java and computer programming, which focuses on the principles of programming, software engineering, and effective learning. It is designed for a two-semester first course in programming for computer science students.

  6. Big is beautiful

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, J.P.

    2007-06-08

    Although big solar systems are both effective and architecturally pleasing, they are still not widespread in Germany. Recently, politicians reacted by improving funding conditions. In order to prevent planning errors, planners and fitters must be better trained, and standardisation of systems must be enhanced. (orig.)

  7. Big Data and Cycling

    NARCIS (Netherlands)

    Romanillos, Gustavo; Zaltz Austwick, Martin; Ettema, Dick; De Kruijf, Joost

    2016-01-01

    Big Data has begun to create significant impacts in urban and transport planning. This paper covers the explosion in data-driven research on cycling, most of which has occurred in the last ten years. We review the techniques, objectives and findings of a growing number of studies we have classified

  8. Big data in history

    CERN Document Server

    Manning, Patrick

    2013-01-01

    Big Data in History introduces the project to create a world-historical archive, tracing the last four centuries of historical dynamics and change. Chapters address the archive's overall plan, how to interpret the past through a global archive, the missions of gathering records, linking local data into global patterns, and exploring the results.

  9. The big bang

    Science.gov (United States)

    Silk, Joseph

    Our universe was born billions of years ago in a hot, violent explosion of elementary particles and radiation - the big bang. What do we know about this ultimate moment of creation, and how do we know it? Drawing upon the latest theories and technology, this new edition of The big bang, is a sweeping, lucid account of the event that set the universe in motion. Joseph Silk begins his story with the first microseconds of the big bang, on through the evolution of stars, galaxies, clusters of galaxies, quasars, and into the distant future of our universe. He also explores the fascinating evidence for the big bang model and recounts the history of cosmological speculation. Revised and updated, this new edition features all the most recent astronomical advances, including: Photos and measurements from the Hubble Space Telescope, Cosmic Background Explorer Satellite (COBE), and Infrared Space Observatory; the latest estimates of the age of the universe; new ideas in string and superstring theory; recent experiments on neutrino detection; new theories about the presence of dark matter in galaxies; new developments in the theory of the formation and evolution of galaxies; the latest ideas about black holes, worm holes, quantum foam, and multiple universes.

  10. Big ideas: innovation policy

    OpenAIRE

    Van Reenen, John

    2011-01-01

    In the last CentrePiece, John Van Reenen stressed the importance of competition and labour market flexibility for productivity growth. His latest in CEP's 'big ideas' series describes the impact of research on how policy-makers can influence innovation more directly - through tax credits for business spending on research and development.

  11. NEW THEORY IN TUNNEL STABLILITY CONTROL OF SOFT ROCK ——MECHANICS OF SOFT ROCK ENGINEERING

    Institute of Scientific and Technical Information of China (English)

    何满朝

    1996-01-01

    Tunnel stability control is a world-wide difficult problem. For the sake of solving it,the new theory of soft rock engineering mechanics has been estabilished. Some key points,such as the definition and classification of soft rock, mechanical deformation mechanism of a soft rock tunnel, the critical support technique of soft rock tunnel and the new theory of the soft rock tunnel stability control are proposed in this paper.

  12. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan Capalbo

    2005-12-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I are organized into four areas: (1) Evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; (2) Development of GIS-based reporting framework that links with national networks; (3) Design of an integrated suite of monitoring, measuring, and verification technologies, market-based opportunities for carbon management, and an economic/risk assessment framework; (referred to below as the Advanced Concepts component of the Phase I efforts) and (4) Initiation of a comprehensive education and outreach program. As a result of the Phase I activities, the groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that complements the ongoing DOE research agenda in Carbon Sequestration. The geology of the Big Sky Carbon Sequestration Partnership Region is favorable for the potential sequestration of enormous volume of CO{sub 2}. The United States Geological Survey (USGS 1995) identified 10 geologic provinces and 111 plays in the region. These provinces and plays include both sedimentary rock types characteristic of oil, gas, and coal productions as well as large areas of mafic volcanic rocks. Of the 10 provinces and 111 plays, 1 province and 4 plays are located within Idaho. The remaining 9 provinces and 107 plays are dominated by sedimentary rocks and located in the states of Montana and Wyoming. The potential sequestration capacity of the 9 sedimentary provinces within the region ranges from 25,000 to almost 900,000 million metric tons of CO{sub 2}. Overall every sedimentary formation investigated

  13. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  14. Pre-big bang geometric extensions of inflationary cosmologies

    CERN Document Server

    Klein, David

    2016-01-01

    Robertson-Walker cosmologies within a large class are geometrically extended to larger spacetimes that include spacetime points with zero and negative cosmological times. In the extended spacetimes, the big bang is lightlike, and though singular, it inherits some geometric structure from the original spacetime. Spacelike geodesics are continuous across the cosmological time zero submanifold which is parameterized by the radius of Fermi space slices, i.e, by the proper distances along spacelike geodesics from a comoving observer to the big bang. The continuous extension of the metric, and the continuously differentiable extension of the leading Fermi metric coefficient g{\\tau}{\\tau} of the observer, restrict the geometry of spacetime points with pre-big bang cosmological time coordinates. In our extensions the big bang is two di- mensional in a certain sense, consistent with some findings in quantum gravity.

  15. Big Data: Survey, Technologies, Opportunities, and Challenges

    OpenAIRE

    Nawsher Khan; Ibrar Yaqoob; Ibrahim Abaker Targio Hashem; Zakira Inayat; Waleed Kamaleldin Mahmoud Ali; Muhammad Alam; Muhammad Shiraz; Abdullah Gani

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information...

  16. Big is beautiful for c. h. p

    Energy Technology Data Exchange (ETDEWEB)

    1979-11-30

    The CEGB in the United Kingdom has retreated to rural surroundings for siting power stations, but this poses no obstacle to the development of combined heat and power. It is pointed out that the cost of transporting hot water across country is not a problem, provided only that the operation is on a large scale. Factors supporting the decision for a big city to become a pioneer in installing a combined heat and power scheme are discussed. (MCW)

  17. The Concept of the Use of the Marine Reactor Plant in Small Electric Grids

    International Nuclear Information System (INIS)

    In report some aspects of the using marine nuclear reactor are considered for provision of need small non-interconnected power systems, as well as separate settlements and the mining enterprises disposed in regions with a undeveloped infrastructure. Recently for these purposes it is offered to use the nuclear small modular power plants. The required plant power for small electric grids lies within from 1 to several tens of MWe. Module can be collected and tested on machine-building plant, and then delivered in ready type to the working place on some transport, for instance, a barge. Through determined time it's possible to transport a module to the repair shop and also to the point of storage after the end of operation. Marine nuclear reactors on their powers, compactness, mass and size are ideal prototypes for creation of such modules. For instance, building at present floating power unit, intended for functioning in region of the Russian North, based on using reactor plants of nuclear icebreakers. Reliability and safety of the ship reactor are confirmed by their trouble-free operation during approximately 180 reactors-years. Unlike big stationary nuclear plant, working in base mode, power unit with marine reactor wholly capable to work in mode of the loading following. In contrast with reactor of nuclear icebreaker, advisable to increase the core lifetime and to reduce the enrichment of the uranium. This requires more uranium capacity fuel compositions and design of the core. In particular, possible transition from traditional for ship reactor of the channel core to cassette design. Other directions of evolution of the ship reactors, not touching the basic constructive decisions verified by practice, but promoting development of properties of self-security of plant are possible. Among such directions is reduction volumetric power density of a core. (author)

  18. Big Data-Survey

    Directory of Open Access Journals (Sweden)

    P.S.G. Aruna Sri

    2016-03-01

    Full Text Available Big data is the term for any gathering of information sets, so expensive and complex, that it gets to be hard to process for utilizing customary information handling applications. The difficulties incorporate investigation, catch, duration, inquiry, sharing, stockpiling, Exchange, perception, and protection infringement. To reduce spot business patterns, anticipate diseases, conflict etc., we require bigger data sets when compared with the smaller data sets. Enormous information is hard to work with utilizing most social database administration frameworks and desktop measurements and perception bundles, needing rather enormously parallel programming running on tens, hundreds, or even a large number of servers. In this paper there was an observation on Hadoop architecture, different tools used for big data and its security issues.

  19. ANALYTICS OF BIG DATA

    Directory of Open Access Journals (Sweden)

    Prof. Shubhada Talegaon

    2015-10-01

    Full Text Available Big Data analytics has started to impact all types of organizations, as it carries the potential power to extract embedded knowledge from big amounts of data and react according to it in real time. The current technology enables us to efficiently store and query large datasets, the focus is now on techniques that make use of the complete data set, instead of sampling. This has tremendous implications in areas like machine learning, pattern recognition and classification, sentiment analysis, social networking analysis to name a few. Therefore, there are a number of requirements for moving beyond standard data mining technique. Purpose of this paper is to understand various techniques to analysis data.

  20. Really big numbers

    CERN Document Server

    Schwartz, Richard Evan

    2014-01-01

    In the American Mathematical Society's first-ever book for kids (and kids at heart), mathematician and author Richard Evan Schwartz leads math lovers of all ages on an innovative and strikingly illustrated journey through the infinite number system. By means of engaging, imaginative visuals and endearing narration, Schwartz manages the monumental task of presenting the complex concept of Big Numbers in fresh and relatable ways. The book begins with small, easily observable numbers before building up to truly gigantic ones, like a nonillion, a tredecillion, a googol, and even ones too huge for names! Any person, regardless of age, can benefit from reading this book. Readers will find themselves returning to its pages for a very long time, perpetually learning from and growing with the narrative as their knowledge deepens. Really Big Numbers is a wonderful enrichment for any math education program and is enthusiastically recommended to every teacher, parent and grandparent, student, child, or other individual i...

  1. Think Small Go Big

    Institute of Scientific and Technical Information of China (English)

    汤维维

    2006-01-01

    Vepoo公司在创立之前,经历了三次创业转型。用他们的话来说,从“think big go small”转到“think small go big”用了一年的时间。这期间他们耗尽了初期筹备资金,幸运的是在最后一刻迎来了黎明的曙光。

  2. DARPA's Big Mechanism program

    Science.gov (United States)

    Cohen, Paul R.

    2015-07-01

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  3. Big Bang 8

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Band 8 vermittelt auf verständliche Weise Relativitätstheorie, Kern- und Teilchenphysik (und deren Anwendungen in der Kosmologie und Astrophysik), Nanotechnologie sowie Bionik.

  4. Big Bang 7

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. In Band 7 werden neben einer Einführung auch viele aktuelle Aspekte von Quantenmechanik (z. Beamen) und Elektrodynamik (zB Elektrosmog), sowie die Klimaproblematik und die Chaostheorie behandelt.

  5. Big Bang 5

    CERN Document Server

    Apolin, Martin

    2007-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 5 RG behandelt die Grundlagen (Maßsystem, Größenordnungen) und die Mechanik (Translation, Rotation, Kraft, Erhaltungssätze).

  6. Big Bang 6

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 6 RG behandelt die Gravitation, Schwingungen und Wellen, Thermodynamik und eine Einführung in die Elektrizität anhand von Alltagsbeispielen und Querverbindungen zu anderen Disziplinen.

  7. Big Data Refinement

    OpenAIRE

    Boiten, Eerke Albert

    2016-01-01

    "Big data" has become a major area of research and associated funding, as well as a focus of utopian thinking. In the still growing research community, one of the favourite optimistic analogies for data processing is that of the oil refinery, extracting the essence out of the raw data. Pessimists look for their imagery to the other end of the petrol cycle, and talk about the "data exhausts" of our society. Obviously, the refinement community knows how to do "refining". This paper explores...

  8. Reactor Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ait Abderrahim, A

    2001-04-01

    The Reactor Physics and MYRRHA Department of SCK-CEN offers expertise in various areas of reactor physics, in particular in neutronics calculations, reactor dosimetry, reactor operation, reactor safety and control and non-destructive analysis of reactor fuel. This expertise is applied in the Department's own research projects in the VENUS critical facility, in the BR1 reactor and in the MYRRHA project (this project aims at designing a prototype Accelerator Driven System). Available expertise is also used in programmes external to the Department such as the reactor pressure steel vessel programme, the BR2 reactor dosimetry, and the preparation and interpretation of irradiation experiments by means of neutron and gamma calculations. The activities of the Fuzzy Logic and Intelligent Technologies in Nuclear Science programme cover several domains outside the department. Progress and achievements in these topical areas in 2000 are summarised.

  9. Progress of Research on Demonstration Fast Reactor Main Pipe Material

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    The main characteristics of the sodium pipe system in demonstration fast reactor are high-temperature, thin-wall and big-caliber, which is different from the high-pressure and thick-wall of the pressurized water reactor system, and the system is long-term

  10. Measuring Public Acceptance of Nuclear Technology with Big data

    Energy Technology Data Exchange (ETDEWEB)

    Roh, Seugkook [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    Surveys can be conducted only on people in specific region and time interval, and it may be misleading to generalize the results to represent the attitude of the public. For example, opinions of a person living in metropolitan area, far from the dangers of nuclear reactors and enjoying cheap electricity produced by the reactors, and a person living in proximity of nuclear power plants, subject to tremendous damage should nuclear meltdown occur, certainly differs for the topic of nuclear generation. To conclude, big data is a useful tool to measure the public acceptance of nuclear technology efficiently (i.e., saves cost, time, and effort of measurement and analysis) and this research was able to provide a case for using big data to analyze public acceptance of nuclear technology. Finally, the analysis identified opinion leaders, which allows target-marketing when policy is executed.

  11. Measuring Public Acceptance of Nuclear Technology with Big data

    International Nuclear Information System (INIS)

    Surveys can be conducted only on people in specific region and time interval, and it may be misleading to generalize the results to represent the attitude of the public. For example, opinions of a person living in metropolitan area, far from the dangers of nuclear reactors and enjoying cheap electricity produced by the reactors, and a person living in proximity of nuclear power plants, subject to tremendous damage should nuclear meltdown occur, certainly differs for the topic of nuclear generation. To conclude, big data is a useful tool to measure the public acceptance of nuclear technology efficiently (i.e., saves cost, time, and effort of measurement and analysis) and this research was able to provide a case for using big data to analyze public acceptance of nuclear technology. Finally, the analysis identified opinion leaders, which allows target-marketing when policy is executed

  12. Reactor safeguards

    CERN Document Server

    Russell, Charles R

    2013-01-01

    Reactor Safeguards provides information for all who are interested in the subject of reactor safeguards. Much of the material is descriptive although some sections are written for the engineer or physicist directly concerned with hazards analysis or site selection problems. The book opens with an introductory chapter on radiation hazards, the construction of nuclear reactors, safety issues, and the operation of nuclear reactors. This is followed by separate chapters that discuss radioactive materials, reactor kinetics, control and safety systems, containment, safety features for water reactor

  13. Reactor operation

    CERN Document Server

    Shaw, J

    2013-01-01

    Reactor Operation covers the theoretical aspects and design information of nuclear reactors. This book is composed of nine chapters that also consider their control, calibration, and experimentation.The opening chapters present the general problems of reactor operation and the principles of reactor control and operation. The succeeding chapters deal with the instrumentation, start-up, pre-commissioning, and physical experiments of nuclear reactors. The remaining chapters are devoted to the control rod calibrations and temperature coefficient measurements in the reactor. These chapters also exp

  14. From Big Bang to Big Crunch and Beyond

    OpenAIRE

    Elitzur, S.; Giveon, A.; Kutasov, D.; Rabinovici, E.

    2002-01-01

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a ``big bang'' singularity, expands and then contracts to a ``big crunch'' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceeding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spaceti...

  15. Big Crunch-based omnidirectional light concentrators

    CERN Document Server

    Smolyaninov, Igor I

    2014-01-01

    Omnidirectional light concentration remains an unsolved problem despite such important practical applications as design of efficient mobile photovoltaic cells. Optical black hole designs developed recently offer partial solution to this problem. However, even these solutions are not truly omnidirectional since they do not exhibit a horizon, and at large enough incidence angles light may be trapped into quasi-stationary orbits around such imperfect optical black holes. Here we propose and realize experimentally another gravity-inspired design of a broadband omnidirectional light concentrator based on the cosmological Big Crunch solutions. By mimicking the Big Crunch spacetime via corresponding effective optical metric we make sure that every photon world line terminates in a single point.

  16. How Big is Earth?

    Science.gov (United States)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  17. Big Data as a Source for Official Statistics

    Directory of Open Access Journals (Sweden)

    Daas Piet J.H.

    2015-06-01

    Full Text Available More and more data are being produced by an increasing number of electronic devices physically surrounding us and on the internet. The large amount of data and the high frequency at which they are produced have resulted in the introduction of the term ‘Big Data’. Because these data reflect many different aspects of our daily lives and because of their abundance and availability, Big Data sources are very interesting from an official statistics point of view. This article discusses the exploration of both opportunities and challenges for official statistics associated with the application of Big Data. Experiences gained with analyses of large amounts of Dutch traffic loop detection records and Dutch social media messages are described to illustrate the topics characteristic of the statistical analysis and use of Big Data.

  18. Big Red Telephone, Gone

    Institute of Scientific and Technical Information of China (English)

    Toni Piech

    2006-01-01

    @@ The Chinese big red telephones looked exactly as Iimagined the ones servicing the direct emergen line between the Kreml and the White House duing the cold-war era would have look like. But here in China, every kio seemed to have such a device in t1990s, and anyone could use it for ju 0.2 yuan. The government did not juinstall public phones on street corner but they let small-business owners pa ticipate in telecommunication. Supply and demand were juggled by a kind of Hutong capitalism.

  19. A Matrix Big Bang

    OpenAIRE

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matr...

  20. Big and little OER

    OpenAIRE

    Weller, Martin

    2010-01-01

    Much of the attention around OERs has been on institutional projects which make explicit learning content available. These can be classified as ‘big OER’, but another form of OER is that of small scale, individually produced resources using web 2.0 type services, which are classified as ‘little OER’. This paper examines some of the differences between the use of these two types of OER to highlight issues in open education. These include attitudes towards reputation, the intentionality of the ...

  1. Big Data Challenges

    Directory of Open Access Journals (Sweden)

    Alexandru Adrian TOLE

    2013-10-01

    Full Text Available The amount of data that is traveling across the internet today, not only that is large, but is complex as well. Companies, institutions, healthcare system etc., all of them use piles of data which are further used for creating reports in order to ensure continuity regarding the services that they have to offer. The process behind the results that these entities requests represents a challenge for software developers and companies that provide IT infrastructure. The challenge is how to manipulate an impressive volume of data that has to be securely delivered through the internet and reach its destination intact. This paper treats the challenges that Big Data creates.

  2. Big Bang Nucleosynthesis Calculation

    CERN Document Server

    Kurki-Suonio, H

    2001-01-01

    I review standard big bang nucleosynthesis and some versions of nonstandard BBN. The abundances of the primordial isotopes D, He-3, and Li-7 produced in standard BBN can be calculated as a function of the baryon density with an accuracy of about 10%. For He-4 the accuracy is better than 1%. The calculated abundances agree fairly well with observations, but the baryon density of the universe cannot be determined with high precision. Possibilities for nonstandard BBN include inhomogeneous and antimatter BBN and nonzero neutrino chemical potentials.

  3. Privacy and Big Data

    CERN Document Server

    Craig, Terence

    2011-01-01

    Much of what constitutes Big Data is information about us. Through our online activities, we leave an easy-to-follow trail of digital footprints that reveal who we are, what we buy, where we go, and much more. This eye-opening book explores the raging privacy debate over the use of personal data, with one undeniable conclusion: once data's been collected, we have absolutely no control over who uses it or how it is used. Personal data is the hottest commodity on the market today-truly more valuable than gold. We are the asset that every company, industry, non-profit, and government wants. Pri

  4. Dynamic experimental study on rock meso-cracks growth by digital image processing technique

    Institute of Scientific and Technical Information of China (English)

    朱珍德; 倪骁慧; 王伟; 李双蓓; 赵杰; 武沂泉

    2008-01-01

    A new meso-mechanical testing scheme based on SEM was developed to carry out the experiment of microfracturing process of rocks. The microfracturing process of the pre-crack marble sample on surrounding rock in the immerged Long-big tunnel in Jinping Cascade II Hydropower Station under uniaxial compression was recorded by using the testing scheme. According to the stereology theory, the propagation and coalescent of cracks at meso-scale were quantitatively investigated with digital technology. Therefore, the basic geometric information of rock microcracks such as area, angle, length, width, perimeter, was obtained from binary images after segmentation. The failure mechanism of specimen under uniaxial compression with the quantitative information was studied from macro and microscopic point of view. The results show that the image of microfracturing process of the specimen can be observed and recorded digitally. During the damage of the specimen, the distribution of microcracks in the specimen is still subjected to exponential distribution with some microcracks concentrated in certain regions. Finally, the change law of the fractal dimension of the local element in marble sample under different external load conditions is obtained by means of the statistical calculation of the fractal dimension.

  5. [Three applications and the challenge of the big data in otology].

    Science.gov (United States)

    Lei, Guanxiong; Li, Jianan; Shen, Weidong; Yang, Shiming

    2016-03-01

    With the expansion of human practical activities, more and more areas have suffered from big data problems. The emergence of big data requires people to update the research paradigm and develop new technical methods. This review discussed that big data might bring opportunities and challenges in the area of auditory implantation, the deafness genome, and auditory pathophysiology, and pointed out that we needed to find appropriate theories and methods to make this kind of expectation into reality. PMID:27033583

  6. Research reactors

    International Nuclear Information System (INIS)

    This article proposes an overview of research reactors, i.e. nuclear reactors of less than 100 MW. Generally, these reactors are used as neutron generators for basic research in matter sciences and for technological research as a support to power reactors. The author proposes an overview of the general design of research reactors in terms of core size, of number of fissions, of neutron flow, of neutron space distribution. He outlines that this design is a compromise between a compact enough core, a sufficient experiment volume, and high enough power densities without affecting neutron performance or its experimental use. The author evokes the safety framework (same regulations as for power reactors, more constraining measures after Fukushima, international bodies). He presents the main characteristics and operation of the two families which represent almost all research reactors; firstly, heavy water reactors (photos, drawings and figures illustrate different examples); and secondly light water moderated and cooled reactors with a distinction between open core pool reactors like Melusine and Triton, pool reactors with containment, experimental fast breeder reactors (Rapsodie, the Russian BOR 60, the Chinese CEFR). The author describes the main uses of research reactors: basic research, applied and technological research, safety tests, production of radio-isotopes for medicine and industry, analysis of elements present under the form of traces at very low concentrations, non destructive testing, doping of silicon mono-crystalline ingots. The author then discusses the relationship between research reactors and non proliferation, and finally evokes perspectives (decrease of the number of research reactors in the world, the Jules Horowitz project)

  7. Reactor physics and reactor computations

    International Nuclear Information System (INIS)

    Mathematical methods and computer calculations for nuclear and thermonuclear reactor kinetics, reactor physics, neutron transport theory, core lattice parameters, waste treatment by transmutation, breeding, nuclear and thermonuclear fuels are the main interests of the conference

  8. Status of Japanese university reactors

    Energy Technology Data Exchange (ETDEWEB)

    Fujita, Yoshiaki [Research Reactor Institute, Kyoto Univ., Kumatori, Osaka (Japan)

    1999-08-01

    Status of Japanese university reactors, their role and value in research and education, and the spent fuel problem are presented. Some of the reactors are now faced by severe difficulties in continuing their operation services. The point of measures to solve the difficulties is suggested. (author)

  9. Can Pleasant Goat and Big Big Wolf Save China's Animation Industry?

    Institute of Scientific and Technical Information of China (English)

    Guo Liqin

    2009-01-01

    "My dreamed husband is big big wolf," claimed Miss Fang, a young lady who works in KPMG Beijing Office. This big big wolf is a lovely cartoon wolf appeared in a Pleasant Goat and Big Big Wolf produced independently by Chinese.

  10. CERN’s Summer of Rock

    CERN Document Server

    Katarina Anthony

    2015-01-01

    When a rock star visits CERN, they don’t just bring their entourage with them. Along for the ride are legions of fans across the world – many of whom may not be the typical CERN audience. In July alone, four big acts paid CERN a visit, sharing their experience with the world: Scorpions, The Script, Kings of Leon and Patti Smith.   @TheScript tweeted: #paleofestival we had the best time! Big love. #CERN (Image: Twitter).   It all started with the Scorpions, the classic rock band whose “Wind of Change” became an anthem in the early 1990s. On 19 July, the band braved the 35-degree heat to tour the CERN site on foot – visiting the Synchrocyclotron and the new Microcosm exhibition. The rockers were very enthusiastic about the research carried out at CERN, and talked about returning in the autumn during their next tour stop. The Scorpions visit Microcosm. Two days later, The Script rolled in. This Irish pop-rock band has been hittin...

  11. ATLAS: civil engineering Point 1

    CERN Multimedia

    2000-01-01

    The ATLAS experimental area is located in Point 1, just across the main CERN entrance, in the commune of Meyrin. There people are busy to finish the different infrastructures for ATLAS. Real underground video. Nice view from the surface to the cavern from the pit side - all the big machines looked very small. The film has original working sound.

  12. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  13. IZVEDBENI ELEMENTI U BIG BROTHERU

    OpenAIRE

    Radman, Korana

    2009-01-01

    Big Brother publici nudi "ultimativnu stvarnost" osiguranu cjelodnevnim nadzorom televizijskih kamera, o čemu je polemizirano od početka njegova prikazivanja u Europi i svijetu. Imajući to na umu, ovaj rad je pristupio Big Brotheru iz perspektive izvedbenih studija, pokušavajući u njemu prepoznati neke od mogućih izvedbi.

  14. The Oklo reactors

    International Nuclear Information System (INIS)

    The Oklo reactors comprise up to nine 235-U depleted zones in an uranium ore in the Republic of Gabon in West Africa. The depletion in fissile U-235 has been proved to have caused by nuclear chain reactions. The study of the Oklo phenomenon indicates that very efficient retardation mechanisms may operate in nature - at least under special conditions. A closer study of these processes ought to be made to establish the limitations to their occurrence. The Oklo sandstone formation today would probably be considered unacceptable as a host rock for a repository. (EG)

  15. Structure and geomorphology of the "big bend" in the Hosgri-San Gregorio fault system, offshore of Big Sur, central California

    Science.gov (United States)

    Johnson, S. Y.; Watt, J. T.; Hartwell, S. R.; Kluesner, J. W.; Dartnell, P.

    2015-12-01

    The right-lateral Hosgri-San Gregorio fault system extends mainly offshore for about 400 km along the central California coast and is a major structure in the distributed transform margin of western North America. We recently mapped a poorly known 64-km-long section of the Hosgri fault offshore Big Sur between Ragged Point and Pfieffer Point using high-resolution bathymetry, tightly spaced single-channel seismic-reflection and coincident marine magnetic profiles, and reprocessed industry multichannel seismic-reflection data. Regionally, this part of the Hosgri-San Gregorio fault system has a markedly more westerly trend (by 10° to 15°) than parts farther north and south, and thus represents a transpressional "big bend." Through this "big bend," the fault zone is never more than 6 km from the shoreline and is a primary control on the dramatic coastal geomorphology that includes high coastal cliffs, a narrow (2- to 8-km-wide) continental shelf, a sharp shelfbreak, and a steep (as much as 17°) continental slope incised by submarine canyons and gullies. Depth-converted industry seismic data suggest that the Hosgri fault dips steeply to the northeast and forms the eastern boundary of the asymmetric (deeper to the east) Sur Basin. Structural relief on Franciscan basement across the Hosgri fault is about 2.8 km. Locally, we recognize five discrete "sections" of the Hosgri fault based on fault trend, shallow structure (e.g., disruption of young sediments), seafloor geomorphology, and coincidence with high-amplitude magnetic anomalies sourced by ultramafic rocks in the Franciscan Complex. From south to north, section lengths and trends are as follows: (1) 17 km, 312°; (2) 10 km, 322°; (3)13 km, 317°; (4) 3 km, 329°; (5) 21 km, 318°. Through these sections, the Hosgri surface trace includes several right steps that vary from a few hundred meters to about 1 km wide, none wide enough to provide a barrier to continuous earthquake rupture.

  16. 代中子时间法求解点堆中子动力学方程%Solving point reactor neutron kinetic equations by using generation of neutron time method

    Institute of Scientific and Technical Information of China (English)

    蔡光明; 阮良成

    2012-01-01

    由于点堆中子动力学方程是个刚性方程,因此准确、快速、稳定地求解方程是困难的.得益于现代计算机技术的进步,本文直接采用代中子时间计算法求解点堆中子动力学方程,并用C++语言编制了计算程序.经过基准例题和动态-逆动态对比计算,验证了模型、程序计算的准确性和稳定性,而计算时间也是可接受的.%As the point reactor neutron kinetic equations are stiff equations, it is difficult to solve these equations rapidly with a certain accuracy and stability. Due to the progress in modern computer technology, these equations can be solved directly by means of generation of neutron time method. In the meantime, we write a code by C++ language to be applied in the calculation. With the kinetic-inverse kinetic contrast calculation, this model and code have been examined and proved to be accurate and stable, and the computing time is also acceptable.

  17. The Rise of Big Data in Neurorehabilitation.

    Science.gov (United States)

    Faroqi-Shah, Yasmeen

    2016-02-01

    In some fields, Big Data has been instrumental in analyzing, predicting, and influencing human behavior. However, Big Data approaches have so far been less central in speech-language pathology. This article introduces the concept of Big Data and provides examples of Big Data initiatives pertaining to adult neurorehabilitation. It also discusses the potential theoretical and clinical contributions that Big Data can make. The article also recognizes some impediments in building and using Big Data for scientific and clinical inquiry.

  18. Triennial technical report - 1986, 1987, 1988 - Instituto de Engenharia Nuclear (IEN) -Dept. of Reactors (DERE)

    International Nuclear Information System (INIS)

    The research activities developed during the period 1986, 1987 and 1988 by the Reactor Department of Brazilian Nuclear Energy Commission (CNEN-DERE) are summarized. The principal aim of the Department of Reactors is concerned to the study and development of fast reactors and research thermal reactors. The DERE also assists the CNEN in the areas related to analysis of power reactor structure; to teach Reactor Physics and Engineering at the University, and professional training to the Nuclear Engineering Institute. To develop its research activity the DERE has three big facilities: Argonauta reactor, CTS-1 sodium circuit, and water circuit. (M.I.)

  19. Avoiding a Big Catastrophe

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Before last October,the South China tiger had almost slipped into mythi- cal status as it had been absent for so long from the public eye.In the previous 20-plus years,these tigers could not be found in the wild in China and the number of those in captivity numbered only around 60. The species—a direct descendent of the earliest tigers thought to have originat- ed in China 2 million years ago—is functionally extinct,according to experts. The big cat’s return to the media spotlight was completely unexpected. On October 12,2007,a digital picture,showing a wild South China tiger

  20. Big Bounce Genesis

    CERN Document Server

    Li, Changhong; Cheung, Yeuk-Kwan E

    2014-01-01

    We report on the possibility to use dark matter mass and its interaction cross section as a smoking gun signal of the existence of a big bounce at the early stage in the evolution of our currently observed universe. A model independent study of dark matter production in the contraction and expansion phases of the bounce universe reveals a new venue for achieving the observed relic abundance in which a significantly smaller amount of dark matter--compared to the standard cosmology--is produced and survives until today, diluted only by the cosmic expansion since the radiation dominated era. Once DM mass and its interaction strength with ordinary matter are determined by experiments, this alternative route becomes a signature of the bounce universe scenario.

  1. BIG DATA AND STATISTICS

    Science.gov (United States)

    Rossell, David

    2016-01-01

    Big Data brings unprecedented power to address scientific, economic and societal issues, but also amplifies the possibility of certain pitfalls. These include using purely data-driven approaches that disregard understanding the phenomenon under study, aiming at a dynamically moving target, ignoring critical data collection issues, summarizing or preprocessing the data inadequately and mistaking noise for signal. We review some success stories and illustrate how statistical principles can help obtain more reliable information from data. We also touch upon current challenges that require active methodological research, such as strategies for efficient computation, integration of heterogeneous data, extending the underlying theory to increasingly complex questions and, perhaps most importantly, training a new generation of scientists to develop and deploy these strategies.

  2. The Last Big Bang

    Energy Technology Data Exchange (ETDEWEB)

    McGuire, Austin D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Meade, Roger Allen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-13

    As one of the very few people in the world to give the “go/no go” decision to detonate a nuclear device, Austin “Mac” McGuire holds a very special place in the history of both the Los Alamos National Laboratory and the world. As Commander of Joint Task Force Unit 8.1.1, on Christmas Island in the spring and summer of 1962, Mac directed the Los Alamos data collection efforts for twelve of the last atmospheric nuclear detonations conducted by the United States. Since data collection was at the heart of nuclear weapon testing, it fell to Mac to make the ultimate decision to detonate each test device. He calls his experience THE LAST BIG BANG, since these tests, part of Operation Dominic, were characterized by the dramatic displays of the heat, light, and sounds unique to atmospheric nuclear detonations – never, perhaps, to be witnessed again.

  3. Big Bang Darkleosynthesis

    CERN Document Server

    Krnjaic, Gordan

    2014-01-01

    In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN), in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above \\Lambda_{QCD}, which generically yields large (>>MeV/dark-nucleon) binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, dark nuclei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses signifi...

  4. Reactor simulator development. Workshop material

    International Nuclear Information System (INIS)

    The International Atomic Energy Agency (IAEA) has established a programme in nuclear reactor simulation computer programs to assist its Member States in education and training. The objective is to provide, for a variety of advanced reactor types, insight and practice in reactor operational characteristics and their response to perturbations and accident situations. To achieve this, the IAEA arranges for the supply or development of simulation programs and training material, sponsors training courses and workshops, and distributes documentation and computer programs. This publication consists of course material for workshops on development of such reactor simulators. Participants in the workshops are provided with instruction and practice in the development of reactor simulation computer codes using a model development system that assembles integrated codes from a selection of pre-programmed and tested sub-components. This provides insight and understanding into the construction and assumptions of the codes that model the design and operational characteristics of various power reactor systems. The main objective is to demonstrate simple nuclear reactor dynamics with hands-on simulation experience. Using one of the modular development systems, CASSIMtm , a simple point kinetic reactor model is developed, followed by a model that simulates the Xenon/Iodine concentration on changes in reactor power. Lastly, an absorber and adjuster control rod, and a liquid zone model are developed to control reactivity. The built model is used to demonstrate reactor behavior in sub-critical, critical and supercritical states, and to observe the impact of malfunctions of various reactivity control mechanisms on reactor dynamics. Using a PHWR simulator, participants practice typical procedures for a reactor startup and approach to criticality. This workshop material consists of an introduction to systems used for developing reactor simulators, an overview of the dynamic simulation

  5. Exploring Data in Human Resources Big Data

    Directory of Open Access Journals (Sweden)

    Adela BARA

    2016-01-01

    Full Text Available Nowadays, social networks and informatics technologies and infrastructures are constantly developing and affect each other. In this context, the HR recruitment process became complex and many multinational organizations have encountered selection issues. The objective of the paper is to develop a prototype system for assisting the selection of candidates for an intelligent management of human resources. Such a system can be a starting point for the efficient organization of semi-structured and unstructured data on recruitment activities. The article extends the research presented at the 14th International Conference on Informatics in Economy (IE 2015 in the scientific paper "Big Data challenges for human resources management".

  6. Moon base reactor system

    Science.gov (United States)

    Chavez, H.; Flores, J.; Nguyen, M.; Carsen, K.

    1989-01-01

    The objective of our reactor design is to supply a lunar-based research facility with 20 MW(e). The fundamental layout of this lunar-based system includes the reactor, power conversion devices, and a radiator. The additional aim of this reactor is a longevity of 12 to 15 years. The reactor is a liquid metal fast breeder that has a breeding ratio very close to 1.0. The geometry of the core is cylindrical. The metallic fuel rods are of beryllium oxide enriched with varying degrees of uranium, with a beryllium core reflector. The liquid metal coolant chosen was natural lithium. After the liquid metal coolant leaves the reactor, it goes directly into the power conversion devices. The power conversion devices are Stirling engines. The heated coolant acts as a hot reservoir to the device. It then enters the radiator to be cooled and reenters the Stirling engine acting as a cold reservoir. The engines' operating fluid is helium, a highly conductive gas. These Stirling engines are hermetically sealed. Although natural lithium produces a lower breeding ratio, it does have a larger temperature range than sodium. It is also corrosive to steel. This is why the container material must be carefully chosen. One option is to use an expensive alloy of cerbium and zirconium. The radiator must be made of a highly conductive material whose melting point temperature is not exceeded in the reactor and whose structural strength can withstand meteor showers.

  7. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-06-01

    soil C in the partnership region, and to design a risk/cost effectiveness framework to make comparative assessments of each viable sink, taking into account economic costs, offsetting benefits, scale of sequestration opportunities, spatial and time dimensions, environmental risks, and long term viability. Scientifically sound information on MMV is critical for public acceptance of these technologies. Two key deliverables were completed this quarter--a literature review/database to assess the soil carbon on rangelands, and the draft protocols, contracting options for soil carbon trading. To date, there has been little research on soil carbon on rangelands, and since rangeland constitutes a major land use in the Big Sky region, this is important in achieving a better understanding of terrestrial sinks. The protocols developed for soil carbon trading are unique and provide a key component of the mechanisms that might be used to efficiently sequester GHG and reduce CO{sub 2} concentrations. Progress on other deliverables is noted in the PowerPoint presentations. A series of meetings held during the second quarter have laid the foundations for assessing the issues surrounding the implementation of a market-based setting for soil C credits. These meetings provide a connection to stakeholders in the region and a basis on which to draw for the DOE PEIS hearings. Finally, the education and outreach efforts have resulted in a comprehensive plan and process which serves as a guide for implementing the outreach activities under Phase I. While we are still working on the public website, we have made many presentations to stakeholders and policy makers, connections to other federal and state agencies concerned with GHG emissions, climate change, and efficient and environmentally-friendly energy production. In addition, we have laid plans for integration of our outreach efforts with the students, especially at the tribal colleges and at the universities involved in our partnership

  8. Reactor building

    International Nuclear Information System (INIS)

    The whole reactor building is accommodated in a shaft and is sealed level with the earth's surface by a building ceiling, which provides protection against penetration due to external effects. The building ceiling is supported on walls of the reactor building, which line the shaft and transfer the vertical components of forces to the foundations. The thickness of the walls is designed to withstand horizontal pressure waves in the floor. The building ceiling has an opening above the reactor, which must be closed by cover plates. Operating equipment for the reactor can be situated above the building ceiling. (orig./HP)

  9. My Pet Rock

    Science.gov (United States)

    Lark, Adam; Kramp, Robyne; Nurnberger-Haag, Julie

    2008-01-01

    Many teachers and students have experienced the classic pet rock experiment in conjunction with a geology unit. A teacher has students bring in a "pet" rock found outside of school, and the students run geologic tests on the rock. The tests include determining relative hardness using Mohs scale, checking for magnetization, and assessing luster.…

  10. The rock diet

    OpenAIRE

    Fordyce, Fiona; Johnson, Chris

    2002-01-01

    You may think there is little connection between rocks and our diet, indeed a serving of rocks may sound very unappetising! But rocks are a vital source of the essential elements and minerals we need to keep us healthy, such as calcium for healthy teeth and bones.

  11. A view on big data and its relation to Informetrics

    Institute of Scientific and Technical Information of China (English)

    Ronald; ROUSSEAU

    2012-01-01

    Purpose:Big data offer a huge challenge.Their very existence leads to the contradiction that the more data we have the less accessible they become,as the particular piece of information one is searching for may be buried among terabytes of other data.In this contribution we discuss the origin of big data and point to three challenges when big data arise:Data storage,data processing and generating insights.Design/methodology/approach:Computer-related challenges can be expressed by the CAP theorem which states that it is only possible to simultaneously provide any two of the three following properties in distributed applications:Consistency(C),availability(A)and partition tolerance(P).As an aside we mention Amdahl’s law and its application for scientific collaboration.We further discuss data mining in large databases and knowledge representation for handling the results of data mining exercises.We further offer a short informetric study of the field of big data,and point to the ethical dimension of the big data phenomenon.Findings:There still are serious problems to overcome before the field of big data can deliver on its promises.Implications and limitations:This contribution offers a personal view,focusing on the information science aspects,but much more can be said about software aspects.Originality/value:We express the hope that the information scientists,including librarians,will be able to play their full role within the knowledge discovery,data mining and big data communities,leading to exciting developments,the reduction of scientific bottlenecks and really innovative applications.

  12. BigOP: Generating Comprehensive Big Data Workloads as a Benchmarking Framework

    OpenAIRE

    Zhu, Yuqing; Zhan, Jianfeng; Weng, Chuliang; Nambiar, Raghunath; Zhang, Jinchao; Chen, Xingzhen; Wang, Lei

    2014-01-01

    Big Data is considered proprietary asset of companies, organizations, and even nations. Turning big data into real treasure requires the support of big data systems. A variety of commercial and open source products have been unleashed for big data storage and processing. While big data users are facing the choice of which system best suits their needs, big data system developers are facing the question of how to evaluate their systems with regard to general big data processing needs. System b...

  13. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  14. Big Data: present and future

    Directory of Open Access Journals (Sweden)

    Mircea Raducu TRIFU

    2014-05-01

    Full Text Available The paper explains the importance of the Big Data concept, a concept that even now, after years of development, is for the most companies just a cool keyword. The paper also describes the level of the actual big data development and the things it can do, and also the things that can be done in the near future. The paper focuses on explaining to nontechnical and non-database related technical specialists what basically is big data, presents the three most important V's, as well as the new ones, the most important solutions used by companies like Google or Amazon, as well as some interesting perceptions based on this subject.

  15. The challenges of big data

    Science.gov (United States)

    2016-01-01

    ABSTRACT The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. PMID:27147249

  16. Big Data Mining: Tools & Algorithms

    Directory of Open Access Journals (Sweden)

    Adeel Shiraz Hashmi

    2016-03-01

    Full Text Available We are now in Big Data era, and there is a growing demand for tools which can process and analyze it. Big data analytics deals with extracting valuable information from that complex data which can’t be handled by traditional data mining tools. This paper surveys the available tools which can handle large volumes of data as well as evolving data streams. The data mining tools and algorithms which can handle big data have also been summarized, and one of the tools has been used for mining of large datasets using distributed algorithms.

  17. The challenges of big data.

    Science.gov (United States)

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest.

  18. Mechanic behavior of unloading fractured rock mass

    Institute of Scientific and Technical Information of China (English)

    YIN Ke; ZHANG Yongxing; WU Hanhui

    2003-01-01

    Under tension and shear conditions related to unloading of rock mass, a jointed rock mass model of linear elastic fracture mechanics is established. According to the model, the equations of stresses, strains and displacements of the region influenced by the crack but relatively faraway the crack (the distance between the research point and the center of the crack is longer than the length of crack) are derived. They are important for evaluating the deformation of cracked rock. It is demonstrated by the comparison between computational results of these theoretical equations and the observed data from unloading test that they are applicable for actual engineering.

  19. Quantum Fields in a Big Crunch/Big Bang Spacetime

    OpenAIRE

    Tolley, Andrew J.; Turok, Neil

    2002-01-01

    We consider quantum field theory on a spacetime representing the Big Crunch/Big Bang transition postulated in the ekpyrotic or cyclic cosmologies. We show via several independent methods that an essentially unique matching rule holds connecting the incoming state, in which a single extra dimension shrinks to zero, to the outgoing state in which it re-expands at the same rate. For free fields in our construction there is no particle production from the incoming adiabatic vacuum. When interacti...

  20. Sailing through the big crunch-big bang transition

    OpenAIRE

    Bars, Itzhak; Steinhardt, Paul; Turok, Neil

    2013-01-01

    In a recent series of papers, we have shown that theories with scalar fields coupled to gravity (e.g., the standard model) can be lifted to a Weyl-invariant equivalent theory in which it is possible to unambiguously trace the classical cosmological evolution through the transition from big crunch to big bang. The key was identifying a sufficient number of finite, Weyl-invariant conserved quantities to uniquely match the fundamental cosmological degrees of freedom across the transition. In so ...

  1. Detecting and understanding big events in big cities

    OpenAIRE

    Furletti, Barbara; Trasarti, Roberto; Gabrielli, Lorenzo; Smoreda, Zbigniew; Vanhoof, Maarten; Ziemlicki, Cezary

    2015-01-01

    Recent studies have shown the great potential of big data such as mobile phone location data to model human behavior. Big data allow to analyze people presence in a territory in a fast and effective way with respect to the classical surveys (diaries or questionnaires). One of the drawbacks of these collection systems is incompleteness of the users' traces; people are localized only when they are using their phones. In this work we define a data mining method for identifying people presence an...

  2. Small Punch Test on Before and Post Irradiated Domestic Reactor Pressure Steel

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Problems may be caused when applying the standard specimen to study the properties of irradiated reactor materials, because of its big dimension, e.g.: The inner temperature gradient of the specimen is high when irradiated, the radiation

  3. Hey, big spender

    International Nuclear Information System (INIS)

    Business to business electronic commerce is looming large in the future of the oil industry. It is estimated that by adopting e-commerce the industry could achieve bottom line savings of between $1.8 to $ 3.4 billion a year on annual gross revenues in excess of $ 30 billion. At present there are several teething problems to overcome such as inter-operability standards, which are at least two or three years away. Tying in electronically with specific suppliers is also an expensive proposition, although the big benefits are in fact in doing business with the same suppliers on a continuing basis. Despite these problems, 14 of the world's largest energy and petrochemical companies joined forces in mid-April to create a single Internet procurement marketplace for the industry's complex supply chain. The exchange was designed by B2B (business-to-business) software provider, Commerce One Inc., ; it will leverage the buying clout of these industry giants (BP Amoco, Royal Dutch Shell Group, Conoco, Occidental Petroleum, Phillips Petroleum, Unocal Corporation and Statoil among them), currently about $ 125 billion on procurement per year; they hope to save between 5 to 30 per cent depending on the product and the region involved. Other similar schemes such as Chevron and partners' Petrocosm Marketplace, Network Oil, a Houston-based Internet portal aimed at smaller petroleum companies, are also doing business in the $ 10 billion per annum range. e-Energy, a cooperative project between IBM Ericson and Telus Advertising is another neutral, virtual marketplace targeted at the oil and gas sector. PetroTRAX, a Calgary-based website plans to take online procurement and auction sales a big step forward by establishing a portal to handle any oil company's asset management needs. There are also a number of websites targeting specific needs: IndigoPool.com (acquisitions and divestitures) and WellBid.com (products related to upstream oil and gas operators) are just two examples. All in

  4. Tipping points? Ethnic composition change in Dutch big city neighbourhoods

    NARCIS (Netherlands)

    Ong, C.

    2014-01-01

    Micro-level studies using individual and household data have shown that residential location choices are influenced by neighbourhood ethnic composition. Using three conurbation samples in the Netherlands - Amsterdam metropolitan area, Rotterdam-The Hague metropolitan area, and the country's largest

  5. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  6. Le Big Bang en laboratoire

    CERN Multimedia

    Roy, Christelle

    2006-01-01

    Physiciens have been dreaming of it for 30 years; Thanks to huge particle accelerators, they were able to observe the matter such as it was some instants after the Big Bang (three different articles in 10 pages)

  7. Big Data Technology Literature Review

    OpenAIRE

    Bar-sinai, Michael

    2015-01-01

    A short overview of various algorithms and technologies that are helpful for big data storage and manipulation. Includes pointers to papers for further reading, and, where applicable, pointers to open source projects implementing a described storage type.

  8. Big bang darkleosynthesis

    Directory of Open Access Journals (Sweden)

    Gordan Krnjaic

    2015-12-01

    Full Text Available In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN, in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV/dark-nucleon binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S≫3/2, whose discovery would be smoking gun evidence for dark nuclei.

  9. Big Data and Ambulatory Care

    OpenAIRE

    Thorpe, Jane Hyatt; Gray, Elizabeth Alexandra

    2014-01-01

    Big data is heralded as having the potential to revolutionize health care by making large amounts of data available to support care delivery, population health, and patient engagement. Critics argue that big data's transformative potential is inhibited by privacy requirements that restrict health information exchange. However, there are a variety of permissible activities involving use and disclosure of patient information that support care delivery and management. This article presents an ov...

  10. Big Data Analytics in Healthcare

    OpenAIRE

    Ashwin Belle; Raghuram Thiagarajan; S. M. Reza Soroushmehr; Fatemeh Navidi; Daniel A Beard; Kayvan Najarian

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is sti...

  11. The role of big laboratories

    CERN Document Server

    Heuer, Rolf-Dieter

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward.

  12. Reactor accident-big impacts but small possibilities

    International Nuclear Information System (INIS)

    Accidents are an unfortunate incident that happened in our lives. The government provides facilities and programs to reduce accidents; people also take a variety of initiatives that accidents can be avoided, and every family and its members are constantly vigilant to protect against accidents. Some industries are relatively simple operations are recorded accidents is higher than other industries is more complex and sophisticated. Authors relate this fact with the accident that occurred in the area where the power generation plant according to author accidents in this area is very small and grouped as isolated cases. This article also commented on two major accidents in nuclear power generation are Chernobyl and Three Miles Island. Authors also hope that the progress of current and future technology can overcome this problem and then convince the public that nuclear energy is safe and low risk.

  13. Operating experiences of the research reactors

    International Nuclear Information System (INIS)

    Nuclear research reactors are devices of wide importance, being used for different scientific research tasks, for testing and improving reactor systems and components, for the production of radioisotopes, for the purposes of defence, for staff training and for other purposes. There are three research reactors in Yugoslavia: RA, RB and TRIGA. Reactors RA and RB at the 'Boris Kidric' Institute of Nuclear Sciences are of heavy water type power being 6500 and 10 kW, and maximum thermal neutron flux of 1014 and 1011(n/cm2s), respectively. TRIGA reactor at the 'Jozef Stefan' Institute in Ljubljana is of 250 kW power and maximum thermal neutron flux of 1013(n/cm2s). Reactors RA and RB use soviet fuel in the form of uranium dioxide (80% enriched) and metallic uranium (2%). Besides, RB reactor operates with natural uranium too. TRIGA reactor uses american uranium fuel 70% and 20% enriched, uranium being mixed homogeneously with moderator (ZrH). Experiences in handling and controlling the fuel before irradiation in the reactor, in reactor and after it are numerous and valuable, involving either the commercial arrangements with foreign producers, or optimal burn up in reactor or fuel treatment after the reactor irradiation. Twenty years of operating experience of these reactors have great importance especially having in mind the number of trained staff. Maintenance of reactors systems and fluids in continuous operation is valuable experience from the point of view of water reactor utilization. The case of the RA reactor primary cycle cobalt decontamination and other events connected with nuclear and radiation security for all three reactors are also specially emphasized. Owing to our research reactors, numerous theoretical, numerical and experimental methods are developed for nuclear and other analyses and design of research and power reactors,as well as methods for control and protection of radiation. (author)

  14. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-06-01

    soil C in the partnership region, and to design a risk/cost effectiveness framework to make comparative assessments of each viable sink, taking into account economic costs, offsetting benefits, scale of sequestration opportunities, spatial and time dimensions, environmental risks, and long term viability. Scientifically sound information on MMV is critical for public acceptance of these technologies. Two key deliverables were completed this quarter--a literature review/database to assess the soil carbon on rangelands, and the draft protocols, contracting options for soil carbon trading. To date, there has been little research on soil carbon on rangelands, and since rangeland constitutes a major land use in the Big Sky region, this is important in achieving a better understanding of terrestrial sinks. The protocols developed for soil carbon trading are unique and provide a key component of the mechanisms that might be used to efficiently sequester GHG and reduce CO{sub 2} concentrations. Progress on other deliverables is noted in the PowerPoint presentations. A series of meetings held during the second quarter have laid the foundations for assessing the issues surrounding the implementation of a market-based setting for soil C credits. These meetings provide a connection to stakeholders in the region and a basis on which to draw for the DOE PEIS hearings. Finally, the education and outreach efforts have resulted in a comprehensive plan and process which serves as a guide for implementing the outreach activities under Phase I. While we are still working on the public website, we have made many presentations to stakeholders and policy makers, connections to other federal and state agencies concerned with GHG emissions, climate change, and efficient and environmentally-friendly energy production. In addition, we have laid plans for integration of our outreach efforts with the students, especially at the tribal colleges and at the universities involved in our partnership

  15. Compact Reactor

    International Nuclear Information System (INIS)

    Weyl's Gauge Principle of 1929 has been used to establish Weyl's Quantum Principle (WQP) that requires that the Weyl scale factor should be unity. It has been shown that the WQP requires the following: quantum mechanics must be used to determine system states; the electrostatic potential must be non-singular and quantified; interactions between particles with different electric charges (i.e. electron and proton) do not obey Newton's Third Law at sub-nuclear separations, and nuclear particles may be much different than expected using the standard model. The above WQP requirements lead to a potential fusion reactor wherein deuterium nuclei are preferentially fused into helium nuclei. Because the deuterium nuclei are preferentially fused into helium nuclei at temperatures and energies lower than specified by the standard model there is no harmful radiation as a byproduct of this fusion process. Therefore, a reactor using this reaction does not need any shielding to contain such radiation. The energy released from each reaction and the absence of shielding makes the deuterium-plus-deuterium-to-helium (DDH) reactor very compact when compared to other reactors, both fission and fusion types. Moreover, the potential energy output per reactor weight and the absence of harmful radiation makes the DDH reactor an ideal candidate for space power. The logic is summarized by which the WQP requires the above conditions that make the prediction of DDH possible. The details of the DDH reaction will be presented along with the specifics of why the DDH reactor may be made to cause two deuterium nuclei to preferentially fuse to a helium nucleus. The presentation will also indicate the calculations needed to predict the reactor temperature as a function of fuel loading, reactor size, and desired output and will include the progress achieved to date

  16. Pre-Big Bang, vacuum and noncyclic cosmologies

    CERN Document Server

    Gonzalez-Mestres, Luis

    2012-01-01

    WMAP and Planck open the way to unprecedented Big Bang phenomenology, potentially allowing to test the standard Big Bang model as well as less conventional approaches including noncyclic pre-Big Bang cosmologies that would incorporate a new fundamental scale beyond the Planck scale and, possibly, new ultimate constituents of matter. Alternatives to standard physics can be considered from a cosmological point of view concerning vacuum structure, the nature of space-time, the origin and evolution of our Universe, the validity of quantum field theory and conventional symmetries, solutions to the cosmological constant problem, inflationary scenarios, dark matter and dark energy, the interpretation of string-like theories... Lorentz-like symmetries for the properties of matter (standard or superbradyonic) can then be naturally stable space-time configurations resulting from general cosmological scenarios that incorporate physics beyond the Planck scale and describe the formation and evolution of the present vacuum...

  17. Hungry for Rocks

    Science.gov (United States)

    2004-01-01

    This image from the Mars Exploration Rover Spirit hazard identification camera shows the rover's perspective just before its first post-egress drive on Mars. On Sunday, the 15th martian day, or sol, of Spirit's journey, engineers drove Spirit approximately 3 meters (10 feet) toward its first rock target, a football-sized, mountain-shaped rock called Adirondack (not pictured). In the foreground of this image are 'Sashimi' and 'Sushi' - two rocks that scientists considered investigating first. Ultimately, these rocks were not chosen because their rough and dusty surfaces are ill-suited for grinding.

  18. Fast reactors and nuclear nonproliferation

    International Nuclear Information System (INIS)

    Problems are discussed with regard to nuclear fuel cycle resistance in fast reactors to nuclear proliferation risk due to the potential for use in military programs of the knowledge, technologies and materials gained from peaceful nuclear power applications. Advantages are addressed for fast reactors in the creation of a more reliable mode of nonproliferation in the closed nuclear fuel cycle in comparison with the existing fully open and partially closed fuel cycles of thermal reactors. Advantages and shortcomings are also discussed from the point of view of nonproliferation from the start with fast reactors using plutonium of thermal reactor spent fuel and enriched uranium fuel to the gradual transition using their own plutonium as fuel. (author)

  19. NEUTRONIC REACTOR

    Science.gov (United States)

    Anderson, H.L.

    1960-09-20

    A nuclear reactor is described comprising fissionable material dispersed in graphite blocks, helium filling the voids of the blocks and the spaces therebetween, and means other than the helium in thermal conductive contact with the graphite for removing heat.

  20. NUCLEAR REACTOR

    Science.gov (United States)

    Miller, H.I.; Smith, R.C.

    1958-01-21

    This patent relates to nuclear reactors of the type which use a liquid fuel, such as a solution of uranyl sulfate in ordinary water which acts as the moderator. The reactor is comprised of a spherical vessel having a diameter of about 12 inches substantially surrounded by a reflector of beryllium oxide. Conventionnl control rods and safety rods are operated in slots in the reflector outside the vessel to control the operation of the reactor. An additional means for increasing the safety factor of the reactor by raising the ratio of delayed neutrons to prompt neutrons, is provided and consists of a soluble sulfate salt of beryllium dissolved in the liquid fuel in the proper proportion to obtain the result desired.

  1. Chemical Reactors.

    Science.gov (United States)

    Kenney, C. N.

    1980-01-01

    Describes a course, including content, reading list, and presentation on chemical reactors at Cambridge University, England. A brief comparison of chemical engineering education between the United States and England is also given. (JN)

  2. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-06-30

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop (see attached agenda). The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement

  3. NUCLEAR REACTOR

    Science.gov (United States)

    Anderson, C.R.

    1962-07-24

    A fluidized bed nuclear reactor and a method of operating such a reactor are described. In the design means are provided for flowing a liquid moderator upwardly through the center of a bed of pellets of a nentron-fissionable material at such a rate as to obtain particulate fluidization while constraining the lower pontion of the bed into a conical shape. A smooth circulation of particles rising in the center and falling at the outside of the bed is thereby established. (AEC)

  4. Nuclear reactor

    International Nuclear Information System (INIS)

    In order to reduce neutron embrittlement of the pressue vessel of an LWR, blanked off elements are fitted at the edge of the reactor core, with the same dimensions as the fuel elements. They are parallel to each other, and to the edge of the reactor taking the place of fuel rods, and are plates of neutron-absorbing material (stainless steel, boron steel, borated Al). (HP)

  5. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  6. [Big data in official statistics].

    Science.gov (United States)

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany. PMID:26077871

  7. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process. PMID:27068058

  8. Dual of Big-bang and Big-crunch

    OpenAIRE

    Bak, Dongsu

    2006-01-01

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by procedure of the double anaytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are non singular at all as the coupling goes to zero in the N=4 Super Yang-Mills theory. The cosmological sing...

  9. Soft rocks in Argentina

    Institute of Scientific and Technical Information of China (English)

    Giambastiani; Mauricio

    2014-01-01

    Soft rocks are a still fairly unexplored chapter in rock mechanics. Within this category are the clastic sedimentary rocks and pyroclastic volcanic rocks, of low to moderate lithification (consolidation, cemen-tation, new formed minerals), chemical sedimentary rocks and metamorphic rocks formed by minerals with Mohs hardness less than 3.5, such as limestone, gypsum, halite, sylvite, between the first and phyllites, graphitic schist, chloritic shale, talc, etc., among the latter. They also include any type of rock that suffered alteration processes (hydrothermal or weathering). In Argentina the study of low-strength rocks has not received much attention despite having extensive outcrops in the Andes and great impact in the design criteria. Correlation between geomechanical properties (UCS, deformability) to physical index (porosity, density, etc.) has shown promising results to be better studied. There are many studies and engineering projects in Argentina in soft rock geological environments, some cited in the text (Chihuído dam, N. Kirchner dam, J. Cepernic Dam, etc.) and others such as International Tunnel in the Province of Mendoza (Corredor Bioceánico), which will require the valuable contribution from rock mechanics. The lack of consistency between some of the physical and mechanical parameters explored from studies in the country may be due to an insufficient amount of information and/or non-standardization of criteria for testing materials. It is understood that more and better academic and professional efforts in improv-ing techniques will result in benefits to the better understanding of the geomechanics of weak rocks.

  10. Macro mechanical parameters' size effect of surrounding rock of Shuibuya project's underground power station

    Institute of Scientific and Technical Information of China (English)

    GUO Zhi-hua; ZHOU Chuang-bing; ZHOU Huo-ming; SHENG Qian; LENG Xian-lun

    2005-01-01

    Scale effect is one of the important aspects in the macro mechanical parameters' research of rock mass, from a new point of view, by means of lab and field rock mechanics test, establishment of E~Vp relation, classification of engineering rock mass, numerical simulation test and back analysis based on surrounding rock's displacement monitoring results of Shuibuya Project's underground power station, rock mass deformation module's size effect of surrounding rock of Shuibuya Project's undegroud power station was studied. It's shown that rock mass deformation module's scale effect of surrounding rock of Shuibuya Project's undeground power station is obvious, the rock mass deformation module to tranquilization is 20% of intact rock's. Finally the relation between rock mass deformation modules and the scale of research was established.

  11. CLOUD COMPUTING WITH BIG DATA: A REVIEW

    OpenAIRE

    Anjali; Er. Amandeep Kaur; Mrs. Shakshi

    2016-01-01

    Big data is a collection of huge quantities of data. Big data is the process of examining large amounts of data. Big data and Cloud computing are the hot issues in Information Technology. Big data is the one of the main problem now a day’s. Researchers focusing how to handle huge amount of data with cloud computing and how to gain a perfect security for big data in cloud computing. To handle the Big Data problem Hadoop framework is used in which data is fragmented and executed parallel....

  12. Research of dynamic mechanical performance of cement rock

    Institute of Scientific and Technical Information of China (English)

    WANG Qiang; WANG Tong; WANG Xiang-lin

    2007-01-01

    As Daqing Oilfield is developing oil layer with a big potential, the requirement for the quality of well cementation is higher than ever before. Cement rock is a brittle material containing a great number of microcracks and defects. In order to reduce the damage to cement ring and improve sealed cementing property at the interface, it is necessary to conduct research on the modification of the cement rock available. According to the principle of super mixed composite materials, various fillers are added to the ingredients of cement rock. Dynamic fracture toughness of cement rock will be changed under the influence of filler. In order to study the damage mechanism of the cement circle during perforation and carry out comprehensive experiments on preventing and resisting connection, a kind of comprehensive experiment equipment used to simulate perforation and multifunctional equipment for testing the dynamic properties of the material are designed. Experimental study of the dynamical mechanical performance of original and some improved cement rock and experiment used to simulate the well cementation and perforation are carried out. Standard for dynamical mechanical performance of the cement rock with fine impact resistance and mechanical properties of some improved cement rock are also given.

  13. 淀粉Big Bang!

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Big Bang,也叫"大爆炸",指的是宇宙诞生时期从密度极大且温度极高的太初状态开始发生不断膨胀的过程。换句话说,从Big Bang开始,我们现在的宇宙慢慢形成了。0K,从本期开始,"少电"将在微博引发Big Bang!——淀粉大爆炸!具体怎么爆呢?我想,看到本页版式的你已经明白了七八分了吧?

  14. Multiwavelength astronomy and big data

    Science.gov (United States)

    Mickaelian, A. M.

    2016-09-01

    Two major characteristics of modern astronomy are multiwavelength (MW) studies (fromγ-ray to radio) and big data (data acquisition, storage and analysis). Present astronomical databases and archives contain billions of objects observed at various wavelengths, both galactic and extragalactic, and the vast amount of data on them allows new studies and discoveries. Astronomers deal with big numbers. Surveys are the main source for discovery of astronomical objects and accumulation of observational data for further analysis, interpretation, and achieving scientific results. We review the main characteristics of astronomical surveys, compare photographic and digital eras of astronomical studies (including the development of wide-field observations), describe the present state of MW surveys, and discuss the Big Data in astronomy and related topics of Virtual Observatories and Computational Astrophysics. The review includes many numbers and data that can be compared to have a possibly overall understanding on the Universe, cosmic numbers and their relationship to modern computational facilities.

  15. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957

  16. Big Data: Astronomical or Genomical?

    Directory of Open Access Journals (Sweden)

    Zachary D Stephens

    2015-07-01

    Full Text Available Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a "four-headed beast"--it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the "genomical" challenges of the next decade.

  17. Big Data Analytics in Healthcare

    Directory of Open Access Journals (Sweden)

    Ashwin Belle

    2015-01-01

    Full Text Available The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  18. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  19. [Algorithms, machine intelligence, big data : general considerations].

    Science.gov (United States)

    Radermacher, F J

    2015-08-01

    We are experiencing astonishing developments in the areas of big data and artificial intelligence. They follow a pattern that we have now been observing for decades: according to Moore's Law,the performance and efficiency in the area of elementary arithmetic operations increases a thousand-fold every 20 years. Although we have not achieved the status where in the singular sense machines have become as "intelligent" as people, machines are becoming increasingly better. The Internet of Things has again helped to massively increase the efficiency of machines. Big data and suitable analytics do the same. If we let these processes simply continue, our civilization may be endangerd in many instances. If the "containment" of these processes succeeds in the context of a reasonable political global governance, a worldwide eco-social market economy, andan economy of green and inclusive markets, many desirable developments that are advantageous for our future may result. Then, at some point in time, the constant need for more and faster innovation may even stop. However, this is anything but certain. We are facing huge challenges.

  20. Analyzing Big Data with Dynamic Quantum Clustering

    CERN Document Server

    Weinstein, M; Hume, A; Sciau, Ph; Shaked, G; Hofstetter, R; Persi, E; Mehta, A; Horn, D

    2013-01-01

    How does one search for a needle in a multi-dimensional haystack without knowing what a needle is and without knowing if there is one in the haystack? This kind of problem requires a paradigm shift - away from hypothesis driven searches of the data - towards a methodology that lets the data speak for itself. Dynamic Quantum Clustering (DQC) is such a methodology. DQC is a powerful visual method that works with big, high-dimensional data. It exploits variations of the density of the data (in feature space) and unearths subsets of the data that exhibit correlations among all the measured variables. The outcome of a DQC analysis is a movie that shows how and why sets of data-points are eventually classified as members of simple clusters or as members of - what we call - extended structures. This allows DQC to be successfully used in a non-conventional exploratory mode where one searches data for unexpected information without the need to model the data. We show how this works for big, complex, real-world dataset...

  1. Sosiaalinen asiakassuhdejohtaminen ja big data

    OpenAIRE

    Toivonen, Topi-Antti

    2015-01-01

    Tässä tutkielmassa käsitellään sosiaalista asiakassuhdejohtamista sekä hyötyjä, joita siihen voidaan saada big datan avulla. Sosiaalinen asiakassuhdejohtaminen on terminä uusi ja monille tuntematon. Tutkimusta motivoi aiheen vähäinen tutkimus, suomenkielisen tutkimuksen puuttuminen kokonaan sekä sosiaalisen asiakassuhdejohtamisen mahdollinen olennainen rooli yritysten toiminnassa tulevaisuudessa. Big dataa käsittelevissä tutkimuksissa keskitytään monesti sen tekniseen puoleen, eikä sovellutuk...

  2. [Big Data- challenges and risks].

    Science.gov (United States)

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-01

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented. PMID:26614539

  3. Towards a big crunch dual

    Energy Technology Data Exchange (ETDEWEB)

    Hertog, Thomas E-mail: hertog@vulcan2.physics.ucsb.edu; Horowitz, Gary T

    2004-07-01

    We show there exist smooth asymptotically anti-de Sitter initial data which evolve to a big crunch singularity in a low energy supergravity limit of string theory. This opens up the possibility of using the dual conformal field theory to obtain a fully quantum description of the cosmological singularity. A preliminary study of this dual theory suggests that the big crunch is an endpoint of evolution even in the full string theory. We also show that any theory with scalar solitons must have negative energy solutions. The results presented here clarify our earlier work on cosmic censorship violation in N=8 supergravity. (author)

  4. The BigBOSS Experiment

    OpenAIRE

    Schlegel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Prieto, C. Allende; Annis, J.; Aubourg, E.; Azzaro, M.; Baltay, S. Bailey. C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.

    2011-01-01

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra...

  5. [Big Data- challenges and risks].

    Science.gov (United States)

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-01

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented.

  6. Big society, big data. The radicalisation of the network society

    NARCIS (Netherlands)

    Frissen, V.

    2011-01-01

    During the British election campaign of 2010, David Cameron produced the idea of the ‘Big Society’ as a cornerstone of his political agenda. At the core of the idea is a stronger civil society and local community coupled with a more withdrawn government. Although many commentators have dismissed thi

  7. Issues of Eco-agricultural Industrialization for Big Qinling Eco-city Cluster

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Firstly,the necessities of ecological agriculture development in Big Qinling Eco-city Cluster were discussed.Then,condition endowment of eco-agricultural industrialization in Big Qinling Eco-city Cluster was analyzed from the aspects of basic conditions and differential endowment of eco-agricultural industrialization.Finally,specific forms and functional orientation of eco-agriculture were pointed out.Countermeasures for the eco-agricultural industrialization in Big Qinling Eco-city Cluster were put forward.Firstly,the government guidance and the media publicity should be strengthened.Secondly,financial support for the eco-agricultural industrialization in Big Qinling Eco-city Cluster should be enhanced.Thirdly,branding strategies of eco-agricultural products in Big Qinling Eco-city Cluster should be implemented as soon as possible.

  8. Automated rock mass characterisation using 3-D terrestrial laser scanning

    NARCIS (Netherlands)

    Slob, S.

    2010-01-01

    The research investigates the possibility of using point cloud data from 3-D terrestrial laser scanning as a basis to characterise discontinuities in exposed rock massed in an automated way. Examples of discontinuities in rock are bedding planes, joints, fractures and schistocity. The characterisati

  9. Stabilized Spheromak Fusion Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Fowler, T

    2007-04-03

    The U.S. fusion energy program is focused on research with the potential for studying plasmas at thermonuclear temperatures, currently epitomized by the tokamak-based International Thermonuclear Experimental Reactor (ITER) but also continuing exploratory work on other plasma confinement concepts. Among the latter is the spheromak pursued on the SSPX facility at LLNL. Experiments in SSPX using electrostatic current drive by coaxial guns have now demonstrated stable spheromaks with good heat confinement, if the plasma is maintained near a Taylor state, but the anticipated high current amplification by gun injection has not yet been achieved. In future experiments and reactors, creating and maintaining a stable spheromak configuration at high magnetic field strength may require auxiliary current drive using neutral beams or RF power. Here we show that neutral beam current drive soon to be explored on SSPX could yield a compact spheromak reactor with current drive efficiency comparable to that of steady state tokamaks. Thus, while more will be learned about electrostatic current drive in coming months, results already achieved in SSPX could point to a productive parallel development path pursuing auxiliary current drive, consistent with plans to install neutral beams on SSPX in the near future. Among possible outcomes, spheromak research could also yield pulsed fusion reactors at lower capital cost than any fusion concept yet proposed.

  10. Characterizing and Subsetting Big Data Workloads

    OpenAIRE

    Jia, Zhen; Zhan, Jianfeng; Wang, Lei; Han, Rui; Mckee, Sally A.; Yang, Qiang; Luo, Chunjie; Li, Jingwei

    2014-01-01

    Big data benchmark suites must include a diversity of data and workloads to be useful in fairly evaluating big data systems and architectures. However, using truly comprehensive benchmarks poses great challenges for the architecture community. First, we need to thoroughly understand the behaviors of a variety of workloads. Second, our usual simulation-based research methods become prohibitively expensive for big data. As big data is an emerging field, more and more software stacks are being p...

  11. BIG DATA IN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2015-06-01

    Full Text Available In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured in order to improve current transactions, to develop new business models, to provide a real image of the supply and demand and thereby, generate market advantages. So, the companies that turn to Big Data have a competitive advantage over other firms. Looking from the perspective of IT organizations, they must accommodate the storage and processing Big Data, and provide analysis tools that are easily integrated into business processes. This paper aims to discuss aspects regarding the Big Data concept, the principles to build, organize and analyse huge datasets in the business environment, offering a three-layer architecture, based on actual software solutions. Also, the article refers to the graphical tools for exploring and representing unstructured data, Gephi and NodeXL.

  12. Big data and urban governance

    NARCIS (Netherlands)

    L. Taylor; C. Richter

    2015-01-01

    This chapter examines the ways in which big data is involved in the rise of smart cities. Mobile phones, sensors and online applications produce streams of data which are used to regulate and plan the city, often in real time, but which presents challenges as to how the city’s functions are seen and

  13. Do big gods cause anything?

    DEFF Research Database (Denmark)

    Geertz, Armin W.

    2014-01-01

    Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere....

  14. True Randomness from Big Data

    Science.gov (United States)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-09-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  15. China: Big Changes Coming Soon

    Science.gov (United States)

    Rowen, Henry S.

    2011-01-01

    Big changes are ahead for China, probably abrupt ones. The economy has grown so rapidly for many years, over 30 years at an average of nine percent a year, that its size makes it a major player in trade and finance and increasingly in political and military matters. This growth is not only of great importance internationally, it is already having…

  16. YOUNG CITY,BIG PARTY

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Shenzhen Universiade unites the world’s young people through sports with none of the usual hoop-la, no fireworks, no grand performances by celebrities and superstars, the Shenzhen Summer Universiade lowered the curtain on a big party for youth and college students on August 23.

  17. 1976 Big Thompson flood, Colorado

    Science.gov (United States)

    Jarrett, R. D.; Vandas, S.J.

    2006-01-01

    In the early evening of July 31, 1976, a large stationary thunderstorm released as much as 7.5 inches of rainfall in about an hour (about 12 inches in a few hours) in the upper reaches of the Big Thompson River drainage. This large amount of rainfall in such a short period of time produced a flash flood that caught residents and tourists by surprise. The immense volume of water that churned down the narrow Big Thompson Canyon scoured the river channel and destroyed everything in its path, including 418 homes, 52 businesses, numerous bridges, paved and unpaved roads, power and telephone lines, and many other structures. The tragedy claimed the lives of 144 people. Scores of other people narrowly escaped with their lives. The Big Thompson flood ranks among the deadliest of Colorado's recorded floods. It is one of several destructive floods in the United States that has shown the necessity of conducting research to determine the causes and effects of floods. The U.S. Geological Survey (USGS) conducts research and operates a Nationwide streamgage network to help understand and predict the magnitude and likelihood of large streamflow events such as the Big Thompson Flood. Such research and streamgage information are part of an ongoing USGS effort to reduce flood hazards and to increase public awareness.

  18. The Big European Bubble Chamber

    CERN Document Server

    1977-01-01

    The 3.70 metre Big European Bubble Chamber (BEBC), dismantled on 9 August 1984. During operation it was one of the biggest detectors in the world, producing direct visual recordings of particle tracks. 6.3 million photos of interactions were taken with the chamber in the course of its existence.

  19. Big data e data science

    OpenAIRE

    Cavique, Luís

    2014-01-01

    Neste artigo foram apresentados os conceitos básicos de Big Data e a nova área a que deu origem, a Data Science. Em Data Science foi discutida e exemplificada a noção de redução da dimensionalidade dos dados.

  20. True Randomness from Big Data

    Science.gov (United States)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-01-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514

  1. Space Weathering of Rocks

    Science.gov (United States)

    Noble, Sarah

    2011-01-01

    Space weathering discussions have generally centered around soils but exposed rocks will also incur the effects of weathering. On the Moon, rocks make up only a very small percentage of the exposed surface and areas where rocks are exposed, like central peaks, are often among the least space weathered regions we find in remote sensing data. However, our studies of weathered Ap 17 rocks 76015 and 76237 show that significant amounts of weathering products can build up on rock surfaces. Because rocks have much longer surface lifetimes than an individual soil grain, and thus record a longer history of exposure, we can study these products to gain a deeper perspective on the weathering process and better assess the relative impo!1ance of various weathering components on the Moon. In contrast to the lunar case, on small asteroids, like Itokowa, rocks make up a large fraction of the exposed surface. Results from the Hayabusa spacecraft at Itokowa suggest that while the low gravity does not allow for the development of a mature regolith, weathering patinas can and do develop on rock surfaces, in fact, the rocky surfaces were seen to be darker and appear spectrally more weathered than regions with finer materials. To explore how weathering of asteroidal rocks may differ from lunar, a set of ordinary chondrite meteorites (H, L, and LL) which have been subjected to artificial space weathering by nanopulse laser were examined by TEM. NpFe(sup 0) bearing glasses were ubiquitous in both the naturally-weathered lunar and the artificially-weathered meteorite samples.

  2. Small Places, Big Stakes

    DEFF Research Database (Denmark)

    Garsten, Christina; Sörbom, Adrienne

    Ethnographic fieldwork in organizations – such as corporations, state agencies, and international organizations – often entails that the ethnographer has to rely to a large extent on meetings as the primary point of access. Oftentimes, this involves doing fieldwork in workshops, at ceremonies......, and at other staged, formal events. In addition, such fieldwork tends to be both multilocal, mobile, and discontinuous. It may not provide as much of a flavour of the different local sites and a sense of ‘being there' as one would wish for. The tendency in anthropology to favour the informal, the ‘genuine......' or ‘authentic' as well as the spontaneous, may leave one with a lingering feeling of having to make do with second-rate material, i.e. the formal, the superficial, and the organized. To a large extent, the staged character of the social events that are accessible to the ethnographer suggests that s/he has been...

  3. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  4. Tipping Point

    Medline Plus

    Full Text Available ... en español Blog About OnSafety CPSC Stands for Safety The Tipping Point Home > 60 Seconds of Safety (Videos) > The Tipping Point The Tipping Point by ... danger death electrical fall furniture head injury product safety television tipover tv Watch the video in Adobe ...

  5. PEOPLE & POINTS

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Rocking the 'Cross-Strait' Boat Just as people thought that crossstrait tensions couldn't get any more testy amid Taiwan leader Chen Shui-bian's efforts to hinder the development of cross-strait ties between the mainland and the island, they did when Chen stumbled upon a new secession drive. Chen announced February 27 his decision to terminate the "National Unification Council" and scrap the

  6. Data, Data, Data : Big, Linked & Open

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Eckartz, S.M.

    2013-01-01

    De gehele business en IT-wereld praat op dit moment over Big Data, een trend die medio 2013 Cloud Computing is gepasseerd (op basis van Google Trends). Ook beleidsmakers houden zich actief bezig met Big Data. Neelie Kroes, vice-president van de Europese Commissie, spreekt over de ‘Big Data Revolutio

  7. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  8. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  9. "Big Data" - Grosse Daten, viel Wissen?

    OpenAIRE

    Hothorn, Torsten

    2015-01-01

    Since a couple of years, the term Big Data describes technologies to extract knowledge from data. Applications of Big Data and their consequences are also increasingly discussed in the mass media. Because medicine is an empirical science, we discuss the meaning of Big Data and its potential for future medical research.

  10. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services. PMID:27559194

  11. The BigBoss Experiment

    International Nuclear Information System (INIS)

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = λ/Δλ = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 max = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (kmax = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  12. Experimental Breeder Reactor I Preservation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Julie Braun

    2006-10-01

    Experimental Breeder Reactor I (EBR I) is a National Historic Landmark located at the Idaho National Laboratory, a Department of Energy laboratory in southeastern Idaho. The facility is significant for its association and contributions to the development of nuclear reactor testing and development. This Plan includes a structural assessment of the interior and exterior of the EBR I Reactor Building from a preservation, rather than an engineering stand point and recommendations for maintenance to ensure its continued protection.

  13. T-S fuzzy control on nuclear reactor power based on model of point kinetics with one delayed neutron group%基于单组缓发中子模型的反应堆功率T-S模糊控制

    Institute of Scientific and Technical Information of China (English)

    赵伟宁; 栾秀春; 樊达宜; 周杰

    2013-01-01

    The T - S fuzzy controller was designed based on the dynamic model of point kinetics with one delayed neutron group to control the power of nuclear reactor. The simulation result showed the satisfactory performance of the T - S fuzzy controller to control the nuclear reactor power output.%基于T-S模糊模型,针对单组缓发中子点堆动力学方程,设计了T-S模糊控制器来实现对反应堆功率的控制.仿真结果表明,所设计的T-S模糊模型控制器能够较好的控制反应堆功率的输出,取得较好的控制效果.

  14. Sonochemical Reactors.

    Science.gov (United States)

    Gogate, Parag R; Patil, Pankaj N

    2016-10-01

    Sonochemical reactors are based on the generation of cavitational events using ultrasound and offer immense potential for the intensification of physical and chemical processing applications. The present work presents a critical analysis of the underlying mechanisms for intensification, available reactor configurations and overview of the different applications exploited successfully, though mostly at laboratory scales. Guidelines have also been presented for optimum selection of the important operating parameters (frequency and intensity of irradiation, temperature and liquid physicochemical properties) as well as the geometric parameters (type of reactor configuration and the number/position of the transducers) so as to maximize the process intensification benefits. The key areas for future work so as to transform the successful technique at laboratory/pilot scale into commercial technology have also been discussed. Overall, it has been established that there is immense potential for sonochemical reactors for process intensification leading to greener processing and economic benefits. Combined efforts from a wide range of disciplines such as material science, physics, chemistry and chemical engineers are required to harness the benefits at commercial scale operation.

  15. Global Fluctuation Spectra in Big Crunch/Big Bang String Vacua

    OpenAIRE

    Craps, Ben; Ovrut, Burt A.

    2003-01-01

    We study Big Crunch/Big Bang cosmologies that correspond to exact world-sheet superconformal field theories of type II strings. The string theory spacetime contains a Big Crunch and a Big Bang cosmology, as well as additional ``whisker'' asymptotic and intermediate regions. Within the context of free string theory, we compute, unambiguously, the scalar fluctuation spectrum in all regions of spacetime. Generically, the Big Crunch fluctuation spectrum is altered while passing through the bounce...

  16. Rock kinoekraanil / Katrin Rajasaare

    Index Scriptorium Estoniae

    Rajasaare, Katrin

    2008-01-01

    7.-11. juulini kinos Sõprus toimuval filminädalal "Rock On Screen" ekraanile jõudvatest rockmuusikuid portreteerivatest filmidest "Lou Reed's Berlin", "The Future Is Unwritten: Joe Strummer", "Control: Joy Division", "Hurriganes", "Shlaager"

  17. Pop & rock / Berk Vaher

    Index Scriptorium Estoniae

    Vaher, Berk, 1975-

    2001-01-01

    Uute heliplaatide Redman "Malpractice", Brian Eno & Peter Schwalm "Popstars", Clawfinger "A Whole Lot of Nothing", Dario G "In Full Color", MLTR e. Michael Learns To Rock "Blue Night" lühitutvustused

  18. [Keeping of bears and big cats in the zoo and circus].

    Science.gov (United States)

    Rietschel, W

    2002-03-01

    The exhibition of bears and big cats in zoo and circus causes regular criticism, justified and unjustified, by people engaged in the prevention of cruelty to animals. Main points of critique are holding conditions, feeding and health status of the animals. The official veterinarian involved in the supervision often needs the cooperation of a specialised zoo veterinarian. In most cases the clinical examination of bears and big cats requires an immobilisation. This article will enter into some of the most common holding problems and diseases of big carnivores in zoo and circus.

  19. Augmented Borders:Big Data and the Ethics of Immigration Control

    OpenAIRE

    Ajana, Btihaj

    2015-01-01

    Purpose– The aim of this paper is to consider some of the issues in light of the application of Big Data in the domain of border security and immigration management. Investment in the technologies of borders and their securitisation continues to be a focal point for many governments across the globe. This paper is concerned with a particular example of such technologies, namely, “Big Data” analytics. In the past two years, the technology of Big Data has gained a remarkable popularity within a...

  20. Age and gender might influence big five factors of personality: a preliminary report in Indian population.

    Science.gov (United States)

    Magan, Dipti; Mehta, Manju; Sarvottam, Kumar; Yadav, Raj Kumar; Pandey, R M

    2014-01-01

    Age and gender are two important physiological variables which might influence the personality of an individual. The influence of age and gender on big five personality domains in Indian population was assessed in this cross-sectional study that included 155 subjects (female = 76, male = 79) aged from 16-75 years. Big five personality factors were evaluated using 60-item NEO-Five Factor Inventory (NEO-FFI) at a single point in time. Among the big five factors of personality, Conscientiousness was positively correlated (r = 0.195; P personality traits might change with age, and is gender-dependent.

  1. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    CERN Multimedia

    2008-01-01

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  2. Perspectives on Big Data and Big Data Analytics

    Directory of Open Access Journals (Sweden)

    Elena Geanina ULARU

    2012-12-01

    Full Text Available Nowadays companies are starting to realize the importance of using more data in order to support decision for their strategies. It was said and proved through study cases that “More data usually beats better algorithms”. With this statement companies started to realize that they can chose to invest more in processing larger sets of data rather than investing in expensive algorithms. The large quantity of data is better used as a whole because of the possible correlations on a larger amount, correlations that can never be found if the data is analyzed on separate sets or on a smaller set. A larger amount of data gives a better output but also working with it can become a challenge due to processing limitations. This article intends to define the concept of Big Data and stress the importance of Big Data Analytics.

  3. Web Science Big Wins: Information Big Bang & Fundamental Constants

    OpenAIRE

    Carr, Les

    2010-01-01

    We take for granted a Web that provides free and unrestricted information exchange, but the Web is under pressure to change in order to respond to issues of security, commerce, criminality, privacy. Web Science needs to explain how the Web impacts society and predict the outcomes of proposed changes to Web infrastructure on business and society. Using the analogy of the Big Bang, this presentation describes how the Web spread the conditions of its initial creation throughout the whole of soci...

  4. Nástroje pro Big Data Analytics

    OpenAIRE

    Miloš, Marek

    2013-01-01

    The thesis covers the term for specific data analysis called Big Data. The thesis firstly defines the term Big Data and the need for its creation because of the rising need for deeper data processing and analysis tools and methods. The thesis also covers some of the technical aspects of Big Data tools, focusing on Apache Hadoop in detail. The later chapters contain Big Data market analysis and describe the biggest Big Data competitors and tools. The practical part of the thesis presents a way...

  5. ISSUES, CHALLENGES, AND SOLUTIONS: BIG DATA MINING

    Directory of Open Access Journals (Sweden)

    Jaseena K.U.

    2014-12-01

    Full Text Available Data has become an indispensable part of every economy, industry, organization, business function and individual. Big Data is a term used to identify the datasets that whose size is beyond the ability of typical database software tools to store, manage and analyze. The Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This paper presents the literature review about the Big data Mining and the issues and challenges with emphasis on the distinguished features of Big Data. It also discusses some methods to deal with big data.

  6. Teaching About Nature's Nuclear Reactors

    CERN Document Server

    Herndon, J M

    2005-01-01

    Naturally occurring nuclear reactors existed in uranium deposits on Earth long before Enrico Fermi built the first man-made nuclear reactor beneath Staggs Field in 1942. In the story of their discovery, there are important lessons to be learned about scientific inquiry and scientific discovery. Now, there is evidence to suggest that the Earth's magnetic field and Jupiter's atmospheric turbulence are driven by planetary-scale nuclear reactors. The subject of planetocentric nuclear fission reactors can be a jumping off point for stimulating classroom discussions about the nature and implications of planetary energy sources and about the geomagnetic field. But more importantly, the subject can help to bring into focus the importance of discussing, debating, and challenging current thinking in a variety of areas.

  7. Determination of chlorine in silicate rocks

    Science.gov (United States)

    Peck, L.C.

    1959-01-01

    In a rapid accurate method for the determination of chlorine in silicate rocks, the rock powder is sintered with a sodium carbonate flux containing zinc oxide and magnesium carbonate. The sinter cake is leached with water, the resulting solution is filtered, and the filtrate is acidified with nitric acid. Chlorine is determined by titrating this solution with mercuric nitrate solution using sodium nitroprusside as the indicator. The titration is made in the dark with a beam of light shining through the solution. The end point of the titration is found by visually comparing the intensity of this beam of light with that of a similar beam of light in a reference solution.

  8. Preliminary Study on weathering and pedogenesis of carbonate rock

    Institute of Scientific and Technical Information of China (English)

    王世杰; 季宏兵; 欧阳自远; 周德全; 郑乐平; 黎廷宇

    1999-01-01

    South China is the largest continuous distribution area of carbonate rock in the world. The origin of the soils over the bedrock carbonate rock has long been a controversial topic. Here further exploration is made by taking five soil profiles as examples, which are developed over the bedrock dolomitite and limestone and morphologically located in upland in karst terrain in the central, west and north Guizhou as well as west Hunan, and proved to be the weathering profiles of carbonate rock by the research results of acid-dissolved extraction experiment of bedrock, mineralogy and trace element geochemistry. Field, mineralogical and trace element geochemical characteristics of weathering and pedogenesis for carbonate rock are discussed in detail. It is pointed out that weathering and pedogenesis of carbonate rock are important pedogenetic mechanisms for soil resources in karst area, providing a basis for further researches on the origin of soils widely overlying bedrock carbonate rocks in South China.

  9. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-10-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification

  10. Big data is not a monolith

    CERN Document Server

    Sugimoto, Cassidy R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  11. Possible triggering of solar activity to big earthquakes (Ms ≥ 8) in faults with near west-east strike in China

    Institute of Scientific and Technical Information of China (English)

    HAN; Yanben; GUO; Zengjian; WU; Jinbing; MA; Lihua

    2004-01-01

    This paper studies the relationship between solar activity and big earthquakes (Ms≥8) that occurred in China and western Mongolia. It is discovered that the occurrence dates of most of the big earthquakes in and near faults with west-east strike are close to the maximum years of sunspot numbers, whereas dates of some big earthquakes which are not in such faults are not close to the maximum years. We consider that it is possibly because of the appearance of many magnetic storms in the maximum years of solar activity. The magnetic storms result in anomalies of geomagnetic field and then produce eddy current in the faults gestating earthquake with near west-east strike. Perhaps the gestated big earthquakes occur easily since the eddy current heats the rocks in the faults and therefore decreases the shear resistant intensity and the static friction limit of the rocks.

  12. George and the big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2012-01-01

    George has problems. He has twin baby sisters at home who demand his parents’ attention. His beloved pig Freddy has been exiled to a farm, where he’s miserable. And worst of all, his best friend, Annie, has made a new friend whom she seems to like more than George. So George jumps at the chance to help Eric with his plans to run a big experiment in Switzerland that seeks to explore the earliest moment of the universe. But there is a conspiracy afoot, and a group of evildoers is planning to sabotage the experiment. Can George repair his friendship with Annie and piece together the clues before Eric’s experiment is destroyed forever? This engaging adventure features essays by Professor Stephen Hawking and other eminent physicists about the origins of the universe and ends with a twenty-page graphic novel that explains how the Big Bang happened—in reverse!

  13. Big Numbers in String Theory

    CERN Document Server

    Schellekens, A N

    2016-01-01

    This paper contains some personal reflections on several computational contributions to what is now known as the "String Theory Landscape". It consists of two parts. The first part concerns the origin of big numbers, and especially the number $10^{1500}$ that appeared in work on the covariant lattice construction (with W. Lerche and D. Luest). This part contains some new results. I correct a huge but inconsequential error, discuss some more accurate estimates, and compare with the counting for free fermion constructions. In particular I prove that the latter only provide an exponentially small fraction of all even self-dual lattices for large lattice dimensions. The second part of the paper concerns dealing with big numbers, and contains some lessons learned from various vacuum scanning projects.

  14. The big wheels of ATLAS

    CERN Multimedia

    2006-01-01

    The ATLAS cavern is filling up at an impressive rate. The installation of the first of the big wheels of the muon spectrometer, a thin gap chamber (TGC) wheel, was completed in September. The muon spectrometer will include four big moving wheels at each end, each measuring 25 metres in diameter. Of the eight wheels in total, six will be composed of thin gap chambers for the muon trigger system and the other two will consist of monitored drift tubes (MDTs) to measure the position of the muons (see Bulletin No. 13/2006). The installation of the 688 muon chambers in the barrel is progressing well, with three-quarters of them already installed between the coils of the toroid magnet.

  15. Big data and ophthalmic research.

    Science.gov (United States)

    Clark, Antony; Ng, Jonathon Q; Morlet, Nigel; Semmens, James B

    2016-01-01

    Large population-based health administrative databases, clinical registries, and data linkage systems are a rapidly expanding resource for health research. Ophthalmic research has benefited from the use of these databases in expanding the breadth of knowledge in areas such as disease surveillance, disease etiology, health services utilization, and health outcomes. Furthermore, the quantity of data available for research has increased exponentially in recent times, particularly as e-health initiatives come online in health systems across the globe. We review some big data concepts, the databases and data linkage systems used in eye research-including their advantages and limitations, the types of studies previously undertaken, and the future direction for big data in eye research. PMID:26844660

  16. Big data and ophthalmic research.

    Science.gov (United States)

    Clark, Antony; Ng, Jonathon Q; Morlet, Nigel; Semmens, James B

    2016-01-01

    Large population-based health administrative databases, clinical registries, and data linkage systems are a rapidly expanding resource for health research. Ophthalmic research has benefited from the use of these databases in expanding the breadth of knowledge in areas such as disease surveillance, disease etiology, health services utilization, and health outcomes. Furthermore, the quantity of data available for research has increased exponentially in recent times, particularly as e-health initiatives come online in health systems across the globe. We review some big data concepts, the databases and data linkage systems used in eye research-including their advantages and limitations, the types of studies previously undertaken, and the future direction for big data in eye research.

  17. Beyond the Point One Zero World

    DEFF Research Database (Denmark)

    Søndergaard, Morten

    2009-01-01

    , however intuitive they may be. Art and everything we do in everyday life – and in science, for that matter - is only dealing with a point zero one world. It is as if a silent big bang occurred, alongside the catastrophe that created our universe, and scattered the elements and building stones of any...

  18. Big Bang Nucleosynthesis: An Update

    OpenAIRE

    Olive, Keith A.; Scully, Sean T.

    1995-01-01

    WThe current status of big bang nucleosynthesis is reviewed with an emphasis on the comparison between the observational determination of the light element abundances of \\D, \\he3, \\he4 and \\li7 and the predictions from theory. In particular, we present new analyses for \\he4 and \\li7. Implications for physics beyond the standard model are also discussed. Limits on the effective number of neutrino flavors are also updated.

  19. Big data processing with Hadoop

    OpenAIRE

    Wu, Shiqi

    2015-01-01

    Computing technology has changed the way we work, study, and live. The distributed data processing technology is one of the popular topics in the IT field. It provides a simple and centralized computing platform by reducing the cost of the hardware. The characteristics of distributed data processing technology have changed the whole industry. Hadoop, as the open source project of Apache foundation, is the most representative platform of distributed big data processing. The Hadoop distribu...

  20. BIG DATA IN BUSINESS ENVIRONMENT

    OpenAIRE

    Logica BANICA; Alina HAGIU

    2015-01-01

    In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured) in order to improve current transactions, to develop new business models, to provide a real image ...

  1. BIG Data – A Review.

    OpenAIRE

    Anuradha Bhatia; Gaurav Vaswani

    2013-01-01

    As more data becomes available from an abundance of sources both within and outside, organizations are seeking to use those abundant resources to increase innovation, retain customers, and increase operational efficiency. At the same time, organizations are challenged by their end users, who are demanding greater capability and integration to mine and analyze burgeoning new sources of information. Big Data provides opportunities for business users to ask questions they never were able to ask ...

  2. Pragmatic Interaction between Big Powers

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Lu. It is very difficult to summarize the relationship among big powers in 2004. Looking east, there existed a """"small cold war"""" named by some media between Europe and Russia and between the United States and Russia; with regard to the """"orange revolution"""" in Ukraine at the end of the year, a rival show rope and Russia. Looking east, awas displayed between America, Eufresh scent seems to fill the air.

  3. Geochemical and petrographic data for intrusions peripheral to the Big Timber Stock, Crazy Mountains, Montana

    Science.gov (United States)

    du Bray, Edward A.; Van Gosen, Bradley S.

    2015-01-01

    The Paleocene Fort Union Formation hosts a compositionally diverse array of Eocene plugs, dikes, and sills arrayed around the Eocene Big Timber stock in the Crazy Mountains of south-central Montana. The geochemistry and petrography of the sills have not previously been characterized or interpreted. The purpose of this report is (1) to present available geochemical and petrographic data for several dozen samples of these rocks and (2) to provide a basic interpretive synthesis of these data.

  4. Reactor container

    International Nuclear Information System (INIS)

    A reactor container has a suppression chamber partitioned by concrete side walls, a reactor pedestal and a diaphragm floor. A plurality of partitioning walls are disposed in circumferential direction each at an interval inside the suppression chamber, so that independent chambers in a state being divided into plurality are formed inside the suppression chamber. The partition walls are formed from the bottom portion of the suppression chamber up to the diaphragm floor to isolate pool water in a divided state. Operation platforms are formed above the suppression chamber and connected to an access port. Upon conducting maintenance, inspection or repairing, a pump is disposed in the independent chamber to transfer pool water therein to one or a plurality of other independent chambers to make it vacant. (I.N.)

  5. Big data: the management revolution.

    Science.gov (United States)

    McAfee, Andrew; Brynjolfsson, Erik

    2012-10-01

    Big data, the authors write, is far more powerful than the analytics of the past. Executives can measure and therefore manage more precisely than ever before. They can make better predictions and smarter decisions. They can target more-effective interventions in areas that so far have been dominated by gut and intuition rather than by data and rigor. The differences between big data and analytics are a matter of volume, velocity, and variety: More data now cross the internet every second than were stored in the entire internet 20 years ago. Nearly real-time information makes it possible for a company to be much more agile than its competitors. And that information can come from social networks, images, sensors, the web, or other unstructured sources. The managerial challenges, however, are very real. Senior decision makers have to learn to ask the right questions and embrace evidence-based decision making. Organizations must hire scientists who can find patterns in very large data sets and translate them into useful business information. IT departments have to work hard to integrate all the relevant internal and external sources of data. The authors offer two success stories to illustrate how companies are using big data: PASSUR Aerospace enables airlines to match their actual and estimated arrival times. Sears Holdings directly analyzes its incoming store data to make promotions much more precise and faster. PMID:23074865

  6. Big Data Comes to School

    Directory of Open Access Journals (Sweden)

    Bill Cope

    2016-03-01

    Full Text Available The prospect of “big data” at once evokes optimistic views of an information-rich future and concerns about surveillance that adversely impacts our personal and private lives. This overview article explores the implications of big data in education, focusing by way of example on data generated by student writing. We have chosen writing because it presents particular complexities, highlighting the range of processes for collecting and interpreting evidence of learning in the era of computer-mediated instruction and assessment as well as the challenges. Writing is significant not only because it is central to the core subject area of literacy; it is also an ideal medium for the representation of deep disciplinary knowledge across a number of subject areas. After defining what big data entails in education, we map emerging sources of evidence of learning that separately and together have the potential to generate unprecedented amounts of data: machine assessments, structured data embedded in learning, and unstructured data collected incidental to learning activity. Our case is that these emerging sources of evidence of learning have significant implications for the traditional relationships between assessment and instruction. Moreover, for educational researchers, these data are in some senses quite different from traditional evidentiary sources, and this raises a number of methodological questions. The final part of the article discusses implications for practice in an emerging field of education data science, including publication of data, data standards, and research ethics.

  7. The BigBOSS Experiment

    CERN Document Server

    Schlegel, D; Abraham, T; Ahn, C; Prieto, C Allende; Annis, J; Aubourg, E; Azzaro, M; Baltay, S Bailey C; Baugh, C; Bebek, C; Becerril, S; Blanton, M; Bolton, A; Bromley, B; Cahn, R; Carton, P -H; Cervantes-Cota, J L; Chu, Y; Cortes, M; Dawson, K; Dey, A; Dickinson, M; Diehl, H T; Doel, P; Ealet, A; Edelstein, J; Eppelle, D; Escoffier, S; Evrard, A; Faccioli, L; Frenk, C; Geha, M; Gerdes, D; Gondolo, P; Gonzalez-Arroyo, A; Grossan, B; Heckman, T; Heetderks, H; Ho, S; Honscheid, K; Huterer, D; Ilbert, O; Ivans, I; Jelinsky, P; Jing, Y; Joyce, D; Kennedy, R; Kent, S; Kieda, D; Kim, A; Kim, C; Kneib, J -P; Kong, X; Kosowsky, A; Krishnan, K; Lahav, O; Lampton, M; LeBohec, S; Brun, V Le; Levi, M; Li, C; Liang, M; Lim, H; Lin, W; Linder, E; Lorenzon, W; de la Macorra, A; Magneville, Ch; Malina, R; Marinoni, C; Martinez, V; Majewski, S; Matheson, T; McCloskey, R; McDonald, P; McKay, T; McMahon, J; Menard, B; Miralda-Escude, J; Modjaz, M; Montero-Dorta, A; Morales, I; Mostek, N; Newman, J; Nichol, R; Nugent, P; Olsen, K; Padmanabhan, N; Palanque-Delabrouille, N; Park, I; Peacock, J; Percival, W; Perlmutter, S; Peroux, C; Petitjean, P; Prada, F; Prieto, E; Prochaska, J; Reil, K; Rockosi, C; Roe, N; Rollinde, E; Roodman, A; Ross, N; Rudnick, G; Ruhlmann-Kleider, V; Sanchez, J; Sawyer, D; Schimd, C; Schubnell, M; Scoccimaro, R; Seljak, U; Seo, H; Sheldon, E; Sholl, M; Shulte-Ladbeck, R; Slosar, A; Smith, D S; Smoot, G; Springer, W; Stril, A; Szalay, A S; Tao, C; Tarle, G; Taylor, E; Tilquin, A; Tinker, J; Valdes, F; Wang, J; Wang, T; Weaver, B A; Weinberg, D; White, M; Wood-Vasey, M; Yang, J; Yeche, X Yang Ch; Zakamska, N; Zentner, A; Zhai, C; Zhang, P

    2011-01-01

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy red...

  8. NEUTRONIC REACTORS

    Science.gov (United States)

    Anderson, J.B.

    1960-01-01

    A reactor is described which comprises a tank, a plurality of coaxial steel sleeves in the tank, a mass of water in the tank, and wire grids in abutting relationship within a plurality of elongated parallel channels within the steel sleeves, the wire being provided with a plurality of bends in the same plane forming adjacent parallel sections between bends, and the sections of adjacent grids being normally disposed relative to each other.

  9. Lander and Mini Matterhorn rock

    Science.gov (United States)

    1997-01-01

    One of the two forward cameras aboard the Sojourner rover took this image of the Sagan Memorial Station on Sol 26. The angular resolution of the camera is about three milliradians (.018 degrees) per pixel, which is why the image appears grainy. The field of view of each rover camera is about 127 degrees horizontally and 90 degrees vertically.Features seen on the lander include (from left to right): the Atmospheric Structure Instrument/Meteorology Package (ASI/MET) mast with windsocks; the low-gain antenna mast, the Imager for Mars Pathfinder (IMP) on its mast at center; the disc-shaped high-gain antenna at right, and areas of deflated airbags. The dark circle on the lander body is a filtered vent that allowed air to escape during launch, and allowed the lander to repressurize upon landing. The high-gain antenna is pointed at Earth. The large rock Yogi, which Sojourner has approached and studied, as at the far right of the image. Mini Matterhorn is the large rock situated in front of the lander at left.The horizontal line at the center of the image is due to differences in light-metering for different portions of the image. The shadow of Sojourner and its antenna are visible at the lower section of the image. The antenna's shadow falls across a light-colored rock.Mars Pathfinder is the second in NASA's Discovery program of low-cost spacecraft with highly focused science goals. The Jet Propulsion Laboratory, Pasadena, CA, developed and manages and Mars Pathfinder mission for NASA's Office of Space Science, Washington, D.C. JPL is an operating division of the California Institute of Technology (Caltech). The Imager for Mars Pathfinder (IMP) was developed by the University of Arizona Lunar and Planetary Laboratory under contract to JPL. Peter Smith is the Principal Investigator.

  10. Groundwater in granitic rocks

    International Nuclear Information System (INIS)

    A comparison of published chemical analyses of ground waters found in granitic rocks from a variety of locations shows that their compositions fall into two distinct classes. Ground waters from shallow wells and springs have a high bicarbonate/chloride ratio resulting from the neutralization of carbonic acid (dissolved CO2) by weathering reactions. The sodium, potassium, and silica released by weathering reactions drive the solutions away from equilibrium with the dominant minerals in the granites (i.e., quartz, muscovite, potassium feldspar, and albite). On the other hand, ground waters from deep wells and excavations are rich in chloride relative to bicarbonate. Their Na, K, H, and silica activities indicate that they are nearly equilibrated with the granite minerals suggesting a very long residence time in the host rock. These observations furnish the basis for a powerful tool to aid in selecting sites for radioactive waste disposal in granitic rocks. When water-bearing fractures are encountered in these rocks, a chemical analysis of the solutions contained within the fracture can determine whether the water came from the surface, i.e., is bicarbonate rich and not equilibrated, or whether it is some sort of connate water that has resided in the rock for a long period, i.e., chloride rich and equilibrated. This technique should allow immediate recognition of fracture systems in granitic radioactive waste repositories that would allow radionuclides to escape to the surface

  11. Big science transformed science, politics and organization in Europe and the United States

    CERN Document Server

    Hallonsten, Olof

    2016-01-01

    This book analyses the emergence of a transformed Big Science in Europe and the United States, using both historical and sociological perspectives. It shows how technology-intensive natural sciences grew to a prominent position in Western societies during the post-World War II era, and how their development cohered with both technological and social developments. At the helm of post-war science are large-scale projects, primarily in physics, which receive substantial funds from the public purse. Big Science Transformed shows how these projects, popularly called 'Big Science', have become symbols of progress. It analyses changes to the political and sociological frameworks surrounding publicly-funding science, and their impact on a number of new accelerator and reactor-based facilities that have come to prominence in materials science and the life sciences. Interdisciplinary in scope, this book will be of great interest to historians, sociologists and philosophers of science.

  12. Big Bang–Big Crunch Optimization Algorithm for Linear Phase Fir Digital Filter Design

    OpenAIRE

    Ms. Rashmi Singh Dr. H. K. Verma

    2012-01-01

    The Big Bang–Big Crunch (BB–BC) optimization algorithm is a new optimization method that relies on the Big Bang and Big Crunch theory, one of the theories of the evolution of the universe. In this paper, a Big Bang–Big Crunch algorithm has been used here for the design of linear phase finite impulse response (FIR) filters. Here the experimented fitness function based on the mean squared error between the actual and the ideal filter response. This paper presents the plot of magnitude response ...

  13. Rock mechanics research awards

    Science.gov (United States)

    Wagner, John E.

    The U.S. National Committee for Rock Mechanics, at its June 1983 annual meeting, adopted three actions to enhance the competition and public awareness of its annual awards program for rock mechanics papers. It will issue a call for nominations of outstanding papers; it will request participating societies to announce the names of award winners and the titles of papers, and it will publish an abstract of the winning papers in the proceedings of the annual U.S. Rock Mechanics Symposium in the year following the awards.The competition is open to papers, by U.S residents or students in a U.S. school, published in an English language publication normally available in the United States. The following authors and papers are the 1983 award winners:

  14. Fuel performance improvement program. Semiannual progress report, April-September 1980

    International Nuclear Information System (INIS)

    Progress on the Fuel Performance Improvement Program's fuel test and demonstration irradiations is reported for the period April-September, 1980. Included are results of out-of-reactor experiments with zircaloy cladding on the iodine assisted stress corrosion cracking mechanism. Preliminary results from the first eight ramp tests performed in the Halden Boiling Water Reactor are reported. The status of demonstration fuel irradiations in the Big Rock Point Reactor is described

  15. Digital carbonate rock physics

    Science.gov (United States)

    Saenger, Erik H.; Vialle, Stephanie; Lebedev, Maxim; Uribe, David; Osorno, Maria; Duda, Mandy; Steeb, Holger

    2016-08-01

    Modern estimation of rock properties combines imaging with advanced numerical simulations, an approach known as digital rock physics (DRP). In this paper we suggest a specific segmentation procedure of X-ray micro-computed tomography data with two different resolutions in the µm range for two sets of carbonate rock samples. These carbonates were already characterized in detail in a previous laboratory study which we complement with nanoindentation experiments (for local elastic properties). In a first step a non-local mean filter is applied to the raw image data. We then apply different thresholds to identify pores and solid phases. Because of a non-neglectable amount of unresolved microporosity (micritic phase) we also define intermediate threshold values for distinct phases. Based on this segmentation we determine porosity-dependent values for effective P- and S-wave velocities as well as for the intrinsic permeability. For effective velocities we confirm an observed two-phase trend reported in another study using a different carbonate data set. As an upscaling approach we use this two-phase trend as an effective medium approach to estimate the porosity-dependent elastic properties of the micritic phase for the low-resolution images. The porosity measured in the laboratory is then used to predict the effective rock properties from the observed trends for a comparison with experimental data. The two-phase trend can be regarded as an upper bound for elastic properties; the use of the two-phase trend for low-resolution images led to a good estimate for a lower bound of effective elastic properties. Anisotropy is observed for some of the considered subvolumes, but seems to be insignificant for the analysed rocks at the DRP scale. Because of the complexity of carbonates we suggest using DRP as a complementary tool for rock characterization in addition to classical experimental methods.

  16. Rock engineering applications, 1991

    International Nuclear Information System (INIS)

    This book demonstrates how to apply the theories and principles of rock engineering to actual engineering and construction tasks. It features insights on geology for mining and tunnelling applications. It is practical resource that focuses on the latest technological innovation and examines up-to-date procedures used by engineers for coping with complex rock conditions. The authors also discuss question related to underground space, from design approaches to underground housing and storage. And they cover the monitoring of storage caverns for liquid and gaseous products or toxic and radioactive wastes

  17. Rock Hellsinki, Marketing Research

    OpenAIRE

    Todd, Roosa; Jalkanen, Katariina

    2013-01-01

    This paper is a qualitative research about rock and heavy metal music tourism in the capital city of Finland, Helsinki. As Helsinki can be considered the city of contrasts, the silent nature city mixed with urban activities, it is important to also use the potential of the loud rock and heavy metal music contrasting the silence. Finland is known abroad for bands such as HIM, Nightwish, Korpiklaani and Children of Bodom so it would make sense to utilize these in the tourism sector as well. The...

  18. Session: Hard Rock Penetration

    Energy Technology Data Exchange (ETDEWEB)

    Tennyson, George P. Jr.; Dunn, James C.; Drumheller, Douglas S.; Glowka, David A.; Lysne, Peter

    1992-01-01

    This session at the Geothermal Energy Program Review X: Geothermal Energy and the Utility Market consisted of five presentations: ''Hard Rock Penetration - Summary'' by George P. Tennyson, Jr.; ''Overview - Hard Rock Penetration'' by James C. Dunn; ''An Overview of Acoustic Telemetry'' by Douglas S. Drumheller; ''Lost Circulation Technology Development Status'' by David A. Glowka; ''Downhole Memory-Logging Tools'' by Peter Lysne.

  19. Fracturing tests on reservoir rocks: Analysis of AE events and radial strain evolution

    CERN Document Server

    Pradhan, S; Fjær, E; Stenebråten, J; Lund, H K; Sønstebø, E F; Roy, S

    2015-01-01

    Fracturing in reservoir rocks is an important issue for the petroleum industry - as productivity can be enhanced by a controlled fracturing operation. Fracturing also has a big impact on CO2 storage, geothermal installation and gas production at and from the reservoir rocks. Therefore, understanding the fracturing behavior of different types of reservoir rocks is a basic need for planning field operations towards these activities. In our study, the fracturing of rock sample is monitored by Acoustic Emission (AE) and post-experiment Computer Tomography (CT) scans. The fracturing experiments have been performed on hollow cylinder cores of different rocks - sandstones and chalks. Our analysis show that the amplitudes and energies of acoustic events clearly indicate initiation and propagation of the main fractures. The amplitudes of AE events follow an exponential distribution while the energies follow a power law distribution. Time-evolution of the radial strain measured in the fracturing-test will later be comp...

  20. A reduced-boron OPR1000 core based on the BigT burnable absorber

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Hwan Yeal; Yahya, Mohd-Syukri; Kim, Yong Hee [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon (Korea, Republic of)

    2016-04-15

    Reducing critical boron concentration in a commercial pressurized water reactor core offers many advantages in view of safety and economics. This paper presents a preliminary investigation of a reduced-boron pressurized water reactor core to achieve a clearly negative moderator temperature coefficient at hot zero power using the newly-proposed 'Burnable absorber-Integrated Guide Thimble' (BigT) absorbers. The reference core is based on a commercial OPR1000 equilibrium configuration. The reduced-boron ORP1000 configuration was determined by simply replacing commercial gadolinia-based burnable absorbers with the optimized BigT-loaded design. The equilibrium cores in this study were directly searched via repetitive Monte Carlo depletion calculations until convergence. The results demonstrate that, with the same fuel management scheme as in the reference core, application of the BigT absorbers can effectively reduce the critical boron concentration at the beginning of cycle by about 65 ppm. More crucially, the analyses indicate promising potential of the reduced-boron OPR1000 core with the BigT absorbers, as its moderator temperature coefficient at the beginning of cycle is clearly more negative and all other vital neutronic parameters are within practical safety limits. All simulations were completed using the Monte Carlo Serpent code with the ENDF/B-VII.0 library.

  1. Turning Point

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Moves from the United States and North Korea give new impetus to nuclear disablement and U.S.-North Korea ties The tense situation surrounding denu-clearization on the Korean Peninsula has reached a turning point. On

  2. Military Simulation Big Data: Background, State of the Art, and Challenges

    Directory of Open Access Journals (Sweden)

    Xiao Song

    2015-01-01

    Full Text Available Big data technology has undergone rapid development and attained great success in the business field. Military simulation (MS is another application domain producing massive datasets created by high-resolution models and large-scale simulations. It is used to study complicated problems such as weapon systems acquisition, combat analysis, and military training. This paper firstly reviewed several large-scale military simulations producing big data (MS big data for a variety of usages and summarized the main characteristics of result data. Then we looked at the technical details involving the generation, collection, processing, and analysis of MS big data. Two frameworks were also surveyed to trace the development of the underlying software platform. Finally, we identified some key challenges and proposed a framework as a basis for future work. This framework considered both the simulation and big data management at the same time based on layered and service oriented architectures. The objective of this review is to help interested researchers learn the key points of MS big data and provide references for tackling the big data problem and performing further research.

  3. Nuclear research reactors

    International Nuclear Information System (INIS)

    It's presented data about nuclear research reactors in the world, retrieved from the Sien (Nuclear and Energetic Information System) data bank. The information are organized in table forms as follows: research reactors by countries; research reactors by type; research reactors by fuel and research reactors by purpose. (E.G.)

  4. Rock burst laws in deep mines based on combined model of membership function and dominance-based rough set

    Institute of Scientific and Technical Information of China (English)

    刘浪; 陈忠强; 王李管

    2015-01-01

    Rock bursts are spontaneous, violent fracture of rock that can occur in deep mines, and the likelihood of rock bursts occurring increases as depth of the mine increases. Rock bursts are also affected by the compressive strength, tensile strength, tangential strength, elastic energy index, etc. of rock, and the relationship between these factors and rock bursts in deep mines is difficult to analyze from quantitative point. Typical rock burst instances as a sample set were collected, and membership function was introduced to process the discrete values of these factors with the discrete factors as condition attributes and rock burst situations as decision attributes. Dominance-based rough set theory was used to generate preference rules of rock burst, and eventually rock burst laws analysis in deep mines with preference relation was taken. The results show that this model for rock burst laws analysis in deep mines is more reasonable and feasible, and the prediction results are more scientific.

  5. Nuclear reactor

    International Nuclear Information System (INIS)

    A nuclear reactor is described in which the core components, including fuel-rod assemblies, control-rod assemblies, fertile rod-assemblies, and removable shielding assemblies, are supported by a plurality of separate inlet modular units. These units are referred to as inlet module units to distinguish them from the modules of the upper internals of the reactor. The modular units are supported, each removable independently of the others, in liners in the supporting structure for the lower internals of the reactor. The core assemblies are removably supported in integral receptacles or sockets of the modular units. The liners, units, sockets and assemblies have inlet openings for entry of the fluid. The modular units are each removably mounted in the liners with fluid seals interposed between the opening in the liner and inlet module into which the fluid enters in the upper and lower portion of the liner. Each assembly is similarly mounted in a corresponding receptacle with fluid seals interposed between the openings where the fluid enters in the lower portion of the receptacle or fitting closely in these regions. As fluid flows along each core assembly a pressure drop is produced along the fluid so that the fluid which emerges from each core assembly is at a lower pressure than the fluid which enters the core assembly. However because of the seals interposed in the mountings of the units and assemblies the pressures above and below the units and assemblies are balanced and the units are held in the liners and the assemblies are held in the receptacles by their weights as they have a higher specific gravity than the fluid. The low-pressure spaces between each module and its liner and between each core assembly and its module is vented to the low-pressure regions of the vessel to assure that fluid which leaks through the seals does not accumulate and destroy the hydraulic balance

  6. Nuclear reactor physics course for reactor operators

    International Nuclear Information System (INIS)

    The education and training of nuclear reactor operators is important to guarantee the safe operation of present and future nuclear reactors. Therefore, a course on basic 'Nuclear reactor physics' in the initial and continuous training of reactor operators has proven to be indispensable. In most countries, such training also results from the direct request from the safety authorities to assure the high level of competence of the staff in nuclear reactors. The aim of the basic course on 'Nuclear Reactor Physics for reactor operators' is to provide the reactor operators with a basic understanding of the main concepts relevant to nuclear reactors. Seen the education level of the participants, mathematical derivations are simplified and reduced to a minimum, but not completely eliminated

  7. Quantization of Big Bang in crypto-Hermitian Heisenberg picture

    CERN Document Server

    Znojil, Miloslav

    2015-01-01

    A background-independent quantization of the Universe near its Big Bang singularity is considered using a drastically simplified toy model. Several conceptual issues are addressed. (1) The observable spatial-geometry characteristics of our empty-space expanding Universe is sampled by the time-dependent operator $Q=Q(t)$ of the distance between two space-attached observers (``Alice and Bob''). (2) For any pre-selected guess of the simple, non-covariant time-dependent observable $Q(t)$ one of the Kato's exceptional points (viz., $t=\\tau_{(EP)}$) is postulated {\\em real-valued}. This enables us to treat it as the time of Big Bang. (3) During our ``Eon'' (i.e., at all $t>\\tau_{(EP)}$) the observability status of operator $Q(t)$ is mathematically guaranteed by its self-adjoint nature with respect to an {\\em ad hoc} Hilbert-space metric $\\Theta(t) \

  8. Simulating Smoke Filling in Big Halls by Computational Fluid Dynamics

    Directory of Open Access Journals (Sweden)

    W. K. Chow

    2011-01-01

    Full Text Available Many tall halls of big space volume were built and, to be built in many construction projects in the Far East, particularly Mainland China, Hong Kong, and Taiwan. Smoke is identified to be the key hazard to handle. Consequently, smoke exhaust systems are specified in the fire code in those areas. An update on applying Computational Fluid Dynamics (CFD in smoke exhaust design will be presented in this paper. Key points to note in CFD simulations on smoke filling due to a fire in a big hall will be discussed. Mathematical aspects concerning of discretization of partial differential equations and algorithms for solving the velocity-pressure linked equations are briefly outlined. Results predicted by CFD with different free boundary conditions are compared with those on room fire tests. Standards on grid size, relaxation factors, convergence criteria, and false diffusion should be set up for numerical experiments with CFD.

  9. An Overview of Big Data Privacy Issues

    OpenAIRE

    Patrick Hung

    2013-01-01

    Big data is the term for a collection of large and complex datasets from different sources that is difficult to process using traditional data management and processing applications. In these datasets, some information must be kept secret from others. On the other hand, some information has to be released for acquainting information or big data analytical services. The research challenge is how to protect the private information in the context of big data. Privacy is described by the ability ...

  10. Social Big Data and Privacy Awareness

    OpenAIRE

    Sang, Lin

    2015-01-01

    Based on the rapid development of Big Data, the data from the online social network becomea major part of it. Big data make the social networks became data-oriented rather than social-oriented. Taking this into account, this dissertation presents a qualitative study to research howdoes the data-oriented social network affect its users’ privacy management for nowadays. Within this dissertation, an overview of Big Data and privacy issues on the social network waspresented as a background study. ...

  11. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A first-surface topography digital elevation model (DEM) mosaic for the Big Sandy Creek Unit of Big Thicket National Preserve in Texas, was produced from remotely...

  12. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Corridor Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A first-surface topography Digital Elevation Model (DEM) mosaic for the Big Sandy Creek Corridor Unit of Big Thicket National Preserve in Texas was produced from...

  13. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Corridor Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A bare-earth topography Digital Elevation Model (DEM) mosaic for the Big Sandy Creek Corridor Unit of Big Thicket National Preserve in Texas was produced from...

  14. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A bare-earth topography digital elevation model (DEM) mosaic for the Big Sandy Creek Unit of Big Thicket National Preserve in Texas, was produced from remotely...

  15. Big Crater as Viewed by Pathfinder Lander - Anaglyph

    Science.gov (United States)

    1997-01-01

    The 'Big Crater' is actually a relatively small Martian crater to the southeast of the Mars Pathfinder landing site. It is 1500 meters (4900 feet) in diameter, or about the same size as Meteor Crater in Arizona. Superimposed on the rim of Big Crater (the central part of the rim as seen here) is a smaller crater nicknamed 'Rimshot Crater.' The distance to this smaller crater, and the nearest portion of the rim of Big Crater, is 2200 meters (7200 feet). To the right of Big Crater, south from the spacecraft, almost lost in the atmospheric dust 'haze,' is the large streamlined mountain nicknamed 'Far Knob.' This mountain is over 450 meters (1480 feet) tall, and is over 30 kilometers (19 miles) from the spacecraft. Another, smaller and closer knob, nicknamed 'Southeast Knob' can be seen as a triangular peak to the left of the flanks of the Big Crater rim. This knob is 21 kilometers (13 miles) southeast from the spacecraft.The larger features visible in this scene - Big Crater, Far Knob, and Southeast Knob - were discovered on the first panoramas taken by the IMP camera on the 4th of July, 1997, and subsequently identified in Viking Orbiter images taken over 20 years ago. The scene includes rocky ridges and swales or 'hummocks' of flood debris that range from a few tens of meters away from the lander to the distance of South Twin Peak. The largest rock in the nearfield, just left of center in the foreground, nicknamed 'Otter', is about 1.5 meters (4.9 feet) long and 10 meters (33 feet) from the spacecraft.This view of Big Crater was produced by combining 6 individual 'Superpan' scenes from the left and right eyes of the IMP camera. Each frame consists of 8 individual frames (left eye) and 7 frames (right eye) taken with different color filters that were enlarged by 500% and then co-added using Adobe Photoshop to produce, in effect, a super-resolution panchromatic frame that is sharper than an individual frame would be.The anaglyph view of Big Crater was produced by

  16. Rocking and Rolling Rattlebacks

    Science.gov (United States)

    Cross, Rod

    2013-01-01

    A rattleback is a well-known physics toy that has a preferred direction of rotation. If it is spun about a vertical axis in the "wrong" direction, it will slow down, start rocking from end to end, and then spin in the opposite (i.e. preferred) direction. Many articles have been written about rattlebacks. Some are highly mathematical and…

  17. Umhlanga Rocks coastal defense

    NARCIS (Netherlands)

    De Jong, L.; De Jong, B.; Ivanova, M.; Gerritse, A.; Rietberg, D.; Dorrepaal, S.

    2014-01-01

    The eThekwini coastline is a vulnerable coastline subject to chronic erosion and damage due to sea level rise. In 2007 a severe storm caused major physical and economic damage along the coastline, proving the need for action. Umhlanga Rocks is a densely populated premium holiday destination on the e

  18. Rock solid energy solution

    International Nuclear Information System (INIS)

    Scientists believe naturally radioactive rocks below the earth's surface could provide an inexhaustible and environmentally friendly power source. And Australia could be a geological hotbed should the concept get off the ground. Despite the scale, the concept itself is simple. The Earth's reserves of heat in naturally radioactive rocks could provide an effectively inexhaustible and environmentally friendly source of power. No greenhouse gas emissions, little water usage and minimal pollution. Natural hot springs are already used to make power in some parts of the world, such as Iceland, but creating artificial hot springs by drilling deep into granite -the hardest of rocks - is a much more ambitious concept. One cubic kilometre of hot granite at 250 deg C has the stored energy equivalent of 40 million barrels of oil. In a nutshell, water is pumped into the hot zone - some 3km to 5km down in Australian conditions - and spreads through a 'reservoir' of hot, cracked rocks. Once superheated, it returns to the surface as steam through a separate production well to spin turbines and generate electricity. The water can then be recaptured and reused, with test sites around the world recovering up to around 90 per cent

  19. Rock-hard coatings

    NARCIS (Netherlands)

    Muller, M.

    2007-01-01

    Aircraft jet engines have to be able to withstand infernal conditions. Extreme heat and bitter cold tax coatings to the limit. Materials expert Dr Ir. Wim Sloof fits atoms together to develop rock-hard coatings. The latest invention in this field is known as ceramic matrix composites. Sloof has sign

  20. Slippery Rock University

    Science.gov (United States)

    Arnhold, Robert W.

    2008-01-01

    Slippery Rock University (SRU), located in western Pennsylvania, is one of 14 state-owned institutions of higher education in Pennsylvania. The university has a rich tradition of providing professional preparation programs in special education, therapeutic recreation, physical education, and physical therapy for individuals with disabilities.…

  1. Traffic information computing platform for big data

    International Nuclear Information System (INIS)

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users

  2. Big data optimization recent developments and challenges

    CERN Document Server

    2016-01-01

    The main objective of this book is to provide the necessary background to work with big data by introducing some novel optimization algorithms and codes capable of working in the big data setting as well as introducing some applications in big data optimization for both academics and practitioners interested, and to benefit society, industry, academia, and government. Presenting applications in a variety of industries, this book will be useful for the researchers aiming to analyses large scale data. Several optimization algorithms for big data including convergent parallel algorithms, limited memory bundle algorithm, diagonal bundle method, convergent parallel algorithms, network analytics, and many more have been explored in this book.

  3. Big Data Analytics Using Cloud and Crowd

    OpenAIRE

    Allahbakhsh, Mohammad; Arbabi, Saeed; Motahari-Nezhad, Hamid-Reza; Benatallah, Boualem

    2016-01-01

    The increasing application of social and human-enabled systems in people's daily life from one side and from the other side the fast growth of mobile and smart phones technologies have resulted in generating tremendous amount of data, also referred to as big data, and a need for analyzing these data, i.e., big data analytics. Recently a trend has emerged to incorporate human computing power into big data analytics to solve some shortcomings of existing big data analytics such as dealing with ...

  4. The big de Rham–Witt complex

    DEFF Research Database (Denmark)

    Hesselholt, Lars

    2015-01-01

    This paper gives a new and direct construction of the multi-prime big de Rham–Witt complex, which is defined for every commutative and unital ring; the original construction by Madsen and myself relied on the adjoint functor theorem and accordingly was very indirect. The construction given here....... It is the existence of these divided Frobenius operators that makes the new construction of the big de Rham–Witt complex possible. It is further shown that the big de Rham–Witt complex behaves well with respect to étale maps, and finally, the big de Rham–Witt complex of the ring of integers is explicitly evaluated....

  5. Urgent Call for Nursing Big Data.

    Science.gov (United States)

    Delaney, Connie W

    2016-01-01

    The purpose of this panel is to expand internationally a National Action Plan for sharable and comparable nursing data for quality improvement and big data science. There is an urgent need to assure that nursing has sharable and comparable data for quality improvement and big data science. A national collaborative - Nursing Knowledge and Big Data Science includes multi-stakeholder groups focused on a National Action Plan toward implementing and using sharable and comparable nursing big data. Panelists will share accomplishments and future plans with an eye toward international collaboration. This presentation is suitable for any audience attending the NI2016 conference. PMID:27332330

  6. Traffic information computing platform for big data

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn; Liu, Yan, E-mail: ztduan@chd.edu.cn; Dai, Jiting, E-mail: ztduan@chd.edu.cn; Kang, Jun, E-mail: ztduan@chd.edu.cn [Chang' an University School of Information Engineering, Xi' an, China and Shaanxi Engineering and Technical Research Center for Road and Traffic Detection, Xi' an (China)

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  7. Big data analytics with R and Hadoop

    CERN Document Server

    Prajapati, Vignesh

    2013-01-01

    Big Data Analytics with R and Hadoop is a tutorial style book that focuses on all the powerful big data tasks that can be achieved by integrating R and Hadoop.This book is ideal for R developers who are looking for a way to perform big data analytics with Hadoop. This book is also aimed at those who know Hadoop and want to build some intelligent applications over Big data with R packages. It would be helpful if readers have basic knowledge of R.

  8. Fitting ERGMs on big networks.

    Science.gov (United States)

    An, Weihua

    2016-09-01

    The exponential random graph model (ERGM) has become a valuable tool for modeling social networks. In particular, ERGM provides great flexibility to account for both covariates effects on tie formations and endogenous network formation processes. However, there are both conceptual and computational issues for fitting ERGMs on big networks. This paper describes a framework and a series of methods (based on existent algorithms) to address these issues. It also outlines the advantages and disadvantages of the methods and the conditions to which they are most applicable. Selected methods are illustrated through examples. PMID:27480375

  9. Big Brother Has Bigger Say

    Institute of Scientific and Technical Information of China (English)

    Yang Wei

    2009-01-01

    @@ 156 delegates from all walks of life in Guangdong province composed the Guangdong delegation for the NPC this year. The import and export value of Guangdong makes up one-third of national total value, and accounts for one-eighth of national economic growth. Guangdong province has maintained its top spot in import and export value among China's many provinces and cities for several years, commonly referred to as "Big Brother". At the same time, it is the region where the global financial crisis has hit China hardest.

  10. Big Five -persoonallisuuspiirteiden yhteydet unettomuuteen

    OpenAIRE

    Aronen, Aino

    2015-01-01

    Tutkimuksen tarkoituksena oli selvittÀÀ, ovatko Big Five -persoonallisuuspiirteet (neuroottisuus, ulospÀinsuuntautuneisuus, tunnollisuus, avoimuus kokemuksille ja sovinnollisuus) yhteydessÀ unettomuuden oireisiin, joita olivat nukahtamisvaikeudet, herÀilyt, vaikeudet pysyÀ unessa ja vÀsyneenÀ herÀÀmiset normaalipituisten unien jÀlkeen. Unettomuutta koskevien teorioiden mukaan korkea neuroottisuus, matala ulospÀinsuuntautuneisuus, matala tunnollisuus ja matala sovinnollisuus voivat...

  11. Small intrinsically safe reactor implications

    International Nuclear Information System (INIS)

    Reviewing the history of nuclear power, it is found that peaceful uses of nuclear power are children of the war-like atom. Importance of special growth in a shielded environment is emphasized to exploit fully the advantages of nuclear power. Nuclear power reactors must be safe for their assimilation into society from the points of view of both technology and social psychology. ISR/ISER is identified as a missing link in the development of nuclear power reactors from this perspective and advocated for international development and utilization, being unleashed from the concerns of politicization, safety, and proliferation

  12. The geology and tectonic significance of the Big Creek Gneiss, Sierra Madre, southeastern Wyoming

    Science.gov (United States)

    Jones, Daniel S.

    The Big Creek Gneiss, southern Sierra Madre, southeastern Wyoming, is a heterogeneous suite of upper-amphibolite-facies metamorphic rocks intruded by post-metamorphic pegmatitic granite. The metamorphic rocks consist of three individual protolith suites: (1) pre- to syn-1780-Ma supracrustal rocks including clastic metasedimentary rocks, calc-silicate paragneiss, and metavolcanic rocks; (2) a bimodal intrusive suite composed of metagabbro and granodiorite-tonalite gneiss; and (3) a younger bimodal suite composed of garnet-bearing metagabbronorite and coarse-grained granitic gneiss. Zircons U-Pb ages from the Big Creek Gneiss demonstrate that: (1) the average age of detrital zircons in the supracrustal rocks is ~1805 Ma, requiring a significant source of 1805-Ma (or older) detritus during deposition, possibly representing an older phase of arc magmatism; (2) the older bimodal igneous suite crystallized at ~1780 Ma, correlative with arc-derived rocks of the Green Mountain Formation; (3) the younger bimodal igneous suite crystallized at ~1763 Ma, coeval with the extensional(?) Horse Creek anorthosite complex in the Laramie Mountains and Sierra Madre Granite batholith in the southwestern Sierra Madre; (4) Big Creek Gneiss rocks were tectonically buried, metamorphosed, and partially melted at ~1750 Ma, coeval with the accretion of the Green Mountain arc to the Wyoming province along the Cheyenne belt; (5) the posttectonic granite and pegmatite bodies throughout the Big Creek Gneiss crystallized at ~1630 Ma and are correlative with the 'white quartz monzonite' of the south-central Sierra Madre. Geochemical analysis of the ~1780-Ma bimodal plutonic suite demonstrates a clear arc-affinity for the mafic rocks, consistent with a subduction environment origin. The granodioritic rocks of this suite were not derived by fractional crystallization from coeval mafic magmas, but are instead interpreted as melts of lower-crustal mafic material. This combination of mantle

  13. BigDataBench: a Big Data Benchmark Suite from Internet Services

    OpenAIRE

    Wang, Lei; Zhan, Jianfeng; Luo, Chunjie; Zhu, Yuqing; Yang, Qiang; He, Yongqiang; Gao, Wanling; Jia, Zhen; Shi, Yingjie; Zhang, Shujie; Zheng, Chen; Lu, Gang; Zhan, Kent; Li, Xiaona; Qiu, Bizhu

    2014-01-01

    As architecture, systems, and data management communities pay greater attention to innovative big data systems and architectures, the pressure of benchmarking and evaluating these systems rises. Considering the broad use of big data systems, big data benchmarks must include diversity of data and workloads. Most of the state-of-the-art big data benchmarking efforts target evaluating specific types of applications or system software stacks, and hence they are not qualified for serving the purpo...

  14. Environmental Consequences of Big Nasty Impacts on the Early Earth

    Science.gov (United States)

    Zahnle, Kevin

    2015-01-01

    The geological record of the Archean Earth is spattered with impact spherules from a dozen or so major cosmic collisions involving Earth and asteroids or comets (Lowe, Byerly 1986, 2015). Extrapolation of the documented deposits suggests that most of these impacts were as big or bigger than the Chicxulub event that famously ended the reign of the thunder lizards. As the Archean impacts were greater, the environmental effects were also greater. The number and magnitude of the impacts is bounded by the lunar record. There are no lunar craters bigger than Chicxulub that date to Earth's mid-to-late Archean. Chance dictates that Earth experienced no more than approximately 10 impacts bigger than Chicxulub between 2.5 billion years and 3.5 2.5 billion years, the biggest of which were approximately30-100 times more energetic, comparable to the Orientale impact on the Moon (1x10 (sup 26) joules). To quantify the thermal consequences of big impacts on old Earth, we model the global flow of energy from the impact into the environment. The model presumes that a significant fraction of the impact energy goes into ejecta that interact with the atmosphere. Much of this energy is initially in rock vapor, melt, and high speed particles. (i) The upper atmosphere is heated by ejecta as they reenter the atmosphere. The mix of hot air, rock vapor, and hot silicates cools by thermal radiation. Rock raindrops fall out as the upper atmosphere cools. (ii) The energy balance of the lower atmosphere is set by radiative exchange with the upper atmosphere and with the surface, and by evaporation of seawater. Susequent cooling is governed by condensation of water vapor. (iii) The oceans are heated by thermal radiation and rock rain and cooled by evaporation. Surface waters become hot and salty; if a deep ocean remains it is relatively cool. Subsequently water vapor condenses to replenish the oceans with hot fresh water (how fresh depending on continental weathering, which might be rather rapid

  15. Scaling Thomson scattering to big machines

    International Nuclear Information System (INIS)

    Thomson scattering is a widely used diagnostic tool for local measurement of both electron temperature and electron density. It is used for both low and high temperature plasmas and it is a key diagnostic on all fusion devices. The extremely low cross-section of the reaction increases the complexity of the design. Since the early days of fusion, when a simple single point measurement was used, the design moved to a multi-point system with a large number of spatial points, LIDAR system or high repetition Thomson scattering diagnostic which are used nowadays. The initial low electron temperature approximation has been replaced by the full relativistic approach necessary for large devices as well as for ITER with expected higher plasma temperature. Along the way, the different development needs and the issues that exist need to be addressed to ensure that the technique is developed sufficiently to handle challenges of the bigger devices of the future as well as current developments needed for ITER. For large devices, the achievement of the necessary temperature range represents an important task. Both high and low temperatures can be measured, however, a large dynamic range makes the design difficult as size of detector and dynamic range are linked together. Therefore, the requirements of the new devices are extending the boundaries of these parameters. Namely, ITER presents challenges as access is also difficult but big efforts have been made to cope with this. This contribution contains a broad review of Thomson scattering diagnostics used in current devices together with comments on recent progress and speculation regarding future developments needed for future large scale devices

  16. Scaling Thomson scattering to big machines

    Science.gov (United States)

    Bílková, P.; Walsh, M.; Böhm, P.; Bassan, M.; Aftanas, M.; Pánek, R.

    2016-03-01

    Thomson scattering is a widely used diagnostic tool for local measurement of both electron temperature and electron density. It is used for both low and high temperature plasmas and it is a key diagnostic on all fusion devices. The extremely low cross-section of the reaction increases the complexity of the design. Since the early days of fusion, when a simple single point measurement was used, the design moved to a multi-point system with a large number of spatial points, LIDAR system or high repetition Thomson scattering diagnostic which are used nowadays. The initial low electron temperature approximation has been replaced by the full relativistic approach necessary for large devices as well as for ITER with expected higher plasma temperature. Along the way, the different development needs and the issues that exist need to be addressed to ensure that the technique is developed sufficiently to handle challenges of the bigger devices of the future as well as current developments needed for ITER. For large devices, the achievement of the necessary temperature range represents an important task. Both high and low temperatures can be measured, however, a large dynamic range makes the design difficult as size of detector and dynamic range are linked together. Therefore, the requirements of the new devices are extending the boundaries of these parameters. Namely, ITER presents challenges as access is also difficult but big efforts have been made to cope with this. This contribution contains a broad review of Thomson scattering diagnostics used in current devices together with comments on recent progress and speculation regarding future developments needed for future large scale devices.

  17. Hybrid adsorptive membrane reactor

    Science.gov (United States)

    Tsotsis, Theodore T.; Sahimi, Muhammad; Fayyaz-Najafi, Babak; Harale, Aadesh; Park, Byoung-Gi; Liu, Paul K. T.

    2011-03-01

    A hybrid adsorbent-membrane reactor in which the chemical reaction, membrane separation, and product adsorption are coupled. Also disclosed are a dual-reactor apparatus and a process using the reactor or the apparatus.

  18. Hybrid adsorptive membrane reactor

    Science.gov (United States)

    Tsotsis, Theodore T. (Inventor); Sahimi, Muhammad (Inventor); Fayyaz-Najafi, Babak (Inventor); Harale, Aadesh (Inventor); Park, Byoung-Gi (Inventor); Liu, Paul K. T. (Inventor)

    2011-01-01

    A hybrid adsorbent-membrane reactor in which the chemical reaction, membrane separation, and product adsorption are coupled. Also disclosed are a dual-reactor apparatus and a process using the reactor or the apparatus.

  19. Comparative validity of brief to medium-length Big Five and Big Six personality questionnaires

    NARCIS (Netherlands)

    A.G. Thalmayer; G. Saucier; A. Eigenhuis

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five

  20. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  1. Calculation system for physical analysis of boiling water reactors; Modelisation des phenomenes physiques specifiques aux reacteurs a eau bouillante, notamment le couplage neutronique-thermohydraulique

    Energy Technology Data Exchange (ETDEWEB)

    Bouveret, F

    2001-07-01

    Although Boiling Water Reactors generate a quarter of worldwide nuclear electricity, they have been only little studied in France. A certain interest now shows up for these reactors. So, the aim of the work presented here is to contribute to determine a core calculation methodology with CEA (Commissariat a l'Energie Atomique) codes. Vapour production in the reactor core involves great differences in technological options from pressurised water reactor. We analyse main physical phenomena for BWR and offer solutions taking them into account. BWR fuel assembly heterogeneity causes steep thermal flux gradients. The two dimensional collision probability method with exact boundary conditions makes possible to calculate accurately the flux in BWR fuel assemblies using the APOLLO-2 lattice code but induces a very long calculation time. So, we determine a new methodology based on a two-level flux calculation. Void fraction variations in assemblies involve big spectrum changes that we have to consider in core calculation. We suggest to use a void history parameter to generate cross-sections libraries for core calculation. The core calculation code has also to calculate the depletion of main isotopes concentrations. A core calculation associating neutronics and thermal-hydraulic codes lays stress on points we still have to study out. The most important of them is to take into account the control blade in the different calculation stages. (author)

  2. Field Geologist: An Android App for Measuring Rock Outcroppings

    Science.gov (United States)

    Baird, J.; Chiu, M. T.; Huang, X.; de Lanerolle, T. R.; Morelli, R.; Gourley, J. R.

    2011-12-01

    Field geologist is a mobile Android app that measures, plots, and exports strike and data in the field. When the phone is placed on the steepest part of the rock, it automatically detects dip, string, latitude and longitude. It includes a drop-down menu to record the type of rock. The app's initial screen displays a compass with an interior dip/strike symbol that always points toward the dip direction. Tapping the compass stores a data point in the phone's database. The points can be displayed on a Google map and uploaded to a server, from where they can be retrieved in CSV format and imported into a spreadsheet.

  3. Astronomical surveys and big data

    Science.gov (United States)

    Mickaelian, Areg M.

    Recent all-sky and large-area astronomical surveys and their catalogued data over the whole range of electromagnetic spectrum, from γ -rays to radio waves, are reviewed, including such as Fermi-GLAST and INTEGRAL in γ -ray, ROSAT, XMM and Chandra in X-ray, GALEX in UV, SDSS and several POSS I and POSS II-based catalogues (APM, MAPS, USNO, GSC) in the optical range, 2MASS in NIR, WISE and AKARI IRC in MIR, IRAS and AKARI FIS in FIR, NVSS and FIRST in radio range, and many others, as well as the most important surveys giving optical images (DSS I and II, SDSS, etc.), proper motions (Tycho, USNO, Gaia), variability (GCVS, NSVS, ASAS, Catalina, Pan-STARRS), and spectroscopic data (FBS, SBS, Case, HQS, HES, SDSS, CALIFA, GAMA). An overall understanding of the coverage along the whole wavelength range and comparisons between various surveys are given: galaxy redshift surveys, QSO/AGN, radio, Galactic structure, and Dark Energy surveys. Astronomy has entered the Big Data era, with Astrophysical Virtual Observatories and Computational Astrophysics playing an important role in using and analyzing big data for new discoveries.

  4. Rock-like oxide fuels for burning excess plutonium in LWRs

    International Nuclear Information System (INIS)

    JAERI has performed research on the plutonium rock-like oxide (ROX) fuels and their once-through burning in light water reactors (the ROX-LWR system) in order to establish an option for utilising or effectively disposing of the excess plutonium. Features of the ROX-LWR system are almost complete burning of plutonium and the direct disposal of spent ROX fuels without reprocessing. The ROX fuel is a sort of inert matrix fuel, and consists of mineral-like compounds such as yttria-stabilised zirconia (YSZ), spinel (MgAl2O4) and corundum (Al2O3). Several difficulties must be overcome to demonstrate the feasibility of the ROX-LWR system from the reactor physics and the materials science points of view. Described are activities concerning development of particle dispersed fuels, the in-pile irradiation behaviour of these fuels, improvement of the ROX fuel core characteristics and fuel behaviour under reactivity-initiated accident condition. (author)

  5. An analysis of cross-sectional differences in big and non-big public accounting firms' audit programs

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans); Drieenhuizen, F.; Stein, M.T.; Simunic, D.A.

    2006-01-01

    A significant body of prior research has shown that audits by the Big 5 (now Big 4) public accounting firms are quality differentiated relative to non-Big 5 audits. This result can be derived analytically by assuming that Big 5 and non-Big 5 firms face different loss functions for "audit failures" a

  6. The big and little of fifty years of Moessbauer spectroscopy at Argonne

    International Nuclear Information System (INIS)

    the $50 million Zero Gradient Synchrotron (ZGS) and the $30 million Experimental Breeder Reactor (EBR) II. Starting in the mid-1990s, Argonne physicists expanded their exploration of the properties of matter by employing a new type of Moessbauer spectroscopy--this time using synchrotron light sources such as Argonne's Advanced Photon Source (APS), which at $1 billion was the most expensive U.S. accelerator project of its time. Traditional Moessbauer spectroscopy looks superficially like prototypical ''Little Science'' and Moessbauer spectroscopy using synchrotrons looks like prototypical ''Big Science''. In addition, the growth from small to larger scale research seems to follow the pattern familiar from high energy physics even though the wide range of science performed using Moessbauer spectroscopy did not include high energy physics. But is the story of Moessbauer spectroscopy really like the tale told by high energy physicists and often echoed by historians? What do U.S. national laboratories, the ''Home'' of Big Science, have to offer small-scale research? And what does the story of the 50-year development of Moessbauer spectroscopy at Argonne tell us about how knowledge is produced at large laboratories? In a recent analysis of the development of relativistic heavy ion science at Lawrence Berkeley Laboratory I questioned whether it was wise for historians to speak in terms of ''Big Science'', pointing out at that Lawrence Berkeley Laboratory hosted large-scale projects at three scales, the grand scale of the Bevatron, the modest scale of the HILAC, and the mezzo scale of the combined machine, the Bevalac. I argue that using the term ''Big Science'', which was coined by participants, leads to a misleading preoccupation with the largest projects and the tendency to see the history of physics as the history of high energy physics. My aim here is to provide an additional corrective to such views as well as further information about the web of connections that allows

  7. Musical Structure as Narrative in Rock

    Directory of Open Access Journals (Sweden)

    John Fernando Encarnacao

    2011-09-01

    Full Text Available In an attempt to take a fresh look at the analysis of form in rock music, this paper uses Susan McClary’s (2000 idea of ‘quest narrative’ in Western art music as a starting point. While much pop and rock adheres to the basic structure of the establishment of a home territory, episodes or adventures away, and then a return, my study suggests three categories of rock music form that provide alternatives to common combinations of verses, choruses and bridges through which the quest narrative is delivered. Labyrinth forms present more than the usual number of sections to confound our sense of ‘home’, and consequently of ‘quest’. Single-cell forms use repetition to suggest either a kind of stasis or to disrupt our expectations of beginning, middle and end. Immersive forms blur sectional divisions and invite more sensual and participatory responses to the recorded text. With regard to all of these alternative approaches to structure, Judy Lochhead’s (1992 concept of ‘forming’ is called upon to underline rock music forms that unfold as process, rather than map received formal constructs. Central to the argument are a couple of crucial definitions. Following Theodore Gracyk (1996, it is not songs, as such, but particular recordings that constitute rock music texts. Additionally, narrative is understood not in (direct relation to the lyrics of a song, nor in terms of artists’ biographies or the trajectories of musical styles, but considered in terms of musical structure. It is hoped that this outline of non-narrative musical structures in rock may have applications not only to other types of music, but to other time-based art forms.

  8. Joint Commission on rock properties

    Science.gov (United States)

    A joint commission on Rock Properties for Petroleum Engineers (RPPE) has been established by the International Society of Rock Mechanics and the Society of Petroleum Engineers to set up data banks on the properties of sedimentary rocks encountered during drilling. Computer-based data banks of complete rock properties will be organized for sandstones (GRESA), shales (ARSHA) and carbonates (CARCA). The commission hopes to access data sources from members of the commission, private companies and the public domain.

  9. Reactor container

    International Nuclear Information System (INIS)

    Purpose: To prevent shocks exerted on a vent head due to pool-swell caused within a pressure suppression chamber (disposed in a torus configuration around the dry well) upon loss of coolant accident in BWR type reactors. Constitution: The following relationship is established between the volume V (m3) of a dry well and the ruptured opening area A (m2) at the boundary expected upon loss of coolant accident: V >= 30340 (m) x A Then, the volume of the dry well is made larger than the ruptured open area, that is, the steam flow rate of leaking coolants upon loss of coolant accident to decrease the pressure rise in the dry well at the initial state where loss of coolant accident is resulted. Accordingly, the pressure of non-compressive gases jetted out from the lower end of the downcomer to the pool water is decreased to suppress the pool-swell. (Ikeda, J.)

  10. Why Big Data Is a Big Deal (Ⅱ)

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    A new group of data mining technologies promises to change forever the way we sift through our vast stores of data,making it faster and cheaper.Some of the technologies are actively being used by people on the bleeding edge who need the technology now,like those involved in creating Web-based services that are driven by social media.They're also heavily contributing to these projects.In other vertical industries,businesses are realizing that much more of their value proposition is informationbased than they had previously thought,which will allow big data technologies to gain traction quickly,Olofson says.Couple that with affordable hardware and software,and enterprises find themselves in a perfect storm of business transformation opportunities.

  11. Big Data – Big Deal for Organization Design?

    Directory of Open Access Journals (Sweden)

    Janne J. Korhonen

    2014-04-01

    Full Text Available Analytics is an increasingly important source of competitive advantage. It has even been posited that big data will be the next strategic emphasis of organizations and that analytics capability will be manifested in organizational structure. In this article, I explore how analytics capability might be reflected in organizational structure using the notion of  “requisite organization” developed by Jaques (1998. Requisite organization argues that a new strategic emphasis requires the addition of a new stratum in the organization, resulting in greater organizational complexity. Requisite organization could serve as an objective, verifiable criterion for what qualifies as a genuine new strategic emphasis. Such a criterion is  necessary for research on the co-evolution of strategy and structure.

  12. On fast reactor kinetics studies

    Energy Technology Data Exchange (ETDEWEB)

    Seleznev, E. F.; Belov, A. A. [Nuclear Safety Inst. of the Russian Academy of Sciences IBRAE (Russian Federation); Matveenko, I. P.; Zhukov, A. M.; Raskach, K. F. [Inst. for Physics and Power Engineering IPPE (Russian Federation)

    2012-07-01

    The results and the program of fast reactor core time and space kinetics experiments performed and planned to be performed at the IPPE critical facility is presented. The TIMER code was taken as computation support of the experimental work, which allows transient equations to be solved in 3-D geometry with multi-group diffusion approximation. The number of delayed neutron groups varies from 6 to 8. The code implements the solution of both transient neutron transfer problems: a direct one, where neutron flux density and its derivatives, such as reactor power, etc, are determined at each time step, and an inverse one for the point kinetics equation form, where such a parameter as reactivity is determined with a well-known reactor power time variation function. (authors)

  13. Big Bang Nucleosynthesis: Probing the First 20 Minutes

    CERN Document Server

    Steigman, G

    2003-01-01

    Within the first 20 minutes of the evolution of the hot, dense, early Universe, astrophysically interesting abundances of deuterium, helium-3, helium-4, and lithium-7 were synthesized by the cosmic nuclear reactor. The primordial abundances of these light nuclides produced during Big Bang Nucleosynthesis (BBN) are sensitive to the universal density of baryons and to the early-Universe expansion rate which at early epochs is governed by the energy density in relativistic particles (``radiation'') such as photons and neutrinos. Some 380 kyr later, when the cosmic background radiation (CBR) radiation was freed from the embrace of the ionized plasma of protons and electrons, the spectrum of temperature fluctuations imprinted on the CBR also depended on the baryon and radiation densities. The comparison between the constraints imposed by BBN and those from the CBR reveals a remarkably consistent picture of the Universe at two widely separated epochs in its evolution. Combining these two probes leads to new and tig...

  14. 革命者BIG BANG

    Institute of Scientific and Technical Information of China (English)

    刘岩

    2015-01-01

    <正>在鄂尔多斯的繁荣时代,我遇见了那里的一位"意见领袖",因为他从美国回来,见过外面的世界,有着对奢侈品辽阔的见识和独到的品味。他引领着那座神秘财富城市中一个小圈子的购物风潮,他们一块接一块儿地购入Big Bang。那个时候,我并不太清楚他们迷恋这款腕表的原因,直到我一次次地去到巴塞尔表展,一次次地了解到Big Bang的想象力。是的,Big Bang的确充满了魅力。Big Bang进化史2005年Big Bang系列诞生2006年Big Bang全黑"全黑"理念使Big Bang更加纯粹和简洁。Big Bang全黑腕表从表壳到表盘浑然天成的亚光质感和多层次、不同材料融合起来的黑色,蕴含"不可见的可见"之禅意。

  15. Kansen voor Big data – WPA Vertrouwen

    NARCIS (Netherlands)

    Broek, T.A. van den; Roosendaal, A.P.C.; Veenstra, A.F.E. van; Nunen, A.M. van

    2014-01-01

    Big data is expected to become a driver for economic growth, but this can only be achieved when services based on (big) data are accepted by citizens and consumers. In a recent policy brief, the Cabinet Office mentions trust as one of the three pillars (the others being transparency and control) for

  16. The ethics of Big data: analytical survey

    OpenAIRE

    GIBER L.; KAZANTSEV N.

    2015-01-01

    The number of recent publications on the matter of ethical challenges of the implementation of Big Data has signified the growing interest to all the aspects of this issue. The proposed study specifically aims at analyzing ethical issues connected with Big Data.

  17. An embedding for the big bang

    Science.gov (United States)

    Wesson, Paul S.

    1994-01-01

    A cosmological model is given that has good physical properties for the early and late universe but is a hypersurface in a flat five-dimensional manifold. The big bang can therefore be regarded as an effect of a choice of coordinates in a truncated higher-dimensional geometry. Thus the big bang is in some sense a geometrical illusion.

  18. Big Red: A Development Environment for Bigraphs

    DEFF Research Database (Denmark)

    Faithfull, Alexander John; Perrone, Gian David; Hildebrandt, Thomas

    2013-01-01

    We present Big Red, a visual editor for bigraphs and bigraphical reactive systems, based upon Eclipse. The editor integrates with several existing bigraph tools to permit simulation and model-checking of bigraphical models. We give a brief introduction to the bigraphs formalism, and show how these...... concepts manifest within the tool using a small motivating example bigraphical model developed in Big Red....

  19. The Big Sleep in the Woods

    Institute of Scientific and Technical Information of China (English)

    王玉峰

    2002-01-01

    Now it's the time of the big sleep for the bees and the bears. Even the buds of the plants whose leaves fall off share in it. But the intensity of this winter sleep, or hibernation, depends on who's doing it.The big sleep of the bears ,for instance ,would probably be thought of as a

  20. A New Look at Big History

    Science.gov (United States)

    Hawkey, Kate

    2014-01-01

    The article sets out a "big history" which resonates with the priorities of our own time. A globalizing world calls for new spacial scales to underpin what the history curriculum addresses, "big history" calls for new temporal scales, while concern over climate change calls for a new look at subject boundaries. The article…

  1. Big Science and Long-tail Science

    CERN Multimedia

    2008-01-01

    Jim Downing and I were privileged to be the guests of Salavtore Mele at CERN yesterday and to see the Atlas detector of the Large Hadron Collider . This is a wow experience - although I knew it was big, I hadnt realised how big.

  2. The Death of the Big Men

    DEFF Research Database (Denmark)

    Martin, Keir

    2010-01-01

    Recently Tolai people og Papua New Guinea have adopted the term 'Big Shot' to decribe an emerging post-colonial political elite. The mergence of the term is a negative moral evaluation of new social possibilities that have arisen as a consequence of the Big Shots' privileged position within a glo...

  3. Big system: Interactive graphics for the engineer

    Science.gov (United States)

    Quenneville, C. E.

    1975-01-01

    The BCS Interactive Graphics System (BIG System) approach to graphics was presented, along with several significant engineering applications. The BIG System precompiler, the graphics support library, and the function requirements of graphics applications are discussed. It was concluded that graphics standardization and a device independent code can be developed to assure maximum graphic terminal transferability.

  4. What is beyond the big five?

    Science.gov (United States)

    Saucier, G; Goldberg, L R

    1998-08-01

    Previous investigators have proposed that various kinds of person-descriptive content--such as differences in attitudes or values, in sheer evaluation, in attractiveness, or in height and girth--are not adequately captured by the Big Five Model. We report on a rather exhaustive search for reliable sources of Big Five-independent variation in data from person-descriptive adjectives. Fifty-three candidate clusters were developed in a college sample using diverse approaches and sources. In a nonstudent adult sample, clusters were evaluated with respect to a minimax criterion: minimum multiple correlation with factors from Big Five markers and maximum reliability. The most clearly Big Five-independent clusters referred to Height, Girth, Religiousness, Employment Status, Youthfulness and Negative Valence (or low-base-rate attributes). Clusters referring to Fashionableness, Sensuality/Seductiveness, Beauty, Masculinity, Frugality, Humor, Wealth, Prejudice, Folksiness, Cunning, and Luck appeared to be potentially beyond the Big Five, although each of these clusters demonstrated Big Five multiple correlations of .30 to .45, and at least one correlation of .20 and over with a Big Five factor. Of all these content areas, Religiousness, Negative Valence, and the various aspects of Attractiveness were found to be represented by a substantial number of distinct, common adjectives. Results suggest directions for supplementing the Big Five when one wishes to extend variable selection outside the domain of personality traits as conventionally defined. PMID:9728415

  5. Big Food, Food Systems, and Global Health

    OpenAIRE

    Stuckler, David; Nestle, Marion

    2012-01-01

    In an article that forms part of the PLoS Medicine series on Big Food, guest editors David Stuckler and Marion Nestle lay out why more examination of the food industry is necessary, and offer three competing views on how public health professionals might engage with Big Food.

  6. In Search of the Big Bubble

    Science.gov (United States)

    Simoson, Andrew; Wentzky, Bethany

    2011-01-01

    Freely rising air bubbles in water sometimes assume the shape of a spherical cap, a shape also known as the "big bubble". Is it possible to find some objective function involving a combination of a bubble's attributes for which the big bubble is the optimal shape? Following the basic idea of the definite integral, we define a bubble's surface as…

  7. Rock and mineral magnetism

    CERN Document Server

    O’Reilly, W

    1984-01-01

    The past two decades have witnessed a revolution in the earth sciences. The quantitative, instrument-based measurements and physical models of. geophysics, together with advances in technology, have radically transformed the way in which the Earth, and especially its crust, is described. The study of the magnetism of the rocks of the Earth's crust has played a major part in this transformation. Rocks, or more specifically their constituent magnetic minerals, can be regarded as a measuring instrument provided by nature, which can be employed in the service of the earth sciences. Thus magnetic minerals are a recording magnetometer; a goniometer or protractor, recording the directions of flows, fields and forces; a clock; a recording thermometer; a position recorder; astrain gauge; an instrument for geo­ logical surveying; a tracer in climatology and hydrology; a tool in petrology. No instrument is linear, or free from noise and systematic errors, and the performance of nature's instrument must be assessed and ...

  8. Uranium in alkaline rocks

    International Nuclear Information System (INIS)

    Geologic and geochemical criteria were developed for the occurrence of economic uranium deposits in alkaline igneous rocks. A literature search, a limited chemical analytical program, and visits to three prominent alkaline-rock localities (Ilimaussaq, Greenland; Pocos de Caldas, Brazil; and Powderhorn, Colorado) were made to establish criteria to determine if a site had some uranium resource potential. From the literature, four alkaline-intrusive occurrences of differing character were identified as type-localities for uranium mineralization, and the important aspects of these localities were described. These characteristics were used to categorize and evaluate U.S. occurrences. The literature search disclosed 69 U.S. sites, encompassing nepheline syenite, alkaline granite, and carbonatite. It was possible to compare two-thirds of these sites to the type localities. A ranking system identified ten of the sites as most likely to have uranium resource potential

  9. Uranium in alkaline rocks

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, M.; Wollenberg, H.; Strisower, B.; Bowman, H.; Flexser, S.; Carmichael, I.

    1978-04-01

    Geologic and geochemical criteria were developed for the occurrence of economic uranium deposits in alkaline igneous rocks. A literature search, a limited chemical analytical program, and visits to three prominent alkaline-rock localities (Ilimaussaq, Greenland; Pocos de Caldas, Brazil; and Powderhorn, Colorado) were made to establish criteria to determine if a site had some uranium resource potential. From the literature, four alkaline-intrusive occurrences of differing character were identified as type-localities for uranium mineralization, and the important aspects of these localities were described. These characteristics were used to categorize and evaluate U.S. occurrences. The literature search disclosed 69 U.S. sites, encompassing nepheline syenite, alkaline granite, and carbonatite. It was possible to compare two-thirds of these sites to the type localities. A ranking system identified ten of the sites as most likely to have uranium resource potential.

  10. Limados : Rock peruano

    OpenAIRE

    García Morete, Ramiro

    2013-01-01

    Incentivado por la corriente nuevaolera que llegaba de México, fue señalado por especialistas como pionero del punk. Aunque el plan, era tocar con lo que hubiera. Un recodo ínfimo de un período breve pero sorprendentemente poderoso, los 60 en un país que hizo del rock una expresión propia de su cultura.

  11. Deformations of fractured rock

    International Nuclear Information System (INIS)

    Results of the DBM and FEM analysis in this study indicate that a suitable rock mass for repository of radioactive waste should be moderately jointed (about 1 joint/m2) and surrounded by shear zones of the first order. This allowes for a gentle and flexible deformation under tectonic stresses and prevent the development of large cross-cutting failures in the repository area. (author)

  12. Site Investigation for Detection of KIJANG Reactor Core Center

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Tae-Hyun; Kim, Jun Yeon; Kim, Jeeyoung [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    It was planned for the end of March 2017 and extended to April 2018 according to the government budget adjustment. The KJRR project is intended for filling the self-sufficiency of RI demand including Mo-99, increasing the NTD capacity and developing technologies related to the research reactor. In project, site investigation is the first activity that defines seismologic and related geologic aspects of the site. Site investigation was carried out from Oct. 2012 to Jan. 2014 and this study is intended to describe detail procedures in locating the reactor core center. The location of the reactor core center was determined by collectively reviewing not only geological information but also information from architects engineering. EL 50m was selected as ground level by levering construction cost. Four recommended locations (R-1a - R-1d) are displayed for the reactor core center. R-1a was found optimal in consideration of medium rock contour, portion of medium rock covering reactor buildings, construction cost, physical protection and electrical resistivity. It is noted that engineering properties of the medium rock is TCR/RQD 100/53, elastic modulus 7,710 - 8,720MPa, permeability coefficient 2.92E-06cm/s, and S-wave velocity 1,380m/s, sound for foundations of reactor buildings.

  13. Research reactor job analysis - A project description

    International Nuclear Information System (INIS)

    Addressing the need of the improved training in nuclear industry, nuclear utilities established training program guidelines based on Performance-Based Training (PBT) concepts. The comparison of commercial nuclear power facilities with research and test reactors owned by the U.S. Department of Energy (DOE), made in an independent review of personnel selection, training, and qualification requirements for DOE-owned reactors pointed out that the complexity of the most critical tasks in research reactors is less than that in power reactors. The U.S. Department of Energy (DOE) started a project by commissioning Oak Ridge Associated Universities (ORAU) to conduct a job analysis survey of representative research reactor facilities. The output of the project consists of two publications: Volume 1 - Research Reactor Job Analysis: Overview, which contains an Introduction, Project Description, Project Methodology,, and. An Overview of Performance-Based Training (PBT); and Volume 2 - Research Reactor Job Analysis: Implementation, which contains Guidelines for Application of Preliminary Task Lists and Preliminary Task Lists for Reactor Operators and Supervisory Reactor Operators

  14. BEBC, the Big European Bubble Chamber

    CERN Multimedia

    1971-01-01

    The vessel of the Big European Bubble Chamber, BEBC, was installed at the beginning of the 1970s. The large stainless-steel vessel, measuring 3.7 metres in diameter and 4 metres in height, was filled with 35 cubic metres of liquid (hydrogen, deuterium or a neon-hydrogen mixture), whose sensitivity was regulated by means of a huge piston weighing 2 tonnes. During each expansion, the trajectories of the charged particles were marked by a trail of bubbles, where liquid reached boiling point as they passed through it. The first images were recorded in 1973 when BEBC, equipped with the largest superconducting magnet in service at the time, first received beam from the PS. In 1977, the bubble chamber was exposed to neutrino and hadron beams at higher energies of up to 450 GeV after the SPS came into operation. By the end of its active life in 1984, BEBC had delivered a total of 6.3 million photographs to 22 experiments devoted to neutrino or hadron physics. Around 600 scientists from some fifty laboratories through...

  15. The Big Occulting Steerable Satellite (BOSS)

    CERN Document Server

    Copi, C J; Copi, Craig J.; Starkman, Glenn D.

    1999-01-01

    Natural (such as lunar) occultations have long been used to study sources on small angular scales, while coronographs have been used to study high contrast sources. We propose launching the Big Occulting Steerable Satellite (BOSS), a large steerable occulting satellite to combine both of these techniques. BOSS will have several advantages over standard occulting bodies. BOSS would block all but about 4e-5 of the light at 1 micron in the region of interest around the star for planet detections. Because the occultation occurs outside the telescope, scattering inside the telescope does not degrade this performance. BOSS could be combined with a space telescope at the Earth-Sun L2 point to yield very long integration times, in excess of 3000 seconds. If placed in Earth orbit, integration times of 160--1600 seconds can be achieved from most major telescope sites for objects in over 90% of the sky. Applications for BOSS include direct imaging of planets around nearby stars. Planets separated by as little as 0.1--0....

  16. The ethics of biomedical big data

    CERN Document Server

    Mittelstadt, Brent Daniel

    2016-01-01

    This book presents cutting edge research on the new ethical challenges posed by biomedical Big Data technologies and practices. ‘Biomedical Big Data’ refers to the analysis of aggregated, very large datasets to improve medical knowledge and clinical care. The book describes the ethical problems posed by aggregation of biomedical datasets and re-use/re-purposing of data, in areas such as privacy, consent, professionalism, power relationships, and ethical governance of Big Data platforms. Approaches and methods are discussed that can be used to address these problems to achieve the appropriate balance between the social goods of biomedical Big Data research and the safety and privacy of individuals. Seventeen original contributions analyse the ethical, social and related policy implications of the analysis and curation of biomedical Big Data, written by leading experts in the areas of biomedical research, medical and technology ethics, privacy, governance and data protection. The book advances our understan...

  17. Big data analytics methods and applications

    CERN Document Server

    Rao, BLS; Rao, SB

    2016-01-01

    This book has a collection of articles written by Big Data experts to describe some of the cutting-edge methods and applications from their respective areas of interest, and provides the reader with a detailed overview of the field of Big Data Analytics as it is practiced today. The chapters cover technical aspects of key areas that generate and use Big Data such as management and finance; medicine and healthcare; genome, cytome and microbiome; graphs and networks; Internet of Things; Big Data standards; bench-marking of systems; and others. In addition to different applications, key algorithmic approaches such as graph partitioning, clustering and finite mixture modelling of high-dimensional data are also covered. The varied collection of themes in this volume introduces the reader to the richness of the emerging field of Big Data Analytics.

  18. Numerical Homogenization of Jointed Rock Masses Using Wave Propagation Simulation

    Science.gov (United States)

    Gasmi, Hatem; Hamdi, Essaïeb; Bouden Romdhane, Nejla

    2014-07-01

    Homogenization in fractured rock analyses is essentially based on the calculation of equivalent elastic parameters. In this paper, a new numerical homogenization method that was programmed by means of a MATLAB code, called HLA-Dissim, is presented. The developed approach simulates a discontinuity network of real rock masses based on the International Society of Rock Mechanics (ISRM) scanline field mapping methodology. Then, it evaluates a series of classic joint parameters to characterize density (RQD, specific length of discontinuities). A pulse wave, characterized by its amplitude, central frequency, and duration, is propagated from a source point to a receiver point of the simulated jointed rock mass using a complex recursive method for evaluating the transmission and reflection coefficient for each simulated discontinuity. The seismic parameters, such as delay, velocity, and attenuation, are then calculated. Finally, the equivalent medium model parameters of the rock mass are computed numerically while taking into account the natural discontinuity distribution. This methodology was applied to 17 bench fronts from six aggregate quarries located in Tunisia, Spain, Austria, and Sweden. It allowed characterizing the rock mass discontinuity network, the resulting seismic performance, and the equivalent medium stiffness. The relationship between the equivalent Young's modulus and rock discontinuity parameters was also analyzed. For these different bench fronts, the proposed numerical approach was also compared to several empirical formulas, based on RQD and fracture density values, published in previous research studies, showing its usefulness and efficiency in estimating rapidly the Young's modulus of equivalent medium for wave propagation analysis.

  19. Cloud Based Big Data Infrastructure: Architectural Components and Automated Provisioning

    OpenAIRE

    Demchenko, Yuri; Turkmen, Fatih; Blanchet, Christophe; Loomis, Charles; Laat, Caees de

    2016-01-01

    This paper describes the general architecture and functional components of the cloud based Big Data Infrastructure (BDI). The proposed BDI architecture is based on the analysis of the emerging Big Data and data intensive technologies and supported by the definition of the Big Data Architecture Framework (BDAF) that defines the following components of the Big Data technologies: Big Data definition, Data Management including data lifecycle and data structures, Big Data Infrastructure (generical...

  20. Rock pushing and sampling under rocks on Mars

    Science.gov (United States)

    Moore, H.J.; Liebes, S.; Crouch, D.S.; Clark, L.V.

    1978-01-01

    Viking Lander 2 acquired samples on Mars from beneath two rocks, where living organisms and organic molecules would be protected from ultraviolet radiation. Selection of rocks to be moved was based on scientific and engineering considerations, including rock size, rock shape, burial depth, and location in a sample field. Rock locations and topography were established using the computerized interactive video-stereophotogrammetric system and plotted on vertical profiles and in plan view. Sampler commands were developed and tested on Earth using a full-size lander and surface mock-up. The use of power by the sampler motor correlates with rock movements, which were by plowing, skidding, and rolling. Provenance of the samples was determined by measurements and interpretation of pictures and positions of the sampler arm. Analytical results demonstrate that the samples were, in fact, from beneath the rocks. Results from the Gas Chromatograph-Mass Spectrometer of the Molecular Analysis experiment and the Gas Exchange instrument of the Biology experiment indicate that more adsorbed(?) water occurs in samples under rocks than in samples exposed to the sun. This is consistent with terrestrial arid environments, where more moisture occurs in near-surface soil un- der rocks than in surrounding soil because the net heat flow is toward the soil beneath the rock and the rock cap inhibits evaporation. Inorganic analyses show that samples of soil from under the rocks have significantly less iron than soil exposed to the sun. The scientific significance of analyses of samples under the rocks is only partly evaluated, but some facts are clear. Detectable quantities of martian organic molecules were not found in the sample from under a rock by the Molecular Analysis experiment. The Biology experiments did not find definitive evidence for Earth-like living organisms in their sample. Significant amounts of adsorbed water may be present in the martian regolith. The response of the soil