WorldWideScience

Sample records for big rock point reactor

  1. 78 FR 58570 - Environmental Assessment; Entergy Nuclear Operations, Inc., Big Rock Point

    Science.gov (United States)

    2013-09-24

    ... COMMISSION Environmental Assessment; Entergy Nuclear Operations, Inc., Big Rock Point AGENCY: Nuclear... Nuclear Operations, Inc. (ENO) (the applicant or the licensee), for the Big Rock Point (BRP) Independent... for Production and Utilization Facilities,'' for the Big Rock Point (BRP) Independent Spent...

  2. Big Bang Day : Physics Rocks

    CERN Multimedia

    Brian Cox; John Barrowman; Eddie Izzard

    2008-01-01

    Is particle physics the new rock 'n' roll? The fundamental questions about the nature of the universe that particle physics hopes to answer have attracted the attention of some very high profile and unusual fans. Alan Alda, Ben Miller, Eddie Izzard, Dara O'Briain and John Barrowman all have interests in this branch of physics. Brian Cox - CERN physicist, and former member of 90's band D:Ream, tracks down some very well known celebrity enthusiasts and takes a light-hearted look at why this subject can appeal to all of us.

  3. Turning points in reactor design

    Energy Technology Data Exchange (ETDEWEB)

    Beckjord, E.S.

    1995-09-01

    This article provides some historical aspects on nuclear reactor design, beginning with PWR development for Naval Propulsion and the first commercial application at Yankee Rowe. Five turning points in reactor design and some safety problems associated with them are reviewed: (1) stability of Dresden-1, (2) ECCS, (3) PRA, (4) TMI-2, and (5) advanced passive LWR designs. While the emphasis is on the thermal-hydraulic aspects, the discussion is also about reactor systems.

  4. Classification of Big Point Cloud Data Using Cloud Computing

    Science.gov (United States)

    Liu, K.; Boehm, J.

    2015-08-01

    Point cloud data plays an significant role in various geospatial applications as it conveys plentiful information which can be used for different types of analysis. Semantic analysis, which is an important one of them, aims to label points as different categories. In machine learning, the problem is called classification. In addition, processing point data is becoming more and more challenging due to the growing data volume. In this paper, we address point data classification in a big data context. The popular cluster computing framework Apache Spark is used through the experiments and the promising results suggests a great potential of Apache Spark for large-scale point data processing.

  5. High-Temperature Gas-Cooled Test Reactor Point Design

    Energy Technology Data Exchange (ETDEWEB)

    Sterbentz, James William [Idaho National Laboratory; Bayless, Paul David [Idaho National Laboratory; Nelson, Lee Orville [Idaho National Laboratory; Gougar, Hans David [Idaho National Laboratory; Kinsey, James Carl [Idaho National Laboratory; Strydom, Gerhard [Idaho National Laboratory; Kumar, Akansha [Idaho National Laboratory

    2016-04-01

    A point design has been developed for a 200 MW high-temperature gas-cooled test reactor. The point design concept uses standard prismatic blocks and 15.5% enriched UCO fuel. Reactor physics and thermal-hydraulics simulations have been performed to characterize the capabilities of the design. In addition to the technical data, overviews are provided on the technological readiness level, licensing approach and costs.

  6. Application Study of Self-balanced Testing Method on Big Diameter Rock-socketed Piles

    Directory of Open Access Journals (Sweden)

    Qing-biao WANG

    2013-07-01

    Full Text Available Through the technological test of self-balanced testing method on big diameter rock-socketed piles of broadcasting centre building of Tai’an, this paper studies and analyzes the links of the balance position selection, the load cell production and installation, displacement sensor selection and installation, loading steps, stability conditions and determination of the bearing capacity in the process of self-balanced testing. And this paper summarizes key technology and engineering experience of self-balanced testing method of big diameter rock-socketed piles and, meanwhile, it also analyzes the difficult technical problems needed to be resolved urgently at present. Conclusion of the study has important significance to the popularization and application of self-balanced testing method and the similar projects.

  7. Solution of the reactor point kinetics equations by MATLAB computing

    Directory of Open Access Journals (Sweden)

    Singh Sudhansu S.

    2015-01-01

    Full Text Available The numerical solution of the point kinetics equations in the presence of Newtonian temperature feedback has been a challenging issue for analyzing the reactor transients. Reactor point kinetics equations are a system of stiff ordinary differential equations which need special numerical treatments. Although a plethora of numerical intricacies have been introduced to solve the point kinetics equations over the years, some of the simple and straightforward methods still work very efficiently with extraordinary accuracy. As an example, it has been shown recently that the fundamental backward Euler finite difference algorithm with its simplicity has proven to be one of the most effective legacy methods. Complementing the back-ward Euler finite difference scheme, the present work demonstrates the application of ordinary differential equation suite available in the MATLAB software package to solve the stiff reactor point kinetics equations with Newtonian temperature feedback effects very effectively by analyzing various classic benchmark cases. Fair accuracy of the results implies the efficient application of MATLAB ordinary differential equation suite for solving the reactor point kinetics equations as an alternate method for future applications.

  8. Fractional neutron point kinetics equations for nuclear reactor dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Espinosa-Paredes, Gilberto, E-mail: gepe@xanum.uam.mx [Area de Ingenieria en Recursos Energeticos, Universidad Autonoma Metropolitana-Iztapalapa, Av. San Rafael Atlixco 186, Col. Vicentina, Mexico, D.F. 09340 (Mexico); Polo-Labarrios, Marco-A. [Area de Ingenieria en Recursos Energeticos, Universidad Autonoma Metropolitana-Iztapalapa, Av. San Rafael Atlixco 186, Col. Vicentina, Mexico, D.F. 09340 (Mexico); Espinosa-Martinez, Erick-G. [Retorno Quebec 6, Col. Burgos de Cuernavaca 62580, Temixco, Mor. (Mexico); Valle-Gallegos, Edmundo del [Escuela Superior de Fisica y Matematicas, Instituto Politecnico Nacional, Av. Instituto Politecnico Nacional s/n, Col. San Pedro Zacatenco, Mexico, D.F. 07738 (Mexico)

    2011-02-15

    The fractional point-neutron kinetics model for the dynamic behavior in a nuclear reactor is derived and analyzed in this paper. The fractional model retains the main dynamic characteristics of the neutron motion in which the relaxation time associated with a rapid variation in the neutron flux contains a fractional order, acting as exponent of the relaxation time, to obtain the best representation of a nuclear reactor dynamics. The physical interpretation of the fractional order is related with non-Fickian effects from the neutron diffusion equation point of view. The numerical approximation to the solution of the fractional neutron point kinetics model, which can be represented as a multi-term high-order linear fractional differential equation, is calculated by reducing the problem to a system of ordinary and fractional differential equations. The numerical stability of the fractional scheme is investigated in this work. Results for neutron dynamic behavior for both positive and negative reactivity and for different values of fractional order are shown and compared with the classic neutron point kinetic equations. Additionally, a related review with the neutron point kinetics equations is presented, which encompasses papers written in English about this research topic (as well as some books and technical reports) published since 1940 up to 2010.

  9. Rock falls from Glacier Point above Camp Curry, Yosemite National Park, California

    Science.gov (United States)

    Wieczorek, Gerald F.; Snyder, James B.

    1999-01-01

    A series of rock falls from the north face of Glacier Point above Camp Curry, Yosemite National Park, California, have caused reexamination of the rock-fall hazard because beginning in June, 1999 a system of cracks propagated through a nearby rock mass outlining a future potential rock fall. If the estimated volume of the potential rock fall fails as a single piece, there could be a risk from rock-fall impact and airborne rock debris to cabins in Camp Curry. The role of joint plane orientation and groundwater pressure in the fractured rock mass are discussed in light of the pattern of developing cracks and potential modes of failure.

  10. Sideloading - Ingestion of Large Point Clouds Into the Apache Spark Big Data Engine

    Science.gov (United States)

    Boehm, J.; Liu, K.; Alis, C.

    2016-06-01

    In the geospatial domain we have now reached the point where data volumes we handle have clearly grown beyond the capacity of most desktop computers. This is particularly true in the area of point cloud processing. It is therefore naturally lucrative to explore established big data frameworks for big geospatial data. The very first hurdle is the import of geospatial data into big data frameworks, commonly referred to as data ingestion. Geospatial data is typically encoded in specialised binary file formats, which are not naturally supported by the existing big data frameworks. Instead such file formats are supported by software libraries that are restricted to single CPU execution. We present an approach that allows the use of existing point cloud file format libraries on the Apache Spark big data framework. We demonstrate the ingestion of large volumes of point cloud data into a compute cluster. The approach uses a map function to distribute the data ingestion across the nodes of a cluster. We test the capabilities of the proposed method to load billions of points into a commodity hardware compute cluster and we discuss the implications on scalability and performance. The performance is benchmarked against an existing native Apache Spark data import implementation.

  11. Isotopic data for Late Cretaceous intrusions and associated altered and mineralized rocks in the Big Belt Mountains, Montana

    Science.gov (United States)

    du Bray, Edward A.; Unruh, Daniel M.; Hofstra, Albert H.

    2017-03-07

    The quartz monzodiorite of Mount Edith and the concentrically zoned intrusive suite of Boulder Baldy constitute the principal Late Cretaceous igneous intrusions hosted by Mesoproterozoic sedimentary rocks of the Newland Formation in the Big Belt Mountains, Montana. These calc-alkaline plutonic masses are manifestations of subduction-related magmatism that prevailed along the western edge of North America during the Cretaceous. Radiogenic isotope data for neodymium, strontium, and lead indicate that the petrogenesis of the associated magmas involved a combination of (1) sources that were compositionally heterogeneous at the scale of the geographically restricted intrusive rocks in the Big Belt Mountains and (2) variable contamination by crustal assimilants also having diverse isotopic compositions. Altered and mineralized rocks temporally, spatially, and genetically related to these intrusions manifest at least two isotopically distinct mineralizing events, both of which involve major inputs from spatially associated Late Cretaceous igneous rocks. Alteration and mineralization of rock associated with the intrusive suite of Boulder Baldy requires a component characterized by significantly more radiogenic strontium than that characteristic of the associated igneous rocks. However, the source of such a component was not identified in the Big Belt Mountains. Similarly, altered and mineralized rocks associated with the quartz monzodiorite of Mount Edith include a component characterized by significantly more radiogenic strontium and lead, particularly as defined by 207Pb/204Pb values. The source of this component appears to be fluids that equilibrated with proximal Newland Formation rocks. Oxygen isotope data for rocks of the intrusive suite of Boulder Baldy are similar to those of subduction-related magmatism that include mantle-derived components; oxygen isotope data for altered and mineralized equivalents are slightly lighter.

  12. End point control of an actinide precipitation reactor

    Energy Technology Data Exchange (ETDEWEB)

    Muske, K.R. [Villanova Univ., PA (United States). Dept. of Chemical Engineering; Palmer, M.J. [Los Alamos National Lab., NM (United States)

    1997-10-01

    The actinide precipitation reactors in the nuclear materials processing facility at Los Alamos National Laboratory are used to remove actinides and other heavy metals from the effluent streams generated during the purification of plutonium. These effluent streams consist of hydrochloric acid solutions, ranging from one to five molar in concentration, in which actinides and other metals are dissolved. The actinides present are plutonium and americium. Typical actinide loadings range from one to five grams per liter. The most prevalent heavy metals are iron, chromium, and nickel that are due to stainless steel. Removal of these metals from solution is accomplished by hydroxide precipitation during the neutralization of the effluent. An end point control algorithm for the semi-batch actinide precipitation reactors at Los Alamos National Laboratory is described. The algorithm is based on an equilibrium solubility model of the chemical species in solution. This model is used to predict the amount of base hydroxide necessary to reach the end point of the actinide precipitation reaction. The model parameters are updated by on-line pH measurements.

  13. Automated Rock Detection and Shape Analysis from Mars Rover Imagery and 3D Point Cloud Data

    Institute of Scientific and Technical Information of China (English)

    Kaichang Di; Zongyu Yue; Zhaoqin Liu; Shuliang Wang

    2013-01-01

    A new object-oriented method has been developed for the extraction of Mars rocks from Mars rover data.It is based on a combination of Mars rover imagery and 3D point cloud data.First,Navcam or Pancam images taken by the Mars rovers are segmented into homogeneous objects with a mean-shift algorithm.Then,the objects in the segmented images are classified into small rock candidates,rock shadows,and large objects.Rock shadows and large objects are considered as the regions within which large rocks may exist.In these regions,large rock candidates are extracted through ground-plane fitting with the 3D point cloud data.Small and large rock candidates are combined and postprocessed to obtain the final rock extraction results.The shape properties of the rocks (angularity,circularity,width,height,and width-height ratio) have been calculated for subsequent geological studies.

  14. Reactor physics and safety aspects of various design options of a Russian light water reactor with rock-like fuels

    Science.gov (United States)

    Bondarenko, A. V.; Komissarov, O. V.; Kozmenkov, Ya. K.; Matveev, Yu. V.; Orekhov, Yu. I.; Pivovarov, V. A.; Sharapov, V. N.

    2003-06-01

    This paper presents results of analytical studies on weapons grade plutonium incineration in VVER (640) medium size light water reactors using a special composition of rock-like fuel (ROX-fuel) to assure spent fuel long-term storage without its reprocessing. The main goal is to achieve high degree of plutonium incineration in once-through cycle. In this paper we considered two fuel compositions. In both compositions weapons grade plutonium is used as fissile material. Spinel (MgAl 2O 4) is used as the 'preserving' material assuring safe storage of the spent fuel. Besides an inert matrix, the option of rock-like fuel with thorium dioxide was studied. One of principal problems in the realization of the proposed approach is the substantial change of properties of the light water reactor core when passing to the use of the ROX-fuel, in particular: (i) due to the absence of 238U the Doppler effect playing a crucial role in reactor's self-regulation and limiting the consequences of reactivity accidents, decreases significantly, (ii) no fuel breeding on one hand, and the quest to attain the maximum plutonium burnup on the other hand, would result in a drastical change of the fuel assembly power during the lifetime and, as a consequence, the rise in irregularity of the power density of fuel assemblies, (iii) both the control rods worth and dissolved boron worth decrease in view of neutron spectrum hardening brought on by the larger absorption cross-section of plutonium as compared to uranium, (iv) βeff is markedly reduced. All these distinctive features are potentially detrimental to the reactor nuclear safety. The principal objective of this work is that to identify a variant of the fuel composition and the reactor layout, which would permit neutralize the negative effect of the above-mentioned distinctive features.

  15. Big ambitions for small reactors as investors size up power options

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, John [nuclear24, Redditch (United Kingdom)

    2016-04-15

    Earlier this year, US nuclear developer NuScale Power completed a study for the UK's National Nuclear Laboratory (NNL) that supported the suitability of NuScale's small modular reactor (SMR) technology for the effective disposition of plutonium. The UK is a frontrunner to compete in the SMR marketplace, both in terms of technological capabilities, trade and political commitment. Industry observers are openly speculating whether SMR design and construction could start to move ahead faster than 'big and conventional' nuclear construction projects - not just in the UK but worldwide. Economies of scale could increase the attraction of SMRs to investors and the general public.

  16. High Temperature Gas-Cooled Test Reactor Point Design: Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Sterbentz, James William [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bayless, Paul David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Nelson, Lee Orville [Idaho National Lab. (INL), Idaho Falls, ID (United States); Gougar, Hans David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-01-01

    A point design has been developed for a 200-MW high-temperature gas-cooled test reactor. The point design concept uses standard prismatic blocks and 15.5% enriched uranium oxycarbide fuel. Reactor physics and thermal-hydraulics simulations have been performed to characterize the capabilities of the design. In addition to the technical data, overviews are provided on the technology readiness level, licensing approach, and costs of the test reactor point design.

  17. High Temperature Gas-Cooled Test Reactor Point Design: Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Sterbentz, James William [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bayless, Paul David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Nelson, Lee Orville [Idaho National Lab. (INL), Idaho Falls, ID (United States); Gougar, Hans David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kinsey, J. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-03-01

    A point design has been developed for a 200-MW high-temperature gas-cooled test reactor. The point design concept uses standard prismatic blocks and 15.5% enriched uranium oxycarbide fuel. Reactor physics and thermal-hydraulics simulations have been performed to characterize the capabilities of the design. In addition to the technical data, overviews are provided on the technology readiness level, licensing approach, and costs of the test reactor point design.

  18. Influence of Uncertainty of Rock Properties on Seismic Responses of Reactor Buildings

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    The influence of the dispersion and uncertainty of the dynamic shear wave velocity and Poisson's ratio of soil in a hard rock site was investigated on the seismic response of reactor building structure. The analysis is performed by considering the soil-structure interaction effects and based on the model of the reactor building in a typical pressurized water reactor nuclear power plant (NPP). The numerical results show that for the typical floor selected, while the relative increment ratio of the dynamic shear wave velocity varies from -30% to 30% compared to the basis of 1 930 m/s, the relative variation of the horizontal response spectra peak value lies in the scope of ±10% for the internal structure, and the relative variation of the frequency corresponding to the spectra peak is 0.0% in most cases. The relative variation of the vertical response spectra peak value lies in the scope of - 10% to 22%, and the relative variation of the frequency corresponding to the spectra peak lies in the scope of -22% to 4%. The analysis indicates that the dynamic shear wave velocity and the Poisson's ratio of the rock would affect the seismic response of structure and the soil-structure interaction effects should be considered in seismic analysis and design of NPP even for a hard rock site.

  19. Rock massif observation from underground coal gasification point of view

    Directory of Open Access Journals (Sweden)

    T. Sasvári

    2009-04-01

    Full Text Available The Underground coal gasification (UCG of the coal seams is determined by suitable geological structure of the area. The assumption of the qualitative changes of the rock massif can be also enabled by application of geophysical methods (electric resisting methods and geoelectric tomography. This article shows the example of evaluating possibilities of realization of the underground coal gasification in the area of the Upper Nitra Coal Basin in Cíge¾ and Nováky deposits, and recommend the needs of cooperation among geological, geotechnical and geophysical researchers.

  20. Field Plot and Accuracy Assessment Points for Pictured Rocks National Lakeshore Vegetation Mapping Project

    Data.gov (United States)

    National Park Service, Department of the Interior — The vegetation point data for Pictured Rocks National Lakeshore (PIRO) was developed to support two projects associated with the 2004 vegetation map, the collection...

  1. Parallel Processing of Big Point Clouds Using Z-Order Partitioning

    Science.gov (United States)

    Alis, C.; Boehm, J.; Liu, K.

    2016-06-01

    As laser scanning technology improves and costs are coming down, the amount of point cloud data being generated can be prohibitively difficult and expensive to process on a single machine. This data explosion is not only limited to point cloud data. Voluminous amounts of high-dimensionality and quickly accumulating data, collectively known as Big Data, such as those generated by social media, Internet of Things devices and commercial transactions, are becoming more prevalent as well. New computing paradigms and frameworks are being developed to efficiently handle the processing of Big Data, many of which utilize a compute cluster composed of several commodity grade machines to process chunks of data in parallel. A central concept in many of these frameworks is data locality. By its nature, Big Data is large enough that the entire dataset would not fit on the memory and hard drives of a single node hence replicating the entire dataset to each worker node is impractical. The data must then be partitioned across worker nodes in a manner that minimises data transfer across the network. This is a challenge for point cloud data because there exist different ways to partition data and they may require data transfer. We propose a partitioning based on Z-order which is a form of locality-sensitive hashing. The Z-order or Morton code is computed by dividing each dimension to form a grid then interleaving the binary representation of each dimension. For example, the Z-order code for the grid square with coordinates (x = 1 = 012, y = 3 = 112) is 10112 = 11. The number of points in each partition is controlled by the number of bits per dimension: the more bits, the fewer the points. The number of bits per dimension also controls the level of detail with more bits yielding finer partitioning. We present this partitioning method by implementing it on Apache Spark and investigating how different parameters affect the accuracy and running time of the k nearest neighbour algorithm

  2. Downstream-migrating fluvial point bars in the rock record

    Science.gov (United States)

    Ghinassi, Massimiliano; Ielpi, Alessandro; Aldinucci, Mauro; Fustic, Milovan

    2016-04-01

    Classical models developed for ancient fluvial point bars are based on the assumption that meander bends invariably increase their radius as meander-bend apices migrate in a direction transverse to the channel-belt axis (i.e., meander bend expansion). However, many modern meandering rivers are also characterized by down-valley migration of the bend apex, a mechanism that takes place without a significant change in meander radius and wavelength. Downstream-migrating fluvial point bars (DMFPB) are the dominant architectural element of these types of meander belts. Yet they are poorly known from ancient fluvial-channel belts, since their disambiguation from expansional point bars often requires fully-3D perspectives. This study aims to review DMFPB deposits spanning in age from Devonian to Holocene, and to discuss their main architectural and sedimentological features from published outcrop, borehole and 3D-seismic datasets. Fluvial successions hosting DMFPB mainly accumulated in low accommodation conditions, where channel belts were affected by different degrees of morphological (e.g., valleys) or tectonic (e.g., axial drainage of shortening basins) confinement. In confined settings, bends migrate downstream along the erosion-resistant valley flanks and little or no floodplain deposits are preserved. Progressive floor aggradation (e.g., valley filling) allow meander belts with DMFPB to decrease their degree of confinement. In less confined settings, meander bends migrate downstream mainly after impinging against older, erosion-resistant channel fill mud. By contrast, tectonic confinement is commonly associated with uplifted alluvial plains that prevented meander-bend expansion, in turn triggering downstream translation. At the scale of individual point bars, translational morphodynamics promote the preservation of downstream-bar deposits, whereas the coarser-grained upstream and central beds are less frequently preserved. However, enhanced preservation of upstream

  3. Reactors

    CERN Document Server

    International Electrotechnical Commission. Geneva

    1988-01-01

    This standard applies to the following types of reactors: shunt reactors, current-limiting reactors including neutral-earthing reactors, damping reactors, tuning (filter) reactors, earthing transformers (neutral couplers), arc-suppression reactors, smoothing reactors, with the exception of the following reactors: small reactors with a rating generally less than 2 kvar single-phase and 10 kvar three-phase, reactors for special purposes such as high-frequency line traps or reactors mounted on rolling stock.

  4. Automatic extraction of discontinuity orientation from rock mass surface 3D point cloud

    Science.gov (United States)

    Chen, Jianqin; Zhu, Hehua; Li, Xiaojun

    2016-10-01

    This paper presents a new method for extracting discontinuity orientation automatically from rock mass surface 3D point cloud. The proposed method consists of four steps: (1) automatic grouping of discontinuity sets using an improved K-means clustering method, (2) discontinuity segmentation and optimization, (3) discontinuity plane fitting using Random Sample Consensus (RANSAC) method, and (4) coordinate transformation of discontinuity plane. The method is first validated by the point cloud of a small piece of a rock slope acquired by photogrammetry. The extracted discontinuity orientations are compared with measured ones in the field. Then it is applied to a publicly available LiDAR data of a road cut rock slope at Rockbench repository. The extracted discontinuity orientations are compared with the method proposed by Riquelme et al. (2014). The results show that the presented method is reliable and of high accuracy, and can meet the engineering needs.

  5. PREMOR: a point reactor exposure model computer code for survey analysis of power plant performance

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.

    1979-10-01

    The PREMOR computer code was written to exploit a simple, two-group point nuclear reactor power plant model for survey analysis. Up to thirteen actinides, fourteen fission products, and one lumped absorber nuclide density are followed over a reactor history. Successive feed batches are accounted for with provision for from one to twenty batches resident. The effect of exposure of each of the batches to the same neutron flux is determined.

  6. Preliminary Demonstration Reactor Point Design for the Fluoride Salt-Cooled High-Temperature Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Qualls, A. L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Betzler, Benjamin R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Brown, Nicholas R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Carbajo, Juan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Greenwood, Michael Scott [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hale, Richard Edward [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Harrison, Thomas J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Powers, Jeffrey J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Robb, Kevin R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Terrell, Jerry W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-01

    Development of the Fluoride Salt-Cooled High-Temperature Reactor (FHR) Demonstration Reactor (DR) is a necessary intermediate step to enable commercial FHR deployment through disruptive and rapid technology development and demonstration. The FHR DR will utilize known, mature technology to close remaining gaps to commercial viability. Lower risk technologies are included in the initial FHR DR design to ensure that the reactor can be built, licensed, and operated within an acceptable budget and schedule. These technologies include tristructural-isotropic (TRISO) particle fuel, replaceable core structural material, the use of that same material for the primary and intermediate loops, and tube-and-shell heat exchangers. This report provides an update on the development of the FHR DR. At this writing, the core neutronics and thermal hydraulics have been developed and analyzed. The mechanical design details are still under development and are described to their current level of fidelity. It is anticipated that the FHR DR can be operational within 10 years because of the use of low-risk, near-term technology options.

  7. Small Stress Change Triggering a Big Earthquake: a Test of the Critical Point Hypothesis for Earthquakes

    Institute of Scientific and Technical Information of China (English)

    万永革; 吴忠良; 周公威

    2003-01-01

    Whether or not a small stress change can trigger a big earthquake is one of the most important problems related to the critical point hypothesis for earthquakes. We investigate global earthquakes with different focal mechanisms which have different levels of ambient shear stress. This ambient stress level is the stress level required by the earthquakes for their occurrence. Earthquake pairs are studied to see whether the occurrence of the preceding event encourages the occurrence of the succeeding one in terms of the Coulomb stress triggering. It is observed that the stress triggering effect produced by the change of Coulomb failure stress in the same order of magnitudes,about 10-2 MPa, is distinctly different for different focal mechanisms, and thus for different ambient stress levels.For non-strike-slip earthquakes with a relatively low ambient stress level, the triggering effect is more evident,while for strike-slip earthquakes with a relatively high ambient stress level, there is no evident triggering effect.This water level test provides an observational support to the critical point hypothesis for earthquakes.

  8. Measured Sections of Upper Paleozoic to Early Tertiary Rocks, Demarcation Point Quadrangle, Alaska

    Science.gov (United States)

    Detterman, Robert L.

    1984-01-01

    Introduction Twelve sections of upper Paleozoic to early Tertiary rocks from the Demarcation Point quadrangle and the northern edge of the Table Mountain quadrangle are presented. These measured sections include the type sections for the Joe Creek Member of the Echooka Formation (Section 11), the Bathtub Graywacke and Kongakut Formation (Section 9), and the unnamed early Tertiary rocks (Section 1). The early Tertiary rocks correlate closely with the Moose Channel Formation in the MacKenzie Delta, Candada (Detterman and Spicer, 1981). The sections were measured with a Jacob's staff during the geologic investigations of the Demarcation Point quadrangle in 1969 to 1971. The geologic map is published in generalized form (Detterman, 1974, 1976; Detterman and others, 1975). The sections are at a scale of 1 in to 100 ft, except for section 1, which is at 1 in to 200 ft. The location map shows the year and station number for each station. Fossils collected from these rocks and marked by and asterisk (*) are included in Detterman and others, 1975 (p. 42-45). A double asterisk (**) indicates they are included in the list below. All other fossil indicators mean fossils are present, but not collected.

  9. 78 FR 61401 - Entergy Nuclear Operations, Inc.; Big Rock Point; Independent Spent Fuel Storage Installation

    Science.gov (United States)

    2013-10-03

    ... Nuclear Material Safety and Safeguards, U.S. Nuclear Regulatory Commission, Washington, DC 20555-0001..., and 10 CFR part 50, allows ENO to possess and store spent nuclear fuel at the permanently shutdown and... Director, Division of Spent Fuel Storage and Transportation, Office of Nuclear Material Safety...

  10. Geology of Precambrian rocks and isotope geochemistry of shear zones in the Big Narrows area, northern Front Range, Colorado

    Science.gov (United States)

    Abbott, Jeffrey T.

    1970-01-01

    Rocks within the Big Narrows and Poudre Park quadrangles located in the northern Front Range of Colorado are Precambrian metasedimentary and metaigneous schists and gneisses and plutonic igneous rocks. These are locally mantled by extensive late Tertiary and Quaternary fluvial gravels. The southern boundary of the Log Cabin batholith lies within the area studied. A detailed chronology of polyphase deformation, metamorphism and plutonism has been established. Early isoclinal folding (F1) was followed by a major period of plastic deformation (F2), sillimanite-microcline grade regional metamorphism, migmatization and synkinematic Boulder Creek granodiorite plutonism (1.7 b.y.). Macroscopic doubly plunging antiformal and synformal structures were developed. P-T conditions at the peak of metamorphism were probably about 670?C and 4.5 Kb. Water pressures may locally have differed from load pressures. The 1.4 b.y. Silver Plume granite plutonism was post kinematic and on the basis of petrographic and field criteria can be divided into three facies. Emplacement was by forcible injection and assimilation. Microscopic and mesoscopic folds which postdate the formation of the characteristic mineral phases during the 1.7 b.y. metamorphism are correlated with the emplacement of the Silver Plume Log Cabin batholith. Extensive retrograde metamorphism was associated with this event. A major period of mylonitization postdates Silver Plume plutonism and produced large E-W and NE trending shear zones. A detailed study of the Rb/Sr isotope geochemistry of the layered mylonites demonstrated that the mylonitization and associated re- crystallization homogenized the Rb87/Sr 86 ratios. Whole-rock dating techniques applied to the layered mylonites indicate a probable age of 1.2 b.y. Petrographic studies suggest that the mylonitization-recrystallization process produced hornfels facies assemblages in the adjacent metasediments. Minor Laramide faulting, mineralization and igneous activity

  11. Sensitivity Analysis for Reactor Period Induced by Positive Reactivity Using One-point Adjoint Kinetic Equation

    Science.gov (United States)

    Chiba, G.; Tsuji, M.; Narabayashi, T.

    2014-04-01

    In order to better predict a kinetic behavior of a nuclear fission reactor, an improvement of the delayed neutron parameters is essential. The present paper specifies important nuclear data for a reactor kinetics: Fission yield and decay constant data of 86Ge, some bromine isotopes, 94Rb, 98mY and some iodine isotopes. Their importance is quantified as sensitivities with a help of the adjoint kinetic equation, and it is found that they are dependent on an inserted reactivity (or a reactor period). Moreover, dependence of sensitivities on nuclear data files is also quantified using the latest files. Even though the currently evaluated data are used, there are large differences among different data files from a view point of the delayed neutrons.

  12. Aging management program of the reactor building concrete at Point Lepreau Generating Station

    Science.gov (United States)

    Aldea, C.-M.; Shenton, B.; Demerchant, M. M.; Gendron, T.

    2011-04-01

    In order for New Brunswick Power Nuclear (NBPN) to control the risks of degradation of the concrete reactor building at the Point Lepreau Generating Station (PLGS) the development of an aging management plan (AMP) was initiated. The intention of this plan was to determine the requirements for specific structural components of concrete of the reactor building that require regular inspection and maintenance to ensure the safe and reliable operation of the plant. The document is currently in draft form and presents an integrated methodology for the application of an AMP for the concrete of the reactor building. The current AMP addresses the reactor building structure and various components, such as joint sealant and liners that are integral to the structure. It does not include internal components housed within the structure. This paper provides background information regarding the document developed and the strategy developed to manage potential degradation of the concrete of the reactor building, as well as specific programs and preventive and corrective maintenance activities initiated.

  13. Aging management program of the reactor building concrete at Point Lepreau Generating Station

    Directory of Open Access Journals (Sweden)

    Gendron T.

    2011-04-01

    Full Text Available In order for New Brunswick Power Nuclear (NBPN to control the risks of degradation of the concrete reactor building at the Point Lepreau Generating Station (PLGS the development of an aging management plan (AMP was initiated. The intention of this plan was to determine the requirements for specific structural components of concrete of the reactor building that require regular inspection and maintenance to ensure the safe and reliable operation of the plant. The document is currently in draft form and presents an integrated methodology for the application of an AMP for the concrete of the reactor building. The current AMP addresses the reactor building structure and various components, such as joint sealant and liners that are integral to the structure. It does not include internal components housed within the structure. This paper provides background information regarding the document developed and the strategy developed to manage potential degradation of the concrete of the reactor building, as well as specific programs and preventive and corrective maintenance activities initiated.

  14. Balancing on the Edge: An Approach to Leadership and Resiliency that Combines Rock Climbing with Four Key Touch Points

    Science.gov (United States)

    Winkler, Harold E.

    2005-01-01

    In this article, the author compares leadership and resiliency with rock climbing. It describes the author's personal experience on a rock climbing adventure with his family and how it required application of similar elements as that of leadership and resiliency. The article contains the following sections: (1) Being Resilient; (2) Points of…

  15. Revisiting the Rosenbrock numerical solutions of the reactor point kinetics equation with numerous examples

    Directory of Open Access Journals (Sweden)

    Yang Xue

    2009-01-01

    Full Text Available The fourth order Rosenbrock method with an automatic step size control feature was described and applied to solve the reactor point kinetics equations. A FORTRAN 90 program was developed to test the computational speed and algorithm accuracy. From the results of various benchmark tests with different types of reactivity insertions, the Rosenbrock method shows high accuracy, high efficiency and stable character of the solution.

  16. Potential of acoustic emissions from three point bending tests as rock failure precursors

    Institute of Scientific and Technical Information of China (English)

    Agioutantis Z.; Kaklis K.; Mavrigiannakis S.; Verigakis M.; Vallianatos F.; Saltas V.

    2016-01-01

    Development of failure in brittle materials is associated with microcracks, which release energy in the form of elastic waves called acoustic emissions. This paper presents results from acoustic emission mea-surements obtained during three point bending tests on Nestos marble under laboratory conditions. Acoustic emission activity was monitored using piezoelectric acoustic emission sensors, and the potential for accurate prediction of rock damage based on acoustic emission data was investigated. Damage local-ization was determined based on acoustic emissions generated from the critically stressed region as scat-tered events at stresses below and close to the strength of the material.

  17. Unioned layer for the Point of Rocks-Black Butte coal assessment area, Green River Basin, Wyoming (porbbfing.shp)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This ArcView shapefile contains a polygon representation of the spatial query layer for the Point of Rocks-Black Butte coalfield, Greater Green River Basin, Wyoming....

  18. Development and analysis of some versions of the fractional-order point reactor kinetics model for a nuclear reactor with slab geometry

    Science.gov (United States)

    Vyawahare, Vishwesh A.; Nataraj, P. S. V.

    2013-07-01

    In this paper, we report the development and analysis of some novel versions and approximations of the fractional-order (FO) point reactor kinetics model for a nuclear reactor with slab geometry. A systematic development of the FO Inhour equation, Inverse FO point reactor kinetics model, and fractional-order versions of the constant delayed neutron rate approximation model and prompt jump approximation model is presented for the first time (for both one delayed group and six delayed groups). These models evolve from the FO point reactor kinetics model, which has been derived from the FO Neutron Telegraph Equation for the neutron transport considering the subdiffusive neutron transport. Various observations and the analysis results are reported and the corresponding justifications are addressed using the subdiffusive framework for the neutron transport. The FO Inhour equation is found out to be a pseudo-polynomial with its degree depending on the order of the fractional derivative in the FO model. The inverse FO point reactor kinetics model is derived and used to find the reactivity variation required to achieve exponential and sinusoidal power variation in the core. The situation of sudden insertion of negative reactivity is analyzed using the FO constant delayed neutron rate approximation. Use of FO model for representing the prompt jump in reactor power is advocated on the basis of subdiffusion. Comparison with the respective integer-order models is carried out for the practical data. Also, it has been shown analytically that integer-order models are a special case of FO models when the order of time-derivative is one. Development of these FO models plays a crucial role in reactor theory and operation as it is the first step towards achieving the FO control-oriented model for a nuclear reactor. The results presented here form an important step in the efforts to establish a step-by-step and systematic theory for the FO modeling of a nuclear reactor.

  19. Automated extraction and analysis of rock discontinuity characteristics from 3D point clouds

    Science.gov (United States)

    Bianchetti, Matteo; Villa, Alberto; Agliardi, Federico; Crosta, Giovanni B.

    2016-04-01

    A reliable characterization of fractured rock masses requires an exhaustive geometrical description of discontinuities, including orientation, spacing, and size. These are required to describe discontinuum rock mass structure, perform Discrete Fracture Network and DEM modelling, or provide input for rock mass classification or equivalent continuum estimate of rock mass properties. Although several advanced methodologies have been developed in the last decades, a complete characterization of discontinuity geometry in practice is still challenging, due to scale-dependent variability of fracture patterns and difficult accessibility to large outcrops. Recent advances in remote survey techniques, such as terrestrial laser scanning and digital photogrammetry, allow a fast and accurate acquisition of dense 3D point clouds, which promoted the development of several semi-automatic approaches to extract discontinuity features. Nevertheless, these often need user supervision on algorithm parameters which can be difficult to assess. To overcome this problem, we developed an original Matlab tool, allowing fast, fully automatic extraction and analysis of discontinuity features with no requirements on point cloud accuracy, density and homogeneity. The tool consists of a set of algorithms which: (i) process raw 3D point clouds, (ii) automatically characterize discontinuity sets, (iii) identify individual discontinuity surfaces, and (iv) analyse their spacing and persistence. The tool operates in either a supervised or unsupervised mode, starting from an automatic preliminary exploration data analysis. The identification and geometrical characterization of discontinuity features is divided in steps. First, coplanar surfaces are identified in the whole point cloud using K-Nearest Neighbor and Principal Component Analysis algorithms optimized on point cloud accuracy and specified typical facet size. Then, discontinuity set orientation is calculated using Kernel Density Estimation and

  20. Forests and Forest Cover - TREES_BIG2005_IN: Champion Tree Locations for 2005 in Indiana (Bernardin-Lochmueller and Associates, Point Shapefile)

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — TREES_BIG2005_IN is a point shapefile showing the locations of state champion trees in Indiana. The register is updated every 5 years. Each location represents a...

  1. Causality and entropic arguments pointing to a null Big Bag hypersurface

    Energy Technology Data Exchange (ETDEWEB)

    Minguzzi, E, E-mail: ettore.minguzzi@unifi.it [Dipartimento di Matematica Applicata, Universita degli Studi di Firenze, Via S. Marta 3, I-50139 Firenze (Italy)

    2011-09-22

    I propose a causality argument in order to solve the homogeneity (horizon) problem and the entropy problem of cosmology. The solution is based on the replacement of the spacelike Big Bang boundary with a null boundary behind which stays a chronology violating region. This solution requires a tilting of the light cones near the null boundary and thus it is based more on the behavior of the light cones and hence on causality than on the behavior of the scale factor (expansion). The connection of this picture with Augustine of Hippo famous philosophical discussion on time and creation is mentioned.

  2. Numerical Solution of Fractional Neutron Point Kinetics Model in Nuclear Reactor

    Directory of Open Access Journals (Sweden)

    Nowak Tomasz Karol

    2014-06-01

    Full Text Available This paper presents results concerning solutions of the fractional neutron point kinetics model for a nuclear reactor. Proposed model consists of a bilinear system of fractional and ordinary differential equations. Three methods to solve the model are presented and compared. The first one entails application of discrete Grünwald-Letnikov definition of the fractional derivative in the model. Second involves building an analog scheme in the FOMCON Toolbox in MATLAB environment. Third is the method proposed by Edwards. The impact of selected parameters on the model’s response was examined. The results for typical input were discussed and compared.

  3. Stability principle of big and small structures of rock surrounding roadway driven along goaf in fully mechanized top coal caving face

    Energy Technology Data Exchange (ETDEWEB)

    Hou, C.; Li, X. [China University of Mining and Technology, Xuzhou (China)

    2001-02-01

    Based on the characteristics of the surrounding rock of roadway driven along the goaf in a fully mechanised top coal caving face, the stability principle of big and small structures is put forward, which provides the theoretical basis to the application of bolting. The mechanical characteristics of the arc-triangle key block in the main roof and the stability of the roadway during drivage and extraction and the effect on the roadway driven along the next goaf are analysed. The main factors which affect the stability of the small structure of the surrounding rock are discussed. The bolting surrounding strength reinforcement theory is applied to study the important significance of improving the pre-tension of bolting and the support strength. 4 refs., 7 figs., 2 tabs.

  4. Brit Crit: Turning Points in British Rock Criticism 1960-1990

    DEFF Research Database (Denmark)

    Gudmundsson, Gestur; Lindberg, U.; Michelsen, M.

    2002-01-01

    The article examines the development of rock criticism in the United Kingdom from the perspective of a Bourdieuan field-analysis. Early British rock critics, like Nik Cohn, were international pioneers, a few years later there was a strong American influence, but British rock criticism has always...

  5. Approximate Solution of the Point Reactor Kinetic Equations of Average One-Group of Delayed Neutrons for Step Reactivity Insertion

    Directory of Open Access Journals (Sweden)

    S. Yamoah

    2012-04-01

    Full Text Available The understanding of the time-dependent behaviour of the neutron population in a nuclear reactor in response to either a planned or unplanned change in the reactor conditions is of great importance to the safe and reliable operation of the reactor. In this study two analytical methods have been presented to solve the point kinetic equations of average one-group of delayed neutrons. These methods which are both approximate solution of the point reactor kinetic equations are compared with a numerical solution using the Euler’s first order method. To obtain accurate solution for the Euler method, a relatively small time step was chosen for the numerical solution. These methods are applied to different types of reactivity to check the validity of the analytical method by comparing the analytical results with the numerical results. From the results, it is observed that the analytical solution agrees well with the numerical solution.

  6. The tipping point how little things can make a big difference

    CERN Document Server

    Gladwell, Malcolm

    2002-01-01

    The tipping point is that magic moment when an idea, trend, or social behavior crosses a threshold, tips, and spreads like wildfire. Just as a single sick person can start an epidemic of the flu, so too can a small but precisely targeted push cause a fashion trend, the popularity of a new product, or a drop in the crime rate. This widely acclaimed bestseller, in which Malcolm Gladwell explores and brilliantly illuminates the tipping point phenomenon, is already changing the way people throughout the world think about selling products and disseminating ideas.

  7. Ozo-Dyes mixture degradation in a fixed bed biofilm reactor packed with volcanic porous rock

    Energy Technology Data Exchange (ETDEWEB)

    Contreras-Blancas, E.; Cobos-Vasconcelos, D. de los; Juarez-Ramirez, C.; Poggi-Varaldo, H. M.; Ruiz-Ordaz, N.; Galindez-Mayer, J.

    2009-07-01

    Textile industries discharge great amounts of dyes and dyeing-process auxiliaries, which pollute streams and water bodies. Several dyes, especially the ones containing the azo group, can cause harmful effects to different organisms including humans. Through bacterial and mammalian tests, azo dyes or their derived aromatic amines have shown cell genotoxicity. The purpose of this work was to evaluate the effect of air flow rate on azo-dyes mixture biodegradation by a microbial community immobilized in a packed bed reactor. (Author)

  8. Geologic Field Notes, Geochemical Analyses, and Field Photographs of Outcrops and Rock Samples from the Big Delta B-1 Quadrangle, East-Central Alaska

    Science.gov (United States)

    Day, Warren C.; O'Neill, J. Michael

    2008-01-01

    The U.S. Geological Survey, in cooperation with the Alaska Department of Natural Resources Division of Mining, Land, and Water, has released a geologic map of the Big Delta B-1 quadrangle of east-central Alaska (Day and others, 2007). This companion report presents the major element oxide and trace element geochemical analyses, including those for gold, silver, and base metals, for representative rock units and for grab samples from quartz veins and mineralized zones within the quadrangle. Also included are field station locations, field notes, structural data, and field photographs based primarily on observations by W.C. Day with additions by J.M. O'Neill and B.M. Gamble, all of the U.S. Geological Survey. The data are provided in both Microsoft Excel spread sheet format and as a Microsoft Access database.

  9. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  10. Core burnup calculation and accidents analyses of a pressurized water reactor partially loaded with rock-like oxide fuel

    Science.gov (United States)

    Akie, H.; Sugo, Y.; Okawa, R.

    2003-06-01

    A rock-like oxide (ROX) fuel - light water reactor (LWR) burning system has been studied for efficient plutonium transmutation. For the improvement of small negative reactivity coefficients and severe transient behaviors of ROX fueled LWRs, a partial loading core of ROX fuel assemblies with conventional UO 2 assemblies was considered. As a result, although the reactivity coefficients could be improved, the power peaking tends to be large in this heterogeneous core configuration. The reactivity initiated accident (RIA) and loss of coolant accident (LOCA) behaviors were not sufficiently improved. In order to reduce the power peaking, the fuel composition and the assembly design of the ROX fuel were modified. Firstly, erbium burnable poison was added as Er 2O 3 in the ROX fuel to reduce the burnup reactivity swing. Then pin-by-pin Pu enrichment and Er content distributions within the ROX fuel assembly were considered. In addition, the Er content distribution was also considered in the axial direction of the ROX fuel pin. With these modifications, a power peaking factor even lower than the one in a conventional UO 2 fueled core can be obtained. The RIA and LOCA analyses of the modified core have also shown the comparable transient behaviors of ROX partial loading core to those of the UO 2 core.

  11. Reactor

    Science.gov (United States)

    Evans, Robert M.

    1976-10-05

    1. A neutronic reactor having a moderator, coolant tubes traversing the moderator from an inlet end to an outlet end, bodies of material fissionable by neutrons of thermal energy disposed within the coolant tubes, and means for circulating water through said coolant tubes characterized by the improved construction wherein the coolant tubes are constructed of aluminum having an outer diameter of 1.729 inches and a wall thickness of 0.059 inch, and the means for circulating a liquid coolant through the tubes includes a source of water at a pressure of approximately 350 pounds per square inch connected to the inlet end of the tubes, and said construction including a pressure reducing orifice disposed at the inlet ends of the tubes reducing the pressure of the water by approximately 150 pounds per square inch.

  12. Fluoride Salt-Cooled High-Temperature Demonstration Reactor Point Design

    Energy Technology Data Exchange (ETDEWEB)

    Qualls, A. L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Brown, Nicholas R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Betzler, Benjamin R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Carbajo, Juan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hale, Richard Edward [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Harrison, Thomas J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Powers, Jeffrey J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Robb, Kevin R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Terrell, Jerry W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wysocki, Aaron J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-02-01

    The fluoride salt-cooled high-temperature reactor (FHR) demonstration reactor (DR) is a concept for a salt-cooled reactor with 100 megawatts of thermal output (MWt). It would use tristructural-isotropic (TRISO) particle fuel within prismatic graphite blocks. FLiBe (2 LiF-BeF2) is the reference primary coolant. The FHR DR is designed to be small, simple, and affordable. Development of the FHR DR is a necessary intermediate step to enable near-term commercial FHRs. Lower risk technologies are purposely included in the initial FHR DR design to ensure that the reactor can be built, licensed, and operated within an acceptable budget and schedule. These technologies include TRISO particle fuel, replaceable core structural material, the use of that same material for the primary and intermediate loops, and tube-and-shell primary-to-intermediate heat exchangers. Several preconceptual and conceptual design efforts that have been conducted on FHR concepts bear a significant influence on the FHR DR design. Specific designs include the Oak Ridge National Laboratory (ORNL) advanced high-temperature reactor (AHTR) with 3400/1500 MWt/megawatts of electric output (MWe), as well as a 125 MWt small modular AHTR (SmAHTR) from ORNL. Other important examples are the Mk1 pebble bed FHR (PB-FHR) concept from the University of California, Berkeley (UCB), and an FHR test reactor design developed at the Massachusetts Institute of Technology (MIT). The MIT FHR test reactor is based on a prismatic fuel platform and is directly relevant to the present FHR DR design effort. These FHR concepts are based on reasonable assumptions for credible commercial prototypes. The FHR DR concept also directly benefits from the operating experience of the Molten Salt Reactor Experiment (MSRE), as well as the detailed design efforts for a large molten salt reactor concept and its breeder variant, the Molten Salt Breeder Reactor. The FHR DR technology is most representative of the 3400 MWt AHTR

  13. Compliance Monitoring of Underwater Blasting for Rock Removal at Warrior Point, Columbia River Channel Improvement Project, 2009/2010

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, Thomas J.; Johnson, Gary E.; Woodley, Christa M.; Skalski, J. R.; Seaburg, Adam

    2011-05-10

    The U.S. Army Corps of Engineers, Portland District (USACE) conducted the 20-year Columbia River Channel Improvement Project (CRCIP) to deepen the navigation channel between Portland, Oregon, and the Pacific Ocean to allow transit of fully loaded Panamax ships (100 ft wide, 600 to 700 ft long, and draft 45 to 50 ft). In the vicinity of Warrior Point, between river miles (RM) 87 and 88 near St. Helens, Oregon, the USACE conducted underwater blasting and dredging to remove 300,000 yd3 of a basalt rock formation to reach a depth of 44 ft in the Columbia River navigation channel. The purpose of this report is to document methods and results of the compliance monitoring study for the blasting project at Warrior Point in the Columbia River.

  14. A 3D clustering approach for point clouds to detect and quantify changes at a rock glacier front

    Science.gov (United States)

    Micheletti, Natan; Tonini, Marj; Lane, Stuart N.

    2016-04-01

    Terrestrial Laser Scanners (TLS) are extensively used in geomorphology to remotely-sense landforms and surfaces of any type and to derive digital elevation models (DEMs). Modern devices are able to collect many millions of points, so that working on the resulting dataset is often troublesome in terms of computational efforts. Indeed, it is not unusual that raw point clouds are filtered prior to DEM creation, so that only a subset of points is retained and the interpolation process becomes less of a burden. Whilst this procedure is in many cases necessary, it implicates a considerable loss of valuable information. First, and even without eliminating points, the common interpolation of points to a regular grid causes a loss of potentially useful detail. Second, it inevitably causes the transition from 3D information to only 2.5D data where each (x,y) pair must have a unique z-value. Vector-based DEMs (e.g. triangulated irregular networks) partially mitigate these issues, but still require a set of parameters to be set and a considerable burden in terms of calculation and storage. Because of the reasons above, being able to perform geomorphological research directly on point clouds would be profitable. Here, we propose an approach to identify erosion and deposition patterns on a very active rock glacier front in the Swiss Alps to monitor sediment dynamics. The general aim is to set up a semiautomatic method to isolate mass movements using 3D-feature identification directly from LiDAR data. An ultra-long range LiDAR RIEGL VZ-6000 scanner was employed to acquire point clouds during three consecutive summers. In order to isolate single clusters of erosion and deposition we applied the Density-Based Scan Algorithm with Noise (DBSCAN), previously successfully employed by Tonini and Abellan (2014) in a similar case for rockfall detection. DBSCAN requires two input parameters, strongly influencing the number, shape and size of the detected clusters: the minimum number of

  15. Heuristic optimization of a continuous flow point-of-use UV-LED disinfection reactor using computational fluid dynamics.

    Science.gov (United States)

    Jenny, Richard M; Jasper, Micah N; Simmons, Otto D; Shatalov, Max; Ducoste, Joel J

    2015-10-15

    Alternative disinfection sources such as ultraviolet light (UV) are being pursued to inactivate pathogenic microorganisms such as Cryptosporidium and Giardia, while simultaneously reducing the risk of exposure to carcinogenic disinfection by-products (DBPs) in drinking water. UV-LEDs offer a UV disinfecting source that do not contain mercury, have the potential for long lifetimes, are robust, and have a high degree of design flexibility. However, the increased flexibility in design options will add a substantial level of complexity when developing a UV-LED reactor, particularly with regards to reactor shape, size, spatial orientation of light, and germicidal emission wavelength. Anticipating that LEDs are the future of UV disinfection, new methods are needed for designing such reactors. In this research study, the evaluation of a new design paradigm using a point-of-use UV-LED disinfection reactor has been performed. ModeFrontier, a numerical optimization platform, was coupled with COMSOL Multi-physics, a computational fluid dynamics (CFD) software package, to generate an optimized UV-LED continuous flow reactor. Three optimality conditions were considered: 1) single objective analysis minimizing input supply power while achieving at least (2.0) log10 inactivation of Escherichia coli ATCC 11229; and 2) two multi-objective analyses (one of which maximized the log10 inactivation of E. coli ATCC 11229 and minimized the supply power). All tests were completed at a flow rate of 109 mL/min and 92% UVT (measured at 254 nm). The numerical solution for the first objective was validated experimentally using biodosimetry. The optimal design predictions displayed good agreement with the experimental data and contained several non-intuitive features, particularly with the UV-LED spatial arrangement, where the lights were unevenly populated throughout the reactor. The optimal designs may not have been developed from experienced designers due to the increased degrees of

  16. A golden point rule in rock-paper-scissors-lizard-spock game

    Science.gov (United States)

    Kang, Yibin; Pan, Qiuhui; Wang, Xueting; He, Mingfeng

    2013-06-01

    We study a novel five-species system on two-dimensional lattices when each species have two superior and two inferior partners. Here we simplify the huge parameter space of predation probability to only two parameters. Both of Monte Carlo simulation and Mean Field Theory reveal that two of strategies may die out when the ratio of the two parameters is close to the golden point 0.618, and the remaining three strategies are provided a cyclic dominance system.

  17. Toward a Learning Health-care System - Knowledge Delivery at the Point of Care Empowered by Big Data and NLP.

    Science.gov (United States)

    Kaggal, Vinod C; Elayavilli, Ravikumar Komandur; Mehrabi, Saeed; Pankratz, Joshua J; Sohn, Sunghwan; Wang, Yanshan; Li, Dingcheng; Rastegar, Majid Mojarad; Murphy, Sean P; Ross, Jason L; Chaudhry, Rajeev; Buntrock, James D; Liu, Hongfang

    2016-01-01

    The concept of optimizing health care by understanding and generating knowledge from previous evidence, ie, the Learning Health-care System (LHS), has gained momentum and now has national prominence. Meanwhile, the rapid adoption of electronic health records (EHRs) enables the data collection required to form the basis for facilitating LHS. A prerequisite for using EHR data within the LHS is an infrastructure that enables access to EHR data longitudinally for health-care analytics and real time for knowledge delivery. Additionally, significant clinical information is embedded in the free text, making natural language processing (NLP) an essential component in implementing an LHS. Herein, we share our institutional implementation of a big data-empowered clinical NLP infrastructure, which not only enables health-care analytics but also has real-time NLP processing capability. The infrastructure has been utilized for multiple institutional projects including the MayoExpertAdvisor, an individualized care recommendation solution for clinical care. We compared the advantages of big data over two other environments. Big data infrastructure significantly outperformed other infrastructure in terms of computing speed, demonstrating its value in making the LHS a possibility in the near future.

  18. Big Data: Big Confusion? Big Challenges?

    Science.gov (United States)

    2015-05-01

    12th Annual Acquisition Research Symposium 12th Annual Acquisition Research Symposium Big Data : Big Confusion? Big Challenges? Mary Maureen... Data : Big Confusion? Big Challenges? 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK...Acquisition Research Symposium • ~!& UNC CHARlD1TE 90% of the data in the world today was created in the last two years Big Data growth from

  19. Fundamental study on long-term stability of rock from the macroscopic point of view

    Energy Technology Data Exchange (ETDEWEB)

    Okubo, Seisuke [Tokyo Univ. (Japan). Faculty of Engineering

    1998-03-01

    In 1994, this project was started. A pneumatic creep testing machine was modified. Inada granite was purchased, and the preliminary tests were carried out. In 1995, a specimen of Tage tuff under water-saturated condition had been loaded in uniaxial condition in the pneumatic creep testing machine. The uniaxial compression and tension tests, and the short-term creep test of Inada granite were also carried out in the servo-controlled testing machines to obtain the complete stress-strain curves. In 1996, creep, compression and tension tests were carried out. Two types of pressure maintenance equipment (hydraulic and pneumatic types) were developed. In 1997, creep, compression and tension tests etc. were again carried out on the basis of the results heretofore. The experimental results of long-term creep testing of Tage tuff are, and of middle-term creep testing of Inada granite, are described. Two types of pressure maintenance equipment were developed and examined in 1996. One was hydraulic type and another was pneumatic type. The hydraulic type equipment modified for long-term creep testing especially in the measurement system to ensure durability and stability was found to be precise and reliable. The results of triaxial compression test are described. In 1997, a specimen was unloaded and re-loaded through a uniaxial tension test to obtain behaviour more precisely. A constitutive equation of variable compliance type was discussed based on the experimental results. Though the equation has relatively simple form, it can be applied beyond the strength failure point up to the post-failure region. The constitutive equation was implemented in two and three dimensional FEM programs. Preliminary evaluation of equipment and materials for pore-pressure controlled testing was described. (J.P.N.)

  20. Solution of Point Reactor Neutron Kinetics Equations with Temperature Feedback by Singularly Perturbed Method

    Directory of Open Access Journals (Sweden)

    Wenzhen Chen

    2013-01-01

    Full Text Available The singularly perturbed method (SPM is proposed to obtain the analytical solution for the delayed supercritical process of nuclear reactor with temperature feedback and small step reactivity inserted. The relation between the reactivity and time is derived. Also, the neutron density (or power and the average density of delayed neutron precursors as the function of reactivity are presented. The variations of neutron density (or power and temperature with time are calculated and plotted and compared with those by accurate solution and other analytical methods. It is shown that the results by the SPM are valid and accurate in the large range and the SPM is simpler than those in the previous literature.

  1. A refined way of solving reactor point kinetics equations for imposed reactivity insertions

    Directory of Open Access Journals (Sweden)

    Ganapol Barry D.

    2009-01-01

    Full Text Available We apply the concept of convergence acceleration, also known as extrapolation, to find the solution of the reactor kinetics equations (RKEs. The method features simplicity in that an approximate finite difference formulation is constructed and converged to high accuracy from knowledge of the error term. Through the Romberg extrapolation, we demonstrate its high accuracy for a variety of imposed reactivity insertions found in the literature. The unique feature of the proposed algorithm, called RKE/R(omberg, is that no special attention is given to the stiffness of the RKEs. Finally, because of its simplicity and accuracy, the RKE/R algorithm is arguably the most efficient numerical solution of the RKEs developed to date.

  2. Permafrost and snow monitoring at Rothera Point (Adelaide Island, Maritime Antarctica): Implications for rock weathering in cryotic conditions

    Science.gov (United States)

    Guglielmin, Mauro; Worland, M. Roger; Baio, Fabio; Convey, Peter

    2014-11-01

    In February 2009 a new permafrost borehole was installed close to the British Antarctic Survey Station at Rothera Point, Adelaide Island (67.57195°S 68.12068°W). The borehole is situated at 31 m asl on a granodiorite knob with scattered lichen cover. The spatial variability of snow cover and of ground surface temperature (GST) is characterised through the monitoring of snow depth on 5 stakes positioned around the borehole and with thermistors placed at three different rock surfaces (A, B and C). The borehole temperature is measured by 18 thermistors placed at different depths between 0.3 and 30 m. Snow persistence is very variable both spatially and temporally with snow free days per year ranging from 13 and more than 300, and maximum snow depths varying between 0.03 and 1.42 m. This variability is the main cause of high variability in GST, that ranged between - 3.7 and - 1.5 °C. The net effect of the snow cover is a cooling of the surface. Mean annual GST, mean summer GST, and the degree days of thawing and the n-factor of thawing were always much lower at sensor A where snow persistence and depth were greater than in the other sensor locations. At sensor A the potential freeze-thaw events were negligible (0-3) and the thermal stress was at least 40% less than in the other sensor locations. The zero curtain effect at the rock surface occurred only at surface A, favouring chemical weathering over mechanical action. The active layer thickness (ALT) ranged between 0.76 and 1.40 m. ALT was directly proportional to the mean air temperature in summer, and inversely proportional to the maximum snow depth in autumn. ALT temporal variability was greater than reported at other sites at similar latitude in the Northern Hemisphere, or with the similar mean annual air temperature in Maritime Antarctica, because vegetation and a soil organic horizon are absent at the study site. Zero annual amplitude in temperature was observed at about 16 m depth, where the mean annual

  3. Discovery Of A Major Contradiction In Big Bang Cosmology Points To The New Cosmic Center Universe Model

    CERN Document Server

    Gentry, R V

    2003-01-01

    The BAL z=3.91 quasar's high Fe/O ratio has led to a reexamination of big bang's spacetime expansion postulate and the discovery that it predicts a CBR redshift of z>36000 instead of the widely accepted z~1000. This result leads an expansion-predicted CBR temperature of only T = 0.08K, which is contradicted by the experimental T = 2.73K. Contrary to long-held belief, these results strongly suggest that the F-L expanding spacetime paradigm, with its expansion redshifts, is not the correct relativistic description of the universe. This conclusion agrees with the earlier finding (gr-qc/9806061) that the universe is relativistically governed by the Einstein static spacetime solution of the field equations, not the F-L solution. Disproof of expansion redshifts removes the only support for the Cosmological Principle, thus showing that the spherical symmetry of the cosmos demanded by the Hubble redshift relation can no longer be attributed to the universe being the same everythere. The Cosmological Principle is flaw...

  4. Trigonometric Fourier-series solutions of the point reactor kinetics equations

    Energy Technology Data Exchange (ETDEWEB)

    Hamada, Yasser Mohamed, E-mail: yaser_abdelsatar@ci.suez.edu.eg

    2015-01-15

    Highlights: • A new method based on Fourier series expansion is introduced. • The method provides accurate approximations to the point kinetics equations. • Vandermonde matrix is used to determine the coefficients of the Fourier series. • A new formula is introduced to determine the inverse of the Vandermonde matrix. • The obtained results agree well with those obtained with other conventional codes. - Abstract: In this paper, a new method based on the Fourier series is introduced to obtain approximate solutions to the systems of the point kinetics equations. These systems are stiff involving equations with slowly and rapidly varying components. They are solved numerically using Fourier series expansion over a partition of the total time interval. Approximate solution requires determining the series coefficients over each time step in that partition. These coefficients are determined using the high order derivatives of the dependent variables at the beginning of the time step introducing a system of linear algebraic equations to be solved at each step. The obtained algebraic system is similar to the Vandermonde system. Evaluation of the inverse of the Vandermonde matrix is required to determine the coefficients of the Fourier series. Because the obtained Vandermonde matrix has a special structure, due to the properties of the sine and cosine functions, a new formula is introduced to determine its inverse using standard computations. The new method solves the general linear and non-linear kinetics problems with six groups of delayed neutrons. The validity of the algorithm is tested with five different types of reactivities including step reactivity insertion, ramp input, oscillatory reactivity changes, a reactivity as a function of the neutron density and finally temperature feedback reactivity. Comparisons are made with analytical and conventional numerical methods used to solve the point kinetics equations. The results confirm the theoretical analysis

  5. Big data, big governance

    NARCIS (Netherlands)

    Reep, Frans van der

    2016-01-01

    “Natuurlijk is het leuk dat mijn koelkast zelf melk bestelt op basis van data gerelateerde patronen. Deep learning op basis van big data kent grote beloften,” zegt Frans van der Reep van Inholland. Geen wonder dat dit op de Hannover Messe tijdens de Wissenstag van ScienceGuide een hoofdthema zal zij

  6. Solution for the nuclear reactor point-kinetics problem via decomposition method; Solucao via metodo da decomposicao do problema de cinetica puntual de um reator nuclear

    Energy Technology Data Exchange (ETDEWEB)

    Vargas, Rubem Mario Figueiro [Pontificia Univ. Catolica do Rio Grande do Sul, Porto Alegre, RS (Brazil). Faculdade de Engenharia. Dept. de Engenharia Quimica]. E-mail: rvargas@pucrs.br; Vilhena, Marco Tullio de [Rio Grande do Sul Univ., Porto Alegre, RS (Brazil). Inst. de Matematica]. E-mail: vilhena@mat.ufrgs.br; Cardona, Augusto Vieira [Pontificia Univ. Catolica do Rio Grande do Sul, Porto Alegre, RS (Brazil). Faculdade de Matematica]. E-mail: acardona@pucrs.br

    2005-07-01

    The decomposition method is a mathematical technique, usually, applied to solve nonlinear problems, but can be an effective procedure for analytical solution of linear problems presenting advantages when compared with others techniques. In this work, an analytical solution for the nuclear reactor point-kinetics equations is developed using the decomposition method. (author)

  7. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  8. Kimberley rock art dating project

    Energy Technology Data Exchange (ETDEWEB)

    Walsh, G.L. [Takarakka Rock Art Research Centre, NT, (Australia); Morwood, M. [New England University, Armidale, NSW, (Australia). Dept of Archaeology and Palaeoanthropology

    1997-12-31

    The art`s additional value, unequalled by traditionally recognised artefacts, is its permanent pictorial documentation presenting a `window` into the otherwise intangible elements of perceptions, vision and mind of pre-historic cultures. Unfortunately it`s potential in establishing Kimberley archaeological `big picture` still remains largely unrecognised. Some of findings of the Kimberley Rock Art Dating Project, using AMS and optical stimulated luminescence (OSL) dating techniques, are outlined. It is estimated that these findings will encourage involvement by a greater diversity of specialist disciplines to tie findings into levels of this art sequence as a primary reference point. The sequence represents a sound basis for selecting specific defined images for targeting detailed studies by a range of dating technique. This effectively removes the undesirable ad hoc sampling of `apparently old paintings`; a process which must unavoidably remain the case with researchers working on most global bodies of rock art.

  9. Research of the Rock Art from the point of view of geography: the neolithic painting of the Mediterranean area of the Iberian Peninsula

    Directory of Open Access Journals (Sweden)

    Cruz Berrocal, María

    2004-12-01

    Full Text Available The rock art of the Mediterranean Arch (which includes what are conventionally called Levantine Rock Art, Schematic Rock Art and Macroschematic Rock Art, among other styles, designated as part of the Human Heritage in 1997, is studied from the point of view of the Archaeology of Landscape. The information sources used were field work, cartographic analysis and analysis in GIS, besides two Rock Art Archives: the UNESCO Document and the Corpus of Levantine Cave Painting (Corpus de Pintura Rupestre Levantina. The initial hypothesis was that this rock art was involved in the process of neolithisation of the Eastern part of Iberia, of which it is a symptom and a result, and it must be understood as an element of landscape construction. If this is true, it would have a concrete distribution in the form of locational patterns. Through statistical procedures and heuristical approaches, it has been demonstrated that there is a structure of the neolithic landscape, defined by rock art, which is possible to interpret functional and economically.

    Se estudia el arte rupestre del Arco Mediterráneo (que incluye a los convencionalmente conocidos como Arte Levantino, Arte Esquemático y Arte Macroesquemático, entre otros estilos, nombrado Patrimonio de la Humanidad en 1998, desde el punto de vista de su localización. Las fuentes de información utilizadas fueron trabajo de campo, revisión cartográfica y análisis en Sistema de Información Geográfica, además de dos archivos de arte rupestre: el Expediente UNESCO y el Corpus de Pintura Rupestre Levantina. La hipótesis inicial fue que este arte rupestre se imbrica en el proceso de neolitización del Levante peninsular, del que es síntoma y resultado, y debe entenderse como un elemento de construcción paisajística, de lo que se deduce que ha de presentar una distribución determinable en forma de patrones locacionales. Por medio tanto de contrastes y descripciones estadísticas como de

  10. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......Astrophysics and cosmology are rich with data. The advent of wide-area digital cameras on large aperture telescopes has led to ever more ambitious surveys of the sky. Data volumes of entire surveys a decade ago can now be acquired in a single night and real-time analysis is often desired. Thus...... with label and measurement noise. We argue that this makes astronomy a great domain for computer science research, as it pushes the boundaries of data analysis. In the following, we will present this exciting application area for data scientists. We will focus on exemplary results, discuss main challenges...

  11. Analysis on problem of the point reactor transfer function model%点堆传递函数模型问题分析

    Institute of Scientific and Technical Information of China (English)

    王远隆

    2017-01-01

    点堆模型通过线性化处理后,再经过拉普拉斯变换就得到传递函数模型。但分析发现,在稳定时假设反应性为零线性化处理后得到的传递函数模型有自身的问题。将对这类问题做一个概要性分析。分析方法是理论与实验相结合。理论分析借助系统动力学原理,重点在时域与频域结果的比较。实验方面则是基于工程参数借助计算机进行仿真实验,将时域和频域的仿真结果进行比较。通过比较分析,可以明显看到在稳定时假设反应性为零的点堆模型线性化处理存在的问题。针对该问题,指出了模型修改途径。%By the linearization method, the mathematic model of point reactor can be simplified. The simplified point reactor model could be further converted to the transfer function with the Laplace transfer tool. Until now, the point reactor transfer function is still used as the basic means for researching the nuclear reactor control system and the relative engineering items. However, through the analysis, it is found that the linearizd point reactor model based on the hypothesis that reactivity is zero when reactor is at the stable state has its own problems. This paper gives the concise analysis of the problems. The analysis method is the combination of theory and experiment. Theo-retically, the focus is to compare the time domain results and the frequency domain results by means of the system dynamics principles. Experimentally, based on the engineering parameters, the focus is to compare the time domain computer simulation results and the fre-quency domain computer simulation results. Through the comparison analysis, the problems are exposed clearly. The paper gives the mod-el modification access for the problems accordingly.

  12. Accurate 3D point cloud comparison and volumetric change analysis of Terrestrial Laser Scan data in a hard rock coastal cliff environment

    Science.gov (United States)

    Earlie, C. S.; Masselink, G.; Russell, P.; Shail, R.; Kingston, K.

    2013-12-01

    Our understanding of the evolution of hard rock coastlines is limited due to the episodic nature and ';slow' rate at which changes occur. High-resolution surveying techniques, such as Terrestrial Laser Scanning (TLS), have just begun to be adopted as a method of obtaining detailed point cloud data to monitor topographical changes over short periods of time (weeks to months). However, the difficulties involved in comparing consecutive point cloud data sets in a complex three-dimensional plane, such as occlusion due to surface roughness and positioning of data capture point as a result of a consistently changing environment (a beach profile), mean that comparing data sets can lead to errors in the region of 10 - 20 cm. Meshing techniques are often used for point cloud data analysis for simple surfaces, but in surfaces such as rocky cliff faces, this technique has been found to be ineffective. Recession rates of hard rock coastlines in the UK are typically determined using aerial photography or airborne LiDAR data, yet the detail of the important changes occurring to the cliff face and toe are missed using such techniques. In this study we apply an algorithm (M3C2 - Multiscale Model to Model Cloud Comparison), initially developed for analysing fluvial morphological change, that directly compares point to point cloud data using surface normals that are consistent with surface roughness and measure the change that occurs along the normal direction (Lague et al., 2013). The surfaces changes are analysed using a set of user defined scales based on surface roughness and registration error. Once the correct parameters are defined, the volumetric cliff face changes are calculated by integrating the mean distance between the point clouds. The analysis has been undertaken at two hard rock sites identified for their active erosion located on the UK's south west peninsular at Porthleven in south west Cornwall and Godrevy in north Cornwall. Alongside TLS point cloud data, in

  13. Heart tissue of harlequin (hq)/Big Blue mice has elevated reactive oxygen species without significant impact on the frequency and nature of point mutations in nuclear DNA

    Energy Technology Data Exchange (ETDEWEB)

    Crabbe, Rory A. [Department of Biology, University of Western Ontario, London, Ontario, N6A 5B7 (Canada); Hill, Kathleen A., E-mail: khill22@uwo.ca [Department of Biology, University of Western Ontario, London, Ontario, N6A 5B7 (Canada)

    2010-09-10

    Age is a major risk factor for heart disease, and cardiac aging is characterized by elevated mitochondrial reactive oxygen species (ROS) with compromised mitochondrial and nuclear DNA integrity. To assess links between increased ROS levels and mutations, we examined in situ levels of ROS and cII mutation frequency, pattern and spectrum in the heart of harlequin (hq)/Big Blue mice. The hq mouse is a model of premature aging with mitochondrial dysfunction and increased risk of oxidative stress-induced heart disease with the means for in vivo mutation detection. The hq mutation produces a significant downregulation in the X-linked apoptosis-inducing factor gene (Aif) impairing both the antioxidant and oxidative phosphorylation functions of AIF. Brain and skin of hq disease mice have elevated frequencies of point mutations in nuclear DNA and histopathology characterized by cell loss. Reports of associated elevations in ROS in brain and skin have mixed results. Herein, heart in situ ROS levels were elevated in hq disease compared to AIF-proficient mice (p < 0.0001) yet, mutation frequency and pattern were similar in hq disease, hq carrier and AIF-proficient mice. Heart cII mutations were also assessed 15 days following an acute exposure to an exogenous ROS inducer (10 mg paraquat/kg). Acute paraquat exposure with a short mutant manifestation period was insufficient to elevate mutation frequency or alter mutation pattern in the post-mitotic heart tissue of AIF-proficient mice. Paraquat induction of ROS requires mitochondrial complex I and thus is likely compromised in hq mice. Results of this preliminary survey and the context of recent literature suggest that determining causal links between AIF deficiency and the premature aging phenotypes of specific tissues is better addressed with assay of mitochondrial ROS and large-scale changes in mitochondrial DNA in specific cell types.

  14. Analytical Solution of the Point Reactor Kinetics Equations for One-Group of Delayed Neutrons for a Discontinuous Linear Reactivity Insertion

    Directory of Open Access Journals (Sweden)

    S. Yamoah

    2012-11-01

    Full Text Available The understanding of the time-dependent behaviour of the neutron population in a nuclear reactor in response to either a planned or unplanned change in the reactor conditions is of great importance to the safe and reliable operation of the reactor. It is therefore important to understand the response of the neutron density and how it relates to the speed of lifting control rods. In this study, an analytical solution of point reactor kinetic equations for one-group of delayed neutrons is developed to calculate the change in neutron density when reactivity is linearly introduced discontinuously. The formulation presented in this study is validated with numerical solution using the Euler method. It is observed that for higher speed, r = 0.0005 the Euler method predicted higher values than the method presented in this study. However with r = 0.0001, the Euler method predicted lower values than the method presented in this study except for t = 1.0 s and 5.0 s. The results obtained have shown to be compatible with the numerical method.

  15. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  16. Experimental Study of Big Row Spacing Cultivation of Tomato Using Straw Biological Reactor Technology%应用秸秆生物反应堆技术大行距栽培番茄试验研究

    Institute of Scientific and Technical Information of China (English)

    王继涛; 张翔; 温学萍; 赵玮; 俞风娟; 汪金山

    2015-01-01

    应用秸秆生物反应堆技术能有效地改善设施内环境因素、减缓病害发生、提高产量效益,但此项技术在开沟过程中比较费工费力,为了降低秸秆生物反应堆技术劳动用工和生产投入,特开展秸秆生物反应堆技术大行距栽培番茄试验研究。结果表明:仅挖沟、埋秸秆、起垄、铺设滴管、定植环节比对照每公顷节省劳动用工35.7%,节约成本16810.5元/hm2,上市期提前5 d,产量增加26.68%,病虫害发病率明显降低。综合田间生长势及室内考种数据,建议在宁夏地区大面积推广应用秸秆生物反应堆技术大行距栽培番茄。%The application of the straw biological reactor technology can effectively improve the environmental factors within the facility, slow down the occurrence of the disease and improve the yield and benefit. But with this technology, in the process of ditching, a lot of work and effort are needed. In order to reduce the labor employment and production inputs in the utilization of the technology, an experiment research on the big row spacing cultivation of tomato using the straw biologi-cal reactor technology was conducted. The results showed that compared with the control, only in the links such as ditching, straw burring, ridging, laying of dropper and planting, 35.7% of the labor employment per hectare, 16,810.5 yuan/hm2 of the cost could be saved the marketing time could be advance by 5 days, the yield could be increased by 26.68% and the inci-dence of pests and diseases could be lowered significantly. In considering the comprehensive growth potential in the field and the indoor test data it is suggested that the big row spacing cultivation of tomato using the straw biological reactor technology should be extended and applied in large areas in Ningxia.

  17. Big Data

    Directory of Open Access Journals (Sweden)

    Prachi More

    2013-05-01

    Full Text Available Demand and spurt in collections and accumulation of data has coined new term “Big Data” has begun. Accidently, incidentally and by interaction of people, information so called data is massively generated. This BIG DATA is to be smartly and effectively used Computer scientists, physicists, economists, mathematicians, political scientists, bio-informaticists, sociologists and many Variety of Intellegesia debate over the potential benefits and costs of analysing information from Twitter, Google, Facebook, Wikipedia and every space where large groups of people leave digital traces and deposit data. Given the rise of Big Data as both a phenomenon and a methodological persuasion, it is time to start critically interrogating this phenomenon, its assumptions and its biases. Big data, which refers to the data sets that are too big to be handled using the existing database management tools, are emerging in many important applications, such as Internet search, business informatics, social networks, social media, genomics, and meteorology. Big data presents a grand challenge for database and data analytics research. This paper is a blend of non-technical and introductory-level technical detail, ideal for the novice. We conclude with some technical challenges as well as the solutions that can be used to these challenges. Big Data differs from other data with five characteristics like volume, variety, value, velocity and complexity. The article will focus on some current and future cases and causes for BIG DATA.

  18. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  19. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Ruppert, Evelyn; Flyverbom, Mikkel

    2016-01-01

    The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments that is contained in any big data practice. Secondly, it suggest a research agenda built around a set of sub-themes that each deserve dedicated scrutiny when studying the interplay between big...

  20. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Jeppesen, Jacob; Vaarst Andersen, Kristina; Lauto, Giancarlo;

    and locations, having a diverse knowledge set and capable of tackling more and more complex problems. This prose the question if Big Egos continues to dominate in this rising paradigm of big science. Using a dataset consisting of full bibliometric coverage from a Large Scale Research Facility, we utilize...

  1. Online stress corrosion crack and fatigue usages factor monitoring and prognostics in light water reactor components: Probabilistic modeling, system identification and data fusion based big data analytics approach

    Energy Technology Data Exchange (ETDEWEB)

    Mohanty, Subhasish M. [Argonne National Lab. (ANL), Argonne, IL (United States); Jagielo, Bryan J. [Argonne National Lab. (ANL), Argonne, IL (United States); Oakland Univ., Rochester, MI (United States); Iverson, William I. [Argonne National Lab. (ANL), Argonne, IL (United States); Univ. of Illinois at Urbana-Champaign, Champaign, IL (United States); Bhan, Chi Bum [Argonne National Lab. (ANL), Argonne, IL (United States); Pusan National Univ., Busan (Korea, Republic of); Soppet, William S. [Argonne National Lab. (ANL), Argonne, IL (United States); Majumdar, Saurin M. [Argonne National Lab. (ANL), Argonne, IL (United States); Natesan, Ken N. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-12-10

    Nuclear reactors in the United States account for roughly 20% of the nation's total electric energy generation, and maintaining their safety in regards to key component structural integrity is critical not only for long term use of such plants but also for the safety of personnel and the public living around the plant. Early detection of damage signature such as of stress corrosion cracking, thermal-mechanical loading related material degradation in safety-critical components is a necessary requirement for long-term and safe operation of nuclear power plant systems.

  2. Thermal-maturity trends within Franciscan rocks near Big Sur, California: Implications for offset along the San Gregorio San Simeon Hosgri fault zone

    Science.gov (United States)

    Underwood, Michael B.; Laughland, Matthew M.; Shelton, Kevin L.; Sedlock, Richard L.

    1995-09-01

    Conventional neotectonic interpretations place the Lucia and Point Sur subterranes of the Franciscan subduction complex on opposite sides of the San Gregorio San Simeon Hosgri dextral fault system and connect that system through the Sur fault zone. Our reconstructed paleotemperature contours, however, are not offset across the San Simeon segment, so differential displacement between the subterranes after peak heating appears to have been negligible. One explanation is that dextral slip on the faults has totaled only 5 10 km. A second possibility is that a discrete Hosgri San Simeon segment extends offshore of the amalgamated Point Sur and Lucia subterranes and that an en echelon stepover transfers dextral slip eastward to the San Gregorio Palo Colorado segment. In either case, the Sur fault zone appears to play a relatively insignificant role in the late Cenozoic tectonic evolution of central California.

  3. Big Data

    OpenAIRE

    2013-01-01

    Demand and spurt in collections and accumulation of data has coined new term “Big Data” has begun. Accidently, incidentally and by interaction of people, information so called data is massively generated. This BIG DATA is to be smartly and effectively used Computer scientists, physicists, economists, mathematicians, political scientists, bio-informaticists, sociologists and many Variety of Intellegesia debate over the potential benefits and costs of analysing information from Twitter, Google,...

  4. Comparing Two Photo-Reconstruction Methods to Produce High Density Point Clouds and DEMs in the Corral del Veleta Rock Glacier (Sierra Nevada, Spain

    Directory of Open Access Journals (Sweden)

    Álvaro Gómez-Gutiérrez

    2014-06-01

    Full Text Available In this paper, two methods based on computer vision are presented in order to produce dense point clouds and high resolution DEMs (digital elevation models of the Corral del Veleta rock glacier in Sierra Nevada (Spain. The first one is a semi-automatic 3D photo-reconstruction method (SA-3D-PR based on the Scale-Invariant Feature Transform algorithm and the epipolar geometry theory that uses oblique photographs and camera calibration parameters as input. The second method is fully automatic (FA-3D-PR and is based on the recently released software 123D-Catch that uses the Structure from Motion and MultiView Stereo algorithms and needs as input oblique photographs and some measurements in order to scale and geo-reference the resulting model. The accuracy of the models was tested using as benchmark a 3D model registered by means of a Terrestrial Laser Scanner (TLS. The results indicate that both methods can be applied to micro-scale study of rock glacier morphologies and processes with average distances to the TLS point cloud of 0.28 m and 0.21 m, for the SA-3D-PR and the FA-3D-PR methods, respectively. The performance of the models was also tested by means of the dimensionless relative precision ratio parameter resulting in figures of 1:1071 and 1:1429 for the SA-3D-PR and the FA-3D-PR methods, respectively. Finally, Digital Elevation Models (DEMs of the study area were produced and compared with the TLS-derived DEM. The results showed average absolute differences with the TLS-derived DEM of 0.52 m and 0.51 m for the SA-3D-PR and the FA-3D-PR methods, respectively.

  5. Toward a Learning Health-care System – Knowledge Delivery at the Point of Care Empowered by Big Data and NLP

    Science.gov (United States)

    Kaggal, Vinod C.; Elayavilli, Ravikumar Komandur; Mehrabi, Saeed; Pankratz, Joshua J.; Sohn, Sunghwan; Wang, Yanshan; Li, Dingcheng; Rastegar, Majid Mojarad; Murphy, Sean P.; Ross, Jason L.; Chaudhry, Rajeev; Buntrock, James D.; Liu, Hongfang

    2016-01-01

    The concept of optimizing health care by understanding and generating knowledge from previous evidence, ie, the Learning Health-care System (LHS), has gained momentum and now has national prominence. Meanwhile, the rapid adoption of electronic health records (EHRs) enables the data collection required to form the basis for facilitating LHS. A prerequisite for using EHR data within the LHS is an infrastructure that enables access to EHR data longitudinally for health-care analytics and real time for knowledge delivery. Additionally, significant clinical information is embedded in the free text, making natural language processing (NLP) an essential component in implementing an LHS. Herein, we share our institutional implementation of a big data-empowered clinical NLP infrastructure, which not only enables health-care analytics but also has real-time NLP processing capability. The infrastructure has been utilized for multiple institutional projects including the MayoExpertAdvisor, an individualized care recommendation solution for clinical care. We compared the advantages of big data over two other environments. Big data infrastructure significantly outperformed other infrastructure in terms of computing speed, demonstrating its value in making the LHS a possibility in the near future. PMID:27385912

  6. Study on Automatic Regulating of Reactor Axial Power Distribution based on Double-Point Reactor Model%基于双点堆模型的反应堆轴向功率分布自动调节研究

    Institute of Scientific and Technical Information of China (English)

    刘玉燕; 李玉红; 王恒

    2014-01-01

    针对二代压水堆核电厂轴向功率分布多为操作员手动调节的现状,提出了一种棒位调节和硼酸浓度调节相结合的反应堆功率和轴向功率分布的自动控制策略。基于 SIMULINK 建立了能够描述温度反馈效应、氙毒效应和轴向功率分布的双点堆模型,以及所设计的控制系统模型,并针对反应堆慢速斜坡降升功率和快速斜坡降升功率两种情况进行了仿真实验。仿真结果表明所提控制策略有较好的控制品质,负荷跟踪的超调量小于1%,轴向功率偏差进入轴向偏差参考带之内的时间小于5000秒。%The reactor power and axial power distribution automatic control strategy which com-bines rod position adjustment with boric acid concentration regulation is put forward for the status that axial power distribution of two generation pressurized water reactor nuclear power plant is adjusted by operator manually. A double-point reactor model and the control system model which can describe the temperature feedback,the effects of xenon and axial power distribution are established based on SIMU-LINK. The simulation results for reactor slow and fast slope lifting power show that the proposed control strategy has better control quality. Load tracking overshoot is less than 1%. The time for axial power dif-ference into reference tape is less than 5000 seconds.

  7. Nuclear Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Hogerton, John

    1964-01-01

    This pamphlet describes how reactors work; discusses reactor design; describes research, teaching, and materials testing reactors; production reactors; reactors for electric power generation; reactors for supply heat; reactors for propulsion; reactors for space; reactor safety; and reactors of tomorrow. The appendix discusses characteristics of U.S. civilian power reactor concepts and lists some of the U.S. reactor power projects, with location, type, capacity, owner, and startup date.

  8. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  9. Assessment of RELAP5 point kinetic model against reactivity insertion transient in the IAEA 10 MW MTR research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Hamidouche, T., E-mail: t.hamidouche@crna.d [Division de l' Environnement, de la Surete et des Dechets Radioactifs, Centre de Recherche Nucleaire d' Alger, 02 Boulevard Frantz Fanon, BP 399 Alger RP (Algeria); Bousbia-Salah, A. [DIMNP - University of Pisa, Via Diotisalvi 02, 56126 Pisa (Italy)

    2010-03-15

    The current study emphasizes an aspect related to the assessment of a model embedded in a computer code. The study concerns more particularly the point neutron kinetics model of the RELAP5/Mod3 code which is worldwide used. The model is assessed against positive reactivity insertion transient taking into account calculations involving thermal-hydraulic feedback as well as transients with no feedback effects. It was concluded that the RELAP5 point kinetics model provides unphysical power evolution trends due most probably to a bug during the programming process.

  10. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  11. Big Dreams

    Science.gov (United States)

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  12. Rock History and Culture

    OpenAIRE

    Gonzalez, Éric

    2013-01-01

    Two ambitious works written by French-speaking scholars tackle rock music as a research object, from different but complementary perspectives. Both are a definite must-read for anyone interested in the contextualisation of rock music in western popular culture. In Une histoire musicale du rock (i.e. A Musical History of Rock), rock music is approached from the point of view of the people – musicians and industry – behind the music. Christophe Pirenne endeavours to examine that field from a m...

  13. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'.

  14. Big data need big theory too

    Science.gov (United States)

    Dougherty, Edward R.; Highfield, Roger R.

    2016-01-01

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their ‘depth’ and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote ‘blind’ big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698035

  15. Big Data; A Management Revolution : The emerging role of big data in businesses

    OpenAIRE

    Blasiak, Kevin

    2014-01-01

    Big data is a term that was coined in 2012 and has since then emerged to one of the top trends in business and technology. Big data is an agglomeration of different technologies resulting in data processing capabilities that have been unreached before. Big data is generally characterized by 4 factors. Volume, velocity and variety. These three factors distinct it from the traditional data use. The possibilities to utilize this technology are vast. Big data technology has touch points in differ...

  16. CRITERIA FOR ROCK ENGINEERING FAILURE

    Institute of Scientific and Technical Information of China (English)

    ZHUDeren; ZHANGYuzhuo

    1995-01-01

    A great number of underground rock projects are maintained in the rock mass which is subject to rock damage and failure development. In many cases, the rock. engineering is still under normal working conditions even though rock is already fails to some extent. This paper introduces two different concepts: rock failure and rock engineering failure. Rock failure is defined as a mechanical state under which an applicable characteristic is changed or lost.However, the rock engineering failure is an engineering state under which an applicable function is changed or lost. The failure of surrounding rocks is the major reason of rock engineering failure. The criterion of rock engineering failure depends on the limit of applicable functions. The rock engineering failure state possesses a corresponding point in rock failure state. In this paper, a description of rock engineering failure criterion is given by simply using a mechanical equation or expression. It is expected that the study of rock engineering failure criterion will be an optimal approach that combines research of rock mechanics with rock engineering problems.

  17. Intensification and forecasting of low-pour-point diesel fuel production via modelling reactor and stabilizer column at industrial unit

    Science.gov (United States)

    Belinskaya, N. S.; Frantsina, E. V.; Ivanchina, E. D.; Popova, N. V.; Zyryanova, I. V.; Averyanova, E. V.

    2016-09-01

    In this work forecast calculation of stabilizer column in the technology of low-pour- point diesel fuel production was modelled. The results of forecast calculation were proved by full-scale experiment at diesel fuel catalytic dewaxing unit. The forecast calculation and full- scale experiment made it possible to determine the ways of mass transfer intensification, as well as to increase the degree of hydrogen sulphide removal in the column, and thereby to decrease corrosiveness of the product stream. It was found out that maintenance of the reflux rate in the range of 80-90 m3/h and injection of additional vapourizing streams, such as stable naphtha from distillation unit (in the volume of 10-22 m3/h) and hydrogen-containing gas (in the volume of 100-300 m3/h), ensure complete elimination of corrosive hydrogen sulphide from the product stream. Reduction of stream corrosive activity due to suggested solutions extends service life of equipment and pipelines at industrial catalytic dewaxing unit.

  18. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  19. Big Data

    DEFF Research Database (Denmark)

    Aaen, Jon; Nielsen, Jeppe Agger

    2016-01-01

    Big Data byder sig til som en af tidens mest hypede teknologiske innovationer, udråbt til at rumme kimen til nye, værdifulde operationelle indsigter for private virksomheder og offentlige organisationer. Mens de optimistiske udmeldinger er mange, er forskningen i Big Data i den offentlige sektor...... indtil videre begrænset. Denne artikel belyser, hvordan den offentlige sundhedssektor kan genanvende og udnytte en stadig større mængde data under hensyntagen til offentlige værdier. Artiklen bygger på et casestudie af anvendelsen af store mængder sundhedsdata i Dansk AlmenMedicinsk Database (DAMD......). Analysen viser, at (gen)brug af data i nye sammenhænge er en flerspektret afvejning mellem ikke alene økonomiske rationaler og kvalitetshensyn, men også kontrol over personfølsomme data og etiske implikationer for borgeren. I DAMD-casen benyttes data på den ene side ”i den gode sags tjeneste” til...

  20. Big Man

    Institute of Scientific and Technical Information of China (English)

    郑秀文

    2012-01-01

    <正>梁炳"Edmond"说他演唱会后会跟太太去旅行。无论飞机降落在地球的哪角,有伴在旁就是幸福。他的concert名字是big man,初时我看错是big mac演唱会:心想干吗是大汉堡演唱会?嘻!后来才知看错。但其实细想,在成长路上,谁不曾是活得像个傻傻的面包,一团面粉暴露在这大千世界,时间和各式人生经历就是酵母,多少年月日,你我都会发酵成长。友情也是激发彼此成长的酵母,看到对方早已经从男仔成了男人,我都原来一早已不再能够以"女仔"称呼自己。在我眼中,他的改变是大的,爱玩外向的个性收窄了,现在的我们,

  1. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  2. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Jeppesen, Jacob; Vaarst Andersen, Kristina; Lauto, Giancarlo

    In this paper we investigate the micro-mechanisms governing the structural evolution of a scientific collaboration. Empirical evidence indicates that we have transcended into a new paradigm with a new modus operandi where scientific discovery are not lead by so called lone ?stars?, or big egos......, but instead by a group of people, from a multitude of institutions, having a diverse knowledge set and capable of operating more and more complex instrumentation. Using a dataset consisting of full bibliometric coverage from a Large Scale Research Facility, we utilize a stochastic actor oriented model...

  3. Challenges of Big Data Analysis.

    Science.gov (United States)

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  4. Collecting Rocks

    Institute of Scientific and Technical Information of China (English)

    孙铮

    2007-01-01

    My hobby is collecting rocks.It is very special,isn’t it?I began to collect rocks about four years ago.I usually go hiking in the mountains,or near the river to look for rocks.When I find a rock,I pick it up and clean it with the brush and water.Then I put it into my bag.Most of the rocks I have collected are quartzite~*.They are really

  5. STUDY OF FACTORS AFFECTING CUSTOMER BEHAVIOUR USING BIG DATA TECHNOLOGY

    OpenAIRE

    Prabin Sahoo; Dr. Nilay Yajnik

    2014-01-01

    Big data technology is getting momentum recently. There are several articles, books, blogs and discussion points to various facets of big data technology. The study in this paper focuses on big data as concept, and insights into 3 Vs such as Volume, Velocity and Variety and demonstrates their significance with respect to factors that can be processed using big data for studying customer behaviour for online users.

  6. On Big Data Benchmarking

    OpenAIRE

    Han, Rui; Lu, Xiaoyi

    2014-01-01

    Big data systems address the challenges of capturing, storing, managing, analyzing, and visualizing big data. Within this context, developing benchmarks to evaluate and compare big data systems has become an active topic for both research and industry communities. To date, most of the state-of-the-art big data benchmarks are designed for specific types of systems. Based on our experience, however, we argue that considering the complexity, diversity, and rapid evolution of big data systems, fo...

  7. KREEP Rocks

    Institute of Scientific and Technical Information of China (English)

    邹永廖; 徐琳; 欧阳自远

    2004-01-01

    KREEP rocks with high contents of K, REE and P were first recognized in Apollo-12 samples, and it was confirmed later that there were KREEP rock fragments in all of the Apollo samples, particularly in Apollo-12 and-14 samples. The KREEP rocks distributed on the lunar surface are the very important objects of study on the evolution of the moon, as well as to evaluate the utilization prospect of REE in KREEP rocks. Based on previous studies and lunar exploration data, the authors analyzed the chemical and mineral characteristics of KREEP rocks, the abundance of Th on the lunar surface materials, the correlation between Th and REE of KREEP rocks in abundance, studied the distribution regions of KREEP rocks on the lunar surface, and further evaluated the utilization prospect of REE in KREEP rocks.

  8. Rock Stars

    Institute of Scientific and Technical Information of China (English)

    张国平

    2000-01-01

    Around the world young people are spending unbelievable sums of money to listen to rock music. Forbes Magazine reports that at least fifty rock stars have incomes between two million and six million dollars per year.

  9. Big data=Big marketing?!

    Institute of Scientific and Technical Information of China (English)

    肖明超

    2012-01-01

    <正>互联网刚刚兴起的时候,有句话很流行:"在网上,没人知道你是一条狗。"但是,在20多年后的今天,这句话已经早被扔进了历史的垃圾堆,因为在技术的推动下,随着移动互联、社交网络、电子商务等的迅速发展,消费者的"行踪"变得越来越容易被把握,消费者在互联网上的眼球、行为轨迹、谈论、喜好、购物经历等等都可能被捕捉到,消费者进入一个几乎透明化生存的"大数据时代"(Age of Big Data)。数据不仅仅正在变得更加可用,人工智能(AI)技术,包括自然语言处理、模式识别和机器学习等技术的发展,正在让数据变得更加容易被计算机所理解,

  10. Rock Finding

    Science.gov (United States)

    Rommel-Esham, Katie; Constable, Susan D.

    2006-01-01

    In this article, the authors discuss a literature-based activity that helps students discover the importance of making detailed observations. In an inspiring children's classic book, "Everybody Needs a Rock" by Byrd Baylor (1974), the author invites readers to go "rock finding," laying out 10 rules for finding a "perfect" rock. In this way, the…

  11. Rock Art

    Science.gov (United States)

    Henn, Cynthia A.

    2004-01-01

    There are many interpretations for the symbols that are seen in rock art, but no decoding key has ever been discovered. This article describes one classroom's experiences with a lesson on rock art--making their rock art and developing their own personal symbols. This lesson allowed for creativity, while giving an opportunity for integration…

  12. Considerations on Geospatial Big Data

    Science.gov (United States)

    LIU, Zhen; GUO, Huadong; WANG, Changlin

    2016-11-01

    Geospatial data, as a significant portion of big data, has recently gained the full attention of researchers. However, few researchers focus on the evolution of geospatial data and its scientific research methodologies. When entering into the big data era, fully understanding the changing research paradigm associated with geospatial data will definitely benefit future research on big data. In this paper, we look deep into these issues by examining the components and features of geospatial big data, reviewing relevant scientific research methodologies, and examining the evolving pattern of geospatial data in the scope of the four ‘science paradigms’. This paper proposes that geospatial big data has significantly shifted the scientific research methodology from ‘hypothesis to data’ to ‘data to questions’ and it is important to explore the generality of growing geospatial data ‘from bottom to top’. Particularly, four research areas that mostly reflect data-driven geospatial research are proposed: spatial correlation, spatial analytics, spatial visualization, and scientific knowledge discovery. It is also pointed out that privacy and quality issues of geospatial data may require more attention in the future. Also, some challenges and thoughts are raised for future discussion.

  13. Reactor Period Algorithm and Parameter Set-point Optimization Study%反应堆周期算法及定值优化研究

    Institute of Scientific and Technical Information of China (English)

    付学峰

    2013-01-01

    A point reactor model with 6 groups delayed neutrons was adopted to simulate the typical 20 pcm and 60 pcm prompt reactivity insertion during the period of initial criticality and zero power physics test .Both Kalman filter and dynamic filter algorithms were studied .The results show that the Kalman filter approach can attenuate the impact of neutron prompt jump and gives features of self-adaptation .However ,it reaches less accuracy at the earlier transient period and the protection response time is long w hen large reactivity is inserted . T he dynamic filter approach gives high accuracy and response quickly through the optimization of the relative power change LAMMA and gain coefficient λ. However , frequent adjustment of λtakes time and increases the probability of operation mistakes .Static filter ,with optimized constant gain coefficient , is accurate ,timesaving and safe .%采用6组缓发中子点堆模型,模拟堆芯在初始达临界和零功率物理试验时,引入20 pcm和60 pcm典型阶跃反应性,研究卡尔曼滤波算法和动态滤波算法。结果表明:卡尔曼滤波算法能减弱中子瞬跳的影响,具有较好的自适应性,但在瞬态初期测量误差偏大,在大反应性引入时启动保护的响应时间较长;动态滤波算法通过优化功率相对变化量定值LAMMA ,并定期调整增益相关参数λ,具有测量准确、大反应性下保护响应快的优点,但频繁定值调整耗时较多,并增加了误操作的风险;静态滤波的增益为优化的常数,具有测量准确、节省时间和安全的优点。

  14. H Reactor

    Data.gov (United States)

    Federal Laboratory Consortium — The H Reactor was the first reactor to be built at Hanford after World War II.It became operational in October of 1949, and represented the fourth nuclear reactor on...

  15. Adobe photoshop quantification (PSQ) rather than point-counting: A rapid and precise method for quantifying rock textural data and porosities

    Science.gov (United States)

    Zhang, Xuefeng; Liu, Bo; Wang, Jieqiong; Zhang, Zhe; Shi, Kaibo; Wu, Shuanglin

    2014-08-01

    Commonly used petrological quantification methods are visual estimation, counting, and image analyses. However, in this article, an Adobe Photoshop-based analyzing method (PSQ) is recommended for quantifying the rock textural data and porosities. Adobe Photoshop system provides versatile abilities in selecting an area of interest and the pixel number of a selection could be read and used to calculate its area percentage. Therefore, Adobe Photoshop could be used to rapidly quantify textural components, such as content of grains, cements, and porosities including total porosities and different genetic type porosities. This method was named as Adobe Photoshop Quantification (PSQ). The workflow of the PSQ method was introduced with the oolitic dolomite samples from the Triassic Feixianguan Formation, Northeastern Sichuan Basin, China, for example. And the method was tested by comparing with the Folk's and Shvetsov's "standard" diagrams. In both cases, there is a close agreement between the "standard" percentages and those determined by the PSQ method with really small counting errors and operator errors, small standard deviations and high confidence levels. The porosities quantified by PSQ were evaluated against those determined by the whole rock helium gas expansion method to test the specimen errors. Results have shown that the porosities quantified by the PSQ are well correlated to the porosities determined by the conventional helium gas expansion method. Generally small discrepancies (mostly ranging from -3% to 3%) are caused by microporosities which would cause systematic underestimation of 2% and/or by macroporosities causing underestimation or overestimation in different cases. Adobe Photoshop could be used to quantify rock textural components and porosities. This method has been tested to be precise and accurate. It is time saving compared with usual methods.

  16. Pancam Multispectral and APXS Chemical Examination of Rocks and Soils in Marathon Valley and Points South Along the Rim of Endeavour Crater

    Science.gov (United States)

    Farrand, W. H.; Johnson, J. R.; Bell, J. F., III; Mittlefehldt, D. W.; Gellert, R.; VanBommel, S.; Arvidson, R. E.; Schroder, C.

    2017-01-01

    The Mars Exploration Rover Opportunity has concluded its exploration of Marathon Valley, a 100-meter-wide valley in the western rim of the 22-kilometer-diameter Endeavour crater. Orbital observations from CRISM (Compact Reconnaissance Imaging Spectrometer for Mars) indicated the presence of Fe smectites in Marathon Valley. Since leaving the valley, Opportunity has been traversing along the inner rim of the crater, and currently towards the outer rim. This presentation describes the Pancam 430 to 1009 nanometer (VNIR - Visible and Near Infared) multispectral reflectance and APXS (Alpha Particle X-ray Spectrometer) chemical compositions of rock and soil units observed during the latter portions of the Marathon Valley campaign on the Knudson Ridge area and observations of those materi-als along the traverse to the south. Full Pancam spectral coverage of rock targets consists of 13 filter (13f) data collections with 11 spectrally unique channels with data processing. Data were examined using spectral parameters, decorrelation stretch composites, and spectral mixture analysis. Note that color terms used here refer to colors in various false-color renditions, not true colors. The APXS determines major and select trace element compositions of targets.

  17. 'Escher' Rock

    Science.gov (United States)

    2004-01-01

    [figure removed for brevity, see original site] Chemical Changes in 'Endurance' Rocks [figure removed for brevity, see original site] Figure 1 This false-color image taken by NASA's Mars Exploration Rover Opportunity shows a rock dubbed 'Escher' on the southwestern slopes of 'Endurance Crater.' Scientists believe the rock's fractures, which divide the surface into polygons, may have been formed by one of several processes. They may have been caused by the impact that created Endurance Crater, or they might have arisen when water leftover from the rock's formation dried up. A third possibility is that much later, after the rock was formed, and after the crater was created, the rock became wet once again, then dried up and developed cracks. Opportunity has spent the last 14 sols investigating Escher, specifically the target dubbed 'Kirchner,' and other similar rocks with its scientific instruments. This image was taken on sol 208 (Aug. 24, 2004) by the rover's panoramic camera, using the 750-, 530- and 430-nanometer filters. The graph above shows that rocks located deeper into 'Endurance Crater' are chemically altered to a greater degree than rocks located higher up. This chemical alteration is believed to result from exposure to water. Specifically, the graph compares ratios of chemicals between the deep rock dubbed 'Escher,' and the more shallow rock called 'Virginia,' before (red and blue lines) and after (green line) the Mars Exploration Rover Opportunity drilled into the rocks. As the red and blue lines indicate, Escher's levels of chlorine relative to Virginia's went up, and sulfur down, before the rover dug a hole into the rocks. This implies that the surface of Escher has been chemically altered to a greater extent than the surface of Virginia. Scientists are still investigating the role water played in influencing this trend. These data were taken by the rover's alpha particle X-ray spectrometer.

  18. Analyzing and studying factors for determining neutral point position of fully grouted rock bolt%全长注浆岩石锚杆中性点影响因素分析研究

    Institute of Scientific and Technical Information of China (English)

    朱训国; 杨庆

    2009-01-01

    The neutral point theory is the important theory in underground engineering reinforcement theory. At present, the formula to determine neutral point position has been existed some unreasonable place. The neutral point theory had been further consummated and improved on the foundation of pre-researchers. Base on the developed analytical model and the theory of frictional resistance is zero at the neutral point, the factors have been detailed analyzed for affecting the neutral point position; and the correlations have been gained which affecting neutral point position. Through analyzing, it is revealed that the hydrostatic primary stress, bolt length, bolt spacing having no influence to neutral point position, and the radius of tunnel, Young's moduli of rock and bolt, the bolt diameter having remarkable influence to it. Among them, it is linear relation among the radius of tunnel, the bolt diameter with the neutral point position, but it is exponential function relations among the Young's modulus of rock and bolt with the neutral point position. The relationship of Young's modulus of rock mass and neutral point position presents exponential decreasing; the relationship of -Young's modulus of bolt and neutral point position presents exponential increasing. Through the factors analyzed, it has obtained that the general functional between the neutral point position and the correlation parameter, for further to study the neutral point theory providing the certain reference significance.%中性点理论是地下工程锚固理论中的重要理论,但目前对于中性点位置的确定计算公式存在不合理之处.在前人工作的基础上,对中性点理论进行了完善和改进.在建立的锚杆解析本构模型的基础上,结合中性点理论中锚杆中性点位置处的摩阻力为0的思想,对影响中性点位置的因素进行了较详细的分析,得出了影响锚杆中性点位置的相关因素.通过分析研究得到,

  19. Nuclear reactor PBMR and cogeneration; Reactor nuclear PBMR y cogeneracion

    Energy Technology Data Exchange (ETDEWEB)

    Ramirez S, J. R.; Alonso V, G., E-mail: ramon.ramirez@inin.gob.mx [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2013-10-15

    In recent years the nuclear reactor designs for the electricity generation have increased their costs, so that at the moment costs are managed of around the 5000 US D for installed kw, reason for which a big nuclear plant requires of investments of the order of billions of dollars, the designed reactors as modular of low power seek to lighten the initial investment of a big reactor dividing the power in parts and dividing in modules the components to lower the production costs, this way it can begin to build a module and finished this to build other, differing the long term investment, getting less risk therefore in the investment. On the other hand the reactors of low power can be very useful in regions where is difficult to have access to the electric net being able to take advantage of the thermal energy of the reactor to feed other processes like the water desalination or the vapor generation for the processes industry like the petrochemical, or even more the possible hydrogen production to be used as fuel. In this work the possibility to generate vapor of high quality for the petrochemical industry is described using a spheres bed reactor of high temperature. (Author)

  20. Comparison of lactate sampling sites for rock climbing.

    Science.gov (United States)

    Fryer, S; Draper, N; Dickson, T; Blackwell, G; Winter, D; Ellis, G

    2011-06-01

    Comparisons of capillary blood lactate concentrations pre and post climb have featured in the protocols of many rock climbing studies, with most researchers obtaining samples from the fingertip. The nature of rock climbing, however, places a comparatively high physiological loading on the foreaand fingertips. Indeed, the fingertips are continually required for gripping and this makes pre-climb sampling at this site problematic. The purpose of our study was to examine differences in capillary blood lactate concentrations from samples taken at the fingertip and first (big) toe in a rock climbing context. 10 participants (9 males and 1 female) completed climbing bouts at 3 different angles (91°, 100° and 110°). Capillary blood samples were taken simultaneously from the fingertip and first toe pre and post climb. A limit of agreement plot revealed all data points to be well within the upper and lower bounds of the 95% population confidence interval. Subsequent regression analysis revealed a strong relationship (R (2)=0.94, y=0.940x + 0.208) between fingertip and first toe capillary blood lactate concentrations. Findings from our study suggest that the toe offers a valid alternative site for capillary blood lactate concentration analysis in a rock climbing context.

  1. 'Earhart' Rock

    Science.gov (United States)

    2004-01-01

    This false-color image taken by NASA's Mars Exploration Rover Opportunity shows a rock informally named 'Earhart' on the lower slopes of 'Endurance Crater.' The rock was named after the pilot Amelia Earhart. Like 'Escher' and other rocks dotting the bottom of Endurance, scientists believe fractures in Earhart could have been formed by one of several processes. They may have been caused by the impact that created Endurance Crater, or they might have arisen when water leftover from the rock's formation dried up. A third possibility is that much later, after the rock was formed, and after the crater was created, the rock became wet once again, then dried up and developed cracks. Rover team members do not have plans to investigate Earhart in detail because it is located across potentially hazardous sandy terrain. This image was taken on sol 219 (Sept. 4) by the rover's panoramic camera, using its 750-, 530- and 430-nanometer filters.

  2. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  3. Five Big Ideas

    Science.gov (United States)

    Morgan, Debbie

    2012-01-01

    Designing quality continuing professional development (CPD) for those teaching mathematics in primary schools is a challenge. If the CPD is to be built on the scaffold of five big ideas in mathematics, what might be these five big ideas? Might it just be a case of, if you tell me your five big ideas, then I'll tell you mine? Here, there is…

  4. Rock Art

    OpenAIRE

    Huyge, Dirk

    2009-01-01

    Rock art, basically being non-utilitarian, non-textual anthropic markings on natural rock surfaces, was an extremely widespread graphical practice in ancient Egypt. While the apogee of the tradition was definitely the Predynastic Period (mainly fourth millennium BCE), examples date from the late Palaeolithic (c. 15,000 BCE) until the Islamic era. Geographically speaking, “Egyptian” rock art is known from many hundreds of sites along the margins of the Upper Egyptian and Nubian Nile Valley and...

  5. Rock blocks

    OpenAIRE

    Turner, W.

    2007-01-01

    Consider representation theory associated to symmetric groups, or to Hecke algebras in type A, or to q-Schur algebras, or to finite general linear groups in non-describing characteristic. Rock blocks are certain combinatorially defined blocks appearing in such a representation theory, first observed by R. Rouquier. Rock blocks are much more symmetric than general blocks, and every block is derived equivalent to a Rock block. Motivated by a theorem of J. Chuang and R. Kessar in the case of sym...

  6. CERN Rocks

    CERN Multimedia

    2004-01-01

    The 15th CERN Hardronic Festival took place on 17 July on the terrace of Rest 3 (Prévessin). Over 1000 people, from CERN and other International Organizations, came to enjoy the warm summer night, and to watch the best of the World's High Energy music. Jazz, rock, pop, country, metal, blues, funk and punk blasted out from 9 bands from the CERN Musiclub and Jazz club, alternating on two stages in a non-stop show.  The night reached its hottest point when The Canettes Blues Band got everybody dancing to sixties R&B tunes (pictured). Meanwhile, the bars and food vans were working at full capacity, under the expert management of the CERN Softball club, who were at the same time running a Softball tournament in the adjacent "Higgs Field". The Hardronic Festival is the main yearly CERN music event, and it is organized with the support of the Staff Association and the CERN Administration.

  7. The Big Group of People Looking at How to Control Putting the Parts of the Air That Are the Same as What You Breathe Out Into Small Spaces in Rocks

    Energy Technology Data Exchange (ETDEWEB)

    Stack, Andrew

    2013-07-18

    Representing the Nanoscale Control of Geologic CO2 (NCGC), this document is one of the entries in the Ten Hundred and One Word Challenge. As part of the challenge, the 46 Energy Frontier Research Centers were invited to represent their science in images, cartoons, photos, words and original paintings, but any descriptions or words could only use the 1000 most commonly used words in the English language, with the addition of one word important to each of the EFRCs and the mission of DOE energy. The mission of NCGC is to build a fundamental understanding of molecular-to-pore-scale processes in fluid-rock systems, and to demonstrate the ability to control critical aspects of flow, transport, and mineralization in porous rock media as applied to the injection and storage of carbon dioxide (CO2) in subsurface reservoirs.

  8. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  9. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  10. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  11. Reactor Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ait Abderrahim, A

    2001-04-01

    The Reactor Physics and MYRRHA Department of SCK-CEN offers expertise in various areas of reactor physics, in particular in neutronics calculations, reactor dosimetry, reactor operation, reactor safety and control and non-destructive analysis of reactor fuel. This expertise is applied in the Department's own research projects in the VENUS critical facility, in the BR1 reactor and in the MYRRHA project (this project aims at designing a prototype Accelerator Driven System). Available expertise is also used in programmes external to the Department such as the reactor pressure steel vessel programme, the BR2 reactor dosimetry, and the preparation and interpretation of irradiation experiments by means of neutron and gamma calculations. The activities of the Fuzzy Logic and Intelligent Technologies in Nuclear Science programme cover several domains outside the department. Progress and achievements in these topical areas in 2000 are summarised.

  12. Reactor safeguards

    CERN Document Server

    Russell, Charles R

    1962-01-01

    Reactor Safeguards provides information for all who are interested in the subject of reactor safeguards. Much of the material is descriptive although some sections are written for the engineer or physicist directly concerned with hazards analysis or site selection problems. The book opens with an introductory chapter on radiation hazards, the construction of nuclear reactors, safety issues, and the operation of nuclear reactors. This is followed by separate chapters that discuss radioactive materials, reactor kinetics, control and safety systems, containment, safety features for water reactor

  13. Reactor operation

    CERN Document Server

    Shaw, J

    2013-01-01

    Reactor Operation covers the theoretical aspects and design information of nuclear reactors. This book is composed of nine chapters that also consider their control, calibration, and experimentation.The opening chapters present the general problems of reactor operation and the principles of reactor control and operation. The succeeding chapters deal with the instrumentation, start-up, pre-commissioning, and physical experiments of nuclear reactors. The remaining chapters are devoted to the control rod calibrations and temperature coefficient measurements in the reactor. These chapters also exp

  14. Art Rocks with Rock Art!

    Science.gov (United States)

    Bickett, Marianne

    2011-01-01

    This article discusses rock art which was the very first "art." Rock art, such as the images created on the stone surfaces of the caves of Lascaux and Altimira, is the true origin of the canvas, paintbrush, and painting media. For there, within caverns deep in the earth, the first artists mixed animal fat, urine, and saliva with powdered minerals…

  15. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  16. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  17. Antigravity and the big crunch/big bang transition

    CERN Document Server

    Bars, Itzhak; Steinhardt, Paul J; Turok, Neil

    2011-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  18. 第一拐点法确定结构面长期强度的研究%STUDY ON USING THE FIRST INFLECTION POINT METHOD IN DETERMINING THE LONG-TERM STRENGTH OF THE ROCK MASS DISCONTINUITIES

    Institute of Scientific and Technical Information of China (English)

    马君; 沈明荣; 谌洪菊

    2012-01-01

    The uniaxial compressive strength test of standard cube specimens, the direct shear tests and shear creep tests under different normal stress levels are conducted in the rock biaxial rheological testing machine in this study. On the base of the test data, the first inflection point method are used to determine the long-term strength of the rock mass discontinuity with different roughness under the corresponding normal stress,and compared the long-term strength with the instantaneous strength.%在岩石双轴流变试验机上进行了圆柱体标准试件单轴抗压强度试验、结构面常规剪切试验以及结构面剪切蠕变试验,利用蠕变曲线第一拐点法确定了不同粗糙度结构面试件在不同法向应力条件下的长期强度,并将其与结构面的瞬时强度进行了对比.

  19. Reactor Neutrinos

    OpenAIRE

    Soo-Bong Kim; Thierry Lasserre; Yifang Wang

    2013-01-01

    We review the status and the results of reactor neutrino experiments. Short-baseline experiments have provided the measurement of the reactor neutrino spectrum, and their interest has been recently revived by the discovery of the reactor antineutrino anomaly, a discrepancy between the reactor neutrino flux state of the art prediction and the measurements at baselines shorter than one kilometer. Middle and long-baseline oscillation experiments at Daya Bay, Double Chooz, and RENO provided very ...

  20. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  1. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  2. Damage Model of Brittle Coal-Rock and Damage Energy Index of Rock Burst

    Institute of Scientific and Technical Information of China (English)

    尹光志; 张东明; 魏作安; 李东伟

    2003-01-01

    Based on the mechanical experiment of brittle coal-rock and the damage mechanical theory, a damage model was established. Coal-Rock damage mechanical characteristic was researched. Furthermore, interior energy transformation mechanism of rock was analyzed from the point of view of damage mechanics and damage energy release rate of brittle coal rock was derived. By analyzing the energy transformation of rock burst, a new conception, damage energy index of rock burst, was put forward. The condition of rock burst was also established.

  3. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  4. Big Boss Interval Games

    NARCIS (Netherlands)

    Alparslan-Gok, S.Z.; Brânzei, R.; Tijs, S.H.

    2008-01-01

    In this paper big boss interval games are introduced and various characterizations are given. The structure of the core of a big boss interval game is explicitly described and plays an important role relative to interval-type bi-monotonic allocation schemes for such games. Specifically, each element

  5. Big Ideas in Art

    Science.gov (United States)

    Day, Kathleen

    2008-01-01

    In this article, the author shares how she was able to discover some big ideas about art education. She relates how she found great ideas to improve her teaching from the book "Rethinking Curriculum in Art." She also shares how she designed a "Big Idea" unit in her class.

  6. Progress of Research on Demonstration Fast Reactor Main Pipe Material

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    The main characteristics of the sodium pipe system in demonstration fast reactor are high-temperature, thin-wall and big-caliber, which is different from the high-pressure and thick-wall of the pressurized water reactor system, and the system is long-term

  7. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  8. 'Wopmay' Rock

    Science.gov (United States)

    2004-01-01

    This approximate true-color image taken by NASA's Mars Exploration Rover Opportunity shows an unusual, lumpy rock informally named 'Wopmay' on the lower slopes of 'Endurance Crater.' The rock was named after the Canadian bush pilot Wilfrid Reid 'Wop' May. Like 'Escher' and other rocks dotting the bottom of Endurance, scientists believe the lumps in Wopmay may be related to cracking and alteration processes, possibly caused by exposure to water. The area between intersecting sets of cracks eroded in a way that created the lumpy appearance. Rover team members plan to drive Opportunity over to Wopmay for a closer look in coming sols. This image was taken by the rover's panoramic camera on sol 248 (Oct. 4, 2004), using its 750-, 530- and 480-nanometer filters.

  9. Crisis analytics : big data-driven crisis response

    NARCIS (Netherlands)

    Qadir, Junaid; ur Rasool, Raihan; Zwitter, Andrej; Sathiaseelan, Arjuna; Crowcroft, Jon

    2016-01-01

    Disasters have long been a scourge for humanity. With the advances in technology (in terms of computing, communications, and the ability to process, and analyze big data), our ability to respond to disasters is at an inflection point. There is great optimism that big data tools can be leveraged to p

  10. Source rock

    OpenAIRE

    Abubakr F. Makky; Mohamed I. El Sayed; Ahmed S. Abu El-Ata; Ibrahim M. Abd El-Gaied; Mohamed I. Abdel-Fattah; Zakaria M. Abd-Allah

    2014-01-01

    West Beni Suef Concession is located at the western part of Beni Suef Basin which is a relatively under-explored basin and lies about 150 km south of Cairo. The major goal of this study is to evaluate the source rock by using different techniques as Rock-Eval pyrolysis, Vitrinite reflectance (%Ro), and well log data of some Cretaceous sequences including Abu Roash (E, F and G members), Kharita and Betty formations. The BasinMod 1D program is used in this study to construct the burial history ...

  11. Big Data: Survey, Technologies, Opportunities, and Challenges

    Directory of Open Access Journals (Sweden)

    Nawsher Khan

    2014-01-01

    Full Text Available Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  12. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  13. Research present situation and analysis on classification of rock drillability

    Institute of Scientific and Technical Information of China (English)

    Zhang Zhi-hong; MA Qin-yong

    2001-01-01

    Rock drillability reflects the drill bit fragments rock hardly or easily. At present, rock drillability classification indexes have rock single-axle compressive strength, point load intensity,fracture stress during chiseling, drill speed, chiseling specific work, acoustic parameter, cutting magnitude, and so on. Every index reflects rock drillability but isn't overall. It is feasible that using many indexes of fuzzy mathematics method etc. to evaluate rock drillability.

  14. Ayers Rock

    Institute of Scientific and Technical Information of China (English)

    王慧茹

    2002-01-01

    Ayers Rock is right in the centre of Australia.It's nearly two thousand kilometres______Sydney.So we flew most of the way.h was rather cloudy______But after we left the mountains behind us, there was hardly a cloud in thesky.

  15. Intellektuaalne rock

    Index Scriptorium Estoniae

    2007-01-01

    Briti laulja-helilooja ja näitleja Toyah Willcox ning Bill Rieflin ansamblist R.E.M. ja Pat Mastelotto King Krimsonist esinevad koos ansamblitega The Humans ja Tuner 25. okt. Tallinnas Rock Cafés ja 27. okt Tartu Jaani kirikus

  16. Rock Paintings.

    Science.gov (United States)

    Jones, Julienne Edwards

    1998-01-01

    Discusses the integration of art and academics in a fifth-grade instructional unit on Native American culture. Describes how students studied Native American pictographs, designed their own pictographs, made their own tools, and created rock paintings of their pictographs using these tools. Provides a list of references on Native American…

  17. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  18. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  19. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    2016-01-01

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van dez

  20. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  1. Inhomogeneous Big Bang Cosmology

    CERN Document Server

    Wagh, S M

    2002-01-01

    In this letter, we outline an inhomogeneous model of the Big Bang cosmology. For the inhomogeneous spacetime used here, the universe originates in the infinite past as the one dominated by vacuum energy and ends in the infinite future as the one consisting of "hot and relativistic" matter. The spatial distribution of matter in the considered inhomogeneous spacetime is {\\em arbitrary}. Hence, observed structures can arise in this cosmology from suitable "initial" density contrast. Different problems of the standard model of Big Bang cosmology are also resolved in the present inhomogeneous model. This inhomogeneous model of the Big Bang Cosmology predicts "hot death" for the universe.

  2. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  3. Moon base reactor system

    Science.gov (United States)

    Chavez, H.; Flores, J.; Nguyen, M.; Carsen, K.

    1989-01-01

    The objective of our reactor design is to supply a lunar-based research facility with 20 MW(e). The fundamental layout of this lunar-based system includes the reactor, power conversion devices, and a radiator. The additional aim of this reactor is a longevity of 12 to 15 years. The reactor is a liquid metal fast breeder that has a breeding ratio very close to 1.0. The geometry of the core is cylindrical. The metallic fuel rods are of beryllium oxide enriched with varying degrees of uranium, with a beryllium core reflector. The liquid metal coolant chosen was natural lithium. After the liquid metal coolant leaves the reactor, it goes directly into the power conversion devices. The power conversion devices are Stirling engines. The heated coolant acts as a hot reservoir to the device. It then enters the radiator to be cooled and reenters the Stirling engine acting as a cold reservoir. The engines' operating fluid is helium, a highly conductive gas. These Stirling engines are hermetically sealed. Although natural lithium produces a lower breeding ratio, it does have a larger temperature range than sodium. It is also corrosive to steel. This is why the container material must be carefully chosen. One option is to use an expensive alloy of cerbium and zirconium. The radiator must be made of a highly conductive material whose melting point temperature is not exceeded in the reactor and whose structural strength can withstand meteor showers.

  4. Application and challenges of big data in quality monitoring of highway engineering

    Science.gov (United States)

    Xiao, Xianglin; Zhou, Chunrong

    2017-03-01

    Generation of big data brings opportunities and challenges to quality monitoring technologies of highway engineering. Big data of highway engineering quality monitoring is featured by typical "4V" characteristics. In order to deeply analyze application of big data in quality monitoring of highway engineering, the paper discusses generation, processing processes, key technologies as well as other aspects of big data of highway engineering quality monitoring. The paper analyzes storage structure, computing courses and data visualized processing processes of the big data of highway engineering quality monitoring and points out the problems and challenges encountered by application of big data in quality monitoring of highway engineering.

  5. Helias reactor studies

    Energy Technology Data Exchange (ETDEWEB)

    Beidler, C.D. [Max-Planck-Institut fuer Plasmaphysik, Garching (Germany); Grieger, G. [Max-Planck-Institut fuer Plasmaphysik, Garching (Germany); Harmeyer, E. [Max-Planck-Institut fuer Plasmaphysik, Garching (Germany); Kisslinger, J. [Max-Planck-Institut fuer Plasmaphysik, Garching (Germany); Karulin, N. [Nuclear Fusion Institute, Moscow (Russian Federation); Maurer, W. [Forschungszentrum Karlsruhe GmbH Technik und Umwelt (Germany); Nuehrenberg, J. [Max-Planck-Institut fuer Plasmaphysik, Garching (Germany); Rau, F. [Max-Planck-Institut fuer Plasmaphysik, Garching (Germany); Sapper, J. [Max-Planck-Institut fuer Plasmaphysik, Garching (Germany); Wobig, H. [Max-Planck-Institut fuer Plasmaphysik, Garching (Germany)

    1995-10-01

    The present status of Helias reactor studies is characterised by the identification and investigation of specific issues which result from the particular properties of this type of stellarator. On the technical side these are issues related to the coil system, while physics studies have concentrated on confinement, alpha-particle behaviour and ignition conditions. The usual assumptions have been made in those fields which are common to all toroidal fusion reactors: blanket and shield, refuelling and exhaust, safety and economic aspects. For blanket and shield sufficient space has been provided, a detailed concept will be developed in future. To date more emphasis has been placed on scoping and parameter studies as opposed to fixing a specific set of parameters and providing a detailed point study. One result of the Helias reactor studies is that physical dimensions are on the same order as those of tokamak reactors. However, it should be noticed that this comparison is difficult in view of the large spectrum of tokamak reactors ranging from a small reactor like Aries, to a large device such as SEAFP. The notion that the large aspect ratio of 10 or more in Helias configurations also leads to large reactors is misleading, since the large major radius of 22 m is compensated by the average plasma radius of 1.8 m and the average coil radius of 5 m. The plasma volume of 1400 m{sup 3} is about the same as the ITER reactor and the magnetic energy of the coil system is about the same or even slightly smaller than envisaged in ITER. (orig.)

  6. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  7. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  8. Multifunctional reactors

    NARCIS (Netherlands)

    Westerterp, K.R.

    1992-01-01

    Multifunctional reactors are single pieces of equipment in which, besides the reaction, other functions are carried out simultaneously. The other functions can be a heat, mass or momentum transfer operation and even another reaction. Multifunctional reactors are not new, but they have received much

  9. "Big Data": Big Knowledge Gaps in the Field of Internet Science

    Directory of Open Access Journals (Sweden)

    Ulf-Dietrich Reips

    2012-01-01

    Full Text Available Research on so-called ‘Big Data’ has received a considerable momentum and is expected to grow in the future. One very interesting stream of research on Big Data analyzes online networks. Many online networks are known to have some typical macro-characteristics, such as ‘small world’ properties. Much less is known about underlying micro-processes leading to these properties. The models used by Big Data researchers usually are inspired by mathematical ease of exposition. We propose to follow in addition a different strategy that leads to knowledge about micro-processes that match with actual online behavior. This knowledge can then be used for the selection of mathematically-tractable models of online network formation and evolution. Insight from social and behavioral research is needed for pursuing this strategy of knowledge generation about micro-processes. Accordingly, our proposal points to a unique role that social scientists could play in Big Data research. ...

  10. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  11. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  12. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Klinkby Madsen, Anders; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  13. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  14. Big Data Analytics

    Indian Academy of Sciences (India)

    2016-08-01

    The volume and variety of data being generated using computersis doubling every two years. It is estimated that in 2015,8 Zettabytes (Zetta=1021) were generated which consistedmostly of unstructured data such as emails, blogs, Twitter,Facebook posts, images, and videos. This is called big data. Itis possible to analyse such huge data collections with clustersof thousands of inexpensive computers to discover patterns inthe data that have many applications. But analysing massiveamounts of data available in the Internet has the potential ofimpinging on our privacy. Inappropriate analysis of big datacan lead to misleading conclusions. In this article, we explainwhat is big data, how it is analysed, and give some case studiesillustrating the potentials and pitfalls of big data analytics.

  15. Big Creek Pit Tags

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BCPITTAGS database is used to store data from an Oncorhynchus mykiss (steelhead/rainbow trout) population dynamics study in Big Creek, a coastal stream along the...

  16. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  17. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  18. ANALYTICS OF BIG DATA

    OpenAIRE

    Asst. Prof. Shubhada Talegaon

    2014-01-01

    Big Data analytics has started to impact all types of organizations, as it carries the potential power to extract embedded knowledge from big amounts of data and react according to it in real time. The current technology enables us to efficiently store and query large datasets, the focus is now on techniques that make use of the complete data set, instead of sampling. This has tremendous implications in areas like machine learning, pattern recognition and classification, senti...

  19. Big data need big theory too

    OpenAIRE

    Coveney, Peter V.; Dougherty, Edward R; Highfield, Roger R.

    2016-01-01

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, ma...

  20. Big data need big theory too.

    OpenAIRE

    Coveney, P. V.; Dougherty, E. R.; Highfield, R. R.

    2016-01-01

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, ma...

  1. Big Data in Caenorhabditis elegans: quo vadis?

    Science.gov (United States)

    Hutter, Harald; Moerman, Donald

    2015-11-05

    A clear definition of what constitutes "Big Data" is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of "complete" data sets for this organism is actually rather small--not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein-protein interaction--important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell.

  2. Reactor vessel

    OpenAIRE

    Makkee, M.; Kapteijn, F.; Moulijn, J.A

    1999-01-01

    A reactor vessel (1) comprises a reactor body (2) through which channels (3) are provided whose surface comprises longitudinal inwardly directed parts (4) and is provided with a catalyst (6), as well as buffer bodies (8, 12) connected to the channels (3) on both sides of the reactor body (2) and comprising connections for supplying (9, 10, 11) and discharging (13, 14, 15) via the channels (3) gases and/or liquids entering into a reaction with each other and substances formed upon this reactio...

  3. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan Capalbo

    2005-12-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I are organized into four areas: (1) Evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; (2) Development of GIS-based reporting framework that links with national networks; (3) Design of an integrated suite of monitoring, measuring, and verification technologies, market-based opportunities for carbon management, and an economic/risk assessment framework; (referred to below as the Advanced Concepts component of the Phase I efforts) and (4) Initiation of a comprehensive education and outreach program. As a result of the Phase I activities, the groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that complements the ongoing DOE research agenda in Carbon Sequestration. The geology of the Big Sky Carbon Sequestration Partnership Region is favorable for the potential sequestration of enormous volume of CO{sub 2}. The United States Geological Survey (USGS 1995) identified 10 geologic provinces and 111 plays in the region. These provinces and plays include both sedimentary rock types characteristic of oil, gas, and coal productions as well as large areas of mafic volcanic rocks. Of the 10 provinces and 111 plays, 1 province and 4 plays are located within Idaho. The remaining 9 provinces and 107 plays are dominated by sedimentary rocks and located in the states of Montana and Wyoming. The potential sequestration capacity of the 9 sedimentary provinces within the region ranges from 25,000 to almost 900,000 million metric tons of CO{sub 2}. Overall every sedimentary formation investigated

  4. NUCLEAR REACTOR

    Science.gov (United States)

    Miller, H.I.; Smith, R.C.

    1958-01-21

    This patent relates to nuclear reactors of the type which use a liquid fuel, such as a solution of uranyl sulfate in ordinary water which acts as the moderator. The reactor is comprised of a spherical vessel having a diameter of about 12 inches substantially surrounded by a reflector of beryllium oxide. Conventionnl control rods and safety rods are operated in slots in the reflector outside the vessel to control the operation of the reactor. An additional means for increasing the safety factor of the reactor by raising the ratio of delayed neutrons to prompt neutrons, is provided and consists of a soluble sulfate salt of beryllium dissolved in the liquid fuel in the proper proportion to obtain the result desired.

  5. Reactor Neutrinos

    Directory of Open Access Journals (Sweden)

    Soo-Bong Kim

    2013-01-01

    Full Text Available We review the status and the results of reactor neutrino experiments. Short-baseline experiments have provided the measurement of the reactor neutrino spectrum, and their interest has been recently revived by the discovery of the reactor antineutrino anomaly, a discrepancy between the reactor neutrino flux state of the art prediction and the measurements at baselines shorter than one kilometer. Middle and long-baseline oscillation experiments at Daya Bay, Double Chooz, and RENO provided very recently the most precise determination of the neutrino mixing angle θ13. This paper provides an overview of the upcoming experiments and of the projects under development, including the determination of the neutrino mass hierarchy and the possible use of neutrinos for society, for nonproliferation of nuclear materials, and geophysics.

  6. Chemical Reactors.

    Science.gov (United States)

    Kenney, C. N.

    1980-01-01

    Describes a course, including content, reading list, and presentation on chemical reactors at Cambridge University, England. A brief comparison of chemical engineering education between the United States and England is also given. (JN)

  7. Reactor Neutrinos

    OpenAIRE

    Lasserre, T.; Sobel, H.W.

    2005-01-01

    We review the status and the results of reactor neutrino experiments, that toe the cutting edge of neutrino research. Short baseline experiments have provided the measurement of the reactor neutrino spectrum, and are still searching for important phenomena such as the neutrino magnetic moment. They could open the door to the measurement of coherent neutrino scattering in a near future. Middle and long baseline oscillation experiments at Chooz and KamLAND have played a relevant role in neutrin...

  8. Favorability for uranium in tertiary sedimentary rocks, southwestern Montana

    Energy Technology Data Exchange (ETDEWEB)

    Wopat, M A; Curry, W E; Robins, J W; Marjaniemi, D K

    1977-10-01

    Tertiary sedimentary rocks in the basins of southwestern Montana were studied to determine their favorability for potential uranium resources. Uranium in the Tertiary sedimentary rocks was probably derived from the Boulder batholith and from silicic volcanic material. The batholith contains numerous uranium occurrences and is the most favorable plutonic source for uranium in the study area. Subjective favorability categories of good, moderate, and poor, based on the number and type of favorable criteria present, were used to classify the rock sequences studied. Rocks judged to have good favorability for uranium deposits are (1) Eocene and Oligocene strata and undifferentiated Tertiary rocks in the western Three Forks basin and (2) Oligocene rocks in the Helena basin. Rocks having moderate favorability consist of (1) Eocene and Oligocene strata in the Jefferson River, Beaverhead River, and lower Ruby River basins, (2) Oligocene rocks in the Townsend and Clarkston basins, (3) Miocene and Pliocene rocks in the Upper Ruby River basin, and (4) all Tertiary sedimentary formations in the eastern Three Forks basin, and in the Grasshopper Creek, Horse Prairie, Medicine Lodge Creek, Big Sheep Creek, Deer Lodge, Big Hole River, and Bull Creek basins. The following have poor favorability: (1) the Beaverhead Conglomerate in the Red Rock and Centennial basins, (2) Eocene and Oligocene rocks in the Upper Ruby River basin, (3) Miocene and Pliocene rocks in the Townsend, Clarkston, Smith River, and Divide Creek basins, (4) Miocene through Pleistocene rocks in the Jefferson River, Beaverhead River, and Lower Ruby River basins, and (5) all Tertiary sedimentary rocks in the Boulder River, Sage Creek, Muddy Creek, Madison River, Flint Creek, Gold Creek, and Bitterroot basins.

  9. Focus : big data, little questions?

    OpenAIRE

    Uprichard, Emma

    2013-01-01

    Big data. Little data. Deep data. Surface data. Noisy, unstructured data. Big. The world of data has gone from being analogue and digital, qualitative and quantitative, transactional and a by-product, to, simply, BIG. It is as if we couldn’t quite deal with its omnipotence and just ran out of adjectives. BIG. With all the data power it is supposedly meant to entail, one might have thought that a slightly better descriptive term might have been latched onto. But, no. BIG. Just BIG.

  10. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  11. Big and Small

    CERN Document Server

    Ekers, R D

    2010-01-01

    Technology leads discovery in astronomy, as in all other areas of science, so growth in technology leads to the continual stream of new discoveries which makes our field so fascinating. Derek de Solla Price had analysed the discovery process in science in the 1960s and he introduced the terms 'Little Science' and 'Big Science' as part of his discussion of the role of exponential growth in science. I will show how the development of astronomical facilities has followed this same trend from 'Little Science' to 'Big Science' as a field matures. We can see this in the discoveries resulting in Nobel Prizes in astronomy. A more detailed analysis of discoveries in radio astronomy shows the same effect. I include a digression to look at how science progresses, comparing the roles of prediction, serendipity, measurement and explanation. Finally I comment on the differences between the 'Big Science' culture in Physics and in Astronomy.

  12. Water resources in the Big Lost River Basin, south-central Idaho

    Science.gov (United States)

    Crosthwaite, E.G.; Thomas, C.A.; Dyer, K.L.

    1970-01-01

    The Big Lost River basin occupies about 1,400 square miles in south-central Idaho and drains to the Snake River Plain. The economy in the area is based on irrigation agriculture and stockraising. The basin is underlain by a diverse-assemblage of rocks which range, in age from Precambrian to Holocene. The assemblage is divided into five groups on the basis of their hydrologic characteristics. Carbonate rocks, noncarbonate rocks, cemented alluvial deposits, unconsolidated alluvial deposits, and basalt. The principal aquifer is unconsolidated alluvial fill that is several thousand feet thick in the main valley. The carbonate rocks are the major bedrock aquifer. They absorb a significant amount of precipitation and, in places, are very permeable as evidenced by large springs discharging from or near exposures of carbonate rocks. Only the alluvium, carbonate rock and locally the basalt yield significant amounts of water. A total of about 67,000 acres is irrigated with water diverted from the Big Lost River. The annual flow of the river is highly variable and water-supply deficiencies are common. About 1 out of every 2 years is considered a drought year. In the period 1955-68, about 175 irrigation wells were drilled to provide a supplemental water supply to land irrigated from the canal system and to irrigate an additional 8,500 acres of new land. Average. annual precipitation ranged from 8 inches on the valley floor to about 50 inches at some higher elevations during the base period 1944-68. The estimated water yield of the Big Lost River basin averaged 650 cfs (cubic feet per second) for the base period. Of this amount, 150 cfs was transpired by crops, 75 cfs left the basin as streamflow, and 425 cfs left as ground-water flow. A map of precipitation and estimated values of evapotranspiration were used to construct a water-yield map. A distinctive feature of the Big Lost River basin, is the large interchange of water from surface streams into the ground and from the

  13. Baryon symmetric big bang cosmology

    Science.gov (United States)

    Stecker, F. W.

    1978-01-01

    Both the quantum theory and Einsteins theory of special relativity lead to the supposition that matter and antimatter were produced in equal quantities during the big bang. It is noted that local matter/antimatter asymmetries may be reconciled with universal symmetry by assuming (1) a slight imbalance of matter over antimatter in the early universe, annihilation, and a subsequent remainder of matter; (2) localized regions of excess for one or the other type of matter as an initial condition; and (3) an extremely dense, high temperature state with zero net baryon number; i.e., matter/antimatter symmetry. Attention is given to the third assumption, which is the simplest and the most in keeping with current knowledge of the cosmos, especially as pertains the universality of 3 K background radiation. Mechanisms of galaxy formation are discussed, whereby matter and antimatter might have collided and annihilated each other, or have coexisted (and continue to coexist) at vast distances. It is pointed out that baryon symmetric big bang cosmology could probably be proved if an antinucleus could be detected in cosmic radiation.

  14. Georges et le big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2011-01-01

    Georges et Annie, sa meilleure amie, sont sur le point d'assister à l'une des plus importantes expériences scientifiques de tous les temps : explorer les premiers instants de l'Univers, le Big Bang ! Grâce à Cosmos, leur super ordinateur, et au Grand Collisionneur de hadrons créé par Éric, le père d'Annie, ils vont enfin pouvoir répondre à cette question essentielle : pourquoi existons nous ? Mais Georges et Annie découvrent qu'un complot diabolique se trame. Pire, c'est toute la recherche scientifique qui est en péril ! Entraîné dans d'incroyables aventures, Georges ira jusqu'aux confins de la galaxie pour sauver ses amis...Une plongée passionnante au coeur du Big Bang. Les toutes dernières théories de Stephen Hawking et des plus grands scientifiques actuels.

  15. [Utilization of Big Data in Medicine and Future Outlook].

    Science.gov (United States)

    Kinosada, Yasutomi; Uematsu, Machiko; Fujiwara, Takuya

    2016-03-01

    "Big data" is a new buzzword. The point is not to be dazzled by the volume of data, but rather to analyze it, and convert it into insights, innovations, and business value. There are also real differences between conventional analytics and big data. In this article, we show some results of big data analysis using open DPC (Diagnosis Procedure Combination) data in areas of the central part of JAPAN: Toyama, Ishikawa, Fukui, Nagano, Gifu, Aichi, Shizuoka, and Mie Prefectures. These 8 prefectures contain 51 medical administration areas called the second medical area. By applying big data analysis techniques such as k-means, hierarchical clustering, and self-organizing maps to DPC data, we can visualize the disease structure and detect similarities or variations among the 51 second medical areas. The combination of a big data analysis technique and open DPC data is a very powerful method to depict real figures on patient distribution in Japan.

  16. Small Punch Test on Before and Post Irradiated Domestic Reactor Pressure Steel

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Problems may be caused when applying the standard specimen to study the properties of irradiated reactor materials, because of its big dimension, e.g.: The inner temperature gradient of the specimen is high when irradiated, the radiation

  17. 代中子时间法求解点堆中子动力学方程%Solving point reactor neutron kinetic equations by using generation of neutron time method

    Institute of Scientific and Technical Information of China (English)

    蔡光明; 阮良成

    2012-01-01

    由于点堆中子动力学方程是个刚性方程,因此准确、快速、稳定地求解方程是困难的.得益于现代计算机技术的进步,本文直接采用代中子时间计算法求解点堆中子动力学方程,并用C++语言编制了计算程序.经过基准例题和动态-逆动态对比计算,验证了模型、程序计算的准确性和稳定性,而计算时间也是可接受的.%As the point reactor neutron kinetic equations are stiff equations, it is difficult to solve these equations rapidly with a certain accuracy and stability. Due to the progress in modern computer technology, these equations can be solved directly by means of generation of neutron time method. In the meantime, we write a code by C++ language to be applied in the calculation. With the kinetic-inverse kinetic contrast calculation, this model and code have been examined and proved to be accurate and stable, and the computing time is also acceptable.

  18. NEW THEORY IN TUNNEL STABLILITY CONTROL OF SOFT ROCK ——MECHANICS OF SOFT ROCK ENGINEERING

    Institute of Scientific and Technical Information of China (English)

    何满朝

    1996-01-01

    Tunnel stability control is a world-wide difficult problem. For the sake of solving it,the new theory of soft rock engineering mechanics has been estabilished. Some key points,such as the definition and classification of soft rock, mechanical deformation mechanism of a soft rock tunnel, the critical support technique of soft rock tunnel and the new theory of the soft rock tunnel stability control are proposed in this paper.

  19. Measuring Public Acceptance of Nuclear Technology with Big data

    Energy Technology Data Exchange (ETDEWEB)

    Roh, Seugkook [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    Surveys can be conducted only on people in specific region and time interval, and it may be misleading to generalize the results to represent the attitude of the public. For example, opinions of a person living in metropolitan area, far from the dangers of nuclear reactors and enjoying cheap electricity produced by the reactors, and a person living in proximity of nuclear power plants, subject to tremendous damage should nuclear meltdown occur, certainly differs for the topic of nuclear generation. To conclude, big data is a useful tool to measure the public acceptance of nuclear technology efficiently (i.e., saves cost, time, and effort of measurement and analysis) and this research was able to provide a case for using big data to analyze public acceptance of nuclear technology. Finally, the analysis identified opinion leaders, which allows target-marketing when policy is executed.

  20. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan Capalbo

    2005-12-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I are organized into four areas: (1) Evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; (2) Development of GIS-based reporting framework that links with national networks; (3) Design of an integrated suite of monitoring, measuring, and verification technologies, market-based opportunities for carbon management, and an economic/risk assessment framework; (referred to below as the Advanced Concepts component of the Phase I efforts) and (4) Initiation of a comprehensive education and outreach program. As a result of the Phase I activities, the groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that complements the ongoing DOE research agenda in Carbon Sequestration. The geology of the Big Sky Carbon Sequestration Partnership Region is favorable for the potential sequestration of enormous volume of CO{sub 2}. The United States Geological Survey (USGS 1995) identified 10 geologic provinces and 111 plays in the region. These provinces and plays include both sedimentary rock types characteristic of oil, gas, and coal productions as well as large areas of mafic volcanic rocks. Of the 10 provinces and 111 plays, 1 province and 4 plays are located within Idaho. The remaining 9 provinces and 107 plays are dominated by sedimentary rocks and located in the states of Montana and Wyoming. The potential sequestration capacity of the 9 sedimentary provinces within the region ranges from 25,000 to almost 900,000 million metric tons of CO{sub 2}. Overall every sedimentary formation investigated

  1. Big Data and Cycling

    NARCIS (Netherlands)

    Romanillos, Gustavo; Zaltz Austwick, Martin; Ettema, Dick; De Kruijf, Joost

    2016-01-01

    Big Data has begun to create significant impacts in urban and transport planning. This paper covers the explosion in data-driven research on cycling, most of which has occurred in the last ten years. We review the techniques, objectives and findings of a growing number of studies we have classified

  2. Big data in history

    CERN Document Server

    Manning, Patrick

    2013-01-01

    Big Data in History introduces the project to create a world-historical archive, tracing the last four centuries of historical dynamics and change. Chapters address the archive's overall plan, how to interpret the past through a global archive, the missions of gathering records, linking local data into global patterns, and exploring the results.

  3. The big bang

    Science.gov (United States)

    Silk, Joseph

    Our universe was born billions of years ago in a hot, violent explosion of elementary particles and radiation - the big bang. What do we know about this ultimate moment of creation, and how do we know it? Drawing upon the latest theories and technology, this new edition of The big bang, is a sweeping, lucid account of the event that set the universe in motion. Joseph Silk begins his story with the first microseconds of the big bang, on through the evolution of stars, galaxies, clusters of galaxies, quasars, and into the distant future of our universe. He also explores the fascinating evidence for the big bang model and recounts the history of cosmological speculation. Revised and updated, this new edition features all the most recent astronomical advances, including: Photos and measurements from the Hubble Space Telescope, Cosmic Background Explorer Satellite (COBE), and Infrared Space Observatory; the latest estimates of the age of the universe; new ideas in string and superstring theory; recent experiments on neutrino detection; new theories about the presence of dark matter in galaxies; new developments in the theory of the formation and evolution of galaxies; the latest ideas about black holes, worm holes, quantum foam, and multiple universes.

  4. The Big Bang

    CERN Multimedia

    Moods, Patrick

    2006-01-01

    How did the Universe begin? The favoured theory is that everything - space, time, matter - came into existence at the same moment, around 13.7 thousand million years ago. This event was scornfully referred to as the "Big Bang" by Sir Fred Hoyle, who did not believe in it and maintained that the Universe had always existed.

  5. The Big Sky inside

    Science.gov (United States)

    Adams, Earle; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Hester, Carolyn; Knuth, Randy; Spangler, Todd; Jones, David; Henthorn, Melissa; Hammill, Brock; Smith, Paul; Salisbury, Rob; Reckin, Gene; Boulafentis, Johna

    2009-01-01

    The University of Montana (UM)-Missoula has implemented a problem-based program in which students perform scientific research focused on indoor air pollution. The Air Toxics Under the Big Sky program (Jones et al. 2007; Adams et al. 2008; Ward et al. 2008) provides a community-based framework for understanding the complex relationship between poor…

  6. Big Java late objects

    CERN Document Server

    Horstmann, Cay S

    2012-01-01

    Big Java: Late Objects is a comprehensive introduction to Java and computer programming, which focuses on the principles of programming, software engineering, and effective learning. It is designed for a two-semester first course in programming for computer science students.

  7. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with sp

  8. A Big Bang Lab

    Science.gov (United States)

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  9. Space big book

    CERN Document Server

    Homer, Charlene

    2007-01-01

    Our Combined resource includes all necessary areas of Space for grades five to eight. Get the big picture about the Solar System, Galaxies and the Universe as your students become fascinated by the interesting information about the Sun, Earth, Moon, Comets, Asteroids Meteoroids, Stars and Constellations. Also, thrill your young astronomers as they connect Earth and space cycles with their daily life.

  10. The BIG Data Center: from deposition to integration to translation.

    Science.gov (United States)

    2017-01-04

    Biological data are generated at unprecedentedly exponential rates, posing considerable challenges in big data deposition, integration and translation. The BIG Data Center, established at Beijing Institute of Genomics (BIG), Chinese Academy of Sciences, provides a suite of database resources, including (i) Genome Sequence Archive, a data repository specialized for archiving raw sequence reads, (ii) Gene Expression Nebulas, a data portal of gene expression profiles based entirely on RNA-Seq data, (iii) Genome Variation Map, a comprehensive collection of genome variations for featured species, (iv) Genome Warehouse, a centralized resource housing genome-scale data with particular focus on economically important animals and plants, (v) Methylation Bank, an integrated database of whole-genome single-base resolution methylomes and (vi) Science Wikis, a central access point for biological wikis developed for community annotations. The BIG Data Center is dedicated to constructing and maintaining biological databases through big data integration and value-added curation, conducting basic research to translate big data into big knowledge and providing freely open access to a variety of data resources in support of worldwide research activities in both academia and industry. All of these resources are publicly available and can be found at http://bigd.big.ac.cn.

  11. The BIG Data Center: from deposition to integration to translation

    Science.gov (United States)

    2017-01-01

    Biological data are generated at unprecedentedly exponential rates, posing considerable challenges in big data deposition, integration and translation. The BIG Data Center, established at Beijing Institute of Genomics (BIG), Chinese Academy of Sciences, provides a suite of database resources, including (i) Genome Sequence Archive, a data repository specialized for archiving raw sequence reads, (ii) Gene Expression Nebulas, a data portal of gene expression profiles based entirely on RNA-Seq data, (iii) Genome Variation Map, a comprehensive collection of genome variations for featured species, (iv) Genome Warehouse, a centralized resource housing genome-scale data with particular focus on economically important animals and plants, (v) Methylation Bank, an integrated database of whole-genome single-base resolution methylomes and (vi) Science Wikis, a central access point for biological wikis developed for community annotations. The BIG Data Center is dedicated to constructing and maintaining biological databases through big data integration and value-added curation, conducting basic research to translate big data into big knowledge and providing freely open access to a variety of data resources in support of worldwide research activities in both academia and industry. All of these resources are publicly available and can be found at http://bigd.big.ac.cn. PMID:27899658

  12. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  13. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  14. Business and Science - Big Data, Big Picture

    Science.gov (United States)

    Rosati, A.

    2013-12-01

    Data Science is more than the creation, manipulation, and transformation of data. It is more than Big Data. The business world seems to have a hold on the term 'data science' and, for now, they define what it means. But business is very different than science. In this talk, I address how large datasets, Big Data, and data science are conceptually different in business and science worlds. I focus on the types of questions each realm asks, the data needed, and the consequences of findings. Gone are the days of datasets being created or collected to serve only one purpose or project. The trick with data reuse is to become familiar enough with a dataset to be able to combine it with other data and extract accurate results. As a Data Curator for the Advanced Cooperative Arctic Data and Information Service (ACADIS), my specialty is communication. Our team enables Arctic sciences by ensuring datasets are well documented and can be understood by reusers. Previously, I served as a data community liaison for the North American Regional Climate Change Assessment Program (NARCCAP). Again, my specialty was communicating complex instructions and ideas to a broad audience of data users. Before entering the science world, I was an entrepreneur. I have a bachelor's degree in economics and a master's degree in environmental social science. I am currently pursuing a Ph.D. in Geography. Because my background has embraced both the business and science worlds, I would like to share my perspectives on data, data reuse, data documentation, and the presentation or communication of findings. My experiences show that each can inform and support the other.

  15. Globalisation, big business and the Blair government

    OpenAIRE

    Grant, Wyn

    2000-01-01

    After reviewing definitions of globalisation, this paper suggests that the ‘company state model is becoming increasingly important in business-government relations. It is argued that Prime Minister Blair has a particular construction of globalisation which fits in well with the agenda of big international business. However, increasing tensions have arisen in the relationship between New Labour and business, reaching crisis point in May 2000. The paper concludes by suggesting that Burnham’s de...

  16. Model "Big Five" personality and criminal behavior

    OpenAIRE

    Sánchez-Teruel, David; Profesor, Departamento de Psicología-Área de Psicología Social, Facultad de Humanidades y Ciencias de la Educación, España.; Robles-Bello, Mª Auxiliadora; Profesor, Departamento de Psicología-Área de Psicología Social, Facultad de Humanidades y Ciencias de la Educación, España.

    2013-01-01

    It reflect on the theoretical issues that currently versa Personality Psychology in general and antisocial or criminal behavior in particular. It discusses how the model can be used personality "Big Five" applied to the field of crime, and shows the variables that the literature presented as more predictive, through one of the most widely used assessment instruments at present. It currently advises finding, meeting points between the various existing theories, for that personality does not be...

  17. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    Science.gov (United States)

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  18. Sonochemical Reactors.

    Science.gov (United States)

    Gogate, Parag R; Patil, Pankaj N

    2016-10-01

    Sonochemical reactors are based on the generation of cavitational events using ultrasound and offer immense potential for the intensification of physical and chemical processing applications. The present work presents a critical analysis of the underlying mechanisms for intensification, available reactor configurations and overview of the different applications exploited successfully, though mostly at laboratory scales. Guidelines have also been presented for optimum selection of the important operating parameters (frequency and intensity of irradiation, temperature and liquid physicochemical properties) as well as the geometric parameters (type of reactor configuration and the number/position of the transducers) so as to maximize the process intensification benefits. The key areas for future work so as to transform the successful technique at laboratory/pilot scale into commercial technology have also been discussed. Overall, it has been established that there is immense potential for sonochemical reactors for process intensification leading to greener processing and economic benefits. Combined efforts from a wide range of disciplines such as material science, physics, chemistry and chemical engineers are required to harness the benefits at commercial scale operation.

  19. Institute for Rock Magnetism established

    Science.gov (United States)

    Banerjee, Subir K.

    There is a new focal point for cooperative research in advanced rock magnetism. The University of Minnesota in Minneapolis has established an Institute for Rock Magnetism (IRM) that will provide free access to modern equipment and encourage visiting fellows to focus on important topics in rock magnetism and related interdisciplinary research. Funding for the first three years has been secured from the National Science Foundation, the W.M. Keck Foundation, and the University of Minnesota.In the fall of 1986, the Geomagnetism and Paleomagnetism (GP) section of the AGU held a workshop at Asilomar, Calif., to pinpoint important and emerging research areas in paleomagnetism and rock magnetism, and the means by which to achieve them. In a report of this workshop published by the AGU in September 1987, two urgent needs were set forth. The first was for interdisciplinary research involving rock magnetism, and mineralogy, petrology, sedimentology, and the like. The second need was to ease the access of rock magnetists and paleomagnetists around the country to the latest equipment in modern magnetics technology, such as magneto-optics or electronoptics. Three years after the publication of the report, we announced the opening of these facilities at the GP section of the AGU Fall 1990 Meeting. A classified advertisement inviting applications for visiting fellowships was published in the January 22, 1991, issue of Eos.

  20. Urban Big Data and the Development of City Intelligence

    Directory of Open Access Journals (Sweden)

    Yunhe Pan

    2016-06-01

    Full Text Available This study provides a definition for urban big data while exploring its features and applications of China's city intelligence. The differences between city intelligence in China and the “smart city” concept in other countries are compared to highlight and contrast the unique definition and model for China's city intelligence in this paper. Furthermore, this paper examines the role of urban big data in city intelligence by showing that it not only serves as the cornerstone of this trend as it also plays a core role in the diffusion of city intelligence technology and serves as an inexhaustible resource for the sustained development of city intelligence. This study also points out the challenges of shaping and developing of China's urban big data. Considering the supporting and core role that urban big data plays in city intelligence, the study then expounds on the key points of urban big data, including infrastructure support, urban governance, public services, and economic and industrial development. Finally, this study points out that the utility of city intelligence as an ideal policy tool for advancing the goals of China's urban development. In conclusion, it is imperative that China make full use of its unique advantages—including using the nation's current state of development and resources, geographical advantages, and good human relations—in subjective and objective conditions to promote the development of city intelligence through the proper application of urban big data.

  1. Big Data and reality

    Directory of Open Access Journals (Sweden)

    Ryan Shaw

    2015-11-01

    Full Text Available DNA sequencers, Twitter, MRIs, Facebook, particle accelerators, Google Books, radio telescopes, Tumblr: what do these things have in common? According to the evangelists of “data science,” all of these are instruments for observing reality at unprecedentedly large scales and fine granularities. This perspective ignores the social reality of these very different technological systems, ignoring how they are made, how they work, and what they mean in favor of an exclusive focus on what they generate: Big Data. But no data, big or small, can be interpreted without an understanding of the process that generated them. Statistical data science is applicable to systems that have been designed as scientific instruments, but is likely to lead to confusion when applied to systems that have not. In those cases, a historical inquiry is preferable.

  2. Big Data-Survey

    Directory of Open Access Journals (Sweden)

    P.S.G. Aruna Sri

    2016-03-01

    Full Text Available Big data is the term for any gathering of information sets, so expensive and complex, that it gets to be hard to process for utilizing customary information handling applications. The difficulties incorporate investigation, catch, duration, inquiry, sharing, stockpiling, Exchange, perception, and protection infringement. To reduce spot business patterns, anticipate diseases, conflict etc., we require bigger data sets when compared with the smaller data sets. Enormous information is hard to work with utilizing most social database administration frameworks and desktop measurements and perception bundles, needing rather enormously parallel programming running on tens, hundreds, or even a large number of servers. In this paper there was an observation on Hadoop architecture, different tools used for big data and its security issues.

  3. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects......This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... of the international development agenda to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development policies, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion...

  4. Really big numbers

    CERN Document Server

    Schwartz, Richard Evan

    2014-01-01

    In the American Mathematical Society's first-ever book for kids (and kids at heart), mathematician and author Richard Evan Schwartz leads math lovers of all ages on an innovative and strikingly illustrated journey through the infinite number system. By means of engaging, imaginative visuals and endearing narration, Schwartz manages the monumental task of presenting the complex concept of Big Numbers in fresh and relatable ways. The book begins with small, easily observable numbers before building up to truly gigantic ones, like a nonillion, a tredecillion, a googol, and even ones too huge for names! Any person, regardless of age, can benefit from reading this book. Readers will find themselves returning to its pages for a very long time, perpetually learning from and growing with the narrative as their knowledge deepens. Really Big Numbers is a wonderful enrichment for any math education program and is enthusiastically recommended to every teacher, parent and grandparent, student, child, or other individual i...

  5. ANALYTICS OF BIG DATA

    Directory of Open Access Journals (Sweden)

    Prof. Shubhada Talegaon

    2015-10-01

    Full Text Available Big Data analytics has started to impact all types of organizations, as it carries the potential power to extract embedded knowledge from big amounts of data and react according to it in real time. The current technology enables us to efficiently store and query large datasets, the focus is now on techniques that make use of the complete data set, instead of sampling. This has tremendous implications in areas like machine learning, pattern recognition and classification, sentiment analysis, social networking analysis to name a few. Therefore, there are a number of requirements for moving beyond standard data mining technique. Purpose of this paper is to understand various techniques to analysis data.

  6. Finding the big bang

    CERN Document Server

    Page, Lyman A; Partridge, R Bruce

    2009-01-01

    Cosmology, the study of the universe as a whole, has become a precise physical science, the foundation of which is our understanding of the cosmic microwave background radiation (CMBR) left from the big bang. The story of the discovery and exploration of the CMBR in the 1960s is recalled for the first time in this collection of 44 essays by eminent scientists who pioneered the work. Two introductory chapters put the essays in context, explaining the general ideas behind the expanding universe and fossil remnants from the early stages of the expanding universe. The last chapter describes how the confusion of ideas and measurements in the 1960s grew into the present tight network of tests that demonstrate the accuracy of the big bang theory. This book is valuable to anyone interested in how science is done, and what it has taught us about the large-scale nature of the physical universe.

  7. DARPA's Big Mechanism program.

    Science.gov (United States)

    Cohen, Paul R

    2015-07-16

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  8. DARPA's Big Mechanism program

    Science.gov (United States)

    Cohen, Paul R.

    2015-07-01

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  9. Big Bang 8

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Band 8 vermittelt auf verständliche Weise Relativitätstheorie, Kern- und Teilchenphysik (und deren Anwendungen in der Kosmologie und Astrophysik), Nanotechnologie sowie Bionik.

  10. Big Bang 6

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 6 RG behandelt die Gravitation, Schwingungen und Wellen, Thermodynamik und eine Einführung in die Elektrizität anhand von Alltagsbeispielen und Querverbindungen zu anderen Disziplinen.

  11. Big Bang 5

    CERN Document Server

    Apolin, Martin

    2007-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 5 RG behandelt die Grundlagen (Maßsystem, Größenordnungen) und die Mechanik (Translation, Rotation, Kraft, Erhaltungssätze).

  12. Big Bang 7

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. In Band 7 werden neben einer Einführung auch viele aktuelle Aspekte von Quantenmechanik (z. Beamen) und Elektrodynamik (zB Elektrosmog), sowie die Klimaproblematik und die Chaostheorie behandelt.

  13. Experimental Breeder Reactor I Preservation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Julie Braun

    2006-10-01

    Experimental Breeder Reactor I (EBR I) is a National Historic Landmark located at the Idaho National Laboratory, a Department of Energy laboratory in southeastern Idaho. The facility is significant for its association and contributions to the development of nuclear reactor testing and development. This Plan includes a structural assessment of the interior and exterior of the EBR I Reactor Building from a preservation, rather than an engineering stand point and recommendations for maintenance to ensure its continued protection.

  14. Big Data Knowledge Mining

    Directory of Open Access Journals (Sweden)

    Huda Umar Banuqitah

    2016-11-01

    Full Text Available Big Data (BD era has been arrived. The ascent of big data applications where information accumulation has grown beyond the ability of the present programming instrument to catch, manage and process within tolerable short time. The volume is not only the characteristic that defines big data, but also velocity, variety, and value. Many resources contain BD that should be processed. The biomedical research literature is one among many other domains that hides a rich knowledge. MEDLINE is a huge biomedical research database which remain a significantly underutilized source of biological information. Discovering the useful knowledge from such huge corpus leading to many problems related to the type of information such as the related concepts of the domain of texts and the semantic relationship associated with them. In this paper, an agent-based system of two–level for Self-supervised relation extraction from MEDLINE using Unified Medical Language System (UMLS Knowledgebase, has been proposed . The model uses a Self-supervised Approach for Relation Extraction (RE by constructing enhanced training examples using information from UMLS with hybrid text features. The model incorporates Apache Spark and HBase BD technologies with multiple data mining and machine learning technique with the Multi Agent System (MAS. The system shows a better result in comparison with the current state of the art and naïve approach in terms of Accuracy, Precision, Recall and F-score.

  15. Disaggregating asthma: Big investigation versus big data.

    Science.gov (United States)

    Belgrave, Danielle; Henderson, John; Simpson, Angela; Buchan, Iain; Bishop, Christopher; Custovic, Adnan

    2017-02-01

    We are facing a major challenge in bridging the gap between identifying subtypes of asthma to understand causal mechanisms and translating this knowledge into personalized prevention and management strategies. In recent years, "big data" has been sold as a panacea for generating hypotheses and driving new frontiers of health care; the idea that the data must and will speak for themselves is fast becoming a new dogma. One of the dangers of ready accessibility of health care data and computational tools for data analysis is that the process of data mining can become uncoupled from the scientific process of clinical interpretation, understanding the provenance of the data, and external validation. Although advances in computational methods can be valuable for using unexpected structure in data to generate hypotheses, there remains a need for testing hypotheses and interpreting results with scientific rigor. We argue for combining data- and hypothesis-driven methods in a careful synergy, and the importance of carefully characterized birth and patient cohorts with genetic, phenotypic, biological, and molecular data in this process cannot be overemphasized. The main challenge on the road ahead is to harness bigger health care data in ways that produce meaningful clinical interpretation and to translate this into better diagnoses and properly personalized prevention and treatment plans. There is a pressing need for cross-disciplinary research with an integrative approach to data science, whereby basic scientists, clinicians, data analysts, and epidemiologists work together to understand the heterogeneity of asthma.

  16. Teaching About Nature's Nuclear Reactors

    CERN Document Server

    Herndon, J M

    2005-01-01

    Naturally occurring nuclear reactors existed in uranium deposits on Earth long before Enrico Fermi built the first man-made nuclear reactor beneath Staggs Field in 1942. In the story of their discovery, there are important lessons to be learned about scientific inquiry and scientific discovery. Now, there is evidence to suggest that the Earth's magnetic field and Jupiter's atmospheric turbulence are driven by planetary-scale nuclear reactors. The subject of planetocentric nuclear fission reactors can be a jumping off point for stimulating classroom discussions about the nature and implications of planetary energy sources and about the geomagnetic field. But more importantly, the subject can help to bring into focus the importance of discussing, debating, and challenging current thinking in a variety of areas.

  17. ESR dating of fault rocks

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hee Kwon [Kangwon National Univ., Chuncheon (Korea, Republic of)

    2003-02-15

    Past movement on faults can be dated by measurement of the intensity of ESR signals in quartz. These signals are reset by local lattice deformation and local frictional heating on grain contacts at the time of fault movement. The ESR signals then grow back as a result of bombardment by ionizing radiation from surrounding rocks. The age is obtained from the ratio of the equivalent dose, needed to produce the observed signal, to the dose rate. Fine grains are more completely reset during faulting, and a plot of age vs. grain size shows a plateau for grains below critical size; these grains are presumed to have been completely zeroed by the last fault activity. We carried out ESR dating of fault rocks collected near the Gori nuclear reactor. Most of the ESR signals of fault rocks collected from the basement are saturated. This indicates that the last movement of the faults had occurred before the Quaternary period. However, ESR dates from the Oyong fault zone range from 370 to 310 ka. Results of this research suggest that long-term cyclic fault activity of the Oyong fault zone continued into the Pleistocene.

  18. Dynamic experimental study on rock meso-cracks growth by digital image processing technique

    Institute of Scientific and Technical Information of China (English)

    朱珍德; 倪骁慧; 王伟; 李双蓓; 赵杰; 武沂泉

    2008-01-01

    A new meso-mechanical testing scheme based on SEM was developed to carry out the experiment of microfracturing process of rocks. The microfracturing process of the pre-crack marble sample on surrounding rock in the immerged Long-big tunnel in Jinping Cascade II Hydropower Station under uniaxial compression was recorded by using the testing scheme. According to the stereology theory, the propagation and coalescent of cracks at meso-scale were quantitatively investigated with digital technology. Therefore, the basic geometric information of rock microcracks such as area, angle, length, width, perimeter, was obtained from binary images after segmentation. The failure mechanism of specimen under uniaxial compression with the quantitative information was studied from macro and microscopic point of view. The results show that the image of microfracturing process of the specimen can be observed and recorded digitally. During the damage of the specimen, the distribution of microcracks in the specimen is still subjected to exponential distribution with some microcracks concentrated in certain regions. Finally, the change law of the fractal dimension of the local element in marble sample under different external load conditions is obtained by means of the statistical calculation of the fractal dimension.

  19. Big Crunch-based omnidirectional light concentrators

    CERN Document Server

    Smolyaninov, Igor I

    2014-01-01

    Omnidirectional light concentration remains an unsolved problem despite such important practical applications as design of efficient mobile photovoltaic cells. Optical black hole designs developed recently offer partial solution to this problem. However, even these solutions are not truly omnidirectional since they do not exhibit a horizon, and at large enough incidence angles light may be trapped into quasi-stationary orbits around such imperfect optical black holes. Here we propose and realize experimentally another gravity-inspired design of a broadband omnidirectional light concentrator based on the cosmological Big Crunch solutions. By mimicking the Big Crunch spacetime via corresponding effective optical metric we make sure that every photon world line terminates in a single point.

  20. How Big is Earth?

    Science.gov (United States)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  1. Privacy and Big Data

    CERN Document Server

    Craig, Terence

    2011-01-01

    Much of what constitutes Big Data is information about us. Through our online activities, we leave an easy-to-follow trail of digital footprints that reveal who we are, what we buy, where we go, and much more. This eye-opening book explores the raging privacy debate over the use of personal data, with one undeniable conclusion: once data's been collected, we have absolutely no control over who uses it or how it is used. Personal data is the hottest commodity on the market today-truly more valuable than gold. We are the asset that every company, industry, non-profit, and government wants. Pri

  2. Big Data Challenges

    Directory of Open Access Journals (Sweden)

    Alexandru Adrian TOLE

    2013-10-01

    Full Text Available The amount of data that is traveling across the internet today, not only that is large, but is complex as well. Companies, institutions, healthcare system etc., all of them use piles of data which are further used for creating reports in order to ensure continuity regarding the services that they have to offer. The process behind the results that these entities requests represents a challenge for software developers and companies that provide IT infrastructure. The challenge is how to manipulate an impressive volume of data that has to be securely delivered through the internet and reach its destination intact. This paper treats the challenges that Big Data creates.

  3. Distributed snow and rock temperature modelling in steep rock walls using Alpine3D

    Science.gov (United States)

    Haberkorn, Anna; Wever, Nander; Hoelzle, Martin; Phillips, Marcia; Kenner, Robert; Bavay, Mathias; Lehning, Michael

    2017-02-01

    In this study we modelled the influence of the spatially and temporally heterogeneous snow cover on the surface energy balance and thus on rock temperatures in two rugged, steep rock walls on the Gemsstock ridge in the central Swiss Alps. The heterogeneous snow depth distribution in the rock walls was introduced to the distributed, process-based energy balance model Alpine3D with a precipitation scaling method based on snow depth data measured by terrestrial laser scanning. The influence of the snow cover on rock temperatures was investigated by comparing a snow-covered model scenario (precipitation input provided by precipitation scaling) with a snow-free (zero precipitation input) one. Model uncertainties are discussed and evaluated at both the point and spatial scales against 22 near-surface rock temperature measurements and high-resolution snow depth data from winter terrestrial laser scans.In the rough rock walls, the heterogeneously distributed snow cover was moderately well reproduced by Alpine3D with mean absolute errors ranging between 0.31 and 0.81 m. However, snow cover duration was reproduced well and, consequently, near-surface rock temperatures were modelled convincingly. Uncertainties in rock temperature modelling were found to be around 1.6 °C. Errors in snow cover modelling and hence in rock temperature simulations are explained by inadequate snow settlement due to linear precipitation scaling, missing lateral heat fluxes in the rock, and by errors caused by interpolation of shortwave radiation, wind and air temperature into the rock walls.Mean annual near-surface rock temperature increases were both measured and modelled in the steep rock walls as a consequence of a thick, long-lasting snow cover. Rock temperatures were 1.3-2.5 °C higher in the shaded and sunny rock walls, while comparing snow-covered to snow-free simulations. This helps to assess the potential error made in ground temperature modelling when neglecting snow in steep bedrock.

  4. CERN’s Summer of Rock

    CERN Multimedia

    Katarina Anthony

    2015-01-01

    When a rock star visits CERN, they don’t just bring their entourage with them. Along for the ride are legions of fans across the world – many of whom may not be the typical CERN audience. In July alone, four big acts paid CERN a visit, sharing their experience with the world: Scorpions, The Script, Kings of Leon and Patti Smith.   @TheScript tweeted: #paleofestival we had the best time! Big love. #CERN (Image: Twitter).   It all started with the Scorpions, the classic rock band whose “Wind of Change” became an anthem in the early 1990s. On 19 July, the band braved the 35-degree heat to tour the CERN site on foot – visiting the Synchrocyclotron and the new Microcosm exhibition. The rockers were very enthusiastic about the research carried out at CERN, and talked about returning in the autumn during their next tour stop. The Scorpions visit Microcosm. Two days later, The Script rolled in. This Irish pop-rock band has been hittin...

  5. Can Pleasant Goat and Big Big Wolf Save China's Animation Industry?

    Institute of Scientific and Technical Information of China (English)

    Guo Liqin

    2009-01-01

    "My dreamed husband is big big wolf," claimed Miss Fang, a young lady who works in KPMG Beijing Office. This big big wolf is a lovely cartoon wolf appeared in a Pleasant Goat and Big Big Wolf produced independently by Chinese.

  6. Asteroids Were Born Big

    CERN Document Server

    Morbidelli, Alessandro; Nesvorny, David; Levison, Harold F

    2009-01-01

    How big were the first planetesimals? We attempt to answer this question by conducting coagulation simulations in which the planetesimals grow by mutual collisions and form larger bodies and planetary embryos. The size frequency distribution (SFD) of the initial planetesimals is considered a free parameter in these simulations, and we search for the one that produces at the end objects with a SFD that is consistent with asteroid belt constraints. We find that, if the initial planetesimals were small (e.g. km-sized), the final SFD fails to fulfill these constraints. In particular, reproducing the bump observed at diameter D~100km in the current SFD of the asteroids requires that the minimal size of the initial planetesimals was also ~100km. This supports the idea that planetesimals formed big, namely that the size of solids in the proto-planetary disk ``jumped'' from sub-meter scale to multi-kilometer scale, without passing through intermediate values. Moreover, we find evidence that the initial planetesimals ...

  7. ATLAS: civil engineering Point 1

    CERN Multimedia

    2000-01-01

    The ATLAS experimental area is located in Point 1, just across the main CERN entrance, in the commune of Meyrin. There people are busy to finish the different infrastructures for ATLAS. Real underground video. Nice view from the surface to the cavern from the pit side - all the big machines looked very small. The film has original working sound.

  8. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  9. Fremtidens landbrug bliver big business

    DEFF Research Database (Denmark)

    Hansen, Henning Otte

    2016-01-01

    Landbrugets omverdensforhold og konkurrencevilkår ændres, og det vil nødvendiggøre en udvikling i retning af “big business“, hvor landbrugene bliver endnu større, mere industrialiserede og koncentrerede. Big business bliver en dominerende udvikling i dansk landbrug - men ikke den eneste...

  10. Hybrid adsorptive membrane reactor

    Science.gov (United States)

    Tsotsis, Theodore T. (Inventor); Sahimi, Muhammad (Inventor); Fayyaz-Najafi, Babak (Inventor); Harale, Aadesh (Inventor); Park, Byoung-Gi (Inventor); Liu, Paul K. T. (Inventor)

    2011-01-01

    A hybrid adsorbent-membrane reactor in which the chemical reaction, membrane separation, and product adsorption are coupled. Also disclosed are a dual-reactor apparatus and a process using the reactor or the apparatus.

  11. D and DR Reactors

    Data.gov (United States)

    Federal Laboratory Consortium — The world's second full-scale nuclear reactor was the D Reactor at Hanford which was built in the early 1940's and went operational in December of 1944.D Reactor ran...

  12. Hybrid adsorptive membrane reactor

    Science.gov (United States)

    Tsotsis, Theodore T.; Sahimi, Muhammad; Fayyaz-Najafi, Babak; Harale, Aadesh; Park, Byoung-Gi; Liu, Paul K. T.

    2011-03-01

    A hybrid adsorbent-membrane reactor in which the chemical reaction, membrane separation, and product adsorption are coupled. Also disclosed are a dual-reactor apparatus and a process using the reactor or the apparatus.

  13. Geotechnical Descriptions of Rock and Rock Masses.

    Science.gov (United States)

    1985-04-01

    weathering is presented by Dornbusch (1982). 39. Mechanical, or physical, weathering of rock occurs primarily by (a) freeze expansion (or frost wedging...34Engineering Classifica- tion of In-Situ Rock," Technical Report No. AFWL-TR-67-144, Air Force Weapons Laboratory, Kirtland Air Force Base, N. Mex. Dornbusch , W

  14. Structure and geomorphology of the "big bend" in the Hosgri-San Gregorio fault system, offshore of Big Sur, central California

    Science.gov (United States)

    Johnson, S. Y.; Watt, J. T.; Hartwell, S. R.; Kluesner, J. W.; Dartnell, P.

    2015-12-01

    The right-lateral Hosgri-San Gregorio fault system extends mainly offshore for about 400 km along the central California coast and is a major structure in the distributed transform margin of western North America. We recently mapped a poorly known 64-km-long section of the Hosgri fault offshore Big Sur between Ragged Point and Pfieffer Point using high-resolution bathymetry, tightly spaced single-channel seismic-reflection and coincident marine magnetic profiles, and reprocessed industry multichannel seismic-reflection data. Regionally, this part of the Hosgri-San Gregorio fault system has a markedly more westerly trend (by 10° to 15°) than parts farther north and south, and thus represents a transpressional "big bend." Through this "big bend," the fault zone is never more than 6 km from the shoreline and is a primary control on the dramatic coastal geomorphology that includes high coastal cliffs, a narrow (2- to 8-km-wide) continental shelf, a sharp shelfbreak, and a steep (as much as 17°) continental slope incised by submarine canyons and gullies. Depth-converted industry seismic data suggest that the Hosgri fault dips steeply to the northeast and forms the eastern boundary of the asymmetric (deeper to the east) Sur Basin. Structural relief on Franciscan basement across the Hosgri fault is about 2.8 km. Locally, we recognize five discrete "sections" of the Hosgri fault based on fault trend, shallow structure (e.g., disruption of young sediments), seafloor geomorphology, and coincidence with high-amplitude magnetic anomalies sourced by ultramafic rocks in the Franciscan Complex. From south to north, section lengths and trends are as follows: (1) 17 km, 312°; (2) 10 km, 322°; (3)13 km, 317°; (4) 3 km, 329°; (5) 21 km, 318°. Through these sections, the Hosgri surface trace includes several right steps that vary from a few hundred meters to about 1 km wide, none wide enough to provide a barrier to continuous earthquake rupture.

  15. The Rise of Big Data in Neurorehabilitation.

    Science.gov (United States)

    Faroqi-Shah, Yasmeen

    2016-02-01

    In some fields, Big Data has been instrumental in analyzing, predicting, and influencing human behavior. However, Big Data approaches have so far been less central in speech-language pathology. This article introduces the concept of Big Data and provides examples of Big Data initiatives pertaining to adult neurorehabilitation. It also discusses the potential theoretical and clinical contributions that Big Data can make. The article also recognizes some impediments in building and using Big Data for scientific and clinical inquiry.

  16. Big bang nucleosynthesis: Present status

    Science.gov (United States)

    Cyburt, Richard H.; Fields, Brian D.; Olive, Keith A.; Yeh, Tsung-Han

    2016-01-01

    Big bang nucleosynthesis (BBN) describes the production of the lightest nuclides via a dynamic interplay among the four fundamental forces during the first seconds of cosmic time. A brief overview of the essentials of this physics is given, and new calculations presented of light-element abundances through 6Li and 7Li, with updated nuclear reactions and uncertainties including those in the neutron lifetime. Fits are provided for these results as a function of baryon density and of the number of neutrino flavors Nν. Recent developments are reviewed in BBN, particularly new, precision Planck cosmic microwave background (CMB) measurements that now probe the baryon density, helium content, and the effective number of degrees of freedom Neff. These measurements allow for a tight test of BBN and cosmology using CMB data alone. Our likelihood analysis convolves the 2015 Planck data chains with our BBN output and observational data. Adding astronomical measurements of light elements strengthens the power of BBN. A new determination of the primordial helium abundance is included in our likelihood analysis. New D/H observations are now more precise than the corresponding theoretical predictions and are consistent with the standard model and the Planck baryon density. Moreover, D/H now provides a tight measurement of Nν when combined with the CMB baryon density and provides a 2 σ upper limit Nνpointing to new physics. This paper concludes with a look at future directions including key nuclear reactions, astronomical observations, and theoretical issues.

  17. To What Extent Can the Big Five and Learning Styles Predict Academic Achievement

    Science.gov (United States)

    Köseoglu, Yaman

    2016-01-01

    Personality traits and learning styles play defining roles in shaping academic achievement. 202 university students completed the Big Five personality traits questionnaire and the Inventory of Learning Processes Scale and self-reported their grade point averages. Conscientiousness and agreeableness, two of the Big Five personality traits, related…

  18. BIG DATA AND STATISTICS

    Science.gov (United States)

    Rossell, David

    2016-01-01

    Big Data brings unprecedented power to address scientific, economic and societal issues, but also amplifies the possibility of certain pitfalls. These include using purely data-driven approaches that disregard understanding the phenomenon under study, aiming at a dynamically moving target, ignoring critical data collection issues, summarizing or preprocessing the data inadequately and mistaking noise for signal. We review some success stories and illustrate how statistical principles can help obtain more reliable information from data. We also touch upon current challenges that require active methodological research, such as strategies for efficient computation, integration of heterogeneous data, extending the underlying theory to increasingly complex questions and, perhaps most importantly, training a new generation of scientists to develop and deploy these strategies.

  19. Big Bounce Genesis

    CERN Document Server

    Li, Changhong; Cheung, Yeuk-Kwan E

    2014-01-01

    We report on the possibility to use dark matter mass and its interaction cross section as a smoking gun signal of the existence of a big bounce at the early stage in the evolution of our currently observed universe. A model independent study of dark matter production in the contraction and expansion phases of the bounce universe reveals a new venue for achieving the observed relic abundance in which a significantly smaller amount of dark matter--compared to the standard cosmology--is produced and survives until today, diluted only by the cosmic expansion since the radiation dominated era. Once DM mass and its interaction strength with ordinary matter are determined by experiments, this alternative route becomes a signature of the bounce universe scenario.

  20. Big Hero 6

    Institute of Scientific and Technical Information of China (English)

    2015-01-01

    看《超能陆战队》如何让普通人变身超级英雄拯救城市!Hiro Hamada,14,lives in the future city of San Fransokyo.He has a robot(机器人)friend Baymax.Baymax is big and soft.His job is to nurse sick(生病的)people.One day,a bad man wants to take control of(控制)SanFransokyo.Hiro hopes to save(挽救)the city with Baymax.ButBaymax is just a nursing robot.This is not a problem for Hiro,(ho一we套ve盔r.甲He)knows a lot about robots.He makes a suit of armorfor Baymax and turns him into a super robot!

  1. The Last Big Bang

    Energy Technology Data Exchange (ETDEWEB)

    McGuire, Austin D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Meade, Roger Allen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-13

    As one of the very few people in the world to give the “go/no go” decision to detonate a nuclear device, Austin “Mac” McGuire holds a very special place in the history of both the Los Alamos National Laboratory and the world. As Commander of Joint Task Force Unit 8.1.1, on Christmas Island in the spring and summer of 1962, Mac directed the Los Alamos data collection efforts for twelve of the last atmospheric nuclear detonations conducted by the United States. Since data collection was at the heart of nuclear weapon testing, it fell to Mac to make the ultimate decision to detonate each test device. He calls his experience THE LAST BIG BANG, since these tests, part of Operation Dominic, were characterized by the dramatic displays of the heat, light, and sounds unique to atmospheric nuclear detonations – never, perhaps, to be witnessed again.

  2. Avoiding a Big Catastrophe

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Before last October,the South China tiger had almost slipped into mythi- cal status as it had been absent for so long from the public eye.In the previous 20-plus years,these tigers could not be found in the wild in China and the number of those in captivity numbered only around 60. The species—a direct descendent of the earliest tigers thought to have originat- ed in China 2 million years ago—is functionally extinct,according to experts. The big cat’s return to the media spotlight was completely unexpected. On October 12,2007,a digital picture,showing a wild South China tiger

  3. T-S fuzzy control on nuclear reactor power based on model of point kinetics with one delayed neutron group%基于单组缓发中子模型的反应堆功率T-S模糊控制

    Institute of Scientific and Technical Information of China (English)

    赵伟宁; 栾秀春; 樊达宜; 周杰

    2013-01-01

    The T - S fuzzy controller was designed based on the dynamic model of point kinetics with one delayed neutron group to control the power of nuclear reactor. The simulation result showed the satisfactory performance of the T - S fuzzy controller to control the nuclear reactor power output.%基于T-S模糊模型,针对单组缓发中子点堆动力学方程,设计了T-S模糊控制器来实现对反应堆功率的控制.仿真结果表明,所设计的T-S模糊模型控制器能够较好的控制反应堆功率的输出,取得较好的控制效果.

  4. My Pet Rock

    Science.gov (United States)

    Lark, Adam; Kramp, Robyne; Nurnberger-Haag, Julie

    2008-01-01

    Many teachers and students have experienced the classic pet rock experiment in conjunction with a geology unit. A teacher has students bring in a "pet" rock found outside of school, and the students run geologic tests on the rock. The tests include determining relative hardness using Mohs scale, checking for magnetization, and assessing luster.…

  5. Exploring Data in Human Resources Big Data

    Directory of Open Access Journals (Sweden)

    Adela BARA

    2016-01-01

    Full Text Available Nowadays, social networks and informatics technologies and infrastructures are constantly developing and affect each other. In this context, the HR recruitment process became complex and many multinational organizations have encountered selection issues. The objective of the paper is to develop a prototype system for assisting the selection of candidates for an intelligent management of human resources. Such a system can be a starting point for the efficient organization of semi-structured and unstructured data on recruitment activities. The article extends the research presented at the 14th International Conference on Informatics in Economy (IE 2015 in the scientific paper "Big Data challenges for human resources management".

  6. Reactor and method of operation

    Science.gov (United States)

    Wheeler, John A.

    1976-08-10

    A nuclear reactor having a flattened reactor activity curve across the reactor includes fuel extending over a lesser portion of the fuel channels in the central portion of the reactor than in the remainder of the reactor.

  7. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-06-01

    soil C in the partnership region, and to design a risk/cost effectiveness framework to make comparative assessments of each viable sink, taking into account economic costs, offsetting benefits, scale of sequestration opportunities, spatial and time dimensions, environmental risks, and long term viability. Scientifically sound information on MMV is critical for public acceptance of these technologies. Two key deliverables were completed this quarter--a literature review/database to assess the soil carbon on rangelands, and the draft protocols, contracting options for soil carbon trading. To date, there has been little research on soil carbon on rangelands, and since rangeland constitutes a major land use in the Big Sky region, this is important in achieving a better understanding of terrestrial sinks. The protocols developed for soil carbon trading are unique and provide a key component of the mechanisms that might be used to efficiently sequester GHG and reduce CO{sub 2} concentrations. Progress on other deliverables is noted in the PowerPoint presentations. A series of meetings held during the second quarter have laid the foundations for assessing the issues surrounding the implementation of a market-based setting for soil C credits. These meetings provide a connection to stakeholders in the region and a basis on which to draw for the DOE PEIS hearings. Finally, the education and outreach efforts have resulted in a comprehensive plan and process which serves as a guide for implementing the outreach activities under Phase I. While we are still working on the public website, we have made many presentations to stakeholders and policy makers, connections to other federal and state agencies concerned with GHG emissions, climate change, and efficient and environmentally-friendly energy production. In addition, we have laid plans for integration of our outreach efforts with the students, especially at the tribal colleges and at the universities involved in our partnership

  8. Mechanic behavior of unloading fractured rock mass

    Institute of Scientific and Technical Information of China (English)

    YIN Ke; ZHANG Yongxing; WU Hanhui

    2003-01-01

    Under tension and shear conditions related to unloading of rock mass, a jointed rock mass model of linear elastic fracture mechanics is established. According to the model, the equations of stresses, strains and displacements of the region influenced by the crack but relatively faraway the crack (the distance between the research point and the center of the crack is longer than the length of crack) are derived. They are important for evaluating the deformation of cracked rock. It is demonstrated by the comparison between computational results of these theoretical equations and the observed data from unloading test that they are applicable for actual engineering.

  9. A view on big data and its relation to Informetrics

    Institute of Scientific and Technical Information of China (English)

    Ronald; ROUSSEAU

    2012-01-01

    Purpose:Big data offer a huge challenge.Their very existence leads to the contradiction that the more data we have the less accessible they become,as the particular piece of information one is searching for may be buried among terabytes of other data.In this contribution we discuss the origin of big data and point to three challenges when big data arise:Data storage,data processing and generating insights.Design/methodology/approach:Computer-related challenges can be expressed by the CAP theorem which states that it is only possible to simultaneously provide any two of the three following properties in distributed applications:Consistency(C),availability(A)and partition tolerance(P).As an aside we mention Amdahl’s law and its application for scientific collaboration.We further discuss data mining in large databases and knowledge representation for handling the results of data mining exercises.We further offer a short informetric study of the field of big data,and point to the ethical dimension of the big data phenomenon.Findings:There still are serious problems to overcome before the field of big data can deliver on its promises.Implications and limitations:This contribution offers a personal view,focusing on the information science aspects,but much more can be said about software aspects.Originality/value:We express the hope that the information scientists,including librarians,will be able to play their full role within the knowledge discovery,data mining and big data communities,leading to exciting developments,the reduction of scientific bottlenecks and really innovative applications.

  10. Big Bang of Massenergy and Negative Big Bang of Spacetime

    Science.gov (United States)

    Cao, Dayong

    2017-01-01

    There is a balance between Big Bang of Massenergy and Negative Big Bang of Spacetime in the universe. Also some scientists considered there is an anti-Big Bang who could produce the antimatter. And the paper supposes there is a structure balance between Einstein field equation and negative Einstein field equation, a balance between massenergy structure and spacetime structure, a balance between an energy of nucleus of the stellar matter and a dark energy of nucleus of the dark matter-dark energy, and a balance between the particle and the wave-a balance system between massenergy (particle) and spacetime (wave). It should explain of the problems of the Big Bang. http://meetings.aps.org/Meeting/APR16/Session/M13.8

  11. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  12. Big Data: present and future

    Directory of Open Access Journals (Sweden)

    Mircea Raducu TRIFU

    2014-05-01

    Full Text Available The paper explains the importance of the Big Data concept, a concept that even now, after years of development, is for the most companies just a cool keyword. The paper also describes the level of the actual big data development and the things it can do, and also the things that can be done in the near future. The paper focuses on explaining to nontechnical and non-database related technical specialists what basically is big data, presents the three most important V's, as well as the new ones, the most important solutions used by companies like Google or Amazon, as well as some interesting perceptions based on this subject.

  13. The challenges of big data.

    Science.gov (United States)

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest.

  14. Big Data: present and future

    OpenAIRE

    Mircea Raducu TRIFU; Mihaela Laura IVAN

    2014-01-01

    The paper explains the importance of the Big Data concept, a concept that even now, after years of development, is for the most companies just a cool keyword. The paper also describes the level of the actual big data development and the things it can do, and also the things that can be done in the near future. The paper focuses on explaining to nontechnical and non-database related technical specialists what basically is big data, presents the three most important V's, as well as the new ...

  15. Big Data Mining: Tools & Algorithms

    Directory of Open Access Journals (Sweden)

    Adeel Shiraz Hashmi

    2016-03-01

    Full Text Available We are now in Big Data era, and there is a growing demand for tools which can process and analyze it. Big data analytics deals with extracting valuable information from that complex data which can’t be handled by traditional data mining tools. This paper surveys the available tools which can handle large volumes of data as well as evolving data streams. The data mining tools and algorithms which can handle big data have also been summarized, and one of the tools has been used for mining of large datasets using distributed algorithms.

  16. The challenges of big data

    Science.gov (United States)

    2016-01-01

    ABSTRACT The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. PMID:27147249

  17. Big Data is invading big places as CERN

    CERN Document Server

    CERN. Geneva

    2017-01-01

    Big Data technologies are becoming more popular with the constant grow of data generation in different fields such as social networks, internet of things and laboratories like CERN. How is CERN making use of such technologies? How machine learning is applied at CERN with Big Data technologies? How much data we move and how it is analyzed? All these questions will be answered during the talk.

  18. The Big Chills

    Science.gov (United States)

    Bond, G. C.; Dwyer, G. S.; Bauch, H. A.

    2002-12-01

    At the end of the last glacial, the Earth's climate system abruptly shifted into the Younger Dryas, a 1500-year long cold snap known in the popular media as the Big Chill. Following an abrupt warming ending the Younger Dryas about 11,600 years ago, the climate system has remained in an interglacial state, thought to have been relatively stable and devoid, with possibly one or two exceptions, of abrupt climate change. A growing amount of evidence suggests that this benign view of interglacial climate is incorrect. High resolution records of North Atlantic ice rafted sediment, now regarded as evidence of extreme multiyear sea ice drift, reveal abrupt shifts on centennial and millennial time scales. These have been traced from the end of the Younger Dryas to the present, revealing evidence of significant climate variability through all of the last two millennia. Correlatives of these events have been found in drift ice records from the Arctic's Laptev Sea, in the isotopic composition of North Grip ice, and in dissolved K from the GISP2 ice core, attesting to their regional extent and imprint in proxies of very different origins. Measurements of Mg/Ca ratios in planktic foraminifera over the last two millennia in the eastern North Atlantic demonstrate that increases in drifting multiyear sea ice were accompanied by abrupt decreases in sea surface temperatures, especially during the Little Ice Age. Estimated rates of temperature change are on the order of two degrees centigrade, more than thirty percent of the regional glacial to interglacial change, within a few decades. When compared at the same resolution, these interglacial variations are as abrupt as the last glacial's Dansgaard-Oeschger cycles. The interglacial abrupt changes are especially striking because they occurred within the core of the warm North Atlantic Current. The changes may have been triggered by variations in solar irradiance, but if so their large magnitude and regional extent requires amplifying

  19. Tipping points? Ethnic composition change in Dutch big city neighbourhoods

    NARCIS (Netherlands)

    Ong, C.

    2014-01-01

    Micro-level studies using individual and household data have shown that residential location choices are influenced by neighbourhood ethnic composition. Using three conurbation samples in the Netherlands - Amsterdam metropolitan area, Rotterdam-The Hague metropolitan area, and the country's largest

  20. Big Data and Perioperative Nursing.

    Science.gov (United States)

    Westra, Bonnie L; Peterson, Jessica J

    2016-10-01

    Big data are large volumes of digital data that can be collected from disparate sources and are challenging to analyze. These data are often described with the five "Vs": volume, velocity, variety, veracity, and value. Perioperative nurses contribute to big data through documentation in the electronic health record during routine surgical care, and these data have implications for clinical decision making, administrative decisions, quality improvement, and big data science. This article explores methods to improve the quality of perioperative nursing data and provides examples of how these data can be combined with broader nursing data for quality improvement. We also discuss a national action plan for nursing knowledge and big data science and how perioperative nurses can engage in collaborative actions to transform health care. Standardized perioperative nursing data has the potential to affect care far beyond the original patient.

  1. Big, Fat World of Lipids

    Science.gov (United States)

    ... Science Home Page The Big, Fat World of Lipids By Emily Carlson Posted August 9, 2012 Cholesterol ... ways to diagnose and treat lipid-related conditions. Lipid Encyclopedia Just as genomics and proteomics spurred advances ...

  2. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  3. Big Bang Nucleosynthesis: 2015

    CERN Document Server

    Cyburt, Richard H; Olive, Keith A; Yeh, Tsung-Han

    2015-01-01

    Big-bang nucleosynthesis (BBN) describes the production of the lightest nuclides via a dynamic interplay among the four fundamental forces during the first seconds of cosmic time. We briefly overview the essentials of this physics, and present new calculations of light element abundances through li6 and li7, with updated nuclear reactions and uncertainties including those in the neutron lifetime. We provide fits to these results as a function of baryon density and of the number of neutrino flavors, N_nu. We review recent developments in BBN, particularly new, precision Planck cosmic microwave background (CMB) measurements that now probe the baryon density, helium content, and the effective number of degrees of freedom, n_eff. These measurements allow for a tight test of BBN and of cosmology using CMB data alone. Our likelihood analysis convolves the 2015 Planck data chains with our BBN output and observational data. Adding astronomical measurements of light elements strengthens the power of BBN. We include a ...

  4. Big bang darkleosynthesis

    Directory of Open Access Journals (Sweden)

    Gordan Krnjaic

    2015-12-01

    Full Text Available In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN, in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV/dark-nucleon binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S≫3/2, whose discovery would be smoking gun evidence for dark nuclei.

  5. Big Data Comes to School

    OpenAIRE

    Bill Cope; Mary Kalantzis

    2016-01-01

    The prospect of “big data” at once evokes optimistic views of an information-rich future and concerns about surveillance that adversely impacts our personal and private lives. This overview article explores the implications of big data in education, focusing by way of example on data generated by student writing. We have chosen writing because it presents particular complexities, highlighting the range of processes for collecting and interpreting evidence of learning in the era of computer-me...

  6. Big Data for Precision Medicine

    OpenAIRE

    Daniel Richard Leff; Guang-Zhong Yang

    2015-01-01

    This article focuses on the potential impact of big data analysis to improve health, prevent and detect disease at an earlier stage, and personalize interventions. The role that big data analytics may have in interrogating the patient electronic health record toward improved clinical decision support is discussed. We examine developments in pharmacogenetics that have increased our appreciation of the reasons why patients respond differently to chemotherapy. We also assess the expansion of onl...

  7. The role of big laboratories

    CERN Document Server

    Heuer, Rolf-Dieter

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward.

  8. Principles of rock mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Turchaninov, I.A.; Iofis, M.A.; Kasparyan, E.V.

    1979-01-01

    This book presents the principles of rock mechanics in a systematic way, reflecting both the historic development and the contemporary status of theoretical and experimental techniques used for the determination of the properties and stress state of rock masses, calculation of elements of systems for exploitation of useful mineral deposits and the design of mine openings. The subject of rock mechanics is discussed and methods and basic approaches are analyzed. The most widely used methods for determining the properties of rock in specimens and in situ are described. Problems of determining the stress strain state of the rock around mine openings by both experimental and analytic methods are discussed. The primary results of the study of the stress state of rock around main, development and production openings are presented. Problems of the movement of rock due to extraction of minerals are analyzed in detail, as are the conditions and causes of the development of rock bursts and sudden release of rock and gas in both surface and underground mines. Procedures for preventing or localizing rock bursts or sudden outbursts are described. (313 refs.)

  9. A comparative study of kinetics of nuclear reactors

    Directory of Open Access Journals (Sweden)

    Obaidurrahman Khalilurrahman

    2009-01-01

    Full Text Available The paper deals with the study of reactivity initiated transients to investigate major differences in the kinetics behavior of various reactor systems under different operating conditions. The article also states guidelines to determine the safety limits on reactivity insertion rates. Three systems, light water reactors (pressurized water reactors, heavy water reactors (pressurized heavy water reactors, and fast breeder reactors are considered for the sake of analysis. The upper safe limits for reactivity insertion rate in these reactor systems are determined. The analyses of transients are performed by a point kinetics computer code, PKOK. A simple but accurate method for accounting total reactivity feedback in kinetics calculations is suggested and used. Parameters governing the kinetics behavior of the core are studied under different core states. A few guidelines are discussed to project the possible kinetics trends in the next generation reactors.

  10. Genericness of Big Bounce in isotropic loop quantum cosmology

    OpenAIRE

    Date, Ghanashyam; Hossain, Golam Mortuza

    2004-01-01

    The absence of isotropic singularity in loop quantum cosmology can be understood in an effective classical description as the universe exhibiting a Big Bounce. We show that with scalar matter field, the big bounce is generic in the sense that it is independent of quantization ambiguities and details of scalar field dynamics. The volume of the universe at the bounce point is parametrized by a single parameter. It provides a minimum length scale which serves as a cut-off for computations of den...

  11. ESR dating of the fault rocks

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hee Kwon [Kangwon National Univ., Chuncheon (Korea, Republic of)

    2004-01-15

    Past movement on faults can be dated by measurement of the intensity of ESR signals in quartz. These signals are reset by local lattice deformation and local frictional heating on grain contacts at the time of fault movement. The ESR signals then grow back as a result of bombardment by ionizing radiation from surrounding rocks. The age is obtained from the ratio of the equivalent dose, needed to produce the observed signal, to the dose rate. Fine grains are more completely reset during faulting, and a plot of age vs, grain size shows a plateau for grains below critical size : these grains are presumed to have been completely zeroed by the last fault activity. We carried out ESR dating of fault rocks collected near the Ulzin nuclear reactor. ESR signals of quartz grains separated from fault rocks collected from the E-W trend fault are saturated. This indicates that the last movement of these faults had occurred before the quaternary period. ESR dates from the NW trend faults range from 300ka to 700ka. On the other hand, ESR date of the NS trend fault is about 50ka. Results of this research suggest that long-term cyclic fault activity near the Ulzin nuclear reactor continued into the pleistocene.

  12. Powering Big Data for Nursing Through Partnership.

    Science.gov (United States)

    Harper, Ellen M; Parkerson, Sara

    2015-01-01

    The Big Data Principles Workgroup (Workgroup) was established with support of the Healthcare Information and Management Systems Society. Building on the Triple Aim challenge, the Workgroup sought to identify Big Data principles, barriers, and challenges to nurse-sensitive data inclusion into Big Data sets. The product of this pioneering partnership Workgroup was the "Guiding Principles for Big Data in Nursing-Using Big Data to Improve the Quality of Care and Outcomes."

  13. Point cloud data management (extended abstract)

    NARCIS (Netherlands)

    Van Oosterom, P.J.M.; Ravada, S.; Horhammer, M.; Martinez Rubi, O.; Ivanova, M.; Kodde, M.; Tijssen, T.P.M.

    2014-01-01

    Point cloud data are important sources for 3D geo-information. The point cloud data sets are growing in popularity and in size. Modern Big Data acquisition and processing technologies, such as laser scanning from airborne, mobile, or static platforms, dense image matching from photos, multi-beam ech

  14. Results of new petrologic and remote sensing studies in the Big Bend region

    Science.gov (United States)

    Benker, Stevan Christian

    The initial section of this manuscript involves the South Rim Formation, a series of 32.2-32 Ma comenditic quartz trachytic-rhyolitic volcanics and associated intrusives, erupted and was emplaced in Big Bend National Park, Texas. Magmatic parameters have only been interpreted for one of the two diverse petrogenetic suites comprising this formation. Here, new mineralogic data for the South Rim Formation rocks are presented. Magmatic parameters interpreted from these data assist in deciphering lithospheric characteristics during the mid-Tertiary. Results indicate low temperatures (debated timing of tectonic transition (Laramide compression to Basin and Range extension) and onset of the southern Rio Grande Rift during the mid-Tertiary. The A-type and peralkaline characteristics of the South Rim Formation and other pre-31 Ma magmatism in Trans-Pecos Texas, in addition to evidence implying earlier Rio Grande Rift onset in Colorado and New Mexico, promotes a near-neutral to transtensional setting in Trans-Pecos Texas by 32 Ma. This idea sharply contrasts with interpretations of tectonic compression and arc-related magmatism until 31 Ma as suggested by some authors. However, evidence discussed cannot preclude a pre-36 Ma proposed by other authors. The later section of this manuscript involves research in the Big Bend area using Google Earth. At present there is high interest in using Google Earth in a variety of scientific investigations. However, program developers have disclosed limited information concerning the program and its accuracy. While some authors have attempted to independently constrain the accuracy of Google Earth, their results have potentially lost validity through time due to technological advances and updates to imagery archives. For this reason we attempt to constrain more current horizontal and vertical position accuracies for the Big Bend region of West Texas. In Google Earth a series of 268 data points were virtually traced along various early

  15. Site Investigation for Detection of KIJANG Reactor Core Center

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Tae-Hyun; Kim, Jun Yeon; Kim, Jeeyoung [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    It was planned for the end of March 2017 and extended to April 2018 according to the government budget adjustment. The KJRR project is intended for filling the self-sufficiency of RI demand including Mo-99, increasing the NTD capacity and developing technologies related to the research reactor. In project, site investigation is the first activity that defines seismologic and related geologic aspects of the site. Site investigation was carried out from Oct. 2012 to Jan. 2014 and this study is intended to describe detail procedures in locating the reactor core center. The location of the reactor core center was determined by collectively reviewing not only geological information but also information from architects engineering. EL 50m was selected as ground level by levering construction cost. Four recommended locations (R-1a - R-1d) are displayed for the reactor core center. R-1a was found optimal in consideration of medium rock contour, portion of medium rock covering reactor buildings, construction cost, physical protection and electrical resistivity. It is noted that engineering properties of the medium rock is TCR/RQD 100/53, elastic modulus 7,710 - 8,720MPa, permeability coefficient 2.92E-06cm/s, and S-wave velocity 1,380m/s, sound for foundations of reactor buildings.

  16. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-06-01

    soil C in the partnership region, and to design a risk/cost effectiveness framework to make comparative assessments of each viable sink, taking into account economic costs, offsetting benefits, scale of sequestration opportunities, spatial and time dimensions, environmental risks, and long term viability. Scientifically sound information on MMV is critical for public acceptance of these technologies. Two key deliverables were completed this quarter--a literature review/database to assess the soil carbon on rangelands, and the draft protocols, contracting options for soil carbon trading. To date, there has been little research on soil carbon on rangelands, and since rangeland constitutes a major land use in the Big Sky region, this is important in achieving a better understanding of terrestrial sinks. The protocols developed for soil carbon trading are unique and provide a key component of the mechanisms that might be used to efficiently sequester GHG and reduce CO{sub 2} concentrations. Progress on other deliverables is noted in the PowerPoint presentations. A series of meetings held during the second quarter have laid the foundations for assessing the issues surrounding the implementation of a market-based setting for soil C credits. These meetings provide a connection to stakeholders in the region and a basis on which to draw for the DOE PEIS hearings. Finally, the education and outreach efforts have resulted in a comprehensive plan and process which serves as a guide for implementing the outreach activities under Phase I. While we are still working on the public website, we have made many presentations to stakeholders and policy makers, connections to other federal and state agencies concerned with GHG emissions, climate change, and efficient and environmentally-friendly energy production. In addition, we have laid plans for integration of our outreach efforts with the students, especially at the tribal colleges and at the universities involved in our partnership

  17. Geoneutrinos and reactor antineutrinos at SNO+

    CERN Document Server

    Baldoncini, M; Wipperfurth, S A; Fiorentini, G; Mantovani, F; McDonough, W F; Ricci, B

    2016-01-01

    In the heart of the Creighton Mine near Sudbury (Canada), the SNO+ detector is foreseen to observe almost in equal proportion electron antineutrinos produced by U and Th in the Earth and by nuclear reactors. SNO+ will be the first long baseline experiment to measure a reactor signal dominated by CANDU cores ($\\sim$55\\% of the total reactor signal), which generally burn natural uranium. Approximately 18\\% of the total geoneutrino signal is generated by the U and Th present in the rocks of the Huronian Supergroup-Sudbury Basin: the 60\\% uncertainty on the signal produced by this lithologic unit plays a crucial role on the discrimination power on the mantle signal as well as on the geoneutrino spectral shape reconstruction, which can in principle provide a direct measurement of the Th/U ratio in the Earth.

  18. Determination of average molecular weights on organic reactor coolants. I.- Freezing-point depression method for benzene solutions; Determinaciond e masas moleculares medias en refrigerantes nucleares organicos. I.- Crioscopia de disolucion bencenicas

    Energy Technology Data Exchange (ETDEWEB)

    Carreira, M.

    1965-07-01

    As a working method for determination of changes in molecular mass that may occur by irradiation (pyrolytic-radiolytic decomposition) of polyphenyl reactor coolants, a cryoscopic technique has been developed which associated the basic simplicity of Beckman's method with some experimental refinements taken out of the equilibrium methods. A total of 18 runs were made on samples of napthalene, biphenyl, and the commercial mixtures OM-2 (Progil) and Santowax-R (Monsanto), with an average deviation from the theoretical molecular mass of 0.6%. (Author) 7 refs.

  19. Calculation system for physical analysis of boiling water reactors; Modelisation des phenomenes physiques specifiques aux reacteurs a eau bouillante, notamment le couplage neutronique-thermohydraulique

    Energy Technology Data Exchange (ETDEWEB)

    Bouveret, F

    2001-07-01

    Although Boiling Water Reactors generate a quarter of worldwide nuclear electricity, they have been only little studied in France. A certain interest now shows up for these reactors. So, the aim of the work presented here is to contribute to determine a core calculation methodology with CEA (Commissariat a l'Energie Atomique) codes. Vapour production in the reactor core involves great differences in technological options from pressurised water reactor. We analyse main physical phenomena for BWR and offer solutions taking them into account. BWR fuel assembly heterogeneity causes steep thermal flux gradients. The two dimensional collision probability method with exact boundary conditions makes possible to calculate accurately the flux in BWR fuel assemblies using the APOLLO-2 lattice code but induces a very long calculation time. So, we determine a new methodology based on a two-level flux calculation. Void fraction variations in assemblies involve big spectrum changes that we have to consider in core calculation. We suggest to use a void history parameter to generate cross-sections libraries for core calculation. The core calculation code has also to calculate the depletion of main isotopes concentrations. A core calculation associating neutronics and thermal-hydraulic codes lays stress on points we still have to study out. The most important of them is to take into account the control blade in the different calculation stages. (author)

  20. Low grade metamorphism of mafic rocks

    Science.gov (United States)

    Schiffman, Peter

    1995-07-01

    Through most of this past century, metamorphic petrologists in the United States have paid their greatest attention to high grade rocks, especially those which constitute the core zones of exhumed, mountain belts. The pioneering studies of the 50's through the 80's, those which applied the principles of thermodynamics to metamorphic rocks, focused almost exclusively on high temperature systems, for which equilibrium processes could be demonstrated. By the 1980's, metamorphic petrologists had developed the methodologies for deciphering the thermal and baric histories of mountain belts through the study of high grade rocks. Of course, low grade metamorphic rocks - here defined as those which form at pressures and temperatures up to and including the greenschist facies - had been well known and described as well, initially through the efforts of Alpine and Circum-Pacific geologists who recognized that they constituted an integral and contiguous portion of mountain belts, and that they underlay large portions of accreted terranes, many of oceanic origins. But until the mid 80's, much of the effort in studying low grade rocks - for a comprehensive review of the literature to that point see Frey (1987) - had been concentrated on mudstones, volcanoclastic rocks, and associated lithologies common to continental mountain belts and arcs. In the mid 80's, results of the Deep Sea Drilling Project (DSDP) rather dramatically mitigated a shift in the study of low grade metamorphic rocks.

  1. A Review of Rock Bolt Monitoring Using Smart Sensors.

    Science.gov (United States)

    Song, Gangbing; Li, Weijie; Wang, Bo; Ho, Siu Chun Michael

    2017-04-05

    Rock bolts have been widely used as rock reinforcing members in underground coal mine roadways and tunnels. Failures of rock bolts occur as a result of overloading, corrosion, seismic burst and bad grouting, leading to catastrophic economic and personnel losses. Monitoring the health condition of the rock bolts plays an important role in ensuring the safe operation of underground mines. This work presents a brief introduction on the types of rock bolts followed by a comprehensive review of rock bolt monitoring using smart sensors. Smart sensors that are used to assess rock bolt integrity are reviewed to provide a firm perception of the application of smart sensors for enhanced performance and reliability of rock bolts. The most widely used smart sensors for rock bolt monitoring are the piezoelectric sensors and the fiber optic sensors. The methodologies and principles of these smart sensors are reviewed from the point of view of rock bolt integrity monitoring. The applications of smart sensors in monitoring the critical status of rock bolts, such as the axial force, corrosion occurrence, grout quality and resin delamination, are highlighted. In addition, several prototypes or commercially available smart rock bolt devices are also introduced.

  2. Oxidation performance of graphite material in reactors

    Institute of Scientific and Technical Information of China (English)

    Xiaowei LUO; Xinli YU; Suyuan YU

    2008-01-01

    Graphite is used as a structural material and moderator for high temperature gas-cooled reactors (HTGR). When a reactor is in operation, graphite oxida-tion influences the safety and operation of the reactor because of the impurities in the coolant and/or the acci-dent conditions, such as water ingress and air ingress. In this paper, the graphite oxidation process is introduced, factors influencing graphite oxidation are analyzed and discussed, and some new directions for further study are pointed out.

  3. Macro mechanical parameters' size effect of surrounding rock of Shuibuya project's underground power station

    Institute of Scientific and Technical Information of China (English)

    GUO Zhi-hua; ZHOU Chuang-bing; ZHOU Huo-ming; SHENG Qian; LENG Xian-lun

    2005-01-01

    Scale effect is one of the important aspects in the macro mechanical parameters' research of rock mass, from a new point of view, by means of lab and field rock mechanics test, establishment of E~Vp relation, classification of engineering rock mass, numerical simulation test and back analysis based on surrounding rock's displacement monitoring results of Shuibuya Project's underground power station, rock mass deformation module's size effect of surrounding rock of Shuibuya Project's undegroud power station was studied. It's shown that rock mass deformation module's scale effect of surrounding rock of Shuibuya Project's undeground power station is obvious, the rock mass deformation module to tranquilization is 20% of intact rock's. Finally the relation between rock mass deformation modules and the scale of research was established.

  4. Small Places, Big Stakes

    DEFF Research Database (Denmark)

    Garsten, Christina; Sörbom, Adrienne

    Ethnographic fieldwork in organizations – such as corporations, state agencies, and international organizations – often entails that the ethnographer has to rely to a large extent on meetings as the primary point of access. Oftentimes, this involves doing fieldwork in workshops, at ceremonies, an...

  5. Study on Splitting Failure Criterion of Surrounding Rock under Point Yield%裂纹尖端屈服条件下的围岩劈裂破坏判据研究

    Institute of Scientific and Technical Information of China (English)

    刘宁; 张春生

    2011-01-01

    The new fracture criterion is set up because the elastic fracture mechanics is not applicable for the plastic zone. Considering the interaction among cracks, the sliding multi-cracks model is adopted to simulate the splitting failure of rock in axial pressure. The plastic radius is calculated by Mises yield criterion and the dissipation energy for plastic deformation in total volume is also computed. The splitting failure criterion is established with principle of energy balance. The criterion is applied to the excavation of Langyashan pumped storage power station. The distribution charts of splitting failure zone are obtained, which coincides with results obtained by Lajtai empirical formula. It shows that the method is effective and feasible, which provides prediction criterion for splitting failure of underground engineering excavation in high geostress.%针对裂纹尖端附近出现塑性区时线弹性断裂力学理论不适用的问题,采用滑移裂纹组模型模拟岩石在轴向压力下的劈裂破坏,根据Mises屈服准则求出塑性区半径和整个体积中塑性变形引起的耗散能,基于能量平衡原理得到小范围屈服条件下的劈裂判据,将该判据应用于琅琊山抽水蓄能水电站机组的开挖分析中,计算获得了劈裂破坏区范围分布图,并与Lajtai经验公式进行比较,结果较为一致,表明该方法可行、有效,为地下工程开挖引起的劈裂破坏提供了预测判据.

  6. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-10-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification

  7. Soft rocks in Argentina

    Institute of Scientific and Technical Information of China (English)

    Giambastiani; Mauricio

    2014-01-01

    Soft rocks are a still fairly unexplored chapter in rock mechanics. Within this category are the clastic sedimentary rocks and pyroclastic volcanic rocks, of low to moderate lithification (consolidation, cemen-tation, new formed minerals), chemical sedimentary rocks and metamorphic rocks formed by minerals with Mohs hardness less than 3.5, such as limestone, gypsum, halite, sylvite, between the first and phyllites, graphitic schist, chloritic shale, talc, etc., among the latter. They also include any type of rock that suffered alteration processes (hydrothermal or weathering). In Argentina the study of low-strength rocks has not received much attention despite having extensive outcrops in the Andes and great impact in the design criteria. Correlation between geomechanical properties (UCS, deformability) to physical index (porosity, density, etc.) has shown promising results to be better studied. There are many studies and engineering projects in Argentina in soft rock geological environments, some cited in the text (Chihuído dam, N. Kirchner dam, J. Cepernic Dam, etc.) and others such as International Tunnel in the Province of Mendoza (Corredor Bioceánico), which will require the valuable contribution from rock mechanics. The lack of consistency between some of the physical and mechanical parameters explored from studies in the country may be due to an insufficient amount of information and/or non-standardization of criteria for testing materials. It is understood that more and better academic and professional efforts in improv-ing techniques will result in benefits to the better understanding of the geomechanics of weak rocks.

  8. Research of dynamic mechanical performance of cement rock

    Institute of Scientific and Technical Information of China (English)

    WANG Qiang; WANG Tong; WANG Xiang-lin

    2007-01-01

    As Daqing Oilfield is developing oil layer with a big potential, the requirement for the quality of well cementation is higher than ever before. Cement rock is a brittle material containing a great number of microcracks and defects. In order to reduce the damage to cement ring and improve sealed cementing property at the interface, it is necessary to conduct research on the modification of the cement rock available. According to the principle of super mixed composite materials, various fillers are added to the ingredients of cement rock. Dynamic fracture toughness of cement rock will be changed under the influence of filler. In order to study the damage mechanism of the cement circle during perforation and carry out comprehensive experiments on preventing and resisting connection, a kind of comprehensive experiment equipment used to simulate perforation and multifunctional equipment for testing the dynamic properties of the material are designed. Experimental study of the dynamical mechanical performance of original and some improved cement rock and experiment used to simulate the well cementation and perforation are carried out. Standard for dynamical mechanical performance of the cement rock with fine impact resistance and mechanical properties of some improved cement rock are also given.

  9. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  10. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  11. Reactor Physics Programme

    Energy Technology Data Exchange (ETDEWEB)

    De Raedt, C

    2000-07-01

    The Reactor Physics and Department of SCK-CEN offers expertise in various areas of reactor physics, in particular in neutronics calculations, reactor dosimetry, reactor operation, reactor safety and control and non-destructive analysis on reactor fuel. This expertise is applied within the Reactor Physics and MYRRHA Research Department's own research projects in the VENUS critical facility, in the BR1 reactor and in the MYRRHA project (this project aims at designing a prototype Accelerator Driven System). Available expertise is also used in programmes external to the Department such as the reactor pressure steel vessel programme, the BR2 reactor dosimetry, and the preparation and interpretation of irradiation experiments. Progress and achievements in 1999 in the following areas are reported on: (1) investigations on the use of military plutonium in commercial power reactors; (2) neutron and gamma calculations performed for BR-2 and for other reactors; (3) the updating of neutron and gamma cross-section libraries; (4) the implementation of reactor codes; (6) the management of the UNIX workstations; and (6) fuel cycle studies.

  12. CLOUD COMPUTING WITH BIG DATA: A REVIEW

    OpenAIRE

    Anjali; Er. Amandeep Kaur; Mrs. Shakshi

    2016-01-01

    Big data is a collection of huge quantities of data. Big data is the process of examining large amounts of data. Big data and Cloud computing are the hot issues in Information Technology. Big data is the one of the main problem now a day’s. Researchers focusing how to handle huge amount of data with cloud computing and how to gain a perfect security for big data in cloud computing. To handle the Big Data problem Hadoop framework is used in which data is fragmented and executed parallel....

  13. Big Data: Astronomical or Genomical?

    Science.gov (United States)

    Stephens, Zachary D; Lee, Skylar Y; Faghri, Faraz; Campbell, Roy H; Zhai, Chengxiang; Efron, Miles J; Iyer, Ravishankar; Schatz, Michael C; Sinha, Saurabh; Robinson, Gene E

    2015-07-01

    Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a "four-headed beast"--it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the "genomical" challenges of the next decade.

  14. Big Data: Astronomical or Genomical?

    Directory of Open Access Journals (Sweden)

    Zachary D Stephens

    2015-07-01

    Full Text Available Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a "four-headed beast"--it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the "genomical" challenges of the next decade.

  15. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  16. 淀粉Big Bang!

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Big Bang,也叫"大爆炸",指的是宇宙诞生时期从密度极大且温度极高的太初状态开始发生不断膨胀的过程。换句话说,从Big Bang开始,我们现在的宇宙慢慢形成了。0K,从本期开始,"少电"将在微博引发Big Bang!——淀粉大爆炸!具体怎么爆呢?我想,看到本页版式的你已经明白了七八分了吧?

  17. Multiwavelength astronomy and big data

    Science.gov (United States)

    Mickaelian, A. M.

    2016-09-01

    Two major characteristics of modern astronomy are multiwavelength (MW) studies (fromγ-ray to radio) and big data (data acquisition, storage and analysis). Present astronomical databases and archives contain billions of objects observed at various wavelengths, both galactic and extragalactic, and the vast amount of data on them allows new studies and discoveries. Astronomers deal with big numbers. Surveys are the main source for discovery of astronomical objects and accumulation of observational data for further analysis, interpretation, and achieving scientific results. We review the main characteristics of astronomical surveys, compare photographic and digital eras of astronomical studies (including the development of wide-field observations), describe the present state of MW surveys, and discuss the Big Data in astronomy and related topics of Virtual Observatories and Computational Astrophysics. The review includes many numbers and data that can be compared to have a possibly overall understanding on the Universe, cosmic numbers and their relationship to modern computational facilities.

  18. Big Data Analytics in Healthcare

    Directory of Open Access Journals (Sweden)

    Ashwin Belle

    2015-01-01

    Full Text Available The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  19. [Algorithms, machine intelligence, big data : general considerations].

    Science.gov (United States)

    Radermacher, F J

    2015-08-01

    We are experiencing astonishing developments in the areas of big data and artificial intelligence. They follow a pattern that we have now been observing for decades: according to Moore's Law,the performance and efficiency in the area of elementary arithmetic operations increases a thousand-fold every 20 years. Although we have not achieved the status where in the singular sense machines have become as "intelligent" as people, machines are becoming increasingly better. The Internet of Things has again helped to massively increase the efficiency of machines. Big data and suitable analytics do the same. If we let these processes simply continue, our civilization may be endangerd in many instances. If the "containment" of these processes succeeds in the context of a reasonable political global governance, a worldwide eco-social market economy, andan economy of green and inclusive markets, many desirable developments that are advantageous for our future may result. Then, at some point in time, the constant need for more and faster innovation may even stop. However, this is anything but certain. We are facing huge challenges.

  20. Analyzing Big Data with Dynamic Quantum Clustering

    CERN Document Server

    Weinstein, M; Hume, A; Sciau, Ph; Shaked, G; Hofstetter, R; Persi, E; Mehta, A; Horn, D

    2013-01-01

    How does one search for a needle in a multi-dimensional haystack without knowing what a needle is and without knowing if there is one in the haystack? This kind of problem requires a paradigm shift - away from hypothesis driven searches of the data - towards a methodology that lets the data speak for itself. Dynamic Quantum Clustering (DQC) is such a methodology. DQC is a powerful visual method that works with big, high-dimensional data. It exploits variations of the density of the data (in feature space) and unearths subsets of the data that exhibit correlations among all the measured variables. The outcome of a DQC analysis is a movie that shows how and why sets of data-points are eventually classified as members of simple clusters or as members of - what we call - extended structures. This allows DQC to be successfully used in a non-conventional exploratory mode where one searches data for unexpected information without the need to model the data. We show how this works for big, complex, real-world dataset...

  1. Towards a big crunch dual

    Energy Technology Data Exchange (ETDEWEB)

    Hertog, Thomas E-mail: hertog@vulcan2.physics.ucsb.edu; Horowitz, Gary T

    2004-07-01

    We show there exist smooth asymptotically anti-de Sitter initial data which evolve to a big crunch singularity in a low energy supergravity limit of string theory. This opens up the possibility of using the dual conformal field theory to obtain a fully quantum description of the cosmological singularity. A preliminary study of this dual theory suggests that the big crunch is an endpoint of evolution even in the full string theory. We also show that any theory with scalar solitons must have negative energy solutions. The results presented here clarify our earlier work on cosmic censorship violation in N=8 supergravity. (author)

  2. Was There A Big Bang?

    CERN Document Server

    Soberman, Robert K

    2008-01-01

    The big bang hypothesis is widely accepted despite numerous physics conflicts. It rests upon two experimental supports, galactic red shift and the cosmic microwave background. Both are produced by dark matter, shown here to be hydrogen dominated aggregates with a few percent of helium nodules. Scattering from these non-radiating intergalactic masses produce a red shift that normally correlates with distance. Warmed by our galaxy to an Eigenvalue of 2.735 K, drawn near the Earth, these bodies, kept cold by ablation, resonance radiate the Planckian microwave signal. Several tests are proposed that will distinguish between this model and the big bang.

  3. [Big Data- challenges and risks].

    Science.gov (United States)

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-01

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented.

  4. Do big gods cause anything?

    DEFF Research Database (Denmark)

    Geertz, Armin W.

    2014-01-01

    Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere.......Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere....

  5. Issues of Eco-agricultural Industrialization for Big Qinling Eco-city Cluster

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Firstly,the necessities of ecological agriculture development in Big Qinling Eco-city Cluster were discussed.Then,condition endowment of eco-agricultural industrialization in Big Qinling Eco-city Cluster was analyzed from the aspects of basic conditions and differential endowment of eco-agricultural industrialization.Finally,specific forms and functional orientation of eco-agriculture were pointed out.Countermeasures for the eco-agricultural industrialization in Big Qinling Eco-city Cluster were put forward.Firstly,the government guidance and the media publicity should be strengthened.Secondly,financial support for the eco-agricultural industrialization in Big Qinling Eco-city Cluster should be enhanced.Thirdly,branding strategies of eco-agricultural products in Big Qinling Eco-city Cluster should be implemented as soon as possible.

  6. LMFBR type reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kawakami, Hiroto

    1995-02-07

    A reactor container of the present invention has a structure that the reactor container is entirely at the same temperature as that at the inlet of the reactor and, a hot pool is incorporated therein, and the reactor container has is entirely at the same temperature and has substantially uniform temperature follow-up property transiently. Namely, if the temperature at the inlet of the reactor core changes, the temperature of the entire reactor container changes following this change, but no great temperature gradient is caused in the axial direction and no great heat stresses due to axial temperature distribution is caused. Occurrence of thermal stresses caused by the axial temperature distribution can be suppressed to improve the reliability of the reactor container. In addition, since the laying of the reactor inlet pipelines over the inside of the reactor is eliminated, the reactor container is made compact and the heat shielding structures above the reactor and a protection structure of container walls are simplified. Further, secondary coolants are filled to the outside of the reactor container to simplify the shieldings. The combined effects described above can improve economical property and reliability. (N.H.).

  7. Big society, big data. The radicalisation of the network society

    NARCIS (Netherlands)

    Frissen, V.

    2011-01-01

    During the British election campaign of 2010, David Cameron produced the idea of the ‘Big Society’ as a cornerstone of his political agenda. At the core of the idea is a stronger civil society and local community coupled with a more withdrawn government. Although many commentators have dismissed thi

  8. Little Science to Big Science: Big Scientists to Little Scientists?

    Science.gov (United States)

    Simonton, Dean Keith

    2010-01-01

    This article presents the author's response to Hisham B. Ghassib's essay entitled "Where Does Creativity Fit into a Productivist Industrial Model of Knowledge Production?" Professor Ghassib's (2010) essay presents a provocative portrait of how the little science of the Babylonians, Greeks, and Arabs became the Big Science of the modern industrial…

  9. Automated rock mass characterisation using 3-D terrestrial laser scanning

    NARCIS (Netherlands)

    Slob, S.

    2010-01-01

    The research investigates the possibility of using point cloud data from 3-D terrestrial laser scanning as a basis to characterise discontinuities in exposed rock massed in an automated way. Examples of discontinuities in rock are bedding planes, joints, fractures and schistocity. The characterisati

  10. Light water reactor safety

    CERN Document Server

    Pershagen, B

    2013-01-01

    This book describes the principles and practices of reactor safety as applied to the design, regulation and operation of light water reactors, combining a historical approach with an up-to-date account of the safety, technology and operating experience of both pressurized water reactors and boiling water reactors. The introductory chapters set out the basic facts upon which the safety of light water reactors depend. The central section is devoted to the methods and results of safety analysis. The accidents at Three Mile Island and Chernobyl are reviewed and their implications for light wate

  11. Nuclear reactor physics

    CERN Document Server

    Stacey, Weston M

    2010-01-01

    Nuclear reactor physics is the core discipline of nuclear engineering. Nuclear reactors now account for a significant portion of the electrical power generated worldwide, and new power reactors with improved fuel cycles are being developed. At the same time, the past few decades have seen an ever-increasing number of industrial, medical, military, and research applications for nuclear reactors. The second edition of this successful comprehensive textbook and reference on basic and advanced nuclear reactor physics has been completely updated, revised and enlarged to include the latest developme

  12. Data Partitioning View of Mining Big Data

    OpenAIRE

    Zhang, Shichao

    2016-01-01

    There are two main approximations of mining big data in memory. One is to partition a big dataset to several subsets, so as to mine each subset in memory. By this way, global patterns can be obtained by synthesizing all local patterns discovered from these subsets. Another is the statistical sampling method. This indicates that data partitioning should be an important strategy for mining big data. This paper recalls our work on mining big data with a data partitioning and shows some interesti...

  13. 77 FR 27245 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN

    Science.gov (United States)

    2012-05-09

    ... Fish and Wildlife Service Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN... comprehensive conservation plan (CCP) and environmental assessment (EA) for Big Stone National Wildlife Refuge...: r3planning@fws.gov . Include ``Big Stone Draft CCP/ EA'' in the subject line of the message. Fax:...

  14. Research on the usage of a deep sea fast reactor

    Energy Technology Data Exchange (ETDEWEB)

    Otsubo, Akira; Kowata, Yasuki [Power Reactor and Nuclear Fuel Development Corp., Oarai, Ibaraki (Japan). Oarai Engineering Center

    1997-09-01

    Many new types of fast reactors have been studied in PNC. A deep sea fast reactor has the highest realization probability of the reactors studied because its development is desired by many specialists of oceanography, meteorology, deep sea bottom oil field, seismology and so on and because the development does not cost big budget and few technical problems remain to be solved. This report explains the outline and the usage of the reactor of 40 kWe and 200 to 400 kWe. The reactor can be used as a power source at an unmanned base for long term climate prediction and the earth science and an oil production base in a deep sea region. On the other hand, it is used for heat and electric power supply to a laboratory in the polar region. In future, it will be used in the space. At the present time, a large FBR development plan does not proceed successfully and a realization goal time of FBR has gone later and later. We think that it is the most important to develop the reactor as fast as possible and to plant a fast reactor technique in our present society. (author)

  15. BIG DATA IN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2015-06-01

    Full Text Available In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured in order to improve current transactions, to develop new business models, to provide a real image of the supply and demand and thereby, generate market advantages. So, the companies that turn to Big Data have a competitive advantage over other firms. Looking from the perspective of IT organizations, they must accommodate the storage and processing Big Data, and provide analysis tools that are easily integrated into business processes. This paper aims to discuss aspects regarding the Big Data concept, the principles to build, organize and analyse huge datasets in the business environment, offering a three-layer architecture, based on actual software solutions. Also, the article refers to the graphical tools for exploring and representing unstructured data, Gephi and NodeXL.

  16. The International Big History Association

    Science.gov (United States)

    Duffy, Michael; Duffy, D'Neil

    2013-01-01

    IBHA, the International Big History Association, was organized in 2010 and "promotes the unified, interdisciplinary study and teaching of history of the Cosmos, Earth, Life, and Humanity." This is the vision that Montessori embraced long before the discoveries of modern science fleshed out the story of the evolving universe. "Big…

  17. The Big European Bubble Chamber

    CERN Multimedia

    1977-01-01

    The 3.70 metre Big European Bubble Chamber (BEBC), dismantled on 9 August 1984. During operation it was one of the biggest detectors in the world, producing direct visual recordings of particle tracks. 6.3 million photos of interactions were taken with the chamber in the course of its existence.

  18. YOUNG CITY,BIG PARTY

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Shenzhen Universiade unites the world’s young people through sports with none of the usual hoop-la, no fireworks, no grand performances by celebrities and superstars, the Shenzhen Summer Universiade lowered the curtain on a big party for youth and college students on August 23.

  19. True Randomness from Big Data

    Science.gov (United States)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-09-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  20. True Randomness from Big Data

    Science.gov (United States)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-01-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514

  1. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  2. Spinning fluids reactor

    Science.gov (United States)

    Miller, Jan D; Hupka, Jan; Aranowski, Robert

    2012-11-20

    A spinning fluids reactor, includes a reactor body (24) having a circular cross-section and a fluid contactor screen (26) within the reactor body (24). The fluid contactor screen (26) having a plurality of apertures and a circular cross-section concentric with the reactor body (24) for a length thus forming an inner volume (28) bound by the fluid contactor screen (26) and an outer volume (30) bound by the reactor body (24) and the fluid contactor screen (26). A primary inlet (20) can be operatively connected to the reactor body (24) and can be configured to produce flow-through first spinning flow of a first fluid within the inner volume (28). A secondary inlet (22) can similarly be operatively connected to the reactor body (24) and can be configured to produce a second flow of a second fluid within the outer volume (30) which is optionally spinning.

  3. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  4. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  5. Data, Data, Data : Big, Linked & Open

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Eckartz, S.M.

    2013-01-01

    De gehele business en IT-wereld praat op dit moment over Big Data, een trend die medio 2013 Cloud Computing is gepasseerd (op basis van Google Trends). Ook beleidsmakers houden zich actief bezig met Big Data. Neelie Kroes, vice-president van de Europese Commissie, spreekt over de ‘Big Data Revolutio

  6. Big sagebrush seed bank densities following wildfires

    Science.gov (United States)

    Big sagebrush (Artemisia spp.) is a critical shrub to many wildlife species including sage grouse (Centrocercus urophasianus), mule deer (Odocoileus hemionus), and pygmy rabbit (Brachylagus idahoensis). Big sagebrush is killed by wildfires and big sagebrush seed is generally short-lived and do not s...

  7. A survey of big data research

    OpenAIRE

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions.

  8. How do we identify big rivers? And how big is big?

    Science.gov (United States)

    Miall, Andrew D.

    2006-04-01

    "Big rivers" are the trunk rivers that carry the water and sediment load from major orogens, or that drain large areas of a continent. Identifying such rivers in the ancient record is a challenge. Some guidance may be provided by tectonic setting and sedimentological evidence, including the scale of architectural elements, and clues from provenance studies, but such data are not infallible guides to river magnitude. The scale of depositional elements is the most obvious clue to channel size, but evidence is typically sparse and inadequate, and may be misleading. For example, thick fining-upward successions may be tectonic cyclothems. Two examples of the analysis of large ancient river systems are discussed here in order to highlight problems of methodology and interpretation. The Hawkesbury Sandstone (Triassic) of the Sydney Basin, Australia, is commonly cited as the deposit of a large river, on the basis of abundant very large-scale crossbedding. An examination of very large outcrops of this unit, including a coastal cliff section 6 km long near Sydney, showed that even with 100% exposure there are ambiguities in the determination of channel scale. It was concluded in this case that the channel dimensions of the Hawkesbury rivers were about half the size of the modern Brahmaputra River. The tectonic setting of a major ancient fluvial system is commonly not a useful clue to river scale. The Hawkesbury Sandstone is a system draining transversely from a cratonic source into a foreland basin, whereas most large rivers in foreland basins flow axially and are derived mainly from the orogenic uplifts (e.g., the large tidally influenced rivers of the Athabasca Oil Sands, Alberta). Epeirogenic tilting of a continent by the dynamic topography process may generate drainages in unexpected directions. For example, analyses of detrital zircons in Upper Paleozoic-Mesozoic nonmarine successions in the SW United States suggests significant derivation from the Appalachian orogen

  9. The Rock in Life

    Institute of Scientific and Technical Information of China (English)

    卢爱龙

    2016-01-01

    One day,a teacher was speaking to a group of students.He told them something that they would never forget.As he stood in front of the group of students,he said,"OK,time for a quiz."He took out a big jar①and put it on the table in front of him.

  10. PEOPLE & POINTS

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Rocking the 'Cross-Strait' Boat Just as people thought that crossstrait tensions couldn't get any more testy amid Taiwan leader Chen Shui-bian's efforts to hinder the development of cross-strait ties between the mainland and the island, they did when Chen stumbled upon a new secession drive. Chen announced February 27 his decision to terminate the "National Unification Council" and scrap the

  11. Digital Rock Studies of Tight Porous Media

    Energy Technology Data Exchange (ETDEWEB)

    Silin, Dmitriy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-08-07

    This technical report summarizes some recently developed approaches to studies of rock properties at a pore scale. Digital rock approach is complementary to laboratory and field studies. It can be especially helpful in situations where experimental data are uncertain, or are difficult or impossible to obtain. Digitized binary images of the pore geometries of natural rocks obtained by different imaging techniques are the input data. Computer-generated models of natural rocks can be used instead of images in a case where microtomography data are unavailable, or the resolution of the tools is insufficient to adequately characterize the features of interest. Simulations of creeping viscous flow in pores produce estimates of Darcy permeability. Maximal Inscribed Spheres calculations estimate two-phase fluid distribution in capillary equilibrium. A combination of both produce relative permeability curves. Computer-generated rock models were employed to study two-phase properties of fractured rocks, or tight sands with slit-like pores, too narrow to be characterized with micro-tomography. Various scenarios can simulate different fluid displacement mechanisms, from piston-like drainage to liquid dropout at the dew point. A finite differences discretization of Stokes equation is developed to simulate flow in the pore space of natural rocks. The numerical schemes are capable to handle both no-slip and slippage flows. An upscaling procedure estimates the permeability by subsampling a large data set. Capillary equilibrium and capillary pressure curves are efficiently estimated with the method of maximal inscribed spheres both an arbitrary contact angle. The algorithms can handle gigobytes of data on a desktop workstation. Customized QuickHull algorithms model natural rocks. Capillary pressure curves evaluated from computer-generated images mimic those obtained for microtomography data.

  12. Heat pipe reactors for space power applications

    Science.gov (United States)

    Koenig, D. R.; Ranken, W. A.; Salmi, E. W.

    1977-01-01

    A family of heat pipe reactors design concepts has been developed to provide heat to a variety of electrical conversion systems. Three power plants are described that span the power range 1-500 kWe and operate in the temperature range 1200-1700 K. The reactors are fast, compact, heat-pipe cooled, high-temperature nuclear reactors fueled with fully enriched refractory fuels, UC-ZrC or UO2. Each fuel element is cooled by an axially located molybdenum heat pipe containing either sodium or lithium vapor. Virtues of the reactor designs are the avoidance of single-point failure mechanisms, the relatively high operating temperature, and the expected long lifetimes of the fuel element components.

  13. µ-reactors for Heterogeneous Catalysis

    DEFF Research Database (Denmark)

    Jensen, Robert

    catalyst surface area by reacting off an adsorbed layer of oxygen with CO. This procedure can be performed at temperatures low enough that sintering of Pt nanoparticles is not an issue. Some results from the reactors are presented. In particular an unexpected oscillation phenomenon of CO-oxidation on Pt...... nanoparticles are presented in detail. The sensitivity of the reactors are currently being investigated with CO oxidation on Pt thin films as a test reaction, and the results so far are presented. We have at this point shown that we are able to reach full conversion with a catalyst area of 38 µm2 with a turn......This thesis is the summary of my work on the µ-reactor platform. The concept of µ-reactors is presented and some of the experimental challenges are outlined. The various experimental issues regarding the platform are discussed and the actual implementation of three generations of the setup...

  14. Big science transformed science, politics and organization in Europe and the United States

    CERN Document Server

    Hallonsten, Olof

    2016-01-01

    This book analyses the emergence of a transformed Big Science in Europe and the United States, using both historical and sociological perspectives. It shows how technology-intensive natural sciences grew to a prominent position in Western societies during the post-World War II era, and how their development cohered with both technological and social developments. At the helm of post-war science are large-scale projects, primarily in physics, which receive substantial funds from the public purse. Big Science Transformed shows how these projects, popularly called 'Big Science', have become symbols of progress. It analyses changes to the political and sociological frameworks surrounding publicly-funding science, and their impact on a number of new accelerator and reactor-based facilities that have come to prominence in materials science and the life sciences. Interdisciplinary in scope, this book will be of great interest to historians, sociologists and philosophers of science.

  15. Pop & rock / Berk Vaher

    Index Scriptorium Estoniae

    Vaher, Berk, 1975-

    2001-01-01

    Uute heliplaatide Redman "Malpractice", Brian Eno & Peter Schwalm "Popstars", Clawfinger "A Whole Lot of Nothing", Dario G "In Full Color", MLTR e. Michael Learns To Rock "Blue Night" lühitutvustused

  16. Art on Rock

    Institute of Scientific and Technical Information of China (English)

    HU YUE

    2010-01-01

    @@ With sprawling deserts and serene lakes, the natural wonders of Ningxia Hui Autonomous Region have never failed to take the breath away from visitors. The area has another major attraction, though: the Helan Mountain rock engravings.

  17. Art on Rock

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    With sprawling deserts and serene lakes, the natural wonders of Ningxia Hui Autonomous Region have never failed totake the breath away from visitors. The areahas another major attraction, though: the Helan Mountain rock engravings.

  18. Rock kinoekraanil / Katrin Rajasaare

    Index Scriptorium Estoniae

    Rajasaare, Katrin

    2008-01-01

    7.-11. juulini kinos Sõprus toimuval filminädalal "Rock On Screen" ekraanile jõudvatest rockmuusikuid portreteerivatest filmidest "Lou Reed's Berlin", "The Future Is Unwritten: Joe Strummer", "Control: Joy Division", "Hurriganes", "Shlaager"

  19. Days of Rock

    Institute of Scientific and Technical Information of China (English)

    YUAN

    2004-01-01

    FROM last October 1 st to 3rd, at the foot of Fragrant Hill, a suburban Beijing resort famous for its flaming maple leaves in autumn, more than 20,000 rock fans indulged themselves in music for three days.

  20. Writing Rock Music Reviews.

    Science.gov (United States)

    Brown, Donal

    1980-01-01

    Suggests ways student reviewers of rock music groups can write better reviews. Among the suggestions made are that reviewers occasionally discuss the audience or what makes a particular group unique, support general comment with detail, and avoid ecstatic adjectives. (TJ)

  1. Age and gender might influence big five factors of personality: a preliminary report in Indian population.

    Science.gov (United States)

    Magan, Dipti; Mehta, Manju; Sarvottam, Kumar; Yadav, Raj Kumar; Pandey, R M

    2014-01-01

    Age and gender are two important physiological variables which might influence the personality of an individual. The influence of age and gender on big five personality domains in Indian population was assessed in this cross-sectional study that included 155 subjects (female = 76, male = 79) aged from 16-75 years. Big five personality factors were evaluated using 60-item NEO-Five Factor Inventory (NEO-FFI) at a single point in time. Among the big five factors of personality, Conscientiousness was positively correlated (r = 0.195; P personality traits might change with age, and is gender-dependent.

  2. [Keeping of bears and big cats in the zoo and circus].

    Science.gov (United States)

    Rietschel, W

    2002-03-01

    The exhibition of bears and big cats in zoo and circus causes regular criticism, justified and unjustified, by people engaged in the prevention of cruelty to animals. Main points of critique are holding conditions, feeding and health status of the animals. The official veterinarian involved in the supervision often needs the cooperation of a specialised zoo veterinarian. In most cases the clinical examination of bears and big cats requires an immobilisation. This article will enter into some of the most common holding problems and diseases of big carnivores in zoo and circus.

  3. Rock avalanches on glaciers

    OpenAIRE

    Shugar, Daniel

    2011-01-01

    This thesis examines relations between rock avalanches and the glaciers on which they are deposited. I have attempted to understand a geophysical phenomenon from two viewpoints: sedimentology and glaciology. The contributions are both methodological, and practical. I have used a GIS to quantify debris sheet geomorphology. A thorough characterization of rock avalanche debris is a necessary step in understanding the flow mechanics of large landslide. I have also developed a technique for solvin...

  4. Reactor Vessel Surveillance Program for Advanced Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kyeong-Hoon; Kim, Tae-Wan; Lee, Gyu-Mahn; Kim, Jong-Wook; Park, Keun-Bae; Kim, Keung-Koo

    2008-10-15

    This report provides the design requirements of an integral type reactor vessel surveillance program for an integral type reactor in accordance with the requirements of Korean MEST (Ministry of Education, Science and Technology Development) Notice 2008-18. This report covers the requirements for the design of surveillance capsule assemblies including their test specimens, test block materials, handling tools, and monitors of the surveillance capsule neutron fluence and temperature. In addition, this report provides design requirements for the program for irradiation surveillance of reactor vessel materials, a layout of specimens and monitors in the surveillance capsule, procedures of installation and retrieval of the surveillance capsule assemblies, and the layout of the surveillance capsule assemblies in the reactor.

  5. Weathering of rock 'Ginger'

    Science.gov (United States)

    1997-01-01

    One of the more unusual rocks at the site is Ginger, located southeast of the lander. Parts of it have the reddest color of any material in view, whereas its rounded lobes are gray and relatively unweathered. These color differences are brought out in the inset, enhanced at the upper right. In the false color image at the lower right, the shape of the visible-wavelength spectrum (related to the abundance of weathered ferric iron minerals) is indicated by the hue of the rocks. Blue indicates relatively unweathered rocks. Typical soils and drift, which are heavily weathered, are shown in green and flesh tones. The very red color in the creases in the rock surface correspond to a crust of ferric minerals. The origin of the rock is uncertain; the ferric crust may have grown underneath the rock, or it may cement pebbles together into a conglomerate. Ginger will be a target of future super-resolution studies to better constrain its origin.Mars Pathfinder is the second in NASA's Discovery program of low-cost spacecraft with highly focused science goals. The Jet Propulsion Laboratory, Pasadena, CA, developed and manages the Mars Pathfinder mission for NASA's Office of Space Science, Washington, D.C. The Imager for Mars Pathfinder (IMP) was developed by the University of Arizona Lunar and Planetary Laboratory under contract to JPL. Peter Smith is the Principal Investigator. JPL is an operating division of the California Institute of Technology (Caltech).

  6. Determination of chlorine in silicate rocks

    Science.gov (United States)

    Peck, L.C.

    1959-01-01

    In a rapid accurate method for the determination of chlorine in silicate rocks, the rock powder is sintered with a sodium carbonate flux containing zinc oxide and magnesium carbonate. The sinter cake is leached with water, the resulting solution is filtered, and the filtrate is acidified with nitric acid. Chlorine is determined by titrating this solution with mercuric nitrate solution using sodium nitroprusside as the indicator. The titration is made in the dark with a beam of light shining through the solution. The end point of the titration is found by visually comparing the intensity of this beam of light with that of a similar beam of light in a reference solution.

  7. A reduced-boron OPR1000 core based on the BigT burnable absorber

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Hwan Yeal; Yahya, Mohd-Syukri; Kim, Yong Hee [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon (Korea, Republic of)

    2016-04-15

    Reducing critical boron concentration in a commercial pressurized water reactor core offers many advantages in view of safety and economics. This paper presents a preliminary investigation of a reduced-boron pressurized water reactor core to achieve a clearly negative moderator temperature coefficient at hot zero power using the newly-proposed 'Burnable absorber-Integrated Guide Thimble' (BigT) absorbers. The reference core is based on a commercial OPR1000 equilibrium configuration. The reduced-boron ORP1000 configuration was determined by simply replacing commercial gadolinia-based burnable absorbers with the optimized BigT-loaded design. The equilibrium cores in this study were directly searched via repetitive Monte Carlo depletion calculations until convergence. The results demonstrate that, with the same fuel management scheme as in the reference core, application of the BigT absorbers can effectively reduce the critical boron concentration at the beginning of cycle by about 65 ppm. More crucially, the analyses indicate promising potential of the reduced-boron OPR1000 core with the BigT absorbers, as its moderator temperature coefficient at the beginning of cycle is clearly more negative and all other vital neutronic parameters are within practical safety limits. All simulations were completed using the Monte Carlo Serpent code with the ENDF/B-VII.0 library.

  8. Power distribution control of CANDU reactors based on modal representation of reactor kinetics

    Energy Technology Data Exchange (ETDEWEB)

    Xia, Lingzhi, E-mail: lxia4@uwo.ca [Department of Electrical and Computer Engineering, The University of Western Ontario, London, Ontario N6A 5B9 (Canada); Jiang, Jin, E-mail: jjiang@eng.uwo.ca [Department of Electrical and Computer Engineering, The University of Western Ontario, London, Ontario N6A 5B9 (Canada); Luxat, John C., E-mail: luxatj@mcmaster.ca [Department of Engineering Physics, McMaster University, Hamilton, Ontario L8S 4L7 (Canada)

    2014-10-15

    Highlights: • Linearization of the modal synthesis model of neutronic kinetic equations for CANDU reactors. • Validation of the linearized dynamic model through closed-loop simulations by using the reactor regulating system. • Design of a LQR state feedback controller for CANDU core power distribution control. • Comparison of the results of this new controller against those of the conventional reactor regulation system. - Abstract: Modal synthesis representation of a neutronic kinetic model for a CANDU reactor core has been utilized in the analysis and synthesis for reactor control systems. Among all the mode shapes, the fundamental mode of the power distribution, which also coincides with the desired reactor power distribution during operation, is used in the control system design. The nonlinear modal models are linearized around desired operating points. Based on the linearized model, linear quadratic regulator (LQR) control approach is used to synthesize a state feedback controller. The performance of this controller has been evaluated by using the original nonlinear models under load-following conditions. It has been demonstrated that the proposed reactor control system can produce more uniform power distribution than the traditional reactor regulation systems (RRS); in particular, it is more effective in compensating the Xenon induced transients.

  9. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    CERN Multimedia

    2008-01-01

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  10. Perspectives on Big Data and Big Data Analytics

    Directory of Open Access Journals (Sweden)

    Elena Geanina ULARU

    2012-12-01

    Full Text Available Nowadays companies are starting to realize the importance of using more data in order to support decision for their strategies. It was said and proved through study cases that “More data usually beats better algorithms”. With this statement companies started to realize that they can chose to invest more in processing larger sets of data rather than investing in expensive algorithms. The large quantity of data is better used as a whole because of the possible correlations on a larger amount, correlations that can never be found if the data is analyzed on separate sets or on a smaller set. A larger amount of data gives a better output but also working with it can become a challenge due to processing limitations. This article intends to define the concept of Big Data and stress the importance of Big Data Analytics.

  11. Solution of a Braneworld Big Crunch/Big Bang Cosmology

    CERN Document Server

    McFadden, P; Turok, N G; Fadden, Paul Mc; Steinhardt, Paul J.; Turok, Neil

    2005-01-01

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c)^2. At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly-separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios.

  12. The ethics of big data in big agriculture

    Directory of Open Access Journals (Sweden)

    Isabelle M. Carbonell

    2016-03-01

    Full Text Available This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique insights on a field-by-field basis into a third or more of the US farmland. This power asymmetry may be rebalanced through open-sourced data, and publicly-funded data analytic tools which rival Climate Corp. in complexity and innovation for use in the public domain.

  13. Preliminary Study on weathering and pedogenesis of carbonate rock

    Institute of Scientific and Technical Information of China (English)

    王世杰; 季宏兵; 欧阳自远; 周德全; 郑乐平; 黎廷宇

    1999-01-01

    South China is the largest continuous distribution area of carbonate rock in the world. The origin of the soils over the bedrock carbonate rock has long been a controversial topic. Here further exploration is made by taking five soil profiles as examples, which are developed over the bedrock dolomitite and limestone and morphologically located in upland in karst terrain in the central, west and north Guizhou as well as west Hunan, and proved to be the weathering profiles of carbonate rock by the research results of acid-dissolved extraction experiment of bedrock, mineralogy and trace element geochemistry. Field, mineralogical and trace element geochemical characteristics of weathering and pedogenesis for carbonate rock are discussed in detail. It is pointed out that weathering and pedogenesis of carbonate rock are important pedogenetic mechanisms for soil resources in karst area, providing a basis for further researches on the origin of soils widely overlying bedrock carbonate rocks in South China.

  14. The Obstacles in Big Data Process

    Directory of Open Access Journals (Sweden)

    Rasim M. Alguliyev

    2017-04-01

    Full Text Available The increasing amount of data and a need to analyze the given data in a timely manner for multiple purposes has created a serious barrier in the big data analysis process. This article describes the challenges that big data creates at each step of the big data analysis process. These problems include typical analytical problems as well as the most uncommon challenges that are futuristic for the big data only. The article breaks down problems for each step of the big data analysis process and discusses these problems separately at each stage. It also offers some simplistic ways to solve these problems.

  15. ISSUES, CHALLENGES, AND SOLUTIONS: BIG DATA MINING

    Directory of Open Access Journals (Sweden)

    Jaseena K.U.

    2014-12-01

    Full Text Available Data has become an indispensable part of every economy, industry, organization, business function and individual. Big Data is a term used to identify the datasets that whose size is beyond the ability of typical database software tools to store, manage and analyze. The Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This paper presents the literature review about the Big data Mining and the issues and challenges with emphasis on the distinguished features of Big Data. It also discusses some methods to deal with big data.

  16. Possible triggering of solar activity to big earthquakes (Ms ≥ 8) in faults with near west-east strike in China

    Institute of Scientific and Technical Information of China (English)

    HAN; Yanben; GUO; Zengjian; WU; Jinbing; MA; Lihua

    2004-01-01

    This paper studies the relationship between solar activity and big earthquakes (Ms≥8) that occurred in China and western Mongolia. It is discovered that the occurrence dates of most of the big earthquakes in and near faults with west-east strike are close to the maximum years of sunspot numbers, whereas dates of some big earthquakes which are not in such faults are not close to the maximum years. We consider that it is possibly because of the appearance of many magnetic storms in the maximum years of solar activity. The magnetic storms result in anomalies of geomagnetic field and then produce eddy current in the faults gestating earthquake with near west-east strike. Perhaps the gestated big earthquakes occur easily since the eddy current heats the rocks in the faults and therefore decreases the shear resistant intensity and the static friction limit of the rocks.

  17. Kinetic study of treatment of wastewater contains food preservative agent by anaerobic baffled reactor : An overview

    Science.gov (United States)

    Sumantri, Indro; Purwanto, Budiyono

    2015-12-01

    The characteristic of wastewater of food industries with preservative substances is high content of organic substances, degradable and high total suspended solid. High organic content in this waste forced the treatment is biologically and pointed out to anaerobic treatment. Anaerobic showed the better performance of degradation than aerobic for high content organic and also for toxic materials. During that day the treatment of food wastewater is aerobically which is high consume of energy required and high volume of sludge produced. The advantage of anaerobic is save high energy, less product of sludge, less requirement of nutrients of microorganism and high efficiency reduction of organic load. The high efficiency of reduction will reduce the load of further treatment, so that, the threshold limit based on the regulation would be easy to achieve. Research of treatment of wastewater of food industries would be utilized by both big scale industries and small industries using addition of preservative substances. The type reactor of anaerobic process is anaerobic baffled reactor that will give better contact between wastewater and microorganism in the sludge. The variables conducted in this research are the baffled configuration, sludge height, preservative agent contents, hydralic retention time and influence of micro nutrients. The respons of this research are the COD effluent, remaining preservative agent, pH, formation of volatile fatty acid and total suspended solid. The result of this research is kinetic model of the anaerobic baffled reactor, reaction kinetic of preservative agent degradation and technology of treatment wastewater contains preservative agent. The benefit of this research is to solve the treatment of wastewater of food industries with preservative substance in order to achieve wastewater limit regulation and also to prevent the environmental deterioration.

  18. Kinetic study of treatment of wastewater contains food preservative agent by anaerobic baffled reactor : An overview

    Energy Technology Data Exchange (ETDEWEB)

    Sumantri, Indro; Purwanto,; Budiyono [Chemical Engineering Department, Faculty of Engineering, Diponegoro University Jl. Prof. H. Soedarto, SH, Kampus Baru Tembalang, Semarang (Indonesia)

    2015-12-29

    The characteristic of wastewater of food industries with preservative substances is high content of organic substances, degradable and high total suspended solid. High organic content in this waste forced the treatment is biologically and pointed out to anaerobic treatment. Anaerobic showed the better performance of degradation than aerobic for high content organic and also for toxic materials. During that day the treatment of food wastewater is aerobically which is high consume of energy required and high volume of sludge produced. The advantage of anaerobic is save high energy, less product of sludge, less requirement of nutrients of microorganism and high efficiency reduction of organic load. The high efficiency of reduction will reduce the load of further treatment, so that, the threshold limit based on the regulation would be easy to achieve. Research of treatment of wastewater of food industries would be utilized by both big scale industries and small industries using addition of preservative substances. The type reactor of anaerobic process is anaerobic baffled reactor that will give better contact between wastewater and microorganism in the sludge. The variables conducted in this research are the baffled configuration, sludge height, preservative agent contents, hydralic retention time and influence of micro nutrients. The respons of this research are the COD effluent, remaining preservative agent, pH, formation of volatile fatty acid and total suspended solid. The result of this research is kinetic model of the anaerobic baffled reactor, reaction kinetic of preservative agent degradation and technology of treatment wastewater contains preservative agent. The benefit of this research is to solve the treatment of wastewater of food industries with preservative substance in order to achieve wastewater limit regulation and also to prevent the environmental deterioration.

  19. Borehole radar response characteristics of point unfavorable geo-bodies:forward simulation of its surrounding rock and filling condition%点状不良地质体钻孔雷达响应特征——围岩及充填效应正演分析

    Institute of Scientific and Technical Information of China (English)

    钟声; 王川婴; 吴立新; 唐新建; 王清远

    2012-01-01

    Borehole radar is a well geophysical method for obtaining the high-resolution information of deep underground environment. According to the common point unfavorable geo-bodies in borehole radar exploration, such as cavities, karst caves and buried objects, etc., the forward simulation of response of the tool to point unfavorable geo-bodies of different surrounding rock and filling condition is investigated with finite difference time domain (FDTD) method. The influence of surrounding rock and filling condition on response characteristics of borehole radar is analyzed. The results indicate that the relative value of dielectric constant of surrounding rock and filling substance affects the contrast of radar reflection profile, and when the relative value of dielectric constant increasing, it is easier to ascertain the subsurface media distribution. While high-conductivity surrounding rock can attenuate most of radar signal, it is almost impossible to detect the cavities with single borehole reflection method. According to the different borehole radar cross-sectiorn and its response characteristics, the filling condition in the cavities can be qualitatively distinguished.%钻孔地质雷达探测是一种有效获取高分辨率深部岩体信息的井中地球物理方法.针对钻孔地质雷达探测工作中常见的空洞、岩溶和地下埋藏物等点状不良地质体,利用时域有限差分数值模拟方法,对岩土体内点状不良地质体围岩介质和充填状况进行了雷达响应正演研究,分析了这些点状不良地质体围岩介质和充填状况对钻孔地质雷达反射信号的影响.研究结果表明,围岩与空洞内充填物的相对介电常数相对值,决定着雷达反射剖面信号的强弱对比,介电常数相对值较大时,更易确定目标地质体的前、后部界面位置,而低阻围岩的雷达波信号大部分被围岩介质吸收,通过单孔反射方法几乎不可能探测到空洞的存在.通过反射信

  20. Big data is not a monolith

    CERN Document Server

    Ekbia, Hamid R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  1. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-10-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification

  2. Geochemical and petrographic data for intrusions peripheral to the Big Timber Stock, Crazy Mountains, Montana

    Science.gov (United States)

    du Bray, Edward A.; Van Gosen, Bradley S.

    2015-01-01

    The Paleocene Fort Union Formation hosts a compositionally diverse array of Eocene plugs, dikes, and sills arrayed around the Eocene Big Timber stock in the Crazy Mountains of south-central Montana. The geochemistry and petrography of the sills have not previously been characterized or interpreted. The purpose of this report is (1) to present available geochemical and petrographic data for several dozen samples of these rocks and (2) to provide a basic interpretive synthesis of these data.

  3. SNTP program reactor design

    Science.gov (United States)

    Walton, Lewis A.; Sapyta, Joseph J.

    1993-06-01

    The Space Nuclear Thermal Propulsion (SNTP) program is evaluating the feasibility of a particle bed reactor for a high-performance nuclear thermal rocket engine. Reactors operating between 500 MW and 2,000 MW will produce engine thrusts ranging from 20,000 pounds to 80,000 pounds. The optimum reactor arrangement depends on the power level desired and the intended application. The key components of the reactor have been developed and are being tested. Flow-to-power matching considerations dominate the thermal-hydraulic design of the reactor. Optimal propellant management during decay heat cooling requires a three-pronged approach. Adequate computational methods exist to perform the neutronics analysis of the reactor core. These methods have been benchmarked to critical experiment data.

  4. Fast Spectrum Reactors

    CERN Document Server

    Todd, Donald; Tsvetkov, Pavel

    2012-01-01

    Fast Spectrum Reactors presents a detailed overview of world-wide technology contributing to the development of fast spectrum reactors. With a unique focus on the capabilities of fast spectrum reactors to address nuclear waste transmutation issues, in addition to the well-known capabilities of breeding new fuel, this volume describes how fast spectrum reactors contribute to the wide application of nuclear power systems to serve the global nuclear renaissance while minimizing nuclear proliferation concerns. Readers will find an introduction to the sustainable development of nuclear energy and the role of fast reactors, in addition to an economic analysis of nuclear reactors. A section devoted to neutronics offers the current trends in nuclear design, such as performance parameters and the optimization of advanced power systems. The latest findings on fuel management, partitioning and transmutation include the physics, efficiency and strategies of transmutation, homogeneous and heterogeneous recycling, in addit...

  5. Hybrid reactors. [Fuel cycle

    Energy Technology Data Exchange (ETDEWEB)

    Moir, R.W.

    1980-09-09

    The rationale for hybrid fusion-fission reactors is the production of fissile fuel for fission reactors. A new class of reactor, the fission-suppressed hybrid promises unusually good safety features as well as the ability to support 25 light-water reactors of the same nuclear power rating, or even more high-conversion-ratio reactors such as the heavy-water type. One 4000-MW nuclear hybrid can produce 7200 kg of /sup 233/U per year. To obtain good economics, injector efficiency times plasma gain (eta/sub i/Q) should be greater than 2, the wall load should be greater than 1 MW.m/sup -2/, and the hybrid should cost less than 6 times the cost of a light-water reactor. Introduction rates for the fission-suppressed hybrid are usually rapid.

  6. The big wheels of ATLAS

    CERN Multimedia

    2006-01-01

    The ATLAS cavern is filling up at an impressive rate. The installation of the first of the big wheels of the muon spectrometer, a thin gap chamber (TGC) wheel, was completed in September. The muon spectrometer will include four big moving wheels at each end, each measuring 25 metres in diameter. Of the eight wheels in total, six will be composed of thin gap chambers for the muon trigger system and the other two will consist of monitored drift tubes (MDTs) to measure the position of the muons (see Bulletin No. 13/2006). The installation of the 688 muon chambers in the barrel is progressing well, with three-quarters of them already installed between the coils of the toroid magnet.

  7. Big Numbers in String Theory

    CERN Document Server

    Schellekens, A N

    2016-01-01

    This paper contains some personal reflections on several computational contributions to what is now known as the "String Theory Landscape". It consists of two parts. The first part concerns the origin of big numbers, and especially the number $10^{1500}$ that appeared in work on the covariant lattice construction (with W. Lerche and D. Luest). This part contains some new results. I correct a huge but inconsequential error, discuss some more accurate estimates, and compare with the counting for free fermion constructions. In particular I prove that the latter only provide an exponentially small fraction of all even self-dual lattices for large lattice dimensions. The second part of the paper concerns dealing with big numbers, and contains some lessons learned from various vacuum scanning projects.

  8. Big Data for Precision Medicine

    Directory of Open Access Journals (Sweden)

    Daniel Richard Leff

    2015-09-01

    Full Text Available This article focuses on the potential impact of big data analysis to improve health, prevent and detect disease at an earlier stage, and personalize interventions. The role that big data analytics may have in interrogating the patient electronic health record toward improved clinical decision support is discussed. We examine developments in pharmacogenetics that have increased our appreciation of the reasons why patients respond differently to chemotherapy. We also assess the expansion of online health communications and the way in which this data may be capitalized on in order to detect public health threats and control or contain epidemics. Finally, we describe how a new generation of wearable and implantable body sensors may improve wellbeing, streamline management of chronic diseases, and improve the quality of surgical implants.

  9. Big data and ophthalmic research.

    Science.gov (United States)

    Clark, Antony; Ng, Jonathon Q; Morlet, Nigel; Semmens, James B

    2016-01-01

    Large population-based health administrative databases, clinical registries, and data linkage systems are a rapidly expanding resource for health research. Ophthalmic research has benefited from the use of these databases in expanding the breadth of knowledge in areas such as disease surveillance, disease etiology, health services utilization, and health outcomes. Furthermore, the quantity of data available for research has increased exponentially in recent times, particularly as e-health initiatives come online in health systems across the globe. We review some big data concepts, the databases and data linkage systems used in eye research-including their advantages and limitations, the types of studies previously undertaken, and the future direction for big data in eye research.

  10. George and the big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2012-01-01

    George has problems. He has twin baby sisters at home who demand his parents’ attention. His beloved pig Freddy has been exiled to a farm, where he’s miserable. And worst of all, his best friend, Annie, has made a new friend whom she seems to like more than George. So George jumps at the chance to help Eric with his plans to run a big experiment in Switzerland that seeks to explore the earliest moment of the universe. But there is a conspiracy afoot, and a group of evildoers is planning to sabotage the experiment. Can George repair his friendship with Annie and piece together the clues before Eric’s experiment is destroyed forever? This engaging adventure features essays by Professor Stephen Hawking and other eminent physicists about the origins of the universe and ends with a twenty-page graphic novel that explains how the Big Bang happened—in reverse!

  11. New reactor concepts; Nieuwe rectorconcepten - nouveaux reacteurs nucleaires

    Energy Technology Data Exchange (ETDEWEB)

    Meskens, G.; Govaerts, P.; Baugnet, J.-M.; Delbrassine, A

    1998-11-01

    The document gives a summary of new nuclear reactor concepts from a technological point of view. Belgium supports the development of the European Pressurized-Water Reactor, which is an evolutionary concept based on the European experience in Pressurized-Water Reactors. A reorientation of the Belgian choice for this evolutionary concept may be required in case that a decision is taken to burn plutonium, when the need for flexible nuclear power plants arises or when new reactor concepts can demonstrate proved benefits in terms of safety and cost.

  12. Statistical Inference: The Big Picture.

    Science.gov (United States)

    Kass, Robert E

    2011-02-01

    Statistics has moved beyond the frequentist-Bayesian controversies of the past. Where does this leave our ability to interpret results? I suggest that a philosophy compatible with statistical practice, labelled here statistical pragmatism, serves as a foundation for inference. Statistical pragmatism is inclusive and emphasizes the assumptions that connect statistical models with observed data. I argue that introductory courses often mis-characterize the process of statistical inference and I propose an alternative "big picture" depiction.

  13. Big data: the management revolution.

    Science.gov (United States)

    McAfee, Andrew; Brynjolfsson, Erik

    2012-10-01

    Big data, the authors write, is far more powerful than the analytics of the past. Executives can measure and therefore manage more precisely than ever before. They can make better predictions and smarter decisions. They can target more-effective interventions in areas that so far have been dominated by gut and intuition rather than by data and rigor. The differences between big data and analytics are a matter of volume, velocity, and variety: More data now cross the internet every second than were stored in the entire internet 20 years ago. Nearly real-time information makes it possible for a company to be much more agile than its competitors. And that information can come from social networks, images, sensors, the web, or other unstructured sources. The managerial challenges, however, are very real. Senior decision makers have to learn to ask the right questions and embrace evidence-based decision making. Organizations must hire scientists who can find patterns in very large data sets and translate them into useful business information. IT departments have to work hard to integrate all the relevant internal and external sources of data. The authors offer two success stories to illustrate how companies are using big data: PASSUR Aerospace enables airlines to match their actual and estimated arrival times. Sears Holdings directly analyzes its incoming store data to make promotions much more precise and faster.

  14. The BigBOSS Experiment

    CERN Document Server

    Schlegel, D; Abraham, T; Ahn, C; Prieto, C Allende; Annis, J; Aubourg, E; Azzaro, M; Baltay, S Bailey C; Baugh, C; Bebek, C; Becerril, S; Blanton, M; Bolton, A; Bromley, B; Cahn, R; Carton, P -H; Cervantes-Cota, J L; Chu, Y; Cortes, M; Dawson, K; Dey, A; Dickinson, M; Diehl, H T; Doel, P; Ealet, A; Edelstein, J; Eppelle, D; Escoffier, S; Evrard, A; Faccioli, L; Frenk, C; Geha, M; Gerdes, D; Gondolo, P; Gonzalez-Arroyo, A; Grossan, B; Heckman, T; Heetderks, H; Ho, S; Honscheid, K; Huterer, D; Ilbert, O; Ivans, I; Jelinsky, P; Jing, Y; Joyce, D; Kennedy, R; Kent, S; Kieda, D; Kim, A; Kim, C; Kneib, J -P; Kong, X; Kosowsky, A; Krishnan, K; Lahav, O; Lampton, M; LeBohec, S; Brun, V Le; Levi, M; Li, C; Liang, M; Lim, H; Lin, W; Linder, E; Lorenzon, W; de la Macorra, A; Magneville, Ch; Malina, R; Marinoni, C; Martinez, V; Majewski, S; Matheson, T; McCloskey, R; McDonald, P; McKay, T; McMahon, J; Menard, B; Miralda-Escude, J; Modjaz, M; Montero-Dorta, A; Morales, I; Mostek, N; Newman, J; Nichol, R; Nugent, P; Olsen, K; Padmanabhan, N; Palanque-Delabrouille, N; Park, I; Peacock, J; Percival, W; Perlmutter, S; Peroux, C; Petitjean, P; Prada, F; Prieto, E; Prochaska, J; Reil, K; Rockosi, C; Roe, N; Rollinde, E; Roodman, A; Ross, N; Rudnick, G; Ruhlmann-Kleider, V; Sanchez, J; Sawyer, D; Schimd, C; Schubnell, M; Scoccimaro, R; Seljak, U; Seo, H; Sheldon, E; Sholl, M; Shulte-Ladbeck, R; Slosar, A; Smith, D S; Smoot, G; Springer, W; Stril, A; Szalay, A S; Tao, C; Tarle, G; Taylor, E; Tilquin, A; Tinker, J; Valdes, F; Wang, J; Wang, T; Weaver, B A; Weinberg, D; White, M; Wood-Vasey, M; Yang, J; Yeche, X Yang Ch; Zakamska, N; Zentner, A; Zhai, C; Zhang, P

    2011-01-01

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy red...

  15. Big Data Comes to School

    Directory of Open Access Journals (Sweden)

    Bill Cope

    2016-03-01

    Full Text Available The prospect of “big data” at once evokes optimistic views of an information-rich future and concerns about surveillance that adversely impacts our personal and private lives. This overview article explores the implications of big data in education, focusing by way of example on data generated by student writing. We have chosen writing because it presents particular complexities, highlighting the range of processes for collecting and interpreting evidence of learning in the era of computer-mediated instruction and assessment as well as the challenges. Writing is significant not only because it is central to the core subject area of literacy; it is also an ideal medium for the representation of deep disciplinary knowledge across a number of subject areas. After defining what big data entails in education, we map emerging sources of evidence of learning that separately and together have the potential to generate unprecedented amounts of data: machine assessments, structured data embedded in learning, and unstructured data collected incidental to learning activity. Our case is that these emerging sources of evidence of learning have significant implications for the traditional relationships between assessment and instruction. Moreover, for educational researchers, these data are in some senses quite different from traditional evidentiary sources, and this raises a number of methodological questions. The final part of the article discusses implications for practice in an emerging field of education data science, including publication of data, data standards, and research ethics.

  16. Multi purpose research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Raina, V.K. [Research Reactor Design and Projects Division, Bhabha Atomic Research Centre, Mumbai 400085 (India)]. E-mail: vkrain@magnum.barc.ernet.in; Sasidharan, K. [Research Reactor Design and Projects Division, Bhabha Atomic Research Centre, Mumbai 400085 (India); Sengupta, Samiran [Research Reactor Design and Projects Division, Bhabha Atomic Research Centre, Mumbai 400085 (India); Singh, Tej [Research Reactor Services Division, Bhabha Atomic Research Centre, Mumbai 400085 (India)

    2006-04-15

    At present Dhruva and Cirus reactors provide the majority of research reactor based facilities to cater to the various needs of a vast pool of researchers in the field of material sciences, physics, chemistry, bio sciences, research and development work for nuclear power plants and production of radio isotopes. With a view to further consolidate and expand the scope of research and development in nuclear and allied sciences, a new 20 MWt multi purpose research reactor is being designed. This paper describes some of the design features and safety aspects of this reactor.

  17. INVAP's Research Reactor Designs

    Directory of Open Access Journals (Sweden)

    Eduardo Villarino

    2011-01-01

    Full Text Available INVAP, an Argentine company founded more than three decades ago, is today recognized as one of the leaders within the research reactor industry. INVAP has participated in several projects covering a wide range of facilities, designed in accordance with the requirements of our different clients. For complying with these requirements, INVAP developed special skills and capabilities to deal with different fuel assemblies, different core cooling systems, and different reactor layouts. This paper summarizes the general features and utilization of several INVAP research reactor designs, from subcritical and critical assemblies to high-power reactors.

  18. LMFBR type reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kanbe, Mitsuru

    1997-04-04

    An LMFBR type reactor comprises a plurality of reactor cores in a reactor container. Namely, a plurality of pot containing vessels are disposed in the reactor vessel and a plurality of reactor cores are formed in a state where an integrated-type fuel assembly is each inserted to a pot, and a coolant pipeline is connected to each of the pot containing-vessel to cool the reactor core respectively. When fuels are exchanged, the integrated-type fuel assembly is taken out together with the pot from the reactor vessel in a state where the integrated-type fuel assembly is immersed in the coolants in the pot as it is. Accordingly, coolants are supplied to each of the pot containing-vessel connected with the coolant pipeline and circulate while cooling the integrated-type fuel assembly for every pot. Then, when the fuels are exchanged, the integrated type fuel assembly is taken out to the outside of the reactor together with the pot by taking up the pot from the pot-containing vessel. Then, neutron economy is improved to thereby improve reactor power and the breeding ratio. (N.H.)

  19. Assessing Pretreatment Reactor Scaling Through Empirical Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lischeske, James J.; Crawford, Nathan C.; Kuhn, Erik; Nagle, Nicholas J.; Schell, Daniel J.; Tucker, Melvin P.; McMillan, James D.; Wolfrum, Edward J.

    2016-12-01

    within the near-optimal space for total sugar yield for the LHR. This indicates that the ASE is a good tool for cost effectively finding near-optimal conditions for operating pilot-scale systems, which may be used as starting points for further optimization. Additionally, using a severity-factor approach to optimization was found to be inadequate compared to a multivariate optimization method. Finally, the ASE and the LHR were able to enable significantly higher total sugar yields after enzymatic hydrolysis relative to the ZCR, despite having similar optimal conditions and total xylose yields. This underscores the importance of incorporating mechanical disruption into pretreatment reactor designs to achieve high enzymatic digestibilities.

  20. Numerical simulations of subcritical reactor kinetics in thermal hydraulic transient phases

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, J.; Park, W. S. [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    A subcritical reactor driven by a linear proton accelerator has been considered as a nuclear waste incinerator at Korea Atomic Energy Research Institute (KAERI). Since the multiplication factor of a subcritical reactor is less than unity, to compensate exponentially decreasing fission neutrons, external neutrons form spallation reactions are essentially required for operating the reactor in its steady state. Furthermore, the profile of accelerator beam currents is very important in controlling a subcritical reactor, because the reactor power varies in accordance to the profile of external neutrons. We have developed a code system to find numerical solutions of reactor kinetics equations, which are the simplest dynamic model for controlling reactors. In a due course of our previous numerical study of point kinetics equations for critical reactors, however, we learned that the same code system can be used in studying dynamic behavior of the subcritical reactor. Our major motivation of this paper is to investigate responses of subcritical reactors for small changes in thermal hydraulic parameters. Building a thermal hydraulic model for the subcritical reactor dynamics, we performed numerical simulations for dynamic responses of the reactor based on point kinetics equations with a source term. Linearizing a set of coupled differential equations for reactor responses, we focus our research interest on dynamic responses of the reactor to variations of the thermal hydraulic parameters in transient phases. 5 refs., 8 figs. (Author)

  1. Light Water Reactor Sustainability Constellation Pilot Project FY11 Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    R. Johansen

    2011-09-01

    Summary report for Fiscal Year 2011 activities associated with the Constellation Pilot Project. The project is a joint effor between Constellation Nuclear Energy Group (CENG), EPRI, and the DOE Light Water Reactor Sustainability Program. The project utilizes two CENG reactor stations: R.E. Ginna and Nine Point Unit 1. Included in the report are activities associate with reactor internals and concrete containments.

  2. Modular stellarator reactor: a fusion power plant

    Energy Technology Data Exchange (ETDEWEB)

    Miller, R.L.; Bathke, C.G.; Krakowski, R.A.; Heck, F.M.; Green, L.; Karbowski, J.S.; Murphy, J.H.; Tupper, R.B.; DeLuca, R.A.; Moazed, A.

    1983-07-01

    A comparative analysis of the modular stellarator and the torsatron concepts is made based upon a steady-state ignited, DT-fueled, reactor embodiment of each concept for use as a central electric-power station. Parametric tradeoff calculations lead to the selection of four design points for an approx. 4-GWt plant based upon Alcator transport scaling in l = 2 systems of moderate aspect ratio. The four design points represent high-aspect ratio. The four design points represent high-(0.08) and low-(0.04) beta versions of the modular stellarator and torsatron concepts. The physics basis of each design point is described together with supporting engineering and economic analyses. The primary intent of this study is the elucidation of key physics and engineering tradeoffs, constraints, and uncertainties with respect to the ultimate power reactor embodiment.

  3. Reactor antineutrino fluxes - status and challenges

    CERN Document Server

    Huber, Patrick

    2016-01-01

    In this contribution we describe the current understanding of reactor antineutrino fluxes and point out some recent developments. This is not intended to be a complete review of this vast topic but merely a selection of observations and remarks, which despite their incompleteness, will highlight the status and the challenges of this field.

  4. Reactor antineutrino fluxes – Status and challenges

    Energy Technology Data Exchange (ETDEWEB)

    Huber, Patrick, E-mail: pahuber@vt.edu

    2016-07-15

    In this contribution we describe the current understanding of reactor antineutrino fluxes and point out some recent developments. This is not intended to be a complete review of this vast topic but merely a selection of observations and remarks, which despite their incompleteness, will highlight the status and the challenges of this field.

  5. Fracturing tests on reservoir rocks: Analysis of AE events and radial strain evolution

    CERN Document Server

    Pradhan, S; Fjær, E; Stenebråten, J; Lund, H K; Sønstebø, E F; Roy, S

    2015-01-01

    Fracturing in reservoir rocks is an important issue for the petroleum industry - as productivity can be enhanced by a controlled fracturing operation. Fracturing also has a big impact on CO2 storage, geothermal installation and gas production at and from the reservoir rocks. Therefore, understanding the fracturing behavior of different types of reservoir rocks is a basic need for planning field operations towards these activities. In our study, the fracturing of rock sample is monitored by Acoustic Emission (AE) and post-experiment Computer Tomography (CT) scans. The fracturing experiments have been performed on hollow cylinder cores of different rocks - sandstones and chalks. Our analysis show that the amplitudes and energies of acoustic events clearly indicate initiation and propagation of the main fractures. The amplitudes of AE events follow an exponential distribution while the energies follow a power law distribution. Time-evolution of the radial strain measured in the fracturing-test will later be comp...

  6. Digital carbonate rock physics

    Science.gov (United States)

    Saenger, Erik H.; Vialle, Stephanie; Lebedev, Maxim; Uribe, David; Osorno, Maria; Duda, Mandy; Steeb, Holger

    2016-08-01

    Modern estimation of rock properties combines imaging with advanced numerical simulations, an approach known as digital rock physics (DRP). In this paper we suggest a specific segmentation procedure of X-ray micro-computed tomography data with two different resolutions in the µm range for two sets of carbonate rock samples. These carbonates were already characterized in detail in a previous laboratory study which we complement with nanoindentation experiments (for local elastic properties). In a first step a non-local mean filter is applied to the raw image data. We then apply different thresholds to identify pores and solid phases. Because of a non-neglectable amount of unresolved microporosity (micritic phase) we also define intermediate threshold values for distinct phases. Based on this segmentation we determine porosity-dependent values for effective P- and S-wave velocities as well as for the intrinsic permeability. For effective velocities we confirm an observed two-phase trend reported in another study using a different carbonate data set. As an upscaling approach we use this two-phase trend as an effective medium approach to estimate the porosity-dependent elastic properties of the micritic phase for the low-resolution images. The porosity measured in the laboratory is then used to predict the effective rock properties from the observed trends for a comparison with experimental data. The two-phase trend can be regarded as an upper bound for elastic properties; the use of the two-phase trend for low-resolution images led to a good estimate for a lower bound of effective elastic properties. Anisotropy is observed for some of the considered subvolumes, but seems to be insignificant for the analysed rocks at the DRP scale. Because of the complexity of carbonates we suggest using DRP as a complementary tool for rock characterization in addition to classical experimental methods.

  7. Dynamic parameters test of Haiyang Nuclear Power Engineering in reactor areas, China

    Science.gov (United States)

    Zhou, N.; Zhao, S.; Sun, L.

    2012-12-01

    Haiyang Nuclear Power Project is located in Haiyang city, China. It consists of 6×1000MW AP1000 Nuclear Power generator sets. The dynamic parameters of the rockmass are essential for the design of the nuclear power plant. No.1 and No.2 reactor area are taken as research target in this paper. Sonic logging, single hole and cross-hole wave velocity are carried out respectively on the site. There are four types of rock lithology within the measured depth. They are siltstone, fine sandstone, shale and allgovite. The total depth of sonic logging is 409.8m and 2049 test points. The sound wave velocity of the rocks are respectively 5521 m/s, 5576m/s, 5318 m/s and 5576 m/s. Accroding to the statistic data, among medium weathered fine sandstone, fairly broken is majority, broken and relatively integrity are second, part of integrity. Medium weathered siltstone, relatively integrity is mojority, fairly broken is second. Medium weathered shale, fairly broken is majority, broken and relatively integrity for the next and part of integrity. Slight weathered fine sandstone, siltstone, shale and allgovite, integrity is the mojority, relatively integrity for the next, part of fairly broken.The single hole wave velocity tests are set in two boreholesin No.1 reactor area and No.2 reactor area respectively. The test depths of two holes are 2-24m, and the others are 2-40m. The wave velocity data are calculated at different depth in each holes and dynamic parameters. According to the test statistic data, the wave velocity and the dynamic parameter values of rockmass are distinctly influenced by the weathering degree. The test results are list in table 1. 3 groups of cross hole wave velocity tests are set for No.1 and 2 reactor area, No.1 reactor area: B16, B16-1, B20(Direction:175°, depth: 100m); B10, B10-1, B11(269°, 40m); B21, B21-1, B17(154°, 40m); with HB16, HB10, HB21 as trigger holes; No.2 reactor area: B47, B47-1, HB51(176°, 100m); B40, B40-1, B41(272°, 40m); B42, B42-1, B

  8. Session: Hard Rock Penetration

    Energy Technology Data Exchange (ETDEWEB)

    Tennyson, George P. Jr.; Dunn, James C.; Drumheller, Douglas S.; Glowka, David A.; Lysne, Peter

    1992-01-01

    This session at the Geothermal Energy Program Review X: Geothermal Energy and the Utility Market consisted of five presentations: ''Hard Rock Penetration - Summary'' by George P. Tennyson, Jr.; ''Overview - Hard Rock Penetration'' by James C. Dunn; ''An Overview of Acoustic Telemetry'' by Douglas S. Drumheller; ''Lost Circulation Technology Development Status'' by David A. Glowka; ''Downhole Memory-Logging Tools'' by Peter Lysne.

  9. Rock burst laws in deep mines based on combined model of membership function and dominance-based rough set

    Institute of Scientific and Technical Information of China (English)

    刘浪; 陈忠强; 王李管

    2015-01-01

    Rock bursts are spontaneous, violent fracture of rock that can occur in deep mines, and the likelihood of rock bursts occurring increases as depth of the mine increases. Rock bursts are also affected by the compressive strength, tensile strength, tangential strength, elastic energy index, etc. of rock, and the relationship between these factors and rock bursts in deep mines is difficult to analyze from quantitative point. Typical rock burst instances as a sample set were collected, and membership function was introduced to process the discrete values of these factors with the discrete factors as condition attributes and rock burst situations as decision attributes. Dominance-based rough set theory was used to generate preference rules of rock burst, and eventually rock burst laws analysis in deep mines with preference relation was taken. The results show that this model for rock burst laws analysis in deep mines is more reasonable and feasible, and the prediction results are more scientific.

  10. Big Bang Nucleosynthesis: Probing the First 20 Minutes

    CERN Document Server

    Steigman, G

    2003-01-01

    Within the first 20 minutes of the evolution of the hot, dense, early Universe, astrophysically interesting abundances of deuterium, helium-3, helium-4, and lithium-7 were synthesized by the cosmic nuclear reactor. The primordial abundances of these light nuclides produced during Big Bang Nucleosynthesis (BBN) are sensitive to the universal density of baryons and to the early-Universe expansion rate which at early epochs is governed by the energy density in relativistic particles (``radiation'') such as photons and neutrinos. Some 380 kyr later, when the cosmic background radiation (CBR) radiation was freed from the embrace of the ionized plasma of protons and electrons, the spectrum of temperature fluctuations imprinted on the CBR also depended on the baryon and radiation densities. The comparison between the constraints imposed by BBN and those from the CBR reveals a remarkably consistent picture of the Universe at two widely separated epochs in its evolution. Combining these two probes leads to new and tig...

  11. Light water reactor program

    Energy Technology Data Exchange (ETDEWEB)

    Franks, S.M.

    1994-12-31

    The US Department of Energy`s Light Water Reactor Program is outlined. The scope of the program consists of: design certification of evolutionary plants; design, development, and design certification of simplified passive plants; first-of-a-kind engineering to achieve commercial standardization; plant lifetime improvement; and advanced reactor severe accident program. These program activities of the Office of Nuclear Energy are discussed.

  12. Space Nuclear Reactor Engineering

    Energy Technology Data Exchange (ETDEWEB)

    Poston, David Irvin [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-03-06

    We needed to find a space reactor concept that could be attractive to NASA for flight and proven with a rapid turnaround, low-cost nuclear test. Heat-pipe-cooled reactors coupled to Stirling engines long identified as the easiest path to near-term, low-cost concept.

  13. Reactor Materials Research

    Energy Technology Data Exchange (ETDEWEB)

    Van Walle, E

    2001-04-01

    The activities of the Reactor Materials Research Department of the Belgian Nuclear Research Centre SCK-CEN in 2000 are summarised. The programmes within the department are focussed on studies concerning (1) fusion, in particular mechanical testing; (2) Irradiation Assisted Stress Corrosion Cracking (IASCC); (3) nuclear fuel; and (4) Reactor Pressure Vessel Steel (RPVS)

  14. Experiment of Doppler Heating Starting Point Measurement for Zero Power Physical Test in the Nuclear Power Reactor%核动力反应堆零功率物理试验的多普勒发热点测量试验

    Institute of Scientific and Technical Information of China (English)

    黄礼渊; 付国恩

    2015-01-01

    In order to determine the upper limit of power,calibrate the range of power and assure testing accura-cy,Doppler heating starting point is measured using digital reactivity meter during zero power physical experi-ment in the reactor.The paper describes the experimental theory,instrument,methods,results and data process-ing methods.The experimental results show that Doppler heating starting point can be determined by modified reactivity values measured by digital reactivity meter.The experience can be used in the following physical ex-periment.%为了定量确定零功率物理试验功率的上限,在反应堆零功率物理试验中,利用数字反应性仪测量多普勒发热点,以确定试验功率的范围、保证试验精度。叙述了本次多普勒发热点测量试验的原理、试验仪器、试验方法、试验结果及数据处理方法等,试验结果表明:利用数字反应性仪测得的反应性经过修正后可以准确地判断多普勒发热点,可为后续物理试验提供参考。

  15. Rock Point 1:100000 Quad Hydrography DLGs

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — Digital line graph (DLG) data are digital representations of cartographic information. DLG's of map features are converted to digital form from maps and related...

  16. Rock Point 1:100000 Quad Transportation DLGs

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — Digital line graph (DLG) data are digital representations of cartographic information. DLG's of map features are converted to digital form from maps and related...

  17. Nuclear reactor design

    CERN Document Server

    2014-01-01

    This book focuses on core design and methods for design and analysis. It is based on advances made in nuclear power utilization and computational methods over the past 40 years, covering core design of boiling water reactors and pressurized water reactors, as well as fast reactors and high-temperature gas-cooled reactors. The objectives of this book are to help graduate and advanced undergraduate students to understand core design and analysis, and to serve as a background reference for engineers actively working in light water reactors. Methodologies for core design and analysis, together with physical descriptions, are emphasized. The book also covers coupled thermal hydraulic core calculations, plant dynamics, and safety analysis, allowing readers to understand core design in relation to plant control and safety.

  18. Status of French reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ballagny, A. [Commissariat a l`Energie Atomique, Saclay (France)

    1997-08-01

    The status of French reactors is reviewed. The ORPHEE and RHF reactors can not be operated with a LEU fuel which would be limited to 4.8 g U/cm{sup 3}. The OSIRIS reactor has already been converted to LEU. It will use U{sub 3}Si{sub 2} as soon as its present stock of UO{sub 2} fuel is used up, at the end of 1994. The decision to close down the SILOE reactor in the near future is not propitious for the start of a conversion process. The REX 2000 reactor, which is expected to be commissioned in 2005, will use LEU (except if the fast neutrons core option is selected). Concerning the end of the HEU fuel cycle, the best option is reprocessing followed by conversion of the reprocessed uranium to LEU.

  19. Migration and retention of elements at the Oklo natural reactor

    Science.gov (United States)

    Brookins, Douglas G.

    1982-09-01

    The Oklo natural reactor, Gabon, permits study of fission-produced elemental behavior in a natural geologic environment. The uranium ore that sustained fission reactions formed about 2 billion years before present (BYBP), and the reactor was operative for about 5 × 105 yrs between about 1.95 to 2 BYBP. The many tons of fission products can, for the most part, be studied for their abundance and distribution today. Since reactor shutdown, many fissiogenic elements have not migrated from host pitchblende, and several others have migrated only a few tens of meters from the reactor ore. Only Xe and Kr have apparently been largely removed from the reactor zones. An element by element assessment of the Oklo rocks' ability to retain the fission products, and actinides and radiogenic Pb and Bi as well, leads to the conclusion that no widespread migration of the elements occurred. This suggests that rocks with more favorable geologic characteristics are indeed well suited for consideration for the storage of radioactive waste.

  20. Quantization of Big Bang in crypto-Hermitian Heisenberg picture

    CERN Document Server

    Znojil, Miloslav

    2015-01-01

    A background-independent quantization of the Universe near its Big Bang singularity is considered using a drastically simplified toy model. Several conceptual issues are addressed. (1) The observable spatial-geometry characteristics of our empty-space expanding Universe is sampled by the time-dependent operator $Q=Q(t)$ of the distance between two space-attached observers (``Alice and Bob''). (2) For any pre-selected guess of the simple, non-covariant time-dependent observable $Q(t)$ one of the Kato's exceptional points (viz., $t=\\tau_{(EP)}$) is postulated {\\em real-valued}. This enables us to treat it as the time of Big Bang. (3) During our ``Eon'' (i.e., at all $t>\\tau_{(EP)}$) the observability status of operator $Q(t)$ is mathematically guaranteed by its self-adjoint nature with respect to an {\\em ad hoc} Hilbert-space metric $\\Theta(t) \

  1. The Big Bang as the Ultimate Traffic Jam

    CERN Document Server

    Jejjala, Vishnu; Minic, Djordje; Tze, Chia-Hsiung

    2009-01-01

    We present a novel solution to the nature and formation of the initial state of the Universe. It derives from the physics of a generally covariant extension of Matrix theory. We focus on the dynamical state space of this background independent quantum theory of gravity and matter, an infinite dimensional, complex non-linear Grassmannian. When this space is endowed with a Fubini--Study-like metric, the associated geodesic distance between any two of its points is zero. This striking mathematical result translates into a physical description of a hot, zero entropy Big Bang. The latter is then seen as a far from equilibrium, large fluctuation driven, metastable ordered transition, a ``freezing by heating'' jamming transition. Moreover, the subsequent unjamming transition could provide a mechanism for inflation while rejamming may model a Big Crunch, the final state of gravitational collapse.

  2. Simulating Smoke Filling in Big Halls by Computational Fluid Dynamics

    Directory of Open Access Journals (Sweden)

    W. K. Chow

    2011-01-01

    Full Text Available Many tall halls of big space volume were built and, to be built in many construction projects in the Far East, particularly Mainland China, Hong Kong, and Taiwan. Smoke is identified to be the key hazard to handle. Consequently, smoke exhaust systems are specified in the fire code in those areas. An update on applying Computational Fluid Dynamics (CFD in smoke exhaust design will be presented in this paper. Key points to note in CFD simulations on smoke filling due to a fire in a big hall will be discussed. Mathematical aspects concerning of discretization of partial differential equations and algorithms for solving the velocity-pressure linked equations are briefly outlined. Results predicted by CFD with different free boundary conditions are compared with those on room fire tests. Standards on grid size, relaxation factors, convergence criteria, and false diffusion should be set up for numerical experiments with CFD.

  3. Particle physics catalysis of thermal big bang nucleosynthesis.

    Science.gov (United States)

    Pospelov, Maxim

    2007-06-08

    We point out that the existence of metastable, tau>10(3) s, negatively charged electroweak-scale particles (X-) alters the predictions for lithium and other primordial elemental abundances for A>4 via the formation of bound states with nuclei during big bang nucleosynthesis. In particular, we show that the bound states of X- with helium, formed at temperatures of about T=10(8) K, lead to the catalytic enhancement of 6Li production, which is 8 orders of magnitude more efficient than the standard channel. In particle physics models where subsequent decay of X- does not lead to large nonthermal big bang nucleosynthesis effects, this directly translates to the level of sensitivity to the number density of long-lived X- particles (tau>10(5) s) relative to entropy of nX-/s less, approximately <3x10(-17), which is one of the most stringent probes of electroweak scale remnants known to date.

  4. Slippery Rock University

    Science.gov (United States)

    Arnhold, Robert W.

    2008-01-01

    Slippery Rock University (SRU), located in western Pennsylvania, is one of 14 state-owned institutions of higher education in Pennsylvania. The university has a rich tradition of providing professional preparation programs in special education, therapeutic recreation, physical education, and physical therapy for individuals with disabilities.…

  5. Rock-hard coatings

    NARCIS (Netherlands)

    Muller, M.

    2007-01-01

    Aircraft jet engines have to be able to withstand infernal conditions. Extreme heat and bitter cold tax coatings to the limit. Materials expert Dr Ir. Wim Sloof fits atoms together to develop rock-hard coatings. The latest invention in this field is known as ceramic matrix composites. Sloof has sign

  6. Umhlanga Rocks coastal defense

    NARCIS (Netherlands)

    De Jong, L.; De Jong, B.; Ivanova, M.; Gerritse, A.; Rietberg, D.; Dorrepaal, S.

    2014-01-01

    The eThekwini coastline is a vulnerable coastline subject to chronic erosion and damage due to sea level rise. In 2007 a severe storm caused major physical and economic damage along the coastline, proving the need for action. Umhlanga Rocks is a densely populated premium holiday destination on the e

  7. Microcraters on lunar rocks.

    Science.gov (United States)

    Morrison, D. A.; Mckay, D. S.; Heiken, G. H.; Moore, H. J.

    1972-01-01

    Microcrater frequency distributions have been obtained for nine Apollo rocks and an exterior chip of an Apollo 12 rock. The frequency distributions indicate that five of the Apollo 14 rocks were tumbled more than once exposing different rock faces whereas four were not tumbled and represent a single exposure interval. The cumulative frequency of craters per square centimeter was extended below optical resolution limits using a SEM scan of an exterior chip of breccia 12073. No craters with central pit diameters less than 15 microns were seen in a total area of 0.44 sq cm. A detailed SEM scan of crystal faces and glassy crater liners revealed no microcraters equal to or larger than the resolution limit of 5 microns. An upper limit of 170 craters per sq cm with central pit diameters larger than 5 microns was set. The slope of the cumulative frequency curve for craters with central pit diameters less than about 75 microns is less than that obtained by other workers.

  8. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Corridor Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A first-surface topography Digital Elevation Model (DEM) mosaic for the Big Sandy Creek Corridor Unit of Big Thicket National Preserve in Texas was produced from...

  9. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Corridor Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A bare-earth topography Digital Elevation Model (DEM) mosaic for the Big Sandy Creek Corridor Unit of Big Thicket National Preserve in Texas was produced from...

  10. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A first-surface topography digital elevation model (DEM) mosaic for the Big Sandy Creek Unit of Big Thicket National Preserve in Texas, was produced from remotely...

  11. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A bare-earth topography digital elevation model (DEM) mosaic for the Big Sandy Creek Unit of Big Thicket National Preserve in Texas, was produced from remotely...

  12. Cuttability Assessment of Selected Rocks Through Different Brittleness Values

    Science.gov (United States)

    Dursun, Arif Emre; Gokay, M. Kemal

    2016-04-01

    Prediction of cuttability is a critical issue for successful execution of tunnel or mining excavation projects. Rock cuttability is also used to determine specific energy, which is defined as the work done by the cutting force to excavate a unit volume of yield. Specific energy is a meaningful inverse measure of cutting efficiency, since it simply states how much energy must be expended to excavate a unit volume of rock. Brittleness is a fundamental rock property and applied in drilling and rock excavation. Brittleness is one of the most crucial rock features for rock excavation. For this reason, determination of relations between cuttability and brittleness will help rock engineers. This study aims to estimate the specific energy from different brittleness values of rocks by means of simple and multiple regression analyses. In this study, rock cutting, rock property, and brittleness index tests were carried out on 24 different rock samples with different strength values, including marble, travertine, and tuff, collected from sites around Konya Province, Turkey. Four previously used brittleness concepts were evaluated in this study, denoted as B 1 (ratio of compressive to tensile strength), B 2 (ratio of the difference between compressive and tensile strength to the sum of compressive and tensile strength), B 3 (area under the stress-strain line in relation to compressive and tensile strength), and B 9 = S 20, the percentage of fines (University of Science and Technology (NTNU) model as well as B 9p (B 9 as predicted from uniaxial compressive, Brazilian tensile, and point load strengths of rocks using multiple regression analysis). The results suggest that the proposed simple regression-based prediction models including B 3, B 9, and B 9p outperform the other models including B 1 and B 2 and can be used for more accurate and reliable estimation of specific energy.

  13. Nuclear reactor kinetics and plant control

    CERN Document Server

    Oka, Yoshiaki

    2013-01-01

    Understanding time-dependent behaviors of nuclear reactors and the methods of their control is essential to the operation and safety of nuclear power plants. This book provides graduate students, researchers, and engineers in nuclear engineering comprehensive information on both the fundamental theory of nuclear reactor kinetics and control and the state-of-the-art practice in actual plants, as well as the idea of how to bridge the two. The first part focuses on understanding fundamental nuclear kinetics. It introduces delayed neutrons, fission chain reactions, point kinetics theory, reactivit

  14. Slow clean-up for fast reactor

    Science.gov (United States)

    Banks, Michael

    2008-05-01

    The year 2300 is so distant that one may be forgiven for thinking of it only in terms of science fiction. But this is the year that workers at the Dounreay power station in Northern Scotland - the UK's only centre for research into "fast" nuclear reactors - term as the "end point" by which time the site will be completely clear of radioactive material. More than 180 facilities - including the iconic dome that housed the Dounreay Fast Reactor (DFR) - were built at at the site since it opened in 1959, with almost 50 having been used to handle radioactive material.

  15. Modeling of Reactor Kinetics and Dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Matthew Johnson; Scott Lucas; Pavel Tsvetkov

    2010-09-01

    In order to model a full fuel cycle in a nuclear reactor, it is necessary to simulate the short time-scale kinetic behavior of the reactor as well as the long time-scale dynamics that occur with fuel burnup. The former is modeled using the point kinetics equations, while the latter is modeled by coupling fuel burnup equations with the kinetics equations. When the equations are solved simultaneously with a nonlinear equation solver, the end result is a code with the unique capability of modeling transients at any time during a fuel cycle.

  16. Big data optimization recent developments and challenges

    CERN Document Server

    2016-01-01

    The main objective of this book is to provide the necessary background to work with big data by introducing some novel optimization algorithms and codes capable of working in the big data setting as well as introducing some applications in big data optimization for both academics and practitioners interested, and to benefit society, industry, academia, and government. Presenting applications in a variety of industries, this book will be useful for the researchers aiming to analyses large scale data. Several optimization algorithms for big data including convergent parallel algorithms, limited memory bundle algorithm, diagonal bundle method, convergent parallel algorithms, network analytics, and many more have been explored in this book.

  17. Medical big data: promise and challenges.

    Science.gov (United States)

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  18. Organizational Design Challenges Resulting From Big Data

    Directory of Open Access Journals (Sweden)

    Jay R. Galbraith

    2014-04-01

    Full Text Available Business firms and other types of organizations are feverishly exploring ways of taking advantage of the big data phenomenon. This article discusses firms that are at the leading edge of developing a big data analytics capability. Firms that are currently enjoying the most success in this area are able to use big data not only to improve their existing businesses but to create new businesses as well. Putting a strategic emphasis on big data requires adding an analytics capability to the existing organization. This transformation process results in power shifting to analytics experts and in decisions being made in real time.

  19. The big de Rham–Witt complex

    DEFF Research Database (Denmark)

    Hesselholt, Lars

    2015-01-01

    This paper gives a new and direct construction of the multi-prime big de Rham–Witt complex, which is defined for every commutative and unital ring; the original construction by Madsen and myself relied on the adjoint functor theorem and accordingly was very indirect. The construction given here....... It is the existence of these divided Frobenius operators that makes the new construction of the big de Rham–Witt complex possible. It is further shown that the big de Rham–Witt complex behaves well with respect to étale maps, and finally, the big de Rham–Witt complex of the ring of integers is explicitly evaluated....

  20. Big data analytics with R and Hadoop

    CERN Document Server

    Prajapati, Vignesh

    2013-01-01

    Big Data Analytics with R and Hadoop is a tutorial style book that focuses on all the powerful big data tasks that can be achieved by integrating R and Hadoop.This book is ideal for R developers who are looking for a way to perform big data analytics with Hadoop. This book is also aimed at those who know Hadoop and want to build some intelligent applications over Big data with R packages. It would be helpful if readers have basic knowledge of R.

  1. Traffic information computing platform for big data

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn; Liu, Yan, E-mail: ztduan@chd.edu.cn; Dai, Jiting, E-mail: ztduan@chd.edu.cn; Kang, Jun, E-mail: ztduan@chd.edu.cn [Chang' an University School of Information Engineering, Xi' an, China and Shaanxi Engineering and Technical Research Center for Road and Traffic Detection, Xi' an (China)

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  2. Environmental Consequences of Big Nasty Impacts on the Early Earth

    Science.gov (United States)

    Zahnle, K. J.

    2015-12-01

    The geological record of the Archean Earth is spattered with impact spherules from a dozen or so major cosmic collisions involving Earth and asteroids or comets (Lowe, Byerly 1986, 2015). Extrapolation of the documented deposits suggests that most of these impacts were as big or bigger than the Chicxulub event that famously ended the reign of the thunder lizards. As the Archean impacts were greater, the environmental effects were also greater. The number and magnitude of the impacts is bounded by the lunar record. There are no lunar craters bigger than Chicxulub that date to Earth's mid-to-late Archean. Chance dictates that Earth experienced ~10 impacts bigger than Chicxulub between 2.5 Ga and 3.5 Ga, the biggest of which were ~30-100X more energetic than Chicxulub. To quantify the thermal consequences of big impacts on old Earth, we model the global flow of energy from the impact into the environment. The model presumes that a significant fraction of the impact energy goes into ejecta that interact with the atmosphere. Much of this energy is initially in rock vapor, melt, and high speed particles. (i) The upper atmosphere is heated by ejecta as they reenter the atmosphere. The mix of hot air, rock vapor, and hot silicates cools by thermal radiation. Rock raindrops fall out as the upper atmosphere cools. (ii) The energy balance of the lower atmosphere is set by radiative exchange with the upper atmosphere and with the surface, and by evaporation of seawater. Susequent cooling is governed by condensation of water vapor. (iii) The oceans are heated by thermal radiation and rock rain and cooled by evaporation. Surface waters become hot and salty; if a deep ocean remains it is relatively cool. Subsequently water vapor condenses to replenish the oceans with hot fresh water (how fresh depending on continental weathering, which might be rather rapid under the circumstances). (iv) The surface temperature of dry land is presumed to be the same as the lower atmosphere. A

  3. Environmental Consequences of Big Nasty Impacts on the Early Earth

    Science.gov (United States)

    Zahnle, Kevin

    2015-01-01

    The geological record of the Archean Earth is spattered with impact spherules from a dozen or so major cosmic collisions involving Earth and asteroids or comets (Lowe, Byerly 1986, 2015). Extrapolation of the documented deposits suggests that most of these impacts were as big or bigger than the Chicxulub event that famously ended the reign of the thunder lizards. As the Archean impacts were greater, the environmental effects were also greater. The number and magnitude of the impacts is bounded by the lunar record. There are no lunar craters bigger than Chicxulub that date to Earth's mid-to-late Archean. Chance dictates that Earth experienced no more than approximately 10 impacts bigger than Chicxulub between 2.5 billion years and 3.5 2.5 billion years, the biggest of which were approximately30-100 times more energetic, comparable to the Orientale impact on the Moon (1x10 (sup 26) joules). To quantify the thermal consequences of big impacts on old Earth, we model the global flow of energy from the impact into the environment. The model presumes that a significant fraction of the impact energy goes into ejecta that interact with the atmosphere. Much of this energy is initially in rock vapor, melt, and high speed particles. (i) The upper atmosphere is heated by ejecta as they reenter the atmosphere. The mix of hot air, rock vapor, and hot silicates cools by thermal radiation. Rock raindrops fall out as the upper atmosphere cools. (ii) The energy balance of the lower atmosphere is set by radiative exchange with the upper atmosphere and with the surface, and by evaporation of seawater. Susequent cooling is governed by condensation of water vapor. (iii) The oceans are heated by thermal radiation and rock rain and cooled by evaporation. Surface waters become hot and salty; if a deep ocean remains it is relatively cool. Subsequently water vapor condenses to replenish the oceans with hot fresh water (how fresh depending on continental weathering, which might be rather rapid

  4. Big Brother Has Bigger Say

    Institute of Scientific and Technical Information of China (English)

    Yang Wei

    2009-01-01

    @@ 156 delegates from all walks of life in Guangdong province composed the Guangdong delegation for the NPC this year. The import and export value of Guangdong makes up one-third of national total value, and accounts for one-eighth of national economic growth. Guangdong province has maintained its top spot in import and export value among China's many provinces and cities for several years, commonly referred to as "Big Brother". At the same time, it is the region where the global financial crisis has hit China hardest.

  5. Big Data en surveillance, deel 1 : Definities en discussies omtrent Big Data

    NARCIS (Netherlands)

    Timan, Tjerk

    2016-01-01

    Naar aanleiding van een (vrij kort) college over surveillance en Big Data, werd me gevraagd iets dieper in te gaan op het thema, definities en verschillende vraagstukken die te maken hebben met big data. In dit eerste deel zal ik proberen e.e.a. uiteen te zetten betreft Big Data theorie en terminolo

  6. Comparative validity of brief to medium-length Big Five and Big Six personality questionnaires

    NARCIS (Netherlands)

    Thalmayer, A.G.; Saucier, G.; Eigenhuis, A.

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five

  7. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  8. Field Geologist: An Android App for Measuring Rock Outcroppings

    Science.gov (United States)

    Baird, J.; Chiu, M. T.; Huang, X.; de Lanerolle, T. R.; Morelli, R.; Gourley, J. R.

    2011-12-01

    Field geologist is a mobile Android app that measures, plots, and exports strike and data in the field. When the phone is placed on the steepest part of the rock, it automatically detects dip, string, latitude and longitude. It includes a drop-down menu to record the type of rock. The app's initial screen displays a compass with an interior dip/strike symbol that always points toward the dip direction. Tapping the compass stores a data point in the phone's database. The points can be displayed on a Google map and uploaded to a server, from where they can be retrieved in CSV format and imported into a spreadsheet.

  9. Big Challenges and Big Opportunities: The Power of "Big Ideas" to Change Curriculum and the Culture of Teacher Planning

    Science.gov (United States)

    Hurst, Chris

    2014-01-01

    Mathematical knowledge of pre-service teachers is currently "under the microscope" and the subject of research. This paper proposes a different approach to teacher content knowledge based on the "big ideas" of mathematics and the connections that exist within and between them. It is suggested that these "big ideas"…

  10. Slurry reactor design studies

    Energy Technology Data Exchange (ETDEWEB)

    Fox, J.M.; Degen, B.D.; Cady, G.; Deslate, F.D.; Summers, R.L. (Bechtel Group, Inc., San Francisco, CA (USA)); Akgerman, A. (Texas A and M Univ., College Station, TX (USA)); Smith, J.M. (California Univ., Davis, CA (USA))

    1990-06-01

    The objective of these studies was to perform a realistic evaluation of the relative costs of tublar-fixed-bed and slurry reactors for methanol, mixed alcohols and Fischer-Tropsch syntheses under conditions where they would realistically be expected to operate. The slurry Fischer-Tropsch reactor was, therefore, operated at low H{sub 2}/CO ratio on gas directly from a Shell gasifier. The fixed-bed reactor was operated on 2.0 H{sub 2}/CO ratio gas after adjustment by shift and CO{sub 2} removal. Every attempt was made to give each reactor the benefit of its optimum design condition and correlations were developed to extend the models beyond the range of the experimental pilot plant data. For the methanol design, comparisons were made for a recycle plant with high methanol yield, this being the standard design condition. It is recognized that this is not necessarily the optimum application for the slurry reactor, which is being proposed for a once-through operation, coproducing methanol and power. Consideration is also given to the applicability of the slurry reactor to mixed alcohols, based on conditions provided by Lurgi for an Octamix{trademark} plant using their standard tubular-fixed reactor technology. 7 figs., 26 tabs.

  11. Musical Structure as Narrative in Rock

    Directory of Open Access Journals (Sweden)

    John Fernando Encarnacao

    2011-09-01

    Full Text Available In an attempt to take a fresh look at the analysis of form in rock music, this paper uses Susan McClary’s (2000 idea of ‘quest narrative’ in Western art music as a starting point. While much pop and rock adheres to the basic structure of the establishment of a home territory, episodes or adventures away, and then a return, my study suggests three categories of rock music form that provide alternatives to common combinations of verses, choruses and bridges through which the quest narrative is delivered. Labyrinth forms present more than the usual number of sections to confound our sense of ‘home’, and consequently of ‘quest’. Single-cell forms use repetition to suggest either a kind of stasis or to disrupt our expectations of beginning, middle and end. Immersive forms blur sectional divisions and invite more sensual and participatory responses to the recorded text. With regard to all of these alternative approaches to structure, Judy Lochhead’s (1992 concept of ‘forming’ is called upon to underline rock music forms that unfold as process, rather than map received formal constructs. Central to the argument are a couple of crucial definitions. Following Theodore Gracyk (1996, it is not songs, as such, but particular recordings that constitute rock music texts. Additionally, narrative is understood not in (direct relation to the lyrics of a song, nor in terms of artists’ biographies or the trajectories of musical styles, but considered in terms of musical structure. It is hoped that this outline of non-narrative musical structures in rock may have applications not only to other types of music, but to other time-based art forms.

  12. An analysis of cross-sectional differences in big and non-big public accounting firms' audit programs

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans); Drieenhuizen, F.; Stein, M.T.; Simunic, D.A.

    2006-01-01

    A significant body of prior research has shown that audits by the Big 5 (now Big 4) public accounting firms are quality differentiated relative to non-Big 5 audits. This result can be derived analytically by assuming that Big 5 and non-Big 5 firms face different loss functions for "audit failures" a

  13. Big data in oncologic imaging.

    Science.gov (United States)

    Regge, Daniele; Mazzetti, Simone; Giannini, Valentina; Bracco, Christian; Stasi, Michele

    2016-09-13

    Cancer is a complex disease and unfortunately understanding how the components of the cancer system work does not help understand the behavior of the system as a whole. In the words of the Greek philosopher Aristotle "the whole is greater than the sum of parts." To date, thanks to improved information technology infrastructures, it is possible to store data from each single cancer patient, including clinical data, medical images, laboratory tests, and pathological and genomic information. Indeed, medical archive storage constitutes approximately one-third of total global storage demand and a large part of the data are in the form of medical images. The opportunity is now to draw insight on the whole to the benefit of each individual patient. In the oncologic patient, big data analysis is at the beginning but several useful applications can be envisaged including development of imaging biomarkers to predict disease outcome, assessing the risk of X-ray dose exposure or of renal damage following the administration of contrast agents, and tracking and optimizing patient workflow. The aim of this review is to present current evidence of how big data derived from medical images may impact on the diagnostic pathway of the oncologic patient.

  14. Gas cooled fast reactor

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1972-06-01

    Although most of the development work on fast breeder reactors has been devoted to the use of liquid metal cooling, interest has been expressed for a number of years in alternative breeder concepts using other coolants. One of a number of concepts in which interest has been retained is the Gas-Cooled Fast Reactor (GCFR). As presently envisioned, it would operate on the uranium-plutonium mixed oxide fuel cycle, similar to that used in the Liquid Metal Fast Breeder Reactor (LMFBR), and would use helium gas as the coolant.

  15. Microfluidic electrochemical reactors

    Science.gov (United States)

    Nuzzo, Ralph G [Champaign, IL; Mitrovski, Svetlana M [Urbana, IL

    2011-03-22

    A microfluidic electrochemical reactor includes an electrode and one or more microfluidic channels on the electrode, where the microfluidic channels are covered with a membrane containing a gas permeable polymer. The distance between the electrode and the membrane is less than 500 micrometers. The microfluidic electrochemical reactor can provide for increased reaction rates in electrochemical reactions using a gaseous reactant, as compared to conventional electrochemical cells. Microfluidic electrochemical reactors can be incorporated into devices for applications such as fuel cells, electrochemical analysis, microfluidic actuation, pH gradient formation.

  16. Fast Breeder Reactor studies

    Energy Technology Data Exchange (ETDEWEB)

    Till, C.E.; Chang, Y.I.; Kittel, J.H.; Fauske, H.K.; Lineberry, M.J.; Stevenson, M.G.; Amundson, P.I.; Dance, K.D.

    1980-07-01

    This report is a compilation of Fast Breeder Reactor (FBR) resource documents prepared to provide the technical basis for the US contribution to the International Nuclear Fuel Cycle Evaluation. The eight separate parts deal with the alternative fast breeder reactor fuel cycles in terms of energy demand, resource base, technical potential and current status, safety, proliferation resistance, deployment, and nuclear safeguards. An Annex compares the cost of decommissioning light-water and fast breeder reactors. Separate abstracts are included for each of the parts.

  17. Multiplicity features of adiabatic autothermal reactors

    Energy Technology Data Exchange (ETDEWEB)

    Lovo, M.; Balakotaiah, V. (Houston Univ., TX (United States). Dept. of Chemical Engineering)

    1992-01-01

    In this paper singularity theory, large activation energy asymptotic, and numerical methods are used to present a comprehensive study of the steady-state multiplicity features of three classical adiabatic autothermal reactor models: tubular reactor with internal heat exchange, tubular reactor with external heat exchange, and the CSTR with external heat exchange. Specifically, the authors derive the exact uniqueness-multiplicity boundary, determine typical cross-sections of the bifurcation set, and classify the different types of bifurcation diagrams of conversion vs. residence time. Asymptotic (limiting) models are used to determine analytical expressions for the uniqueness boundary and the ignition and extinction points. The analytical results are used to present simple, explicit and accurate expressions defining the boundary of the region of autothermal operation in the physical parameter space.

  18. Oklo reactors and implications for nuclear science

    CERN Document Server

    Davis, E D; Sharapov, E I

    2014-01-01

    We summarize the nuclear physics interests in the Oklo natural nuclear reactors, focusing particularly on developments over the past two decades. Modeling of the reactors has become increasingly sophisticated, employing Monte Carlo simulations with realistic geometries and materials that can generate both the thermal and epithermal fractions. The water content and the temperatures of the reactors have been uncertain parameters. We discuss recent work pointing to lower temperatures than earlier assumed. Nuclear cross sections are input to all Oklo modeling and we discuss a parameter, the $^{175}$Lu ground state cross section for thermal neutron capture leading to the isomer $^{176\\mathrm{m}}$ Lu, that warrants further investigation. Studies of the time dependence of dimensionless fundamental constants have been a driver for much of the recent work on Oklo. We critically review neutron resonance energy shifts and their dependence on the fine structure constant $\\alpha$ and the ratio $X_q=m_q/\\Lambda$ (where $m_...

  19. Preliminary geologic map of the Big Costilla Peak area, Taos County, New Mexico, and Costilla County, Colorado

    Science.gov (United States)

    Fridrich, Christopher J.; Shroba, Ralph R.; Hudson, Adam M.

    2012-01-01

    This map covers the Big Costilla Peak, New Mex.&nash;Colo. quadrangle and adjacent parts of three other 7.5 minute quadrangles: Amalia, New Mex.–Colo., Latir Peak, New Mex., and Comanche Point, New Mex. The study area is in the southwesternmost part of that segment of the Sangre de Cristo Mountains known as the Culebra Range; the Taos Range segment lies to the southwest of Costilla Creek and its tributary, Comanche Creek. The map area extends over all but the northernmost part of the Big Costilla horst, a late Cenozoic uplift of Proterozoic (1.7-Ga and less than 1.4-Ga) rocks that is largely surrounded by down-faulted middle to late Cenozoic (about 40 Ma to about 1 Ma) rocks exposed at significantly lower elevations. This horst is bounded on the northwest side by the San Pedro horst and Culebra graben, on the northeast and east sides by the Devils Park graben, and on the southwest side by the (about 30 Ma to about 25 Ma) Latir volcanic field. The area of this volcanic field, at the north end of the Taos Range, has undergone significantly greater extension than the area to the north of Costilla Creek. The horsts and grabens discussed above are all peripheral structures on the eastern flank of the San Luis basin, which is the axial part of the (about 26 Ma to present) Rio Grande rift at the latitude of the map. The Raton Basin lies to the east of the Culebra segment of the Sangre de Cristo Mountains. This foreland basin formed during, and is related to, the original uplift of the Sangre de Cristo Mountains which was driven by tectonic contraction of the Laramide (about 70 Ma to about 40 Ma) orogeny. Renewed uplift and structural modification of these mountains has occurred during formation of the Rio Grande rift. Surficial deposits in the study area include alluvial, mass-movement, and glacial deposits of middle Pleistocene to Holocene age.

  20. Pitted Rock Named Ender

    Science.gov (United States)

    1997-01-01

    This image was taken by the Sojourner rover's right front camera on Sol 33. The rock in the foreground, nicknamed 'Ender', is pitted and marked by a subtle horizontal texture. The bright material on the top of the rock is probably wind-deposited dust. The Pathfinder Lander is seen in the distance at right. The lander camera is the cylindrical object on top of the deployed mast.Mars Pathfinder is the second in NASA's Discovery program of low-cost spacecraft with highly focused science goals. The Jet Propulsion Laboratory, Pasadena, CA, developed and managed the Mars Pathfinder mission for NASA's Office of Space Science, Washington, D.C. JPL is a division of the California Institute of Technology (Caltech).

  1. Rock and mineral magnetism

    CERN Document Server

    O’Reilly, W

    1984-01-01

    The past two decades have witnessed a revolution in the earth sciences. The quantitative, instrument-based measurements and physical models of. geophysics, together with advances in technology, have radically transformed the way in which the Earth, and especially its crust, is described. The study of the magnetism of the rocks of the Earth's crust has played a major part in this transformation. Rocks, or more specifically their constituent magnetic minerals, can be regarded as a measuring instrument provided by nature, which can be employed in the service of the earth sciences. Thus magnetic minerals are a recording magnetometer; a goniometer or protractor, recording the directions of flows, fields and forces; a clock; a recording thermometer; a position recorder; astrain gauge; an instrument for geo­ logical surveying; a tracer in climatology and hydrology; a tool in petrology. No instrument is linear, or free from noise and systematic errors, and the performance of nature's instrument must be assessed and ...

  2. Uranium in alkaline rocks

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, M.; Wollenberg, H.; Strisower, B.; Bowman, H.; Flexser, S.; Carmichael, I.

    1978-04-01

    Geologic and geochemical criteria were developed for the occurrence of economic uranium deposits in alkaline igneous rocks. A literature search, a limited chemical analytical program, and visits to three prominent alkaline-rock localities (Ilimaussaq, Greenland; Pocos de Caldas, Brazil; and Powderhorn, Colorado) were made to establish criteria to determine if a site had some uranium resource potential. From the literature, four alkaline-intrusive occurrences of differing character were identified as type-localities for uranium mineralization, and the important aspects of these localities were described. These characteristics were used to categorize and evaluate U.S. occurrences. The literature search disclosed 69 U.S. sites, encompassing nepheline syenite, alkaline granite, and carbonatite. It was possible to compare two-thirds of these sites to the type localities. A ranking system identified ten of the sites as most likely to have uranium resource potential.

  3. Why Big Data Is a Big Deal (Ⅱ)

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    A new group of data mining technologies promises to change forever the way we sift through our vast stores of data,making it faster and cheaper.Some of the technologies are actively being used by people on the bleeding edge who need the technology now,like those involved in creating Web-based services that are driven by social media.They're also heavily contributing to these projects.In other vertical industries,businesses are realizing that much more of their value proposition is informationbased than they had previously thought,which will allow big data technologies to gain traction quickly,Olofson says.Couple that with affordable hardware and software,and enterprises find themselves in a perfect storm of business transformation opportunities.

  4. Big Data – Big Deal for Organization Design?

    Directory of Open Access Journals (Sweden)

    Janne J. Korhonen

    2014-04-01

    Full Text Available Analytics is an increasingly important source of competitive advantage. It has even been posited that big data will be the next strategic emphasis of organizations and that analytics capability will be manifested in organizational structure. In this article, I explore how analytics capability might be reflected in organizational structure using the notion of  “requisite organization” developed by Jaques (1998. Requisite organization argues that a new strategic emphasis requires the addition of a new stratum in the organization, resulting in greater organizational complexity. Requisite organization could serve as an objective, verifiable criterion for what qualifies as a genuine new strategic emphasis. Such a criterion is  necessary for research on the co-evolution of strategy and structure.

  5. Numerical Homogenization of Jointed Rock Masses Using Wave Propagation Simulation

    Science.gov (United States)

    Gasmi, Hatem; Hamdi, Essaïeb; Bouden Romdhane, Nejla

    2014-07-01

    Homogenization in fractured rock analyses is essentially based on the calculation of equivalent elastic parameters. In this paper, a new numerical homogenization method that was programmed by means of a MATLAB code, called HLA-Dissim, is presented. The developed approach simulates a discontinuity network of real rock masses based on the International Society of Rock Mechanics (ISRM) scanline field mapping methodology. Then, it evaluates a series of classic joint parameters to characterize density (RQD, specific length of discontinuities). A pulse wave, characterized by its amplitude, central frequency, and duration, is propagated from a source point to a receiver point of the simulated jointed rock mass using a complex recursive method for evaluating the transmission and reflection coefficient for each simulated discontinuity. The seismic parameters, such as delay, velocity, and attenuation, are then calculated. Finally, the equivalent medium model parameters of the rock mass are computed numerically while taking into account the natural discontinuity distribution. This methodology was applied to 17 bench fronts from six aggregate quarries located in Tunisia, Spain, Austria, and Sweden. It allowed characterizing the rock mass discontinuity network, the resulting seismic performance, and the equivalent medium stiffness. The relationship between the equivalent Young's modulus and rock discontinuity parameters was also analyzed. For these different bench fronts, the proposed numerical approach was also compared to several empirical formulas, based on RQD and fracture density values, published in previous research studies, showing its usefulness and efficiency in estimating rapidly the Young's modulus of equivalent medium for wave propagation analysis.

  6. Kansen voor Big data – WPA Vertrouwen

    NARCIS (Netherlands)

    Broek, T.A. van den; Roosendaal, A.P.C.; Veenstra, A.F.E. van; Nunen, A.M. van

    2014-01-01

    Big data is expected to become a driver for economic growth, but this can only be achieved when services based on (big) data are accepted by citizens and consumers. In a recent policy brief, the Cabinet Office mentions trust as one of the three pillars (the others being transparency and control) for

  7. The Death of the Big Men

    DEFF Research Database (Denmark)

    Martin, Keir

    2010-01-01

    Recently Tolai people og Papua New Guinea have adopted the term 'Big Shot' to decribe an emerging post-colonial political elite. The mergence of the term is a negative moral evaluation of new social possibilities that have arisen as a consequence of the Big Shots' privileged position within a glo...

  8. The Big Sleep in the Woods

    Institute of Scientific and Technical Information of China (English)

    王玉峰

    2002-01-01

    Now it's the time of the big sleep for the bees and the bears. Even the buds of the plants whose leaves fall off share in it. But the intensity of this winter sleep, or hibernation, depends on who's doing it.The big sleep of the bears ,for instance ,would probably be thought of as a

  9. A New Look at Big History

    Science.gov (United States)

    Hawkey, Kate

    2014-01-01

    The article sets out a "big history" which resonates with the priorities of our own time. A globalizing world calls for new spacial scales to underpin what the history curriculum addresses, "big history" calls for new temporal scales, while concern over climate change calls for a new look at subject boundaries. The article…

  10. 革命者BIG BANG

    Institute of Scientific and Technical Information of China (English)

    刘岩

    2015-01-01

    <正>在鄂尔多斯的繁荣时代,我遇见了那里的一位"意见领袖",因为他从美国回来,见过外面的世界,有着对奢侈品辽阔的见识和独到的品味。他引领着那座神秘财富城市中一个小圈子的购物风潮,他们一块接一块儿地购入Big Bang。那个时候,我并不太清楚他们迷恋这款腕表的原因,直到我一次次地去到巴塞尔表展,一次次地了解到Big Bang的想象力。是的,Big Bang的确充满了魅力。Big Bang进化史2005年Big Bang系列诞生2006年Big Bang全黑"全黑"理念使Big Bang更加纯粹和简洁。Big Bang全黑腕表从表壳到表盘浑然天成的亚光质感和多层次、不同材料融合起来的黑色,蕴含"不可见的可见"之禅意。

  11. An embedding for the big bang

    Science.gov (United States)

    Wesson, Paul S.

    1994-01-01

    A cosmological model is given that has good physical properties for the early and late universe but is a hypersurface in a flat five-dimensional manifold. The big bang can therefore be regarded as an effect of a choice of coordinates in a truncated higher-dimensional geometry. Thus the big bang is in some sense a geometrical illusion.

  12. Structuring the Curriculum around Big Ideas

    Science.gov (United States)

    Alleman, Janet; Knighton, Barbara; Brophy, Jere

    2010-01-01

    This article provides an inside look at Barbara Knighton's classroom teaching. She uses big ideas to guide her planning and instruction and gives other teachers suggestions for adopting the big idea approach and ways for making the approach easier. This article also represents a "small slice" of a dozen years of collaborative research,…

  13. A novel concept for CRIEC-driven subcritical research reactors

    Energy Technology Data Exchange (ETDEWEB)

    Nieto, M.; Miley, G.H. [Illinois Univ., Fusion Studies Lab., Dept. of Nuclear, Plasma, and Radiological Engineering, Urbana, IL (United States)

    2001-07-01

    A novel scheme is proposed to drive a low-power subcritical fuel assembly by means of a long Cylindrical Radially-convergent Inertial Electrostatic Confinement (CRIEC) used as a neutron source. The concept is inherently safe in the sense that the fuel assembly remains subcritical at all times. Previous work has been done for the possible implementation of CRIEC as a subcritical assembly driver for power reactors. However, it has been found that the present technology and stage of development of IEC-based neutron sources can not meet the neutron flux requirements to drive a system as big as a power reactor. Nevertheless, smaller systems, such as research and training reactors, could be successfully driven with levels of neutron flux that seem more reasonable to be achieved in the near future by IEC devices. The need for custom-made expensive nuclear fission fuel, as in the case of the TRIGA reactors, is eliminated, and the CRIEC presents substantial advantages with respect to the accelerator-driven subcritical reactors in terms of simplicity and cost. In the present paper, a conceptual design for a research/training CRIEC-driven subcritical assembly is presented, emphasizing the description, principle of operation and performance of the CRIEC neutron source, highlighting its advantages and discussing some key issues that require study for the implementation of this concept. (author)

  14. Cloud Based Big Data Infrastructure: Architectural Components and Automated Provisioning

    OpenAIRE

    Demchenko, Yuri; Turkmen, Fatih; Blanchet, Christophe; Loomis, Charles; Laat, Caees de

    2016-01-01

    This paper describes the general architecture and functional components of the cloud based Big Data Infrastructure (BDI). The proposed BDI architecture is based on the analysis of the emerging Big Data and data intensive technologies and supported by the definition of the Big Data Architecture Framework (BDAF) that defines the following components of the Big Data technologies: Big Data definition, Data Management including data lifecycle and data structures, Big Data Infrastructure (generical...

  15. Reactor BR2. Introduction

    Energy Technology Data Exchange (ETDEWEB)

    Gubel, P

    2001-04-01

    The BR2 is a materials testing reactor and is still one of SCK-CEN's important nuclear facilities. After an extensive refurbishment to compensate for the ageing of the installation, the reactor was restarted in April 1997. During the last three years, the availability of the installation was maintained at an average level of 97.6 percent. In the year 2000, the reactor was operated for a total of 104 days at a mean power of 56 MW. In 2000, most irradiation experiments were performed in the CALLISTO PWR loop. The report describes irradiations achieved or under preparation in 2000, including the development of advanced facilities and concept studies for new programmes. An overview of the scientific irradiation programmes as well as of the R and D programme of the BR2 reactor in 2000 is given.

  16. Reactor Neutrino Spectra

    CERN Document Server

    Hayes, A C

    2016-01-01

    We present a review of the antineutrino spectra emitted from reactors. Knowledge of these and their associated uncertainties are crucial for neutrino oscillation studies. The spectra used to-date have been determined by either conversion of measured electron spectra to antineutrino spectra or by summing over all of the thousands of transitions that makeup the spectra using modern databases as input. The uncertainties in the subdominant corrections to beta-decay plague both methods, and we provide estimates of these uncertainties. Improving on current knowledge of the antineutrino spectra from reactors will require new experiments. Such experiments would also address the so-called reactor neutrino anomaly and the possible origin of the shoulder observed in the antineutrino spectra measured in recent high-statistics reactor neutrino experiments.

  17. New reactor type proposed

    CERN Multimedia

    2003-01-01

    "Russian scientists at the Research Institute of Nuclear Power Engineering in Moscow are hoping to develop a new reactor that will use lead and bismuth as fuel instead of uranium and plutonium" (1/2 page).

  18. The ethics of biomedical big data

    CERN Document Server

    Mittelstadt, Brent Daniel

    2016-01-01

    This book presents cutting edge research on the new ethical challenges posed by biomedical Big Data technologies and practices. ‘Biomedical Big Data’ refers to the analysis of aggregated, very large datasets to improve medical knowledge and clinical care. The book describes the ethical problems posed by aggregation of biomedical datasets and re-use/re-purposing of data, in areas such as privacy, consent, professionalism, power relationships, and ethical governance of Big Data platforms. Approaches and methods are discussed that can be used to address these problems to achieve the appropriate balance between the social goods of biomedical Big Data research and the safety and privacy of individuals. Seventeen original contributions analyse the ethical, social and related policy implications of the analysis and curation of biomedical Big Data, written by leading experts in the areas of biomedical research, medical and technology ethics, privacy, governance and data protection. The book advances our understan...

  19. Ethics and Epistemology in Big Data Research.

    Science.gov (United States)

    Lipworth, Wendy; Mason, Paul H; Kerridge, Ian; Ioannidis, John P A

    2017-03-20

    Biomedical innovation and translation are increasingly emphasizing research using "big data." The hope is that big data methods will both speed up research and make its results more applicable to "real-world" patients and health services. While big data research has been embraced by scientists, politicians, industry, and the public, numerous ethical, organizational, and technical/methodological concerns have also been raised. With respect to technical and methodological concerns, there is a view that these will be resolved through sophisticated information technologies, predictive algorithms, and data analysis techniques. While such advances will likely go some way towards resolving technical and methodological issues, we believe that the epistemological issues raised by big data research have important ethical implications and raise questions about the very possibility of big data research achieving its goals.

  20. 2nd INNS Conference on Big Data

    CERN Document Server

    Manolopoulos, Yannis; Iliadis, Lazaros; Roy, Asim; Vellasco, Marley

    2017-01-01

    The book offers a timely snapshot of neural network technologies as a significant component of big data analytics platforms. It promotes new advances and research directions in efficient and innovative algorithmic approaches to analyzing big data (e.g. deep networks, nature-inspired and brain-inspired algorithms); implementations on different computing platforms (e.g. neuromorphic, graphics processing units (GPUs), clouds, clusters); and big data analytics applications to solve real-world problems (e.g. weather prediction, transportation, energy management). The book, which reports on the second edition of the INNS Conference on Big Data, held on October 23–25, 2016, in Thessaloniki, Greece, depicts an interesting collaborative adventure of neural networks with big data and other learning technologies.

  1. Big data analytics methods and applications

    CERN Document Server

    Rao, BLS; Rao, SB

    2016-01-01

    This book has a collection of articles written by Big Data experts to describe some of the cutting-edge methods and applications from their respective areas of interest, and provides the reader with a detailed overview of the field of Big Data Analytics as it is practiced today. The chapters cover technical aspects of key areas that generate and use Big Data such as management and finance; medicine and healthcare; genome, cytome and microbiome; graphs and networks; Internet of Things; Big Data standards; bench-marking of systems; and others. In addition to different applications, key algorithmic approaches such as graph partitioning, clustering and finite mixture modelling of high-dimensional data are also covered. The varied collection of themes in this volume introduces the reader to the richness of the emerging field of Big Data Analytics.

  2. Review Study of Mining Big Data

    Directory of Open Access Journals (Sweden)

    Mohammad Misagh Javaherian

    2016-06-01

    Full Text Available Big data is time period for collecting extensive and complex data set which including both structured and nonstructured information. Data can come from everywhere. sensors for collecting environment data are presented in online networking targets, computer images and recording and so on , this information is known as big data. The valuable data can be extracted from this big data using data mining. Data mining is a method to find attractive samples and also logical models of information in wide scale. This article shown types of big data and future problems in extensive information as a chart. Study of issues in data-centered model in addition to big data will be analyzed.

  3. Rock in Rio: eternamente jovem

    OpenAIRE

    Ricardo Ferreira Freitas; Flávio Lins Rodrigues

    2014-01-01

    The purpose of this article is to discuss the role of Rock in Rio: The Musical, as herald of megafestival Rock in Rio. Driven by the success that musicals have reached in Brazil, we believe that the design of this spectacle of music, dance and staging renews the brand of the rock festival, once it adds the force of young and healthy bodies to its concept. Moreover, the musical provides Rock in Rio with some distance from the controversal trilogy of sex, drugs and rock and roll, a strong mark ...

  4. Rock in Rio: forever young

    Directory of Open Access Journals (Sweden)

    Ricardo Ferreira Freitas

    2014-12-01

    Full Text Available The purpose of this article is to discuss the role of Rock in Rio: The Musical, as herald of megafestival Rock in Rio. Driven by the success that musicals have reached in Brazil, we believe that the design of this spectacle of music, dance and staging renews the brand of the rock festival, once it adds the force of young and healthy bodies to its concept. Moreover, the musical provides Rock in Rio with some distance from the controversal trilogy of sex, drugs and rock and roll, a strong mark of past festivals around the world. Thus, the musical expands the possibilities of growth for the brand.

  5. Rock blasting and explosives engineering

    Energy Technology Data Exchange (ETDEWEB)

    Persson, P.-A.; Holmberg, R.; Lee, J. (New Mexico Institute of Mining and Technology, Socorro, NM (United States). Research Center for Energetic Materials)

    1994-01-01

    The book covers the practical engineering aspects of different kinds of rock blasting. It includes a thorough analysis of the cost of the entire process of tunneling by drilling and blasting compared with full-face boring. It covers the economics of the entire rock blasting operation and its dependence on the size of excavation. The book highlights the fundamentals of rock mechanics, shock waves and detonation, initiation and mechanics of rock motion. It describes the engineering design principles and computational techniques for many separate mining methods and rock blasting operations. 274 refs.

  6. Advances and Applications of Rock Physics for Hydrocarbon Exploration

    Directory of Open Access Journals (Sweden)

    Valle-Molina C.

    2012-10-01

    Full Text Available Integration of the geological and geophysical information with different scale and features is the key point to establish relationships between petrophysical and elastic characteristics of the rocks in the reservoir. It is very important to present the fundamentals and current methodologies of the rock physics analyses applied to hydrocarbons exploration among engineers and Mexican students. This work represents an effort to capacitate personnel of oil exploration through the revision of the subjects of rock physics. The main aim is to show updated improvements and applications of rock physics into seismology for exploration. Most of the methodologies presented in this document are related to the study the physical and geological mechanisms that impact on the elastic properties of the rock reservoirs based on rock specimens characterization and geophysical borehole information. Predictions of the rock properties (litology, porosity, fluid in the voids can be performed using 3D seismic data that shall be properly calibrated with experimental measurements in rock cores and seismic well log data

  7. The Big Occulting Steerable Satellite (BOSS)

    CERN Document Server

    Copi, C J; Copi, Craig J.; Starkman, Glenn D.

    1999-01-01

    Natural (such as lunar) occultations have long been used to study sources on small angular scales, while coronographs have been used to study high contrast sources. We propose launching the Big Occulting Steerable Satellite (BOSS), a large steerable occulting satellite to combine both of these techniques. BOSS will have several advantages over standard occulting bodies. BOSS would block all but about 4e-5 of the light at 1 micron in the region of interest around the star for planet detections. Because the occultation occurs outside the telescope, scattering inside the telescope does not degrade this performance. BOSS could be combined with a space telescope at the Earth-Sun L2 point to yield very long integration times, in excess of 3000 seconds. If placed in Earth orbit, integration times of 160--1600 seconds can be achieved from most major telescope sites for objects in over 90% of the sky. Applications for BOSS include direct imaging of planets around nearby stars. Planets separated by as little as 0.1--0....

  8. BEBC, the Big European Bubble Chamber

    CERN Multimedia

    1971-01-01

    The vessel of the Big European Bubble Chamber, BEBC, was installed at the beginning of the 1970s. The large stainless-steel vessel, measuring 3.7 metres in diameter and 4 metres in height, was filled with 35 cubic metres of liquid (hydrogen, deuterium or a neon-hydrogen mixture), whose sensitivity was regulated by means of a huge piston weighing 2 tonnes. During each expansion, the trajectories of the charged particles were marked by a trail of bubbles, where liquid reached boiling point as they passed through it. The first images were recorded in 1973 when BEBC, equipped with the largest superconducting magnet in service at the time, first received beam from the PS. In 1977, the bubble chamber was exposed to neutrino and hadron beams at higher energies of up to 450 GeV after the SPS came into operation. By the end of its active life in 1984, BEBC had delivered a total of 6.3 million photographs to 22 experiments devoted to neutrino or hadron physics. Around 600 scientists from some fifty laboratories through...

  9. Nuclear reactors situation in Japan after the major earthquake of March 11, 2011. March 12, 2011, 8:00 PM status; Situation des reacteurs nucleaires au Japon suite au seisme majeur servenu le 11 mars 2011. Point de situation du 12 mars 2011 a 20 heures

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    This situation note is established according to the information gained on March 12, 2011, at 8:00 PM, by the crisis centre of the French institute of radiation protection and nuclear safety (IRSN). The situation of the reactors No. 1, 2 and 3 of the Fukushima I site is briefly presented with the progress of the accident management actions. The operation principles of a BWR-type reactor and of a PWR-type reactor are presented in appendix as well as the confinement principle specific to Mark I-type BWR reactors designed by General Electric. The meteorological forecasts of the day are presented in a figure. (J.S.)

  10. Future Reactor Experiments

    OpenAIRE

    He, Miao

    2013-01-01

    The measurement of the neutrino mixing angle $\\theta_{13}$ opens a gateway for the next generation experiments to measure the neutrino mass hierarchy and the leptonic CP-violating phase. Future reactor experiments will focus on mass hierarchy determination and the precision measurement of mixing parameters. Mass hierarchy can be determined from the disappearance of reactor electron antineutrinos based on the interference effect of two separated oscillation modes. Relative and absolute measure...

  11. Reactor Neutrino Experiments

    OpenAIRE

    Cao, Jun

    2007-01-01

    Precisely measuring $\\theta_{13}$ is one of the highest priority in neutrino oscillation study. Reactor experiments can cleanly determine $\\theta_{13}$. Past reactor neutrino experiments are reviewed and status of next precision $\\theta_{13}$ experiments are presented. Daya Bay is designed to measure $\\sin^22\\theta_{13}$ to better than 0.01 and Double Chooz and RENO are designed to measure it to 0.02-0.03. All are heading to full operation in 2010. Recent improvements in neutrino moment measu...

  12. Department of Reactor Technology

    DEFF Research Database (Denmark)

    Risø National Laboratory, Roskilde

    The general development of the Department of Reactor Technology at Risø during 1981 is presented, and the activities within the major subject fields are described in some detail. Lists of staff, publications, and computer programs are included.......The general development of the Department of Reactor Technology at Risø during 1981 is presented, and the activities within the major subject fields are described in some detail. Lists of staff, publications, and computer programs are included....

  13. Three dimensional simulation for Big Hill Strategic Petroleum Reserve (SPR).

    Energy Technology Data Exchange (ETDEWEB)

    Ehgartner, Brian L. (Sandia National Laboratories, Albuquerque, NM); Park, Byoung Yoon; Sobolik, Steven Ronald (Sandia National Laboratories, Albuquerque, NM); Lee, Moo Yul (Sandia National Laboratories, Albuquerque, NM)

    2005-07-01

    3-D finite element analyses were performed to evaluate the structural integrity of caverns located at the Strategic Petroleum Reserve's Big Hill site. State-of-art analyses simulated the current site configuration and considered additional caverns. The addition of 5 caverns to account for a full site and a full dome containing 31 caverns were modeled. Operations including both normal and cavern workover pressures and cavern enlargement due to leaching were modeled to account for as many as 5 future oil drawdowns. Under the modeled conditions, caverns were placed very close to the edge of the salt dome. The web of salt separating the caverns and the web of salt between the caverns and edge of the salt dome were reduced due to leaching. The impacts on cavern stability, underground creep closure, surface subsidence and infrastructure, and well integrity were quantified. The analyses included recently derived damage criterion obtained from testing of Big Hill salt cores. The results show that from a structural view point, many additional caverns can be safely added to Big Hill.

  14. NETIMIS: Dynamic Simulation of Health Economics Outcomes Using Big Data.

    Science.gov (United States)

    Johnson, Owen A; Hall, Peter S; Hulme, Claire

    2016-02-01

    Many healthcare organizations are now making good use of electronic health record (EHR) systems to record clinical information about their patients and the details of their healthcare. Electronic data in EHRs is generated by people engaged in complex processes within complex environments, and their human input, albeit shaped by computer systems, is compromised by many human factors. These data are potentially valuable to health economists and outcomes researchers but are sufficiently large and complex enough to be considered part of the new frontier of 'big data'. This paper describes emerging methods that draw together data mining, process modelling, activity-based costing and dynamic simulation models. Our research infrastructure includes safe links to Leeds hospital's EHRs with 3 million secondary and tertiary care patients. We created a multidisciplinary team of health economists, clinical specialists, and data and computer scientists, and developed a dynamic simulation tool called NETIMIS (Network Tools for Intervention Modelling with Intelligent Simulation; http://www.netimis.com ) suitable for visualization of both human-designed and data-mined processes which can then be used for 'what-if' analysis by stakeholders interested in costing, designing and evaluating healthcare interventions. We present two examples of model development to illustrate how dynamic simulation can be informed by big data from an EHR. We found the tool provided a focal point for multidisciplinary team work to help them iteratively and collaboratively 'deep dive' into big data.

  15. Constraining Big Bang lithium production with recent solar neutrino data

    CERN Document Server

    Takács, Marcell P; Szücs, Tamás; Zuber, Kai

    2015-01-01

    The 3He({\\alpha},{\\gamma})7Be reaction affects not only the production of 7Li in Big Bang nucleosynthesis, but also the fluxes of 7Be and 8B neutrinos from the Sun. This double role is exploited here to constrain the former by the latter. A number of recent experiments on 3He({\\alpha},{\\gamma})7Be provide precise cross section data at E = 0.5-1.0 MeV center-of-mass energy. However, there is a scarcity of precise data at Big Bang energies, 0.1-0.5 MeV, and below. This problem can be alleviated, based on precisely calibrated 7Be and 8B neutrino fluxes from the Sun that are now available, assuming the neutrino flavour oscillation framework to be correct. These fluxes and the standard solar model are used here to determine the 3He(alpha,gamma)7Be astrophysical S-factor at the solar Gamow peak, S(23+6-5 keV) = 0.548+/-0.054 keVb. This new data point is then included in a re-evaluation of the 3He({\\alpha},{\\gamma})7Be S-factor at Big Bang energies, following an approach recently developed for this reaction in the c...

  16. ROCK inhibitor prevents the dedifferentiation of human articular chondrocytes

    Energy Technology Data Exchange (ETDEWEB)

    Matsumoto, Emi [Department of Orthopaedic Surgery, Science of Functional Recovery and Reconstruction, Okayama University Graduate School of Medicine, Dentistry, and Pharmaceutical Sciences, 2-5-1 Shikatacho, Kitaku, Okayama 700-8558 (Japan); Furumatsu, Takayuki, E-mail: matino@md.okayama-u.ac.jp [Department of Orthopaedic Surgery, Science of Functional Recovery and Reconstruction, Okayama University Graduate School of Medicine, Dentistry, and Pharmaceutical Sciences, 2-5-1 Shikatacho, Kitaku, Okayama 700-8558 (Japan); Kanazawa, Tomoko; Tamura, Masanori; Ozaki, Toshifumi [Department of Orthopaedic Surgery, Science of Functional Recovery and Reconstruction, Okayama University Graduate School of Medicine, Dentistry, and Pharmaceutical Sciences, 2-5-1 Shikatacho, Kitaku, Okayama 700-8558 (Japan)

    2012-03-30

    Highlights: Black-Right-Pointing-Pointer ROCK inhibitor stimulates chondrogenic gene expression of articular chondrocytes. Black-Right-Pointing-Pointer ROCK inhibitor prevents the dedifferentiation of monolayer-cultured chondrocytes. Black-Right-Pointing-Pointer ROCK inhibitor enhances the redifferentiation of cultured chondrocytes. Black-Right-Pointing-Pointer ROCK inhibitor is useful for preparation of un-dedifferentiated chondrocytes. Black-Right-Pointing-Pointer ROCK inhibitor may be a useful reagent for chondrocyte-based regeneration therapy. -- Abstract: Chondrocytes lose their chondrocytic phenotypes in vitro. The Rho family GTPase ROCK, involved in organizing the actin cytoskeleton, modulates the differentiation status of chondrocytic cells. However, the optimum method to prepare a large number of un-dedifferentiated chondrocytes is still unclear. In this study, we investigated the effect of ROCK inhibitor (ROCKi) on the chondrogenic property of monolayer-cultured articular chondrocytes. Human articular chondrocytes were subcultured in the presence or absence of ROCKi (Y-27632). The expression of chondrocytic marker genes such as SOX9 and COL2A1 was assessed by quantitative real-time PCR analysis. Cellular morphology and viability were evaluated. Chondrogenic redifferentiation potential was examined by a pellet culture procedure. The expression level of SOX9 and COL2A1 was higher in ROCKi-treated chondrocytes than in untreated cells. Chondrocyte morphology varied from a spreading form to a round shape in a ROCKi-dependent manner. In addition, ROCKi treatment stimulated the proliferation of chondrocytes. The deposition of safranin O-stained proteoglycans and type II collagen was highly detected in chondrogenic pellets derived from ROCKi-pretreated chondrocytes. Our results suggest that ROCKi prevents the dedifferentiation of monolayer-cultured chondrocytes, and may be a useful reagent to maintain chondrocytic phenotypes in vitro for chondrocyte

  17. Advancements in Big Data Processing

    CERN Document Server

    Vaniachine, A; The ATLAS collaboration

    2012-01-01

    The ever-increasing volumes of scientific data present new challenges for Distributed Computing and Grid-technologies. The emerging Big Data revolution drives new discoveries in scientific fields including nanotechnology, astrophysics, high-energy physics, biology and medicine. New initiatives are transforming data-driven scientific fields by pushing Bid Data limits enabling massive data analysis in new ways. In petascale data processing scientists deal with datasets, not individual files. As a result, a task (comprised of many jobs) became a unit of petascale data processing on the Grid. Splitting of a large data processing task into jobs enabled fine-granularity checkpointing analogous to the splitting of a large file into smaller TCP/IP packets during data transfers. Transferring large data in small packets achieves reliability through automatic re-sending of the dropped TCP/IP packets. Similarly, transient job failures on the Grid can be recovered by automatic re-tries to achieve reliable Six Sigma produc...

  18. Evidence of the Big Fix

    CERN Document Server

    Hamada, Yuta; Kawana, Kiyoharu

    2014-01-01

    We give an evidence of the Big Fix. The theory of wormholes and multiverse suggests that the parameters of the Standard Model are fixed in such a way that the total entropy at the late stage of the universe is maximized, which we call the maximum entropy principle. In this paper, we discuss how it can be confirmed by the experimental data, and we show that it is indeed true for the Higgs vacuum expectation value $v_{h}$. We assume that the baryon number is produced by the sphaleron process, and that the current quark masses, the gauge couplings and the Higgs self coupling are fixed when we vary $v_{h}$. It turns out that the existence of the atomic nuclei plays a crucial role to maximize the entropy. This is reminiscent of the anthropic principle, however it is required by the fundamental low in our case.

  19. Evidence of the big fix

    Science.gov (United States)

    Hamada, Yuta; Kawai, Hikaru; Kawana, Kiyoharu

    2014-06-01

    We give an evidence of the Big Fix. The theory of wormholes and multiverse suggests that the parameters of the Standard Model are fixed in such a way that the total entropy at the late stage of the universe is maximized, which we call the maximum entropy principle. In this paper, we discuss how it can be confirmed by the experimental data, and we show that it is indeed true for the Higgs vacuum expectation value vh. We assume that the baryon number is produced by the sphaleron process, and that the current quark masses, the gauge couplings and the Higgs self-coupling are fixed when we vary vh. It turns out that the existence of the atomic nuclei plays a crucial role to maximize the entropy. This is reminiscent of the anthropic principle, however it is required by the fundamental law in our case.

  20. Big Book of Apple Hacks

    CERN Document Server

    Seibold, Chris

    2008-01-01

    Bigger in size, longer in length, broader in scope, and even more useful than our original Mac OS X Hacks, the new Big Book of Apple Hacks offers a grab bag of tips, tricks and hacks to get the most out of Mac OS X Leopard, as well as the new line of iPods, iPhone, and Apple TV. With 125 entirely new hacks presented in step-by-step fashion, this practical book is for serious Apple computer and gadget users who really want to take control of these systems. Many of the hacks take you under the hood and show you how to tweak system preferences, alter or add keyboard shortcuts, mount drives and