WorldWideScience

Sample records for big rock point reactor

  1. RETRAN operational transient analysis of the Big Rock Point plant boiling water reactor

    International Nuclear Information System (INIS)

    Sawtelle, G.R.; Atchison, J.D.; Farman, R.F.; VandeWalle, D.J.; Bazydlo, H.G.

    1983-01-01

    Energy Incorporated used the RETRAN computer code to model and calculate nine Consumers Power Company Big Rock Point Nuclear Power Plant transients. RETRAN, a best-estimate, one-dimensional, homogeneous-flow thermal-equilibrium code, is applicable to FSAR Chapter 15 transients for Conditions 1 through IV. The BWR analyses were performed in accordance with USNRC Standard Review Plan criteria and in response to the USNRC Systematic Evaluation Program. The RETRAN Big Rock Point model was verified by comparison to plant startup test data. This paper discusses the unique modeling techniques used in RETRAN to model this steam-drum-type BWR. Transient analyses results are also presented

  2. Big Rock Point severe accident management strategies

    International Nuclear Information System (INIS)

    Brogan, B.A.; Gabor, J.R.

    1996-01-01

    December 1994, the Nuclear Energy Institute (NEI) issued guidance relative to the formal industry position on Severe Accident Management (SAM) approved by the NEI Strategic Issues Advisory Committee on November 4, 1994. This paper summarizes how Big Rock Point (BRP) has and continues to address SAM strategies. The historical accounting portion of this presentation includes a description of how the following projects identified and defined the current Big Rock Point SAM strategies: the 1981 Level 3 Probabilistic Risk Assessment performance; the development of the Plant Specific Technical Guidelines from which the symptom oriented Emergency Operating Procedures (EOPs) were developed; the Control Room Design Review; and, the recent completion of the Individual Plant Evaluation (IPE). In addition to the historical presentation deliberation, this paper the present activities that continue to stress SAM strategies

  3. Big Rock Point: 35 years of electrical generation

    International Nuclear Information System (INIS)

    Petrosky, T.D.

    1998-01-01

    On September 27, 1962, the 75 MWe boiling water reactor, designed and built by General Electric, of the Big Rock Point Nuclear Power Station went critical for the first time. The US Atomic Energy Commission (AEC) and the plant operator, Consumers Power, had designed the plant also as a research reactor. The first studies were devoted to fuel behavior, higher burnup, and materials research. The reactor was also used for medical technology: Co-60 radiation sources were produced for the treatment of more than 120,000 cancer patients. After the accident at the Three Mile Island-2 nuclear generating unit in 1979, Big Rock Point went through an extensive backfitting phase. Personnel from numerous other American nuclear power plants were trained at the simulator of Big Rock Point. The plant was decommissioned permanently on August 29, 1997 after more than 35 years of operation and a cumulated electric power production of 13,291 GWh. A period of five to seven years is estimated for decommissioning and demolition work up to the 'green field' stage. (orig.) [de

  4. Extended burnup demonstration: reactor fuel program. Pre-irradiation characterization and summary of pre-program poolside examinations. Big Rock Point extended burnup fuel

    International Nuclear Information System (INIS)

    Exarhos, C.A.; Van Swam, L.F.; Wahlquist, F.P.

    1981-12-01

    This report is a resource document characterizing the 64 fuel rods being irradiated at the Big Rock Point reactor as part of the Extended Burnup Demonstration being sponsored jointly by the US Department of Energy, Consumers Power Company, Exxon Nuclear Company, and General Public Utilities. The program entails extending the exposure of standard BWR fuel to a discharge average of 38,000 MWD/MTU to demonstrate the feasibility of operating fuel of standard design to levels significantly above current limits. The fabrication characteristics of the Big Rock Point EBD fuel are presented along with measurement of rod length, rod diameter, pellet stack height, and fuel rod withdrawal force taken at poolside at burnups up to 26,200 MWD/MTU. A review of the fuel examination data indicates no performance characteristics which might restrict the continued irradiation of the fuel

  5. Free Release Standards Utilized at Big Rock Point

    International Nuclear Information System (INIS)

    Robert P. Wills

    2000-01-01

    The decommissioning of Consumers Energy's Big Rock Point (BRP) site involves decommissioning its 75-MW boiling water reactor and all of the associated facilities. Consumers Energy is committed to restoring the site to greenfield conditions. This commitment means that when the decommissioning is complete, all former structures will have been removed, and the site will be available for future use without radiological restrictions. BRP's radiation protection management staff determined that the typical methods used to comply with U.S Nuclear Regulatory Commission (NRC) regulations for analyzing volumetric material for radionuclides would not fulfill the demands of a facility undergoing decommissioning. The challenge at hand is to comply with regulatory requirements and put into production a large-scale bulk release production program. This report describes Consumers Energy's planned approach to the regulatory aspects of free release

  6. Risks due to fires at Big Rock Point

    International Nuclear Information System (INIS)

    Brinsfield, W.A.; Blanchard, D.P.

    1983-01-01

    The unique and older designs of the Big Rock Point nuclear plant is such that fires contribute significantly to the probability of core damage predicted in the probabilistic risk assessment performed for this plant. The methodology employed to determine this contribution reflects the unique, as constructed, plant design, while systematically and logically addressing the true effect of fires on the operation of the plant and the safety of the public. As a result of the methodology utilized in the PRA, recommendations are made which minimize the risk of core damage due to fires. Included in these recommendations is a proposal for equipment and controls to be included on the Big Rock Point alternate shutdown panel

  7. 78 FR 58570 - Environmental Assessment; Entergy Nuclear Operations, Inc., Big Rock Point

    Science.gov (United States)

    2013-09-24

    ... Assessment; Entergy Nuclear Operations, Inc., Big Rock Point AGENCY: Nuclear Regulatory Commission. ACTION... applicant or the licensee), for the Big Rock Point (BRP) Independent Spent Fuel Storage Installation (ISFSI... Rock Point (BRP) Independent Spent Fuel Storage Installation (ISFSI). II. Environmental Assessment (EA...

  8. Big rock point restoration project BWR major component removal, packaging and shipping - planning and experience

    International Nuclear Information System (INIS)

    Milner, T.; Dam, S.; Papp, M.; Slade, J.; Slimp, B.; Nurden, P.

    2001-01-01

    The Big Rock Point boiling water reactor (BWR) at Charlevoix, MI was permanently shut down on August 29th 1997. In 1999 BNFL Inc.'s Reactor Decommissioning Group (RDG) was awarded a contract by Consumers Energy (CECo) for the Big Rock Point (BRP) Major Component Removal (MCR) project. BNFL Inc. RDG has teamed with MOTA, Sargent and Lundy and MDM Services to plan and execute MCR in support of the facility restoration project. The facility restoration project will be completed by 2005. Key to the success of the project has been the integration of best available demonstrated technology into a robust and responsive project management approach, which places emphasis on safety and quality assurance in achieving project milestones linked to time and cost. To support decommissioning of the BRP MCR activities, a reactor vessel (RV) shipping container is required. Discussed in this paper is the design and fabrication of a 10 CFR Part 71 Type B container necessary to ship the BRP RV. The container to be used for transportation of the RV to the burial site was designed as an Exclusive Use Type B package for shipment and burial at the Barnwell, South Carolina (SC) disposal facility. (author)

  9. Integrated plant-safety assessment, Systematic Evaluation Program: Big Rock Point Plant (Docket No. 50-155)

    International Nuclear Information System (INIS)

    1983-09-01

    The Systematic Evaluation Program was initiated in February 1977 by the US Nuclear Regulatory Commission to review the designs of older operating nuclear reactor plants to reconfirm and document their safety. This report documents the review of the Big Rock Point Plant, which is one of ten plants reviewed under Phase II of this program. This report indicates how 137 topics selected for review under Phase I of the program were addressed. It also addresses a majority of the pending licensing actions for Big Rock Point, which include TMI Action Plan requirements and implementation criteria for resolved generic issues. Equipment and procedural changes have been identified as a result of the review

  10. Spatial distribution of radionuclides in Lake Michigan biota near the Big Rock Point Nuclear Plant

    International Nuclear Information System (INIS)

    Wahlgren, M.A.; Yaguchi, E.M.; Nelson, D.M.; Marshall, J.S.

    1974-01-01

    A survey was made of four groups of biota in the vicinity of the Big Rock Point Nuclear Plant near Charlevoix, Michigan, to determine their usefulness in locating possible sources of plutonium and other radionuclides to Lake Michigan. This 70 MW boiling-water reactor, located on the Lake Michigan shoreline, was chosen because its fuel contains recycled plutonium, and because it routinely discharges very low-level radioactive wastes into the lake. Samples of crayfish (Orconectes sp.), green algae (Chara sp. and Cladophora sp.), and an aquatic macrophyte (Potamogeton sp.) were collected in August 1973, at varying distances from the discharge and analyzed for 239 240 Pu, 90 Sr, and five gamma-emitting radionuclides. Comparison samples of reactor waste solution have also been analyzed for these radionuclides. Comparisons of the spatial distributions of the extremely low radionuclide concentrations in biota clearly indicated that 137 Cs, 134 Cs, 65 Zn, and 60 Co were released from the reactor; their concentrations decreased exponentially with increasing distance from the discharge. Conversely, concentrations of 239 240 Pu, 95 Zr, and 90 Sr showed no correlation with distance, suggesting any input from Big Rock was insignificant with respect to the atmospheric origin of these isotopes. The significance of these results is discussed, particularly with respect to current public debate over the possibility of local environmental hazards associated with the use of plutonium as a nuclear fuel. (U.S.)

  11. Big Rock Point Nuclear Plant. Annual operating report for 1976

    International Nuclear Information System (INIS)

    1977-01-01

    Net electrical power generated was 244,492.9 MWH with the reactor on line 4,405 hrs. Information is presented concerning operations, power generation, shutdowns, corrective maintenance, chemistry and radiochemistry, occupational radiation exposure, release of radioactive materials, reportable occurrences, and fuel performance

  12. Integrated plant safety assessment. Systematic evaluation program, Big Rock Point Plant (Docket No. 50-155). Final report

    International Nuclear Information System (INIS)

    1984-05-01

    The Systematic Evaluation Program was initiated in February 1977 by the U.S. Nuclear Regulatory Commission to review the designs of older operating nuclear reactor plants to reconfirm and document their safety. The review provides (1) an assessment of how these plants compare with current licensing safety requirements relating to selected issues, (2) a basis for deciding how these differences should be resolved in an integrated plant review, and (3) a documented evaluation of plant safety when the supplement to the Final Integrated Plant Safety Assessment Report has been issued. This report documents the review of the Big Rock Point Plant, which is one of ten plants reviewed under Phase II of this program. This report indicates how 137 topics selected for review under Phase I of the program were addressed. It also addresses a majority of the pending licensing actions for Big Rock Point, which include TMI Action Plan requirements and implementation criteria for resolved generic issues. Equipment and procedural changes have been identified as a result of the review

  13. Big Rock Point Nuclear Plant. Semiannual operations report No. 22, January--June 1975

    International Nuclear Information System (INIS)

    1975-01-01

    Net electrical power generated was 50,198.2 MWH(e) with the reactor on line 922.6 hrs. Information is presented concerning power generation, shutdowns, corrective maintenance, chemistry and radiochemistry, occupational radiation exposure, and abnormal occurrences. (FS)

  14. Big Rock Point Nuclear Plant. 23rd semiannual report of operations, July--December 1976

    International Nuclear Information System (INIS)

    1976-01-01

    Net electrical power generated was 240,333.9 MWh(e) with the reactor on line 4,316.6 hr. Information is presented concerning operation, power generation, shutdowns, corrective maintenance, chemistry and radiochemistry, occupational radiation exposure, release of radioactive materials, changes, tests, experiments, and environmental monitoring

  15. Technical evaluation of the proposed changes in the technical specifications for emergency power sources for the Big Rock Point nuclear power plant

    International Nuclear Information System (INIS)

    Latorre, V.R.

    1979-12-01

    The technical evaluation is presented for the proposed changes to the Technical Specifications for emergency power sources for the Big Rock Point nuclear power plant. The criteria used to evaluate the acceptability of the changes include those delineated in IEEE Std-308-1974, and IEEE Std-450-1975 as endorsed by US NRC Regulatory Guide 1.129

  16. Big Bang Day : Physics Rocks

    CERN Multimedia

    Brian Cox; John Barrowman; Eddie Izzard

    2008-01-01

    Is particle physics the new rock 'n' roll? The fundamental questions about the nature of the universe that particle physics hopes to answer have attracted the attention of some very high profile and unusual fans. Alan Alda, Ben Miller, Eddie Izzard, Dara O'Briain and John Barrowman all have interests in this branch of physics. Brian Cox - CERN physicist, and former member of 90's band D:Ream, tracks down some very well known celebrity enthusiasts and takes a light-hearted look at why this subject can appeal to all of us.

  17. Turning points in reactor design

    International Nuclear Information System (INIS)

    Beckjord, E.S.

    1995-01-01

    This article provides some historical aspects on nuclear reactor design, beginning with PWR development for Naval Propulsion and the first commercial application at Yankee Rowe. Five turning points in reactor design and some safety problems associated with them are reviewed: (1) stability of Dresden-1, (2) ECCS, (3) PRA, (4) TMI-2, and (5) advanced passive LWR designs. While the emphasis is on the thermal-hydraulic aspects, the discussion is also about reactor systems

  18. Turning points in reactor design

    Energy Technology Data Exchange (ETDEWEB)

    Beckjord, E.S.

    1995-09-01

    This article provides some historical aspects on nuclear reactor design, beginning with PWR development for Naval Propulsion and the first commercial application at Yankee Rowe. Five turning points in reactor design and some safety problems associated with them are reviewed: (1) stability of Dresden-1, (2) ECCS, (3) PRA, (4) TMI-2, and (5) advanced passive LWR designs. While the emphasis is on the thermal-hydraulic aspects, the discussion is also about reactor systems.

  19. Big Bang as a Critical Point

    Directory of Open Access Journals (Sweden)

    Jakub Mielczarek

    2017-01-01

    Full Text Available This article addresses the issue of possible gravitational phase transitions in the early universe. We suggest that a second-order phase transition observed in the Causal Dynamical Triangulations approach to quantum gravity may have a cosmological relevance. The phase transition interpolates between a nongeometric crumpled phase of gravity and an extended phase with classical properties. Transition of this kind has been postulated earlier in the context of geometrogenesis in the Quantum Graphity approach to quantum gravity. We show that critical behavior may also be associated with a signature change in Loop Quantum Cosmology, which occurs as a result of quantum deformation of the hypersurface deformation algebra. In the considered cases, classical space-time originates at the critical point associated with a second-order phase transition. Relation between the gravitational phase transitions and the corresponding change of symmetry is underlined.

  20. INDIAN POINT REACTOR STARTUP AND PERFORMANCE

    Energy Technology Data Exchange (ETDEWEB)

    Deddens, J. C.; Batch, M. L.

    1963-09-15

    The testing program for the Indian Point Reactor is discussed. The thermal and hydraulic evaluation of the primary coolant system is discussed. Analyses of fuel loading and initial criticality, measurement of operating coefficients of reactivity, control rod group reactivity worths, and xenon evaluation are presented. (R.E.U.)

  1. Reactor coolant flow measurements at Point Lepreau

    International Nuclear Information System (INIS)

    Brenciaglia, G.; Gurevich, Y.; Liu, G.

    1996-01-01

    The CROSSFLOW ultrasonic flow measurement system manufactured by AMAG is fully proven as reliable and accurate when applied to large piping in defined geometries for such applications as feedwater flows measurement. Its application to direct reactor coolant flow (RCF) measurements - both individual channel flows and bulk flows such as pump suction flow - has been well established through recent work by AMAG at Point Lepreau, with application to other reactor types (eg. PWR) imminent. At Point Lepreau, Measurements have been demonstrated at full power; improvements to consistently meet ±1% accuracy are in progress. The development and recent customization of CROSSFLOW to RCF measurement at Point Lepreau are described in this paper; typical measurement results are included. (author)

  2. Researchers solve big mysteries of pebble bed reactor

    Energy Technology Data Exchange (ETDEWEB)

    Shams, Afaque; Roelofs, Ferry; Komen, E.M.J. [Nuclear Research and Consultancy Group (NRG), Petten (Netherlands); Baglietto, Emilio [Massachusetts Institute of Technology, Cambridge, MA (United States). Dept. of Nuclear Science and Engineering; Sgro, Titus [CD-adapco, London (United Kingdom). Technical Marketing

    2014-03-15

    The PBR is one type of High Temperature Reactors, which allows high temperature work while preventing the fuel from melting (bringing huge safety margins to the reactor) and high electricity efficiency. The design is also highly scalable; a plant could be designed to be as large or small as needed, and can even be made mobile, allowing it to be used onboard a ship. In a PBR, small particles of nuclear fuel, embedded in a moderating graphite pebble, are dropped into the reactor as needed. At the bottom, the pebbles can be removed simply by opening a small hatch and letting gravity pull them down. To cool the reactor and create electricity, helium gas is pumped through the reactor to pull heat out which is then run through generators. One of the most difficult problems to deal with has been the possible appearance of local temperature hotspots within the pebble bed heating to the point of melting the graphite moderators surrounding the fuel. Obviously, constructing a reactor and experimenting to investigate this possibility is out of the question. Instead, nuclear engineers have been attempting to simulate a PBR with various CFD codes. The thermo-dynamic analysis to simulate realistic conditions in a pebble bed are described and the results are shown. (orig.)

  3. Application Study of Self-balanced Testing Method on Big Diameter Rock-socketed Piles

    Directory of Open Access Journals (Sweden)

    Qing-biao WANG

    2013-07-01

    Full Text Available Through the technological test of self-balanced testing method on big diameter rock-socketed piles of broadcasting centre building of Tai’an, this paper studies and analyzes the links of the balance position selection, the load cell production and installation, displacement sensor selection and installation, loading steps, stability conditions and determination of the bearing capacity in the process of self-balanced testing. And this paper summarizes key technology and engineering experience of self-balanced testing method of big diameter rock-socketed piles and, meanwhile, it also analyzes the difficult technical problems needed to be resolved urgently at present. Conclusion of the study has important significance to the popularization and application of self-balanced testing method and the similar projects.

  4. Rock siting of nuclear power plants from a reactor safety standpoint. Status report October 1974

    International Nuclear Information System (INIS)

    1975-01-01

    The aim of this study is to clearify the advantages and disadvantages of an underground nuclear power plant from a reactor safety point of view, compared to a plant above ground. Principles for the technical design of a rock sited BWR nuclear power plant is presented. Also questions of sabotage and closing down the plant at the end of the operational period are treated. (K.K.)

  5. Response characteristics of reactor building on weathered soft rock ground

    International Nuclear Information System (INIS)

    Hirata, Kazuta; Tochigi, Hitoshi

    1991-01-01

    The purpose of this study is to investigate the seismic stability of nuclear power plants on layered soft bedrock grounds, focusing on the seismic response of reactor buildings. In this case, the soft bedrock grounds refer to the weathered soft bedrocks with several tens meter thickness overlaying hard bedrocks. Under this condition, there are two subjects regarding the estimation of the seismic response of reactor buildings. One is the estimation of the seismic response of surface ground, and another is the estimation of soil-structure interaction characteristics for the structures embedded in the layered grounds with low impedandce ratio between the surface ground and the bedrock. Paying attention to these subjects, many cases of seismic response analysis were carried out, and the following facts were clarified. In the soft rock grounds overlaying hard bedrocks, it was proved that the response acceleration was larger than the case of uniform hard bedrocks. A simplified sway and rocking model was proposed to consider soil-structure interaction. It was proved that the response of reactor buildings was small when the effect of embedment was considered. (K.I.)

  6. Recent advances in analysis and prediction of Rock Falls, Rock Slides, and Rock Avalanches using 3D point clouds

    Science.gov (United States)

    Abellan, A.; Carrea, D.; Jaboyedoff, M.; Riquelme, A.; Tomas, R.; Royan, M. J.; Vilaplana, J. M.; Gauvin, N.

    2014-12-01

    The acquisition of dense terrain information using well-established 3D techniques (e.g. LiDAR, photogrammetry) and the use of new mobile platforms (e.g. Unmanned Aerial Vehicles) together with the increasingly efficient post-processing workflows for image treatment (e.g. Structure From Motion) are opening up new possibilities for analysing, modeling and predicting rock slope failures. Examples of applications at different scales ranging from the monitoring of small changes at unprecedented level of detail (e.g. sub millimeter-scale deformation under lab-scale conditions) to the detection of slope deformation at regional scale. In this communication we will show the main accomplishments of the Swiss National Foundation project "Characterizing and analysing 3D temporal slope evolution" carried out at Risk Analysis group (Univ. of Lausanne) in close collaboration with the RISKNAT and INTERES groups (Univ. of Barcelona and Univ. of Alicante, respectively). We have recently developed a series of innovative approaches for rock slope analysis using 3D point clouds, some examples include: the development of semi-automatic methodologies for the identification and extraction of rock-slope features such as discontinuities, type of material, rockfalls occurrence and deformation. Moreover, we have been improving our knowledge in progressive rupture characterization thanks to several algorithms, some examples include the computing of 3D deformation, the use of filtering techniques on permanently based TLS, the use of rock slope failure analogies at different scales (laboratory simulations, monitoring at glacier's front, etc.), the modelling of the influence of external forces such as precipitation on the acceleration of the deformation rate, etc. We have also been interested on the analysis of rock slope deformation prior to the occurrence of fragmental rockfalls and the interaction of this deformation with the spatial location of future events. In spite of these recent advances

  7. SIDELOADING – INGESTION OF LARGE POINT CLOUDS INTO THE APACHE SPARK BIG DATA ENGINE

    Directory of Open Access Journals (Sweden)

    J. Boehm

    2016-06-01

    Full Text Available In the geospatial domain we have now reached the point where data volumes we handle have clearly grown beyond the capacity of most desktop computers. This is particularly true in the area of point cloud processing. It is therefore naturally lucrative to explore established big data frameworks for big geospatial data. The very first hurdle is the import of geospatial data into big data frameworks, commonly referred to as data ingestion. Geospatial data is typically encoded in specialised binary file formats, which are not naturally supported by the existing big data frameworks. Instead such file formats are supported by software libraries that are restricted to single CPU execution. We present an approach that allows the use of existing point cloud file format libraries on the Apache Spark big data framework. We demonstrate the ingestion of large volumes of point cloud data into a compute cluster. The approach uses a map function to distribute the data ingestion across the nodes of a cluster. We test the capabilities of the proposed method to load billions of points into a commodity hardware compute cluster and we discuss the implications on scalability and performance. The performance is benchmarked against an existing native Apache Spark data import implementation.

  8. Sideloading - Ingestion of Large Point Clouds Into the Apache Spark Big Data Engine

    Science.gov (United States)

    Boehm, J.; Liu, K.; Alis, C.

    2016-06-01

    In the geospatial domain we have now reached the point where data volumes we handle have clearly grown beyond the capacity of most desktop computers. This is particularly true in the area of point cloud processing. It is therefore naturally lucrative to explore established big data frameworks for big geospatial data. The very first hurdle is the import of geospatial data into big data frameworks, commonly referred to as data ingestion. Geospatial data is typically encoded in specialised binary file formats, which are not naturally supported by the existing big data frameworks. Instead such file formats are supported by software libraries that are restricted to single CPU execution. We present an approach that allows the use of existing point cloud file format libraries on the Apache Spark big data framework. We demonstrate the ingestion of large volumes of point cloud data into a compute cluster. The approach uses a map function to distribute the data ingestion across the nodes of a cluster. We test the capabilities of the proposed method to load billions of points into a commodity hardware compute cluster and we discuss the implications on scalability and performance. The performance is benchmarked against an existing native Apache Spark data import implementation.

  9. Reactor dosimetry calibrations in the Big Ten critical assembly

    International Nuclear Information System (INIS)

    Barr, D.W.; Hansen, G.E.

    1977-01-01

    Eleven irradiations of foil packs located in the central region of Big Ten were made for the Interlaboratory Reaction Rate Program. Each irradiation was at a nominal 10 15 fluence and the principal fluence monitor was the National Bureau of Standards' double fission chamber containing 235 U and 238 U deposits and located at the center of Big Ten. Secondary monitors consisted of three external fission chambers and two internal foil sets containing Au, In, and Al. Activities of one set were counted at the LASL and the other at the Hanford Engineering Developement Laboratory. The uncertainty in relative fluence for each irradiation was +-0.3%

  10. Research reactor put Canada in the nuclear big time

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    The history of the NRX reactor is briefly recounted. When NRX started up in 1947, it was the most powerful neutron source in the world. It is now the oldest research reactor still operating. NRX had to be rebuilt after an accident in 1952, and its calandria was changed again in 1970. Loops in NRX were used to test fuel for the Nautilus submarine, and the first zircaloy pressure tube in the world. At the present time, NRX is in a 'hot standby' condition as a backup to the NRU reactor, which is used mainly for isotope production. NRX will be decommissioned after completion and startup of the new MAPLE-X reactor

  11. High-Temperature Gas-Cooled Test Reactor Point Design

    Energy Technology Data Exchange (ETDEWEB)

    Sterbentz, James William [Idaho National Laboratory; Bayless, Paul David [Idaho National Laboratory; Nelson, Lee Orville [Idaho National Laboratory; Gougar, Hans David [Idaho National Laboratory; Kinsey, James Carl [Idaho National Laboratory; Strydom, Gerhard [Idaho National Laboratory; Kumar, Akansha [Idaho National Laboratory

    2016-04-01

    A point design has been developed for a 200 MW high-temperature gas-cooled test reactor. The point design concept uses standard prismatic blocks and 15.5% enriched UCO fuel. Reactor physics and thermal-hydraulics simulations have been performed to characterize the capabilities of the design. In addition to the technical data, overviews are provided on the technological readiness level, licensing approach and costs.

  12. Optimal configuration of spatial points in the reactor cell

    International Nuclear Information System (INIS)

    Bosevski, T.

    1968-01-01

    Optimal configuration of spatial points was chosen in respect to the total number needed for integration of reactions in the reactor cell. Previously developed code VESTERN was used for numerical verification of the method on a standard reactor cell. The code applies the collision probability method for calculating the neutron flux distribution. It is shown that the total number of spatial points is twice smaller than the respective number of spatial zones needed for determination of number of reactions in the cell, with the preset precision. This result shows the direction for further condensing of the procedure for calculating the space-energy distribution of the neutron flux in a reactors cell [sr

  13. Evaluation of the integrity of reactor vessels designed to ASME Code, Sections I and/or VIII

    International Nuclear Information System (INIS)

    Hoge, K.G.

    1976-01-01

    A documented review of nuclear reactor pressure vessels designed to ASME Code, Sections I and/or VIII is made. The review is primarily concerned with the design specifications and quality assurance programs utilized for the reactor vessel construction and the status of power plant material surveillance programs, pressure-temperature operating limits, and inservice inspection programs. The following ten reactor vessels for light-water power reactors are covered in the report: Indian Point Unit No. 1, Dresden Unit No. 1, Yankee Rowe, Humboldt Bay Unit No. 3, Big Rock Point, San Onofre Unit No. 1, Connecticut Yankee, Oyster Creek, Nine Mile Point Unit No. 1, and La Crosse

  14. A `big-mac` high converting water reactor

    Energy Technology Data Exchange (ETDEWEB)

    Ronen, Y; Dali, Y [Ben-Gurion Univ. of the Negev, Beersheba (Israel). Dept. of Nuclear Engineering

    1996-12-01

    Currently an effort is being made to get rid of plutonium. Therefore, at this time, a scientific study of a high converting reactor seems to be out of place. However , it is our opinion that the future of nuclear energy lies, among other things in the clever utilization of plutonium. It is also our opinion that one of the best ways to utilize plutonium is in high converting water reactors (authors).

  15. A two-point kinetic model for the PROTEUS reactor

    International Nuclear Information System (INIS)

    Dam, H. van.

    1995-03-01

    A two-point reactor kinetic model for the PROTEUS-reactor is developed and the results are described in terms of frequency dependent reactivity transfer functions for the core and the reflector. It is shown that at higher frequencies space-dependent effects occur which imply failure of the one-point kinetic model. In the modulus of the transfer functions these effects become apparent above a radian frequency of about 100 s -1 , whereas for the phase behaviour the deviation from a point model already starts at a radian frequency of 10 s -1 . (orig.)

  16. Second nuclear reactor, Point Lepreau, New Brunswick

    International Nuclear Information System (INIS)

    Connelly, R.; Desjardins, L.

    1985-05-01

    This is a report of the findings, conclusions and recommendations of the Environmental Assessment Panel appointed by the Ministers of Environment of New Brunswick and Canada to review the proposal to build a seond nuclear unit at Point Lepreau, New Brunswick. The Panel's mandate was to assess the environmental and related social impacts of the proposal. The Panel concludes that the project can proceed without significant adverse effects provided certain recommendations are followed. In order to understand the impacts of Lepreau II, it was necessary to review, to the extent possible, the actual effects of Lepreau I before estimating the incremental effects of Lepreau II. In so doing, the Panel made a number of recommendations that should be implemented now. The information gathered and experience gained can be applied to Lepreau II to ensure that potential impacts are reduced to a minimum and existing concerns associated with Lepreau I can be corrected

  17. Power Trip Set-points of Reactor Protection System for New Research Reactor

    International Nuclear Information System (INIS)

    Lee, Byeonghee; Yang, Soohyung

    2013-01-01

    This paper deals with the trip set-point related to the reactor power considering the reactivity induced accident (RIA) of new research reactor. The possible scenarios of reactivity induced accidents were simulated and the effects of trip set-point on the critical heat flux ratio (CHFR) were calculated. The proper trip set-points which meet the acceptance criterion and guarantee sufficient margins from normal operation were then determined. The three different trip set-points related to the reactor power are determined based on the RIA of new research reactor during FP condition, over 0.1%FP and under 0.1%FP. Under various reactivity insertion rates, the CHFR are calculated and checked whether they meet the acceptance criterion. For RIA at FP condition, the acceptance criterion can be satisfied even if high power set-point is only used for reactor trip. Since the design of the reactor is still progressing and need a safety margin for possible design changes, 18 MW is recommended as a high power set-point. For RIA at 0.1%FP, high power setpoint of 18 MW and high log rate of 10%pp/s works well and acceptance criterion is satisfied. For under 0.1% FP operations, the application of high log rate is necessary for satisfying the acceptance criterion. Considering possible decrease of CHFR margin due to design changes, the high log rate is suggested to be 8%pp/s. Suggested trip set-points have been identified based on preliminary design data for new research reactor; therefore, these trip set-points will be re-established by considering design progress of the reactor. The reactor protection system (RPS) of new research reactor is designed for safe shutdown of the reactor and preventing the release of radioactive material to environment. The trip set point of RPS is essential for reactor safety, therefore should be determined to mitigate the consequences from accidents. At the same time, the trip set-point should secure margins from normal operational condition to avoid

  18. Study of the stochastic point reactor kinetic equation

    International Nuclear Information System (INIS)

    Gotoh, Yorio

    1980-01-01

    Diagrammatic technique is used to solve the stochastic point reactor kinetic equation. The method gives exact results which are derived from Fokker-Plank theory. A Green's function dressed with the clouds of noise is defined, which is a transfer function of point reactor with fluctuating reactivity. An integral equation for the correlation function of neutron power is derived using the following assumptions: 1) Green's funntion should be dressed with noise, 2) The ladder type diagrams only contributes to the correlation function. For a white noise and the one delayed neutron group approximation, the norm of the integral equation and the variance to mean-squared ratio are analytically obtained. (author)

  19. Solution of the reactor point kinetics equations by MATLAB computing

    Directory of Open Access Journals (Sweden)

    Singh Sudhansu S.

    2015-01-01

    Full Text Available The numerical solution of the point kinetics equations in the presence of Newtonian temperature feedback has been a challenging issue for analyzing the reactor transients. Reactor point kinetics equations are a system of stiff ordinary differential equations which need special numerical treatments. Although a plethora of numerical intricacies have been introduced to solve the point kinetics equations over the years, some of the simple and straightforward methods still work very efficiently with extraordinary accuracy. As an example, it has been shown recently that the fundamental backward Euler finite difference algorithm with its simplicity has proven to be one of the most effective legacy methods. Complementing the back-ward Euler finite difference scheme, the present work demonstrates the application of ordinary differential equation suite available in the MATLAB software package to solve the stiff reactor point kinetics equations with Newtonian temperature feedback effects very effectively by analyzing various classic benchmark cases. Fair accuracy of the results implies the efficient application of MATLAB ordinary differential equation suite for solving the reactor point kinetics equations as an alternate method for future applications.

  20. Fast neutron reactors: the safety point of view

    International Nuclear Information System (INIS)

    Laverie, M.; Avenas, M.

    1984-01-01

    All versions of nuclear reactors present favourable and unfavourable characteristics from the point of view of safety. The safety of the installations is obtained by making efforts to utilize in the best possible way those which are favourable and by taking proper steps in the face of those which are unfavourable. The present article shows how this general principle has been applied as regards the fast neutron reactors of integrated design which have been developped in France, taking into account the specific features of this version. A qualitative method to compare the safety of this version with that of pressurized water reactors which has been widely put to the test commercially all over the world is presented. These analyses make, generally speaking, several positive characteristics stand out for these fast neutron reactors from the safety aspects [fr

  1. Innovations and Enhancements for a Consortium of Big-10 University Research and Training Reactors. Final Report

    International Nuclear Information System (INIS)

    Brenizer, Jack

    2011-01-01

    The Consortium of Big-10 University Research and Training Reactors was by design a strategic partnership of seven leading institutions. We received the support of both our industry and DOE laboratory partners. Investments in reactor, laboratory and program infrastructure, allowed us to lead the national effort to expand and improve the education of engineers in nuclear science and engineering, to provide outreach and education to pre-college educators and students and to become a key resource of ideas and trained personnel for our U.S. industrial and DOE laboratory collaborators.

  2. Decontamination of the Douglas Point reactor, May 1983

    International Nuclear Information System (INIS)

    Lesurf, J.E.; Stepaniak, R.; Broad, L.G.; Barber, W.G.

    1983-01-01

    The Douglas Point reactor primary heat transport system including the fuel, was successfully decontaminated by the CAN-DECON process in 1975. A second decontamination, also using the CAN-DECON process, was successfully performed in May 1983. This paper outlines the need for the decontamination, the process used, the results obtained, and the benefits to the station maintenance and operation

  3. Review of Kaganove's solution for the reactor point kinetics equations

    International Nuclear Information System (INIS)

    Couto, R.T.; Santo, A.C.F. de.

    1993-09-01

    A review of Kaganove's method for the reactor point kinetics equations solution is performed. This was method chosen to calculate the power in ATR, a computer program for the analysis of reactivity transients. The reasons for this choice and the adaptation of the method to the purposes of ATR are presented. (author)

  4. CRED REA Fish Team Stationary Point Count Surveys at Kaula Rock, Main Hawaiian Islands, 2006

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Stationary Point Counts at 4 stations at each survey site were surveyed as part of Rapid Ecological Assessments (REA) conducted at 2 sites around Kaula Rock in the...

  5. Fractional neutron point kinetics equations for nuclear reactor dynamics

    International Nuclear Information System (INIS)

    Espinosa-Paredes, Gilberto; Polo-Labarrios, Marco-A.; Espinosa-Martinez, Erick-G.; Valle-Gallegos, Edmundo del

    2011-01-01

    The fractional point-neutron kinetics model for the dynamic behavior in a nuclear reactor is derived and analyzed in this paper. The fractional model retains the main dynamic characteristics of the neutron motion in which the relaxation time associated with a rapid variation in the neutron flux contains a fractional order, acting as exponent of the relaxation time, to obtain the best representation of a nuclear reactor dynamics. The physical interpretation of the fractional order is related with non-Fickian effects from the neutron diffusion equation point of view. The numerical approximation to the solution of the fractional neutron point kinetics model, which can be represented as a multi-term high-order linear fractional differential equation, is calculated by reducing the problem to a system of ordinary and fractional differential equations. The numerical stability of the fractional scheme is investigated in this work. Results for neutron dynamic behavior for both positive and negative reactivity and for different values of fractional order are shown and compared with the classic neutron point kinetic equations. Additionally, a related review with the neutron point kinetics equations is presented, which encompasses papers written in English about this research topic (as well as some books and technical reports) published since 1940 up to 2010.

  6. Rock siting of nuclear power plants from a reactor safety standpoint

    International Nuclear Information System (INIS)

    1975-11-01

    The study has aimed at surveying the advantages and disadvantages of a rock sited nuclear power plant from a reactor safety standpoint. The studies performed are almost entirely concentrated on the BWR alternative. The design of a nuclear power plant in rock judged most appropriate has been studied in greater detail, and a relatively extensive safety analysis has been made. It is found that the presented technical design of the rock sited alternative is sufficiently advanced to form a basis for further projecting treatment. The chosen technical design of the reactor plant demands a cavern with a 45-50 metre span. Caverns without strengthening efforts with such spans are used in mines, but have no previously been used for industrial plants. Studies of the stability of such caverns show that a safety level is attainable corresponding to the safety required for the other parts of the nuclear power plant. The conditions are that the rock is of high quality, that necessary strengthening measures are taken and that careful studies of the rock are made before and during the blasting, and also during operation of the plant. When locating a rock sited nuclear power plant, the same criteria must be considered as for an above ground plant, with additional stronger demands for rock quality. The presented rock sited nuclear power plant has been assessed to cost 20 % more in total construction costs than a corresponding above ground plant. The motivations for rock siting also depend on whether a condensing plant for only electricity production, or a plant for combined power production and district heating, is considered. The latter would under certain circumstances make rock siting look more attractive. (author)

  7. PARALLEL PROCESSING OF BIG POINT CLOUDS USING Z-ORDER-BASED PARTITIONING

    Directory of Open Access Journals (Sweden)

    C. Alis

    2016-06-01

    Full Text Available As laser scanning technology improves and costs are coming down, the amount of point cloud data being generated can be prohibitively difficult and expensive to process on a single machine. This data explosion is not only limited to point cloud data. Voluminous amounts of high-dimensionality and quickly accumulating data, collectively known as Big Data, such as those generated by social media, Internet of Things devices and commercial transactions, are becoming more prevalent as well. New computing paradigms and frameworks are being developed to efficiently handle the processing of Big Data, many of which utilize a compute cluster composed of several commodity grade machines to process chunks of data in parallel. A central concept in many of these frameworks is data locality. By its nature, Big Data is large enough that the entire dataset would not fit on the memory and hard drives of a single node hence replicating the entire dataset to each worker node is impractical. The data must then be partitioned across worker nodes in a manner that minimises data transfer across the network. This is a challenge for point cloud data because there exist different ways to partition data and they may require data transfer. We propose a partitioning based on Z-order which is a form of locality-sensitive hashing. The Z-order or Morton code is computed by dividing each dimension to form a grid then interleaving the binary representation of each dimension. For example, the Z-order code for the grid square with coordinates (x = 1 = 012, y = 3 = 112 is 10112 = 11. The number of points in each partition is controlled by the number of bits per dimension: the more bits, the fewer the points. The number of bits per dimension also controls the level of detail with more bits yielding finer partitioning. We present this partitioning method by implementing it on Apache Spark and investigating how different parameters affect the accuracy and running time of the k nearest

  8. Parallel Processing of Big Point Clouds Using Z-Order Partitioning

    Science.gov (United States)

    Alis, C.; Boehm, J.; Liu, K.

    2016-06-01

    As laser scanning technology improves and costs are coming down, the amount of point cloud data being generated can be prohibitively difficult and expensive to process on a single machine. This data explosion is not only limited to point cloud data. Voluminous amounts of high-dimensionality and quickly accumulating data, collectively known as Big Data, such as those generated by social media, Internet of Things devices and commercial transactions, are becoming more prevalent as well. New computing paradigms and frameworks are being developed to efficiently handle the processing of Big Data, many of which utilize a compute cluster composed of several commodity grade machines to process chunks of data in parallel. A central concept in many of these frameworks is data locality. By its nature, Big Data is large enough that the entire dataset would not fit on the memory and hard drives of a single node hence replicating the entire dataset to each worker node is impractical. The data must then be partitioned across worker nodes in a manner that minimises data transfer across the network. This is a challenge for point cloud data because there exist different ways to partition data and they may require data transfer. We propose a partitioning based on Z-order which is a form of locality-sensitive hashing. The Z-order or Morton code is computed by dividing each dimension to form a grid then interleaving the binary representation of each dimension. For example, the Z-order code for the grid square with coordinates (x = 1 = 012, y = 3 = 112) is 10112 = 11. The number of points in each partition is controlled by the number of bits per dimension: the more bits, the fewer the points. The number of bits per dimension also controls the level of detail with more bits yielding finer partitioning. We present this partitioning method by implementing it on Apache Spark and investigating how different parameters affect the accuracy and running time of the k nearest neighbour algorithm

  9. Incoherent SSI Analysis of Reactor Building using 2007 Hard-Rock Coherency Model

    International Nuclear Information System (INIS)

    Kang, Joo-Hyung; Lee, Sang-Hoon

    2008-01-01

    Many strong earthquake recordings show the response motions at building foundations to be less intense than the corresponding free-field motions. To account for these phenomena, the concept of spatial variation, or wave incoherence was introduced. Several approaches for its application to practical analysis and design as part of soil-structure interaction (SSI) effect have been developed. However, conventional wave incoherency models didn't reflect the characteristics of earthquake data from hard-rock site, and their application to the practical nuclear structures on the hard-rock sites was not justified sufficiently. This paper is focused on the response impact of hard-rock coherency model proposed in 2007 on the incoherent SSI analysis results of nuclear power plant (NPP) structure. A typical reactor building of pressurized water reactor (PWR) type NPP is modeled classified into surface and embedded foundations. The model is also assumed to be located on medium-hard rock and hard-rock sites. The SSI analysis results are obtained and compared in case of coherent and incoherent input motions. The structural responses considering rocking and torsion effects are also investigated

  10. Methods for solving the stochastic point reactor kinetic equations

    International Nuclear Information System (INIS)

    Quabili, E.R.; Karasulu, M.

    1979-01-01

    Two new methods are presented for analysis of the statistical properties of nonlinear outputs of a point reactor to stochastic non-white reactivity inputs. They are Bourret's approximation and logarithmic linearization. The results have been compared with the exact results, previously obtained in the case of Gaussian white reactivity input. It was found that when the reactivity noise has short correlation time, Bourret's approximation should be recommended because it yields results superior to those yielded by logarithmic linearization. When the correlation time is long, Bourret's approximation is not valid, but in that case, if one can assume the reactivity noise to be Gaussian, one may use the logarithmic linearization. (author)

  11. Automatic extraction of discontinuity orientation from rock mass surface 3D point cloud

    Science.gov (United States)

    Chen, Jianqin; Zhu, Hehua; Li, Xiaojun

    2016-10-01

    This paper presents a new method for extracting discontinuity orientation automatically from rock mass surface 3D point cloud. The proposed method consists of four steps: (1) automatic grouping of discontinuity sets using an improved K-means clustering method, (2) discontinuity segmentation and optimization, (3) discontinuity plane fitting using Random Sample Consensus (RANSAC) method, and (4) coordinate transformation of discontinuity plane. The method is first validated by the point cloud of a small piece of a rock slope acquired by photogrammetry. The extracted discontinuity orientations are compared with measured ones in the field. Then it is applied to a publicly available LiDAR data of a road cut rock slope at Rockbench repository. The extracted discontinuity orientations are compared with the method proposed by Riquelme et al. (2014). The results show that the presented method is reliable and of high accuracy, and can meet the engineering needs.

  12. End point control of an actinide precipitation reactor

    International Nuclear Information System (INIS)

    Muske, K.R.

    1997-01-01

    The actinide precipitation reactors in the nuclear materials processing facility at Los Alamos National Laboratory are used to remove actinides and other heavy metals from the effluent streams generated during the purification of plutonium. These effluent streams consist of hydrochloric acid solutions, ranging from one to five molar in concentration, in which actinides and other metals are dissolved. The actinides present are plutonium and americium. Typical actinide loadings range from one to five grams per liter. The most prevalent heavy metals are iron, chromium, and nickel that are due to stainless steel. Removal of these metals from solution is accomplished by hydroxide precipitation during the neutralization of the effluent. An end point control algorithm for the semi-batch actinide precipitation reactors at Los Alamos National Laboratory is described. The algorithm is based on an equilibrium solubility model of the chemical species in solution. This model is used to predict the amount of base hydroxide necessary to reach the end point of the actinide precipitation reaction. The model parameters are updated by on-line pH measurements

  13. Method of nuclear reactor control using a variable temperature load dependent set point

    International Nuclear Information System (INIS)

    Kelly, J.J.; Rambo, G.E.

    1982-01-01

    A method and apparatus for controlling a nuclear reactor in response to a variable average reactor coolant temperature set point is disclosed. The set point is dependent upon percent of full power load demand. A manually-actuated ''droop mode'' of control is provided whereby the reactor coolant temperature is allowed to drop below the set point temperature a predetermined amount wherein the control is switched from reactor control rods exclusively to feedwater flow

  14. Big ambitions for small reactors as investors size up power options

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, John [nuclear24, Redditch (United Kingdom)

    2016-04-15

    Earlier this year, US nuclear developer NuScale Power completed a study for the UK's National Nuclear Laboratory (NNL) that supported the suitability of NuScale's small modular reactor (SMR) technology for the effective disposition of plutonium. The UK is a frontrunner to compete in the SMR marketplace, both in terms of technological capabilities, trade and political commitment. Industry observers are openly speculating whether SMR design and construction could start to move ahead faster than 'big and conventional' nuclear construction projects - not just in the UK but worldwide. Economies of scale could increase the attraction of SMRs to investors and the general public.

  15. Big slow movers: a look at weathered-rock slides in Western North Carolina

    Science.gov (United States)

    Rebecca S. Latham; Richard M. Wooten; Anne C. Witt; Stephen J. Fuemmeler; Kenneth a. Gillon; Thomas J. Douglas; Jennifer B. Bauer; Barton D. Clinton

    2007-01-01

    The North Carolina Geological Survey (NCGS) is currently implementing a landslide hazard-mapping program in western North Carolina authorized by the North Carolina Hurricane Recovery Act of 2005. To date, over 2700 landslides and landslide deposits have been documented. A small number of these landslides are relatively large, slow-moving, weathered-rock slides...

  16. High Temperature Gas-Cooled Test Reactor Point Design: Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Sterbentz, James William [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bayless, Paul David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Nelson, Lee Orville [Idaho National Lab. (INL), Idaho Falls, ID (United States); Gougar, Hans David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kinsey, J. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-03-01

    A point design has been developed for a 200-MW high-temperature gas-cooled test reactor. The point design concept uses standard prismatic blocks and 15.5% enriched uranium oxycarbide fuel. Reactor physics and thermal-hydraulics simulations have been performed to characterize the capabilities of the design. In addition to the technical data, overviews are provided on the technology readiness level, licensing approach, and costs of the test reactor point design.

  17. High Temperature Gas-Cooled Test Reactor Point Design: Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Sterbentz, James William [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bayless, Paul David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Nelson, Lee Orville [Idaho National Lab. (INL), Idaho Falls, ID (United States); Gougar, Hans David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-01-01

    A point design has been developed for a 200-MW high-temperature gas-cooled test reactor. The point design concept uses standard prismatic blocks and 15.5% enriched uranium oxycarbide fuel. Reactor physics and thermal-hydraulics simulations have been performed to characterize the capabilities of the design. In addition to the technical data, overviews are provided on the technology readiness level, licensing approach, and costs of the test reactor point design.

  18. Seismic capacities of masonry walls at the big rock point nuclear generating plant

    International Nuclear Information System (INIS)

    Wesley, D.A.; Bunon, H.; Jenkins, R.B.

    1984-01-01

    An evaluation to determine the ability of selected concrete block walls in the vicinity of essential equipment to withstand seismic excitation was conducted. The seismic input to the walls was developed in accordance with the Systematic Evaluation Program (SEP) site-specific response spectra for the site. Time-history inputs to the walls were determined from the response of the turbine building complex. Analyses were performed to determine the capacities of the walls to withstand both in-plane and transverse seismic loads. Transverse load capacities were determined from time-history analyses of nonlinear two-dimensional analytical models of the walls. Separate inputs were used at the tops and bottoms of the walls to reflect the amplification through the building. The walls were unreinforced vertically with one exception, and have unsupported heights as high as 20'-8''. Also, cantilever walls as high as 11'-2'' were included in the evaluation. Factors of safety based on stability of the walls were determined for the transverse response, and on code allowable stresses (Reference 1) for the in-plane response

  19. Balancing on the Edge: An Approach to Leadership and Resiliency that Combines Rock Climbing with Four Key Touch Points

    Science.gov (United States)

    Winkler, Harold E.

    2005-01-01

    In this article, the author compares leadership and resiliency with rock climbing. It describes the author's personal experience on a rock climbing adventure with his family and how it required application of similar elements as that of leadership and resiliency. The article contains the following sections: (1) Being Resilient; (2) Points of…

  20. Properties of uranium and thorium in host rocks of multi-metal (Ag, Pb, U, Cu, Bi, Z, F) Big Kanimansur deposit (Tajikistan)

    International Nuclear Information System (INIS)

    Fayziev, A.R.

    2007-01-01

    Multi-metal Big Kanimansur Deposit host rocks contain high averages of uranium and thorium which are more than clark averages by 7 and 2.5 times accordingly. The second property of radio-active elements distribution are low ratio of thorium to uranium. That criteria can be used as prospecting sings for flanks and depth of know ore fields as well as for new squares of multi-metal mineralisation

  1. Automated extraction and analysis of rock discontinuity characteristics from 3D point clouds

    Science.gov (United States)

    Bianchetti, Matteo; Villa, Alberto; Agliardi, Federico; Crosta, Giovanni B.

    2016-04-01

    A reliable characterization of fractured rock masses requires an exhaustive geometrical description of discontinuities, including orientation, spacing, and size. These are required to describe discontinuum rock mass structure, perform Discrete Fracture Network and DEM modelling, or provide input for rock mass classification or equivalent continuum estimate of rock mass properties. Although several advanced methodologies have been developed in the last decades, a complete characterization of discontinuity geometry in practice is still challenging, due to scale-dependent variability of fracture patterns and difficult accessibility to large outcrops. Recent advances in remote survey techniques, such as terrestrial laser scanning and digital photogrammetry, allow a fast and accurate acquisition of dense 3D point clouds, which promoted the development of several semi-automatic approaches to extract discontinuity features. Nevertheless, these often need user supervision on algorithm parameters which can be difficult to assess. To overcome this problem, we developed an original Matlab tool, allowing fast, fully automatic extraction and analysis of discontinuity features with no requirements on point cloud accuracy, density and homogeneity. The tool consists of a set of algorithms which: (i) process raw 3D point clouds, (ii) automatically characterize discontinuity sets, (iii) identify individual discontinuity surfaces, and (iv) analyse their spacing and persistence. The tool operates in either a supervised or unsupervised mode, starting from an automatic preliminary exploration data analysis. The identification and geometrical characterization of discontinuity features is divided in steps. First, coplanar surfaces are identified in the whole point cloud using K-Nearest Neighbor and Principal Component Analysis algorithms optimized on point cloud accuracy and specified typical facet size. Then, discontinuity set orientation is calculated using Kernel Density Estimation and

  2. A highly accurate benchmark for reactor point kinetics with feedback

    International Nuclear Information System (INIS)

    Ganapol, B. D.; Picca, P.

    2010-10-01

    This work apply the concept of convergence acceleration, also known as extrapolation, to find the solution to the reactor kinetics equations describing nuclear reactor transients. The method features simplicity in that an approximate finite difference formulation is constructed and converged to high accuracy from knowledge of how the error term behaves. Through Rom berg extrapolation, we demonstrate its high accuracy for a variety of imposed reactivity insertions found in the literature as well as nonlinear temperature and fission product feedback. A unique feature of the proposed method, called RKE/R(om berg) algorithm, is interval bisection to ensure high accuracy. (Author)

  3. Development of uniform hazard response spectra for rock sites considering line and point sources of earthquakes

    International Nuclear Information System (INIS)

    Ghosh, A.K.; Kushwaha, H.S.

    2001-12-01

    Traditionally, the seismic design basis ground motion has been specified by normalised response spectral shapes and peak ground acceleration (PGA). The mean recurrence interval (MRI) used to computed for PGA only. It is shown that the MRI associated with such response spectra are not the same at all frequencies. The present work develops uniform hazard response spectra i.e. spectra having the same MRI at all frequencies for line and point sources of earthquakes by using a large number of strong motion accelerograms recorded on rock sites. Sensitivity of the number of the results to the changes in various parameters has also been presented. This work is an extension of an earlier work for aerial sources of earthquakes. These results will help to determine the seismic hazard at a given site and the associated uncertainities. (author)

  4. An accurate solution of point reactor neutron kinetics equations of multi-group of delayed neutrons

    International Nuclear Information System (INIS)

    Yamoah, S.; Akaho, E.H.K.; Nyarko, B.J.B.

    2013-01-01

    Highlights: ► Analytical solution is proposed to solve the point reactor kinetics equations (PRKE). ► The method is based on formulating a coefficient matrix of the PRKE. ► The method was applied to solve the PRKE for six groups of delayed neutrons. ► Results shows good agreement with other traditional methods in literature. ► The method is accurate and efficient for solving the point reactor kinetics equations. - Abstract: The understanding of the time-dependent behaviour of the neutron population in a nuclear reactor in response to either a planned or unplanned change in the reactor conditions is of great importance to the safe and reliable operation of the reactor. In this study, an accurate analytical solution of point reactor kinetics equations with multi-group of delayed neutrons for specified reactivity changes is proposed to calculate the change in neutron density. The method is based on formulating a coefficient matrix of the homogenous differential equations of the point reactor kinetics equations and calculating the eigenvalues and the corresponding eigenvectors of the coefficient matrix. A small time interval is chosen within which reactivity relatively stays constant. The analytical method was applied to solve the point reactor kinetics equations with six-groups delayed neutrons for a representative thermal reactor. The problems of step, ramp and temperature feedback reactivities are computed and the results compared with other traditional methods. The comparison shows that the method presented in this study is accurate and efficient for solving the point reactor kinetics equations of multi-group of delayed neutrons

  5. Programme of hot points eradication (Co-60) led on French PWR type reactors

    International Nuclear Information System (INIS)

    Rocher, A.; Ridoux, P.; Anthoni, S.; Brun, C.

    1998-01-01

    The question of hot points (pellets rich in cobalt 59 or in cobalt 60 in a PWR type reactor), is studied from the radiation protection point of view. The purpose is to see how to optimize the radiation protection, the elimination of these hot points can bring an improvement. (N.C.)

  6. Brit Crit: Turning Points in British Rock Criticism 1960-1990

    DEFF Research Database (Denmark)

    Gudmundsson, Gestur; Lindberg, U.; Michelsen, M.

    2002-01-01

    had national specific traits and there have been more profound paradigm shifts than in American rock criticism. This is primarily explained by the fact that American rock criticism is more strongly connected to general cultural history, while the UK rock criticism has been more alienated from dominant......The article examines the development of rock criticism in the United Kingdom from the perspective of a Bourdieuan field-analysis. Early British rock critics, like Nik Cohn, were international pioneers, a few years later there was a strong American influence, but British rock criticism has always...... culture and more linked to youth culture. However, also in the UK rock criticism has been part and parcel of the legitimation of rock culture and has moved closer to dominant fields and positions in the cultural hierarchy....

  7. Points of emphasis and objectives of reactor safety research

    International Nuclear Information System (INIS)

    Krewer, K.H.

    1982-01-01

    Reactor safety research is part of the presently running second programme on energy research and energy-engineering with which the Federal Government is connecting a whole bundle of economic and ecological aims: medium- and long-term assurance of energy supply, provision and efficient utilization of energy at favourable economic total costs, improvement of the technological performance, consideration of the requirements of the environmental protection, of the careful treatment of the resources, as well as of the protection of the population and personnel from the risks of conversion and use of energy. (orig.) [de

  8. Numerical solution of the point reactor kinetics equations with fuel burn-up and temperature feedback

    International Nuclear Information System (INIS)

    Tashakor, S.; Jahanfarnia, G.; Hashemi-Tilehnoee, M.

    2010-01-01

    Point reactor kinetics equations are solved numerically using one group of delayed neutrons and with fuel burn-up and temperature feedback included. To calculate the fraction of one-group delayed neutrons, a group of differential equations are solved by an implicit time method. Using point reactor kinetics equations, changes in mean neutrons density, temperature, and reactivity are calculated in different times during the reactor operation. The variation of reactivity, temperature, and maximum power with time are compared with the predictions by other methods.

  9. Generalized saddle point condition for ignition in a tokamak reactor with temperature and density profiles

    International Nuclear Information System (INIS)

    Mitari, O.; Hirose, A.; Skarsgard, H.M.

    1989-01-01

    In this paper, the concept of a generalized ignition contour map, is extended to the realistic case of a plasma with temperature and density profiles in order to study access to ignition in a tokamak reactor. The generalized saddle point is found to lie between the Lawson and ignition conditions. If the height of the operation path with Goldston L-mode scaling is higher than the generalized saddle point, a reactor can reach ignition with this scaling for the case with no confinement degradation effect due to alpha-particle heating. In this sense, the saddle point given in a general form is a new criterion for reaching ignition. Peaking the profiles for the plasma temperature and density can lower the height of the generalized saddle point and help a reactor to reach ignition. With this in mind, the authors can judge whether next-generation tokamaks, such as Compact Ignition Tokamak, Tokamak Ignition/Burn Experimental Reactor, Next European Torus, Fusion Experimental Reactor, International Tokamak Reactor, and AC Tokamak Reactor, can reach ignition with realistic profile parameters and an L-mode scaling law

  10. Evaluation of the integrity of SEP reactor vessels

    International Nuclear Information System (INIS)

    Hoge, K.G.

    1979-12-01

    A documented review is presented of the integrity of the 11 reactor pressure vessels covered in the Systematic Evaluation Program. This review deals primarily with the design specifications and quality assurance programs used in the vessel construction and the status of material surveillance programs, pressure-temperature operating limits, and inservice inspection programs of the applicable plants. Several generic items such as PWR overpressurization protection and BWR nozzle and safe-end cracking also are evaluated. The 11 vessels evaluated include Dresden Units 1 and 2, Big Rock Point, Haddam Neck, Yankee Rowe, Oyster Creek, San Onofre 1, LaCrosse, Ginna, Millstone 1, and Palisades

  11. Supergene destruction of a hydrothermal replacement alunite deposit at Big Rock Candy Mountain, Utah: Mineralogy, spectroscopic remote sensing, stable-isotope, and argon-age evidences

    Science.gov (United States)

    Cunningham, Charles G.; Rye, Robert O.; Rockwell, Barnaby W.; Kunk, Michael J.; Councell, Terry B.

    2005-01-01

    Big Rock Candy Mountain is a prominent center of variegated altered volcanic rocks in west-central Utah. It consists of the eroded remnants of a hypogene alunite deposit that, at ∼21 Ma, replaced intermediate-composition lava flows. The alunite formed in steam-heated conditions above the upwelling limb of a convection cell that was one of at least six spaced at 3- to 4-km intervals around the margin of a monzonite stock. Big Rock Candy Mountain is horizontally zoned outward from an alunite core to respective kaolinite, dickite, and propylite envelopes. The altered rocks are also vertically zoned from a lower pyrite–propylite assemblage upward through assemblages successively dominated by hypogene alunite, jarosite, and hematite, to a flooded silica cap. This hydrothermal assemblage is undergoing natural destruction in a steep canyon downcut by the Sevier River in Marysvale Canyon. Integrated geological, mineralogical, spectroscopic remote sensing using AVIRIS data, Ar radiometric, and stable isotopic studies trace the hypogene origin and supergene destruction of the deposit and permit distinction of primary (hydrothermal) and secondary (weathering) processes. This destruction has led to the formation of widespread supergene gypsum in cross-cutting fractures and as surficial crusts, and to natrojarosite, that gives the mountain its buff coloration along ridges facing the canyon. A small spring, Lemonade Spring, with a pH of 2.6 and containing Ca, Mg, Si, Al, Fe, Mn, Cl, and SO4, also occurs near the bottom of the canyon. The 40Ar/39Ar age (21.32±0.07 Ma) of the alunite is similar to that for other replacement alunites at Marysvale. However, the age spectrum contains evidence of a 6.6-Ma thermal event that can be related to the tectonic activity responsible for the uplift that led to the downcutting of Big Rock Candy Mountain by the Sevier River. This ∼6.6 Ma event also is present in the age spectrum of supergene natrojarosite forming today, and probably

  12. INDIAN POINT REACTOR REACTIVITY AND FLUX DISTRIBUTION MEASUREMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Batch, M. L.; Fischer, F. E.

    1963-11-15

    The reactivity of the Indian Point core was measured near zero reactivity at various shim and control rod patterns. Flux distribution measurements were also made, and the results are expressed in terms of power peaking factors and normalized detector response during rod withdrawal. (D.L.C.)

  13. A small modular fast reactor as starting point for industrial deployment of fast reactors

    International Nuclear Information System (INIS)

    Chang, Yoon I.; Lo Pinto, Pierre; Konomura, Mamoru

    2006-01-01

    The current commercial reactors based on light water technology provide 17% of the electricity worldwide owing to their reliability, safety and competitive economics. In the near term, next generation reactors are expected to be evolutionary type, taking benefits of extensive LWR experience feedbacks and further improved economics and safety provisions. For the long term, however, sustainable energy production will be required due to continuous increase of the human activities, environmental concerns such as greenhouse effect and the need of alternatives to fossil fuels as long term energy resources. Therefore, future generation commercial reactors should meet some criteria of sustainability that the current generation cannot fully satisfy. In addition to the current objectives of economics and safety, waste management, resource extension and public acceptance become other major objectives among the sustainability criteria. From this perspective, two questions can be raised: what reactor type can meet the sustainability criteria, and how to proceed to an effective deployment in harmony with the high reliability and availability of the current nuclear reactor fleet. There seems to be an international consensus that the fast spectrum reactor, notably the sodium-cooled system is most promising to meet all of the long term sustainability criteria. As for the latter, we propose a small modular fast reactor project could become a base to prepare the industrial infrastructure. The paper has the following contents: - Introduction; - SMFR project; - Core design; - Supercritical CO 2 Brayton cycle; - Near-term reference plant; - Advanced designs; - Conclusions. To summarize, the sodium-cooled fast reactor is currently recognized as the technology of choice for the long term nuclear energy expansion, but some research and development are required to optimize and validate advanced design solutions. A small modular fast reactor can satisfy some existing near-term market niche

  14. Experimental study of radiation dose rate at different strategic points of the BAEC TRIGA Research Reactor.

    Science.gov (United States)

    Ajijul Hoq, M; Malek Soner, M A; Salam, M A; Haque, M M; Khanom, Salma; Fahad, S M

    2017-12-01

    The 3MW TRIGA Mark-II Research Reactor of Bangladesh Atomic Energy Commission (BAEC) has been under operation for about thirty years since its commissioning at 1986. In accordance with the demand of fundamental nuclear research works, the reactor has to operate at different power levels by utilizing a number of experimental facilities. Regarding the enquiry for safety of reactor operating personnel and radiation workers, it is necessary to know the radiation level at different strategic points of the reactor where they are often worked. In the present study, neutron, beta and gamma radiation dose rate at different strategic points of the reactor facility with reactor power level of 2.4MW was measured to estimate the rising level of radiation due to its operational activities. From the obtained results high radiation dose is observed at the measurement position of the piercing beam port which is caused by neutron leakage and accordingly, dose rate at the stated position with different reactor power levels was measured. This study also deals with the gamma dose rate measurements at a fixed position of the reactor pool top surface for different reactor power levels under both Natural Convection Cooling Mode (NCCM) and Forced Convection Cooling Mode (FCCM). Results show that, radiation dose rate is higher for NCCM in compared with FCCM and increasing with the increase of reactor power. Thus, concerning the radiological safety issues for working personnel and the general public, the radiation dose level monitoring and the experimental analysis performed within this paper is so much effective and the result of this work can be utilized for base line data and code verification of the nuclear reactor. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Operating point considerations for the Reference Theta-Pinch Reactor (RTPR)

    International Nuclear Information System (INIS)

    Krakowski, R.A.; Miller, R.L.; Hagenson, R.L.

    1976-01-01

    Aspects of the continuing engineering design-point reassessment and optimization of the Reference Theta-Pinch Reactor (RTPR) are discussed. An updated interim design point which achieves a favorable energy balance and involves relaxed technological requirements, which nonetheless satisfy more rigorous physics and engineering constraints, is presented

  16. Liquid infiltration through the boiling-point isotherm in a desiccating fractured rock matrix

    International Nuclear Information System (INIS)

    Phillips, O.M.

    1994-01-01

    Over a long time interval, the integrity of the radioactive waste repository proposed at Yucca Mountain may be compromised by corrosion accelerated by intermittent wetting which could occur by episodic infiltration of meteoric water from above through the fracture network. A simple two-dimensional model is constructed for the infiltration of liquid water down a fracture in a permeable rock matrix, beyond the boiling-point isotherm. The water may derive from episodic infiltration or from the condensation of steam above a desiccating region. Boiling of the water in the fracture is maintained by heat transfer from a surrounding superheated matrix blocks. There are two intrinsic length scales in this situation, (1): l s = ρ l q o L/(k m β) which is such that the total heat flow over this lateral distance balances that needed for evaporation of the liquid water infiltration, and (2): The thermal diffusion distance l θ = (k m t) 1/2 which increases with time after the onset of infiltration. The primary results are: (a) for two-dimensional infiltration down an isolated fracture or fault, the depth of penetration below the (undisturbed) boiling point isotherm is given by 1/2 π 1/2 (l s l θ ) 1/2 , and so increases as t 1/4 . Immediately following the onset of infiltration, penetration is rapid, but quickly slows. This behavior continues until l θ (and D) become comparable with l s . (b) With continuing infiltration down an isolated fracture or cluster of fractures, when l θ >> l s the temperature distribution becomes steady and the penetration distance stabilizes at a value proportional to l s . (c) Effects such as three-dimensionality of the liquid flow paths and flow rates, matrix infiltration, etc., appear to reduce the penetration distance

  17. Space nuclear reactor concepts for avoidance of a single point failure

    International Nuclear Information System (INIS)

    El-Genk, M. S.

    2007-01-01

    This paper presents three space nuclear reactor concepts for future exploration missions requiring electrical power of 10's to 100's kW, for 7-10 years. These concepts avoid a single point failure in reactor cooling; and they could be used with a host of energy conversion technologies. The first is lithium or sodium heat pipes cooled reactor. The heat pipes operate at a fraction of their prevailing capillary or sonic limit. Thus, when a number of heat pipes fail, those in the adjacent modules remove their heat load, maintaining reactor core adequately cooled. The second is a reactor with a circulating liquid metal coolant. The reactor core is divided into six identical sectors, each with a separate energy conversion loop. The sectors in the reactor core are neurotically coupled, but hydraulically decoupled. Thus, when a sector experiences a loss of coolant, the fission power generated in it will be removed by the circulating coolant in the adjacent sectors. In this case, however, the reactor fission power would have to decrease to avoid exceeding the design temperature limits in the sector with a failed loop. These two reactor concepts are used with energy conversion technologies, such as advanced Thermoelectric (TE), Free Piston Stirling Engines (FPSE), and Alkali Metal Thermal-to- Electric Conversion (AMTEC). Gas cooled reactors are a better choice to use with Closed Brayton Cycle engines, such as the third reactor concept to be presented in the paper. It has a sectored core that is cooled with a binary mixture of He-Xe (40 gm/mole). Each of the three sectors in the reactor has its own CBC and neutronically, but not hydraulically, coupled to the other sectors

  18. SALLY, Dynamic Behaviour of Reactor Cooling Channel by Point Model

    International Nuclear Information System (INIS)

    Reiche, Chr.; Ziegenbein, D.

    1981-01-01

    1 - Nature of the physical problem solved: The dynamical behaviour of a cooling channel is calculated. Starting from an equilibrium state a perturbation is introduced into the system. That may be an outer reactivity perturbation or a change in the coolant velocity or in the coolant temperature. The neutron kinetics is treated in the framework of the one-point model. The cooling channel consists of a cladded and cooled fuel rod. The temperature distribution is taken into account as an array above a mesh of radial zones and axial layers. Heat transfer is considered in radial direction only, the thermodynamical coupling of the different layers is obtained by the coolant flow. The thermal material parameters are considered to be temperature independent. Reactivity feedback is introduced by means of reactivity coefficients for fuel, canning, and coolant. Doppler broadening is included. The first cooling cycle can be taken into account by a simple model. 2 - Method of solution: The integration of the point kinetics equations is done numerically by the P11 scheme. The system of temperature equations with constant heat resistance coefficients is solved by the method of factorization. 3 - Restrictions on the complexity of the problem: Given limits are: 10 radial fuel zones, 25 axial layers, 6 groups of delayed neutrons

  19. Measurement and analysis of pressure tube elongation in the Douglas Point reactor

    International Nuclear Information System (INIS)

    Causey, A.R.; MacEwan, S.R.; Jamieson, H.C.; Mitchell, A.B.

    1980-02-01

    Elongations of zirconium alloy pressure tubes in CANDU reactors, which occur as a result of neutron-irradiation-induced creep and growth, have been measured over the past 6 years, and the consequences of thses elongations have recently been analysed. Elongation rates, previously deduced from extensive measurements of elongations of cold-worked Zircaloy-2 pressure tubes in the Pickering reactors, have been modified to apply to the pressure tubes in the Douglas Point (DP) reactor by taking into account measured diffences in texture and dislocation density. Using these elongation rates, and structural data unique to the DP reactor, the analysis predicts elongation behaviour which is in good agreement with pressure tube elongations measured during the ten years of reactor operation. (Auth)

  20. PREMOR: a point reactor exposure model computer code for survey analysis of power plant performance

    International Nuclear Information System (INIS)

    Vondy, D.R.

    1979-10-01

    The PREMOR computer code was written to exploit a simple, two-group point nuclear reactor power plant model for survey analysis. Up to thirteen actinides, fourteen fission products, and one lumped absorber nuclide density are followed over a reactor history. Successive feed batches are accounted for with provision for from one to twenty batches resident. The effect of exposure of each of the batches to the same neutron flux is determined

  1. PREMOR: a point reactor exposure model computer code for survey analysis of power plant performance

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.

    1979-10-01

    The PREMOR computer code was written to exploit a simple, two-group point nuclear reactor power plant model for survey analysis. Up to thirteen actinides, fourteen fission products, and one lumped absorber nuclide density are followed over a reactor history. Successive feed batches are accounted for with provision for from one to twenty batches resident. The effect of exposure of each of the batches to the same neutron flux is determined.

  2. An Approach for Automatic Orientation of Big Point Clouds from the Stationary Scanners Based on the Spherical Targets

    Directory of Open Access Journals (Sweden)

    YAO Jili

    2015-04-01

    Full Text Available Terrestrial laser scanning (TLS technology has high speed of data acquisition, large amount of point cloud, long distance of measuring. However, there are some disadvantages such as distance limitation in target detecting, hysteresis in point clouds processing, low automation and weaknesses of adapting long-distance topographic survey. In this case, we put forward a method on long-range targets detecting in big point clouds orientation. The method firstly searches point cloud rings that contain targets according to their engineering coordinate system. Then the detected rings are divided into sectors to detect targets in a very short time so as to obtain central coordinates of these targets. Finally, the position and orientation parameters of scanner are calculated and point clouds in scanner's own coordinate system(SOCS are converted into engineering coordinate system. The method is able to be applied in ordinary computers for long distance topographic(the distance between scanner and targets ranges from 180 to 700 m survey in mountainous areas with targets radius of 0.162m.

  3. Determination of melting point of mixed-oxide fuel irradiated in a fast breeder reactor

    International Nuclear Information System (INIS)

    Tachibana, Toshimichi

    1985-01-01

    The melting point of fuel is important to set its in-reactor maximum temperature in fuel design. The fuel melting point measuring methods are broadly the filament method and the capsule sealing method. The only instance of measuring the melting point of irradiated mixed oxide (U, Pu)O 2 fuel by the filament method is by GE in the United States. The capsule sealing method, while the excellent means, is difficult in weld sealing the irradiated fuel in a capsule within the cell. In the fast reactor development program, the remotely operated melting point measuring apparatus in capsule sealing the mixed (U, Pu)O 2 fuel irradiated in the experimental FBR Joyo was set in the cell and the melting point was measured, for the first time in the world. (Mori, K.)

  4. Preliminary Demonstration Reactor Point Design for the Fluoride Salt-Cooled High-Temperature Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Qualls, A. L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Betzler, Benjamin R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Brown, Nicholas R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Carbajo, Juan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Greenwood, Michael Scott [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hale, Richard Edward [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Harrison, Thomas J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Powers, Jeffrey J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Robb, Kevin R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Terrell, Jerry W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-01

    Development of the Fluoride Salt-Cooled High-Temperature Reactor (FHR) Demonstration Reactor (DR) is a necessary intermediate step to enable commercial FHR deployment through disruptive and rapid technology development and demonstration. The FHR DR will utilize known, mature technology to close remaining gaps to commercial viability. Lower risk technologies are included in the initial FHR DR design to ensure that the reactor can be built, licensed, and operated within an acceptable budget and schedule. These technologies include tristructural-isotropic (TRISO) particle fuel, replaceable core structural material, the use of that same material for the primary and intermediate loops, and tube-and-shell heat exchangers. This report provides an update on the development of the FHR DR. At this writing, the core neutronics and thermal hydraulics have been developed and analyzed. The mechanical design details are still under development and are described to their current level of fidelity. It is anticipated that the FHR DR can be operational within 10 years because of the use of low-risk, near-term technology options.

  5. Preliminary Demonstration Reactor Point Design for the Fluoride Salt-Cooled High-Temperature Reactor

    International Nuclear Information System (INIS)

    Qualls, A. L.; Betzler, Benjamin R.; Brown, Nicholas R.; Carbajo, Juan; Greenwood, Michael Scott; Hale, Richard Edward; Harrison, Thomas J.; Powers, Jeffrey J.; Robb, Kevin R.; Terrell, Jerry W.

    2015-01-01

    Development of the Fluoride Salt-Cooled High-Temperature Reactor (FHR) Demonstration Reactor (DR) is a necessary intermediate step to enable commercial FHR deployment through disruptive and rapid technology development and demonstration. The FHR DR will utilize known, mature technology to close remaining gaps to commercial viability. Lower risk technologies are included in the initial FHR DR design to ensure that the reactor can be built, licensed, and operated within an acceptable budget and schedule. These technologies include tristructural-isotropic (TRISO) particle fuel, replaceable core structural material, the use of that same material for the primary and intermediate loops, and tube-and-shell heat exchangers. This report provides an update on the development of the FHR DR. At this writing, the core neutronics and thermal hydraulics have been developed and analyzed. The mechanical design details are still under development and are described to their current level of fidelity. It is anticipated that the FHR DR can be operational within 10 years because of the use of low-risk, near-term technology options.

  6. Improving Site Characterization for Rock Dredging using a Drilling Parameter Recorder and the Point Load Test

    Science.gov (United States)

    1994-09-01

    materials. Also, available data from drilling rates in the mining and tunneling industries (Howarth and Rowlands 1987, Somerton 1959) indicate a...selected uniform natural rock materials and several man -made rock simulants were used to obtain drilling parameter records for materials of known...Dredging Seminar, Atlantic City, NJ, May 1993. Western Dredging Association (WEDA) and Texas A&M University. Somerton , W. H. (1959). "A laboratory study of

  7. USE OF BIG DATA ANALYTICS FOR CUSTOMER RELATIONSHIP MANAGEMENT: POINT OF PARITY OR SOURCE OF COMPETITIVE ADVANTAGE?

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas; Zablah, Alex R.; Straub, Detmar W.

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (CA use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: 1. What are the key antecedents of big data customer analytics use? 2. How, and to what extent, does big data...

  8. The tipping point how little things can make a big difference

    CERN Document Server

    Gladwell, Malcolm

    2002-01-01

    The tipping point is that magic moment when an idea, trend, or social behavior crosses a threshold, tips, and spreads like wildfire. Just as a single sick person can start an epidemic of the flu, so too can a small but precisely targeted push cause a fashion trend, the popularity of a new product, or a drop in the crime rate. This widely acclaimed bestseller, in which Malcolm Gladwell explores and brilliantly illuminates the tipping point phenomenon, is already changing the way people throughout the world think about selling products and disseminating ideas.

  9. Compact reversed-field pinch reactors (CRFPR): sensitivity study and design-point determination

    International Nuclear Information System (INIS)

    Hagenson, R.L.; Krakowski, R.A.

    1982-07-01

    If the costing assumptions upon which the positive assessment of conventional large superconducting fusion reactors are based proves overly optimistic, approaches that promise considerably increased system power density and reduced mass utilization will be required. These more compact reactor embodiments generally must operate with reduced shield thickness and resistive magnets. Because of the unique, magnetic topology associated with the Reversed-Field Pinch (RFP), the compact reactor embodiment for this approach is particularly attractive from the viewpoint of low-field resistive coils operating with Ohmic losses that can be made small relative to the fusion power. A comprehensive system model is developed and described for a steady-state, compact RFP reactor (CRFPR). This model is used to select a unique cost-optimized design point that will be used for a conceptual engineering design. The cost-optimized CRFPR design presented herein would operate with system power densities and mass utilizations that are comparable to fission power plants and are an order of magnitude more favorable than the conventional approaches to magnetic fusion power. The sensitivity of the base-case design point to changes in plasma transport, profiles, beta, blanket thickness, normal vs superconducting coils, and fuel cycle (DT vs DD) is examined. The RFP approach is found to yield a point design for a high-power-density reactor that is surprisingly resilient to changes in key, but relatively unknown, physics and systems parameters

  10. PKI, Gamma Radiation Reactor Shielding Calculation by Point-Kernel Method

    International Nuclear Information System (INIS)

    Li Chunhuai; Zhang Liwu; Zhang Yuqin; Zhang Chuanxu; Niu Xihua

    1990-01-01

    1 - Description of program or function: This code calculates radiation shielding problem of gamma-ray in geometric space. 2 - Method of solution: PKI uses a point kernel integration technique, describes radiation shielding geometric space by using geometric space configuration method and coordinate conversion, and makes use of calculation result of reactor primary shielding and flow regularity in loop system for coolant

  11. Application of the fractional neutron point kinetic equation: Start-up of a nuclear reactor

    International Nuclear Information System (INIS)

    Polo-Labarrios, M.-A.; Espinosa-Paredes, G.

    2012-01-01

    Highlights: ► Neutron density behavior at reactor start up with fractional neutron point kinetics. ► There is a relaxation time associated with a rapid variation in the neutron flux. ► Physical interpretation of the fractional order is related with non-Fickian effects. ► Effect of the anomalous diffusion coefficient and the relaxation time is analyzed. ► Neutron density is related with speed and duration of the control rods lifting. - Abstract: In this paper we present the behavior of the variation of neutron density when the nuclear reactor power is increased using the fractional neutron point kinetic (FNPK) equation with a single-group of delayed neutron precursor. It is considered that there is a relaxation time associated with a rapid variation in the neutron flux and its physical interpretation of the fractional order is related with non-Fickian effects from the neutron diffusion equation point of view. We analyzed the case of increase the nuclear reactor power when reactor is cold start-up which is a process of inserting reactivity by lifting control rods discontinuously. The results show that for short time scales of the start-up the neutronic density behavior with FNPK shows sub-diffusive effects whose absorption are government by control rods velocity. For large times scale, the results shows that the classical equation of the neutron point kinetics over predicted the neutron density regarding to FNPK.

  12. Application of point kinetic model in the study of fluidized bed reactor dynamic

    International Nuclear Information System (INIS)

    Borges, Volnei; Vilhena, Marco Tullio de; Streck, Elaine E.

    1995-01-01

    In this work the dynamical behavior of the fluidized bed nuclear reactor is analysed. The main goal consist to study the effect of the acceleration term in the point kinetic equations. Numerical simulations are reported considering constant acceleration. (author). 7 refs, 4 figs

  13. Innovations and enhancements in neutronic analysis of the Big-10 university research and training reactors based on the AGENT code system

    International Nuclear Information System (INIS)

    Hursin, M.; Shanjie, X.; Burns, A.; Hopkins, J.; Satvat, N.; Gert, G.; Tsoukalas, L. H.; Jevremovic, T.

    2006-01-01

    Introduction. This paper summarizes salient aspects of the 'virtual' reactor system developed at Purdue Univ. emphasizing efficient neutronic modeling through AGENT (Arbitrary Geometry Neutron Transport) a deterministic neutron transport code. DOE's Big-10 Innovations in Nuclear Infrastructure and Education (INIE) Consortium was launched in 2002 to enhance scholarship activities pertaining to university research and training reactors (URTRs). Existing and next generation URTRs are powerful campus tools for nuclear engineering as well as a number of disciplines that include, but are not limited to, medicine, biology, material science, and food science. Advancing new computational environments for the analysis and configuration of URTRs is an important Big-10 INIE aim. Specifically, Big-10 INIE has pursued development of a 'virtual' reactor, an advanced computational environment to serve as a platform on which to build operations, utilization (research and education), and systemic analysis of URTRs physics. The 'virtual' reactor computational system will integrate computational tools addressing the URTR core and near core physics (transport, dynamics, fuel management and fuel configuration); thermal-hydraulics; beam line, in-core and near-core experiments; instrumentation and controls; confinement/containment and security issues. Such integrated computational environment does not currently exist. The 'virtual' reactor is designed to allow researchers and educators to configure and analyze their systems to optimize experiments, fuel locations for flux shaping, as well as detector selection and configuration. (authors)

  14. Point design for deuterium-deuterium compact reversed-field pinch reactors

    International Nuclear Information System (INIS)

    Dabiri, A.E.; Dobrott, D.R.; Gurol, H.; Schnack, D.D.

    1984-01-01

    A deuterium-deuterium (D-D) reversed-field pinch (RFP) reactor may be made comparable in size and cost to a deuterium-tritium (D-T) reactor at the expense of high-thermal heat load to the first wall. This heat load is the result of the larger percentage of fusion power in charged particles in the D-D reaction as compared to the D-T reaction. The heat load may be reduced by increasing the reactor size and hence the cost. In addition to this ''degraded'' design, the size may be kept small by means of a higher heat load wall, or by means of a toroidal divertor, in which case most of the heat load seen by the wall is in the form of radiation. Point designs are developed for these approaches and cost studies are performed and compared with a D-T reactor. The results indicate that the cost of electricity of a D-D RFP reactor is about20% higher than a D-T RFP reactor. This increased cost could be offset by the inherent safety features of the D-D fuel cycle

  15. Numerical simulation of stochastic point kinetic equation in the dynamical system of nuclear reactor

    International Nuclear Information System (INIS)

    Saha Ray, S.

    2012-01-01

    Highlights: ► In this paper stochastic neutron point kinetic equations have been analyzed. ► Euler–Maruyama method and Strong Taylor 1.5 order method have been discussed. ► These methods are applied for the solution of stochastic point kinetic equations. ► Comparison between the results of these methods and others are presented in tables. ► Graphs for neutron and precursor sample paths are also presented. -- Abstract: In the present paper, the numerical approximation methods, applied to efficiently calculate the solution for stochastic point kinetic equations () in nuclear reactor dynamics, are investigated. A system of Itô stochastic differential equations has been analyzed to model the neutron density and the delayed neutron precursors in a point nuclear reactor. The resulting system of Itô stochastic differential equations are solved over each time-step size. The methods are verified by considering different initial conditions, experimental data and over constant reactivities. The computational results indicate that the methods are simple and suitable for solving stochastic point kinetic equations. In this article, a numerical investigation is made in order to observe the random oscillations in neutron and precursor population dynamics in subcritical and critical reactors.

  16. Fast Computation of the Two-Point Correlation Function in the Age of Big Data

    Science.gov (United States)

    Pellegrino, Andrew; Timlin, John

    2018-01-01

    We present a new code which quickly computes the two-point correlation function for large sets of astronomical data. This code combines the ease of use of Python with the speed of parallel shared libraries written in C. We include the capability to compute the auto- and cross-correlation statistics, and allow the user to calculate the three-dimensional and angular correlation functions. Additionally, the code automatically divides the user-provided sky masks into contiguous subsamples of similar size, using the HEALPix pixelization scheme, for the purpose of resampling. Errors are computed using jackknife and bootstrap resampling in a way that adds negligible extra runtime, even with many subsamples. We demonstrate comparable speed with other clustering codes, and code accuracy compared to known and analytic results.

  17. Lattice Boltzmann Simulations of Fluid Flow in Continental Carbonate Reservoir Rocks and in Upscaled Rock Models Generated with Multiple-Point Geostatistics

    Directory of Open Access Journals (Sweden)

    J. Soete

    2017-01-01

    Full Text Available Microcomputed tomography (μCT and Lattice Boltzmann Method (LBM simulations were applied to continental carbonates to quantify fluid flow. Fluid flow characteristics in these complex carbonates with multiscale pore networks are unique and the applied method allows studying their heterogeneity and anisotropy. 3D pore network models were introduced to single-phase flow simulations in Palabos, a software tool for particle-based modelling of classic computational fluid dynamics. In addition, permeability simulations were also performed on rock models generated with multiple-point geostatistics (MPS. This allowed assessing the applicability of MPS in upscaling high-resolution porosity patterns into large rock models that exceed the volume limitations of the μCT. Porosity and tortuosity control fluid flow in these porous media. Micro- and mesopores influence flow properties at larger scales in continental carbonates. Upscaling with MPS is therefore necessary to overcome volume-resolution problems of CT scanning equipment. The presented LBM-MPS workflow is applicable to other lithologies, comprising different pore types, shapes, and pore networks altogether. The lack of straightforward porosity-permeability relationships in complex carbonates highlights the necessity for a 3D approach. 3D fluid flow studies provide the best understanding of flow through porous media, which is of crucial importance in reservoir modelling.

  18. Schools K-12, This is a point feature class of Schools within Rock County. This data does not contain religious or parochial schools, or schools affiliated with churches., Published in 2005, Rock County Government.

    Data.gov (United States)

    NSGIC Local Govt | GIS Inventory — Schools K-12 dataset current as of 2005. This is a point feature class of Schools within Rock County. This data does not contain religious or parochial schools, or...

  19. Aging management program of the reactor building concrete at Point Lepreau Generating Station

    Directory of Open Access Journals (Sweden)

    Gendron T.

    2011-04-01

    Full Text Available In order for New Brunswick Power Nuclear (NBPN to control the risks of degradation of the concrete reactor building at the Point Lepreau Generating Station (PLGS the development of an aging management plan (AMP was initiated. The intention of this plan was to determine the requirements for specific structural components of concrete of the reactor building that require regular inspection and maintenance to ensure the safe and reliable operation of the plant. The document is currently in draft form and presents an integrated methodology for the application of an AMP for the concrete of the reactor building. The current AMP addresses the reactor building structure and various components, such as joint sealant and liners that are integral to the structure. It does not include internal components housed within the structure. This paper provides background information regarding the document developed and the strategy developed to manage potential degradation of the concrete of the reactor building, as well as specific programs and preventive and corrective maintenance activities initiated.

  20. Aging management program of the reactor building concrete at Point Lepreau Generating Station

    Science.gov (United States)

    Aldea, C.-M.; Shenton, B.; Demerchant, M. M.; Gendron, T.

    2011-04-01

    In order for New Brunswick Power Nuclear (NBPN) to control the risks of degradation of the concrete reactor building at the Point Lepreau Generating Station (PLGS) the development of an aging management plan (AMP) was initiated. The intention of this plan was to determine the requirements for specific structural components of concrete of the reactor building that require regular inspection and maintenance to ensure the safe and reliable operation of the plant. The document is currently in draft form and presents an integrated methodology for the application of an AMP for the concrete of the reactor building. The current AMP addresses the reactor building structure and various components, such as joint sealant and liners that are integral to the structure. It does not include internal components housed within the structure. This paper provides background information regarding the document developed and the strategy developed to manage potential degradation of the concrete of the reactor building, as well as specific programs and preventive and corrective maintenance activities initiated.

  1. The ARIES-I high-field-tokamak reactor: Design-point determination and parametric studies

    International Nuclear Information System (INIS)

    Miller, R.L.

    1989-01-01

    The multi-institutional ARIES study has examined the physics, technology, safety, and economic issues associated with the conceptual design of a tokamak magnetic-fusion reactor. The ARIES-I variant envisions a DT-fueled device based on advanced superconducting coil, blanket, and power-conversion technologies and a modest extrapolation of existing tokamak physics. A comprehensive systems and trade study has been conducted as an integral and ongoing part of the reactor assessment in order to identify an acceptable design point to be subjected to detailed analysis and integration as well as to characterize the ARIES-I operating space. Results of parametric studies leading to the identification of such a design point are presented. 15 refs., 6 figs., 2 tabs

  2. A new integral method for solving the point reactor neutron kinetics equations

    International Nuclear Information System (INIS)

    Li Haofeng; Chen Wenzhen; Luo Lei; Zhu Qian

    2009-01-01

    A numerical integral method that efficiently provides the solution of the point kinetics equations by using the better basis function (BBF) for the approximation of the neutron density in one time step integrations is described and investigated. The approach is based on an exact analytic integration of the neutron density equation, where the stiffness of the equations is overcome by the fully implicit formulation. The procedure is tested by using a variety of reactivity functions, including step reactivity insertion, ramp input and oscillatory reactivity changes. The solution of the better basis function method is compared to other analytical and numerical solutions of the point reactor kinetics equations. The results show that selecting a better basis function can improve the efficiency and accuracy of this integral method. The better basis function method can be used in real time forecasting for power reactors in order to prevent reactivity accidents.

  3. Detection of gaseous heavy water leakage points in CANDU 6 pressurized heavy water reactors

    International Nuclear Information System (INIS)

    Park, T-K.; Jung, S-H.

    1996-01-01

    During reactor operation, the heavy water filled primary coolant system in a CANDU 6 Pressurized Heavy Water (PHWR) may leak through routine operations of the plant via components, mechanical joints, and during inadvertent operations etc. Early detection of leak points is therefore important to maintain plant safety and economy. There are many independent systems to monitor and recover heavy water leakage in a CANDU 6 PHWR. Methodology for early detection based on operating experience from these systems, is investigated in this paper. In addition, the four symptoms of D 2 O leakage, the associated process for clarifying and verifying the leakage, and the probable points of leakage are discussed. (author)

  4. An efficient technique for the point reactor kinetics equations with Newtonian temperature feedback effects

    International Nuclear Information System (INIS)

    Nahla, Abdallah A.

    2011-01-01

    Highlights: → An efficient technique for the nonlinear reactor kinetics equations is presented. → This method is based on Backward Euler or Crank Nicholson and fundamental matrix. → Stability of efficient technique is defined and discussed. → This method is applied to point kinetics equations of six-groups of delayed neutrons. → Step, ramp, sinusoidal and temperature feedback reactivities are discussed. - Abstract: The point reactor kinetics equations of multi-group of delayed neutrons in the presence Newtonian temperature feedback effects are a system of stiff nonlinear ordinary differential equations which have not any exact analytical solution. The efficient technique for this nonlinear system is based on changing this nonlinear system to a linear system by the predicted value of reactivity and solving this linear system using the fundamental matrix of the homogenous linear differential equations. The nonlinear point reactor kinetics equations are rewritten in the matrix form. The solution of this matrix form is introduced. This solution contains the exponential function of a variable coefficient matrix. This coefficient matrix contains the unknown variable, reactivity. The predicted values of reactivity in the explicit form are determined replacing the exponential function of the coefficient matrix by two kinds, Backward Euler and Crank Nicholson, of the rational approximations. The nonlinear point kinetics equations changed to a linear system of the homogenous differential equations. The fundamental matrix of this linear system is calculated using the eigenvalues and the corresponding eigenvectors of the coefficient matrix. Stability of the efficient technique is defined and discussed. The efficient technique is applied to the point kinetics equations of six-groups of delayed neutrons with step, ramp, sinusoidal and the temperature feedback reactivities. The results of these efficient techniques are compared with the traditional methods.

  5. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  6. The extension of the SWS period or CANDU reactors with particular reference to Douglas Point

    International Nuclear Information System (INIS)

    Bennett, C.R.

    1985-01-01

    The foregoing approach to the determination of the fate of a concrete containment building is worth much consideration. The expenditure of $10 8 or its escalated equivalent is too much to pay for the probable saving of fraction of a statistical life. The unquestioning adoption of the dogma of reactor dismantlement displays a complete misunderstanding of the numerics of ''risk'', even the place of reactor dismantling in the spectrum of nuclear risk. The position of the risk of reactor dismantling is more than an order of magnitude lower than the former of these. The most altruistic criterion for any engineering activity is the achievement of the greatest expected net benefit (or the least expected net detriment) when all the consequences of the activity are taken into account. As has been shown this criterion leads to the conclusion that, at least in CANDU reactors and particularly Douglas Point, there is apparently no reason why the S.W.S. period should not be extended indefinitely

  7. Compliance Monitoring of Underwater Blasting for Rock Removal at Warrior Point, Columbia River Channel Improvement Project, 2009/2010

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, Thomas J.; Johnson, Gary E.; Woodley, Christa M.; Skalski, J. R.; Seaburg, Adam

    2011-05-10

    The U.S. Army Corps of Engineers, Portland District (USACE) conducted the 20-year Columbia River Channel Improvement Project (CRCIP) to deepen the navigation channel between Portland, Oregon, and the Pacific Ocean to allow transit of fully loaded Panamax ships (100 ft wide, 600 to 700 ft long, and draft 45 to 50 ft). In the vicinity of Warrior Point, between river miles (RM) 87 and 88 near St. Helens, Oregon, the USACE conducted underwater blasting and dredging to remove 300,000 yd3 of a basalt rock formation to reach a depth of 44 ft in the Columbia River navigation channel. The purpose of this report is to document methods and results of the compliance monitoring study for the blasting project at Warrior Point in the Columbia River.

  8. The asymptotic behaviour of a critical point reactor in the absence of a controller

    International Nuclear Information System (INIS)

    Bansal, N.K.; Borgwaldt, H.

    1976-11-01

    A method is presented to calculate the first and second moments of neutron and precursor populations for a critical reactor system described by point kinetic equations and possessing inherent reactivity fluctuations. The equations have been linearised on the assumption that the system has a large average neutron population and that the amplitude of reactivity fluctuations is sufficiently small. The reactivity noise is assumed to be band limited white with a corner frequency higher than all the time constants of the system. Explicit expressions for the exact time development of the moments have been obtained for the case of a reactor without reactivity feedback and with one group of delayed neutrons. It is found that the expected values of the neutron and delayed neutron precursor numbers tend asymptotically to stationary values, whereas the mean square deviations increase linearly with time at an extremely low rate. (orig.) [de

  9. Pilot program: NRC severe reactor accident incident response training manual. Overview and summary of major points

    International Nuclear Information System (INIS)

    McKenna, T.J.; Martin, J.A. Jr.; Giitter, J.G.; Miller, C.W.; Hively, L.M.; Sharpe, R.W.; Watkins

    1987-02-01

    Overview and Summary of Major Points is the first in a series of volumes that collectively summarize the U.S. Nuclear Regulatory Commission (NRC) emergency response during severe power reactor accidents and provide necessary background information. This volume describes elementary perspectives on severe accidents and accident assessment. Other volumes in the series are: Volume 2-Severe Reactor Accident Overview; Volume 3- Response of Licensee and State and Local Officials; Volume 4-Public Protective Actions-Predetermined Criteria and Initial Actions; Volume 5 - U.S. Nuclear Regulatory Commission. Each volume serves, respectively, as the text for a course of instruction in a series of courses for NRC response personnel. These materials do not provide guidance or license requirements for NRC licensees. The volumes have been organized into these training modules to accommodate the scheduling and duty needs of participating NRC staff. Each volume is accompanied by an appendix of slides that can be used to present this material

  10. A 3D clustering approach for point clouds to detect and quantify changes at a rock glacier front

    Science.gov (United States)

    Micheletti, Natan; Tonini, Marj; Lane, Stuart N.

    2016-04-01

    Terrestrial Laser Scanners (TLS) are extensively used in geomorphology to remotely-sense landforms and surfaces of any type and to derive digital elevation models (DEMs). Modern devices are able to collect many millions of points, so that working on the resulting dataset is often troublesome in terms of computational efforts. Indeed, it is not unusual that raw point clouds are filtered prior to DEM creation, so that only a subset of points is retained and the interpolation process becomes less of a burden. Whilst this procedure is in many cases necessary, it implicates a considerable loss of valuable information. First, and even without eliminating points, the common interpolation of points to a regular grid causes a loss of potentially useful detail. Second, it inevitably causes the transition from 3D information to only 2.5D data where each (x,y) pair must have a unique z-value. Vector-based DEMs (e.g. triangulated irregular networks) partially mitigate these issues, but still require a set of parameters to be set and a considerable burden in terms of calculation and storage. Because of the reasons above, being able to perform geomorphological research directly on point clouds would be profitable. Here, we propose an approach to identify erosion and deposition patterns on a very active rock glacier front in the Swiss Alps to monitor sediment dynamics. The general aim is to set up a semiautomatic method to isolate mass movements using 3D-feature identification directly from LiDAR data. An ultra-long range LiDAR RIEGL VZ-6000 scanner was employed to acquire point clouds during three consecutive summers. In order to isolate single clusters of erosion and deposition we applied the Density-Based Scan Algorithm with Noise (DBSCAN), previously successfully employed by Tonini and Abellan (2014) in a similar case for rockfall detection. DBSCAN requires two input parameters, strongly influencing the number, shape and size of the detected clusters: the minimum number of

  11. Numerical Solution of Fractional Neutron Point Kinetics Model in Nuclear Reactor

    Directory of Open Access Journals (Sweden)

    Nowak Tomasz Karol

    2014-06-01

    Full Text Available This paper presents results concerning solutions of the fractional neutron point kinetics model for a nuclear reactor. Proposed model consists of a bilinear system of fractional and ordinary differential equations. Three methods to solve the model are presented and compared. The first one entails application of discrete Grünwald-Letnikov definition of the fractional derivative in the model. Second involves building an analog scheme in the FOMCON Toolbox in MATLAB environment. Third is the method proposed by Edwards. The impact of selected parameters on the model’s response was examined. The results for typical input were discussed and compared.

  12. Theory of fluctuations and parametric noise in a point nuclear reactor model

    International Nuclear Information System (INIS)

    Rodriguez, M.A.; San Miguel, M.; Sancho, J.M.

    1984-01-01

    We present a joint description of internal fluctuations and parametric noise in a point nuclear reactor model in which delayed neutrons and a detector are considered. We obtain kinetic equations for the first moments and define effective kinetic parameters which take into account the effect of parametric Gaussian white noise. We comment on the validity of Langevin approximations for this problem. We propose a general method to deal with weak but otherwise arbitrary non-white parametric noise. Exact kinetic equations are derived for Gaussian non-white noise. (author)

  13. Determination of the protection set-points lines for the Angra-1 reactor core

    International Nuclear Information System (INIS)

    Furieri, E.B.

    1980-03-01

    In this work several thermo-hidraulic calculation were performed to obtain Protection set-points lines for the Angra-1 reactor core in order to compare with the values presented by the vendor in the FSAR. These lines are the locus of points where DNBR min = 1,3 and power = 1,18 x P nominal as a function of ΔT m and T m , the temperature difference and the average coolant temperature between hot and cold legs. A computation scheme was developed using COBRA-IIIF as a subroutine of a new main program and adding new subroutines in order to obtain the desired DNBR. The solution is obtained through a convergentce procedure using parameters estimated in a sensivity study. (author) [pt

  14. Fundamental study on long-term stability of rock from the macroscopic point of view

    International Nuclear Information System (INIS)

    Okubo, Seisuke

    2004-02-01

    In the fiscal year of 1994 when this project was started, a pneumatic creep testing machine was modified. At the end of the fiscal year of 1994, Inada granite was purchased, and the preliminary tests such as P-wave velocity measurement and Schmidt hammer testing were carried out. Through the fiscal year of 1995, a specimen of Tage tuff under water-saturated condition had been loaded in uniaxial condition in the pneumatic creep testing machine. In the fiscal year of 1995, the uniaxial compression and tension tests, and the short-term creep test of Inada granite were also carried out in the servo-controlled testing machines to obtain the complete stress-strain curves. A hydraulic creep testing machine which was planned to use in the next year was modified for long-term creep testing. Finally, a constitutive equation of variable compliance type was examined based on the experimental results. In the fiscal year of 1996, creep, compression and tension tests were carried out. Two types of pressure maintenance equipment (hydraulic and pneumatic types) were developed and examined. In the fiscal year of 1997, creep, compression and tension tests etc. were again carried out on the basis of the results heretofore. The experimental results of long-term creep testing of Tage tuff, middle-term creep testing of Inada granite were described. In both creep tests, samples were submerged in water. In the fiscal year of 1998, creep testing of Tage tuff was conducted. Results of relatively short-term (middle-term) creep conducted on a servo-controlled testing machine were also described. Sample rock was Sirahama sandstone that showed a considerably large creep strain in low stress level such as 17% of the uniaxial compression strength. Results of triaxial compression test and uniaxial tension test including unloading-reloading tests were described. In the fiscal years of 1999-2002, creep testing of Tage tuff was continuously conducted. A multi-cylinder hydraulic creep testing machine

  15. Real-time simulation of response to load variation for a ship reactor based on point-reactor double regions and lumped parameter model

    Energy Technology Data Exchange (ETDEWEB)

    Wang Qiao; Zhang De [Department of Nuclear Energy Science and Engineering, Naval University of Engineering, Wuhan 430033 (China); Chen Wenzhen, E-mail: Cwz2@21cn.com [Department of Nuclear Energy Science and Engineering, Naval University of Engineering, Wuhan 430033 (China); Chen Zhiyun [Department of Nuclear Energy Science and Engineering, Naval University of Engineering, Wuhan 430033 (China)

    2011-05-15

    Research highlights: > We calculate the variation of main parameters of the reactor core by the Simulink. > The Simulink calculation software (SCS) can deal well with the stiff problem. > The high calculation precision is reached with less time, and the results can be easily displayed. > The quick calculation of ship reactor transient can be achieved by this method. - Abstract: Based on the point-reactor double regions and lumped parameter model, while the nuclear power plant second loop load is increased or decreased quickly, the Simulink calculation software (SCS) is adopted to calculate the variation of main physical and thermal-hydraulic parameters of the reactor core. The calculation results are compared with those of three-dimensional simulation program. It is indicated that the SCS can deal well with the stiff problem of the point-reactor kinetics equations and the coupled problem of neutronics and thermal-hydraulics. The high calculation precision can be reached with less time, and the quick calculation of parameters of response to load disturbance for the ship reactor can be achieved. The clear image of the calculation results can also be displayed quickly by the SCS, which is very significant and important to guarantee the reactor safety operation.

  16. Still Bay Point-Production Strategies at Hollow Rock Shelter and Umhlatuzana Rock Shelter and Knowledge-Transfer Systems in Southern Africa at about 80-70 Thousand Years Ago

    Science.gov (United States)

    Lombard, Marlize

    2016-01-01

    It has been suggested that technological variations associated with Still Bay assemblages of southern Africa have not been addressed adequately. Here we present a study developed to explore regional and temporal variations in Still Bay point-production strategies. We applied our approach in a regional context to compare the Still Bay point assemblages from Hollow Rock Shelter (Western Cape) and Umhlatuzana Rock Shelter (KwaZulu-Natal). Our interpretation of the point-production strategies implies inter-regional point-production conventions, but also highlights variability and intra-regional knapping strategies used for the production of Still Bay points. These strategies probably reflect flexibility in the organisation of knowledge-transfer systems at work during the later stages of the Middle Stone Age between about 80 ka and 70 ka in South Africa. PMID:27942012

  17. Ozo-Dyes mixture degradation in a fixed bed biofilm reactor packed with volcanic porous rock

    International Nuclear Information System (INIS)

    Contreras-Blancas, E.; Cobos-Vasconcelos, D. de los; Juarez-Ramirez, C.; Poggi-Varaldo, H. M.; Ruiz-Ordaz, N.; Galindez-Mayer, J.

    2009-01-01

    Textile industries discharge great amounts of dyes and dyeing-process auxiliaries, which pollute streams and water bodies. Several dyes, especially the ones containing the azo group, can cause harmful effects to different organisms including humans. Through bacterial and mammalian tests, azo dyes or their derived aromatic amines have shown cell genotoxicity. The purpose of this work was to evaluate the effect of air flow rate on azo-dyes mixture biodegradation by a microbial community immobilized in a packed bed reactor. (Author)

  18. Ozo-Dyes mixture degradation in a fixed bed biofilm reactor packed with volcanic porous rock

    Energy Technology Data Exchange (ETDEWEB)

    Contreras-Blancas, E.; Cobos-Vasconcelos, D. de los; Juarez-Ramirez, C.; Poggi-Varaldo, H. M.; Ruiz-Ordaz, N.; Galindez-Mayer, J.

    2009-07-01

    Textile industries discharge great amounts of dyes and dyeing-process auxiliaries, which pollute streams and water bodies. Several dyes, especially the ones containing the azo group, can cause harmful effects to different organisms including humans. Through bacterial and mammalian tests, azo dyes or their derived aromatic amines have shown cell genotoxicity. The purpose of this work was to evaluate the effect of air flow rate on azo-dyes mixture biodegradation by a microbial community immobilized in a packed bed reactor. (Author)

  19. Big inquiry

    Energy Technology Data Exchange (ETDEWEB)

    Wynne, B [Lancaster Univ. (UK)

    1979-06-28

    The recently published report entitled 'The Big Public Inquiry' from the Council for Science and Society and the Outer Circle Policy Unit is considered, with especial reference to any future enquiry which may take place into the first commercial fast breeder reactor. Proposals embodied in the report include stronger rights for objectors and an attempt is made to tackle the problem that participation in a public inquiry is far too late to be objective. It is felt by the author that the CSS/OCPU report is a constructive contribution to the debate about big technology inquiries but that it fails to understand the deeper currents in the economic and political structure of technology which so influence the consequences of whatever formal procedures are evolved.

  20. Toward a Learning Health-care System - Knowledge Delivery at the Point of Care Empowered by Big Data and NLP.

    Science.gov (United States)

    Kaggal, Vinod C; Elayavilli, Ravikumar Komandur; Mehrabi, Saeed; Pankratz, Joshua J; Sohn, Sunghwan; Wang, Yanshan; Li, Dingcheng; Rastegar, Majid Mojarad; Murphy, Sean P; Ross, Jason L; Chaudhry, Rajeev; Buntrock, James D; Liu, Hongfang

    2016-01-01

    The concept of optimizing health care by understanding and generating knowledge from previous evidence, ie, the Learning Health-care System (LHS), has gained momentum and now has national prominence. Meanwhile, the rapid adoption of electronic health records (EHRs) enables the data collection required to form the basis for facilitating LHS. A prerequisite for using EHR data within the LHS is an infrastructure that enables access to EHR data longitudinally for health-care analytics and real time for knowledge delivery. Additionally, significant clinical information is embedded in the free text, making natural language processing (NLP) an essential component in implementing an LHS. Herein, we share our institutional implementation of a big data-empowered clinical NLP infrastructure, which not only enables health-care analytics but also has real-time NLP processing capability. The infrastructure has been utilized for multiple institutional projects including the MayoExpertAdvisor, an individualized care recommendation solution for clinical care. We compared the advantages of big data over two other environments. Big data infrastructure significantly outperformed other infrastructure in terms of computing speed, demonstrating its value in making the LHS a possibility in the near future.

  1. Analytic method study of point-reactor kinetic equation when cold start-up

    International Nuclear Information System (INIS)

    Zhang Fan; Chen Wenzhen; Gui Xuewen

    2008-01-01

    The reactor cold start-up is a process of inserting reactivity by lifting control rod discontinuously. Inserting too much reactivity will cause short-period and may cause an overpressure accident in the primary loop. It is therefore very important to understand the rule of neutron density variation and to find out the relationships among the speed of lifting control rod, and the duration and speed of neutron density response. It is also helpful for the operators to grasp the rule in order to avoid a start-up accident. This paper starts with one-group delayed neutron point-reactor kinetics equations and provides their analytic solution when reactivity is introduced by lifting control rods discontinuously. The analytic expression is validated by comparison with practical data. It is shown that the analytic solution agrees well with numerical solution. Using this analytical solution, the relationships among neutron density response with the speed of lifting control rod and its duration are also studied. By comparing the results with those under the condition of step inserted reactivity, useful conclusions are drawn

  2. Development of a point-kinetic verification scheme for nuclear reactor applications

    Energy Technology Data Exchange (ETDEWEB)

    Demazière, C., E-mail: demaz@chalmers.se; Dykin, V.; Jareteg, K.

    2017-06-15

    In this paper, a new method that can be used for checking the proper implementation of time- or frequency-dependent neutron transport models and for verifying their ability to recover some basic reactor physics properties is proposed. This method makes use of the application of a stationary perturbation to the system at a given frequency and extraction of the point-kinetic component of the system response. Even for strongly heterogeneous systems for which an analytical solution does not exist, the point-kinetic component follows, as a function of frequency, a simple analytical form. The comparison between the extracted point-kinetic component and its expected analytical form provides an opportunity to verify and validate neutron transport solvers. The proposed method is tested on two diffusion-based codes, one working in the time domain and the other working in the frequency domain. As long as the applied perturbation has a non-zero reactivity effect, it is demonstrated that the method can be successfully applied to verify and validate time- or frequency-dependent neutron transport solvers. Although the method is demonstrated in the present paper in a diffusion theory framework, higher order neutron transport methods could be verified based on the same principles.

  3. UABUC - Single energy point model burnup computer code for water reactors

    International Nuclear Information System (INIS)

    El-Meshad, Y.; Morsy, S.; El-Osery, I.A.

    1981-01-01

    UABUC is a single energy point reactor burnup computer program in FORTRAN language. The program calculates the change in the isotopic composition of the uranium fuel as a function of irradiation time with all its associated quantities such as the average point flux, the conversion ratio, macroscopic fuel cross sections, and the point reactivity profile. A step-wise time analytical solution was developed for the nonlinear first order burnup differential equations. The ''Westcott'' convention of the effective cross sections was used except for plutonium-240 and uranium-238. For plutonium-240, an effective microscopic cross section was derived from the direct physical arguments taking into account the selfshielding effect of plutonium-240 as well as the 1 ev. resonance absorption. For uranium-238, an effective cross section, reflecting the effect of fast fission and resonance absorption was used. The fission products were treated in the three groups with 50, 300, and 800 barns. The yields in the groups were treated as functions of the type of fissionable nuclides, the effective neutron temperature, and the epithermal index. Xenon-135 and Samarium-149 were treated separately as functions of irradiation time. (author)

  4. Fully 3D printed integrated reactor array for point-of-care molecular diagnostics.

    Science.gov (United States)

    Kadimisetty, Karteek; Song, Jinzhao; Doto, Aoife M; Hwang, Young; Peng, Jing; Mauk, Michael G; Bushman, Frederic D; Gross, Robert; Jarvis, Joseph N; Liu, Changchun

    2018-06-30

    Molecular diagnostics that involve nucleic acid amplification tests (NAATs) are crucial for prevention and treatment of infectious diseases. In this study, we developed a simple, inexpensive, disposable, fully 3D printed microfluidic reactor array that is capable of carrying out extraction, concentration and isothermal amplification of nucleic acids in variety of body fluids. The method allows rapid molecular diagnostic tests for infectious diseases at point of care. A simple leak-proof polymerization strategy was developed to integrate flow-through nucleic acid isolation membranes into microfluidic devices, yielding a multifunctional diagnostic platform. Static coating technology was adopted to improve the biocompatibility of our 3D printed device. We demonstrated the suitability of our device for both end-point colorimetric qualitative detection and real-time fluorescence quantitative detection. We applied our diagnostic device to detection of Plasmodium falciparum in plasma samples and Neisseria meningitides in cerebrospinal fluid (CSF) samples by loop-mediated, isothermal amplification (LAMP) within 50 min. The detection limits were 100 fg for P. falciparum and 50 colony-forming unit (CFU) for N. meningitidis per reaction, which are comparable to that of benchtop instruments. This rapid and inexpensive 3D printed device has great potential for point-of-care molecular diagnosis of infectious disease in resource-limited settings. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. The application of polynomial chaos methods to a point kinetics model of MIPR: An Aqueous Homogeneous Reactor

    International Nuclear Information System (INIS)

    Cooling, C.M.; Williams, M.M.R.; Nygaard, E.T.; Eaton, M.D.

    2013-01-01

    Highlights: • A point kinetics model for the Medical Isotope Production Reactor is formulated. • Reactivity insertions are simulated using this model. • Polynomial chaos is used to simulate uncertainty in reactor parameters. • The computational efficiency of polynomial chaos is compared to that of Monte Carlo. -- Abstract: This paper models a conceptual Medical Isotope Production Reactor (MIPR) using a point kinetics model which is used to explore power excursions in the event of a reactivity insertion. The effect of uncertainty of key parameters is modelled using intrusive polynomial chaos. It is found that the system is stable against reactivity insertions and power excursions are all bounded and tend towards a new equilibrium state due to the negative feedbacks inherent in Aqueous Homogeneous Reactors (AHRs). The Polynomial Chaos Expansion (PCE) method is found to be much more computationally efficient than that of Monte Carlo simulation in this application

  6. Traceological analysis of a singular artefact: The rock crystal point from O Achadizo (Boiro, A Coruña, Galicia

    Directory of Open Access Journals (Sweden)

    Juan Luis Fernández Marchena

    2016-09-01

    In this paper we present the data obtained from a use-wear study of a rock crystal tool from the O Achadizo hill fort (Boiro, A Coruña, Galicia. This tool was located in shell midden A, dated as Second Iron Age, and is of particular importance because of its pointed morphology and the configuration evidence on its perimeter. We carried out a macroscopic and microscopic analysis to obtain as much data on this piece as possible. Macroscopically we identified retouching as well as an impact fracture, and at the microscopic level we found several series of striations on the ventral face which are not in keeping with the use of the piece as a projectile tip. We decided to generate several “gigapixel” images of different areas of the tool, in order to record the order and arrangement of these striations, and to understand their origin. We identified differential orientation of the striations in the various sectors of the tool, suggesting a technical origin. The combination of the macro and microscopic analysis of both faces has allowed us to functionally interpret the tool as a sharp element.

  7. Burnup performance of rock-like oxide (ROX) fuel in small pebble bed reactor with accumulative fuel loading scheme

    International Nuclear Information System (INIS)

    Simanullang, Irwan Liapto; Obara, Toru

    2017-01-01

    Highlights: • Burnup performance using ROX fuel in PBR with accumulative fuel loading scheme was analyzed. • Initial excess reactivity was suppressed by reducing 235 U enrichment in the startup condition. • Negative temperature coefficient was achieved in all condition of PBR with accumulative fuel loading scheme using ROX fuel. • Core lifetime of PBR with accumulative fuel loading scheme using ROX fuel was shorter than with UO 2 fuel. • In PBR with accumulative fuel loading scheme using ROX fuel, achieved discharged burnup can be as high as that for UO 2 fuel. - Abstract: The Japan Atomic Energy Agency (JAEA) has proposed rock-like oxide (ROX) fuel as a new, once-through type fuel concept. Here, burnup performance using ROX fuel was simulated in a pebble bed reactor with an accumulative fuel loading scheme. The MVP-BURN code was used to simulate the burnup calculation. Fuel of 5 g-HM/pebble with 20% 235 U enrichment was selected as the optimum composition. Discharged burnup could reach up to 218 GWd/t, with a core lifetime of about 8.4 years. However, high excess reactivity occurred in the initial condition. Initial fuel enrichment was therefore reduced from 20% to 4.65% to counter the initial excess reactivity. The operation period was reduced by the decrease of initial fuel enrichment, but the maximum discharged burnup was 198 GWd/t. Burnup performance of ROX fuel in this reactor concept was compared with that of UO 2 fuel obtained previously. Discharged burnup for ROX fuel in the PBR with an accumulative fuel loading scheme was as high as UO 2 fuel. Maximum power density could be lowered by introducing ROX fuel compared to UO 2 fuel. However, PBR core lifetime was shorter with ROX fuel than with UO 2 fuel. A negative temperature coefficient was achieved for both UO 2 and ROX fuels throughout the operation period.

  8. Evaluation of thermal physical properties for fast reactor fuels. Melting point and thermal conductivities

    International Nuclear Information System (INIS)

    Kato, Masato; Morimoto, Kyoichi; Komeno, Akira; Nakamichi, Shinya; Kashimura, Motoaki; Abe, Tomoyuki; Uno, Hiroki; Ogasawara, Masahiro; Tamura, Tetsuya; Sugata, Hirotada; Sunaoshi, Takeo; Shibata, Kazuya

    2006-10-01

    Japan Atomic Energy Agency has developed a fast breeder reactor (FBR), and plutonium and uranium mixed oxide (MOX) having low density and 20-30%Pu content has used as a fuel of the FBR, Monju. In plutonium, Americium has been accumulated during long-term storage, and Am content will be increasing up to 2-3% in the MOX. It is essential to evaluate the influence of Am content on physical properties of MOX on the development of FBR in the future. In this study melting points and thermal conductivities which are important data on the fuel design were measured systematically in wide range of composition, and the effects of Am accumulated were evaluated. The solidus temperatures of MOX were measured as a function of Pu content, oxygen to metal ratio (O/M) and Am content using thermal arrest technique. The sample was sealed in a tungsten capsule in vacuum for measuring solidus temperature. In the measurements of MOX with Pu content of more than 30%, a rhenium inner capsule was used to prevent the reaction between MOX and tungsten. In the results, it was confirmed that the melting points of MOX decrease with as an increase of Pu content and increase slightly with a decrease of O/M ratio. The effect of Am content on the fuel design was negligible small in the range of Am content up to 3%. Thermal conductivities of MOX were evaluated from thermal diffusivity measured by laser flash method and heat capacity calculated by Neumann- Kopp's law. The thermal conductivity of MOX decreased slightly in the temperature of less than 1173K with increasing Am content. The effect of Am accumulated in long-term storage fuel was evaluated from melting points and thermal conductivities measured in this study. It is concluded that the increase of Am in the fuel barely affect the fuel design in the range of less than 3%Am content. (author)

  9. Change-point analysis of geophysical time-series: application to landslide displacement rate (Séchilienne rock avalanche, France)

    Science.gov (United States)

    Amorese, D.; Grasso, J.-R.; Garambois, S.; Font, M.

    2018-05-01

    The rank-sum multiple change-point method is a robust statistical procedure designed to search for the optimal number and the location of change points in an arbitrary continue or discrete sequence of values. As such, this procedure can be used to analyse time-series data. Twelve years of robust data sets for the Séchilienne (French Alps) rockslide show a continuous increase in average displacement rate from 50 to 280 mm per month, in the 2004-2014 period, followed by a strong decrease back to 50 mm per month in the 2014-2015 period. When possible kinematic phases are tentatively suggested in previous studies, its solely rely on the basis of empirical threshold values. In this paper, we analyse how the use of a statistical algorithm for change-point detection helps to better understand time phases in landslide kinematics. First, we test the efficiency of the statistical algorithm on geophysical benchmark data, these data sets (stream flows and Northern Hemisphere temperatures) being already analysed by independent statistical tools. Second, we apply the method to 12-yr daily time-series of the Séchilienne landslide, for rainfall and displacement data, from 2003 December to 2015 December, in order to quantitatively extract changes in landslide kinematics. We find two strong significant discontinuities in the weekly cumulated rainfall values: an average rainfall rate increase is resolved in 2012 April and a decrease in 2014 August. Four robust changes are highlighted in the displacement time-series (2008 May, 2009 November-December-2010 January, 2012 September and 2014 March), the 2010 one being preceded by a significant but weak rainfall rate increase (in 2009 November). Accordingly, we are able to quantitatively define five kinematic stages for the Séchilienne rock avalanche during this period. The synchronization between the rainfall and displacement rate, only resolved at the end of 2009 and beginning of 2010, corresponds to a remarkable change (fourfold

  10. Analytical solution of point kinetics equations for linear reactivity variation during the start-up of a nuclear reactor

    Energy Technology Data Exchange (ETDEWEB)

    Palma, Daniel A.P. [CEFET QUIMICA de Nilopolis/RJ, 21941-914 Rio de Janeiro (Brazil)], E-mail: agoncalves@con.ufrj.br; Martinez, Aquilino S.; Goncalves, Alessandro C. [COPPE/UFRJ - Programa de Engenharia Nuclear, Rio de Janeiro (Brazil)

    2009-09-15

    The analytical solution of point kinetics equations with a group of delayed neutrons is useful in predicting the variation of neutron density during the start-up of a nuclear reactor. In the practical case of an increase of nuclear reactor power resulting from the linear insertion of reactivity, the exact analytical solution cannot be obtained. Approximate solutions have been obtained in previous articles, based on considerations that need to be verifiable in practice. In the present article, an alternative analytic solution is presented for point kinetics equations in which the only approximation consists of disregarding the term of the second derivative for neutron density in relation to time. The results proved satisfactory when applied to practical situations in the start-up of a nuclear reactor through the control rods withdraw.

  11. Analytical solution of point kinetics equations for linear reactivity variation during the start-up of a nuclear reactor

    International Nuclear Information System (INIS)

    Palma, Daniel A.P.; Martinez, Aquilino S.; Goncalves, Alessandro C.

    2009-01-01

    The analytical solution of point kinetics equations with a group of delayed neutrons is useful in predicting the variation of neutron density during the start-up of a nuclear reactor. In the practical case of an increase of nuclear reactor power resulting from the linear insertion of reactivity, the exact analytical solution cannot be obtained. Approximate solutions have been obtained in previous articles, based on considerations that need to be verifiable in practice. In the present article, an alternative analytic solution is presented for point kinetics equations in which the only approximation consists of disregarding the term of the second derivative for neutron density in relation to time. The results proved satisfactory when applied to practical situations in the start-up of a nuclear reactor through the control rods withdraw.

  12. NJOY processed multigroup library for fast reactor applications and point data library for MCNP - Experience and validation

    International Nuclear Information System (INIS)

    Kim Jung-Do; Gil Choong-Sup

    1996-01-01

    JEF-1-based 50-group cross section library for fast reactor applications and point data library for continuous-energy Monte Carlo code MCNP have been generated using NJOY91.38 system. They have been examined by analyzing measured integral quantities such as criticality and central reaction rate ratios for 8 small fast critical assemblies. (author). 9 refs, 2 figs, 10 tabs

  13. Computation of point reactor dynamics equations with thermal feedback via weighted residue method

    International Nuclear Information System (INIS)

    Suo Changan; Liu Xiaoming

    1986-01-01

    Point reactor dynamics equations with six groups of delayed neutrons have been computed via weighted-residual method in which the delta function was taken as a weighting function, and the parabolic with or without exponential factor as a trial function respectively for an insertion of large or smaller reactivity. The reactivity inserted into core can be varied with time, including insertion in forms of step function, polynomials up to second power and sine function. A thermal feedback of single flow channel model was added in. The thermal equations concerned were treated by use of a backward difference technique. A WRK code has been worked out, including implementation of an automatic selection of time span based on an input of error requirement and of an automatic change between computation with large reactivity and that with smaller one. On the condition of power varied slowly and without feedback, the results are not sensitive to the selection of values of time span. At last, the comparison of relevant results has shown that the agreement is quite well

  14. The role of point defect clusters in reactor pressure vessel embrittlement

    International Nuclear Information System (INIS)

    Stoller, R.E.

    1993-01-01

    Radiation-induced point defect clusters (PDC) are a plausible source of matrix hardening in reactor pressure vessel (RPV) steels in addition to copper-rich precipitates. These PDCs can be of either interstitial or vacancy type, and could exist in either 2 or 3-D shapes, e.g. small loops, voids, or stacking fault tetrahedra. Formation and evolution of PDCs are primarily determined by displacement damage rate and irradiation temperature. There is experimental evidence that size distributions of these clusters are also influenced by impurities such as copper. A theoretical model has been developed to investigate potential role of PDCs in RPV embrittlement. The model includes a detailed description of interstitial cluster population; vacancy clusters are treated in a more approximate fashion. The model has been used to examine a broad range of irradiation and material parameters. Results indicate that magnitude of hardening increment due to these clusters can be comparable to that attributed to copper precipitates. Both interstitial and vacancy type defects contribute to this hardening, with their relative importance determined by the specific irradiation conditions

  15. Neutron density fluctuations in point reactor systems with dichotomic reactivity noise

    International Nuclear Information System (INIS)

    Sako, Okitsugu

    1984-01-01

    The exactly solvable stochastic point reactor model systems are analyzed through the stochastic Liouville equation. Three kinds of model systems are treated: (1) linear system without delayed neutrons, (2) linear system with one-group of delayed neutrons, and (3) nonlinear system with direct power feedback. The exact expressions for the fluctuations of neutron density, such as the moments, autocorrelation function and power spectral density, are derived in the case where the colored reactivity noise is described by the dichotomic, or two state, Markov process with arbitrary correlation time and intensity, and the effects of the finite correlation time and intensity of the noise on the neutron density fluctuations are investigated. The influence of presence of delayed neutrons and the effect of nonlinearity of system on the neutron density fluctuations are also elucidated. When the reactivity correlation time is very short, the correlation time has almost no effect on the power spectral density, and the relative fluctuation of neutron density in the stationary state is not affected very much by the presence of delayed neutrons and also by the nonlinearity of system. On the other hand, if the reactivity correlation time is very long, the effect of the reactivity noise on the power spectral density appears at very low frequency, and the presence of delayed neutrons has an effect of reducing the neutron density fluctuations. (author)

  16. Contrasting Nature of Magnetic Anomalies over Thin Sections Made out of Barrandien’s Basaltic Rocks Points to their Origin

    Czech Academy of Sciences Publication Activity Database

    Kletetschka, Günther; Pruner, Petr; Schnabl, Petr; Šifnerová, Kristýna

    -, special issue (2012), s. 69-70 ISSN 1335-2806. [Castle meeting New Trends in Geomagnetism : Paleo, rock and environmental magnetism/13./. 17.06.2012-23.06.2012, Zvolen] R&D Projects: GA ČR GAP210/10/2351 Institutional support: RVO:67985831 Keywords : magnetic anomalies * thin sections * volcanic rocks Subject RIV: DE - Earth Magnetism, Geodesy, Geography http://gauss.savba.sk/GPIweb/conferences/Castle2012/abstrCastle.pdf

  17. Discovery Of A Major Contradiction In Big Bang Cosmology Points To The New Cosmic Center Universe Model

    CERN Document Server

    Gentry, R V

    2003-01-01

    The BAL z=3.91 quasar's high Fe/O ratio has led to a reexamination of big bang's spacetime expansion postulate and the discovery that it predicts a CBR redshift of z>36000 instead of the widely accepted z~1000. This result leads an expansion-predicted CBR temperature of only T = 0.08K, which is contradicted by the experimental T = 2.73K. Contrary to long-held belief, these results strongly suggest that the F-L expanding spacetime paradigm, with its expansion redshifts, is not the correct relativistic description of the universe. This conclusion agrees with the earlier finding (gr-qc/9806061) that the universe is relativistically governed by the Einstein static spacetime solution of the field equations, not the F-L solution. Disproof of expansion redshifts removes the only support for the Cosmological Principle, thus showing that the spherical symmetry of the cosmos demanded by the Hubble redshift relation can no longer be attributed to the universe being the same everythere. The Cosmological Principle is flaw...

  18. Protection set-points lines for the reactor core and considerations about power distribution and peak factors

    International Nuclear Information System (INIS)

    Furieri, E.B.

    1981-01-01

    In order to assure the reactor core integrity during the slow operational transients (power excursion above the nominal value and the high coolant temperature), the formation of a steam film (DNB-Departure from Nucleate Boiling) in the control rods must be avoided. The protection set points lines presents the points where DNBR (relation between critical heat flux-q sub(DNB) and the local heat flux-q' sub(local) is equal to 1.30, corrected by peak factors and uncertainty in function of ΔTr and T sub(R), respectively coolant elevation and medium coolant temperature in reactor pressure vessel. The curve set-points were determined using a new version of COBRA-IIIF (CUPRO) computer code, implemented with new subroutines and linearized convergence scheme. Pratical results for Angra-1 core were obtained and its were compared with the results from the fabricator. (E.G.) [pt

  19. A CAREM reactor's design evaluation from the nuclear security point of view

    International Nuclear Information System (INIS)

    Kay, J.M.; Felizia, E.R.; Navarro, N.R.; Caruso, G.J.

    1990-01-01

    The main objective of this work is to define the adequate rules for CAREM reactor security systems design and processes which aim to assure verification of the CALIN regulations 'Radiological Criteria' in relation to accidents concerning CAREM reactor design. (Author) [es

  20. Stress analysis of neutral beam pivot point bellows for tokamak fusion test reactor

    International Nuclear Information System (INIS)

    Johnson, J.J.; Benda, B.J.; Tiong, L.W.

    1983-01-01

    The neutral beam pivot point bellows serves as an airtight flexible linkage between the torus duct and the neutral beam transition duct in Princeton University's Tokamak Fusion Test Reactor. The bellows considered here is basically rectangular in cross section with rounded corners; a unique shape. Its overall external dimensions are about 28 in. (about 711 mm) X about 35 in. (about 889 mm). The bellows is formed from 18 convolutions and is of the nested ripple type. It is about 11 in. (about 43.3 mm) in length, composed of Inconel 718, and each leaf has a thickness of 0.034 in. (.86 mm). The bellows is subjected to a series of design loading conditions -- vacuum, vacuum + 2 psi (.12 MPa), vacuum + stroke (10,000 cycles), vacuum + temperature increase + extension, extension to a stress of 120 ksi (838 MPa), and a series of rotational loading conditions induced in the bellows by alignment of the neutral beam injector. A stress analysis of the bellows was performed by the finite element method -- locations and magnitude of maximum stresses were calculated for all of the design loading conditions to check with allowable values and help guide placement of strain gauges during proof testing. A typical center convolution and end convolution were analyzed. Loading conditions were separated into symmetric and antisymmetric cases about the planes of symmetry of the cross-section. Iterative linear analyses were performed, i.e. compressive loading conditions led to predicted overlap of the leaves from linear analysis and restraints were added to prevent such overlap. This effect was found to be substantial in stress predicition and necessary to be taken into account. A total of eleven loading conditions and seven models were analyzed. The results showed peak stresses to be within allowable limits and the number of allowable cycles to be greater than the design condition

  1. Water-quality effects on phytoplankton species and density and trophic state indices at Big Base and Little Base Lakes, Little Rock Air Force Base, Arkansas, June through August, 2015

    Science.gov (United States)

    Driver, Lucas; Justus, Billy

    2016-01-01

    Big Base and Little Base Lakes are located on Little Rock Air Force Base, Arkansas, and their close proximity to a dense residential population and an active military/aircraft installation make the lakes vulnerable to water-quality degradation. The U.S. Geological Survey (USGS) conducted a study from June through August 2015 to investigate the effects of water quality on phytoplankton species and density and trophic state in Big Base and Little Base Lakes, with particular regard to nutrient concentrations. Nutrient concentrations, trophic-state indices, and the large part of the phytoplankton biovolume composed of cyanobacteria, indicate eutrophic conditions were prevalent for Big Base and Little Base Lakes, particularly in August 2015. Cyanobacteria densities and biovolumes measured in this study likely pose a low to moderate risk of adverse algal toxicity, and the high proportion of filamentous cyanobacteria in the lakes, in relation to other algal groups, is important from a fisheries standpoint because these algae are a poor food source for many aquatic taxa. In both lakes, total nitrogen to total phosphorus (N:P) ratios declined over the sampling period as total phosphorus concentrations increased relative to nitrogen concentrations. The N:P ratios in the August samples (20:1 and 15:1 in Big Base and Little Base Lakes, respectively) and other indications of eutrophic conditions are of concern and suggest that exposure of the two lakes to additional nutrients could cause unfavorable dissolved-oxygen conditions and increase the risk of cyanobacteria blooms and associated cyanotoxin issues.

  2. Cross-section requirements for reactor neutron flux measurements from the user's point of view

    International Nuclear Information System (INIS)

    Mas, P.; Lloret, R.

    1978-01-01

    The dosimetry of testing materials irradiations involves in practice a lot of problems: fluences measurements, knowledge of spectrum, choice of a convenient set of cross section, damage rate determination, transposition from testing reactor to power reactor. From those problems, we consider that a temporary recommandation concerning the differential cross section of some fluence detectors is to be done, and that it is necessary to dispose of more accessible benchmarks in order to correlate cross section and computer codes. (author)

  3. Impact of mesh points number on the accuracy of deterministic calculations of control rods worth for Tehran research reactor

    International Nuclear Information System (INIS)

    Boustani, Ehsan; Amirkabir University of Technology, Tehran; Khakshournia, Samad

    2016-01-01

    In this paper two different computational approaches, a deterministic and a stochastic one, were used for calculation of the control rods worth of the Tehran research reactor. For the deterministic approach the MTRPC package composed of the WIMS code and diffusion code CITVAP was used, while for the stochastic one the Monte Carlo code MCNPX was applied. On comparing our results obtained by the Monte Carlo approach and those previously reported in the Safety Analysis Report (SAR) of Tehran research reactor produced by the deterministic approach large discrepancies were seen. To uncover the root cause of these discrepancies, some efforts were made and finally was discerned that the number of spatial mesh points in the deterministic approach was the critical cause of these discrepancies. Therefore, the mesh optimization was performed for different regions of the core such that the results of deterministic approach based on the optimized mesh points have a good agreement with those obtained by the Monte Carlo approach.

  4. Impact of mesh points number on the accuracy of deterministic calculations of control rods worth for Tehran research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Boustani, Ehsan [Nuclear Science and Technology Research Institute (NSTRI), Tehran (Iran, Islamic Republic of); Amirkabir University of Technology, Tehran (Iran, Islamic Republic of). Energy Engineering and Physics Dept.; Khakshournia, Samad [Amirkabir University of Technology, Tehran (Iran, Islamic Republic of). Energy Engineering and Physics Dept.

    2016-12-15

    In this paper two different computational approaches, a deterministic and a stochastic one, were used for calculation of the control rods worth of the Tehran research reactor. For the deterministic approach the MTRPC package composed of the WIMS code and diffusion code CITVAP was used, while for the stochastic one the Monte Carlo code MCNPX was applied. On comparing our results obtained by the Monte Carlo approach and those previously reported in the Safety Analysis Report (SAR) of Tehran research reactor produced by the deterministic approach large discrepancies were seen. To uncover the root cause of these discrepancies, some efforts were made and finally was discerned that the number of spatial mesh points in the deterministic approach was the critical cause of these discrepancies. Therefore, the mesh optimization was performed for different regions of the core such that the results of deterministic approach based on the optimized mesh points have a good agreement with those obtained by the Monte Carlo approach.

  5. Kimberley rock art dating project

    International Nuclear Information System (INIS)

    Walsh, G.L.; Morwood, M.

    1997-01-01

    The art's additional value, unequalled by traditionally recognised artefacts, is its permanent pictorial documentation presenting a 'window' into the otherwise intangible elements of perceptions, vision and mind of pre-historic cultures. Unfortunately it's potential in establishing Kimberley archaeological 'big picture' still remains largely unrecognised. Some of findings of the Kimberley Rock Art Dating Project, using AMS and optical stimulated luminescence (OSL) dating techniques, are outlined. It is estimated that these findings will encourage involvement by a greater diversity of specialist disciplines to tie findings into levels of this art sequence as a primary reference point. The sequence represents a sound basis for selecting specific defined images for targeting detailed studies by a range of dating technique. This effectively removes the undesirable ad hoc sampling of 'apparently old paintings'; a process which must unavoidably remain the case with researchers working on most global bodies of rock art

  6. Fluoride Salt-Cooled High-Temperature Demonstration Reactor Point Design

    Energy Technology Data Exchange (ETDEWEB)

    Qualls, A. L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Brown, Nicholas R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Betzler, Benjamin R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Carbajo, Juan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hale, Richard Edward [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Harrison, Thomas J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Powers, Jeffrey J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Robb, Kevin R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Terrell, Jerry W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wysocki, Aaron J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-02-01

    The fluoride salt-cooled high-temperature reactor (FHR) demonstration reactor (DR) is a concept for a salt-cooled reactor with 100 megawatts of thermal output (MWt). It would use tristructural-isotropic (TRISO) particle fuel within prismatic graphite blocks. FLiBe (2 LiF-BeF2) is the reference primary coolant. The FHR DR is designed to be small, simple, and affordable. Development of the FHR DR is a necessary intermediate step to enable near-term commercial FHRs. Lower risk technologies are purposely included in the initial FHR DR design to ensure that the reactor can be built, licensed, and operated within an acceptable budget and schedule. These technologies include TRISO particle fuel, replaceable core structural material, the use of that same material for the primary and intermediate loops, and tube-and-shell primary-to-intermediate heat exchangers. Several preconceptual and conceptual design efforts that have been conducted on FHR concepts bear a significant influence on the FHR DR design. Specific designs include the Oak Ridge National Laboratory (ORNL) advanced high-temperature reactor (AHTR) with 3400/1500 MWt/megawatts of electric output (MWe), as well as a 125 MWt small modular AHTR (SmAHTR) from ORNL. Other important examples are the Mk1 pebble bed FHR (PB-FHR) concept from the University of California, Berkeley (UCB), and an FHR test reactor design developed at the Massachusetts Institute of Technology (MIT). The MIT FHR test reactor is based on a prismatic fuel platform and is directly relevant to the present FHR DR design effort. These FHR concepts are based on reasonable assumptions for credible commercial prototypes. The FHR DR concept also directly benefits from the operating experience of the Molten Salt Reactor Experiment (MSRE), as well as the detailed design efforts for a large molten salt reactor concept and its breeder variant, the Molten Salt Breeder Reactor. The FHR DR technology is most representative of the 3400 MWt AHTR

  7. Big Data and Neuroimaging.

    Science.gov (United States)

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  8. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......, and highlight some recent methodological advancements in machine learning and image analysis triggered by astronomical applications....

  9. Research of the Rock Art from the point of view of geography: the neolithic painting of the Mediterranean area of the Iberian Peninsula

    Directory of Open Access Journals (Sweden)

    Cruz Berrocal, María

    2004-12-01

    Full Text Available The rock art of the Mediterranean Arch (which includes what are conventionally called Levantine Rock Art, Schematic Rock Art and Macroschematic Rock Art, among other styles, designated as part of the Human Heritage in 1997, is studied from the point of view of the Archaeology of Landscape. The information sources used were field work, cartographic analysis and analysis in GIS, besides two Rock Art Archives: the UNESCO Document and the Corpus of Levantine Cave Painting (Corpus de Pintura Rupestre Levantina. The initial hypothesis was that this rock art was involved in the process of neolithisation of the Eastern part of Iberia, of which it is a symptom and a result, and it must be understood as an element of landscape construction. If this is true, it would have a concrete distribution in the form of locational patterns. Through statistical procedures and heuristical approaches, it has been demonstrated that there is a structure of the neolithic landscape, defined by rock art, which is possible to interpret functional and economically.

    Se estudia el arte rupestre del Arco Mediterráneo (que incluye a los convencionalmente conocidos como Arte Levantino, Arte Esquemático y Arte Macroesquemático, entre otros estilos, nombrado Patrimonio de la Humanidad en 1998, desde el punto de vista de su localización. Las fuentes de información utilizadas fueron trabajo de campo, revisión cartográfica y análisis en Sistema de Información Geográfica, además de dos archivos de arte rupestre: el Expediente UNESCO y el Corpus de Pintura Rupestre Levantina. La hipótesis inicial fue que este arte rupestre se imbrica en el proceso de neolitización del Levante peninsular, del que es síntoma y resultado, y debe entenderse como un elemento de construcción paisajística, de lo que se deduce que ha de presentar una distribución determinable en forma de patrones locacionales. Por medio tanto de contrastes y descripciones estadísticas como de

  10. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  11. Reactor

    International Nuclear Information System (INIS)

    Toyama, Masahiro; Kasai, Shigeo.

    1978-01-01

    Purpose: To provide a lmfbr type reactor wherein effusion of coolants through a loop contact portion is reduced even when fuel assemblies float up, and misloading of reactor core constituting elements is prevented thereby improving the reactor safety. Constitution: The reactor core constituents are secured in the reactor by utilizing the differential pressure between the high-pressure cooling chamber and low-pressure cooling chamber. A resistance port is formed at the upper part of a connecting pipe, and which is connect the low-pressure cooling chamber and the lower surface of the reactor core constituent. This resistance part is formed such that the internal sectional area of the connecting pipe is made larger stepwise toward the upper part, and the cylinder is formed larger so that it profiles the inner surface of the connecting pipe. (Aizawa, K.)

  12. Importance of Sodium Fuel Interaction in Fast Reactor Safety Evaluation - CEA Point of View

    International Nuclear Information System (INIS)

    Tanguy, P.

    1976-01-01

    The consequences of interactions between molten metal (aluminium-uranium alloy) and water have long been a subject of concern for those in charge of reactor safety, following accidents observed or induced in certain reactors (BORAX, SL1, SPERT 1 D). In such accidents, as in similar cases occurring in traditional industries (aluminium foundries, steel works, paper mills...) the contact between the hot liquid product and the coolant entails rapid vaporization of the latter with effects identical to that of an explosive. Although chemical reactions of water decomposition occur in some cases, the main phenomenon is the conversion of the thermal energy stored in the hot substance into mechanical energy. Despite the fact that a molten oxide fuel differs from an aluminium-uranium alloy, as does sodium from water, the consequences of possible contact between the molten mixed uranium and plutonium oxide and sodium must be carefully studied since such a contact may occur in accident conditions in sodium-cooled fast neutron reactors. The essential purpose of an evaluation of reactor safety in accident conditions is in fact to ensure the containment of dangerous products Consequently, any phenomenon likely to endanger containment barriers must be carefully examined. In conclusion: Whereas an accident within an assembly seems to show little likelihood of creating conditions seriously endangering fuel containment, the gravity of problems associated with an overall accident on the core is worthy of thorough and attentive study. In the case of an overall accident on the core of a fast reactor, the interaction between the molten fuel and the sodium is of consequence at two levels. The first is the retention of mechanical energy which may be considerable. The second is the recovery of fuel fragments in an overall cooled configuration but where local cooling problems may give rise to interaction. A greater effort is required in performing tests and mastering their results to

  13. Reactor

    International Nuclear Information System (INIS)

    Ikeda, Masaomi; Kashimura, Kazuo; Inoue, Kazuyuki; Nishioka, Kazuya.

    1979-01-01

    Purpose: To facilitate the construction of a reactor containment building, whereby the inspections of the outer wall of a reactor container after the completion of the construction of the reactor building can be easily carried out. Constitution: In a reactor accommodated in a container encircled by a building wall, a space is provided between the container and the building wall encircling the container, and a metal wall is provided in the space so that it is fitted in the building wall in an attachable or detatchable manner. (Aizawa, K.)

  14. Fort St. Vrain high temperature gas-cooled reactor. Pt. 12. The dew point moisture monitor testing program

    Energy Technology Data Exchange (ETDEWEB)

    Olson, H.G. (Colorado State Univ., Fort Collins (USA). Dept. of Mechanical Engineering); Brey, H.L. (Public Service Co. of Colorado, Denver (USA)); Swart, F.E. (Gas-Cooled Reactor Associates, La Jolla, CA (USA)); Forbis, J.M. (Storage Technology Corp., Louisville, CO (USA))

    1982-09-01

    Moisture ingress into the core volume could cause damaging reactions with the moderator-reflector graphite and burnable poison, therefore a dew point moisture monitoring system has been developed with the basic design criteria that a plant protective system trip is signaled after the system detects high primary coolant helium moisture levels and that the system is able to correctly identify which of two steam generator loops is leaking. Modifications to the sample supplies to the monitors were necessary to reduce the system's unsatisfactory response time at lower reactor power levels.

  15. Big science

    CERN Multimedia

    Nadis, S

    2003-01-01

    " "Big science" is moving into astronomy, bringing large experimental teams, multi-year research projects, and big budgets. If this is the wave of the future, why are some astronomers bucking the trend?" (2 pages).

  16. The ARIES-III D-3He tokamak reactor: Design-point determination and parametric studies

    International Nuclear Information System (INIS)

    Bathke, C.G.; Werley, K.A.; Miller, R.L.; Krakowski, R.A.; Santarius, J.F.

    1991-01-01

    The multi-institutional ARIES study has generated a conceptual design of another tokamak fusion reactor in a series that varies the assumed advances in technology and physics. The ARIES-3 design uses a D- 3 He fuel cycle and requires advances in technology and physics for economical attractiveness. The optimal design was characterized through systems analyses for eventual conceptual engineering design. Results from the systems analysis are summarized, and a comparison with the high-field, D-T fueled ARIES-1 is included. 11 refs., 5 figs

  17. Cost-constrained design point for the Reversed-Field Pinch Reactor (RFPR)

    International Nuclear Information System (INIS)

    Hagenson, R.L.; Krakowski, R.A.

    1978-01-01

    A broad spectrum of Reversed-Field Pinch Reactor (RFPR) operating modes are compared on an economics basis. An RFPR with superconducting coils and an air-core poloidal field transformer optimizes to give a minimum cost system when compared to normal-conducting coils and the iron-core transformer used in earlier designs. An interim design is described that exhibits a thermally stable, unrefueled, 21 s burn (burnup 50 percent) with an energy containment time equal to 200 times the Bohm time, which is consistent with present-day tokamak experiments. This design operates near the minimum energy state (THETA = B/sub THETA/(r/sub w/)/[B/sub z/] = 2.0 and F = B/sub z/(r/sub w/)/[B/sub z/] = 1.0 from the High Beta Model) of the RFP configuration. This cost-optimized design produces a reactor of 1.5-m minor radius and 12.8-m major radius, that generates 1000 MWe (net) with a recirculating power fraction of 0.15 at a direct capital cost of 970 $/kWe

  18. A feasibility study to determine the functionality of a novel rocking kiln - fluidized bed reactor for the treatment of waste

    International Nuclear Information System (INIS)

    Mohamad Azman Che Mat Isa; Muhd Noor Muhd Yunus; Mohamad Puad Abu; Shahazrin Mohd Nasir; Mohd fairus Abdul Farid

    2004-01-01

    Rotary kiln has been widely used in incineration and studied by many researches. Solid wastes of various shapes, sizes and heat value can be fed into rotary kiln either in batches or continually. Waste combustion in rotary kiln involves rotation method and the residence time depends on the length and diameter of the rotary kiln and the total stoichiometric air given to the system. Rocking system is another technology used in incinerator. In the rocking system, internal elements in the combustion chamber move to transports and mix the burning waste so that all combustible material in the waste is fully burnt. Another technology in incinerator is the fluidized bed This method uses air to fluidized the sand thus enhancing the combustion process. The total air is controlled in order to obtain a suitable fluidized condition This preliminary study was conducted to study the feasibility of an incinerator system when three components viz. the rotary kiln, rocking system and fluidized bed are combined This research was also conducted to obtain preliminary data parameters of the three components such as the suitable temperature, the angle of the kiln, residence time, total air for fluidization, rocking speed and the devolatilization rate. The samples used in this research were the palm oil kernel shells. The results of the studies showed that the palm oil kernel shells combusted evenly using the new parameters. (Author)

  19. Solution of Point Reactor Neutron Kinetics Equations with Temperature Feedback by Singularly Perturbed Method

    Directory of Open Access Journals (Sweden)

    Wenzhen Chen

    2013-01-01

    Full Text Available The singularly perturbed method (SPM is proposed to obtain the analytical solution for the delayed supercritical process of nuclear reactor with temperature feedback and small step reactivity inserted. The relation between the reactivity and time is derived. Also, the neutron density (or power and the average density of delayed neutron precursors as the function of reactivity are presented. The variations of neutron density (or power and temperature with time are calculated and plotted and compared with those by accurate solution and other analytical methods. It is shown that the results by the SPM are valid and accurate in the large range and the SPM is simpler than those in the previous literature.

  20. Heart tissue of harlequin (hq)/Big Blue mice has elevated reactive oxygen species without significant impact on the frequency and nature of point mutations in nuclear DNA

    Energy Technology Data Exchange (ETDEWEB)

    Crabbe, Rory A. [Department of Biology, University of Western Ontario, London, Ontario, N6A 5B7 (Canada); Hill, Kathleen A., E-mail: khill22@uwo.ca [Department of Biology, University of Western Ontario, London, Ontario, N6A 5B7 (Canada)

    2010-09-10

    Age is a major risk factor for heart disease, and cardiac aging is characterized by elevated mitochondrial reactive oxygen species (ROS) with compromised mitochondrial and nuclear DNA integrity. To assess links between increased ROS levels and mutations, we examined in situ levels of ROS and cII mutation frequency, pattern and spectrum in the heart of harlequin (hq)/Big Blue mice. The hq mouse is a model of premature aging with mitochondrial dysfunction and increased risk of oxidative stress-induced heart disease with the means for in vivo mutation detection. The hq mutation produces a significant downregulation in the X-linked apoptosis-inducing factor gene (Aif) impairing both the antioxidant and oxidative phosphorylation functions of AIF. Brain and skin of hq disease mice have elevated frequencies of point mutations in nuclear DNA and histopathology characterized by cell loss. Reports of associated elevations in ROS in brain and skin have mixed results. Herein, heart in situ ROS levels were elevated in hq disease compared to AIF-proficient mice (p < 0.0001) yet, mutation frequency and pattern were similar in hq disease, hq carrier and AIF-proficient mice. Heart cII mutations were also assessed 15 days following an acute exposure to an exogenous ROS inducer (10 mg paraquat/kg). Acute paraquat exposure with a short mutant manifestation period was insufficient to elevate mutation frequency or alter mutation pattern in the post-mitotic heart tissue of AIF-proficient mice. Paraquat induction of ROS requires mitochondrial complex I and thus is likely compromised in hq mice. Results of this preliminary survey and the context of recent literature suggest that determining causal links between AIF deficiency and the premature aging phenotypes of specific tissues is better addressed with assay of mitochondrial ROS and large-scale changes in mitochondrial DNA in specific cell types.

  1. Heart tissue of harlequin (hq)/Big Blue mice has elevated reactive oxygen species without significant impact on the frequency and nature of point mutations in nuclear DNA

    International Nuclear Information System (INIS)

    Crabbe, Rory A.; Hill, Kathleen A.

    2010-01-01

    Age is a major risk factor for heart disease, and cardiac aging is characterized by elevated mitochondrial reactive oxygen species (ROS) with compromised mitochondrial and nuclear DNA integrity. To assess links between increased ROS levels and mutations, we examined in situ levels of ROS and cII mutation frequency, pattern and spectrum in the heart of harlequin (hq)/Big Blue mice. The hq mouse is a model of premature aging with mitochondrial dysfunction and increased risk of oxidative stress-induced heart disease with the means for in vivo mutation detection. The hq mutation produces a significant downregulation in the X-linked apoptosis-inducing factor gene (Aif) impairing both the antioxidant and oxidative phosphorylation functions of AIF. Brain and skin of hq disease mice have elevated frequencies of point mutations in nuclear DNA and histopathology characterized by cell loss. Reports of associated elevations in ROS in brain and skin have mixed results. Herein, heart in situ ROS levels were elevated in hq disease compared to AIF-proficient mice (p < 0.0001) yet, mutation frequency and pattern were similar in hq disease, hq carrier and AIF-proficient mice. Heart cII mutations were also assessed 15 days following an acute exposure to an exogenous ROS inducer (10 mg paraquat/kg). Acute paraquat exposure with a short mutant manifestation period was insufficient to elevate mutation frequency or alter mutation pattern in the post-mitotic heart tissue of AIF-proficient mice. Paraquat induction of ROS requires mitochondrial complex I and thus is likely compromised in hq mice. Results of this preliminary survey and the context of recent literature suggest that determining causal links between AIF deficiency and the premature aging phenotypes of specific tissues is better addressed with assay of mitochondrial ROS and large-scale changes in mitochondrial DNA in specific cell types.

  2. Testing of the rectangular pivot-point bellows for the PPPL tokamak fusion test reactor

    International Nuclear Information System (INIS)

    Haughian, J.; Lou, K.; Greer, J.; Fong, M.; Scalise, D.T.

    1983-12-01

    The Neutral Beam Pivot Point Bellows (PPB) is installed in the duct which connects the Neutral Beam Enclosure to the Torus. This bellows, located at the pivot point, must fit the severely limited space available at the pivot-point location. Consequently, it has to be made rectangular in cross section with a large inside area for beam access. This leads to small convolutions with high stress concentrations. The function of the bellows is to permit change in the angular positioning of the neutral beam line with respect to the Tokamak, to isolate the Neutral Beam Line from the deflection of the Torus during bake out, and to allow for all misalignments. Internally the bellows will have a vacuum along with such gases such as hydrogen or deuterium. Tests parameters are described

  3. One group neutron flux at a point in a cylindrical reactor cell calculated by Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Kocic, A [Institute of Nuclear Sciences Vinca, Beograd (Serbia and Montenegro)

    1974-01-15

    Mean values of the neutron flux over material regions and the neutron flux at space points in a cylindrical annular cell (one group model) have been calculated by Monte Carlo. The results are compared with those obtained by an improved collision probability method (author)

  4. Systematic evaluation program, status summary report

    International Nuclear Information System (INIS)

    1983-01-01

    Status reports are presented on the systematic evaluation program for the Big Rock Point reactor, Dresden-1 reactor, Dresden-2 reactor, Ginna-1 reactor, Connecticut Yankee reactor, LACBWR reactor, Millstone-1 reactor, Oyster Creek-1 reactor, Palisades-1 reactor, San Onofre-1 reactor, and Rowe Yankee reactor

  5. Reactor analysis support package (RASP). Volume 7. PWR set-point methodology. Final report

    International Nuclear Information System (INIS)

    Temple, S.M.; Robbins, T.R.

    1986-09-01

    This report provides an overview of the basis and methodology requirements for determining Pressurized Water Reactor (PWR) technical specifications related setpoints and focuses on development of the methodology for a reload core. Additionally, the report documents the implementation and typical methods of analysis used by PWR vendors during the 1970's to develop Protection System Trip Limits (or Limiting Safety System Settings) and Limiting Conditions for Operation. The descriptions of the typical setpoint methodologies are provided for Nuclear Steam Supply Systems as designed and supplied by Babcock and Wilcox, Combustion Engineering, and Westinghouse. The description of the methods of analysis includes the discussion of the computer codes used in the setpoint methodology. Next, the report addresses the treatment of calculational and measurement uncertainties based on the extent to which such information was available for each of the three types of PWR. Finally, the major features of the setpoint methodologies are compared, and the principal effects of each particular methodology on plant operation are summarized for each of the three types of PWR

  6. Comparison of one-dimensional and point kinetics for various light water reactor transients

    International Nuclear Information System (INIS)

    Naser, J.A.; Lin, C.; Gose, G.C.; McClure, J.A.; Matsui, Y.

    1985-01-01

    The object of this paper is to compare the results from the three kinetics options: 1) point kinetics; 2) point kinetics by not changing the shape function; and 3) one-dimensional kinetics for various transients on both BWRs and PWRs. A systematic evaluation of the one-dimensional kinetics calculation and its alternatives is performed to determine the status of these models and to identify additional development work. In addition, for PWRs, the NODEP-2 - NODETRAN and SIMULATE - SIMTRAN paths for calculating kinetics parameters are compared. This type of comparison has not been performed before and is needed to properly evaluate the RASP methodology of which these codes are a part. It should be noted that RASP is in its early pre-release stage and this is the first serious attempt to examine the consistency between these two similar but different methods of generating physics parameters for the RETRAN computer code

  7. The testing of the Rectangular Pivot-point bellows for the PPPL Tokamak fusion test reactor

    International Nuclear Information System (INIS)

    Haughian, J.; Fong, M.; Greer, J.; Lou, K.; Scalise, D.T.

    1983-01-01

    The Neutral Beam Pivot Point Bellows (PPB) is installed in the duct which connects the Neutral Beam Enclosure to the Torus. This bellows, located at the pivot point, must fit the severely limited space available at the pivot-point location. Consequently, it has to be made rectangular in cross section with a large inside area for beam access. This leads to small convolutions with high stress concentrations. The function of the bellows is to permit change in the angular positioning of the neutral beam line with respect to the Tokamak, to isolate the Neutral Beam Line from the deflection of the Torus during bake out, and to allow for all misalignments. Internally the bellows will have a vacuum along with such gases such as hydrogen or deuterium. Externally, air or nitrogen gas will be present. It is constructed of Inconel 718 convolutions welded together to provide a clear rectangular opening of 23.4 by 32.2 inches, joined to a 625 Inconel flange at each end

  8. Super critical water reactors

    International Nuclear Information System (INIS)

    Dumaz, P.; Antoni, O; Arnoux, P.; Bergeron, A; Renault, C.; Rimpault, G.

    2005-01-01

    Water is used as a calori-porter and moderator in the most major nuclear centers which are actually in function. In the pressurized water reactor (PWR) and boiling water reactor (BWR), water is maintained under critical point of water (21 bar, 374 Centigrade) which limits the efficiency of thermodynamic cycle of energy conversion (yield gain of about 33%) Crossing the critical point, one can then use s upercritical water , the obtained pressure and temperature allow a significant yield gains. In addition, the supercritical water offers important properties. Particularly there is no more possible coexistence between vapor and liquid. Therefore, we don't have more boiling problem, one of the phenomena which limits the specific power of PWR and BWR. Since 1950s, the reactor of supercritical water was the subject of studies more or less detailed but neglected. From the early 1990s, this type of conception benefits of some additional interests. Therefore, in the international term G eneration IV , the supercritical water reactors had been considered as one of the big options for study as Generation IV reactors. In the CEA, an active city has engaged from 1930 with the participation to a European program: The HPWR (High Performance Light Water Reactor). In this contest, the R and D studies are focused on the fields of neutrons, thermodynamic and materials. The CEA intends to pursue a limited effort of R and D in this field, in the framework of international cooperation, preferring the study of versions of rapid spectrum. (author)

  9. End points in discharge cleaning on TFTR (Tokamak Fusion Test Reactor)

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, D.; Dylla, H.F.; Bell, M.G.; Blanchard, W.R.; Bush, C.E.; Gettelfinger, G.; Hawryluk, R.J.; Hill, K.W.; Janos, A.C.; Jobes, F.C.

    1989-07-01

    It has been found necessary to perform a series of first-wall conditioning steps prior to successful high power plasma operation in the Tokamak Fusion Test Reactor (TFTR). This series begins with glow discharge cleaning (GDC) and is followed by pulse discharge cleaning (PDC). During machine conditioning, the production of impurities is monitored by a Residual Gas Analyzer (RGA). PDC is made in two distinct modes: Taylor discharge cleaning (TDC), where the plasma current is kept low (15--50 kA) and of short duration (50 ms) by means of a relatively high prefill pressure and aggressive PDC, where lower prefill pressure and higher toroidal field result in higher current (200--400 kA) limited by disruptions at q(a) /approx/ 3 at /approx/ 250 ms. At a constant repetition rate of 12 discharges/minute, the production rate of H/sub 2/O, CO, or other impurities has been found to be an unreliable measure of progress in cleaning. However, the ability to produce aggressive PDC with substantial limiter heating, but without the production of x-rays from runaway electrons, is an indication that TDC is no longer necessary after /approx/ 10/sup 5/ pulses. During aggressive PDC, the uncooled limiters are heated by the plasma from the bakeout temperature of 150/degree/C to about 250/degree/C over a period of three to eight hours. This limiter heating is important to enhance the rate at which H/sub 2/O is removed from the graphite limiter. 14 refs., 3 figs., 1 tab.

  10. End points in discharge cleaning on TFTR [Tokamak Fusion Test Reactor

    International Nuclear Information System (INIS)

    Mueller, D.; Dylla, H.F.; Bell, M.G.

    1989-07-01

    It has been found necessary to perform a series of first-wall conditioning steps prior to successful high power plasma operation in the Tokamak Fusion Test Reactor (TFTR). This series begins with glow discharge cleaning (GDC) and is followed by pulse discharge cleaning (PDC). During machine conditioning, the production of impurities is monitored by a Residual Gas Analyzer (RGA). PDC is made in two distinct modes: Taylor discharge cleaning (TDC), where the plasma current is kept low (15--50 kA) and of short duration (50 ms) by means of a relatively high prefill pressure and aggressive PDC, where lower prefill pressure and higher toroidal field result in higher current (200--400 kA) limited by disruptions at q(a) approx 3 at approx 250 ms. At a constant repetition rate of 12 discharges/minute, the production rate of H 2 O, CO, or other impurities has been found to be an unreliable measure of progress in cleaning. However, the ability to produce aggressive PDC with substantial limiter heating, but without the production of x-rays from runaway electrons, is an indication that TDC is no longer necessary after approx 10 5 pulses. During aggressive PDC, the uncooled limiters are heated by the plasma from the bakeout temperature of 150 degree C to about 250 degree C over a period of three to eight hours. This limiter heating is important to enhance the rate at which H 2 O is removed from the graphite limiter. 14 refs., 3 figs., 1 tab

  11. Point Source contamination approach for hydrological risk assessment of a major hypothetical accident from second research reactor at Inshas site

    International Nuclear Information System (INIS)

    Sadek, M.A.; Tawfik, F.S.

    2002-01-01

    The point source contamination mechanism and the deterministic conservative approach have been implemented to demonstrate the hazards of hydrological pollution due to a major hypothetical accident in the second research reactor at Inshas. The radioactive inventory is assumed to be dissolved in 75% of the cooling water (25% are lost) and comes directly into contact with ground water and moved down gradient. Five radioisotopes(I-129, Sr-90, Ru-106, Cs-134 and Cs-137) of the entire inventory are found to be highly durable and represent vulnerability in the environment. Their downstream spread indices; C max : maximum concentration at the focus of the moving ellipse, delta: pollution duration at different distances, A:polluted area at different distances and X min : safety distance from the reactor, were calculated based on analytical solutions of the convection-dispersion partial differential equation for absorbable and decaying species. The largest downstream contamination range was found for Sr-90 and Ru-106 but still no potential. The geochemical and hydrological parameters of the water bearing formations play a great role in buffering and limiting the radiation effects. These reduce the retention time of the radioisotopes several order of magnitudes in the polluted distances. Sensitivity analysis of the computed pollution ranges shows low sensitivity to possible potential for variations activity of nuclide inventory, dispersivity and saturated thickness and high sensitivity for possible variations in groundwater velocity and retention factors

  12. Generalized treatment of point reactor kinetics driven by random reactivity fluctuations via the Wiener-Hermite functional method

    International Nuclear Information System (INIS)

    Behringer, K.

    1991-02-01

    In a recent paper by Behringer et al. (1990), the Wiener-Hermite Functional (WHF) method has been applied to point reactor kinetics excited by Gaussian random reactivity noise under stationary conditions, in order to calculate the neutron steady-state value and the neutron power spectral density (PSD) in a second-order (WHF-2) approximation. For simplicity, delayed neutrons and any feedback effects have been disregarded. The present study is a straightforward continuation of the previous one, treating the problem more generally by including any number of delayed neutron groups. For the case of white reactivity noise, the accuracy of the approach is determined by comparison with the exact solution available from the Fokker-Planck method. In the numerical comparisons, the first-oder (WHF-1) approximation of the PSD is also considered. (author) 4 figs., 10 refs

  13. Application of the Wiener-Hermite functional method to point reactor kinetics driven by random reactivity fluctuations

    International Nuclear Information System (INIS)

    Behringer, K.; Pineyro, J.; Mennig, J.

    1990-06-01

    The Wiener-Hermite functional (WHF) method has been applied to the point reactor kinetic equation excited by Gaussian random reactivity noise under stationary conditions. Delayed neutrons and any feedback effects are disregarded. The neutron steady-state value and the power spectral density (PSD) of the neutron flux have been calculated in a second order (WHF-2) approximation. Two cases are considered: in the first case, the noise source is low-pass white noise. In both cases the WHF-2 approximation of the neutron PSDs leads to relatively simple analytical expressions. The accuracy of the approach is determined by comparison with exact solutions of the problem. The investigations show that the WHF method is a powerful approximative tool for studying the nonlinear effects in the stochastic differential equation. (author) 5 figs., 29 refs

  14. World must build two atomic reactors each day the next hundred years. [Summary of and commentary on book, 'Mankind at the Turning Point'

    Energy Technology Data Exchange (ETDEWEB)

    1974-07-24

    In summarizing and commenting on the ideas presented in Mesarovic and Pestel's book ''Mankind at the Turning Point'' it is pointed out that the global energy crisis makes comprehensive long-term planning a necessity. Assuming, optimistically, that nuclear power alone is able to supply the total projected energy demand in 100 years, it is stated that this will require 3000 nuclear power stations, each with 8 fast breeder reactors, totally 100 GW(t). This means a net rate of construction of four reactors per week, which again means allowing for a 30-year life, two reactors per day, every day, for the next hundred years. Fueling of these reactors will require the production and transport of 15 x 10/sup 6/ kg of /sup 239/Pu per year. It is therefore obvious that the energy crisis is not only a technological, but also a political, social, and even psychological problem.

  15. World must build two atomic reactors each day the next hundred years. [Summary of and commentary on book, 'Mankind at the Turning Point'

    Energy Technology Data Exchange (ETDEWEB)

    1974-07-24

    In summarizing and commenting on the ideas presented in Mesarovic and Pestel's book ''Mankind at the Turning Point'' it is pointed out that the global energy crisis makes comprehensive long-term planning a necessity. Assuming, optimistically, that nuclear power alone is able to supply the total projected energy demand in 100 years, it is stated that this will require 3000 nuclear power stations, each with 8 fast breeder reactors, totally 100 GW(t). This means a net rate of construction of four reactors per week, which again means allowing for a 30-year life, two reactors per day, every day, for the next hundred years. Fueling of these reactors will require the production and transport of 15 x 10/sup 6/ kg of /sup 239/Pu per year. It is therefore obvious that the energy crisis is not only a technological, but also a political, social, and even psychological problem.

  16. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  17. Performance evaluation of a natural treatment system for small communities, composed of a UASB reactor, maturation ponds (baffled and unbaffled) and a granular rock filter in series.

    Science.gov (United States)

    Dias, D F C; Passos, R G; Rodrigues, V A J; de Matos, M P; Santos, C R S; von Sperling, M

    2018-02-01

    Post-treatment of anaerobic reactor effluent with maturation ponds is a good option for small to medium-sized communities in tropical climates. The treatment line investigated, operating in Brazil, with an equivalent capacity to treat domestic sewage from 250 inhabitants, comprised a upflow anaerobic sludge blanket reactor followed by two shallow maturation ponds (unbaffled and baffled) and a granular rock filter (decreasing grain size) in series, requiring an area of only 1.5 m 2  inhabitant -1 . With an overall hydraulic retention time of only 6.7 days, the performance was excellent for a natural treatment system. Based on over two years of continuous monitoring, median removal efficiencies were: biochemical oxygen demand = 93%, chemical oxygen demand = 79%, total suspended solids = 87%, ammonia = 43% and Escherichia coli = 6.1 log units. The final effluent complied with European discharge standards and WHO guidelines for some forms of irrigation, and appeared to be a suitable alternative for treating domestic sewage for small communities in warm areas, especially in developing countries.

  18. A fast and sensitive method for evaluating nuclides migration characteristics in rock medium by using micro-channel reactor concept

    Science.gov (United States)

    Okuyama, Keita; Sasahira, Akira; Noshita, Kenji; Yoshida, Takuma; Kato, Kazuyuki; Nagasaki, Shinya; Ohe, Toshiaki

    Experimental effort to evaluate the barrier performance of geologic disposal requires relatively long testing periods and chemically stable conditions. We have developed a new technique, the micro mock-up method, to present a fast and sensitive method to measure both nuclide diffusivity and sorption coefficient within a day to overcome such disadvantage of the conventional method. In this method, a Teflon plate having a micro channel (10-200 μm depth, 2, 4 mm width) is placed just beneath the rock sample plate, radionuclide solution is injected into the channel with constant rate. The breakthrough curve is being measured until a steady state. The outlet flux in the steady state however does not meet the inlet flux because of the matrix diffusion into the rock body. This inlet-outlet difference is simply related to the effective diffusion coefficient ( De) and the distribution coefficient ( Kd) of rock sample. Then, we adopt a fitting procedure to speculate Kd and De values by comparing the observation to the theoretical curve of the two-dimensional diffusion-advection equation. In the present study, we measured De of 3H by using both the micro mock-up method and the conventional through-diffusion method for comparison. The obtained values of De by two different ways for granite sample (Inada area of Japan) were identical: 1.0 × 10 -11 and 9.0 × 10 -12 m 2/s but the testing period was much different: 10 h and 3 days, respectively. We also measured the breakthrough curve of 85Sr and the resulting Kd and De agreed well to the previous study obtained by the batch sorption experiments with crushed samples. The experimental evidence and the above advantages reveal that the micro mock-up method based on the microreactor concept is powerful and much advantageous when compared to the conventional method.

  19. Dynamic analysis of multiple nuclear-coupled boiling channels based on a multi-point reactor model

    International Nuclear Information System (INIS)

    Lee, J.D.; Pan Chin

    2005-01-01

    This work investigates the non-linear dynamics and stabilities of a multiple nuclear-coupled boiling channel system based on a multi-point reactor model using the Galerkin nodal approximation method. The nodal approximation method for the multiple boiling channels developed by Lee and Pan [Lee, J.D., Pan, C., 1999. Dynamics of multiple parallel boiling channel systems with forced flows. Nucl. Eng. Des. 192, 31-44] is extended to address the two-phase flow dynamics in the present study. The multi-point reactor model, modified from Uehiro et al. [Uehiro, M., Rao, Y.F., Fukuda, K., 1996. Linear stability analysis on instabilities of in-phase and out-of-phase modes in boiling water reactors. J. Nucl. Sci. Technol. 33, 628-635], is employed to study a multiple-channel system with unequal steady-state neutron density distribution. Stability maps, non-linear dynamics and effects of major parameters on the multiple nuclear-coupled boiling channel system subject to a constant total flow rate are examined. This study finds that the void-reactivity feedback and neutron interactions among subcores are coupled and their competing effects may influence the system stability under different operating conditions. For those cases with strong neutron interaction conditions, by strengthening the void-reactivity feedback, the nuclear-coupled effect on the non-linear dynamics may induce two unstable oscillation modes, the supercritical Hopf bifurcation and the subcritical Hopf bifurcation. Moreover, for those cases with weak neutron interactions, by quadrupling the void-reactivity feedback coefficient, period-doubling and complex chaotic oscillations may appear in a three-channel system under some specific operating conditions. A unique type of complex chaotic attractor may evolve from the Rossler attractor because of the coupled channel-to-channel thermal-hydraulic and subcore-to-subcore neutron interactions. Such a complex chaotic attractor has the imbedding dimension of 5 and the

  20. National demonstration of full reactor coolant system (RCS) chemical decontamination at Indian Point 2

    Energy Technology Data Exchange (ETDEWEB)

    Trovato, S.A.; Parry, J.O. [Consolidated Edison Co., New York, NY (United States)

    1995-03-01

    Key to the safe and efficient operation of the nation`s civilian nuclear power plants is the performance of maintenance activities within regulations and guidelines for personnel radiation exposure. However, maintenance activities, often performed in areas of relatively high radiation fields, will increase as the nation`s plant age. With the Nuclear Regulatory Commission (NRC) lowering the allowable radiation exposure to plant workers in 1994 and considering further reductions and regulations in the future, it is imperative that new techniques be developed and applied to reduce personnel exposure. Full primary system chemical decontamination technology offers the potential to be single most effective method of maintaining workers exposure {open_quotes}as low as reasonably achievable{close_quotes} (ALARA) while greatly reducing plant operation and maintenance (O&M) costs. A three-phase program underway since 1987, has as its goal to demonstrate that full RCS decontamination is a visible technology to reduce general plant radiation levels without threatening the long term reliability and operability of a plant. This paper discusses research leading to and plans for a National Demonstration of Full RCS Chemical Decontamination at Indian Point 2 nuclear generating station in 1995.

  1. Urbanising Big

    DEFF Research Database (Denmark)

    Ljungwall, Christer

    2013-01-01

    Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis.......Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis....

  2. Big Argumentation?

    Directory of Open Access Journals (Sweden)

    Daniel Faltesek

    2013-08-01

    Full Text Available Big Data is nothing new. Public concern regarding the mass diffusion of data has appeared repeatedly with computing innovations, in the formation before Big Data it was most recently referred to as the information explosion. In this essay, I argue that the appeal of Big Data is not a function of computational power, but of a synergistic relationship between aesthetic order and a politics evacuated of a meaningful public deliberation. Understanding, and challenging, Big Data requires an attention to the aesthetics of data visualization and the ways in which those aesthetics would seem to depoliticize information. The conclusion proposes an alternative argumentative aesthetic as the appropriate response to the depoliticization posed by the popular imaginary of Big Data.

  3. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations......The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  4. Reactors

    DEFF Research Database (Denmark)

    Shah, Vivek; Vaz Salles, Marcos António

    2018-01-01

    The requirements for OLTP database systems are becoming ever more demanding. Domains such as finance and computer games increasingly mandate that developers be able to encode complex application logic and control transaction latencies in in-memory databases. At the same time, infrastructure...... engineers in these domains need to experiment with and deploy OLTP database architectures that ensure application scalability and maximize resource utilization in modern machines. In this paper, we propose a relational actor programming model for in-memory databases as a novel, holistic approach towards......-level function calls. In contrast to classic transactional models, however, reactors allow developers to take advantage of intra-transaction parallelism and state encapsulation in their applications to reduce latency and improve locality. Moreover, reactors enable a new degree of flexibility in database...

  5. How Big Is Too Big?

    Science.gov (United States)

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  6. Toward a Learning Health-care System – Knowledge Delivery at the Point of Care Empowered by Big Data and NLP

    Science.gov (United States)

    Kaggal, Vinod C.; Elayavilli, Ravikumar Komandur; Mehrabi, Saeed; Pankratz, Joshua J.; Sohn, Sunghwan; Wang, Yanshan; Li, Dingcheng; Rastegar, Majid Mojarad; Murphy, Sean P.; Ross, Jason L.; Chaudhry, Rajeev; Buntrock, James D.; Liu, Hongfang

    2016-01-01

    The concept of optimizing health care by understanding and generating knowledge from previous evidence, ie, the Learning Health-care System (LHS), has gained momentum and now has national prominence. Meanwhile, the rapid adoption of electronic health records (EHRs) enables the data collection required to form the basis for facilitating LHS. A prerequisite for using EHR data within the LHS is an infrastructure that enables access to EHR data longitudinally for health-care analytics and real time for knowledge delivery. Additionally, significant clinical information is embedded in the free text, making natural language processing (NLP) an essential component in implementing an LHS. Herein, we share our institutional implementation of a big data-empowered clinical NLP infrastructure, which not only enables health-care analytics but also has real-time NLP processing capability. The infrastructure has been utilized for multiple institutional projects including the MayoExpertAdvisor, an individualized care recommendation solution for clinical care. We compared the advantages of big data over two other environments. Big data infrastructure significantly outperformed other infrastructure in terms of computing speed, demonstrating its value in making the LHS a possibility in the near future. PMID:27385912

  7. Rock History and Culture

    OpenAIRE

    Gonzalez, Éric

    2013-01-01

    Two ambitious works written by French-speaking scholars tackle rock music as a research object, from different but complementary perspectives. Both are a definite must-read for anyone interested in the contextualisation of rock music in western popular culture. In Une histoire musicale du rock (i.e. A Musical History of Rock), rock music is approached from the point of view of the people – musicians and industry – behind the music. Christophe Pirenne endeavours to examine that field from a m...

  8. Reactor

    International Nuclear Information System (INIS)

    Fujibayashi, Toru.

    1976-01-01

    Object: To provide a boiling water reactor which can enhance a quake resisting strength and flatten power distribution. Structure: At least more than four fuel bundles, in which a plurality of fuel rods are arranged in lattice fashion which upper and lower portions are supported by tie-plates, are bundled and then covered by a square channel box. The control rod is movably arranged within a space formed by adjoining channel boxes. A spacer of trapezoidal section is disposed in the central portion on the side of the channel box over substantially full length in height direction, and a neutron instrumented tube is disposed in the central portion inside the channel box. Thus, where a horizontal load is exerted due to earthquake or the like, the spacers come into contact with each other to support the channel box and prevent it from abnormal vibrations. (Furukawa, Y.)

  9. Interest in 100% MOX future reactors as seen from the fuel fabrication and from the Pu manager point of view

    International Nuclear Information System (INIS)

    Golinelli, C.; Guillet, J.L.; Nigon, J.L.

    1996-01-01

    Today, plutonium recycling in PWR type reactors has reached the industrial phase. But, on a competitive market, cost reduction can be achieved by improving fuel performances and fuel management. That is why researches on MOX future reactors are still carried out in the world and particularly in France. As a matter of fact, MOX future reactors can be more competitive if the in-reactor utilization is improved. This solution should certainly be the next step to re-use the recovered plutonium from reprocessed spent fuel. (O.M.)

  10. Big nuclear accidents

    International Nuclear Information System (INIS)

    Marshall, W.; Billingon, D.E.; Cameron, R.F.; Curl, S.J.

    1983-09-01

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but just imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the risks of nuclear power. The paper reviews the way in which the probability and consequences of big nuclear accidents have been presented in the past and makes recommendations for the future, including the presentation of the long-term consequences of such accidents in terms of 'loss of life expectancy', 'increased chance of fatal cancer' and 'equivalent pattern of compulsory cigarette smoking'. The paper presents mathematical arguments, which show the derivation and validity of the proposed methods of presenting the consequences of imaginable big nuclear accidents. (author)

  11. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  12. Neutron behavior, reactor control, and reactor heat transfer. Volume four

    International Nuclear Information System (INIS)

    Anon.

    1986-01-01

    Volume four covers neutron behavior (neutron absorption, how big are nuclei, neutron slowing down, neutron losses, the self-sustaining reactor), reactor control (what is controlled in a reactor, controlling neutron population, is it easy to control a reactor, range of reactor control, what happens when the fuel burns up, controlling a PWR, controlling a BWR, inherent safety of reactors), and reactor heat transfer (heat generation in a nuclear reactor, how is heat removed from a reactor core, heat transfer rate, heat transfer properties of the reactor coolant)

  13. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  14. Big Dreams

    Science.gov (United States)

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  15. Big Science

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1986-05-15

    Astronomy, like particle physics, has become Big Science where the demands of front line research can outstrip the science budgets of whole nations. Thus came into being the European Southern Observatory (ESO), founded in 1962 to provide European scientists with a major modern observatory to study the southern sky under optimal conditions.

  16. Big Data; A Management Revolution : The emerging role of big data in businesses

    OpenAIRE

    Blasiak, Kevin

    2014-01-01

    Big data is a term that was coined in 2012 and has since then emerged to one of the top trends in business and technology. Big data is an agglomeration of different technologies resulting in data processing capabilities that have been unreached before. Big data is generally characterized by 4 factors. Volume, velocity and variety. These three factors distinct it from the traditional data use. The possibilities to utilize this technology are vast. Big data technology has touch points in differ...

  17. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2015 The Authors.

  18. Release plan for Big Pete

    International Nuclear Information System (INIS)

    Edwards, T.A.

    1996-11-01

    This release plan is to provide instructions for the Radiological Control Technician (RCT) to conduct surveys for the unconditional release of ''Big Pete,'' which was used in the removal of ''Spacers'' from the N-Reactor. Prior to performing surveys on the rear end portion of ''Big Pete,'' it shall be cleaned (i.e., free of oil, grease, caked soil, heavy dust). If no contamination is found, the vehicle may be released with the permission of the area RCT Supervisor. If contamination is found by any of the surveys, contact the cognizant Radiological Engineer for decontamination instructions

  19. Rock fragmentation

    Energy Technology Data Exchange (ETDEWEB)

    Brown, W.S.; Green, S.J.; Hakala, W.W.; Hustrulid, W.A.; Maurer, W.C. (eds.)

    1976-01-01

    Experts in rock mechanics, mining, excavation, drilling, tunneling and use of underground space met to discuss the relative merits of a wide variety of rock fragmentation schemes. Information is presented on novel rock fracturing techniques; tunneling using electron beams, thermocorer, electric spark drills, water jets, and diamond drills; and rock fracturing research needs for mining and underground construction. (LCL)

  20. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Andersen, Kristina Vaarst; Jeppesen, Jacob

    In this paper we investigate the micro-mechanisms governing structural evolution and performance of scientific collaboration. Scientific discovery tends not to be lead by so called lone ?stars?, or big egos, but instead by collaboration among groups of researchers, from a multitude of institutions...

  1. Big Data and Big Science

    OpenAIRE

    Di Meglio, Alberto

    2014-01-01

    Brief introduction to the challenges of big data in scientific research based on the work done by the HEP community at CERN and how the CERN openlab promotes collaboration among research institutes and industrial IT companies. Presented at the FutureGov 2014 conference in Singapore.

  2. Study of radiation exposure rate on the measurement points in Kartini reactor hall as based to determine operation safety parameters (KBO)

    International Nuclear Information System (INIS)

    Mahrus Salam; Elisabeth Supriyatni; Fajar Panuntun

    2016-01-01

    In the operation of nuclear facility there are safety parameters, which is the value of the conservatively maximum limit to ensure that all of the uncertainty in the analysis of facility operations safety have been considered, such as uncertainty of measurement, response time and uncertainty calculation tool, and is get a long to others value of normal operating condition limits, in other words, there are still allowed or permitted. Calculation of the radiation exposure rate on five measurement points (50 cm above the water surface of reactor pool, above interim storage (bulk shielding), reactor deck, thermal column and sub critical facility) and to be compared to the operation safety parameters (KBO) of Kartini reactor. The exposure rate value is obtained by calculating the source term of radioactivity on the core, attenuation resulting from the radiation shielding and measurement distance. From the calculation obtained that the value of gamma exposure rate of 50 cm above the water surface of reactor pool is 96.91 mR/hr (KBO<100 mR/hr), on the deck of Bulk Shielding amounted to 1.70 mR/h (KBO<2.5 mR/hr), on the reactor deck amounted to 5.73 mR/hr (KBO<10 mR/hr), on the Thermal Column amounted to 2.73 mR/hr (KBO<10 mR/hr) and on the sub critical facility amounted to 1.148 mR/hr (KBO<2.5 mR/hr). The value of gamma exposure rate at 5 locations measurements are still less than the operation safety parameters (KBO), it means that the reactor is safe to be operated. (author)

  3. Advanced fuel designs for existing and future generations of reactors: driving factors from technical and economic points of view

    International Nuclear Information System (INIS)

    Hesketh, Kevin

    2003-01-01

    This paper reviews the current state of advanced fuel research and development and considers advanced fuel development work in the context of the technical and economic drivers. The scope encompasses evolutionary development for existing light water reactors (LWRs), radical developments for LWRs, most of which are focused on more efficient plutonium consumption and on longer term developments in relation to thermal and fast reactor fuels. The review concludes that there is a gap between near-term research and development to support utilities and the long-term work that focuses on goals such as improved plutonium utilisation, waste reduction, improved proliferation resistance and strategic independence

  4. Seismic analysis of APR1400 RCS for site envelope using big mass method

    International Nuclear Information System (INIS)

    Kim, J. Y.; Jeon, J. H.; Lee, D. H.; Park, S. H.

    2002-01-01

    One of design concepts of APR1400 is the site envelope considering various soil sites as well as rock site. The KSNP's are constructed on the rock site where only the translational excitations are directly transferred to the plant. On the other hand, the rotational motions affect the responses of the structures in the soil cases. In this study, a Big Mass Method is used to consider rotational motions as excitations at the foundation in addition to translational ones to obtain seismic responses of the APR1400 RCS main components. The seismic analyses for the APR1400 excited simultaneously by translation and rotational motions were performed. The results show that the effect of soil sites is not significant for the design of main components and supports of the RCS, but it may be considerable for the design of reactor vessel internals, piping, and nozzles which have lower natural frequencies

  5. Big Data

    OpenAIRE

    Bútora, Matúš

    2017-01-01

    Cieľom bakalárskej práca je popísať problematiku Big Data a agregačné operácie OLAP pre podporu rozhodovania, ktoré sú na ne aplikované pomocou technológie Apache Hadoop. Prevažná časť práce je venovaná popisu práve tejto technológie. Posledná kapitola sa zaoberá spôsobom aplikovania agregačných operácií a problematikou ich realizácie. Nasleduje celkové zhodnotenie práce a možnosti využitia výsledného systému do budúcna. The aim of the bachelor thesis is to describe the Big Data issue and ...

  6. BIG DATA

    OpenAIRE

    Abhishek Dubey

    2018-01-01

    The term 'Big Data' portrays inventive methods and advances to catch, store, disseminate, oversee and break down petabyte-or bigger estimated sets of data with high-speed & diverted structures. Enormous information can be organized, non-structured or half-organized, bringing about inadequacy of routine information administration techniques. Information is produced from different distinctive sources and can touch base in the framework at different rates. With a specific end goal to handle this...

  7. International conference on opportunities and challenges for water cooled reactors in the 21. century. PowerPoint presentations

    International Nuclear Information System (INIS)

    2009-01-01

    Water Cooled Reactors have been the keystone of the nuclear industry in the 20th Century. As we move into the 21st Century and face new challenges such as the threat of climate change or the large growth in world energy demand, nuclear energy has been singled out as one of the sources that could substantially and sustainably contribute to power the world. As the nuclear community worldwide looks into the future with the development of advanced and innovative reactor designs and fuel cycles, it becomes important to explore the role Water Cooled Reactors (WCRs) will play in this future. To support the future role of WCRs, substantial design and development programmes are underway in a number of Member States to incorporate additional technology improvements into advanced nuclear power plants (NPPs) designs. One of the key features of advanced nuclear reactor designs is their improved safety due to a reduction in the probability and consequences of accidents and to an increase in the operator time allowed to better assess and properly react to abnormal events. A systematic approach and the experience of many years of successful operation have allowed designers to focus their design efforts and develop safer, more efficient and more reliable designs, and to optimize plant availability and cost through improved maintenance programs and simpler operation and inspection practices. Because many of these advanced WCR designs will be built in countries with no previous nuclear experience, it is also important to establish a forum to facilitate the exchange of information on the infrastructure and technical issues associated with the sustainable deployment of advanced nuclear reactors and its application for the optimization of maintenance of operating nuclear power plants. This international conference seeks to be all-inclusive, bringing together the policy, economic and technical decision-makers and the stakeholders in the nuclear industry such as operators, suppliers

  8. Keynote: Big Data, Big Opportunities

    OpenAIRE

    Borgman, Christine L.

    2014-01-01

    The enthusiasm for big data is obscuring the complexity and diversity of data in scholarship and the challenges for stewardship. Inside the black box of data are a plethora of research, technology, and policy issues. Data are not shiny objects that are easily exchanged. Rather, data are representations of observations, objects, or other entities used as evidence of phenomena for the purposes of research or scholarship. Data practices are local, varying from field to field, individual to indiv...

  9. Determination of the parameters of the point kinetics equation of a nuclear reactor by the quasilinearization technique

    International Nuclear Information System (INIS)

    Tanomaru, N.

    1979-12-01

    The problem of parameter identification in a pontual model for a thermal reactor is dealt with using the quasilinearization technique. The model considers one group of delayed neutrons and a heavily non-linear temperature feedback in the reactivity. The parameter prompt neutron generation time and a parameter of the fuel temperatura reactivity coefficient equation are identified simultaneously, considering discrete measurements of the reactor power, during the transient produced by a change in the external reactivity. The influences of the choice of the external reactivity disturbance, of the two parameters values initial guesses, of the interval between measurements and the measurement noise level in the method accuracy and rate of convergence are analysed. For noiseless or low level noise measurements, the method proved to be very effective. (Author) [pt

  10. Burnable absorber-integrated Guide Thimble (BigT) - 1. Design concepts and neutronic characterization on the fuel assembly benchmarks

    International Nuclear Information System (INIS)

    Yahya, Mohd-Syukri; Yu, Hwanyeal; Kim, Yonghee

    2016-01-01

    This paper presents the conceptual designs of a new burnable absorber (BA) for the pressurized water reactor (PWR), which is named 'Burnable absorber-integrated Guide Thimble' (BigT). The BigT integrates BA materials into standard guide thimble in a PWR fuel assembly. Neutronic sensitivities and practical design considerations of the BigT concept are points of highlight in the first half of the paper. Specifically, the BigT concepts are characterized in view of its BA material and spatial self-shielding variations. In addition, the BigT replaceability requirement, bottom-end design specifications and thermal-hydraulic considerations are also deliberated. Meanwhile, much of the second half of the paper is devoted to demonstrate practical viability of the BigT absorbers via comparative evaluations against the conventional BA technologies in representative 17x17 and 16x16 fuel assembly lattices. For the 17x17 lattice evaluations, all three BigT variants are benchmarked against Westinghouse's existing BA technologies, while in the 16x16 assembly analyses, the BigT designs are compared against traditional integral gadolinia-urania rod design. All analyses clearly show that the BigT absorbers perform as well as the commercial BA technologies in terms of reactivity and power peaking management. In addition, it has been shown that sufficiently high control rod worth can be obtained with the BigT absorbers in place. All neutronic simulations were completed using the Monte Carlo Serpent code with ENDF/B-VII.0 library. (author)

  11. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  12. Rock Art

    Science.gov (United States)

    Henn, Cynthia A.

    2004-01-01

    There are many interpretations for the symbols that are seen in rock art, but no decoding key has ever been discovered. This article describes one classroom's experiences with a lesson on rock art--making their rock art and developing their own personal symbols. This lesson allowed for creativity, while giving an opportunity for integration…

  13. Technical evaluation report on the monitoring of electric power to the reactor protection system for the Nine Mile Point Nuclear Station, Unit 1 (Docket No. 50-220)

    International Nuclear Information System (INIS)

    Selan, J.C.

    1984-01-01

    This report documents the technical evaluation of the monitoring of electric power to the reactor protection system (RPS) at the Nine Mile Point Nuclear Station, Unit 1. The evaluation is to determine if the proposed design modification will protect the RPS from abnormal voltage and frequency conditions which could be supplied from the power supplies and will meet certain requirements set forth by the Nuclear Regulatory Commission. The proposed design modifications will protect the RPS from sustained abnormal voltage and frequency conditions from the supplying sources

  14. Big Data

    DEFF Research Database (Denmark)

    Aaen, Jon; Nielsen, Jeppe Agger

    2016-01-01

    Big Data byder sig til som en af tidens mest hypede teknologiske innovationer, udråbt til at rumme kimen til nye, værdifulde operationelle indsigter for private virksomheder og offentlige organisationer. Mens de optimistiske udmeldinger er mange, er forskningen i Big Data i den offentlige sektor...... indtil videre begrænset. Denne artikel belyser, hvordan den offentlige sundhedssektor kan genanvende og udnytte en stadig større mængde data under hensyntagen til offentlige værdier. Artiklen bygger på et casestudie af anvendelsen af store mængder sundhedsdata i Dansk AlmenMedicinsk Database (DAMD......). Analysen viser, at (gen)brug af data i nye sammenhænge er en flerspektret afvejning mellem ikke alene økonomiske rationaler og kvalitetshensyn, men også kontrol over personfølsomme data og etiske implikationer for borgeren. I DAMD-casen benyttes data på den ene side ”i den gode sags tjeneste” til...

  15. Magnetostratigraphy of a Marine Triassic-Jurassic Boundary Section, Kennecott Point, Queen Charlotte Islands: Implications for the Temporal Correlation of a 'Big Five' Mass Extinction Event.

    Science.gov (United States)

    Hilburn, I. A.; Kirschvink, J. L.; Ward, P. D.; Haggart, J. W.; Raub, T. D.

    2008-12-01

    Several causes have been proposed for Triassic-Jurassic (T-J) boundary extinctions, including global ocean anoxia/euxinia, an impact event, and/or eruption of the massive Central Atlantic Magmatic Province (CAMP), but poor intercontinental correlation makes testing these difficult. Sections at Kennecott Point, Queen Charlotte Islands, British Columbia span the late Norian through Rhaetian (Triassic) and into the earliest Hettangian (Jurassic) and provide the best integrated magneto- and chemostratigraphic framework for placing necessary temporal constraints upon the T-J mass extinctions. At Kennecott Point, turnover of radiolaria and ammonoids define the T-J boundary marine extinction and are coincident with a 2 ‰ negative excursion in δ13Corg similar in magnitude to that observed at Ferguson Hill (Muller Canyon), Nevada (1, 2). With Conodont Alteration Index values in the 1-2 range, Kennecott Point provides the ideal setting for use of magnetostratigraphy to tie the marine isotope excursion into the chronostratigraphic framework of the Newark, Hartford, and Fundy Basins. In the summer of 2005, we collected a ~1m resolution magnetostratigraphic section from 105 m of deep marine, silt- and sandstone turbidites and interbedded mudstones, spanning the T-J boundary at Kennecott Point. Hybrid progressive demagnetization - including zero-field, low-temperature cycling; low-field AF cleaning; and thermal demagnetization in ~25°C steps to 445°C under flowing N2 gas (3) - first removed a Northerly, steeply inclined component interpreted to be a Tertiary overprint, revealing an underlying dual-polarity component of moderate inclination. Five major polarity zones extend through our section, with several short, one-sample reversals interspersed amongst them. Comparison of this pattern with other T-J boundary sections (4-6) argues for a Northern hemisphere origin of our site, albeit with large vertical-axis rotations. A long normal chron bounds the T-J boundary punctuated

  16. Development of artificial soft rock

    International Nuclear Information System (INIS)

    Kishi, Kiyoshi

    1995-01-01

    When foundation base rocks are deeper than the level of installing structures or there exist weathered rocks and crushed rocks in a part of base rocks, often sound artificial base rocks are made by substituting the part with concrete. But in the construction of Kashiwazaki Kariwa Nuclear Power Station of Tokyo Electric Power Co., Inc., the foundation base rocks consist of mudstone, and the stiffness of concrete is large as compared with the surrounding base rocks. As the quality of the substituting material, the nearly same stiffness as that of the surrounding soft rocks and long term stability are suitable, and the excellent workability and economical efficiency are required, therefore, artificial soft rocks were developed. As the substituting material, the soil mortar that can obtain the physical property values in stable form, which are similar to those of Nishiyama mudstone, was selected. The mechanism of its hardening and the long term stability, and the manufacturing plant are reported. As for its application to the base rocks of Kashiwazaki Kariwa Nuclear Power Station, the verification test at the site and the application to the base rocks for No. 7 plant reactor building and other places are described. (K.I.)

  17. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  18. Determination of average molecular weights on organic reactor coolants. I.- Freezing-point depression method for benzene solutions

    International Nuclear Information System (INIS)

    Carreira, M.

    1965-01-01

    As a working method for determination of changes in molecular mass that may occur by irradiation (pyrolytic-radiolytic decomposition) of polyphenyl reactor coolants, a cryoscopic technique has been developed which associated the basic simplicity of Beckman's method with some experimental refinements taken out of the equilibrium methods. A total of 18 runs were made on samples of napthalene, biphenyl, and the commercial mixtures OM-2 (Progil) and Santowax-R (Monsanto), with an average deviation from the theoretical molecular mass of 0.6%. (Author) 7 refs

  19. Comparative study on nutrient removal of agricultural non-point source pollution for three filter media filling schemes in eco-soil reactors.

    Science.gov (United States)

    Du, Fuyi; Xie, Qingjie; Fang, Longxiang; Su, Hang

    2016-08-01

    Nutrients (nitrogen and phosphorus) from agricultural non-point source (NPS) pollution have been increasingly recognized as a major contributor to the deterioration of water quality in recent years. The purpose of this article is to investigate the discrepancies in interception of nutrients in agricultural NPS pollution for eco-soil reactors using different filling schemes. Parallel eco-soil reactors of laboratory scale were created and filled with filter media, such as grit, zeolite, limestone, and gravel. Three filling schemes were adopted: increasing-sized filling (I-filling), decreasing-sized filling (D-filling), and blend-sized filling (B-filling). The systems were intermittent operations via simulated rainstorm runoff. The nutrient removal efficiency, biomass accumulation and vertical dissolved oxygen (DO) distribution were defined to assess the performance of eco-soil. The results showed that B-filling reactor presented an ideal DO for partial nitrification-denitrification across the eco-soil, and B-filling was the most stable in the change of bio-film accumulation trends with depth in the three fillings. Simultaneous and highest removals of NH4(+)-N (57.74-70.52%), total nitrogen (43.69-54.50%), and total phosphorus (42.50-55.00%) were obtained in the B-filling, demonstrating the efficiency of the blend filling schemes of eco-soil for oxygen transfer and biomass accumulation to cope with agricultural NPS pollution.

  20. Adobe photoshop quantification (PSQ) rather than point-counting: A rapid and precise method for quantifying rock textural data and porosities

    Science.gov (United States)

    Zhang, Xuefeng; Liu, Bo; Wang, Jieqiong; Zhang, Zhe; Shi, Kaibo; Wu, Shuanglin

    2014-08-01

    Commonly used petrological quantification methods are visual estimation, counting, and image analyses. However, in this article, an Adobe Photoshop-based analyzing method (PSQ) is recommended for quantifying the rock textural data and porosities. Adobe Photoshop system provides versatile abilities in selecting an area of interest and the pixel number of a selection could be read and used to calculate its area percentage. Therefore, Adobe Photoshop could be used to rapidly quantify textural components, such as content of grains, cements, and porosities including total porosities and different genetic type porosities. This method was named as Adobe Photoshop Quantification (PSQ). The workflow of the PSQ method was introduced with the oolitic dolomite samples from the Triassic Feixianguan Formation, Northeastern Sichuan Basin, China, for example. And the method was tested by comparing with the Folk's and Shvetsov's "standard" diagrams. In both cases, there is a close agreement between the "standard" percentages and those determined by the PSQ method with really small counting errors and operator errors, small standard deviations and high confidence levels. The porosities quantified by PSQ were evaluated against those determined by the whole rock helium gas expansion method to test the specimen errors. Results have shown that the porosities quantified by the PSQ are well correlated to the porosities determined by the conventional helium gas expansion method. Generally small discrepancies (mostly ranging from -3% to 3%) are caused by microporosities which would cause systematic underestimation of 2% and/or by macroporosities causing underestimation or overestimation in different cases. Adobe Photoshop could be used to quantify rock textural components and porosities. This method has been tested to be precise and accurate. It is time saving compared with usual methods.

  1. Big nuclear accidents

    International Nuclear Information System (INIS)

    Marshall, W.

    1983-01-01

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the safety of nuclear power. The way in which the probability and consequences of big nuclear accidents have been presented in the past is reviewed and recommendations for the future are made including the presentation of the long-term consequences of such accidents in terms of 'reduction in life expectancy', 'increased chance of fatal cancer' and the equivalent pattern of compulsory cigarette smoking. (author)

  2. Supplementary control points for reactor shutdown without access to the main control room (International Electrotechnical Commission Standard Publication 965:1989)

    International Nuclear Information System (INIS)

    Kubalek, J.; Hajek, B.

    1993-01-01

    This standard establishes the requirements for supplementary Control Points provided to enable the operating staff to shut down the reactor and maintain the plant in a safe shut-down condition when the main control room is no longer available. This standard covers the functional selection, design and organization of the man/machine interface. It also establishes requirements for procedures which systematically verify and validate the functional design of supplementary control points. The requirements reflect the application of human engineering principles as they apply to man/machine interface. This standard does not cover special emergency response centres (e.g. a Technical Support Centre). It also does not include the detailed equipment design. Unavailability of the main control room controls due to intentionally man-induced events is not considered

  3. Online stress corrosion crack and fatigue usages factor monitoring and prognostics in light water reactor components: Probabilistic modeling, system identification and data fusion based big data analytics approach

    Energy Technology Data Exchange (ETDEWEB)

    Mohanty, Subhasish M. [Argonne National Lab. (ANL), Argonne, IL (United States); Jagielo, Bryan J. [Argonne National Lab. (ANL), Argonne, IL (United States); Oakland Univ., Rochester, MI (United States); Iverson, William I. [Argonne National Lab. (ANL), Argonne, IL (United States); Univ. of Illinois at Urbana-Champaign, Champaign, IL (United States); Bhan, Chi Bum [Argonne National Lab. (ANL), Argonne, IL (United States); Pusan National Univ., Busan (Korea, Republic of); Soppet, William S. [Argonne National Lab. (ANL), Argonne, IL (United States); Majumdar, Saurin M. [Argonne National Lab. (ANL), Argonne, IL (United States); Natesan, Ken N. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-12-10

    Nuclear reactors in the United States account for roughly 20% of the nation's total electric energy generation, and maintaining their safety in regards to key component structural integrity is critical not only for long term use of such plants but also for the safety of personnel and the public living around the plant. Early detection of damage signature such as of stress corrosion cracking, thermal-mechanical loading related material degradation in safety-critical components is a necessary requirement for long-term and safe operation of nuclear power plant systems.

  4. Challenges of Big Data Analysis.

    Science.gov (United States)

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  5. DELIGHT-B/REDEL, point reactivity burnup code for high-temperature gas-cooled reactor cells

    International Nuclear Information System (INIS)

    Shindo, Ryuiti; Watanabe, Takashi.

    1977-03-01

    Code DELIGHT-2 was previously developed to analyze cell burnup characteristics and to produce few-group constants for core burnup calculation in high-temperature gas-cooled reactors. In the code, burnup dependency of the burnable poison, boron-10, is considered with the homogeneous model of space. In actuality, however, the burnable poison is used as homogeneous rods or uniform rods of small granular poison and graphite, to control the reactivity and power distribution. Precise analysis of the burnup characteristics is thus difficult because of the heterogeneity due to the configuration of poison rods. In cell burnup calculation, the DELIGHT-B, which is a modification of DELIGHT-2, takes into consideration this heterogeneous effect. The auxiliary code REDEL, a reduction of DELIGHT-B, used in combination with 3 dimensional diffusion code CITATION, is for core burnup calculation with the macro-scopic cross section model. (auth.)

  6. 'Escher' Rock

    Science.gov (United States)

    2004-01-01

    [figure removed for brevity, see original site] Chemical Changes in 'Endurance' Rocks [figure removed for brevity, see original site] Figure 1 This false-color image taken by NASA's Mars Exploration Rover Opportunity shows a rock dubbed 'Escher' on the southwestern slopes of 'Endurance Crater.' Scientists believe the rock's fractures, which divide the surface into polygons, may have been formed by one of several processes. They may have been caused by the impact that created Endurance Crater, or they might have arisen when water leftover from the rock's formation dried up. A third possibility is that much later, after the rock was formed, and after the crater was created, the rock became wet once again, then dried up and developed cracks. Opportunity has spent the last 14 sols investigating Escher, specifically the target dubbed 'Kirchner,' and other similar rocks with its scientific instruments. This image was taken on sol 208 (Aug. 24, 2004) by the rover's panoramic camera, using the 750-, 530- and 430-nanometer filters. The graph above shows that rocks located deeper into 'Endurance Crater' are chemically altered to a greater degree than rocks located higher up. This chemical alteration is believed to result from exposure to water. Specifically, the graph compares ratios of chemicals between the deep rock dubbed 'Escher,' and the more shallow rock called 'Virginia,' before (red and blue lines) and after (green line) the Mars Exploration Rover Opportunity drilled into the rocks. As the red and blue lines indicate, Escher's levels of chlorine relative to Virginia's went up, and sulfur down, before the rover dug a hole into the rocks. This implies that the surface of Escher has been chemically altered to a greater extent than the surface of Virginia. Scientists are still investigating the role water played in influencing this trend. These data were taken by the rover's alpha particle X-ray spectrometer.

  7. Spacer grid for fuel assembly of nuclear reactor comprising opposite support points made with elastic thin plates

    International Nuclear Information System (INIS)

    Feutrel, C.

    1983-01-01

    Two series of thin walls form square cells, each containing a fuel pencil. Support points are made in the cells walls. Splines obtained by two parallel slots in the length of the cells. The reaction of fuel pencil produce a deformation of the elastic splines made in the plate, for compensation of the tolerance allowed on the diameter of the pencils [fr

  8. CERN Rocks

    CERN Multimedia

    2004-01-01

    The 15th CERN Hardronic Festival took place on 17 July on the terrace of Rest 3 (Prévessin). Over 1000 people, from CERN and other International Organizations, came to enjoy the warm summer night, and to watch the best of the World's High Energy music. Jazz, rock, pop, country, metal, blues, funk and punk blasted out from 9 bands from the CERN Musiclub and Jazz club, alternating on two stages in a non-stop show.  The night reached its hottest point when The Canettes Blues Band got everybody dancing to sixties R&B tunes (pictured). Meanwhile, the bars and food vans were working at full capacity, under the expert management of the CERN Softball club, who were at the same time running a Softball tournament in the adjacent "Higgs Field". The Hardronic Festival is the main yearly CERN music event, and it is organized with the support of the Staff Association and the CERN Administration.

  9. Recreating Rocks

    DEFF Research Database (Denmark)

    Posth, Nicole R

    2008-01-01

    Nicole Posth and colleagues spent a month touring South African rock formations in their quest to understand the origin of ancient iron and silicate layers.......Nicole Posth and colleagues spent a month touring South African rock formations in their quest to understand the origin of ancient iron and silicate layers....

  10. Geology and petrography in basaltic rocks (Arapey formation) cropping out in road 30 between the Bella Union round point (27 km) and Penas cuesta (225 Km)

    International Nuclear Information System (INIS)

    Oyhantcabal, P.; Pineiro, G.

    2007-01-01

    This contribution presents a geological map of the basaltic flows of Arapey formation (Mezosoic) cropping out in Road 30 between the Bella Union round point (27 Km) and Pena s cuesta (225 Km) together with the description of the petrographic features of the different portions of the 20 recognized flow units

  11. CFD analysis of the dynamic behaviour of a fuel rod subchannel in a supercritical water reactor with point kinetics

    International Nuclear Information System (INIS)

    Ampomah-Amoako, Emmanuel; Akaho, Edward H.K.; Nyarko, Benjamin J.B.; Ambrosini, Walter

    2013-01-01

    Highlights: • The analysis of flow stability of nuclear fuel subchannels with supercritical water is presented. • The results obtained by a CFD code are compared with those of a system code. • The model includes also heat conduction in the fuel rod and point neutron kinetics. - Abstract: The paper presents the analysis by a CFD code of coupled neutronic–thermal hydraulic instabilities in a subchannel slice belonging to a square lattice assembly. The work represents a further phase in the assessment of the suitability of CFD codes for studies of flow stability of supercritical fluids; the research started in previous work with the analysis of bare 2D circular pipes and already addressed 3D subchannel slices with no allowance for heat conduction or neutronic effects. In the present phase, a more realistic system is considered, dealing with a slice of a fuel assembly subchannel containing the regions of the pellet, the gap and the cladding and including also the effect of inlet and outlet throttling. The adopted neutronic model is a point kinetics one, including six delayed neutron groups with global Doppler and fluid density feedbacks. The response of the model to perturbations applied starting from a steady-state condition at the rated power is compared with that of a similar model developed for a 1D system code. Upward, horizontal and downward flow orientations are addressed making use of a uniform power profile and changing relevant parameters as the gap equivalent conductance and the density reactivity coefficient. A bottom-peaked power profile is also considered in the case of vertical upward flow. Though the adopted model can still be considered simple in comparison with actual details of fuel assemblies, the obtained results demonstrate the potential of the adopted methodology for more accurate analyses to be made with larger computational resources

  12. Art Rocks with Rock Art!

    Science.gov (United States)

    Bickett, Marianne

    2011-01-01

    This article discusses rock art which was the very first "art." Rock art, such as the images created on the stone surfaces of the caves of Lascaux and Altimira, is the true origin of the canvas, paintbrush, and painting media. For there, within caverns deep in the earth, the first artists mixed animal fat, urine, and saliva with powdered minerals…

  13. Rock Physics

    DEFF Research Database (Denmark)

    Fabricius, Ida Lykke

    2017-01-01

    Rock physics is the discipline linking petrophysical properties as derived from borehole data to surface based geophysical exploration data. It can involve interpretation of both elastic wave propagation and electrical conductivity, but in this chapter focus is on elasticity. Rock physics is based...... on continuum mechanics, and the theory of elasticity developed for statics becomes the key to petrophysical interpretation of velocity of elastic waves. In practice, rock physics involves interpretation of well logs including vertical seismic profiling (VSP) and analysis of core samples. The results...

  14. Derivation of Pal-Bell equations for two-point reactors, and its application to correlation measurements at KUCA

    International Nuclear Information System (INIS)

    Murata, Naoyuki; Yamane, Yoshihiro; Nishina, Kojiro; Shiroya, Seiji; Kanda, Keiji.

    1980-01-01

    A probability is defined for an event in which m neutrons exist at time t sub(f) in core I of a coupled-core system, originating from a neutron injected into the core I at an earlier time t; we call it P sub(I,I,m)(t sub(f)/t). Similarly, P sub(I,II,m)(t sub(f)/t) is defined as the probability for m neutrons to exist in core II of the system at time t sub(f), originating from a neutron injected into the core I at time t. Then a system of coupled equations are derived for the generating functions G sub(Ij)(z, t sub(f)/t) = μP sub(Ijm)(t sub(f)/t).z sup(m), where j = I, II. By similar procedures equations are derived for the generating functions associated with joint probability of the following events: a given combination of numbers of neutrons are detected during given series of detection time intervals by a detector inserted in one of the cores. The above two kinds of systems of equations can be regarded as a two-point version of Pal-Bell's equations. As the application of these formulations, analyzing formula for correlation measurements, namely (1) Feynman-alpha experiment and (2) Rossi-alpha experiment of Orndoff-type, are derived, and their feasibility is verified by experiments carried out at KUCA. (author)

  15. Bubble point measurement and high pressure distillation column design for the environmentally benign separation of zirconium from hafnium for nuclear power reactor

    International Nuclear Information System (INIS)

    Minh, Le Quang; Kim, Gyeongmin; Lee, Moonyong; Park, Jongki

    2015-01-01

    We examined the feasible separation of ZrCl 4 and HfCl 4 through high pressure distillation as environmentally benign separation for structural material of nuclear power reactor. The bubble point pressures of ZrCl 4 and HfCl 4 mixtures were determined experimentally by using an invariable volume equilibrium cell at high pressure and temperature condition range of 2.3-5..6MPa and 440-490 .deg. C. The experimental bubble point pressure data were correlated with Peng-Robinson equation of state with a good agreement. Based on the vapor-liquid equilibrium properties evaluated from the experimental data, the feasibility of high pressure distillation process for the separation of ZrCl 4 and HfCl 4 was investigated with its main design condition through rigorous simulation using a commercial process simulator, ASPEN Hysys. An enhanced distillation configuration was also proposed to improve energy efficiency in the distillation process. The result showed that a heat-pump assisted distillation with a partial bottom flash could be a promising option for commercial separation of ZrCl 4 and HfCl 4 by taking into account of both energy and environmental advantages

  16. Spectral shift reactor control method

    International Nuclear Information System (INIS)

    Impink, A.J. Jr.

    1981-01-01

    A method of operating a nuclear reactor having a core and coolant displacer elements arranged in the core wherein is established a reator coolant temperature set point at which it is desired to operate said reactor and first reactor coolant temperature band limits are provided within which said set point is located and it is desired to operate said reactor charactrized in that said reactor coolant displacer elements are moved relative to the reactor core for adjusting the volume of reactor coolant in said core as said reactor coolant temperature approaches said first band limits thereby to maintain said reactor coolant temperature near said set point and within said first band limits

  17. Rocking pneumonia

    OpenAIRE

    Rijkers, Ger T.; Rodriguez Gomez, Maria

    2017-01-01

    Ever since Chuck Berry coined the term “rocking pneumonia” in his 1956 song “Roll over Beethoven”, pneumonia has been mentioned frequently in modern blues and rock songs. We analyzed the lyrics of these songs to examine how various elements of pneumonia have been represented in popular music, specifically the cause of pneumonia, the risk groups, comorbidity (such as the boogie woogie flu), the clinical symptoms, and treatment and outcome. Up to this day, songwriters suggest that pneumonia is ...

  18. Geological and geotechnical aspects of the foundation pit of Kaiga atomic power plant reactor building 2, Kaiga, Uttara Kannada district, Karnataka

    International Nuclear Information System (INIS)

    Katti, Vinod J.; Shah, V.L.; Pande, A.K.

    2014-01-01

    In India Nuclear Power Plants are constructed as per the guidelines laid by IAEA and AERB. Before concrete is poured into reactor building pits, they are systematically mapped and Iithostructural maps are prepared for pit base and side walls. The constraints noticed are carefully attended with geotechnical solutions and remedies to make foundation safe for the entire period of reactor life. Similarly, pit of Kaiga Reactor Building II was systematically mapped for circular base and side walls. Geo-engineering solutions like scrapping out loose, foliated schistose patches, scooping out soft altered zones, filling with grouting, rock-bolting rock segments with major joints and fractures for stopping seepage points were suggested. (author)

  19. Big Data, Big Problems: A Healthcare Perspective.

    Science.gov (United States)

    Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M

    2017-01-01

    Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."

  20. Stalin's Big Fleet Program

    National Research Council Canada - National Science Library

    Mauner, Milan

    2002-01-01

    Although Dr. Milan Hauner's study 'Stalin's Big Fleet program' has focused primarily on the formation of Big Fleets during the Tsarist and Soviet periods of Russia's naval history, there are important lessons...

  1. Big Data Semantics

    NARCIS (Netherlands)

    Ceravolo, Paolo; Azzini, Antonia; Angelini, Marco; Catarci, Tiziana; Cudré-Mauroux, Philippe; Damiani, Ernesto; Mazak, Alexandra; van Keulen, Maurice; Jarrar, Mustafa; Santucci, Giuseppe; Sattler, Kai-Uwe; Scannapieco, Monica; Wimmer, Manuel; Wrembel, Robert; Zaraket, Fadi

    2018-01-01

    Big Data technology has discarded traditional data modeling approaches as no longer applicable to distributed data processing. It is, however, largely recognized that Big Data impose novel challenges in data and infrastructure management. Indeed, multiple components and procedures must be

  2. Rock solidification method

    International Nuclear Information System (INIS)

    Nakaya, Iwao; Murakami, Tadashi; Miyake, Takafumi; Funakoshi, Toshio; Inagaki, Yuzo; Hashimoto, Yasuhide.

    1985-01-01

    Purpose: To convert radioactive wastes into the final state for storage (artificial rocks) in a short period of time. Method: Radioactive burnable wastes such as spent papers, cloths and oils and activated carbons are burnt into ashes in a burning furnace, while radioactive liquid wastes such as liquid wastes of boric acid, exhausted cleaning water and decontaminating liquid wastes are powderized in a drying furnace or calcining furnace. These powders are joined with silicates as such as white clay, silica and glass powder and a liquid alkali such as NaOH or Ca(OH) 2 and transferred to a solidifying vessel. Then, the vessel is set to a hydrothermal reactor, heated and pressurized, then taken out about 20 min after and tightly sealed. In this way, radioactive wastes are converted through the hydrothermal reactions into aqueous rock stable for a long period of time to obtain solidification products insoluble to water and with an extremely low leaching rate. (Ikeda, J.)

  3. Characterizing Big Data Management

    OpenAIRE

    Rogério Rossi; Kechi Hirama

    2015-01-01

    Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: t...

  4. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  5. The prototype fast reactor

    International Nuclear Information System (INIS)

    Broomfield, A.M.

    1985-01-01

    The paper concerns the Prototype Fast Reactor (PFR), which is a liquid metal cooled fast reactor power station, situated at Dounreay, Scotland. The principal design features of a Fast Reactor and the PFR are given, along with key points of operating history, and health and safety features. The role of the PFR in the development programme for commercial reactors is discussed. (U.K.)

  6. The Influence Of Switching-Off The Big Lamps On The Humidity Operation Hall

    International Nuclear Information System (INIS)

    Wiranto, Slamet; Sriawan

    2001-01-01

    When there is no activity in the Operation Hall, the big lamps in this are switched off. Due to the water trap of ventilation system is not in good function, the humidity of the Operation Hall increases. In any point of time the humidity rise over the permitted limit value. To avoid this problem it is needed to investigate the characteristic by measuring the humidity of the Operation Hall at various condition and situation. From the characteristic, it can be determined that for normal condition, the Operation Hall big lamps should be switched off, and 2 days before start-up reactor, the all operation building lamps should be switched on for about 5 days as the operation building humidity back to normal value

  7. Transporting radioactive rock

    International Nuclear Information System (INIS)

    Pearce, G.

    1990-01-01

    The case is made for exempting geological specimens from the IAEA Regulations for Safer Transport of Radioactive Materials. It is pointed out that many mineral collectors in Devon and Cornwall may be unwittingly infringing these regulations by taking naturally radioactive rocks and specimens containing uranium ores. Even if these collectors are aware that these rocks are radioactive, and many are not, few have the necessary equipment to monitor the activity levels. If the transport regulations were to be enforced alarm could be generated and the regulations devalued in case of an accident. The danger from a spill of rock specimens is negligible compared with an accident involving industrial or medical radioactive substances yet would require similar special treatment. (UK)

  8. When Big Ice Turns Into Water It Matters For Houses, Stores And Schools All Over

    Science.gov (United States)

    Bell, R. E.

    2017-12-01

    When ice in my glass turns to water it is not bad but when the big ice at the top and bottom of the world turns into water it is not good. This new water makes many houses, stores and schools wet. It is really bad during when the wind is strong and the rain is hard. New old ice water gets all over the place. We can not get to work or school or home. We go to the big ice at the top and bottom of the world to see if it will turn to water soon and make more houses wet. We fly over the big ice to see how it is doing. Most of the big ice sits on rock. Around the edge of the big sitting on rock ice, is really low ice that rides on top of the water. This really low ice slows down the big rock ice turning into water. If the really low ice cracks up and turns into little pieces of ice, the big rock ice will make more houses wet. We look to see if there is new water in the cracks. Water in the cracks is bad as it hurts the big rock ice. Water in the cracks on the really low ice will turn the low ice into many little pieces of ice. Then the big rock ice will turn to water. That is water in cracks is bad for the houses, schools and businesses. If water moves off the really low ice, it does not stay in the cracks. This is better for the really low ice. This is better for the big rock ice. We took pictures of the really low ice and saw water leaving. The water was not staying in the cracks. Water leaving the really low ice might be good for houses, schools and stores.

  9. Big Data and central banks

    OpenAIRE

    David Bholat

    2015-01-01

    This commentary recaps a Centre for Central Banking Studies event held at the Bank of England on 2–3 July 2014. The article covers three main points. First, it situates the Centre for Central Banking Studies event within the context of the Bank’s Strategic Plan and initiatives. Second, it summarises and reflects on major themes from the event. Third, the article links central banks’ emerging interest in Big Data approaches with their broader uptake by other economic agents.

  10. Big Bang or vacuum fluctuation

    International Nuclear Information System (INIS)

    Zel'dovich, Ya.B.

    1980-01-01

    Some general properties of vacuum fluctuations in quantum field theory are described. The connection between the ''energy dominance'' of the energy density of vacuum fluctuations in curved space-time and the presence of singularity is discussed. It is pointed out that a de-Sitter space-time (with the energy density of the vacuum fluctuations in the Einstein equations) that matches the expanding Friedman solution may describe the history of the Universe before the Big Bang. (P.L.)

  11. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  12. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  13. One approach to accepting and transporting spent fuel from early-generation reactors with short fuel assemblies

    International Nuclear Information System (INIS)

    Peterson, R.W.; Bentz, E.J. Jr.; Bentz, C.B.

    1993-01-01

    In the early days of development of commercial nuclear power reactors in the U.S., the overall length and uranium loading of the fuel assemblies were considerably less than those of later generation facilities. In turn, some of these early facilities were designed for handling shorter casks than currently-certified casks. The spent fuel assemblies from these facilities are nearly all standard fuel within the definition in the Standard Contract (10 CFR 961) between the utilities and the U.S. Department of Energy (DOE) (the Big Rock Point fuel cross-section is outside the standard fuel dimension), and the utilities involved hold early delivery rights under DOE's oldest-fuel-first (OFF) allocation scenario. However, development of casks suitable for satisfying the acceptance and transportation requirements of some of these facilities is not currently underway in the DOE Cask System Development Program (CSDP). While the total MTU of these fuels is relatively small compared to the total program, the number of assemblies to be transported is significant, especially in the early years of operation according to the OFF allocation scenario. We therefore perceive a need for DOE to develop an approach and to implement plans to satisfy the unique acceptance and transportation requirements of these facilities. One such approach is outlined below. (author)

  14. Source rock

    Directory of Open Access Journals (Sweden)

    Abubakr F. Makky

    2014-03-01

    Full Text Available West Beni Suef Concession is located at the western part of Beni Suef Basin which is a relatively under-explored basin and lies about 150 km south of Cairo. The major goal of this study is to evaluate the source rock by using different techniques as Rock-Eval pyrolysis, Vitrinite reflectance (%Ro, and well log data of some Cretaceous sequences including Abu Roash (E, F and G members, Kharita and Betty formations. The BasinMod 1D program is used in this study to construct the burial history and calculate the levels of thermal maturity of the Fayoum-1X well based on calibration of measured %Ro and Tmax against calculated %Ro model. The calculated Total Organic Carbon (TOC content from well log data compared with the measured TOC from the Rock-Eval pyrolysis in Fayoum-1X well is shown to match against the shale source rock but gives high values against the limestone source rock. For that, a new model is derived from well log data to calculate accurately the TOC content against the limestone source rock in the study area. The organic matter existing in Abu Roash (F member is fair to excellent and capable of generating a significant amount of hydrocarbons (oil prone produced from (mixed type I/II kerogen. The generation potential of kerogen in Abu Roash (E and G members and Betty formations is ranging from poor to fair, and generating hydrocarbons of oil and gas prone (mixed type II/III kerogen. Eventually, kerogen (type III of Kharita Formation has poor to very good generation potential and mainly produces gas. Thermal maturation of the measured %Ro, calculated %Ro model, Tmax and Production index (PI indicates that Abu Roash (F member exciting in the onset of oil generation, whereas Abu Roash (E and G members, Kharita and Betty formations entered the peak of oil generation.

  15. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  16. NASA's Big Data Task Force

    Science.gov (United States)

    Holmes, C. P.; Kinter, J. L.; Beebe, R. F.; Feigelson, E.; Hurlburt, N. E.; Mentzel, C.; Smith, G.; Tino, C.; Walker, R. J.

    2017-12-01

    Two years ago NASA established the Ad Hoc Big Data Task Force (BDTF - https://science.nasa.gov/science-committee/subcommittees/big-data-task-force), an advisory working group with the NASA Advisory Council system. The scope of the Task Force included all NASA Big Data programs, projects, missions, and activities. The Task Force focused on such topics as exploring the existing and planned evolution of NASA's science data cyber-infrastructure that supports broad access to data repositories for NASA Science Mission Directorate missions; best practices within NASA, other Federal agencies, private industry and research institutions; and Federal initiatives related to big data and data access. The BDTF has completed its two-year term and produced several recommendations plus four white papers for NASA's Science Mission Directorate. This presentation will discuss the activities and results of the TF including summaries of key points from its focused study topics. The paper serves as an introduction to the papers following in this ESSI session.

  17. Intellektuaalne rock

    Index Scriptorium Estoniae

    2007-01-01

    Briti laulja-helilooja ja näitleja Toyah Willcox ning Bill Rieflin ansamblist R.E.M. ja Pat Mastelotto King Krimsonist esinevad koos ansamblitega The Humans ja Tuner 25. okt. Tallinnas Rock Cafés ja 27. okt Tartu Jaani kirikus

  18. Igneous Rocks

    Science.gov (United States)

    Doe, Bruce R.

    “Igneous Rocks was written for undergraduate geology majors who have had a year of college-level chemistry and a course in mineralogy … and for beginning graduate students. Geologists working in industry, government, or academia should find this text useful as a guide to the technical literature up to 1981 and as an overview of topics with which they have not worked but which may have unanticipated pertinence to their own projects.” So starts the preface to this textbook.As one who works part time in research on igneous rocks, especially as they relate to mineral deposits, I have been looking for such a book with this avowed purpose in a field that has a choking richness of evolving terminology and a bewildering volume of interdisciplinary literature. In addition to the standard topics of igneous petrology, the book contains a chapter on the role of igneous activity in the genesis of mineral deposits, its value to geothermal energy, and the potential of igneous rocks as an environment for nuclear waste disposal. These topics are presented rather apologetically in the preface, but the author is to be applauded for including this chapter. The apology shows just how new these interests are to petrology. Recognition is finally coming that, for example, mineral deposits are not “sports of nature,” a view held even by many economic geologists as recently as the early 1960's; instead they are perfectly ordinary geochemical features formed by perfectly ordinary geologic processes. In fact, the mineral deposits and their attendant alteration zones probably have as much to tell us about igneous rocks as the igneous rocks have to tell us about mineral deposits.

  19. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  20. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  1. Antigravity and the big crunch/big bang transition

    International Nuclear Information System (INIS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  2. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  3. The decision on the application to carry out a decommissioning project at Hinkley Point A Power Station under the Nuclear Reactors (Environmental Impact Assessment for Decommissioning) Regulations 1999

    International Nuclear Information System (INIS)

    2003-01-01

    European Council Directive 85/337/EEC, as amended by Council Directive 97/1 I/EC, sets out a framework on the assessment of the effects of certain public and private projects on the environment. The Directive is implemented in Great Britain for decommissioning nuclear reactor projects by the Nuclear Reactors (Environmental Impact Assessment for Decommissioning) Regulations 1999. The intention of the Directive and Regulations is to involve the public through consultation in considering the potential environmental impacts of a decommissioning project, and to make the decision-making process on granting consent open and transparent. The Regulations require the licensee to undertake an environmental impact assessment, prepare an environmental statement that summarises the environmental effects of the project, and apply to the Health and Safety Executive (HSE) for consent to carry out a decommissioning project. There is an optional stage where the licensee may request from HSE an opinion on what the environmental statement should contain (called a pre-application opinion). The licensee of Hinkley Point A Power Station, Magnox Electric pie, requested a pre-application opinion and provided information in a scoping report in December 2000. HSE undertook a public consultation on the scoping report and provided its pre- application opinion in April 2001. The licensee applied to HSE for consent to carry out a decommissioning project and provided an environmental statement in December 2001. Following a public consultation on the environmental statement, HSE requested further information that was subsequently provided by the licensee. A further public consultation was undertaken on the further information that ended in March 2003. All these public consultations involved around 60 organisations. HSE granted consent to carry out a decommissioning project at Hinkley Point A Power Station under the Regulations in July 2003, and attached conditions to the Consent. HSE took relevant

  4. The big bang

    International Nuclear Information System (INIS)

    Chown, Marcus.

    1987-01-01

    The paper concerns the 'Big Bang' theory of the creation of the Universe 15 thousand million years ago, and traces events which physicists predict occurred soon after the creation. Unified theory of the moment of creation, evidence of an expanding Universe, the X-boson -the particle produced very soon after the big bang and which vanished from the Universe one-hundredth of a second after the big bang, and the fate of the Universe, are all discussed. (U.K.)

  5. Evaluation of Rock Bolt Support for Polish Hard Rock Mines

    Science.gov (United States)

    Skrzypkowski, Krzysztof

    2018-03-01

    The article presents different types of rock bolt support used in Polish ore mining. Individual point resin and expansion rock bolt support were characterized. The roof classes for zinc and lead and copper ore mines were presented. Furthermore, in the article laboratory tests of point resin rock bolt support in a geometric scale of 1:1 with minimal fixing length of 0.6 m were made. Static testing of point resin rock bolt support were carried out on a laboratory test facility of Department of Underground Mining which simulate mine conditions for Polish ore and hard coal mining. Laboratory tests of point resin bolts were carried out, especially for the ZGH Bolesław, zinc and lead "Olkusz - Pomorzany" mine. The primary aim of the research was to check whether at the anchoring point length of 0.6 m by means of one and a half resin cartridge, the type bolt "Olkusz - 20A" is able to overcome the load.The second purpose of the study was to obtain load - displacement characteristic with determination of the elastic and plastic range of the bolt. For the best simulation of mine conditions the station steel cylinders with an external diameter of 0.1 m and a length of 0.6 m with a core of rock from the roof of the underground excavations were used.

  6. White Rock

    Science.gov (United States)

    2002-01-01

    (Released 19 April 2002) The Science 'White Rock' is the unofficial name for this unusual landform which was first observed during the Mariner 9 mission in the early 1970's. As later analysis of additional data sets would show, White Rock is neither white nor dense rock. Its apparent brightness arises from the fact that the material surrounding it is so dark. Images from the Mars Global Surveyor MOC camera revealed dark sand dunes surrounding White Rock and on the floor of the troughs within it. Some of these dunes are just apparent in the THEMIS image. Although there was speculation that the material composing White Rock could be salts from an ancient dry lakebed, spectral data from the MGS TES instrument did not support this claim. Instead, the White Rock deposit may be the erosional remnant of a previously more continuous occurrence of air fall sediments, either volcanic ash or windblown dust. The THEMIS image offers new evidence for the idea that the original deposit covered a larger area. Approximately 10 kilometers to the southeast of the main deposit are some tiny knobs of similarly bright material preserved on the floor of a small crater. Given that the eolian erosion of the main White Rock deposit has produced isolated knobs at its edges, it is reasonable to suspect that the more distant outliers are the remnants of a once continuous deposit that stretched at least to this location. The fact that so little remains of the larger deposit suggests that the material is very easily eroded and simply blows away. The Story Fingers of hard, white rock seem to jut out like icy daggers across a moody Martian surface, but appearances can be deceiving. These bright, jagged features are neither white, nor icy, nor even hard and rocky! So what are they, and why are they so different from the surrounding terrain? Scientists know that you can't always trust what your eyes see alone. You have to use other kinds of science instruments to measure things that our eyes can

  7. Big data business models: Challenges and opportunities

    Directory of Open Access Journals (Sweden)

    Ralph Schroeder

    2016-12-01

    Full Text Available This paper, based on 28 interviews from a range of business leaders and practitioners, examines the current state of big data use in business, as well as the main opportunities and challenges presented by big data. It begins with an account of the current landscape and what is meant by big data. Next, it draws distinctions between the ways organisations use data and provides a taxonomy of big data business models. We observe a variety of different business models, depending not only on sector, but also on whether the main advantages derive from analytics capabilities or from having ready access to valuable data sources. Some major challenges emerge from this account, including data quality and protectiveness about sharing data. The conclusion discusses these challenges, and points to the tensions and differing perceptions about how data should be governed as between business practitioners, the promoters of open data, and the wider public.

  8. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  9. Data: Big and Small.

    Science.gov (United States)

    Jones-Schenk, Jan

    2017-02-01

    Big data is a big topic in all leadership circles. Leaders in professional development must develop an understanding of what data are available across the organization that can inform effective planning for forecasting. Collaborating with others to integrate data sets can increase the power of prediction. Big data alone is insufficient to make big decisions. Leaders must find ways to access small data and triangulate multiple types of data to ensure the best decision making. J Contin Educ Nurs. 2017;48(2):60-61. Copyright 2017, SLACK Incorporated.

  10. A Big Video Manifesto

    DEFF Research Database (Denmark)

    Mcilvenny, Paul Bruce; Davidsen, Jacob

    2017-01-01

    and beautiful visualisations. However, we also need to ask what the tools of big data can do both for the Humanities and for more interpretative approaches and methods. Thus, we prefer to explore how the power of computation, new sensor technologies and massive storage can also help with video-based qualitative......For the last few years, we have witnessed a hype about the potential results and insights that quantitative big data can bring to the social sciences. The wonder of big data has moved into education, traffic planning, and disease control with a promise of making things better with big numbers...

  11. Nuclear reactor buildings

    International Nuclear Information System (INIS)

    Nagashima, Shoji; Kato, Ryoichi.

    1985-01-01

    Purpose: To reduce the cost of reactor buildings and satisfy the severe seismic demands in tank type FBR type reactors. Constitution: In usual nuclear reactor buildings of a flat bottom embedding structure, the flat bottom is entirely embedded into the rock below the soils down to the deck level of the nuclear reactor. As a result, although the weight of the seismic structure can be decreased, the amount of excavating the cavity is significantly increased to inevitably increase the plant construction cost. Cross-like intersecting foundation mats are embedded to the building rock into a thickness capable withstanding to earthquakes while maintaining the arrangement of equipments around the reactor core in the nuclear buildings required by the system design, such as vertical relationship between the equipments, fuel exchange systems and sponteneous drainings. Since the rock is hard and less deformable, the rigidity of the walls and the support structures of the reactor buildings can be increased by the embedding into the rock substrate and floor responsivity can be reduced. This enables to reduce the cost and increasing the seismic proofness. (Kamimura, M.)

  12. The Rossendorf research reactor. Operating and dismantling from a point of view of the emission control; Der Rossendorfer Forschungsreaktor. Betrieb und Rueckbau aus Sicht der Emissionsueberwachung

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, B.; Beutmann, A.; Kaden, M.; Scheibke, J. [VKTA, Dresden (Germany); Boessert, W.; Jansen, K.; Walter, M.

    2016-07-01

    The Rossendorf research reactor went in operation in 1957 as GDR's first nuclear reactor and Germanys second after FRM Garching. It was a heterogeneously structured, light-water moderated and cooled tank-reactor of the Soviet type WWR-S. During his time of operation, he served both the research and the production of radioisotopes. The history of exhaust air emission monitoring and its results are presented. With view to the decommissioning time selected results are discussed. The estimated discharges are compared by the actually recognized.

  13. Source rock potential of middle cretaceous rocks in Southwestern Montana

    Science.gov (United States)

    Dyman, T.S.; Palacas, J.G.; Tysdal, R.G.; Perry, W.J.; Pawlewicz, M.J.

    1996-01-01

    The middle Cretaceous in southwestern Montana is composed of a marine and nonmarine succession of predominantly clastic rocks that were deposited along the western margin of the Western Interior Seaway. In places, middle Cretaceous rocks contain appreciable total organic carbon (TOC), such as 5.59% for the Mowry Shale and 8.11% for the Frontier Formation in the Madison Range. Most samples, however, exhibit less than 1.0% TOC. The genetic or hydrocarbon potential (S1+S2) of all the samples analyzed, except one, yield less than 1 mg HC/g rock, strongly indicating poor potential for generating commercial amounts of hydrocarbons. Out of 51 samples analyzed, only one (a Thermopolis Shale sample from the Snowcrest Range) showed a moderate petroleum potential of 3.1 mg HC/g rock. Most of the middle Cretaceous samples are thermally immature to marginally mature, with vitrinite reflectance ranging from about 0.4 to 0.6% Ro. Maturity is high in the Pioneer Mountains, where vitrinite reflectance averages 3.4% Ro, and at Big Sky Montana, where vitrinite reflectance averages 2.5% Ro. At both localities, high Ro values are due to local heat sources, such as the Pioneer batholith in the Pioneer Mountains.

  14. Rock stresses (Grimsel rock laboratory)

    International Nuclear Information System (INIS)

    Pahl, A.; Heusermann, S.; Braeuer, V.; Gloeggler, W.

    1989-01-01

    On the research and development project 'Rock Stress Measurements' the BGR has developed and tested several test devices and methods at GTS for use in boreholes at a depth of 200 m and has carried out rock mechanical and engineering geological investigations for the evaluation and interpretation of the stress measurements. The first time a computer for data processing was installed in the borehole together with the BGR-probe. Laboratory tests on hollow cylinders were made to study the stress-deformation behavior. To validate and to interprete the measurement results some test methods were modelled using the finite-element method. The dilatometer-tests yielded high values of Young's modulus, whereas laboratory tests showed lower values with a distinct deformation anisotropy. Stress measurements with the BGR-probe yielded horizontal stresses being higher than the theoretical overburden pressure and vertical stresses which agree well with the theoretical overburden pressure. These results are comparable to the results of the hydraulic fracturing tests, whereas stresses obtained with CSIR-triaxial cells are generally lower. The detailed geological mapping of the borehole indicated relationships between stress and geology. With regard to borehole depth different zones of rock structure joint frequency, joint orientation, and orientation of microfissures as well as stress magnitude, stress direction, and degree of deformation anisotropy could be distinguished. (orig./HP) [de

  15. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  16. Dual of big bang and big crunch

    International Nuclear Information System (INIS)

    Bak, Dongsu

    2007-01-01

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory

  17. Thermal Inertia of Rocks and Rock Populations

    Science.gov (United States)

    Golombek, M. P.; Jakosky, B. M.; Mellon, M. T.

    2001-01-01

    The effective thermal inertia of rock populations on Mars and Earth is derived from a model of effective inertia versus rock diameter. Results allow a parameterization of the effective rock inertia versus rock abundance and bulk and fine component inertia. Additional information is contained in the original extended abstract.

  18. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  19. Crisis analytics : big data-driven crisis response

    NARCIS (Netherlands)

    Qadir, Junaid; ur Rasool, Raihan; Zwitter, Andrej; Sathiaseelan, Arjuna; Crowcroft, Jon

    2016-01-01

    Disasters have long been a scourge for humanity. With the advances in technology (in terms of computing, communications, and the ability to process, and analyze big data), our ability to respond to disasters is at an inflection point. There is great optimism that big data tools can be leveraged to

  20. Big Data, Small Sample.

    Science.gov (United States)

    Gerlovina, Inna; van der Laan, Mark J; Hubbard, Alan

    2017-05-20

    Multiple comparisons and small sample size, common characteristics of many types of "Big Data" including those that are produced by genomic studies, present specific challenges that affect reliability of inference. Use of multiple testing procedures necessitates calculation of very small tail probabilities of a test statistic distribution. Results based on large deviation theory provide a formal condition that is necessary to guarantee error rate control given practical sample sizes, linking the number of tests and the sample size; this condition, however, is rarely satisfied. Using methods that are based on Edgeworth expansions (relying especially on the work of Peter Hall), we explore the impact of departures of sampling distributions from typical assumptions on actual error rates. Our investigation illustrates how far the actual error rates can be from the declared nominal levels, suggesting potentially wide-spread problems with error rate control, specifically excessive false positives. This is an important factor that contributes to "reproducibility crisis". We also review some other commonly used methods (such as permutation and methods based on finite sampling inequalities) in their application to multiple testing/small sample data. We point out that Edgeworth expansions, providing higher order approximations to the sampling distribution, offer a promising direction for data analysis that could improve reliability of studies relying on large numbers of comparisons with modest sample sizes.

  1. [Embracing medical innovation in the era of big data].

    Science.gov (United States)

    You, Suning

    2015-01-01

    Along with the advent of big data era worldwide, medical field has to place itself in it inevitably. The current article thoroughly introduces the basic knowledge of big data, and points out the coexistence of its advantages and disadvantages. Although the innovations in medical field are struggling, the current medical pattern will be changed fundamentally by big data. The article also shows quick change of relevant analysis in big data era, depicts a good intention of digital medical, and proposes some wise advices to surgeons.

  2. Big Data: Survey, Technologies, Opportunities, and Challenges

    Directory of Open Access Journals (Sweden)

    Nawsher Khan

    2014-01-01

    Full Text Available Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  3. Big data: survey, technologies, opportunities, and challenges.

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  4. Big Data: Survey, Technologies, Opportunities, and Challenges

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Mahmoud Ali, Waleed Kamaleldin; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  5. From Big Data to Big Business

    DEFF Research Database (Denmark)

    Lund Pedersen, Carsten

    2017-01-01

    Idea in Brief: Problem: There is an enormous profit potential for manufacturing firms in big data, but one of the key barriers to obtaining data-driven growth is the lack of knowledge about which capabilities are needed to extract value and profit from data. Solution: We (BDBB research group at C...

  6. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  7. Reactivity changes in hybrid thermal-fast reactor systems during fast core flooding

    International Nuclear Information System (INIS)

    Pesic, M.

    1994-09-01

    A new space-dependent kinetic model in adiabatic approximation with local feedback reactivity parameters for reactivity determination in the coupled systems is proposed in this thesis. It is applied in the accident calculation of the 'HERBE' fast-thermal reactor system and compared to usual point kinetics model with core-averaged parameters. Advantages of the new model - more realistic picture of the reactor kinetics and dynamics during local large reactivity perturbation, under the same heat transfer conditions, are underlined. Calculated reactivity parameters of the new model are verified in the experiments performed at the 'HERBE' coupled core. The model has shown that the 'HERBE' safety system can shutdown reactor safely and fast even in the case of highly set power trip and even under conditions of big partial failure of the reactor safety system (author)

  8. Big Data and central banks

    Directory of Open Access Journals (Sweden)

    David Bholat

    2015-04-01

    Full Text Available This commentary recaps a Centre for Central Banking Studies event held at the Bank of England on 2–3 July 2014. The article covers three main points. First, it situates the Centre for Central Banking Studies event within the context of the Bank’s Strategic Plan and initiatives. Second, it summarises and reflects on major themes from the event. Third, the article links central banks’ emerging interest in Big Data approaches with their broader uptake by other economic agents.

  9. Nuclear power in rock. Principal report

    International Nuclear Information System (INIS)

    1977-06-01

    In September 1975 the Swedish Government directed the Swedish State Power Board to study the question of rock-siting nuclear power plants. The study accounted for in this report aims at clarifying the advantages and disadvantages of siting a nuclear power plant in rock, compared to siting on ground level, considering reactor safety, war protection and sabotage. The need for nuclear power production during war situations and the closing down of nuclear power plants after terminated operation are also dealt with. (author)

  10. Favorability for uranium in tertiary sedimentary rocks, southwestern Montana

    International Nuclear Information System (INIS)

    Wopat, M.A.; Curry, W.E.; Robins, J.W.; Marjaniemi, D.K.

    1977-10-01

    Tertiary sedimentary rocks in the basins of southwestern Montana were studied to determine their favorability for potential uranium resources. Uranium in the Tertiary sedimentary rocks was probably derived from the Boulder batholith and from silicic volcanic material. The batholith contains numerous uranium occurrences and is the most favorable plutonic source for uranium in the study area. Subjective favorability categories of good, moderate, and poor, based on the number and type of favorable criteria present, were used to classify the rock sequences studied. Rocks judged to have good favorability for uranium deposits are (1) Eocene and Oligocene strata and undifferentiated Tertiary rocks in the western Three Forks basin and (2) Oligocene rocks in the Helena basin. Rocks having moderate favorability consist of (1) Eocene and Oligocene strata in the Jefferson River, Beaverhead River, and lower Ruby River basins, (2) Oligocene rocks in the Townsend and Clarkston basins, (3) Miocene and Pliocene rocks in the Upper Ruby River basin, and (4) all Tertiary sedimentary formations in the eastern Three Forks basin, and in the Grasshopper Creek, Horse Prairie, Medicine Lodge Creek, Big Sheep Creek, Deer Lodge, Big Hole River, and Bull Creek basins. The following have poor favorability: (1) the Beaverhead Conglomerate in the Red Rock and Centennial basins, (2) Eocene and Oligocene rocks in the Upper Ruby River basin, (3) Miocene and Pliocene rocks in the Townsend, Clarkston, Smith River, and Divide Creek basins, (4) Miocene through Pleistocene rocks in the Jefferson River, Beaverhead River, and Lower Ruby River basins, and (5) all Tertiary sedimentary rocks in the Boulder River, Sage Creek, Muddy Creek, Madison River, Flint Creek, Gold Creek, and Bitterroot basins

  11. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  12. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  13. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van

  14. Water - rock interaction in different rock environments

    International Nuclear Information System (INIS)

    Lamminen, S.

    1995-01-01

    The study assesses the groundwater geochemistry and geological environment of 44 study sites for radioactive waste disposal. Initially, the study sites were divided by rock type into 5 groups: (1) acid - intermediate rocks, (2) mafic - ultramafic rocks, (3) gabbros, amphibolites and gneisses that contain calc-silicate (skarn) rocks, (4) carbonates and (5) sandstones. Separate assessments are made of acid - intermediate plutonic rocks and of a subgroup that comprises migmatites, granite and mica gneiss. These all belong to the group of acid - intermediate rocks. Within the mafic -ultramafic rock group, a subgroup that comprises mafic - ultramafic plutonic rocks, serpentinites, mafic - ultramafic volcanic rocks and volcanic - sedimentary schists is also evaluated separately. Bedrock groundwaters are classified by their concentration of total dissolved solids as fresh, brackish, saline, strongly saline and brine-class groundwaters. (75 refs., 24 figs., 3 tabs.)

  15. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  16. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Boyd, Richard N.

    2001-01-01

    The precision of measurements in modern cosmology has made huge strides in recent years, with measurements of the cosmic microwave background and the determination of the Hubble constant now rivaling the level of precision of the predictions of big bang nucleosynthesis. However, these results are not necessarily consistent with the predictions of the Standard Model of big bang nucleosynthesis. Reconciling these discrepancies may require extensions of the basic tenets of the model, and possibly of the reaction rates that determine the big bang abundances

  17. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  18. Big Data, indispensable today

    Directory of Open Access Journals (Sweden)

    Radu-Ioan ENACHE

    2015-10-01

    Full Text Available Big data is and will be used more in the future as a tool for everything that happens both online and offline. Of course , online is a real hobbit, Big Data is found in this medium , offering many advantages , being a real help for all consumers. In this paper we talked about Big Data as being a plus in developing new applications, by gathering useful information about the users and their behaviour.We've also presented the key aspects of real-time monitoring and the architecture principles of this technology. The most important benefit brought to this paper is presented in the cloud section.

  19. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  20. Small Big Data Congress 2017

    NARCIS (Netherlands)

    Doorn, J.

    2017-01-01

    TNO, in collaboration with the Big Data Value Center, presents the fourth Small Big Data Congress! Our congress aims at providing an overview of practical and innovative applications based on big data. Do you want to know what is happening in applied research with big data? And what can already be

  1. Cryptography for Big Data Security

    Science.gov (United States)

    2015-07-13

    Cryptography for Big Data Security Book Chapter for Big Data: Storage, Sharing, and Security (3S) Distribution A: Public Release Ariel Hamlin1 Nabil...Email: arkady@ll.mit.edu ii Contents 1 Cryptography for Big Data Security 1 1.1 Introduction...48 Chapter 1 Cryptography for Big Data Security 1.1 Introduction With the amount

  2. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  3. The lack of a big picture in tuberculosis: the clinical point of view, the problems of experimental modeling and immunomodulation. The factors we should consider when designing novel treatment strategies.

    Science.gov (United States)

    Vilaplana, Cristina; Cardona, Pere-Joan

    2014-01-01

    This short review explores the large gap between clinical issues and basic science, and suggests why tuberculosis research should focus on redirect the immune system and not only on eradicating Mycobacterium tuberculosis bacillus. Along the manuscript, several concepts involved in human tuberculosis are explored in order to understand the big picture, including infection and disease dynamics, animal modeling, liquefaction, inflammation and immunomodulation. Scientists should take into account all these factors in order to answer questions with clinical relevance. Moreover, the inclusion of the concept of a strong inflammatory response being required in order to develop cavitary tuberculosis disease opens a new field for developing new therapeutic and prophylactic tools in which destruction of the bacilli may not necessarily be the final goal.

  4. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  5. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  6. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  7. BigDansing

    KAUST Repository

    Khayyat, Zuhair; Ilyas, Ihab F.; Jindal, Alekh; Madden, Samuel; Ouzzani, Mourad; Papotti, Paolo; Quiané -Ruiz, Jorge-Arnulfo; Tang, Nan; Yin, Si

    2015-01-01

    of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic

  8. Big Creek Pit Tags

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BCPITTAGS database is used to store data from an Oncorhynchus mykiss (steelhead/rainbow trout) population dynamics study in Big Creek, a coastal stream along the...

  9. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-01-01

    on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also

  10. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  11. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  12. Big Bang baryosynthesis

    International Nuclear Information System (INIS)

    Turner, M.S.; Chicago Univ., IL

    1983-01-01

    In these lectures I briefly review Big Bang baryosynthesis. In the first lecture I discuss the evidence which exists for the BAU, the failure of non-GUT symmetrical cosmologies, the qualitative picture of baryosynthesis, and numerical results of detailed baryosynthesis calculations. In the second lecture I discuss the requisite CP violation in some detail, further the statistical mechanics of baryosynthesis, possible complications to the simplest scenario, and one cosmological implication of Big Bang baryosynthesis. (orig./HSI)

  13. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  14. Big Data and medicine: a big deal?

    Science.gov (United States)

    Mayer-Schönberger, V; Ingelsson, E

    2018-05-01

    Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research. Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data's role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval. © 2017 The Association for the Publication of the Journal of Internal Medicine.

  15. Big data uncertainties.

    Science.gov (United States)

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  16. Generation IV reactors: reactor concepts

    International Nuclear Information System (INIS)

    Cardonnier, J.L.; Dumaz, P.; Antoni, O.; Arnoux, P.; Bergeron, A.; Renault, C.; Rimpault, G.; Delpech, M.; Garnier, J.C.; Anzieu, P.; Francois, G.; Lecomte, M.

    2003-01-01

    Liquid metal reactor concept looks promising because of its hard neutron spectrum. Sodium reactors benefit a large feedback experience in Japan and in France. Lead reactors have serious assets concerning safety but they require a great effort in technological research to overcome the corrosion issue and they lack a leader country to develop this innovative technology. In molten salt reactor concept, salt is both the nuclear fuel and the coolant fluid. The high exit temperature of the primary salt (700 Celsius degrees) allows a high energy efficiency (44%). Furthermore molten salts have interesting specificities concerning the transmutation of actinides: they are almost insensitive to irradiation damage, some salts can dissolve large quantities of actinides and they are compatible with most reprocessing processes based on pyro-chemistry. Supercritical water reactor concept is based on operating temperature and pressure conditions that infers water to be beyond its critical point. In this range water gets some useful characteristics: - boiling crisis is no more possible because liquid and vapour phase can not coexist, - a high heat transfer coefficient due to the low thermal conductivity of supercritical water, and - a high global energy efficiency due to the high temperature of water. Gas-cooled fast reactors combining hard neutron spectrum and closed fuel cycle open the way to a high valorization of natural uranium while minimizing ultimate radioactive wastes and proliferation risks. Very high temperature gas-cooled reactor concept is developed in the prospect of producing hydrogen from no-fossil fuels in large scale. This use implies a reactor producing helium over 1000 Celsius degrees. (A.C.)

  17. Disc cutter wear and rock texture in hard rock TBM tunneling

    International Nuclear Information System (INIS)

    Koizumi, Yu; Tsusaka, Kimikazu; Tanimoto, Chikaosa; Nakagawa, Shigeo; Fujita, Naoya

    2008-01-01

    Disc cutter wear in TBM tunneling is caused by initial fragmentation of a solid rock face (the primary fragmentation) and fragmentation of residual rock pieces between a cutterhead and the face (the secondary fragmentation). In two projects through sedimentary and granitic rocks, the authors investigated the relationships between the rate of cutter wear caused by the primary fragmentation, point load index and the grain size and contents of abrasive minerals. As a result, it was found that the tensile strength and the mineral contents of rocks significantly influenced the cutter wear in both projects and thus it is necessary to take into account of rock type. (author)

  18. Big Data in Caenorhabditis elegans: quo vadis?

    Science.gov (United States)

    Hutter, Harald; Moerman, Donald

    2015-11-05

    A clear definition of what constitutes "Big Data" is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of "complete" data sets for this organism is actually rather small--not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein-protein interaction--important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell. © 2015 Hutter and Moerman. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  19. Farewell to a Big and Rich Nuclear Power Club?

    International Nuclear Information System (INIS)

    Takeda, A.

    2001-01-01

    For the last few decades of the 20th, century, we have seen a large number of big nuclear power plants being built and operated in a few rich countries like the United States, France, Germany, the United Kingdom, and Japan. They have standardized the 1000 MWe-type light water reactors, which have the actual generating capacity of more than 1100 MW. (author)

  20. Detector point of view of reactor internal vibrations under Gaussian coloured random forces - the problem of fitting neutron noise experimental data

    International Nuclear Information System (INIS)

    Arnal, R.S.; Martin, G.V.; Gonzalez, J.L.M.-C.

    1988-01-01

    This paper studies the local vibrations of reactor components driven by Gaussian coloured and white forces, when nonlinear vibrations arise. We study also the important problem of noise sources, modelization and the noise propagation through the neutron field using the discrete ordinates transport theory. Finally, we study the effect of the neutron field upon the PSD (power spectral density) of the noise source and we analyse the problem of fitting neutron noise experimental data to perform pattern recognition analysis. (author)

  1. Joint EC-IAEA topical meeting on development of new structural materials for advanced fission and fusion reactor systems. PowerPoint presentations

    International Nuclear Information System (INIS)

    2009-01-01

    The key topics of the meeting are the following: Radiation damage phenomena and modelling of material properties under irradiation; On-going challenges in radiation materials science; Key material parameters and operational conditions of selected reactor designs; Microstructures and mechanical properties of nuclear structural materials; Pathways to development of new structural materials; Qualification of new structural materials; Advanced microstructure probing methods; Special emphasis is given to the application of nuclear techniques in the development and qualification of new structural materials.

  2. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan Capalbo

    2005-12-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I are organized into four areas: (1) Evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; (2) Development of GIS-based reporting framework that links with national networks; (3) Design of an integrated suite of monitoring, measuring, and verification technologies, market-based opportunities for carbon management, and an economic/risk assessment framework; (referred to below as the Advanced Concepts component of the Phase I efforts) and (4) Initiation of a comprehensive education and outreach program. As a result of the Phase I activities, the groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that complements the ongoing DOE research agenda in Carbon Sequestration. The geology of the Big Sky Carbon Sequestration Partnership Region is favorable for the potential sequestration of enormous volume of CO{sub 2}. The United States Geological Survey (USGS 1995) identified 10 geologic provinces and 111 plays in the region. These provinces and plays include both sedimentary rock types characteristic of oil, gas, and coal productions as well as large areas of mafic volcanic rocks. Of the 10 provinces and 111 plays, 1 province and 4 plays are located within Idaho. The remaining 9 provinces and 107 plays are dominated by sedimentary rocks and located in the states of Montana and Wyoming. The potential sequestration capacity of the 9 sedimentary provinces within the region ranges from 25,000 to almost 900,000 million metric tons of CO{sub 2}. Overall every sedimentary formation investigated

  3. Operating Experience in Nuclear Power Plants with Boiling-Water Reactors; Experience acquise dans l'exploitation des reacteurs a eau bouillante; Opyt ehkspluatatsii kipyashchago reaktora; Experiencia adquirida con la explotacion de reactores de agua hirviente

    Energy Technology Data Exchange (ETDEWEB)

    Ascherl, R. J. [General Electric Company, San Jose, CA (United States)

    1963-10-15

    Asignificant amount of operating experience has now been accumulated by boiling-water-reactor power plants. By the end of 1962, over 2200 million kWh of electricity have been generated by three plants operating on utility.system - Dresden Nuclear Power Station, Commonwealth Edison Company, Morris, Illinois; Vallecitos Atomic Power Plant, Pacific Gas and Electric Company and General Electric Company, Pleasanton, California; and Kahl Nuclear Power Station, Rheinisch Westfaelisches Elektrizitaetswerk and Bayernwerk, Kahl-am-Main, West Germany. Boiling-water-reactor power-plant performance, under routine electric-utility operating conditions, has been excellent. Reactor and plant availability and capacity factor provide a sound basis for anticipation of continuing reliable performance from boiling-water-reactor power stations. During 1963, four additional boiling-water-reactor plants will begin power operation: Big Rock Point Nuclear Plant, Consumers Power Company, Charlevoix, Michigan; Humboldt Bay Plant Nuclear Unit, Pacific Gas and Electric Company, Eureka, California; Garigliano Nuclear Power Station, Societa Elettronucleare Nazionale, Scauri, Italy; and Japan Power Demonstration Reactor, Japan Atomic Energy Research Institute, Tokai Mura, Japan. The start-up and initial operation of these plants confirms the expectation of reliable performance established by Dresden, Kahl, and Vallecitos. Performance records of Dresden, Kahl and Vallecitos have clearly proved the stability and safety of boiling-water reactors. Additionally, radiation levels within the plant and in the environs have been significantly below limits established by operating licences. Simplicity and ease of operation of boiling-water reactors has been confirmed. Load following characteristics of the Dresden dual-cycle boiling-water reactor have been excellent. Major and minor maintenance and repair work can be accomplished by ordinary craft unions, and without undue hardship or time limits caused by

  4. Water resources in the Big Lost River Basin, south-central Idaho

    Science.gov (United States)

    Crosthwaite, E.G.; Thomas, C.A.; Dyer, K.L.

    1970-01-01

    The Big Lost River basin occupies about 1,400 square miles in south-central Idaho and drains to the Snake River Plain. The economy in the area is based on irrigation agriculture and stockraising. The basin is underlain by a diverse-assemblage of rocks which range, in age from Precambrian to Holocene. The assemblage is divided into five groups on the basis of their hydrologic characteristics. Carbonate rocks, noncarbonate rocks, cemented alluvial deposits, unconsolidated alluvial deposits, and basalt. The principal aquifer is unconsolidated alluvial fill that is several thousand feet thick in the main valley. The carbonate rocks are the major bedrock aquifer. They absorb a significant amount of precipitation and, in places, are very permeable as evidenced by large springs discharging from or near exposures of carbonate rocks. Only the alluvium, carbonate rock and locally the basalt yield significant amounts of water. A total of about 67,000 acres is irrigated with water diverted from the Big Lost River. The annual flow of the river is highly variable and water-supply deficiencies are common. About 1 out of every 2 years is considered a drought year. In the period 1955-68, about 175 irrigation wells were drilled to provide a supplemental water supply to land irrigated from the canal system and to irrigate an additional 8,500 acres of new land. Average. annual precipitation ranged from 8 inches on the valley floor to about 50 inches at some higher elevations during the base period 1944-68. The estimated water yield of the Big Lost River basin averaged 650 cfs (cubic feet per second) for the base period. Of this amount, 150 cfs was transpired by crops, 75 cfs left the basin as streamflow, and 425 cfs left as ground-water flow. A map of precipitation and estimated values of evapotranspiration were used to construct a water-yield map. A distinctive feature of the Big Lost River basin, is the large interchange of water from surface streams into the ground and from the

  5. Manufacture of components for Canadian reactor programs

    International Nuclear Information System (INIS)

    Perry, L.P.

    Design features, especially those relating to calandrias, are pointed out for many CANDU-type reactors and the Taiwan research reactor. The special requirements shouldered by the Canadian suppliers of heavy reactor components are analyzed. (E.C.B.)

  6. Hopi and Anasazi Alignments and Rock Art

    Science.gov (United States)

    Bates, Bryan C.

    The interaction of light and shadow on ancestral Puebloan rock art, or rock art demarcating sunrise/set horizon points that align with culturally significant dates, has long been assumed to be evidence of "intentional construct" for marking time or event by the native creator. However, anthropological rock art research requires the scientific control of cultural time, element orientation and placement, structure, and association with other rock art elements. The evaluation of five exemplars challenges the oft-held assumption that "if the interaction occurs, it therefore supports intentional construct" and thereby conveys meaning to the native culture.

  7. BigDansing

    KAUST Repository

    Khayyat, Zuhair

    2015-06-02

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to scaling to big datasets. This presents a serious impediment since data cleansing often involves costly computations such as enumerating pairs of tuples, handling inequality joins, and dealing with user-defined functions. In this paper, we present BigDansing, a Big Data Cleansing system to tackle efficiency, scalability, and ease-of-use issues in data cleansing. The system can run on top of most common general purpose data processing platforms, ranging from DBMSs to MapReduce-like frameworks. A user-friendly programming interface allows users to express data quality rules both declaratively and procedurally, with no requirement of being aware of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic and real datasets show that BigDansing outperforms existing baseline systems up to more than two orders of magnitude without sacrificing the quality provided by the repair algorithms.

  8. Spectral shift reactor control method

    International Nuclear Information System (INIS)

    Impink, A.J.

    1982-01-01

    A method of operating a nuclear reactor having a core and coolant displacer elements arranged in the core where there is established a reactor coolant temperature set point at which it is desired to operate the reactor and first reactor coolant temperature band limits within which the set point is characterized. The reactor coolant displacer elements are moved relative to the reactor core for adjusting the volume of reactor coolant in the core as the reactor coolant temperature approaches the first band limits to maintain the reactor coolant temperature near the set point and within the first band limits. The reactivity charges associated with movement of respective coolant displacer element clusters is calculated and compared with a calculated derived reactivity charge in order to select the cluster to be moved. (author)

  9. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  10. Thick-Big Descriptions

    DEFF Research Database (Denmark)

    Lai, Signe Sophus

    The paper discusses the rewards and challenges of employing commercial audience measurements data – gathered by media industries for profitmaking purposes – in ethnographic research on the Internet in everyday life. It questions claims to the objectivity of big data (Anderson 2008), the assumption...... communication systems, language and behavior appear as texts, outputs, and discourses (data to be ‘found’) – big data then documents things that in earlier research required interviews and observations (data to be ‘made’) (Jensen 2014). However, web-measurement enterprises build audiences according...... to a commercial logic (boyd & Crawford 2011) and is as such directed by motives that call for specific types of sellable user data and specific segmentation strategies. In combining big data and ‘thick descriptions’ (Geertz 1973) scholars need to question how ethnographic fieldwork might map the ‘data not seen...

  11. Big data in biomedicine.

    Science.gov (United States)

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Big data bioinformatics.

    Science.gov (United States)

    Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao

    2014-12-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.

  13. Rollerjaw Rock Crusher

    Science.gov (United States)

    Peters, Gregory; Brown, Kyle; Fuerstenau, Stephen

    2009-01-01

    The rollerjaw rock crusher melds the concepts of jaw crushing and roll crushing long employed in the mining and rock-crushing industries. Rollerjaw rock crushers have been proposed for inclusion in geological exploration missions on Mars, where they would be used to pulverize rock samples into powders in the tens of micrometer particle size range required for analysis by scientific instruments.

  14. [Utilization of Big Data in Medicine and Future Outlook].

    Science.gov (United States)

    Kinosada, Yasutomi; Uematsu, Machiko; Fujiwara, Takuya

    2016-03-01

    "Big data" is a new buzzword. The point is not to be dazzled by the volume of data, but rather to analyze it, and convert it into insights, innovations, and business value. There are also real differences between conventional analytics and big data. In this article, we show some results of big data analysis using open DPC (Diagnosis Procedure Combination) data in areas of the central part of JAPAN: Toyama, Ishikawa, Fukui, Nagano, Gifu, Aichi, Shizuoka, and Mie Prefectures. These 8 prefectures contain 51 medical administration areas called the second medical area. By applying big data analysis techniques such as k-means, hierarchical clustering, and self-organizing maps to DPC data, we can visualize the disease structure and detect similarities or variations among the 51 second medical areas. The combination of a big data analysis technique and open DPC data is a very powerful method to depict real figures on patient distribution in Japan.

  15. Baryon symmetric big bang cosmology

    Science.gov (United States)

    Stecker, F. W.

    1978-01-01

    Both the quantum theory and Einsteins theory of special relativity lead to the supposition that matter and antimatter were produced in equal quantities during the big bang. It is noted that local matter/antimatter asymmetries may be reconciled with universal symmetry by assuming (1) a slight imbalance of matter over antimatter in the early universe, annihilation, and a subsequent remainder of matter; (2) localized regions of excess for one or the other type of matter as an initial condition; and (3) an extremely dense, high temperature state with zero net baryon number; i.e., matter/antimatter symmetry. Attention is given to the third assumption, which is the simplest and the most in keeping with current knowledge of the cosmos, especially as pertains the universality of 3 K background radiation. Mechanisms of galaxy formation are discussed, whereby matter and antimatter might have collided and annihilated each other, or have coexisted (and continue to coexist) at vast distances. It is pointed out that baryon symmetric big bang cosmology could probably be proved if an antinucleus could be detected in cosmic radiation.

  16. Georges et le big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2011-01-01

    Georges et Annie, sa meilleure amie, sont sur le point d'assister à l'une des plus importantes expériences scientifiques de tous les temps : explorer les premiers instants de l'Univers, le Big Bang ! Grâce à Cosmos, leur super ordinateur, et au Grand Collisionneur de hadrons créé par Éric, le père d'Annie, ils vont enfin pouvoir répondre à cette question essentielle : pourquoi existons nous ? Mais Georges et Annie découvrent qu'un complot diabolique se trame. Pire, c'est toute la recherche scientifique qui est en péril ! Entraîné dans d'incroyables aventures, Georges ira jusqu'aux confins de la galaxie pour sauver ses amis...Une plongée passionnante au coeur du Big Bang. Les toutes dernières théories de Stephen Hawking et des plus grands scientifiques actuels.

  17. Stability analysis for the Big Dee upgrade of the Doublet III tokamak

    International Nuclear Information System (INIS)

    Helton, F.J.; Luxon, J.L.

    1987-01-01

    Ideal magnetohydrodynamic stability analysis has been carried out for configurations expected in the Big Dee tokamak, an upgrade of the Doublet III tokamak into a non-circular cross-section device which began operation early in 1986. The results of this analysis support theoretical predictions as follows: Since the maximum value of beta stable to ballooning and Mercier modes, which we denote β c , increases with inverse aspect ratio, elongation and triangularity, the Big Dee is particularly suited to obtain high values of β c and there exist high β c Big Dee equilibria for large variations in all relevant plasma parameters. The beta limits for the Big Dee are consistent with established theory as summarized in present scaling laws. High beta Big Dee equilibria are continuously accessible when approached through changes in all relevant input parameters and are structurally stable with respect to variations of input plasma parameters. Big Dee beta limits have a smooth dependence on plasma parameters such as β p and elongation. These calculations indicate that in the actual running of the device the Big Dee high beta equilibria should be smoothly accessible. Theory predicts that the limiting plasma parameters, such as beta, total plasma current and plasma pressure, which can be obtained within the operating limits of the Big Dee are reactor relevant. Thus the Big Dee should be able to use its favourable ideal MHD scaling and controlled plasma shaping to attain reactor relevant parameters in a moderate sized device. (author)

  18. Attempt of groundwater dating using the drilled rock core. 1. Development of the rock sampling method for measurement of noble gases dissolved in interstitial water in rock

    International Nuclear Information System (INIS)

    Mahara, Yasunori

    2002-01-01

    Groundwater dating in low permeable rock is very difficult and impracticable, because we take a very long time to collect groundwater sample in a borehole and have to invest much fund in production of the in-situ groundwater sampler and in operation of it. If we can directly measure noble gases dissolved in interstitial groundwater in rock core, we have a big merit to estimate groundwater resident time easy. In this study, we designed and produced a high vacuum container to let dissolved noble gases diffuse until reaching in equilibrium, and we made a handling manual of the rock core into the container and a procedure to vacuum out air from the sealed container. We compared data sets of noble gas concentration obtained from rock cores and groundwater sample collected from boreholes in-situ. The measured rocks are pumice-tuff rock, mud rock and hornfels, which have their permeabilities of 10 -6 cm/s, 10 -9 cm/s and 10 -11 cm/s, respectively. Consequently, we evaluated the rock core method is better than the in-situ groundwater sampling method for low permeable rock. (author)

  19. Epidemiology in wonderland: Big Data and precision medicine.

    Science.gov (United States)

    Saracci, Rodolfo

    2018-03-01

    Big Data and precision medicine, two major contemporary challenges for epidemiology, are critically examined from two different angles. In Part 1 Big Data collected for research purposes (Big research Data) and Big Data used for research although collected for other primary purposes (Big secondary Data) are discussed in the light of the fundamental common requirement of data validity, prevailing over "bigness". Precision medicine is treated developing the key point that high relative risks are as a rule required to make a variable or combination of variables suitable for prediction of disease occurrence, outcome or response to treatment; the commercial proliferation of allegedly predictive tests of unknown or poor validity is commented. Part 2 proposes a "wise epidemiology" approach to: (a) choosing in a context imprinted by Big Data and precision medicine-epidemiological research projects actually relevant to population health, (b) training epidemiologists, (c) investigating the impact on clinical practices and doctor-patient relation of the influx of Big Data and computerized medicine and (d) clarifying whether today "health" may be redefined-as some maintain in purely technological terms.

  20. The BIG Data Center: from deposition to integration to translation.

    Science.gov (United States)

    2017-01-04

    Biological data are generated at unprecedentedly exponential rates, posing considerable challenges in big data deposition, integration and translation. The BIG Data Center, established at Beijing Institute of Genomics (BIG), Chinese Academy of Sciences, provides a suite of database resources, including (i) Genome Sequence Archive, a data repository specialized for archiving raw sequence reads, (ii) Gene Expression Nebulas, a data portal of gene expression profiles based entirely on RNA-Seq data, (iii) Genome Variation Map, a comprehensive collection of genome variations for featured species, (iv) Genome Warehouse, a centralized resource housing genome-scale data with particular focus on economically important animals and plants, (v) Methylation Bank, an integrated database of whole-genome single-base resolution methylomes and (vi) Science Wikis, a central access point for biological wikis developed for community annotations. The BIG Data Center is dedicated to constructing and maintaining biological databases through big data integration and value-added curation, conducting basic research to translate big data into big knowledge and providing freely open access to a variety of data resources in support of worldwide research activities in both academia and industry. All of these resources are publicly available and can be found at http://bigd.big.ac.cn. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. Big Java late objects

    CERN Document Server

    Horstmann, Cay S

    2012-01-01

    Big Java: Late Objects is a comprehensive introduction to Java and computer programming, which focuses on the principles of programming, software engineering, and effective learning. It is designed for a two-semester first course in programming for computer science students.

  2. Big ideas: innovation policy

    OpenAIRE

    John Van Reenen

    2011-01-01

    In the last CentrePiece, John Van Reenen stressed the importance of competition and labour market flexibility for productivity growth. His latest in CEP's 'big ideas' series describes the impact of research on how policy-makers can influence innovation more directly - through tax credits for business spending on research and development.

  3. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with

  4. Big data in history

    CERN Document Server

    Manning, Patrick

    2013-01-01

    Big Data in History introduces the project to create a world-historical archive, tracing the last four centuries of historical dynamics and change. Chapters address the archive's overall plan, how to interpret the past through a global archive, the missions of gathering records, linking local data into global patterns, and exploring the results.

  5. The Big Sky inside

    Science.gov (United States)

    Adams, Earle; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Hester, Carolyn; Knuth, Randy; Spangler, Todd; Jones, David; Henthorn, Melissa; Hammill, Brock; Smith, Paul; Salisbury, Rob; Reckin, Gene; Boulafentis, Johna

    2009-01-01

    The University of Montana (UM)-Missoula has implemented a problem-based program in which students perform scientific research focused on indoor air pollution. The Air Toxics Under the Big Sky program (Jones et al. 2007; Adams et al. 2008; Ward et al. 2008) provides a community-based framework for understanding the complex relationship between poor…

  6. Moving Another Big Desk.

    Science.gov (United States)

    Fawcett, Gay

    1996-01-01

    New ways of thinking about leadership require that leaders move their big desks and establish environments that encourage trust and open communication. Educational leaders must trust their colleagues to make wise choices. When teachers are treated democratically as leaders, classrooms will also become democratic learning organizations. (SM)

  7. A Big Bang Lab

    Science.gov (United States)

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  8. New 'bigs' in cosmology

    International Nuclear Information System (INIS)

    Yurov, Artyom V.; Martin-Moruno, Prado; Gonzalez-Diaz, Pedro F.

    2006-01-01

    This paper contains a detailed discussion on new cosmic solutions describing the early and late evolution of a universe that is filled with a kind of dark energy that may or may not satisfy the energy conditions. The main distinctive property of the resulting space-times is that they make to appear twice the single singular events predicted by the corresponding quintessential (phantom) models in a manner which can be made symmetric with respect to the origin of cosmic time. Thus, big bang and big rip singularity are shown to take place twice, one on the positive branch of time and the other on the negative one. We have also considered dark energy and phantom energy accretion onto black holes and wormholes in the context of these new cosmic solutions. It is seen that the space-times of these holes would then undergo swelling processes leading to big trip and big hole events taking place on distinct epochs along the evolution of the universe. In this way, the possibility is considered that the past and future be connected in a non-paradoxical manner in the universes described by means of the new symmetric solutions

  9. The Big Bang

    CERN Multimedia

    Moods, Patrick

    2006-01-01

    How did the Universe begin? The favoured theory is that everything - space, time, matter - came into existence at the same moment, around 13.7 thousand million years ago. This event was scornfully referred to as the "Big Bang" by Sir Fred Hoyle, who did not believe in it and maintained that the Universe had always existed.

  10. Big Data Analytics

    Indian Academy of Sciences (India)

    The volume and variety of data being generated using computersis doubling every two years. It is estimated that in 2015,8 Zettabytes (Zetta=1021) were generated which consistedmostly of unstructured data such as emails, blogs, Twitter,Facebook posts, images, and videos. This is called big data. Itis possible to analyse ...

  11. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  12. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  13. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-07-31

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to big data scaling. This presents a serious impediment since identify- ing and repairing dirty data often involves processing huge input datasets, handling sophisticated error discovery approaches and managing huge arbitrary errors. With large datasets, error detection becomes overly expensive and complicated especially when considering user-defined functions. Furthermore, a distinctive algorithm is de- sired to optimize inequality joins in sophisticated error discovery rather than na ̈ıvely parallelizing them. Also, when repairing large errors, their skewed distribution may obstruct effective error repairs. In this dissertation, I present solutions to overcome the above three problems in scaling data cleansing. First, I present BigDansing as a general system to tackle efficiency, scalability, and ease-of-use issues in data cleansing for Big Data. It automatically parallelizes the user’s code on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also enables parallel execution of serial repair algorithms by exploiting the graph representation of discovered errors. The experimental results show that BigDansing outperforms existing baselines up to more than two orders of magnitude. Although BigDansing scales cleansing jobs, it still lacks the ability to handle sophisticated error discovery requiring inequality joins. Therefore, I developed IEJoin as an algorithm for fast inequality joins. It is based on sorted arrays and space efficient bit-arrays to reduce the problem’s search space. By comparing IEJoin against well- known optimizations, I show that it is more scalable, and several orders of magnitude faster. BigDansing depends on vertex-centric graph systems, i.e., Pregel

  14. Nuclear reactor

    International Nuclear Information System (INIS)

    Hattori, Sadao; Sekine, Katsuhisa.

    1987-01-01

    Purpose: To decrease the thickness of a reactor container and reduce the height and the height and plate thickness of a roof slab without using mechanical vibration stoppers. Constitution: Earthquake proofness is improved by filling fluids such as liquid metal between a reactor container and a secondary container and connecting the outer surface of the reactor container with the inner surface of the secondary container by means of bellows. That is, for the horizontal seismic vibrations, horizontal loads can be supported by the secondary container without providing mechanical vibration stoppers to the reactor container and the wall thickness can be reduced thereby enabling to simplify thermal insulation structure for the reduction of thermal stresses. Further, for the vertical seismic vibrations, verical loads can be transmitted to the secondary container thereby enabling to reduce the wall thickness in the same manner as for the horizontal load. By the effect of transferring the point of action of the container load applied to the roof slab to the outer circumferential portion, the intended purpose can be attained and, in addition, the radiation dose rate at the upper surface of the roof slab can be decreased. (Kamimura, M.)

  15. The fast breeder reactor

    International Nuclear Information System (INIS)

    Davis, D.A.; Baker, M.A.W.; Hall, R.S.

    1990-01-01

    Following submission of written evidence, the Energy Committee members asked questions of three witnesses from the Central Electricity Generating Board and Nuclear Electric (which will be the government owned company running nuclear power stations after privatisation). Both questions and answers are reported verbatim. The points raised include where the responsibility for the future fast reactor programme should lie, with government only or with private enterprise or both and the viability of fast breeder reactors in the future. The case for the fast reactor was stated as essentially strategic not economic. This raised the issue of nuclear cost which has both a construction and a decommissioning element. There was considerable discussion as to the cost of building a European Fast reactor and the cost of the electricity it would generate compared with PWR type reactors. The likely demand for fast reactors will not arrive for 20-30 years and the need to build a fast reactor now is questioned. (UK)

  16. Globalisation, big business and the Blair government

    OpenAIRE

    Grant, Wyn

    2000-01-01

    After reviewing definitions of globalisation, this paper suggests that the ‘company state model is becoming increasingly important in business-government relations. It is argued that Prime Minister Blair has a particular construction of globalisation which fits in well with the agenda of big international business. However, increasing tensions have arisen in the relationship between New Labour and business, reaching crisis point in May 2000. The paper concludes by suggesting that Burnham’s de...

  17. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    Science.gov (United States)

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  18. CERN’s Summer of Rock

    CERN Multimedia

    Katarina Anthony

    2015-01-01

    When a rock star visits CERN, they don’t just bring their entourage with them. Along for the ride are legions of fans across the world – many of whom may not be the typical CERN audience. In July alone, four big acts paid CERN a visit, sharing their experience with the world: Scorpions, The Script, Kings of Leon and Patti Smith.   @TheScript tweeted: #paleofestival we had the best time! Big love. #CERN (Image: Twitter).   It all started with the Scorpions, the classic rock band whose “Wind of Change” became an anthem in the early 1990s. On 19 July, the band braved the 35-degree heat to tour the CERN site on foot – visiting the Synchrocyclotron and the new Microcosm exhibition. The rockers were very enthusiastic about the research carried out at CERN, and talked about returning in the autumn during their next tour stop. The Scorpions visit Microcosm. Two days later, The Script rolled in. This Irish pop-rock band has been hittin...

  19. Radon exhalation from granitic rocks

    International Nuclear Information System (INIS)

    Del Claro, Flávia; Paschuk, Sergei A.; Corrêa, Janine N.; Mazer, Wellington; Narloch, Danielle Cristine; Martin, Aline Cristina; Denyak, Valeriy

    2017-01-01

    Naturally occurring radionuclides such as radon ( 222 Rn), its decay products and other elements from the radioactive series of uranium ( 238 U and 235 U) and thorium ( 232 Th) are an important source of human exposure to natural radioactivity. The worldwide evaluation of health radiobiological effects and risks from population exposure to natural radionuclides is a growing concern. About 50% of personal radiation annual dose is related to radionuclides such as radon ( 222 Rn), thoron ( 220 Rn), radium ( 226 Ra), thorium ( 232 Th) and potassium ( 40 K), which are present in modern materials commonly used in construction of dwellings and buildings. The radioactivity of marbles and granites is of big concern since under certain conditions the radioactivity levels of these materials can be hazardous to the population and require the implementation of mitigation procedures. Present survey of the 222 Rn and 220 Rn activity concentration liberated in the air was performed using commercialized Brazilian granite rocks at national market as well as exported to other countries. The 222 Rn and 220 Rn measurements were performed using the AlphaGUARD instant monitor and RAD7 detector, respectively. This study was performed at the Applied Nuclear Physics Laboratory of the Federal University of Technology – Paraná (UTFPR). Obtained results of radon concentration activity in air exhaled studied samples of granites varied from 3±1 Bq/m 3 to 2087±19 Bq/m 3 , which shows that some samples of granitic rocks represent rather elevated health risk the population. (author)

  20. Rocks Can Wow? Yes, Rocks Can Wow!

    Science.gov (United States)

    Hardman, Sally; Luke, Sue

    2016-01-01

    Rocks and fossils appear in the National Curriculum of England science programmes of study for children in year 3 (ages 7-8). A frequently asked question is "How do you make the classification of rocks engaging?" In response to this request from a school, a set of interactive activities was designed and organised by tutors and students…

  1. Mirror hybrid reactor studies

    International Nuclear Information System (INIS)

    Bender, D.J.

    1978-01-01

    The hybrid reactor studies are reviewed. The optimization of the point design and work on a reference design are described. The status of the nuclear analysis of fast spectrum blankets, systems studies for fissile fuel producing hybrid reactor, and the mechanical design of the machine are reviewed

  2. First-principles investigation of neutron-irradiation-induced point defects in B4C, a neutron absorber for sodium-cooled fast nuclear reactors

    Science.gov (United States)

    You, Yan; Yoshida, Katsumi; Yano, Toyohiko

    2018-05-01

    Boron carbide (B4C) is a leading candidate neutron absorber material for sodium-cooled fast nuclear reactors owing to its excellent neutron-capture capability. The formation and migration energies of the neutron-irradiation-induced defects, including vacancies, neutron-capture reaction products, and knocked-out atoms were studied by density functional theory calculations. The vacancy-type defects tend to migrate to the C–B–C chains of B4C, which indicates that the icosahedral cage structures of B4C have strong resistance to neutron irradiation. We found that lithium and helium atoms had significantly lower migration barriers along the rhombohedral (111) plane of B4C than perpendicular to this plane. This implies that the helium and lithium interstitials tended to follow a two-dimensional diffusion regime in B4C at low temperatures which explains the formation of flat disk like helium bubbles experimentally observed in B4C pellets after neutron irradiation. The knocked-out atoms are considered to be annihilated by the recombination of the close pairs of self-interstitials and vacancies.

  3. ESR dating of the fault rocks

    International Nuclear Information System (INIS)

    Lee, Hee Kwon

    2005-01-01

    We carried out ESR dating of fault rocks collected near the nuclear reactor. The Upcheon fault zone is exposed close to the Ulzin nuclear reactor. The space-time pattern of fault activity on the Upcheon fault deduced from ESR dating of fault gouge can be summarised as follows : this fault zone was reactivated between fault breccia derived from Cretaceous sandstone and tertiary volcanic sedimentary rocks about 2 Ma, 1.5 Ma and 1 Ma ago. After those movements, the Upcheon fault was reactivated between Cretaceous sandstone and fault breccia zone about 800 ka ago. This fault zone was reactivated again between fault breccia derived form Cretaceous sandstone and Tertiary volcanic sedimentary rocks about 650 ka and after 125 ka ago. These data suggest that the long-term(200-500 k.y.) cyclic fault activity of the Upcheon fault zone continued into the Pleistocene. In the Ulzin area, ESR dates from the NW and EW trend faults range from 800 ka to 600 ka NE and EW trend faults were reactivated about between 200 ka and 300 ka ago. On the other hand, ESR date of the NS trend fault is about 400 ka and 50 ka. Results of this research suggest the fault activity near the Ulzin nuclear reactor fault activity continued into the Pleistocene. One ESR date near the Youngkwang nuclear reactor is 200 ka

  4. Thermal instability observations during ramp tests in the Studsvik R2 reactor

    International Nuclear Information System (INIS)

    Roennberg, G.; Kjaer-Pedersen, N.

    1984-01-01

    A series of ramp tests on ENC-built BWR fuel from the Big Rock Point reactor was performed in September 1982 in the Studsvik R2 Reactor. The tests involved segmented rods with a burnup of 18 MWd/KgU, and constituted part of the Fuel Performance Improvement Program sponsored by the United States Department of Energy. Rods of different designs were tested. The reference design had solid, dished pellets and was unpressurized. The alternative designs were annular pellets and sphere-pac. Some of the rods with annular pellets were prepressurized, and some were not. During the ramp tests the rod power is controlled by a helium depressurization loop which causes a strictly linear power ramp versus time. The thermal output of the test rig is measured calorimetrically, the data immediately being recorded on a strip chart and later processed by a computer. Furthermore, elongation detectors permit the immediate recording of the rod length variation versus time. For some of the rods the thermal output went constant for a fraction of a minute after reaching a certain value, then continued to rise, while the helium depressurization continued to proceed linearly with time. For the duration of this plateau of the thermal output curve the slope of the elongation detector signal was significantly higher than before, but fell back to its original value after the plateau. This observation was made only for the reference rods. None of the annular rods, with or without prepressurization, nor the sphere-pac rods, showed the effect. When observed, the effect occurred at about 40 kw/m. The effect is attributed to fission gas release rapidly being enhanced by thermal feedback. The increase in stored energy associated with the temperature rise in the fuel causes the delay in thermal output. The larger available internal volume and/or the prepressurization of the annular rods, and the lack of a distinct fuel-clad gap for the sphere-pac rods prevented the effect from occurring in those other

  5. Urban Big Data and the Development of City Intelligence

    Directory of Open Access Journals (Sweden)

    Yunhe Pan

    2016-06-01

    Full Text Available This study provides a definition for urban big data while exploring its features and applications of China's city intelligence. The differences between city intelligence in China and the “smart city” concept in other countries are compared to highlight and contrast the unique definition and model for China's city intelligence in this paper. Furthermore, this paper examines the role of urban big data in city intelligence by showing that it not only serves as the cornerstone of this trend as it also plays a core role in the diffusion of city intelligence technology and serves as an inexhaustible resource for the sustained development of city intelligence. This study also points out the challenges of shaping and developing of China's urban big data. Considering the supporting and core role that urban big data plays in city intelligence, the study then expounds on the key points of urban big data, including infrastructure support, urban governance, public services, and economic and industrial development. Finally, this study points out that the utility of city intelligence as an ideal policy tool for advancing the goals of China's urban development. In conclusion, it is imperative that China make full use of its unique advantages—including using the nation's current state of development and resources, geographical advantages, and good human relations—in subjective and objective conditions to promote the development of city intelligence through the proper application of urban big data.

  6. Rock slope design guide.

    Science.gov (United States)

    2011-04-01

    This Manual is intended to provide guidance for the design of rock cut slopes, rockfall catchment, and : rockfall controls. Recommendations presented in this manual are based on research presented in Shakoor : and Admassu (2010) entitled Rock Slop...

  7. Rock Slope Design Criteria

    Science.gov (United States)

    2010-06-01

    Based on the stratigraphy and the type of slope stability problems, the flat lying, Paleozoic age, sedimentary : rocks of Ohio were divided into three design units: 1) competent rock design unit consisting of sandstones, limestones, : and siltstones ...

  8. The Rock Cycle

    Science.gov (United States)

    Singh, Raman J.; Bushee, Jonathan

    1977-01-01

    Presents a rock cycle diagram suitable for use at the secondary or introductory college levels which separates rocks formed on and below the surface, includes organic materials, and separates products from processes. (SL)

  9. ESR dating of fault rocks

    International Nuclear Information System (INIS)

    Lee, Hee Kwon

    2003-02-01

    Past movement on faults can be dated by measurement of the intensity of ESR signals in quartz. These signals are reset by local lattice deformation and local frictional heating on grain contacts at the time of fault movement. The ESR signals then grow back as a result of bombardment by ionizing radiation from surrounding rocks. The age is obtained from the ratio of the equivalent dose, needed to produce the observed signal, to the dose rate. Fine grains are more completely reset during faulting, and a plot of age vs. grain size shows a plateau for grains below critical size; these grains are presumed to have been completely zeroed by the last fault activity. We carried out ESR dating of fault rocks collected near the Gori nuclear reactor. Most of the ESR signals of fault rocks collected from the basement are saturated. This indicates that the last movement of the faults had occurred before the Quaternary period. However, ESR dates from the Oyong fault zone range from 370 to 310 ka. Results of this research suggest that long-term cyclic fault activity of the Oyong fault zone continued into the Pleistocene

  10. ESR dating of fault rocks

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hee Kwon [Kangwon National Univ., Chuncheon (Korea, Republic of)

    2003-02-15

    Past movement on faults can be dated by measurement of the intensity of ESR signals in quartz. These signals are reset by local lattice deformation and local frictional heating on grain contacts at the time of fault movement. The ESR signals then grow back as a result of bombardment by ionizing radiation from surrounding rocks. The age is obtained from the ratio of the equivalent dose, needed to produce the observed signal, to the dose rate. Fine grains are more completely reset during faulting, and a plot of age vs. grain size shows a plateau for grains below critical size; these grains are presumed to have been completely zeroed by the last fault activity. We carried out ESR dating of fault rocks collected near the Gori nuclear reactor. Most of the ESR signals of fault rocks collected from the basement are saturated. This indicates that the last movement of the faults had occurred before the Quaternary period. However, ESR dates from the Oyong fault zone range from 370 to 310 ka. Results of this research suggest that long-term cyclic fault activity of the Oyong fault zone continued into the Pleistocene.

  11. Big Data-Survey

    Directory of Open Access Journals (Sweden)

    P.S.G. Aruna Sri

    2016-03-01

    Full Text Available Big data is the term for any gathering of information sets, so expensive and complex, that it gets to be hard to process for utilizing customary information handling applications. The difficulties incorporate investigation, catch, duration, inquiry, sharing, stockpiling, Exchange, perception, and protection infringement. To reduce spot business patterns, anticipate diseases, conflict etc., we require bigger data sets when compared with the smaller data sets. Enormous information is hard to work with utilizing most social database administration frameworks and desktop measurements and perception bundles, needing rather enormously parallel programming running on tens, hundreds, or even a large number of servers. In this paper there was an observation on Hadoop architecture, different tools used for big data and its security issues.

  12. Finding the big bang

    CERN Document Server

    Page, Lyman A; Partridge, R Bruce

    2009-01-01

    Cosmology, the study of the universe as a whole, has become a precise physical science, the foundation of which is our understanding of the cosmic microwave background radiation (CMBR) left from the big bang. The story of the discovery and exploration of the CMBR in the 1960s is recalled for the first time in this collection of 44 essays by eminent scientists who pioneered the work. Two introductory chapters put the essays in context, explaining the general ideas behind the expanding universe and fossil remnants from the early stages of the expanding universe. The last chapter describes how the confusion of ideas and measurements in the 1960s grew into the present tight network of tests that demonstrate the accuracy of the big bang theory. This book is valuable to anyone interested in how science is done, and what it has taught us about the large-scale nature of the physical universe.

  13. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Klinkby Madsen, Anders; Rasche, Andreas

    data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects......This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... of international development agendas to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development efforts, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion...

  14. Big Bounce and inhomogeneities

    International Nuclear Information System (INIS)

    Brizuela, David; Mena Marugan, Guillermo A; Pawlowski, Tomasz

    2010-01-01

    The dynamics of an inhomogeneous universe is studied with the methods of loop quantum cosmology, via a so-called hybrid quantization, as an example of the quantization of vacuum cosmological spacetimes containing gravitational waves (Gowdy spacetimes). The analysis of this model with an infinite number of degrees of freedom, performed at the effective level, shows that (i) the initial Big Bang singularity is replaced (as in the case of homogeneous cosmological models) by a Big Bounce, joining deterministically two large universes, (ii) the universe size at the bounce is at least of the same order of magnitude as that of the background homogeneous universe and (iii) for each gravitational wave mode, the difference in amplitude at very early and very late times has a vanishing statistical average when the bounce dynamics is strongly dominated by the inhomogeneities, whereas this average is positive when the dynamics is in a near-vacuum regime, so that statistically the inhomogeneities are amplified. (fast track communication)

  15. Big Data and reality

    Directory of Open Access Journals (Sweden)

    Ryan Shaw

    2015-11-01

    Full Text Available DNA sequencers, Twitter, MRIs, Facebook, particle accelerators, Google Books, radio telescopes, Tumblr: what do these things have in common? According to the evangelists of “data science,” all of these are instruments for observing reality at unprecedentedly large scales and fine granularities. This perspective ignores the social reality of these very different technological systems, ignoring how they are made, how they work, and what they mean in favor of an exclusive focus on what they generate: Big Data. But no data, big or small, can be interpreted without an understanding of the process that generated them. Statistical data science is applicable to systems that have been designed as scientific instruments, but is likely to lead to confusion when applied to systems that have not. In those cases, a historical inquiry is preferable.

  16. Really big numbers

    CERN Document Server

    Schwartz, Richard Evan

    2014-01-01

    In the American Mathematical Society's first-ever book for kids (and kids at heart), mathematician and author Richard Evan Schwartz leads math lovers of all ages on an innovative and strikingly illustrated journey through the infinite number system. By means of engaging, imaginative visuals and endearing narration, Schwartz manages the monumental task of presenting the complex concept of Big Numbers in fresh and relatable ways. The book begins with small, easily observable numbers before building up to truly gigantic ones, like a nonillion, a tredecillion, a googol, and even ones too huge for names! Any person, regardless of age, can benefit from reading this book. Readers will find themselves returning to its pages for a very long time, perpetually learning from and growing with the narrative as their knowledge deepens. Really Big Numbers is a wonderful enrichment for any math education program and is enthusiastically recommended to every teacher, parent and grandparent, student, child, or other individual i...

  17. Big Bang Circus

    Science.gov (United States)

    Ambrosini, C.

    2011-06-01

    Big Bang Circus is an opera I composed in 2001 and which was premiered at the Venice Biennale Contemporary Music Festival in 2002. A chamber group, four singers and a ringmaster stage the story of the Universe confronting and interweaving two threads: how early man imagined it and how scientists described it. Surprisingly enough fancy, myths and scientific explanations often end up using the same images, metaphors and sometimes even words: a strong tension, a drumskin starting to vibrate, a shout…

  18. Big Bang 5

    CERN Document Server

    Apolin, Martin

    2007-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 5 RG behandelt die Grundlagen (Maßsystem, Größenordnungen) und die Mechanik (Translation, Rotation, Kraft, Erhaltungssätze).

  19. Big Bang 8

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Band 8 vermittelt auf verständliche Weise Relativitätstheorie, Kern- und Teilchenphysik (und deren Anwendungen in der Kosmologie und Astrophysik), Nanotechnologie sowie Bionik.

  20. Big Bang 6

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 6 RG behandelt die Gravitation, Schwingungen und Wellen, Thermodynamik und eine Einführung in die Elektrizität anhand von Alltagsbeispielen und Querverbindungen zu anderen Disziplinen.

  1. Big Bang 7

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. In Band 7 werden neben einer Einführung auch viele aktuelle Aspekte von Quantenmechanik (z. Beamen) und Elektrodynamik (zB Elektrosmog), sowie die Klimaproblematik und die Chaostheorie behandelt.

  2. Big Bang Darkleosynthesis

    OpenAIRE

    Krnjaic, Gordan; Sigurdson, Kris

    2014-01-01

    In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis , a dark-sector version of big-bang nucleosynthesis (BBN), in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD , which generica...

  3. Big³. Editorial.

    Science.gov (United States)

    Lehmann, C U; Séroussi, B; Jaulent, M-C

    2014-05-22

    To provide an editorial introduction into the 2014 IMIA Yearbook of Medical Informatics with an overview of the content, the new publishing scheme, and upcoming 25th anniversary. A brief overview of the 2014 special topic, Big Data - Smart Health Strategies, and an outline of the novel publishing model is provided in conjunction with a call for proposals to celebrate the 25th anniversary of the Yearbook. 'Big Data' has become the latest buzzword in informatics and promise new approaches and interventions that can improve health, well-being, and quality of life. This edition of the Yearbook acknowledges the fact that we just started to explore the opportunities that 'Big Data' will bring. However, it will become apparent to the reader that its pervasive nature has invaded all aspects of biomedical informatics - some to a higher degree than others. It was our goal to provide a comprehensive view at the state of 'Big Data' today, explore its strengths and weaknesses, as well as its risks, discuss emerging trends, tools, and applications, and stimulate the development of the field through the aggregation of excellent survey papers and working group contributions to the topic. For the first time in history will the IMIA Yearbook be published in an open access online format allowing a broader readership especially in resource poor countries. For the first time, thanks to the online format, will the IMIA Yearbook be published twice in the year, with two different tracks of papers. We anticipate that the important role of the IMIA yearbook will further increase with these changes just in time for its 25th anniversary in 2016.

  4. Recent big flare

    International Nuclear Information System (INIS)

    Moriyama, Fumio; Miyazawa, Masahide; Yamaguchi, Yoshisuke

    1978-01-01

    The features of three big solar flares observed at Tokyo Observatory are described in this paper. The active region, McMath 14943, caused a big flare on September 16, 1977. The flare appeared on both sides of a long dark line which runs along the boundary of the magnetic field. Two-ribbon structure was seen. The electron density of the flare observed at Norikura Corona Observatory was 3 x 10 12 /cc. Several arc lines which connect both bright regions of different magnetic polarity were seen in H-α monochrome image. The active region, McMath 15056, caused a big flare on December 10, 1977. At the beginning, several bright spots were observed in the region between two main solar spots. Then, the area and the brightness increased, and the bright spots became two ribbon-shaped bands. A solar flare was observed on April 8, 1978. At first, several bright spots were seen around the solar spot in the active region, McMath 15221. Then, these bright spots developed to a large bright region. On both sides of a dark line along the magnetic neutral line, bright regions were generated. These developed to a two-ribbon flare. The time required for growth was more than one hour. A bright arc which connects two ribbons was seen, and this arc may be a loop prominence system. (Kato, T.)

  5. Big Data Technologies

    Science.gov (United States)

    Bellazzi, Riccardo; Dagliati, Arianna; Sacchi, Lucia; Segagni, Daniele

    2015-01-01

    The so-called big data revolution provides substantial opportunities to diabetes management. At least 3 important directions are currently of great interest. First, the integration of different sources of information, from primary and secondary care to administrative information, may allow depicting a novel view of patient’s care processes and of single patient’s behaviors, taking into account the multifaceted nature of chronic care. Second, the availability of novel diabetes technologies, able to gather large amounts of real-time data, requires the implementation of distributed platforms for data analysis and decision support. Finally, the inclusion of geographical and environmental information into such complex IT systems may further increase the capability of interpreting the data gathered and extract new knowledge from them. This article reviews the main concepts and definitions related to big data, it presents some efforts in health care, and discusses the potential role of big data in diabetes care. Finally, as an example, it describes the research efforts carried on in the MOSAIC project, funded by the European Commission. PMID:25910540

  6. Flow of catalyst particles in a flue gas desulfurization plant; mass transfer in the domain of a detached flow - two examples (desulfurization, HTGR type reactor) for the application of big computers solving technical problems

    International Nuclear Information System (INIS)

    Achenbach, E.

    1988-01-01

    The research work of the Institute for Reactor Components is mainly experimental in character. Where possible, the experiments are accompanied by numerical calculations. This has the advantage of rendering parameter studies faster and more economical than is the case with experiments, so that physical contexts can become more apparent. However, these calculations are no substitute for experiments. The application of numerical calculations in connection with experimental results can now be demonstrated with two examples. The examples have been selected with the aim of making the presentation of the results sufficiently interesting for all those participating at the colloquium. The theoretical and experimental results are presented in the form of short films. (orig.) [de

  7. Pore-scale analysis of electrical properties in thinly bedded rock using digital rock physics

    International Nuclear Information System (INIS)

    Sun, Jianmeng; Zhao, Jianpeng; Liu, Xuefeng; Chen, Hui; Jiang, LiMing; Zhang, JinYan

    2014-01-01

    We investigated the electrical properties of laminated rock consist of macro-porous layers and micro-porous layers based on digital rock technology. Due to the bedding effect and anisotropy, traditional Archie equations cannot well describe the electrical behavior of laminated rock. The RI-Sw curve of laminated rock shows a nonlinear relationship. The RI-Sw curve can be divided into two linear segments with different saturation exponent. Laminated sand-shale sequences and laminated sands of different porosity or grain size will yield macroscopic electrical anisotropy. Numerical simulation and theoretical analysis lead to the conclusion that electrical anisotropy coefficient of laminated rock is a strong function of water saturation. The function curve can be divided into three segments by the turning point. Therefore, the electrical behavior of laminated rock should be considered in oil exploration and development. (paper)

  8. New reactor concepts

    International Nuclear Information System (INIS)

    Meskens, G.; Govaerts, P.; Baugnet, J.-M.; Delbrassine, A.

    1998-11-01

    The document gives a summary of new nuclear reactor concepts from a technological point of view. Belgium supports the development of the European Pressurized-Water Reactor, which is an evolutionary concept based on the European experience in Pressurized-Water Reactors. A reorientation of the Belgian choice for this evolutionary concept may be required in case that a decision is taken to burn plutonium, when the need for flexible nuclear power plants arises or when new reactor concepts can demonstrate proved benefits in terms of safety and cost

  9. Big bang and big crunch in matrix string theory

    OpenAIRE

    Bedford, J; Papageorgakis, C; Rodríguez-Gómez, D; Ward, J

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  10. Nuclear reactors

    International Nuclear Information System (INIS)

    Barre, Bertrand

    2015-10-01

    After some remarks on the nuclear fuel, on the chain reaction control, on fuel loading and unloading, this article proposes descriptions of the design, principles and operations of different types of nuclear reactors as well as comments on their presence and use in different countries: pressurized water reactors (design of the primary and secondary circuits, volume and chemistry control, backup injection circuits), boiling water reactors, heavy water reactors, graphite and boiling water reactors, graphite-gas reactors, fast breeder reactors, and fourth generation reactors (definition, fast breeding). For these last ones, six concepts are presented: sodium-cooled fast reactor, lead-cooled fast reactor, gas-cooled fast reactor, high temperature gas-cooled reactor, supercritical water-cooled reactor, and molten salt reactor

  11. Nuclear reactor PBMR and cogeneration; Reactor nuclear PBMR y cogeneracion

    Energy Technology Data Exchange (ETDEWEB)

    Ramirez S, J. R.; Alonso V, G., E-mail: ramon.ramirez@inin.gob.mx [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2013-10-15

    In recent years the nuclear reactor designs for the electricity generation have increased their costs, so that at the moment costs are managed of around the 5000 US D for installed kw, reason for which a big nuclear plant requires of investments of the order of billions of dollars, the designed reactors as modular of low power seek to lighten the initial investment of a big reactor dividing the power in parts and dividing in modules the components to lower the production costs, this way it can begin to build a module and finished this to build other, differing the long term investment, getting less risk therefore in the investment. On the other hand the reactors of low power can be very useful in regions where is difficult to have access to the electric net being able to take advantage of the thermal energy of the reactor to feed other processes like the water desalination or the vapor generation for the processes industry like the petrochemical, or even more the possible hydrogen production to be used as fuel. In this work the possibility to generate vapor of high quality for the petrochemical industry is described using a spheres bed reactor of high temperature. (Author)

  12. Calculation system for physical analysis of boiling water reactors

    International Nuclear Information System (INIS)

    Bouveret, F.

    2001-01-01

    Although Boiling Water Reactors generate a quarter of worldwide nuclear electricity, they have been only little studied in France. A certain interest now shows up for these reactors. So, the aim of the work presented here is to contribute to determine a core calculation methodology with CEA (Commissariat a l'Energie Atomique) codes. Vapour production in the reactor core involves great differences in technological options from pressurised water reactor. We analyse main physical phenomena for BWR and offer solutions taking them into account. BWR fuel assembly heterogeneity causes steep thermal flux gradients. The two dimensional collision probability method with exact boundary conditions makes possible to calculate accurately the flux in BWR fuel assemblies using the APOLLO-2 lattice code but induces a very long calculation time. So, we determine a new methodology based on a two-level flux calculation. Void fraction variations in assemblies involve big spectrum changes that we have to consider in core calculation. We suggest to use a void history parameter to generate cross-sections libraries for core calculation. The core calculation code has also to calculate the depletion of main isotopes concentrations. A core calculation associating neutronics and thermal-hydraulic codes lays stress on points we still have to study out. The most important of them is to take into account the control blade in the different calculation stages. (author)

  13. Development and testing of the EDF-2 reactor fuel element; Essais et mise au point de l'element combustible pour le reacteur EDF-2

    Energy Technology Data Exchange (ETDEWEB)

    Delpeyroux, P [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires; Furhmann, R [Societe Industrielle de Combustible Nucleaire (France)

    1964-07-01

    This technical report reviews the work which has been necessary for defining the EDF-2 fuel element. After giving briefly the EDF-2 reactor characteristics and the preliminary choice of parameters which made it possible to draw up a draft plan for the fuel element, the authors consider the research proper: - Uranium studies: tests on the passage into the {beta} phase of an internal crown of a tube, bending of the tube under the effect of a localized force, welding of the end-pellets and testing for leaks. The resistance of the tube to crushing and of the pellets to yielding under the external pressure have been studied in detail in another CEA report. - Can studies: conditions of production and leak proof testing of the can, resistance of the fins to creep due to the effect of the gas flow. - Studies of the extremities of the element: creep under compression and welding of the plugs to the can. - Cartridge studies: determination of the characteristics of the can fuel fixing grooves and of the canning conditions, verification of the resistance of the fuel element to thermal cycling, determination of the temperature drop at the can-fuel interface dealt with in more detail in another CEA report. - Studies of the whole assembly: this work which concerns the graphite jacket, the support and the cartridge vibrations has been carried out by the Mechanical and Thermal Study Service (Mechanics Section). In this field the Fuel Element Study Section has investigated the behaviour of the centering devices in a gas current. The outcome of this research is the defining of the plan of the element the production process and the production specifications. The validity of ail these out-of-pile tests will be confirmed by the in-pile tests already under way and by irradiation of the elements in the EDF-2 reactor itself. In conclusion the programme is given for improving the fuel element and for defining the fuel element for the second charge. (authors) [French] Ce rapport technique

  14. Point Lepreau generating station

    International Nuclear Information System (INIS)

    Ganong, G.H.D.; Strang, A.E.; Gunter, G.E.; Thompson, T.S.

    Point Lepreau-1 reactor is a 600 MWe generating station expected to be in service by October 1979. New Brunswick is suffering a 'catch up' phenomenon in load growth and needs to decrease dependence on foreign oil. The site is on salt water and extensive study has gone into corrosion control. Project management, financing and scheduling have unique aspects. (E.C.B.)

  15. Reactor core cooling device

    International Nuclear Information System (INIS)

    Kobayashi, Masahiro.

    1986-01-01

    Purpose: To safely and effectively cool down the reactor core after it has been shut down but is still hot due to after-heat. Constitution: Since the coolant extraction nozzle is situated at a location higher than the coolant injection nozzle, the coolant sprayed from the nozzle, is free from sucking immediately from the extraction nozzle and is therefore used effectively to cool the reactor core. As all the portions from the top to the bottom of the reactor are cooled simultaneously, the efficiency of the reactor cooling process is increased. Since the coolant extraction nozzle can be installed at a point considerably higher than the coolant injection nozzle, the distance from the coolant surface to the point of the coolant extraction nozzle can be made large, preventing cavitation near the coolant extraction nozzle. Therefore, without increasing the capacity of the heat exchanger, the reactor can be cooled down after a shutdown safely and efficiently. (Kawakami, Y.)

  16. Permeability Evolution and Rock Brittle Failure

    OpenAIRE

    Sun Qiang; Xue Lei; Zhu Shuyun

    2015-01-01

    This paper reports an experimental study of the evolution of permeability during rock brittle failure and a theoretical analysis of rock critical stress level. It is assumed that the rock is a strain-softening medium whose strength can be described by Weibull’s distribution. Based on the two-dimensional renormalization group theory, it is found that the stress level λ c (the ratio of the stress at the critical point to the peak stress) depends mainly on the homogeneity index or shape paramete...

  17. Radon and rock bursts in deep mines

    International Nuclear Information System (INIS)

    Bulashevich, Yu.P.; Utkin, V.I.; Yurkov, A.K.; Nikolaev, V.V.

    1996-01-01

    Variation fields of radon concentration in time to ascertain stress-strain state of the North Ural bauxite mines have been studied. It is shown that dynamic changes in the stress-strain state of the rocks prior to the rock burst bring about variations in radon concentration in the observation wells. Depending on mutual positioning of the observation points and the rock burst epicenter, the above-mentioned variations differ in principle, reduction of radon concentration in the near zone and its increase in the far zone are observed [ru

  18. Application of rock mechanics in opencast mining

    Energy Technology Data Exchange (ETDEWEB)

    Desurmont, M; Feuga, B

    1979-07-01

    The significance of opencast mining in the world today is mentioned. With the exception of coal, opencast workings provide approximately 80% of output. The importance of opencast has continued to increase over the last ten years. Access to the mineral usually necessitates the removal of large quantities of rock. The aim is to reduce the quantity of the latter as much as possible in order to minimize the dirt/mineral ratio. For this purpose use has been made of the operating techniques of rock mechanics in order to determine the optimum dimensions of the access trench compatible with safety requirements. The author illustrates this technique by means of three examples: the Luzenac talc workings, the Mont-Roc fluorine workings and the Big Hole at Kimberley.

  19. Disaggregating asthma: Big investigation versus big data.

    Science.gov (United States)

    Belgrave, Danielle; Henderson, John; Simpson, Angela; Buchan, Iain; Bishop, Christopher; Custovic, Adnan

    2017-02-01

    We are facing a major challenge in bridging the gap between identifying subtypes of asthma to understand causal mechanisms and translating this knowledge into personalized prevention and management strategies. In recent years, "big data" has been sold as a panacea for generating hypotheses and driving new frontiers of health care; the idea that the data must and will speak for themselves is fast becoming a new dogma. One of the dangers of ready accessibility of health care data and computational tools for data analysis is that the process of data mining can become uncoupled from the scientific process of clinical interpretation, understanding the provenance of the data, and external validation. Although advances in computational methods can be valuable for using unexpected structure in data to generate hypotheses, there remains a need for testing hypotheses and interpreting results with scientific rigor. We argue for combining data- and hypothesis-driven methods in a careful synergy, and the importance of carefully characterized birth and patient cohorts with genetic, phenotypic, biological, and molecular data in this process cannot be overemphasized. The main challenge on the road ahead is to harness bigger health care data in ways that produce meaningful clinical interpretation and to translate this into better diagnoses and properly personalized prevention and treatment plans. There is a pressing need for cross-disciplinary research with an integrative approach to data science, whereby basic scientists, clinicians, data analysts, and epidemiologists work together to understand the heterogeneity of asthma. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Phosphine from rocks: mechanically driven phosphate reduction?

    Science.gov (United States)

    Glindemann, Dietmar; Edwards, Marc; Morgenstern, Peter

    2005-11-01

    Natural rock and mineral samples released trace amounts of phosphine during dissolution in mineral acid. An order of magnitude more phosphine (average 1982 ng PH3 kg rock and maximum 6673 ng PH3/kg rock) is released from pulverized rock samples (basalt, gneiss, granite, clay, quartzitic pebbles, or marble). Phosphine was correlated to hardness and mechanical pulverization energy of the rocks. The yield of PH3 ranged from 0 to 0.01% of the total P content of the dissolved rock. Strong circumstantial evidence was gathered for reduction of phosphate in the rock via mechanochemical or "tribochemical" weathering at quartz and calcite/marble inclusions. Artificial reproduction of this mechanism by rubbing quartz rods coated with apatite-phosphate to the point of visible triboluminescence, led to detection of more than 70 000 ng/kg PH3 in the apatite. This reaction pathway may be considered a mechano-chemical analogue of phosphate reduction from lightning or electrical discharges and may contribute to phosphine production via tectonic forces and processing of rocks.

  1. Measuring Public Acceptance of Nuclear Technology with Big data

    Energy Technology Data Exchange (ETDEWEB)

    Roh, Seugkook [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    Surveys can be conducted only on people in specific region and time interval, and it may be misleading to generalize the results to represent the attitude of the public. For example, opinions of a person living in metropolitan area, far from the dangers of nuclear reactors and enjoying cheap electricity produced by the reactors, and a person living in proximity of nuclear power plants, subject to tremendous damage should nuclear meltdown occur, certainly differs for the topic of nuclear generation. To conclude, big data is a useful tool to measure the public acceptance of nuclear technology efficiently (i.e., saves cost, time, and effort of measurement and analysis) and this research was able to provide a case for using big data to analyze public acceptance of nuclear technology. Finally, the analysis identified opinion leaders, which allows target-marketing when policy is executed.

  2. Measuring Public Acceptance of Nuclear Technology with Big data

    International Nuclear Information System (INIS)

    Roh, Seugkook

    2015-01-01

    Surveys can be conducted only on people in specific region and time interval, and it may be misleading to generalize the results to represent the attitude of the public. For example, opinions of a person living in metropolitan area, far from the dangers of nuclear reactors and enjoying cheap electricity produced by the reactors, and a person living in proximity of nuclear power plants, subject to tremendous damage should nuclear meltdown occur, certainly differs for the topic of nuclear generation. To conclude, big data is a useful tool to measure the public acceptance of nuclear technology efficiently (i.e., saves cost, time, and effort of measurement and analysis) and this research was able to provide a case for using big data to analyze public acceptance of nuclear technology. Finally, the analysis identified opinion leaders, which allows target-marketing when policy is executed

  3. Nuclear reactor PBMR and cogeneration

    International Nuclear Information System (INIS)

    Ramirez S, J. R.; Alonso V, G.

    2013-10-01

    In recent years the nuclear reactor designs for the electricity generation have increased their costs, so that at the moment costs are managed of around the 5000 US D for installed kw, reason for which a big nuclear plant requires of investments of the order of billions of dollars, the designed reactors as modular of low power seek to lighten the initial investment of a big reactor dividing the power in parts and dividing in modules the components to lower the production costs, this way it can begin to build a module and finished this to build other, differing the long term investment, getting less risk therefore in the investment. On the other hand the reactors of low power can be very useful in regions where is difficult to have access to the electric net being able to take advantage of the thermal energy of the reactor to feed other processes like the water desalination or the vapor generation for the processes industry like the petrochemical, or even more the possible hydrogen production to be used as fuel. In this work the possibility to generate vapor of high quality for the petrochemical industry is described using a spheres bed reactor of high temperature. (Author)

  4. Big data and educational research

    OpenAIRE

    Beneito-Montagut, Roser

    2017-01-01

    Big data and data analytics offer the promise to enhance teaching and learning, improve educational research and progress education governance. This chapter aims to contribute to the conceptual and methodological understanding of big data and analytics within educational research. It describes the opportunities and challenges that big data and analytics bring to education as well as critically explore the perils of applying a data driven approach to education. Despite the claimed value of the...

  5. The trashing of Big Green

    International Nuclear Information System (INIS)

    Felten, E.

    1990-01-01

    The Big Green initiative on California's ballot lost by a margin of 2-to-1. Green measures lost in five other states, shocking ecology-minded groups. According to the postmortem by environmentalists, Big Green was a victim of poor timing and big spending by the opposition. Now its supporters plan to break up the bill and try to pass some provisions in the Legislature

  6. Big Data as a Source for Official Statistics

    Directory of Open Access Journals (Sweden)

    Daas Piet J.H.

    2015-06-01

    Full Text Available More and more data are being produced by an increasing number of electronic devices physically surrounding us and on the internet. The large amount of data and the high frequency at which they are produced have resulted in the introduction of the term ‘Big Data’. Because these data reflect many different aspects of our daily lives and because of their abundance and availability, Big Data sources are very interesting from an official statistics point of view. This article discusses the exploration of both opportunities and challenges for official statistics associated with the application of Big Data. Experiences gained with analyses of large amounts of Dutch traffic loop detection records and Dutch social media messages are described to illustrate the topics characteristic of the statistical analysis and use of Big Data.

  7. Big data in fashion industry

    Science.gov (United States)

    Jain, S.; Bruniaux, J.; Zeng, X.; Bruniaux, P.

    2017-10-01

    Significant work has been done in the field of big data in last decade. The concept of big data includes analysing voluminous data to extract valuable information. In the fashion world, big data is increasingly playing a part in trend forecasting, analysing consumer behaviour, preference and emotions. The purpose of this paper is to introduce the term fashion data and why it can be considered as big data. It also gives a broad classification of the types of fashion data and briefly defines them. Also, the methodology and working of a system that will use this data is briefly described.

  8. Big Data Analytics An Overview

    Directory of Open Access Journals (Sweden)

    Jayshree Dwivedi

    2015-08-01

    Full Text Available Big data is a data beyond the storage capacity and beyond the processing power is called big data. Big data term is used for data sets its so large or complex that traditional data it involves data sets with sizes. Big data size is a constantly moving target year by year ranging from a few dozen terabytes to many petabytes of data means like social networking sites the amount of data produced by people is growing rapidly every year. Big data is not only a data rather it become a complete subject which includes various tools techniques and framework. It defines the epidemic possibility and evolvement of data both structured and unstructured. Big data is a set of techniques and technologies that require new forms of assimilate to uncover large hidden values from large datasets that are diverse complex and of a massive scale. It is difficult to work with using most relational database management systems and desktop statistics and visualization packages exacting preferably massively parallel software running on tens hundreds or even thousands of servers. Big data environment is used to grab organize and resolve the various types of data. In this paper we describe applications problems and tools of big data and gives overview of big data.

  9. Was there a big bang

    International Nuclear Information System (INIS)

    Narlikar, J.

    1981-01-01

    In discussing the viability of the big-bang model of the Universe relative evidence is examined including the discrepancies in the age of the big-bang Universe, the red shifts of quasars, the microwave background radiation, general theory of relativity aspects such as the change of the gravitational constant with time, and quantum theory considerations. It is felt that the arguments considered show that the big-bang picture is not as soundly established, either theoretically or observationally, as it is usually claimed to be, that the cosmological problem is still wide open and alternatives to the standard big-bang picture should be seriously investigated. (U.K.)

  10. How Big is Earth?

    Science.gov (United States)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  11. [Three applications and the challenge of the big data in otology].

    Science.gov (United States)

    Lei, Guanxiong; Li, Jianan; Shen, Weidong; Yang, Shiming

    2016-03-01

    With the expansion of human practical activities, more and more areas have suffered from big data problems. The emergence of big data requires people to update the research paradigm and develop new technical methods. This review discussed that big data might bring opportunities and challenges in the area of auditory implantation, the deafness genome, and auditory pathophysiology, and pointed out that we needed to find appropriate theories and methods to make this kind of expectation into reality.

  12. The big data phenomenon: The business and public impact

    Directory of Open Access Journals (Sweden)

    Chroneos-Krasavac Biljana

    2016-01-01

    Full Text Available The subject of the research in this paper is the emergence of big data phenomenon and application of big data technologies for business' needs with the specific emphasis on marketing and trade. The purpose of the research is to make a comprehensive overview of different discussions about the characteristics, application possibilities, achievements, constraints and the future of big data development. Based on the relevant literature, the concept of big data is presented and the potential of large impact of big data on business activities is discussed. One of the key findings indicates that the most prominent change that big data brings to the business arena is the appearance of new business models, as well as revisions of the existing ones. Substantial part of the paper is devoted to the marketing and marketing research which are under the strong impact of big data. The most exciting outcomes of the research in this domain concerns the new abilities in profiling the customers. In addition to the vast amount of structured data which are used in marketing for a long period, big data initiatives suggest the inclusion of semi-structured and unstructured data, opening up the room for substantial improvements in customer profile analysis. Considering the usage of information communication technologies (ICT as a prerequisite for big data project success, the concept of Networked Readiness Index (NRI is presented and the position of Serbia and regional countries in NRI framework is analyzed. The main outcome of the analysis points out that Serbia, with its NRI score took the lowest position in the region, excluding Albania. Also, Serbia is lagging behind the appropriate EU mean values regarding all observed composite indicators - pillars. Further on, this analysis reveals the domains of ICT usage in Serbia, which could be focused for an improvement and where incentives can be made. These domains are: political and regulatory environment, business and

  13. Privacy and Big Data

    CERN Document Server

    Craig, Terence

    2011-01-01

    Much of what constitutes Big Data is information about us. Through our online activities, we leave an easy-to-follow trail of digital footprints that reveal who we are, what we buy, where we go, and much more. This eye-opening book explores the raging privacy debate over the use of personal data, with one undeniable conclusion: once data's been collected, we have absolutely no control over who uses it or how it is used. Personal data is the hottest commodity on the market today-truly more valuable than gold. We are the asset that every company, industry, non-profit, and government wants. Pri

  14. Visualizing big energy data

    DEFF Research Database (Denmark)

    Hyndman, Rob J.; Liu, Xueqin Amy; Pinson, Pierre

    2018-01-01

    Visualization is a crucial component of data analysis. It is always a good idea to plot the data before fitting models, making predictions, or drawing conclusions. As sensors of the electric grid are collecting large volumes of data from various sources, power industry professionals are facing th...... the challenge of visualizing such data in a timely fashion. In this article, we demonstrate several data-visualization solutions for big energy data through three case studies involving smart-meter data, phasor measurement unit (PMU) data, and probabilistic forecasts, respectively....

  15. Big Data Challenges

    Directory of Open Access Journals (Sweden)

    Alexandru Adrian TOLE

    2013-10-01

    Full Text Available The amount of data that is traveling across the internet today, not only that is large, but is complex as well. Companies, institutions, healthcare system etc., all of them use piles of data which are further used for creating reports in order to ensure continuity regarding the services that they have to offer. The process behind the results that these entities requests represents a challenge for software developers and companies that provide IT infrastructure. The challenge is how to manipulate an impressive volume of data that has to be securely delivered through the internet and reach its destination intact. This paper treats the challenges that Big Data creates.

  16. Big data naturally rescaled

    International Nuclear Information System (INIS)

    Stoop, Ruedi; Kanders, Karlis; Lorimer, Tom; Held, Jenny; Albert, Carlo

    2016-01-01

    We propose that a handle could be put on big data by looking at the systems that actually generate the data, rather than the data itself, realizing that there may be only few generic processes involved in this, each one imprinting its very specific structures in the space of systems, the traces of which translate into feature space. From this, we propose a practical computational clustering approach, optimized for coping with such data, inspired by how the human cortex is known to approach the problem.

  17. A Matrix Big Bang

    OpenAIRE

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matr...

  18. The source rock characters of U-rich granite

    Energy Technology Data Exchange (ETDEWEB)

    Mingyue, Feng; Debao, He [CNNC Key Laboratory of Uranium Resources Exploration and Evaluation Technology, Beijing Research Institute of Uranium Geology (China)

    2012-03-15

    This paper discusses the stratum composition, lithological association, uranium content of crust and the activation, migration, concentration of uranium at each tectonic cycle in South China. The authors point out that the source rock of U-rich granite is U-rich continental crust which is rich in Si, Al and K. The lithological association is mainly composed of terrestrial clastic rocks formation of mudstone and sandstone, mingled with intermediate-acidic, mafic pyroclastic rocks and carbonate rocks formation. During tectonic movements, the rocks had undergone regional metamorphism, migmatitization, granitization, and formed U-rich granites finally. (authors)

  19. The source rock characters of U-rich granite

    International Nuclear Information System (INIS)

    Feng Mingyue; He Debao

    2012-01-01

    This paper discusses the stratum composition, lithological association, uranium content of crust and the activation, migration, concentration of uranium at each tectonic cycle in South China. The authors point out that the source rock of U-rich granite is U-rich continental crust which is rich in Si, Al and K. The lithological association is mainly composed of terrestrial clastic rocks formation of mudstone and sandstone, mingled with intermediate-acidic, mafic pyroclastic rocks and carbonate rocks formation. During tectonic movements, the rocks had undergone regional metamorphism, migmatitization, granitization, and formed U-rich granites finally. (authors)

  20. BIG Data - BIG Gains? Understanding the Link Between Big Data Analytics and Innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance for product innovations. Since big data technologies provide new data information practices, they create new decision-making possibilities, which firms can use to realize innovations. Applying German firm-level data we find suggestive evidence that big data analytics matters for the likelihood of becoming a product innovator as well as the market success of the firms’ product innovat...

  1. BIG data - BIG gains? Empirical evidence on the link between big data analytics and innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance in terms of product innovations. Since big data technologies provide new data information practices, they create novel decision-making possibilities, which are widely believed to support firms’ innovation process. Applying German firm-level data within a knowledge production function framework we find suggestive evidence that big data analytics is a relevant determinant for the likel...

  2. [Big data in imaging].

    Science.gov (United States)

    Sewerin, Philipp; Ostendorf, Benedikt; Hueber, Axel J; Kleyer, Arnd

    2018-04-01

    Until now, most major medical advancements have been achieved through hypothesis-driven research within the scope of clinical trials. However, due to a multitude of variables, only a certain number of research questions could be addressed during a single study, thus rendering these studies expensive and time consuming. Big data acquisition enables a new data-based approach in which large volumes of data can be used to investigate all variables, thus opening new horizons. Due to universal digitalization of the data as well as ever-improving hard- and software solutions, imaging would appear to be predestined for such analyses. Several small studies have already demonstrated that automated analysis algorithms and artificial intelligence can identify pathologies with high precision. Such automated systems would also seem well suited for rheumatology imaging, since a method for individualized risk stratification has long been sought for these patients. However, despite all the promising options, the heterogeneity of the data and highly complex regulations covering data protection in Germany would still render a big data solution for imaging difficult today. Overcoming these boundaries is challenging, but the enormous potential advances in clinical management and science render pursuit of this goal worthwhile.

  3. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Fields, Brian D.; Olive, Keith A.

    2006-01-01

    We present an overview of the standard model of big bang nucleosynthesis (BBN), which describes the production of the light elements in the early universe. The theoretical prediction for the abundances of D, 3 He, 4 He, and 7 Li is discussed. We emphasize the role of key nuclear reactions and the methods by which experimental cross section uncertainties are propagated into uncertainties in the predicted abundances. The observational determination of the light nuclides is also discussed. Particular attention is given to the comparison between the predicted and observed abundances, which yields a measurement of the cosmic baryon content. The spectrum of anisotropies in the cosmic microwave background (CMB) now independently measures the baryon density to high precision; we show how the CMB data test BBN, and find that the CMB and the D and 4 He observations paint a consistent picture. This concordance stands as a major success of the hot big bang. On the other hand, 7 Li remains discrepant with the CMB-preferred baryon density; possible explanations are reviewed. Finally, moving beyond the standard model, primordial nucleosynthesis constraints on early universe and particle physics are also briefly discussed

  4. Critical point predication device

    International Nuclear Information System (INIS)

    Matsumura, Kazuhiko; Kariyama, Koji.

    1996-01-01

    An operation for predicting a critical point by using a existent reverse multiplication method has been complicated, and an effective multiplication factor could not be plotted directly to degrade the accuracy for the prediction. The present invention comprises a detector counting memory section for memorizing the counting sent from a power detector which monitors the reactor power, a reverse multiplication factor calculation section for calculating the reverse multiplication factor based on initial countings and current countings of the power detector, and a critical point prediction section for predicting the criticality by the reverse multiplication method relative to effective multiplication factors corresponding to the state of the reactor core previously determined depending on the cases. In addition, a reactor core characteristic calculation section is added for analyzing an effective multiplication factor depending on the state of the reactor core. Then, if the margin up to the criticality is reduced to lower than a predetermined value during critical operation, an alarm is generated to stop the critical operation when generation of a period of more than a predetermined value predicted by succeeding critical operation. With such procedures, forecasting for the critical point can be easily predicted upon critical operation to greatly mitigate an operator's burden and improve handling for the operation. (N.H.)

  5. H Reactor

    Data.gov (United States)

    Federal Laboratory Consortium — The H Reactor was the first reactor to be built at Hanford after World War II.It became operational in October of 1949, and represented the fourth nuclear reactor on...

  6. Nursing Needs Big Data and Big Data Needs Nursing.

    Science.gov (United States)

    Brennan, Patricia Flatley; Bakken, Suzanne

    2015-09-01

    Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.

  7. The Concept of the Use of the Marine Reactor Plant in Small Electric Grids

    International Nuclear Information System (INIS)

    Khlopkin, N.; Makarov, V.; Pologikh, B.

    2002-01-01

    In report some aspects of the using marine nuclear reactor are considered for provision of need small non-interconnected power systems, as well as separate settlements and the mining enterprises disposed in regions with a undeveloped infrastructure. Recently for these purposes it is offered to use the nuclear small modular power plants. The required plant power for small electric grids lies within from 1 to several tens of MWe. Module can be collected and tested on machine-building plant, and then delivered in ready type to the working place on some transport, for instance, a barge. Through determined time it's possible to transport a module to the repair shop and also to the point of storage after the end of operation. Marine nuclear reactors on their powers, compactness, mass and size are ideal prototypes for creation of such modules. For instance, building at present floating power unit, intended for functioning in region of the Russian North, based on using reactor plants of nuclear icebreakers. Reliability and safety of the ship reactor are confirmed by their trouble-free operation during approximately 180 reactors-years. Unlike big stationary nuclear plant, working in base mode, power unit with marine reactor wholly capable to work in mode of the loading following. In contrast with reactor of nuclear icebreaker, advisable to increase the core lifetime and to reduce the enrichment of the uranium. This requires more uranium capacity fuel compositions and design of the core. In particular, possible transition from traditional for ship reactor of the channel core to cassette design. Other directions of evolution of the ship reactors, not touching the basic constructive decisions verified by practice, but promoting development of properties of self-security of plant are possible. Among such directions is reduction volumetric power density of a core. (author)

  8. Was the big bang hot

    International Nuclear Information System (INIS)

    Wright, E.L.

    1983-01-01

    The author considers experiments to confirm the substantial deviations from a Planck curve in the Woody and Richards spectrum of the microwave background, and search for conducting needles in our galaxy. Spectral deviations and needle-shaped grains are expected for a cold Big Bang, but are not required by a hot Big Bang. (Auth.)

  9. Ethische aspecten van big data

    NARCIS (Netherlands)

    N. (Niek) van Antwerpen; Klaas Jan Mollema

    2017-01-01

    Big data heeft niet alleen geleid tot uitdagende technische vraagstukken, ook gaat het gepaard met allerlei nieuwe ethische en morele kwesties. Om verantwoord met big data om te gaan, moet ook over deze kwesties worden nagedacht. Want slecht datagebruik kan nadelige gevolgen hebben voor

  10. Fremtidens landbrug bliver big business

    DEFF Research Database (Denmark)

    Hansen, Henning Otte

    2016-01-01

    Landbrugets omverdensforhold og konkurrencevilkår ændres, og det vil nødvendiggøre en udvikling i retning af “big business“, hvor landbrugene bliver endnu større, mere industrialiserede og koncentrerede. Big business bliver en dominerende udvikling i dansk landbrug - men ikke den eneste...

  11. Human factors in Big Data

    NARCIS (Netherlands)

    Boer, J. de

    2016-01-01

    Since 2014 I am involved in various (research) projects that try to make the hype around Big Data more concrete and tangible for the industry and government. Big Data is about multiple sources of (real-time) data that can be analysed, transformed to information and be used to make 'smart' decisions.

  12. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  13. Applications of Big Data in Education

    OpenAIRE

    Faisal Kalota

    2015-01-01

    Big Data and analytics have gained a huge momentum in recent years. Big Data feeds into the field of Learning Analytics (LA) that may allow academic institutions to better understand the learners' needs and proactively address them. Hence, it is important to have an understanding of Big Data and its applications. The purpose of this descriptive paper is to provide an overview of Big Data, the technologies used in Big Data, and some of the applications of Big Data in educa...

  14. Radon exhalation from granitic rocks

    Energy Technology Data Exchange (ETDEWEB)

    Del Claro, Flávia; Paschuk, Sergei A.; Corrêa, Janine N.; Mazer, Wellington; Narloch, Danielle Cristine; Martin, Aline Cristina [Universidade Tecnológica Federal do Paraná (UTFPR), Curitiba, PR (Brazil); Denyak, Valeriy, E-mail: flaviadelclaro@gmail.com, E-mail: spaschuk@gmail.com, E-mail: janine_nicolosi@hotmail.com, E-mail: denyak@gmail.com [Instituto de Pesquisa Pelé Pequeno Príncipe (IPPP), Curitiba, PR (Brazil)

    2017-07-01

    Naturally occurring radionuclides such as radon ({sup 222}Rn), its decay products and other elements from the radioactive series of uranium ({sup 238}U and {sup 235}U) and thorium ({sup 232}Th) are an important source of human exposure to natural radioactivity. The worldwide evaluation of health radiobiological effects and risks from population exposure to natural radionuclides is a growing concern. About 50% of personal radiation annual dose is related to radionuclides such as radon ({sup 222}Rn), thoron ({sup 220}Rn), radium ({sup 226}Ra), thorium ({sup 232}Th) and potassium ({sup 40}K), which are present in modern materials commonly used in construction of dwellings and buildings. The radioactivity of marbles and granites is of big concern since under certain conditions the radioactivity levels of these materials can be hazardous to the population and require the implementation of mitigation procedures. Present survey of the {sup 222}Rn and {sup 220}Rn activity concentration liberated in the air was performed using commercialized Brazilian granite rocks at national market as well as exported to other countries. The {sup 222}Rn and {sup 220}Rn measurements were performed using the AlphaGUARD instant monitor and RAD7 detector, respectively. This study was performed at the Applied Nuclear Physics Laboratory of the Federal University of Technology – Paraná (UTFPR). Obtained results of radon concentration activity in air exhaled studied samples of granites varied from 3±1 Bq/m{sup 3} to 2087±19 Bq/m{sup 3}, which shows that some samples of granitic rocks represent rather elevated health risk the population. (author)

  15. Exploring complex and big data

    Directory of Open Access Journals (Sweden)

    Stefanowski Jerzy

    2017-12-01

    Full Text Available This paper shows how big data analysis opens a range of research and technological problems and calls for new approaches. We start with defining the essential properties of big data and discussing the main types of data involved. We then survey the dedicated solutions for storing and processing big data, including a data lake, virtual integration, and a polystore architecture. Difficulties in managing data quality and provenance are also highlighted. The characteristics of big data imply also specific requirements and challenges for data mining algorithms, which we address as well. The links with related areas, including data streams and deep learning, are discussed. The common theme that naturally emerges from this characterization is complexity. All in all, we consider it to be the truly defining feature of big data (posing particular research and technological challenges, which ultimately seems to be of greater importance than the sheer data volume.

  16. To What Extent Can the Big Five and Learning Styles Predict Academic Achievement

    Science.gov (United States)

    Köseoglu, Yaman

    2016-01-01

    Personality traits and learning styles play defining roles in shaping academic achievement. 202 university students completed the Big Five personality traits questionnaire and the Inventory of Learning Processes Scale and self-reported their grade point averages. Conscientiousness and agreeableness, two of the Big Five personality traits, related…

  17. Big data analytics for mitigating carbon emissions in smart cities : opportunities and challenges

    NARCIS (Netherlands)

    Giest, S.N.

    2017-01-01

    The paper addresses the growing scepticism around big data use in the context of smart cities. Big data is said to transform city governments into being more efficient, effective and evidence-based. However, critics point towards the limited capacity of government to overcome the siloed structure of

  18. A Review of Rock Bolt Monitoring Using Smart Sensors

    Directory of Open Access Journals (Sweden)

    Gangbing Song

    2017-04-01

    Full Text Available Rock bolts have been widely used as rock reinforcing members in underground coal mine roadways and tunnels. Failures of rock bolts occur as a result of overloading, corrosion, seismic burst and bad grouting, leading to catastrophic economic and personnel losses. Monitoring the health condition of the rock bolts plays an important role in ensuring the safe operation of underground mines. This work presents a brief introduction on the types of rock bolts followed by a comprehensive review of rock bolt monitoring using smart sensors. Smart sensors that are used to assess rock bolt integrity are reviewed to provide a firm perception of the application of smart sensors for enhanced performance and reliability of rock bolts. The most widely used smart sensors for rock bolt monitoring are the piezoelectric sensors and the fiber optic sensors. The methodologies and principles of these smart sensors are reviewed from the point of view of rock bolt integrity monitoring. The applications of smart sensors in monitoring the critical status of rock bolts, such as the axial force, corrosion occurrence, grout quality and resin delamination, are highlighted. In addition, several prototypes or commercially available smart rock bolt devices are also introduced.

  19. A Review of Rock Bolt Monitoring Using Smart Sensors.

    Science.gov (United States)

    Song, Gangbing; Li, Weijie; Wang, Bo; Ho, Siu Chun Michael

    2017-04-05

    Rock bolts have been widely used as rock reinforcing members in underground coal mine roadways and tunnels. Failures of rock bolts occur as a result of overloading, corrosion, seismic burst and bad grouting, leading to catastrophic economic and personnel losses. Monitoring the health condition of the rock bolts plays an important role in ensuring the safe operation of underground mines. This work presents a brief introduction on the types of rock bolts followed by a comprehensive review of rock bolt monitoring using smart sensors. Smart sensors that are used to assess rock bolt integrity are reviewed to provide a firm perception of the application of smart sensors for enhanced performance and reliability of rock bolts. The most widely used smart sensors for rock bolt monitoring are the piezoelectric sensors and the fiber optic sensors. The methodologies and principles of these smart sensors are reviewed from the point of view of rock bolt integrity monitoring. The applications of smart sensors in monitoring the critical status of rock bolts, such as the axial force, corrosion occurrence, grout quality and resin delamination, are highlighted. In addition, several prototypes or commercially available smart rock bolt devices are also introduced.

  20. The big data telescope

    International Nuclear Information System (INIS)

    Finkel, Elizabeth

    2017-01-01

    On a flat, red mulga plain in the outback of Western Australia, preparations are under way to build the most audacious telescope astronomers have ever dreamed of - the Square Kilometre Array (SKA). Next-generation telescopes usually aim to double the performance of their predecessors. The Australian arm of SKA will deliver a 168-fold leap on the best technology available today, to show us the universe as never before. It will tune into signals emitted just a million years after the Big Bang, when the universe was a sea of hydrogen gas, slowly percolating with the first galaxies. Their starlight illuminated the fledgling universe in what is referred to as the “cosmic dawn”.

  1. The Big Optical Array

    International Nuclear Information System (INIS)

    Mozurkewich, D.; Johnston, K.J.; Simon, R.S.

    1990-01-01

    This paper describes the design and the capabilities of the Naval Research Laboratory Big Optical Array (BOA), an interferometric optical array for high-resolution imaging of stars, stellar systems, and other celestial objects. There are four important differences between the BOA design and the design of Mark III Optical Interferometer on Mount Wilson (California). These include a long passive delay line which will be used in BOA to do most of the delay compensation, so that the fast delay line will have a very short travel; the beam combination in BOA will be done in triplets, to allow measurement of closure phase; the same light will be used for both star and fringe tracking; and the fringe tracker will use several wavelength channels

  2. Nonstandard big bang models

    International Nuclear Information System (INIS)

    Calvao, M.O.; Lima, J.A.S.

    1989-01-01

    The usual FRW hot big-bang cosmologies have been generalized by considering the equation of state ρ = Anm +(γ-1) -1 p, where m is the rest mass of the fluid particles and A is a dimensionless constant. Explicit analytic solutions are given for the flat case (ε=O). For large cosmological times these extended models behave as the standard Einstein-de Sitter universes regardless of the values of A and γ. Unlike the usual FRW flat case the deceleration parameter q is a time-dependent function and its present value, q≅ 1, obtained from the luminosity distance versus redshift relation, may be fitted by taking, for instance, A=1 and γ = 5/3 (monatomic relativistic gas with >> k B T). In all cases the universe cools obeying the same temperature law of the FRW models and it is shown that the age of the universe is only slightly modified. (author) [pt

  3. The Last Big Bang

    Energy Technology Data Exchange (ETDEWEB)

    McGuire, Austin D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Meade, Roger Allen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-13

    As one of the very few people in the world to give the “go/no go” decision to detonate a nuclear device, Austin “Mac” McGuire holds a very special place in the history of both the Los Alamos National Laboratory and the world. As Commander of Joint Task Force Unit 8.1.1, on Christmas Island in the spring and summer of 1962, Mac directed the Los Alamos data collection efforts for twelve of the last atmospheric nuclear detonations conducted by the United States. Since data collection was at the heart of nuclear weapon testing, it fell to Mac to make the ultimate decision to detonate each test device. He calls his experience THE LAST BIG BANG, since these tests, part of Operation Dominic, were characterized by the dramatic displays of the heat, light, and sounds unique to atmospheric nuclear detonations – never, perhaps, to be witnessed again.

  4. A matrix big bang

    International Nuclear Information System (INIS)

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control

  5. A matrix big bang

    Energy Technology Data Exchange (ETDEWEB)

    Craps, Ben [Instituut voor Theoretische Fysica, Universiteit van Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam (Netherlands); Sethi, Savdeep [Enrico Fermi Institute, University of Chicago, Chicago, IL 60637 (United States); Verlinde, Erik [Instituut voor Theoretische Fysica, Universiteit van Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam (Netherlands)

    2005-10-15

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control.

  6. DPF Big One

    International Nuclear Information System (INIS)

    Anon.

    1993-01-01

    At its latest venue at Fermilab from 10-14 November, the American Physical Society's Division of Particles and Fields meeting entered a new dimension. These regular meetings, which allow younger researchers to communicate with their peers, have been gaining popularity over the years (this was the seventh in the series), but nobody had expected almost a thousand participants and nearly 500 requests to give talks. Thus Fermilab's 800-seat auditorium had to be supplemented with another room with a video hookup, while the parallel sessions were organized into nine bewildering streams covering fourteen major physics topics. With the conventionality of the Standard Model virtually unchallenged, physics does not move fast these days. While most of the physics results had already been covered in principle at the International Conference on High Energy Physics held in Dallas in August (October, page 1), the Fermilab DPF meeting had a very different atmosphere. Major international meetings like Dallas attract big names from far and wide, and it is difficult in such an august atmosphere for young researchers to find a receptive audience. This was not the case at the DPF parallel sessions. The meeting also adopted a novel approach, with the parallels sandwiched between an initial day of plenaries to set the scene, and a final day of summaries. With the whole world waiting for the sixth ('top') quark to be discovered at Fermilab's Tevatron protonantiproton collider, the meeting began with updates from Avi Yagil and Ronald Madaras from the big detectors, CDF and DO respectively. Although rumours flew thick and fast, the Tevatron has not yet reached the top, although Yagil could show one intriguing event of a type expected from the heaviest quark

  7. DPF Big One

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1993-01-15

    At its latest venue at Fermilab from 10-14 November, the American Physical Society's Division of Particles and Fields meeting entered a new dimension. These regular meetings, which allow younger researchers to communicate with their peers, have been gaining popularity over the years (this was the seventh in the series), but nobody had expected almost a thousand participants and nearly 500 requests to give talks. Thus Fermilab's 800-seat auditorium had to be supplemented with another room with a video hookup, while the parallel sessions were organized into nine bewildering streams covering fourteen major physics topics. With the conventionality of the Standard Model virtually unchallenged, physics does not move fast these days. While most of the physics results had already been covered in principle at the International Conference on High Energy Physics held in Dallas in August (October, page 1), the Fermilab DPF meeting had a very different atmosphere. Major international meetings like Dallas attract big names from far and wide, and it is difficult in such an august atmosphere for young researchers to find a receptive audience. This was not the case at the DPF parallel sessions. The meeting also adopted a novel approach, with the parallels sandwiched between an initial day of plenaries to set the scene, and a final day of summaries. With the whole world waiting for the sixth ('top') quark to be discovered at Fermilab's Tevatron protonantiproton collider, the meeting began with updates from Avi Yagil and Ronald Madaras from the big detectors, CDF and DO respectively. Although rumours flew thick and fast, the Tevatron has not yet reached the top, although Yagil could show one intriguing event of a type expected from the heaviest quark.

  8. Big bang and big crunch in matrix string theory

    International Nuclear Information System (INIS)

    Bedford, J.; Ward, J.; Papageorgakis, C.; Rodriguez-Gomez, D.

    2007-01-01

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe

  9. Exploring Data in Human Resources Big Data

    Directory of Open Access Journals (Sweden)

    Adela BARA

    2016-01-01

    Full Text Available Nowadays, social networks and informatics technologies and infrastructures are constantly developing and affect each other. In this context, the HR recruitment process became complex and many multinational organizations have encountered selection issues. The objective of the paper is to develop a prototype system for assisting the selection of candidates for an intelligent management of human resources. Such a system can be a starting point for the efficient organization of semi-structured and unstructured data on recruitment activities. The article extends the research presented at the 14th International Conference on Informatics in Economy (IE 2015 in the scientific paper "Big Data challenges for human resources management".

  10. Point cloud data management (extended abstract)

    NARCIS (Netherlands)

    Van Oosterom, P.J.M.; Ravada, S.; Horhammer, M.; Martinez Rubi, O.; Ivanova, M.; Kodde, M.; Tijssen, T.P.M.

    2014-01-01

    Point cloud data are important sources for 3D geo-information. The point cloud data sets are growing in popularity and in size. Modern Big Data acquisition and processing technologies, such as laser scanning from airborne, mobile, or static platforms, dense image matching from photos, multi-beam

  11. Roles of plasma neutron source reactor in development of fusion reactor engineering: Comparison with fission reactor engineering

    International Nuclear Information System (INIS)

    Hirayama, Shoichi; Kawabe, Takaya

    1995-01-01

    The history of development of fusion power reactor has come to a turning point, where the main research target is now shifting from the plasma heating and confinement physics toward the burning plasma physics and reactor engineering. Although the development of fusion reactor system is the first time for human beings, engineers have experience of development of fission power reactor. The common feature between them is that both are plants used for the generation of nuclear reactions for the production of energy, nucleon, and radiation on an industrial scale. By studying the history of the development of the fission reactor, one can find the existence of experimental neutron reactors including irradiation facilities for fission reactor materials. These research neutron reactors played very important roles in the development of fission power reactors. When one considers the strategy of development of fusion power reactors from the points of fusion reactor engineering, one finds that the fusion neutron source corresponds to the neutron reactor in fission reactor development. In this paper, the authors discuss the roles of the plasma-based neutron source reactors in the development of fusion reactor engineering, by comparing it with the neutron reactors in the history of fission power development, and make proposals for the strategy of the fusion reactor development. 21 refs., 6 figs

  12. Rock Cycle Roulette.

    Science.gov (United States)

    Schmidt, Stan M.; Palmer, Courtney

    2000-01-01

    Introduces an activity on the rock cycle. Sets 11 stages representing the transitions of an earth material in the rock cycle. Builds six-sided die for each station, and students move to the stations depending on the rolling side of the die. Evaluates students by discussing several questions in the classroom. Provides instructional information for…

  13. Rock engineering in Finland

    Energy Technology Data Exchange (ETDEWEB)

    1986-01-01

    Contains a large collection of short articles concerned with tunnels and underground caverns and their construction and use. The articles are grouped under the following headings: use of the subsurface space; water supply; waste water services; energy management (includes articles on power stations, district heating and oil storage and an article on coal storage); multipurpose tunnels; waste disposal; transport; shelters; sporting and recreational amenities located in rock caverns; storage facilities; industrial, laboratory, and service facilities; rock foundations; tourism and culture; utilization of rock masses; research on the disposal of nuclear waste; training and research in the field of rock engineering; site investigation techniques; design of structures in rock; construction; the environment and occupational safety; modern equipment technology; underground space in Helsinki.

  14. BigOP: Generating Comprehensive Big Data Workloads as a Benchmarking Framework

    OpenAIRE

    Zhu, Yuqing; Zhan, Jianfeng; Weng, Chuliang; Nambiar, Raghunath; Zhang, Jinchao; Chen, Xingzhen; Wang, Lei

    2014-01-01

    Big Data is considered proprietary asset of companies, organizations, and even nations. Turning big data into real treasure requires the support of big data systems. A variety of commercial and open source products have been unleashed for big data storage and processing. While big data users are facing the choice of which system best suits their needs, big data system developers are facing the question of how to evaluate their systems with regard to general big data processing needs. System b...

  15. Oscillation characteristics of the reactor 'A'; Oscilatorne karakteristike reaktora 'A'

    Energy Technology Data Exchange (ETDEWEB)

    Zecevic, V; Lolic, B [The Institute of Nuclear Sciences Boris Kidric, Vinca, Beograd (Yugoslavia)

    1961-07-01

    In addition to good knowledge of reactor physical properties, design of the reactor oscillator demands determining of the oscillator operating points as well as oscillation reactor properties. This paper contains study of the RA reactor power changes due to oscillations in in one of the vertical experimental channels. It has been concluded that the reactor optimum operating conditions are attained when the oscillator operates at optimum points, and other parameters are determined dependent on the sensitivity of the method and reactor stability.

  16. Rock-fall Hazard In The Yosemite Valley, California

    Science.gov (United States)

    Guzzetti, F.; Reichenbach, P.; Wieczorek, G. F.

    Rock slides and rock falls are the most frequent slope movements in Yosemite Na- tional Park, California. In historical time (1851-2001), more than 400 rock falls and rock slides have been documented in the valley, and some of them have been mapped in detail. We present the preliminary results of an attempt to assess rockfall hazard in the Yosemite Valley using STONE, a 3-dimensional rock-fall simulation computer program. The software computes 3-dimensional rock-fall trajectories starting from a digital terrain model (DTM), the location of rock-fall release points (source areas), and maps of the dynamic rolling coefficient and of the coefficients of normal and tan- gential energy restitution. For each DTM cell the software also calculates the number of rock falls passing through the cell, the maximum rock-fall velocity and the maxi- mum flying height. For the Yosemite Valley, a DTM with a ground resolution of 10 x 10 m was prepared using topographic contour lines from USGS 1:24,000-scale maps. Rock-fall release points were identified as DTM cells having a slope steeper than 60 degrees, an assumption based on the location of historical rock falls. Maps of the nor- mal and tangential energy restitution coefficients and of the rolling friction coefficient were produced from a surficial geologic map. The availability of historical rock falls mapped in detail allowed us to check the computer program performance and to cali- brate the model parameters. Visual and statistical comparison of the model results with the mapped rock falls confirmed the accuracy of the model. The model results are also compared with a geomorphic assessment of rock-fall hazard based on potential energy referred to as a "shadow angle" approach, recently completed for the Yosemite Valley.

  17. From big bang to big crunch and beyond

    International Nuclear Information System (INIS)

    Elitzur, Shmuel; Rabinovici, Eliezer; Giveon, Amit; Kutasov, David

    2002-01-01

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a 'big bang' singularity, expands and then contracts to a 'big crunch' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spacetimes is further connected at the singularities to a series of non-compact static regions with closed timelike curves. These regions contain boundaries, on which the observables of the theory live. This suggests a holographic interpretation of the physics. (author)

  18. Reactor safety protection system

    International Nuclear Information System (INIS)

    Nishi, Hiroshi; Yokoyama, Tsuguo.

    1989-01-01

    A plurality of neutron detectors are disposed around a reactor core and detection signals from optional two neutron detectors are inputted into a ratio calculation device. If the ratio between both of the neutron flux level signals exceeds a predetermined value, a reactor trip signal is generated from an alarm setting device. Further, detection signals from all of the neutron detection devices are inputted into an average calculation device and the reactor trip signal is generated also in a case where the average value exceeds a predetermined set value. That is, when the reactor core power is increased locally, the detection signal from the neutron detector nearer to the point of power increase is greater than the increase rate for the entire reactor core power, while the detection signal from the neutron detector remote from the point of power increase is smaller. Thus, the local power increase ratio in the FBR reactor core can be detected efficiently by calculating the ratio for the neutron flux level signals from two neutron detectors, thereby enabling to exactly recognize the local power increase rate in the reactor core. (N.H.)

  19. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  20. Empathy and the Big Five

    OpenAIRE

    Paulus, Christoph

    2016-01-01

    Del Barrio et al. (2004) haben vor mehr als 10 Jahren versucht, eine direkte Beziehung zwischen Empathie und den Big Five herzustellen. Im Mittel hatten in ihrer Stichprobe Frauen höhere Werte in der Empathie und auf den Big Five-Faktoren mit Ausnahme des Faktors Neurotizismus. Zusammenhänge zu Empathie fanden sie in den Bereichen Offenheit, Verträglichkeit, Gewissenhaftigkeit und Extraversion. In unseren Daten besitzen Frauen sowohl in der Empathie als auch den Big Five signifikant höhere We...

  1. "Big data" in economic history.

    Science.gov (United States)

    Gutmann, Myron P; Merchant, Emily Klancher; Roberts, Evan

    2018-03-01

    Big data is an exciting prospect for the field of economic history, which has long depended on the acquisition, keying, and cleaning of scarce numerical information about the past. This article examines two areas in which economic historians are already using big data - population and environment - discussing ways in which increased frequency of observation, denser samples, and smaller geographic units allow us to analyze the past with greater precision and often to track individuals, places, and phenomena across time. We also explore promising new sources of big data: organically created economic data, high resolution images, and textual corpora.

  2. Big Data as Information Barrier

    Directory of Open Access Journals (Sweden)

    Victor Ya. Tsvetkov

    2014-07-01

    Full Text Available The article covers analysis of ‘Big Data’ which has been discussed over last 10 years. The reasons and factors for the issue are revealed. It has proved that the factors creating ‘Big Data’ issue has existed for quite a long time, and from time to time, would cause the informational barriers. Such barriers were successfully overcome through the science and technologies. The conducted analysis refers the “Big Data” issue to a form of informative barrier. This issue may be solved correctly and encourages development of scientific and calculating methods.

  3. Homogeneous and isotropic big rips?

    CERN Document Server

    Giovannini, Massimo

    2005-01-01

    We investigate the way big rips are approached in a fully inhomogeneous description of the space-time geometry. If the pressure and energy densities are connected by a (supernegative) barotropic index, the spatial gradients and the anisotropic expansion decay as the big rip is approached. This behaviour is contrasted with the usual big-bang singularities. A similar analysis is performed in the case of sudden (quiescent) singularities and it is argued that the spatial gradients may well be non-negligible in the vicinity of pressure singularities.

  4. Big Data in Space Science

    OpenAIRE

    Barmby, Pauline

    2018-01-01

    It seems like “big data” is everywhere these days. In planetary science and astronomy, we’ve been dealing with large datasets for a long time. So how “big” is our data? How does it compare to the big data that a bank or an airline might have? What new tools do we need to analyze big datasets, and how can we make better use of existing tools? What kinds of science problems can we address with these? I’ll address these questions with examples including ESA’s Gaia mission, ...

  5. Rate Change Big Bang Theory

    Science.gov (United States)

    Strickland, Ken

    2013-04-01

    The Rate Change Big Bang Theory redefines the birth of the universe with a dramatic shift in energy direction and a new vision of the first moments. With rate change graph technology (RCGT) we can look back 13.7 billion years and experience every step of the big bang through geometrical intersection technology. The analysis of the Big Bang includes a visualization of the first objects, their properties, the astounding event that created space and time as well as a solution to the mystery of anti-matter.

  6. Big bang in a universe with infinite extension

    Energy Technology Data Exchange (ETDEWEB)

    Groen, Oeyvind [Oslo College, Department of Engineering, PO Box 4, St Olavs Pl, 0130 Oslo (Norway); Institute of Physics, University of Oslo, PO Box 1048 Blindern, 0316 Oslo (Norway)

    2006-05-01

    How can a universe coming from a point-like big bang event have infinite spatial extension? It is shown that the relativity of simultaneity is essential in answering this question. Space is finite as defined by the simultaneity of one observer, but it may be infinite as defined by the simultaneity of all the clocks participating in the Hubble flow.

  7. Big bang in a universe with infinite extension

    International Nuclear Information System (INIS)

    Groen, Oeyvind

    2006-01-01

    How can a universe coming from a point-like big bang event have infinite spatial extension? It is shown that the relativity of simultaneity is essential in answering this question. Space is finite as defined by the simultaneity of one observer, but it may be infinite as defined by the simultaneity of all the clocks participating in the Hubble flow

  8. Big Data is invading big places as CERN

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Big Data technologies are becoming more popular with the constant grow of data generation in different fields such as social networks, internet of things and laboratories like CERN. How is CERN making use of such technologies? How machine learning is applied at CERN with Big Data technologies? How much data we move and how it is analyzed? All these questions will be answered during the talk.

  9. The ethics of big data in big agriculture

    OpenAIRE

    Carbonell (Isabelle M.)

    2016-01-01

    This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique in...

  10. Big climate data analysis

    Science.gov (United States)

    Mudelsee, Manfred

    2015-04-01

    The Big Data era has begun also in the climate sciences, not only in economics or molecular biology. We measure climate at increasing spatial resolution by means of satellites and look farther back in time at increasing temporal resolution by means of natural archives and proxy data. We use powerful supercomputers to run climate models. The model output of the calculations made for the IPCC's Fifth Assessment Report amounts to ~650 TB. The 'scientific evolution' of grid computing has started, and the 'scientific revolution' of quantum computing is being prepared. This will increase computing power, and data amount, by several orders of magnitude in the future. However, more data does not automatically mean more knowledge. We need statisticians, who are at the core of transforming data into knowledge. Statisticians notably also explore the limits of our knowledge (uncertainties, that is, confidence intervals and P-values). Mudelsee (2014 Climate Time Series Analysis: Classical Statistical and Bootstrap Methods. Second edition. Springer, Cham, xxxii + 454 pp.) coined the term 'optimal estimation'. Consider the hyperspace of climate estimation. It has many, but not infinite, dimensions. It consists of the three subspaces Monte Carlo design, method and measure. The Monte Carlo design describes the data generating process. The method subspace describes the estimation and confidence interval construction. The measure subspace describes how to detect the optimal estimation method for the Monte Carlo experiment. The envisaged large increase in computing power may bring the following idea of optimal climate estimation into existence. Given a data sample, some prior information (e.g. measurement standard errors) and a set of questions (parameters to be estimated), the first task is simple: perform an initial estimation on basis of existing knowledge and experience with such types of estimation problems. The second task requires the computing power: explore the hyperspace to

  11. Hey, big spender

    Energy Technology Data Exchange (ETDEWEB)

    Cope, G.

    2000-04-01

    Business to business electronic commerce is looming large in the future of the oil industry. It is estimated that by adopting e-commerce the industry could achieve bottom line savings of between $1.8 to $ 3.4 billion a year on annual gross revenues in excess of $ 30 billion. At present there are several teething problems to overcome such as inter-operability standards, which are at least two or three years away. Tying in electronically with specific suppliers is also an expensive proposition, although the big benefits are in fact in doing business with the same suppliers on a continuing basis. Despite these problems, 14 of the world's largest energy and petrochemical companies joined forces in mid-April to create a single Internet procurement marketplace for the industry's complex supply chain. The exchange was designed by B2B (business-to-business) software provider, Commerce One Inc., ; it will leverage the buying clout of these industry giants (BP Amoco, Royal Dutch Shell Group, Conoco, Occidental Petroleum, Phillips Petroleum, Unocal Corporation and Statoil among them), currently about $ 125 billion on procurement per year; they hope to save between 5 to 30 per cent depending on the product and the region involved. Other similar schemes such as Chevron and partners' Petrocosm Marketplace, Network Oil, a Houston-based Internet portal aimed at smaller petroleum companies, are also doing business in the $ 10 billion per annum range. e-Energy, a cooperative project between IBM Ericson and Telus Advertising is another neutral, virtual marketplace targeted at the oil and gas sector. PetroTRAX, a Calgary-based website plans to take online procurement and auction sales a big step forward by establishing a portal to handle any oil company's asset management needs. There are also a number of websites targeting specific needs: IndigoPool.com (acquisitions and divestitures) and WellBid.com (products related to upstream oil and gas operators) are just

  12. Hey, big spender

    International Nuclear Information System (INIS)

    Cope, G.

    2000-01-01

    Business to business electronic commerce is looming large in the future of the oil industry. It is estimated that by adopting e-commerce the industry could achieve bottom line savings of between $1.8 to $ 3.4 billion a year on annual gross revenues in excess of $ 30 billion. At present there are several teething problems to overcome such as inter-operability standards, which are at least two or three years away. Tying in electronically with specific suppliers is also an expensive proposition, although the big benefits are in fact in doing business with the same suppliers on a continuing basis. Despite these problems, 14 of the world's largest energy and petrochemical companies joined forces in mid-April to create a single Internet procurement marketplace for the industry's complex supply chain. The exchange was designed by B2B (business-to-business) software provider, Commerce One Inc., ; it will leverage the buying clout of these industry giants (BP Amoco, Royal Dutch Shell Group, Conoco, Occidental Petroleum, Phillips Petroleum, Unocal Corporation and Statoil among them), currently about $ 125 billion on procurement per year; they hope to save between 5 to 30 per cent depending on the product and the region involved. Other similar schemes such as Chevron and partners' Petrocosm Marketplace, Network Oil, a Houston-based Internet portal aimed at smaller petroleum companies, are also doing business in the $ 10 billion per annum range. e-Energy, a cooperative project between IBM Ericson and Telus Advertising is another neutral, virtual marketplace targeted at the oil and gas sector. PetroTRAX, a Calgary-based website plans to take online procurement and auction sales a big step forward by establishing a portal to handle any oil company's asset management needs. There are also a number of websites targeting specific needs: IndigoPool.com (acquisitions and divestitures) and WellBid.com (products related to upstream oil and gas operators) are just two examples. All in

  13. Methods and tools for big data visualization

    OpenAIRE

    Zubova, Jelena; Kurasova, Olga

    2015-01-01

    In this paper, methods and tools for big data visualization have been investigated. Challenges faced by the big data analysis and visualization have been identified. Technologies for big data analysis have been discussed. A review of methods and tools for big data visualization has been done. Functionalities of the tools have been demonstrated by examples in order to highlight their advantages and disadvantages.

  14. Measuring the Promise of Big Data Syllabi

    Science.gov (United States)

    Friedman, Alon

    2018-01-01

    Growing interest in Big Data is leading industries, academics and governments to accelerate Big Data research. However, how teachers should teach Big Data has not been fully examined. This article suggests criteria for redesigning Big Data syllabi in public and private degree-awarding higher education establishments. The author conducted a survey…

  15. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  16. Evaluation of seismic stability of nuclear power plants on weathered soft rocks

    International Nuclear Information System (INIS)

    Ogata, Nobuhide; Nishi, Koichi; Honsho, Shizumitsu

    1991-01-01

    Soft rocks such as weathered rocks or low cemented sedimentary rocks spread all over the country. If it is possible to construct nuclear power plants on such soft rocks, there will be more available sites for nuclear power plants. The investigation on the following research items was carried out. (1) Geological survey and the application of test methods on soft rocks. (2) Methods and application of laboratory and in-situ tests on soft rocks. (3) Response analysis of a reactor building and foundation ground during earthquake. (4) Stability analysis of soft rock ground as the foundation of a nuclear power plant regarding both earthquake and long-term settlement. From the results of the investigation, it became evident that the seismic stability of a nuclear power plant on weathered soft rocks can be assured enough. (author)

  17. Eos Chaos Rocks

    Science.gov (United States)

    2006-01-01

    11 January 2006 This Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows light-toned, layered rock outcrops in Eos Chaos, located near the east end of the Valles Marineris trough system. The outcrops occur in the form of a distinct, circular butte (upper half of image) and a high slope (lower half of image). The rocks might be sedimentary rocks, similar to those found elsewhere exposed in the Valles Marineris system and the chaotic terrain to the east of the region. Location near: 12.9oS, 49.5oW Image width: 3 km (1.9 mi) Illumination from: lower left Season: Southern Summer

  18. ESR dating of the fault rocks

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hee Kwon [Kangwon National Univ., Chuncheon (Korea, Republic of)

    2004-01-15

    Past movement on faults can be dated by measurement of the intensity of ESR signals in quartz. These signals are reset by local lattice deformation and local frictional heating on grain contacts at the time of fault movement. The ESR signals then grow back as a result of bombardment by ionizing radiation from surrounding rocks. The age is obtained from the ratio of the equivalent dose, needed to produce the observed signal, to the dose rate. Fine grains are more completely reset during faulting, and a plot of age vs, grain size shows a plateau for grains below critical size : these grains are presumed to have been completely zeroed by the last fault activity. We carried out ESR dating of fault rocks collected near the Ulzin nuclear reactor. ESR signals of quartz grains separated from fault rocks collected from the E-W trend fault are saturated. This indicates that the last movement of these faults had occurred before the quaternary period. ESR dates from the NW trend faults range from 300ka to 700ka. On the other hand, ESR date of the NS trend fault is about 50ka. Results of this research suggest that long-term cyclic fault activity near the Ulzin nuclear reactor continued into the pleistocene.

  19. Biophotonics: the big picture

    Science.gov (United States)

    Marcu, Laura; Boppart, Stephen A.; Hutchinson, Mark R.; Popp, Jürgen; Wilson, Brian C.

    2018-02-01

    The 5th International Conference on Biophotonics (ICOB) held April 30 to May 1, 2017, in Fremantle, Western Australia, brought together opinion leaders to discuss future directions for the field and opportunities to consider. The first session of the conference, "How to Set a Big Picture Biophotonics Agenda," was focused on setting the stage for developing a vision and strategies for translation and impact on society of biophotonic technologies. The invited speakers, panelists, and attendees engaged in discussions that focused on opportunities and promising applications for biophotonic techniques, challenges when working at the confluence of the physical and biological sciences, driving factors for advances of biophotonic technologies, and educational opportunities. We share a summary of the presentations and discussions. Three main themes from the conference are presented in this position paper that capture the current status, opportunities, challenges, and future directions of biophotonics research and key areas of applications: (1) biophotonics at the nano- to microscale level; (2) biophotonics at meso- to macroscale level; and (3) biophotonics and the clinical translation conundrum.

  20. Big bang darkleosynthesis

    Directory of Open Access Journals (Sweden)

    Gordan Krnjaic

    2015-12-01

    Full Text Available In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN, in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV/dark-nucleon binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S≫3/2, whose discovery would be smoking gun evidence for dark nuclei.

  1. Predicting big bang deuterium

    Energy Technology Data Exchange (ETDEWEB)

    Hata, N.; Scherrer, R.J.; Steigman, G.; Thomas, D.; Walker, T.P. [Department of Physics, Ohio State University, Columbus, Ohio 43210 (United States)

    1996-02-01

    We present new upper and lower bounds to the primordial abundances of deuterium and {sup 3}He based on observational data from the solar system and the interstellar medium. Independent of any model for the primordial production of the elements we find (at the 95{percent} C.L.): 1.5{times}10{sup {minus}5}{le}(D/H){sub {ital P}}{le}10.0{times}10{sup {minus}5} and ({sup 3}He/H){sub {ital P}}{le}2.6{times}10{sup {minus}5}. When combined with the predictions of standard big bang nucleosynthesis, these constraints lead to a 95{percent} C.L. bound on the primordial abundance deuterium: (D/H){sub best}=(3.5{sup +2.7}{sub {minus}1.8}){times}10{sup {minus}5}. Measurements of deuterium absorption in the spectra of high-redshift QSOs will directly test this prediction. The implications of this prediction for the primordial abundances of {sup 4}He and {sup 7}Li are discussed, as well as those for the universal density of baryons. {copyright} {ital 1996 The American Astronomical Society.}

  2. Big bang darkleosynthesis

    Science.gov (United States)

    Krnjaic, Gordan; Sigurdson, Kris

    2015-12-01

    In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN), in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV /dark-nucleon) binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S ≫ 3 / 2), whose discovery would be smoking gun evidence for dark nuclei.

  3. The role of big laboratories

    CERN Document Server

    Heuer, Rolf-Dieter

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward.

  4. The role of big laboratories

    International Nuclear Information System (INIS)

    Heuer, R-D

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward. (paper)

  5. Reactor Physics

    International Nuclear Information System (INIS)

    Ait Abderrahim, A.

    2002-01-01

    SCK-CEN's Reactor Physics and MYRRHA Department offers expertise in various areas of reactor physics, in particular in neutron and gamma calculations, reactor dosimetry, reactor operation and control, reactor code benchmarking and reactor safety calculations. This expertise is applied in the Department's own research projects in the VENUS critical facility, in the BR1 reactor and in the MYRRHA project (this project aims at designing a prototype Accelerator Driven System). Available expertise is also used in programmes external to the Department such as the reactor pressure steel vessel programme, the BR2 materials testing reactor dosimetry, and the preparation and interpretation of irradiation experiments by means of neutron and gamma calculations. The activities of the Fuzzy Logic and Intelligent Technologies in Nuclear Science programme cover several domains outside the department. Progress and achievements in these topical areas in 2001 are summarised

  6. Reactor Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ait Abderrahim, A

    2001-04-01

    The Reactor Physics and MYRRHA Department of SCK-CEN offers expertise in various areas of reactor physics, in particular in neutronics calculations, reactor dosimetry, reactor operation, reactor safety and control and non-destructive analysis of reactor fuel. This expertise is applied in the Department's own research projects in the VENUS critical facility, in the BR1 reactor and in the MYRRHA project (this project aims at designing a prototype Accelerator Driven System). Available expertise is also used in programmes external to the Department such as the reactor pressure steel vessel programme, the BR2 reactor dosimetry, and the preparation and interpretation of irradiation experiments by means of neutron and gamma calculations. The activities of the Fuzzy Logic and Intelligent Technologies in Nuclear Science programme cover several domains outside the department. Progress and achievements in these topical areas in 2000 are summarised.

  7. Reactor Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ait Abderrahim, A

    2002-04-01

    SCK-CEN's Reactor Physics and MYRRHA Department offers expertise in various areas of reactor physics, in particular in neutron and gamma calculations, reactor dosimetry, reactor operation and control, reactor code benchmarking and reactor safety calculations. This expertise is applied in the Department's own research projects in the VENUS critical facility, in the BR1 reactor and in the MYRRHA project (this project aims at designing a prototype Accelerator Driven System). Available expertise is also used in programmes external to the Department such as the reactor pressure steel vessel programme, the BR2 materials testing reactor dosimetry, and the preparation and interpretation of irradiation experiments by means of neutron and gamma calculations. The activities of the Fuzzy Logic and Intelligent Technologies in Nuclear Science programme cover several domains outside the department. Progress and achievements in these topical areas in 2001 are summarised.

  8. Reactor Physics

    International Nuclear Information System (INIS)

    Ait Abderrahim, A.

    2001-01-01

    The Reactor Physics and MYRRHA Department of SCK-CEN offers expertise in various areas of reactor physics, in particular in neutronics calculations, reactor dosimetry, reactor operation, reactor safety and control and non-destructive analysis of reactor fuel. This expertise is applied in the Department's own research projects in the VENUS critical facility, in the BR1 reactor and in the MYRRHA project (this project aims at designing a prototype Accelerator Driven System). Available expertise is also used in programmes external to the Department such as the reactor pressure steel vessel programme, the BR2 reactor dosimetry, and the preparation and interpretation of irradiation experiments by means of neutron and gamma calculations. The activities of the Fuzzy Logic and Intelligent Technologies in Nuclear Science programme cover several domains outside the department. Progress and achievements in these topical areas in 2000 are summarised

  9. Generalized formal model of Big Data

    OpenAIRE

    Shakhovska, N.; Veres, O.; Hirnyak, M.

    2016-01-01

    This article dwells on the basic characteristic features of the Big Data technologies. It is analyzed the existing definition of the “big data” term. The article proposes and describes the elements of the generalized formal model of big data. It is analyzed the peculiarities of the application of the proposed model components. It is described the fundamental differences between Big Data technology and business analytics. Big Data is supported by the distributed file system Google File System ...

  10. Reactor operation

    CERN Document Server

    Shaw, J

    2013-01-01

    Reactor Operation covers the theoretical aspects and design information of nuclear reactors. This book is composed of nine chapters that also consider their control, calibration, and experimentation.The opening chapters present the general problems of reactor operation and the principles of reactor control and operation. The succeeding chapters deal with the instrumentation, start-up, pre-commissioning, and physical experiments of nuclear reactors. The remaining chapters are devoted to the control rod calibrations and temperature coefficient measurements in the reactor. These chapters also exp

  11. Reactor safeguards

    CERN Document Server

    Russell, Charles R

    1962-01-01

    Reactor Safeguards provides information for all who are interested in the subject of reactor safeguards. Much of the material is descriptive although some sections are written for the engineer or physicist directly concerned with hazards analysis or site selection problems. The book opens with an introductory chapter on radiation hazards, the construction of nuclear reactors, safety issues, and the operation of nuclear reactors. This is followed by separate chapters that discuss radioactive materials, reactor kinetics, control and safety systems, containment, safety features for water reactor

  12. Nuclear reactors theory

    International Nuclear Information System (INIS)

    Naudan, G.; Nigon, J.L.

    1993-01-01

    After principles of chain reaction and criticality notion, a descriptive model of neutrons behaviour is exposed from a local point of view (this model is called four factors model). One justifies the use of middle values for the calculation of the distribution in space of reactor, quantities representing heterogeneous middle from a local point of view (fuel, moderator, can or clad, and so on ...) by substitution of an equivalent homogeneous middle. Time dependence, dynamical behaviour of reactor are studied. Long term effects of evolution of constituents elements of heart under irradiation, and ways to balance this evolution are in the last paragraph. 18 refs., 26 figs

  13. Rock properties data base

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, R.; Gorski, B.; Gyenge, M.

    1991-03-01

    As mining companies proceed deeper and into areas whose stability is threatened by high and complex stress fields, the science of rock mechanics becomes invaluable in designing underground mine strata control programs. CANMET's Mining Research Laboratories division has compiled a summary of pre- and post-failure mechanical properties of rock types which were tested to provide design data. The 'Rock Properties Data Base' presents the results of these tests, and includes many rock types typical of Canadian mine environments. The data base also contains 'm' and 's' values determined using Hoek and Brown's failure criteria for both pre- and post-failure conditions. 7 refs., 3 tabs., 9 figs., 1 append.

  14. Eclogite facies rocks

    National Research Council Canada - National Science Library

    Carswell, D. A

    1990-01-01

    ... of eclogite evolution and genesis. The authors present a thorough treatment of the stability relations and geochemistry of these rocks, their intimate association with continental plate collision zones and suture zones...

  15. Rock kinoekraanil / Katrin Rajasaare

    Index Scriptorium Estoniae

    Rajasaare, Katrin

    2008-01-01

    7.-11. juulini kinos Sõprus toimuval filminädalal "Rock On Screen" ekraanile jõudvatest rockmuusikuid portreteerivatest filmidest "Lou Reed's Berlin", "The Future Is Unwritten: Joe Strummer", "Control: Joy Division", "Hurriganes", "Shlaager"

  16. Eclogite facies rocks

    National Research Council Canada - National Science Library

    Carswell, D. A

    1990-01-01

    .... This is the first volume to provide a coherent and comprehensive review of the conditions necessary for the formation of eclogites and eclogite facies rocks and assemblages, and a detailed account...

  17. Solid as a rock

    International Nuclear Information System (INIS)

    Pincus, H.J.

    1984-01-01

    Recent technologic developments have required a more comprehensive approach to the behavior of rock mass or rock substance plus discontinuities than was adequate previously. This work considers the inherent problems in such operations as the storage of hot or cold fluids in caverns and aquifers, underground storage of nuclear waste, underground recovery of heat from hydrocarbon fuels, tertiary recovery of oil by thermal methods, rapid excavation of large openings at shallow to great depths and in hostile environments, and retrofitting of large structures built on or in rock. The standardization of methods for determining rock properties is essential to all of the activities described, for use not only in design and construction but also in site selection and post-construction monitoring. Development of such standards is seen as a multidisciplinary effort

  18. Rock Equity Holdings, LLC

    Science.gov (United States)

    The EPA is providing notice of an Administrative Penalty Assessment in the form of an Expedited Storm Water Settlement Agreement against Rock Equity Holdings, LLC, for alleged violations at The Cove at Kettlestone/98th Street Reconstruction located at 3015

  19. Pop & rock / Berk Vaher

    Index Scriptorium Estoniae

    Vaher, Berk, 1975-

    2001-01-01

    Uute heliplaatide Redman "Malpractice", Brian Eno & Peter Schwalm "Popstars", Clawfinger "A Whole Lot of Nothing", Dario G "In Full Color", MLTR e. Michael Learns To Rock "Blue Night" lühitutvustused

  20. Digital Rock Studies of Tight Porous Media

    Energy Technology Data Exchange (ETDEWEB)

    Silin, Dmitriy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-08-07

    This technical report summarizes some recently developed approaches to studies of rock properties at a pore scale. Digital rock approach is complementary to laboratory and field studies. It can be especially helpful in situations where experimental data are uncertain, or are difficult or impossible to obtain. Digitized binary images of the pore geometries of natural rocks obtained by different imaging techniques are the input data. Computer-generated models of natural rocks can be used instead of images in a case where microtomography data are unavailable, or the resolution of the tools is insufficient to adequately characterize the features of interest. Simulations of creeping viscous flow in pores produce estimates of Darcy permeability. Maximal Inscribed Spheres calculations estimate two-phase fluid distribution in capillary equilibrium. A combination of both produce relative permeability curves. Computer-generated rock models were employed to study two-phase properties of fractured rocks, or tight sands with slit-like pores, too narrow to be characterized with micro-tomography. Various scenarios can simulate different fluid displacement mechanisms, from piston-like drainage to liquid dropout at the dew point. A finite differences discretization of Stokes equation is developed to simulate flow in the pore space of natural rocks. The numerical schemes are capable to handle both no-slip and slippage flows. An upscaling procedure estimates the permeability by subsampling a large data set. Capillary equilibrium and capillary pressure curves are efficiently estimated with the method of maximal inscribed spheres both an arbitrary contact angle. The algorithms can handle gigobytes of data on a desktop workstation. Customized QuickHull algorithms model natural rocks. Capillary pressure curves evaluated from computer-generated images mimic those obtained for microtomography data.