WorldWideScience

Sample records for big ten reactor

  1. Reactor dosimetry calibrations in the Big Ten critical assembly

    International Nuclear Information System (INIS)

    Barr, D.W.; Hansen, G.E.

    1977-01-01

    Eleven irradiations of foil packs located in the central region of Big Ten were made for the Interlaboratory Reaction Rate Program. Each irradiation was at a nominal 10 15 fluence and the principal fluence monitor was the National Bureau of Standards' double fission chamber containing 235 U and 238 U deposits and located at the center of Big Ten. Secondary monitors consisted of three external fission chambers and two internal foil sets containing Au, In, and Al. Activities of one set were counted at the LASL and the other at the Hanford Engineering Developement Laboratory. The uncertainty in relative fluence for each irradiation was +-0.3%

  2. How the Big Ten West Was Won: Football Recruiting

    Directory of Open Access Journals (Sweden)

    Diehl Kevin A.

    2017-06-01

    Full Text Available The paper analyses the 2017 Big Ten West Division football cycle models recruiting. The top ten recruit scores [(Rivals.com * 100 + [100 (outside surrounding; 75 (surrounding] + extra credit walk-ons (100; 75] allow proper ranking: Iowa, Nebraska, Wisconsin, Northwestern, Illinois, Minnesota, and Purdue.

  3. Dosimetry results for Big Ten and related benchmarks

    International Nuclear Information System (INIS)

    Hansen, G.E.; Gilliam, D.M.

    1980-01-01

    Measured average reaction cross sections for the Big Ten central flux spectrum are given together with calculated values based on the U.S. Evaluated Nuclear Data File ENDF/B-IV. Central reactivity coefficients for 233 U, 235 U, 239 Pu, 6 Li and 10 B are given to check consistency of bias between measured and calculated reaction cross sections for these isotopes. Spectral indexes for the Los Alamos 233 U, 235 U and 239 Pu metal critical assemblies are updated, utilizing the Big Ten measurements and interassembly calibrations, and their implications for inelastic scattering are reiterated

  4. Measurement of the neutron spectrum of the Big Ten critical assembly by lithium-6 spectrometry

    International Nuclear Information System (INIS)

    De Leeuw-Gierts, G.; De Leeuw, S.; Hansen, G.E.; Helmick, H.H.

    1979-01-01

    The central neutron-flux spectrum of the Los Alamos Scientific Laboratory's critical assembly, Big Ten, was measured with a 6 Li spectrometer and techniques developed at the Centre d'Etude de L'Energie Nucleaire, Mol, as part of an experimental program to establish the characteristics of Big Ten

  5. Measurement of the neutron spectrum of the Big Ten critical assembly by lithium-6 spectrometry

    International Nuclear Information System (INIS)

    Leeuw-Gierts, G. de; Leeuw, S. de

    1980-01-01

    The central neutron-flux spectrum of the Los Alamos Scientific Laboratory's critical assembly, Big Ten, was measured with a 6 Li spectrometer and techniques developed at the Centre d'Etude de l'Energie Nucleaire, Mol, as part of an experimental program to establish the characteristics of Big Ten

  6. An improved benchmark model for the Big Ten critical assembly - 021

    International Nuclear Information System (INIS)

    Mosteller, R.D.

    2010-01-01

    A new benchmark specification is developed for the BIG TEN uranium critical assembly. The assembly has a fast spectrum, and its core contains approximately 10 wt.% enriched uranium. Detailed specifications for the benchmark are provided, and results from the MCNP5 Monte Carlo code using a variety of nuclear-data libraries are given for this benchmark and two others. (authors)

  7. Nuclear technology and reactor safety engineering. The situation ten years after the Chernobyl reactor accident

    International Nuclear Information System (INIS)

    Birkhofer, A.

    1996-01-01

    Ten years ago, on April 26, 1986 the most serious accident ever in the history of nuclear tgechnology worldwide happened in unit 4 of the nuclear power plant in Chernobyl in the Ukraine, this accident unveiling to the world at large that the Soviet reactor design lines are bearing unthought of safety engineering deficits. The dimensions of this reactor accident on site, and the radioactive fallout spreading far and wide to many countries in Europe, vividly nourished the concern of great parts of the population in the Western world about the safety of nuclear technology, and re-instigated debates about the risks involved and their justification. Now that ten years have elapsed since the accident, it is appropriate to strike a balance and analyse the situation today. The number of nuclear power plants operating worldwide has been growing in the last few years and this trend will continue, primarily due to developments in Asia. The Chernobyl reactor accident has pushed the international dimension of reactor safety to the foreground. Thus the Western world had reason enough to commit itself to enhancing the engineered safety of reactors in East Europe. The article analyses some of the major developments and activities to date and shows future perspectives. (orig.) [de

  8. Flexibility in faculty work-life policies at medical schools in the Big Ten conference.

    Science.gov (United States)

    Welch, Julie L; Wiehe, Sarah E; Palmer-Smith, Victoria; Dankoski, Mary E

    2011-05-01

    Women lag behind men in several key academic indicators, such as advancement, retention, and securing leadership positions. Although reasons for these disparities are multifactorial, policies that do not support work-life integration contribute to the problem. The objective of this descriptive study was to compare the faculty work-life policies among medical schools in the Big Ten conference. Each institution's website was accessed in order to assess its work-life policies in the following areas: maternity leave, paternity leave, adoption leave, extension of probationary period, part-time appointments, part-time benefits (specifically health insurance), child care options, and lactation policy. Institutions were sent requests to validate the online data and supply additional information if needed. Each institution received an overall score and subscale scores for family leave policies and part-time issues. Data were verified by the human resources office at 8 of the 10 schools. Work-life policies varied among Big Ten schools, with total scores between 9.25 and 13.5 (possible score: 0-21; higher scores indicate greater flexibility). Subscores were not consistently high or low within schools. Comparing the flexibility of faculty work-life policies in relation to other schools will help raise awareness of these issues and promote more progressive policies among less progressive schools. Ultimately, flexible policies will lead to greater equity and institutional cultures that are conducive to recruiting, retaining, and advancing diverse faculty.

  9. Gender Differences in Personality across the Ten Aspects of the Big Five.

    Science.gov (United States)

    Weisberg, Yanna J; Deyoung, Colin G; Hirsh, Jacob B

    2011-01-01

    This paper investigates gender differences in personality traits, both at the level of the Big Five and at the sublevel of two aspects within each Big Five domain. Replicating previous findings, women reported higher Big Five Extraversion, Agreeableness, and Neuroticism scores than men. However, more extensive gender differences were found at the level of the aspects, with significant gender differences appearing in both aspects of every Big Five trait. For Extraversion, Openness, and Conscientiousness, the gender differences were found to diverge at the aspect level, rendering them either small or undetectable at the Big Five level. These findings clarify the nature of gender differences in personality and highlight the utility of measuring personality at the aspect level.

  10. Ten aspects of the Big Five in the Personality Inventory for DSM-5.

    Science.gov (United States)

    DeYoung, Colin G; Carey, Bridget E; Krueger, Robert F; Ross, Scott R

    2016-04-01

    Diagnostic and Statistical Manual of Mental Disorders (5th ed.; DSM-5) includes a dimensional model of personality pathology, operationalized in the Personality Inventory for DSM-5 (PID-5), with 25 facets grouped into 5 higher order factors resembling the Big Five personality dimensions. The present study tested how well these 25 facets could be integrated with the 10-factor structure of traits within the Big Five that is operationalized by the Big Five Aspect Scales (BFAS). In 2 healthy adult samples, 10-factor solutions largely confirmed our hypothesis that each of the 10 BFAS would be the highest loading BFAS on 1 and only 1 factor. Varying numbers of PID-5 scales were additional markers of each factor, and the overall factor structure in the first sample was well replicated in the second. Our results allow Cybernetic Big Five Theory (CB5T) to be brought to bear on manifestations of personality disorder, because CB5T offers mechanistic explanations of the 10 factors measured by the BFAS. Future research, therefore, may begin to test hypotheses derived from CB5T regarding the mechanisms that are dysfunctional in specific personality disorders. (c) 2016 APA, all rights reserved).

  11. Usability Analysis of the Big Ten Academic Alliance Geoportal: Findings and Recommendations for Improvement of the User Experience

    Directory of Open Access Journals (Sweden)

    Mara Blake

    2017-10-01

    Full Text Available The Big Ten Academic Alliance (BTAA Geospatial Data Project is a collaboration between twelve member institutions of the consortium and works towards providing discoverability and access to geospatial data, scanned maps, and web mapping services. Usability tests and heuristic evaluations were chosen as methods of evaluation, as they have had a long standing in measuring and managing website engagement and are essential in the process of iterative design. The BTAA project hopes to give back to the community by publishing the results of our usability findings with the hope that it will benefit other portals built with GeoBlacklight.

  12. Research reactor put Canada in the nuclear big time

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    The history of the NRX reactor is briefly recounted. When NRX started up in 1947, it was the most powerful neutron source in the world. It is now the oldest research reactor still operating. NRX had to be rebuilt after an accident in 1952, and its calandria was changed again in 1970. Loops in NRX were used to test fuel for the Nautilus submarine, and the first zircaloy pressure tube in the world. At the present time, NRX is in a 'hot standby' condition as a backup to the NRU reactor, which is used mainly for isotope production. NRX will be decommissioned after completion and startup of the new MAPLE-X reactor

  13. A `big-mac` high converting water reactor

    Energy Technology Data Exchange (ETDEWEB)

    Ronen, Y; Dali, Y [Ben-Gurion Univ. of the Negev, Beersheba (Israel). Dept. of Nuclear Engineering

    1996-12-01

    Currently an effort is being made to get rid of plutonium. Therefore, at this time, a scientific study of a high converting reactor seems to be out of place. However , it is our opinion that the future of nuclear energy lies, among other things in the clever utilization of plutonium. It is also our opinion that one of the best ways to utilize plutonium is in high converting water reactors (authors).

  14. Researchers solve big mysteries of pebble bed reactor

    Energy Technology Data Exchange (ETDEWEB)

    Shams, Afaque; Roelofs, Ferry; Komen, E.M.J. [Nuclear Research and Consultancy Group (NRG), Petten (Netherlands); Baglietto, Emilio [Massachusetts Institute of Technology, Cambridge, MA (United States). Dept. of Nuclear Science and Engineering; Sgro, Titus [CD-adapco, London (United Kingdom). Technical Marketing

    2014-03-15

    The PBR is one type of High Temperature Reactors, which allows high temperature work while preventing the fuel from melting (bringing huge safety margins to the reactor) and high electricity efficiency. The design is also highly scalable; a plant could be designed to be as large or small as needed, and can even be made mobile, allowing it to be used onboard a ship. In a PBR, small particles of nuclear fuel, embedded in a moderating graphite pebble, are dropped into the reactor as needed. At the bottom, the pebbles can be removed simply by opening a small hatch and letting gravity pull them down. To cool the reactor and create electricity, helium gas is pumped through the reactor to pull heat out which is then run through generators. One of the most difficult problems to deal with has been the possible appearance of local temperature hotspots within the pebble bed heating to the point of melting the graphite moderators surrounding the fuel. Obviously, constructing a reactor and experimenting to investigate this possibility is out of the question. Instead, nuclear engineers have been attempting to simulate a PBR with various CFD codes. The thermo-dynamic analysis to simulate realistic conditions in a pebble bed are described and the results are shown. (orig.)

  15. Ten-year utilization of the Oregon State University TRIGA Reactor (OSTR)

    International Nuclear Information System (INIS)

    Ringle, John C.; Anderson, Terrance V.; Johnson, Arthur G.

    1978-01-01

    The Oregon State University TRIGA Reactor (OSTR) has been used heavily throughout the past ten years to accommodate exclusively university research, teaching, and training efforts. Averages for the past nine years show that the OSTR use time has been as follows: 14% for academic and special training courses; 44% for OSU research projects; 6% for non-OSU research projects; 2% for demonstrations for tours; and 34% for reactor maintenance, calibrations, inspections, etc. The OSTR has operated an average of 25.4 hours per week during this nine-year period. Each year, about 20 academic courses and 30 different research projects use the OSTR. Visitors to the facility average about 1,500 per year. No commercial radiations or services have been performed at the OSTR during this period. Special operator training courses are given at the OSTR at the rate of at least one per year. (author)

  16. Innovations and Enhancements for a Consortium of Big-10 University Research and Training Reactors. Final Report

    International Nuclear Information System (INIS)

    Brenizer, Jack

    2011-01-01

    The Consortium of Big-10 University Research and Training Reactors was by design a strategic partnership of seven leading institutions. We received the support of both our industry and DOE laboratory partners. Investments in reactor, laboratory and program infrastructure, allowed us to lead the national effort to expand and improve the education of engineers in nuclear science and engineering, to provide outreach and education to pre-college educators and students and to become a key resource of ideas and trained personnel for our U.S. industrial and DOE laboratory collaborators.

  17. TenBig Bangs” in Theory and Practice that Have Made a Difference to Australian Policing in the Last Three Decades

    Directory of Open Access Journals (Sweden)

    Rick Sarre

    2016-05-01

    Full Text Available This paper discusses what could be considered the top ten innovations that have occurred in policing in the last thirty years. The intent is to focus attention on how practice could be further inspired by additional innovation. The innovations are discussed here as “Big Bangs” as a way of drawing attention to the significant impact they have had on policing, in the same way that the cosmological Big Bang was an important watershed event in the universe’s existence. These ten policing innovations ushered in, it is argued, a new mindset, pattern or trend, and they affected Australian policing profoundly; although many had their roots in other settings long before Australian policy-makers implemented them.

  18. French experience in operating pressurized water reactor power stations. Ten years' operation of the Ardennes power station

    International Nuclear Information System (INIS)

    Teste du Bailler, A.; Vedrinne, J.F.

    1978-01-01

    In the paper the experience gained over ten years' operation of the Ardennes (Chooz) nuclear power station is summarized from the point of view of monitoring and control equipment. The reactor was the first pressurized water reactor to be installed in France; it is operated jointly by France and Belgium. The equipment, which in many cases consists of prototypes, was developed for industrial use and with the experience that has now been gained it is possible to evaluate its qualities and defects, the constraints which it imposes and the action that has to be taken in the future. (author)

  19. Big ambitions for small reactors as investors size up power options

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, John [nuclear24, Redditch (United Kingdom)

    2016-04-15

    Earlier this year, US nuclear developer NuScale Power completed a study for the UK's National Nuclear Laboratory (NNL) that supported the suitability of NuScale's small modular reactor (SMR) technology for the effective disposition of plutonium. The UK is a frontrunner to compete in the SMR marketplace, both in terms of technological capabilities, trade and political commitment. Industry observers are openly speculating whether SMR design and construction could start to move ahead faster than 'big and conventional' nuclear construction projects - not just in the UK but worldwide. Economies of scale could increase the attraction of SMRs to investors and the general public.

  20. Molten-Salt Reactors: Report for 1960 Ten-Year-Plan Evaluation

    International Nuclear Information System (INIS)

    MacPherson, H. G.

    1960-01-01

    For purposes of this evaluation, the molten-salt reactor is considered as an advanced concept. It is considered not to have a status of a current technology adequate to allow the construction of large-scale power plants, since no power reactor has been built or even designed in detail. As a result there can be no estimate of present cost of power, and the projection of power costs to later years is necessarily based on general arguments rather than detailed considerations.

  1. Ten years's reactor operation at the Technical University Zittau - operation report

    International Nuclear Information System (INIS)

    Konschak, K.

    1990-01-01

    The Zittau Training and Research Reactor ZLFR is in use for purposes of teaching the engineers who will operate the nuclear power plants in the GDR since 10 years. Since commissioning it was started up more than 1600 times, approximately two thirds of the start-ups being utilized for purposes of teaching. A number of teaching experiments were installed that demonstrate fundamental technological processes in nuclear reactors in a manner easy to understand. The high level of nuclear safety manifests itself, among other things, in extremely low radiation exposures of the operating personal and the persons to be trained. (author)

  2. Ten years of TRIGA reactor research at the University of Texas

    International Nuclear Information System (INIS)

    O'Kelly, Sean

    2002-01-01

    The 1 MW TRIGA Research Reactor at the Nuclear Engineering Teaching Laboratory is the second TRIGA at the University of Texas at Austin (UT). A small (10 kW-1963, 250 kW-1968) TRIGA Mark I was housed in the basement of the Engineering Building until is was shutdown and decommissioned in 1989. The new TRIGA Mark II with a licensed power of 1.1 MW reached initial criticality in 1992. Prior to 1990, reactor research at UT usually consisted of projects requiring neutron activation analysis (NAA) but the step up to a much larger reactor with neutron beam capability required additional personnel to build the neutron research program. The TCNS is currently used to perform Prompt Gamma Activation Analysis to determine hydrogen and boron concentrations of various composite materials. The early 1990s was a very active period for neutron beam projects at the NETL. In addition to the TCNS, a real-time neutron radiography facility (NIF) and a high-resolution neutron depth profiling facility (NDP) were installed in two separate beam ports. The NDP facility was most recently used to investigate alpha damage on stainless steel in support of the U.S. Nuclear Weapons Stewardship programs. In 1999, a sapphire beam filter was installed in the NDP system to reduce the fast neutron flux at the sample location. A collaborative effort was started in 1997 between UT-Austin and the University of Texas at Arlington to build a reactor-based, low-energy positron beam (TIPS). The limited success in obtaining funding has placed the project on hold. The Nuclear and Radiation Engineering Program has grown rapidly and effectively doubled in size over the past 5 years but years of low nuclear research funding, an overall stagnation in the U.S. nuclear power industry and a persuasive public distrust of nuclear energy has caused a precipitous decline in many programs. Recently, the U.S. DOE has encouraged University Research Reactors (URR) in the U.S. to collaborate closely together by forming URR

  3. Neutron-physical characteristics of the TVRM-100 reactor with ten ring fuel channels

    International Nuclear Information System (INIS)

    Mikhajlov, V.M.; Myrtsymova, L.A.

    1988-01-01

    Three-dimensional heterogeneous calculations of TVRM-100 reactor which is a research reactor using enriched fuel with heavy-water moderator, coolant and reflector, are conducted. Achievable burnup depths depending on the number of removable FAs are presented. The maximum non-perturbed thermal neutron flux in the reflector is (2-1.8)x10 15 cm -2 c -1 ; mean flux on the fuel is 2.9x10 14 cm -2 c -1 . Energy release radial non-uniformity is 0.67, maximum bending by FA is ∼3.7. Reactivity temperature effect is negative and is equal to - 0.9x10 -4 grad -1 without accounting for experimental channels. Control rod efficiency in the radial reflector is high, but their location dose to experimental devices in the high neutron flux area is undesirable. 4 refs.; 5 figs

  4. Ten years after the Chernobyl reactor accident: expected and detected health effects in the CIS

    International Nuclear Information System (INIS)

    Kellerer, A.M.

    1996-01-01

    The author explains the essential aspects of the actual or possible health effects of the reactor accident in the immediatedly affected areas. Radiation-induced injury to health primarily manifested itself in thyroid tumors induced by the short-lived radio-iodine. It is possible that the long-lived fission products in the fallout will increase in the long run, or have done so already, the incidence rate of cancer, especially leukemias, in the population; to date, this possible increase is of an order of magnitude not yet observable in the available statistical data. (orig.) [de

  5. Ten years of IAEA cooperation with the Russian research reactor fuel return programme

    Energy Technology Data Exchange (ETDEWEB)

    Tozser, S.; Adelfang, P.; Bradley, E. [International Atomic Energy Agency, Vienna (Austria)

    2013-01-15

    The Russian Research Reactor Fuel Return (RRRFR) Programme was launched in 2001. Over the duration, the programme successfully completed 43 safe shipments of 1.6 tons of fresh and spent HEU fuel from different countries using Russian fuelled research reactors to the country of origin. The IAEA has been a very active supporter of the RRRFR Programme since its inception. Under the auspices of the RRRFR Programme, the Agency has been ensuring a broad range of technical advisory and organizational support to the HEU fuel repatriation, as well as training and advisory assistance for supporting RR conversion from HEU to LEU. The presentation gives an overview of the RRRFR programme achievements with special consideration of the IAEA contribution. These include an overview of the shipments' history in terms of fresh and spent fuel, as well as a summary of experiences gained during the shipments' preparation and termination. The presentation focuses on technical advisory support given by the IAEA during the programme implementation, captures the consolidated knowledge of the unique international programme and shares the most important lessons learned. (orig.)

  6. RETRAN operational transient analysis of the Big Rock Point plant boiling water reactor

    International Nuclear Information System (INIS)

    Sawtelle, G.R.; Atchison, J.D.; Farman, R.F.; VandeWalle, D.J.; Bazydlo, H.G.

    1983-01-01

    Energy Incorporated used the RETRAN computer code to model and calculate nine Consumers Power Company Big Rock Point Nuclear Power Plant transients. RETRAN, a best-estimate, one-dimensional, homogeneous-flow thermal-equilibrium code, is applicable to FSAR Chapter 15 transients for Conditions 1 through IV. The BWR analyses were performed in accordance with USNRC Standard Review Plan criteria and in response to the USNRC Systematic Evaluation Program. The RETRAN Big Rock Point model was verified by comparison to plant startup test data. This paper discusses the unique modeling techniques used in RETRAN to model this steam-drum-type BWR. Transient analyses results are also presented

  7. Innovations and enhancements in neutronic analysis of the Big-10 university research and training reactors based on the AGENT code system

    International Nuclear Information System (INIS)

    Hursin, M.; Shanjie, X.; Burns, A.; Hopkins, J.; Satvat, N.; Gert, G.; Tsoukalas, L. H.; Jevremovic, T.

    2006-01-01

    Introduction. This paper summarizes salient aspects of the 'virtual' reactor system developed at Purdue Univ. emphasizing efficient neutronic modeling through AGENT (Arbitrary Geometry Neutron Transport) a deterministic neutron transport code. DOE's Big-10 Innovations in Nuclear Infrastructure and Education (INIE) Consortium was launched in 2002 to enhance scholarship activities pertaining to university research and training reactors (URTRs). Existing and next generation URTRs are powerful campus tools for nuclear engineering as well as a number of disciplines that include, but are not limited to, medicine, biology, material science, and food science. Advancing new computational environments for the analysis and configuration of URTRs is an important Big-10 INIE aim. Specifically, Big-10 INIE has pursued development of a 'virtual' reactor, an advanced computational environment to serve as a platform on which to build operations, utilization (research and education), and systemic analysis of URTRs physics. The 'virtual' reactor computational system will integrate computational tools addressing the URTR core and near core physics (transport, dynamics, fuel management and fuel configuration); thermal-hydraulics; beam line, in-core and near-core experiments; instrumentation and controls; confinement/containment and security issues. Such integrated computational environment does not currently exist. The 'virtual' reactor is designed to allow researchers and educators to configure and analyze their systems to optimize experiments, fuel locations for flux shaping, as well as detector selection and configuration. (authors)

  8. Extended burnup demonstration: reactor fuel program. Pre-irradiation characterization and summary of pre-program poolside examinations. Big Rock Point extended burnup fuel

    International Nuclear Information System (INIS)

    Exarhos, C.A.; Van Swam, L.F.; Wahlquist, F.P.

    1981-12-01

    This report is a resource document characterizing the 64 fuel rods being irradiated at the Big Rock Point reactor as part of the Extended Burnup Demonstration being sponsored jointly by the US Department of Energy, Consumers Power Company, Exxon Nuclear Company, and General Public Utilities. The program entails extending the exposure of standard BWR fuel to a discharge average of 38,000 MWD/MTU to demonstrate the feasibility of operating fuel of standard design to levels significantly above current limits. The fabrication characteristics of the Big Rock Point EBD fuel are presented along with measurement of rod length, rod diameter, pellet stack height, and fuel rod withdrawal force taken at poolside at burnups up to 26,200 MWD/MTU. A review of the fuel examination data indicates no performance characteristics which might restrict the continued irradiation of the fuel

  9. The Chernobyl reactor accident and the situation ten years after: expected and observed health effects in the CIS

    International Nuclear Information System (INIS)

    Kellerer, A.M.

    1996-01-01

    Essential aspects of the observed or possible health effects of the reactor accident in the regions directly affected are shown and discussed. It was not possible though, within the framework of this article, to address all important aspects. The article is primarily intended to draw a picture of the general situation. Radiation-induced disease among the population is observed, such as the thyroid tumors caused by the short-lived radioiodine. It can by no means be ruled out that also the long-lived radioisotopes will cause or have been inducing an elevated rate of cancer incidence, in particular of leukemias. Such elevated incidence so far is building up at a rate not yet detectable by statistics. The facts collected to date clearly do not confirm publications speaking of a drastically enhanced general incidence of cancer in the affected regions, or of a number of 125.000 cases of death in the Ukraine caused by radioactive radiation. (orig./MG) [de

  10. Big inquiry

    Energy Technology Data Exchange (ETDEWEB)

    Wynne, B [Lancaster Univ. (UK)

    1979-06-28

    The recently published report entitled 'The Big Public Inquiry' from the Council for Science and Society and the Outer Circle Policy Unit is considered, with especial reference to any future enquiry which may take place into the first commercial fast breeder reactor. Proposals embodied in the report include stronger rights for objectors and an attempt is made to tackle the problem that participation in a public inquiry is far too late to be objective. It is felt by the author that the CSS/OCPU report is a constructive contribution to the debate about big technology inquiries but that it fails to understand the deeper currents in the economic and political structure of technology which so influence the consequences of whatever formal procedures are evolved.

  11. Chernobyl ten years after

    International Nuclear Information System (INIS)

    1996-01-01

    The accident in the fourth reactor plant in Chernobyl in Ukraine occurred ten years years ago, caused the death of 31 people while the health consequences have turned out to be difficult to assess. This review describes the accident, its consequences and effects to health, studies carried out at the present state as well as the comparison with the other accidents and disaster. (author)

  12. Comparison of wastewater plant of Nova Pampulha, with an UASB reactor, with another ten brazilian stations; Comparacion de la EDAR de Nova Pampulha, dotada de reactor UASB, con otras diez plantas brasilenas

    Energy Technology Data Exchange (ETDEWEB)

    Barrosa Correa, S. M. B.; Ruiz, E.; Romero, F.

    2004-07-01

    This work is based on data of the wastewater plant of Nova Pampulha with an UASB reactor. The objective of this research was focussed in the comparison of this plant with another ten brazilian stations provided with different depuration techniques. Firstly the graphical comparison of average operational data suggest analogies between influents (less suspended solids in the Nova Pampulha), effluent (more suspended solids and bacteria in the same station) and alimentation's (smaller for suspended solids and bacteria in Nova Pampulha, where there is also an increase in alkalinity). Cluster analysis, made with percentages of elimination of constituents in the eleven stations and shown as dendrograms, was chosen as the second comparative method. A third comparison was affected by multiple linear regression for obtaiming mathematical models from the eliminations of constituents, with statistical significance at level of the 95% confidence, using as possible independent variables the flows and the concentrations of influents. The explanations of the variances of data by the calculated equations is in the range 46 to 91%. As a general conclusion, it can be said that a well operated UASB reactor may be a satisfactory technique for wastewater treatment and well adapted to climatological Brazilian conditions. (Author) 14 refs.

  13. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......, and highlight some recent methodological advancements in machine learning and image analysis triggered by astronomical applications....

  14. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  15. Reactor

    International Nuclear Information System (INIS)

    Toyama, Masahiro; Kasai, Shigeo.

    1978-01-01

    Purpose: To provide a lmfbr type reactor wherein effusion of coolants through a loop contact portion is reduced even when fuel assemblies float up, and misloading of reactor core constituting elements is prevented thereby improving the reactor safety. Constitution: The reactor core constituents are secured in the reactor by utilizing the differential pressure between the high-pressure cooling chamber and low-pressure cooling chamber. A resistance port is formed at the upper part of a connecting pipe, and which is connect the low-pressure cooling chamber and the lower surface of the reactor core constituent. This resistance part is formed such that the internal sectional area of the connecting pipe is made larger stepwise toward the upper part, and the cylinder is formed larger so that it profiles the inner surface of the connecting pipe. (Aizawa, K.)

  16. Reactor

    International Nuclear Information System (INIS)

    Ikeda, Masaomi; Kashimura, Kazuo; Inoue, Kazuyuki; Nishioka, Kazuya.

    1979-01-01

    Purpose: To facilitate the construction of a reactor containment building, whereby the inspections of the outer wall of a reactor container after the completion of the construction of the reactor building can be easily carried out. Constitution: In a reactor accommodated in a container encircled by a building wall, a space is provided between the container and the building wall encircling the container, and a metal wall is provided in the space so that it is fitted in the building wall in an attachable or detatchable manner. (Aizawa, K.)

  17. Big science

    CERN Multimedia

    Nadis, S

    2003-01-01

    " "Big science" is moving into astronomy, bringing large experimental teams, multi-year research projects, and big budgets. If this is the wave of the future, why are some astronomers bucking the trend?" (2 pages).

  18. Update on the debate about the existence and utility of the Big Five: a ten-year follow-up on Carroll's "the Five-Factor Personality Model: how complete and satisfactory is it?

    Science.gov (United States)

    Merenda, Peter F

    2008-12-01

    This paper is a follow-up comment on John B. Carroll's critique of the Big Five Model and his suggestion years ago on how to design and conduct research properly on the structure of personality and its assessment. The status of research on personality factor models is discussed, and conclusions are reached regarding the likely consequences and further prospects of the failure of personality theorists and practitioners to follow through on Carroll's poignant suggestion for required effort.

  19. Integrated plant-safety assessment, Systematic Evaluation Program: Big Rock Point Plant (Docket No. 50-155)

    International Nuclear Information System (INIS)

    1983-09-01

    The Systematic Evaluation Program was initiated in February 1977 by the US Nuclear Regulatory Commission to review the designs of older operating nuclear reactor plants to reconfirm and document their safety. This report documents the review of the Big Rock Point Plant, which is one of ten plants reviewed under Phase II of this program. This report indicates how 137 topics selected for review under Phase I of the program were addressed. It also addresses a majority of the pending licensing actions for Big Rock Point, which include TMI Action Plan requirements and implementation criteria for resolved generic issues. Equipment and procedural changes have been identified as a result of the review

  20. Big Data Analytics An Overview

    Directory of Open Access Journals (Sweden)

    Jayshree Dwivedi

    2015-08-01

    Full Text Available Big data is a data beyond the storage capacity and beyond the processing power is called big data. Big data term is used for data sets its so large or complex that traditional data it involves data sets with sizes. Big data size is a constantly moving target year by year ranging from a few dozen terabytes to many petabytes of data means like social networking sites the amount of data produced by people is growing rapidly every year. Big data is not only a data rather it become a complete subject which includes various tools techniques and framework. It defines the epidemic possibility and evolvement of data both structured and unstructured. Big data is a set of techniques and technologies that require new forms of assimilate to uncover large hidden values from large datasets that are diverse complex and of a massive scale. It is difficult to work with using most relational database management systems and desktop statistics and visualization packages exacting preferably massively parallel software running on tens hundreds or even thousands of servers. Big data environment is used to grab organize and resolve the various types of data. In this paper we describe applications problems and tools of big data and gives overview of big data.

  1. Urbanising Big

    DEFF Research Database (Denmark)

    Ljungwall, Christer

    2013-01-01

    Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis.......Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis....

  2. Big Argumentation?

    Directory of Open Access Journals (Sweden)

    Daniel Faltesek

    2013-08-01

    Full Text Available Big Data is nothing new. Public concern regarding the mass diffusion of data has appeared repeatedly with computing innovations, in the formation before Big Data it was most recently referred to as the information explosion. In this essay, I argue that the appeal of Big Data is not a function of computational power, but of a synergistic relationship between aesthetic order and a politics evacuated of a meaningful public deliberation. Understanding, and challenging, Big Data requires an attention to the aesthetics of data visualization and the ways in which those aesthetics would seem to depoliticize information. The conclusion proposes an alternative argumentative aesthetic as the appropriate response to the depoliticization posed by the popular imaginary of Big Data.

  3. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations......The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  4. Reactors

    DEFF Research Database (Denmark)

    Shah, Vivek; Vaz Salles, Marcos António

    2018-01-01

    The requirements for OLTP database systems are becoming ever more demanding. Domains such as finance and computer games increasingly mandate that developers be able to encode complex application logic and control transaction latencies in in-memory databases. At the same time, infrastructure...... engineers in these domains need to experiment with and deploy OLTP database architectures that ensure application scalability and maximize resource utilization in modern machines. In this paper, we propose a relational actor programming model for in-memory databases as a novel, holistic approach towards......-level function calls. In contrast to classic transactional models, however, reactors allow developers to take advantage of intra-transaction parallelism and state encapsulation in their applications to reduce latency and improve locality. Moreover, reactors enable a new degree of flexibility in database...

  5. Evaluation of the integrity of reactor vessels designed to ASME Code, Sections I and/or VIII

    International Nuclear Information System (INIS)

    Hoge, K.G.

    1976-01-01

    A documented review of nuclear reactor pressure vessels designed to ASME Code, Sections I and/or VIII is made. The review is primarily concerned with the design specifications and quality assurance programs utilized for the reactor vessel construction and the status of power plant material surveillance programs, pressure-temperature operating limits, and inservice inspection programs. The following ten reactor vessels for light-water power reactors are covered in the report: Indian Point Unit No. 1, Dresden Unit No. 1, Yankee Rowe, Humboldt Bay Unit No. 3, Big Rock Point, San Onofre Unit No. 1, Connecticut Yankee, Oyster Creek, Nine Mile Point Unit No. 1, and La Crosse

  6. How Big Is Too Big?

    Science.gov (United States)

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  7. Reactor physics verification of the MCNP6 unstructured mesh capability

    International Nuclear Information System (INIS)

    Burke, T. P.; Kiedrowski, B. C.; Martz, R. L.; Martin, W. R.

    2013-01-01

    The Monte Carlo software package MCNP6 has the ability to transport particles on unstructured meshes generated from the Computed-Aided Engineering software Abaqus. Verification is performed using benchmarks with features relevant to reactor physics - Big Ten and the C5G7 computational benchmark. Various meshing strategies are tested and results are compared to reference solutions. Computational performance results are also given. The conclusions show MCNP6 is capable of producing accurate calculations for reactor physics geometries and the computational requirements for small lattice benchmarks are reasonable on modern computing platforms. (authors)

  8. Reactor physics verification of the MCNP6 unstructured mesh capability

    Energy Technology Data Exchange (ETDEWEB)

    Burke, T. P. [Department of Nuclear Engineering and Radiological Sciences, University of Michigan, 2355 Bonisteel Boulevard, Ann Arbor, MI 48109 (United States); Kiedrowski, B. C.; Martz, R. L. [X-Computational Physics Division, Monte Carlo Codes Group, Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States); Martin, W. R. [Department of Nuclear Engineering and Radiological Sciences, University of Michigan, 2355 Bonisteel Boulevard, Ann Arbor, MI 48109 (United States)

    2013-07-01

    The Monte Carlo software package MCNP6 has the ability to transport particles on unstructured meshes generated from the Computed-Aided Engineering software Abaqus. Verification is performed using benchmarks with features relevant to reactor physics - Big Ten and the C5G7 computational benchmark. Various meshing strategies are tested and results are compared to reference solutions. Computational performance results are also given. The conclusions show MCNP6 is capable of producing accurate calculations for reactor physics geometries and the computational requirements for small lattice benchmarks are reasonable on modern computing platforms. (authors)

  9. Reactor

    International Nuclear Information System (INIS)

    Fujibayashi, Toru.

    1976-01-01

    Object: To provide a boiling water reactor which can enhance a quake resisting strength and flatten power distribution. Structure: At least more than four fuel bundles, in which a plurality of fuel rods are arranged in lattice fashion which upper and lower portions are supported by tie-plates, are bundled and then covered by a square channel box. The control rod is movably arranged within a space formed by adjoining channel boxes. A spacer of trapezoidal section is disposed in the central portion on the side of the channel box over substantially full length in height direction, and a neutron instrumented tube is disposed in the central portion inside the channel box. Thus, where a horizontal load is exerted due to earthquake or the like, the spacers come into contact with each other to support the channel box and prevent it from abnormal vibrations. (Furukawa, Y.)

  10. Big Data-Survey

    Directory of Open Access Journals (Sweden)

    P.S.G. Aruna Sri

    2016-03-01

    Full Text Available Big data is the term for any gathering of information sets, so expensive and complex, that it gets to be hard to process for utilizing customary information handling applications. The difficulties incorporate investigation, catch, duration, inquiry, sharing, stockpiling, Exchange, perception, and protection infringement. To reduce spot business patterns, anticipate diseases, conflict etc., we require bigger data sets when compared with the smaller data sets. Enormous information is hard to work with utilizing most social database administration frameworks and desktop measurements and perception bundles, needing rather enormously parallel programming running on tens, hundreds, or even a large number of servers. In this paper there was an observation on Hadoop architecture, different tools used for big data and its security issues.

  11. Big nuclear accidents

    International Nuclear Information System (INIS)

    Marshall, W.; Billingon, D.E.; Cameron, R.F.; Curl, S.J.

    1983-09-01

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but just imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the risks of nuclear power. The paper reviews the way in which the probability and consequences of big nuclear accidents have been presented in the past and makes recommendations for the future, including the presentation of the long-term consequences of such accidents in terms of 'loss of life expectancy', 'increased chance of fatal cancer' and 'equivalent pattern of compulsory cigarette smoking'. The paper presents mathematical arguments, which show the derivation and validity of the proposed methods of presenting the consequences of imaginable big nuclear accidents. (author)

  12. A Matrix Big Bang

    OpenAIRE

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matr...

  13. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  14. Real-Time Pathogen Detection in the Era of Whole-Genome Sequencing and Big Data: Comparison of k-mer and Site-Based Methods for Inferring the Genetic Distances among Tens of Thousands of Salmonella Samples.

    Directory of Open Access Journals (Sweden)

    James B Pettengill

    Full Text Available The adoption of whole-genome sequencing within the public health realm for molecular characterization of bacterial pathogens has been followed by an increased emphasis on real-time detection of emerging outbreaks (e.g., food-borne Salmonellosis. In turn, large databases of whole-genome sequence data are being populated. These databases currently contain tens of thousands of samples and are expected to grow to hundreds of thousands within a few years. For these databases to be of optimal use one must be able to quickly interrogate them to accurately determine the genetic distances among a set of samples. Being able to do so is challenging due to both biological (evolutionary diverse samples and computational (petabytes of sequence data issues. We evaluated seven measures of genetic distance, which were estimated from either k-mer profiles (Jaccard, Euclidean, Manhattan, Mash Jaccard, and Mash distances or nucleotide sites (NUCmer and an extended multi-locus sequence typing (MLST scheme. When analyzing empirical data (whole-genome sequence data from 18,997 Salmonella isolates there are features (e.g., genomic, assembly, and contamination that cause distances inferred from k-mer profiles, which treat absent data as informative, to fail to accurately capture the distance between samples when compared to distances inferred from differences in nucleotide sites. Thus, site-based distances, like NUCmer and extended MLST, are superior in performance, but accessing the computing resources necessary to perform them may be challenging when analyzing large databases.

  15. Real-Time Pathogen Detection in the Era of Whole-Genome Sequencing and Big Data: Comparison of k-mer and Site-Based Methods for Inferring the Genetic Distances among Tens of Thousands of Salmonella Samples.

    Science.gov (United States)

    Pettengill, James B; Pightling, Arthur W; Baugher, Joseph D; Rand, Hugh; Strain, Errol

    2016-01-01

    The adoption of whole-genome sequencing within the public health realm for molecular characterization of bacterial pathogens has been followed by an increased emphasis on real-time detection of emerging outbreaks (e.g., food-borne Salmonellosis). In turn, large databases of whole-genome sequence data are being populated. These databases currently contain tens of thousands of samples and are expected to grow to hundreds of thousands within a few years. For these databases to be of optimal use one must be able to quickly interrogate them to accurately determine the genetic distances among a set of samples. Being able to do so is challenging due to both biological (evolutionary diverse samples) and computational (petabytes of sequence data) issues. We evaluated seven measures of genetic distance, which were estimated from either k-mer profiles (Jaccard, Euclidean, Manhattan, Mash Jaccard, and Mash distances) or nucleotide sites (NUCmer and an extended multi-locus sequence typing (MLST) scheme). When analyzing empirical data (whole-genome sequence data from 18,997 Salmonella isolates) there are features (e.g., genomic, assembly, and contamination) that cause distances inferred from k-mer profiles, which treat absent data as informative, to fail to accurately capture the distance between samples when compared to distances inferred from differences in nucleotide sites. Thus, site-based distances, like NUCmer and extended MLST, are superior in performance, but accessing the computing resources necessary to perform them may be challenging when analyzing large databases.

  16. Neutron behavior, reactor control, and reactor heat transfer. Volume four

    International Nuclear Information System (INIS)

    Anon.

    1986-01-01

    Volume four covers neutron behavior (neutron absorption, how big are nuclei, neutron slowing down, neutron losses, the self-sustaining reactor), reactor control (what is controlled in a reactor, controlling neutron population, is it easy to control a reactor, range of reactor control, what happens when the fuel burns up, controlling a PWR, controlling a BWR, inherent safety of reactors), and reactor heat transfer (heat generation in a nuclear reactor, how is heat removed from a reactor core, heat transfer rate, heat transfer properties of the reactor coolant)

  17. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  18. Big Dreams

    Science.gov (United States)

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  19. Big Science

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1986-05-15

    Astronomy, like particle physics, has become Big Science where the demands of front line research can outstrip the science budgets of whole nations. Thus came into being the European Southern Observatory (ESO), founded in 1962 to provide European scientists with a major modern observatory to study the southern sky under optimal conditions.

  20. Release plan for Big Pete

    International Nuclear Information System (INIS)

    Edwards, T.A.

    1996-11-01

    This release plan is to provide instructions for the Radiological Control Technician (RCT) to conduct surveys for the unconditional release of ''Big Pete,'' which was used in the removal of ''Spacers'' from the N-Reactor. Prior to performing surveys on the rear end portion of ''Big Pete,'' it shall be cleaned (i.e., free of oil, grease, caked soil, heavy dust). If no contamination is found, the vehicle may be released with the permission of the area RCT Supervisor. If contamination is found by any of the surveys, contact the cognizant Radiological Engineer for decontamination instructions

  1. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Andersen, Kristina Vaarst; Jeppesen, Jacob

    In this paper we investigate the micro-mechanisms governing structural evolution and performance of scientific collaboration. Scientific discovery tends not to be lead by so called lone ?stars?, or big egos, but instead by collaboration among groups of researchers, from a multitude of institutions...

  2. Big Data and Big Science

    OpenAIRE

    Di Meglio, Alberto

    2014-01-01

    Brief introduction to the challenges of big data in scientific research based on the work done by the HEP community at CERN and how the CERN openlab promotes collaboration among research institutes and industrial IT companies. Presented at the FutureGov 2014 conference in Singapore.

  3. Big Data

    OpenAIRE

    Bútora, Matúš

    2017-01-01

    Cieľom bakalárskej práca je popísať problematiku Big Data a agregačné operácie OLAP pre podporu rozhodovania, ktoré sú na ne aplikované pomocou technológie Apache Hadoop. Prevažná časť práce je venovaná popisu práve tejto technológie. Posledná kapitola sa zaoberá spôsobom aplikovania agregačných operácií a problematikou ich realizácie. Nasleduje celkové zhodnotenie práce a možnosti využitia výsledného systému do budúcna. The aim of the bachelor thesis is to describe the Big Data issue and ...

  4. BIG DATA

    OpenAIRE

    Abhishek Dubey

    2018-01-01

    The term 'Big Data' portrays inventive methods and advances to catch, store, disseminate, oversee and break down petabyte-or bigger estimated sets of data with high-speed & diverted structures. Enormous information can be organized, non-structured or half-organized, bringing about inadequacy of routine information administration techniques. Information is produced from different distinctive sources and can touch base in the framework at different rates. With a specific end goal to handle this...

  5. Keynote: Big Data, Big Opportunities

    OpenAIRE

    Borgman, Christine L.

    2014-01-01

    The enthusiasm for big data is obscuring the complexity and diversity of data in scholarship and the challenges for stewardship. Inside the black box of data are a plethora of research, technology, and policy issues. Data are not shiny objects that are easily exchanged. Rather, data are representations of observations, objects, or other entities used as evidence of phenomena for the purposes of research or scholarship. Data practices are local, varying from field to field, individual to indiv...

  6. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  7. VINKA, ten years on. Main scientific results

    International Nuclear Information System (INIS)

    1979-01-01

    The VINKA facility in the TRITON swimming-pool reactor at Fontenay-aux-Roses allows the irradiation of solids at low temperatures in order to study crystalline defects. After ten years of operation the main scientific results obtained in the fields of creep and growth (chapter I), point defects (chapter II), amorphisation (chapter III) and dechanneling of particles (chapter IV) are summarised [fr

  8. Big Data

    DEFF Research Database (Denmark)

    Aaen, Jon; Nielsen, Jeppe Agger

    2016-01-01

    Big Data byder sig til som en af tidens mest hypede teknologiske innovationer, udråbt til at rumme kimen til nye, værdifulde operationelle indsigter for private virksomheder og offentlige organisationer. Mens de optimistiske udmeldinger er mange, er forskningen i Big Data i den offentlige sektor...... indtil videre begrænset. Denne artikel belyser, hvordan den offentlige sundhedssektor kan genanvende og udnytte en stadig større mængde data under hensyntagen til offentlige værdier. Artiklen bygger på et casestudie af anvendelsen af store mængder sundhedsdata i Dansk AlmenMedicinsk Database (DAMD......). Analysen viser, at (gen)brug af data i nye sammenhænge er en flerspektret afvejning mellem ikke alene økonomiske rationaler og kvalitetshensyn, men også kontrol over personfølsomme data og etiske implikationer for borgeren. I DAMD-casen benyttes data på den ene side ”i den gode sags tjeneste” til...

  9. HIPs at Ten

    Science.gov (United States)

    Kuh, George; O'Donnell, Ken; Schneider, Carol Geary

    2017-01-01

    2017 is the anniversary of the introduction of what are now commonly known as high-impact practices (HIPs). Many of the specific activities pursued under the HIPs acronym have been around in some form for decades, such as study abroad, internships, and student-faculty research. It was about ten years ago that, after conferring HIPs at Ten with…

  10. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  11. A matrix big bang

    International Nuclear Information System (INIS)

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control

  12. A matrix big bang

    Energy Technology Data Exchange (ETDEWEB)

    Craps, Ben [Instituut voor Theoretische Fysica, Universiteit van Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam (Netherlands); Sethi, Savdeep [Enrico Fermi Institute, University of Chicago, Chicago, IL 60637 (United States); Verlinde, Erik [Instituut voor Theoretische Fysica, Universiteit van Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam (Netherlands)

    2005-10-15

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control.

  13. Big nuclear accidents

    International Nuclear Information System (INIS)

    Marshall, W.

    1983-01-01

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the safety of nuclear power. The way in which the probability and consequences of big nuclear accidents have been presented in the past is reviewed and recommendations for the future are made including the presentation of the long-term consequences of such accidents in terms of 'reduction in life expectancy', 'increased chance of fatal cancer' and the equivalent pattern of compulsory cigarette smoking. (author)

  14. Online stress corrosion crack and fatigue usages factor monitoring and prognostics in light water reactor components: Probabilistic modeling, system identification and data fusion based big data analytics approach

    Energy Technology Data Exchange (ETDEWEB)

    Mohanty, Subhasish M. [Argonne National Lab. (ANL), Argonne, IL (United States); Jagielo, Bryan J. [Argonne National Lab. (ANL), Argonne, IL (United States); Oakland Univ., Rochester, MI (United States); Iverson, William I. [Argonne National Lab. (ANL), Argonne, IL (United States); Univ. of Illinois at Urbana-Champaign, Champaign, IL (United States); Bhan, Chi Bum [Argonne National Lab. (ANL), Argonne, IL (United States); Pusan National Univ., Busan (Korea, Republic of); Soppet, William S. [Argonne National Lab. (ANL), Argonne, IL (United States); Majumdar, Saurin M. [Argonne National Lab. (ANL), Argonne, IL (United States); Natesan, Ken N. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-12-10

    Nuclear reactors in the United States account for roughly 20% of the nation's total electric energy generation, and maintaining their safety in regards to key component structural integrity is critical not only for long term use of such plants but also for the safety of personnel and the public living around the plant. Early detection of damage signature such as of stress corrosion cracking, thermal-mechanical loading related material degradation in safety-critical components is a necessary requirement for long-term and safe operation of nuclear power plant systems.

  15. Tens bij bevallingen

    NARCIS (Netherlands)

    Tuin-Nuis, F.D.F.

    2000-01-01

    TENS (Transcutane Electrische Neuro Stimulatie) is een pijnverlichtingsmethode die berust op de Gate Control Theory van Melzack en Wall. Door middel van electrische pulsen via de huid zou de geleiding van nociceptieve signalen (pijnprikkels) worden beïnvloed en zou het lichaam endorfinen aanmaken:

  16. Affordances: Ten Years On

    Science.gov (United States)

    Brown, Jill P.; Stillman, Gloria

    2014-01-01

    Ten years ago the construct, affordance, was rising in prominence in scholarly literature. A proliferation of different uses and meanings was evident. Beginning with its origin in the work of Gibson, we traced its development and use in various scholarly fields. This paper revisits our original question with respect to its utility in mathematics…

  17. Powers of ten

    CERN Document Server

    1979-01-01

    Powers of Ten is a 1977 short documentary film written and directed by Charles Eames and his wife, Ray. The film depicts the relative scale of the Universe in factors of ten (see also logarithmic scale and order of magnitude). The idea for the film appears to have come from the 1957 book Cosmic View by Kees Boeke. The film begins with an aerial image of a man reclining on a blanket; the view is that of one meter across. The viewpoint, accompanied by expository voiceover, then slowly zooms out to a view ten meters across ( or 101 m in standard form), revealing that the man is picnicking in a park with a female companion. The zoom-out continues, to a view of 100 meters (102 m), then 1 kilometer (103 m), and so on, increasing the perspective—the picnic is revealed to be taking place near Soldier Field on Chicago's waterfront—and continuing to zoom out to a field of view of 1024 meters, or the size of the observable universe. The camera then zooms back in to the picnic, and then to views of negative pow...

  18. Big Data, Big Problems: A Healthcare Perspective.

    Science.gov (United States)

    Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M

    2017-01-01

    Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."

  19. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  20. Stalin's Big Fleet Program

    National Research Council Canada - National Science Library

    Mauner, Milan

    2002-01-01

    Although Dr. Milan Hauner's study 'Stalin's Big Fleet program' has focused primarily on the formation of Big Fleets during the Tsarist and Soviet periods of Russia's naval history, there are important lessons...

  1. Big Data Semantics

    NARCIS (Netherlands)

    Ceravolo, Paolo; Azzini, Antonia; Angelini, Marco; Catarci, Tiziana; Cudré-Mauroux, Philippe; Damiani, Ernesto; Mazak, Alexandra; van Keulen, Maurice; Jarrar, Mustafa; Santucci, Giuseppe; Sattler, Kai-Uwe; Scannapieco, Monica; Wimmer, Manuel; Wrembel, Robert; Zaraket, Fadi

    2018-01-01

    Big Data technology has discarded traditional data modeling approaches as no longer applicable to distributed data processing. It is, however, largely recognized that Big Data impose novel challenges in data and infrastructure management. Indeed, multiple components and procedures must be

  2. Characterizing Big Data Management

    OpenAIRE

    Rogério Rossi; Kechi Hirama

    2015-01-01

    Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: t...

  3. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  4. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  5. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  6. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  7. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  8. The ten thousand Kims

    Science.gov (United States)

    Baek, Seung Ki; Minnhagen, Petter; Kim, Beom Jun

    2011-07-01

    In Korean culture, the names of family members are recorded in special family books. This makes it possible to follow the distribution of Korean family names far back in history. It is shown here that these name distributions are well described by a simple null model, the random group formation (RGF) model. This model makes it possible to predict how the name distributions change and these predictions are shown to be borne out. In particular, the RGF model predicts that for married women entering a collection of family books in a certain year, the occurrence of the most common family name 'Kim' should be directly proportional to the total number of married women with the same proportionality constant for all the years. This prediction is also borne out to a high degree. We speculate that it reflects some inherent social stability in the Korean culture. In addition, we obtain an estimate of the total population of the Korean culture down to the year 500 AD, based on the RGF model, and find about ten thousand Kims.

  9. The ten thousand Kims

    International Nuclear Information System (INIS)

    Baek, Seung Ki; Minnhagen, Petter; Kim, Beom Jun

    2011-01-01

    In Korean culture, the names of family members are recorded in special family books. This makes it possible to follow the distribution of Korean family names far back in history. It is shown here that these name distributions are well described by a simple null model, the random group formation (RGF) model. This model makes it possible to predict how the name distributions change and these predictions are shown to be borne out. In particular, the RGF model predicts that for married women entering a collection of family books in a certain year, the occurrence of the most common family name 'Kim' should be directly proportional to the total number of married women with the same proportionality constant for all the years. This prediction is also borne out to a high degree. We speculate that it reflects some inherent social stability in the Korean culture. In addition, we obtain an estimate of the total population of the Korean culture down to the year 500 AD, based on the RGF model, and find about ten thousand Kims.

  10. The big bang

    International Nuclear Information System (INIS)

    Chown, Marcus.

    1987-01-01

    The paper concerns the 'Big Bang' theory of the creation of the Universe 15 thousand million years ago, and traces events which physicists predict occurred soon after the creation. Unified theory of the moment of creation, evidence of an expanding Universe, the X-boson -the particle produced very soon after the big bang and which vanished from the Universe one-hundredth of a second after the big bang, and the fate of the Universe, are all discussed. (U.K.)

  11. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  12. Data: Big and Small.

    Science.gov (United States)

    Jones-Schenk, Jan

    2017-02-01

    Big data is a big topic in all leadership circles. Leaders in professional development must develop an understanding of what data are available across the organization that can inform effective planning for forecasting. Collaborating with others to integrate data sets can increase the power of prediction. Big data alone is insufficient to make big decisions. Leaders must find ways to access small data and triangulate multiple types of data to ensure the best decision making. J Contin Educ Nurs. 2017;48(2):60-61. Copyright 2017, SLACK Incorporated.

  13. A Big Video Manifesto

    DEFF Research Database (Denmark)

    Mcilvenny, Paul Bruce; Davidsen, Jacob

    2017-01-01

    and beautiful visualisations. However, we also need to ask what the tools of big data can do both for the Humanities and for more interpretative approaches and methods. Thus, we prefer to explore how the power of computation, new sensor technologies and massive storage can also help with video-based qualitative......For the last few years, we have witnessed a hype about the potential results and insights that quantitative big data can bring to the social sciences. The wonder of big data has moved into education, traffic planning, and disease control with a promise of making things better with big numbers...

  14. Reactor technology. Progress report, January--March 1978

    International Nuclear Information System (INIS)

    Warren, J.L.

    1978-07-01

    Progress is reported in eight program areas. The nuclear Space Electric Power Supply Program examined safety questions in the aftermath of the COSMOS 954 incident, examined the use of thermoelectric converters, examined the neutronic effectiveness of various reflecting materials, examined ways of connecting heat pipes to one another, studied the consequences of the failure of one heat pipe in the reactor core, and did conceptual design work on heat radiators for various power supplies. The Heat Pipe Program reported progress in the design of ceramic heat pipes, new application of heat pipes to solar collectors, and final performance tests of two pipes for HEDL applications. Under the Nuclear Process Heat Program, work continues on computer codes to model a pebble bed high-temperature gas-cooled reactor, adaptation of a set of German reactor calculation codes to use on U.S. computers, and a parametric study of a certain resonance integral required in reactor studies. Under the Nonproliferation Alternative Sources Assessment Program LASL has undertaken an evaluation of a study of gaseous core reactors by Southern Science Applications, Inc. Independently LASL has developed a proposal for a comprehensive study of gaseous uranium-fueled reactor technology. The Plasma Core Reactor Program has concentrated on restacking the beryllium reflector and redesigning the nuclear control system. The status of and experiments on four critical assemblies, SKUA, Godiva IV, Big Ten, and Flattop, are reported. The Nuclear Criticality Safety Program carried out several tasks including conducting a course, doing several annual safety reviews and evaluating the safety of two Nevada test devices. During the quarter one of the groups involved in reactor technology has acquired responsibility for the operation of a Cockroft-Walton accelerator. The present report contains information on the use of machine and improvements being made in its operation

  15. Ten years after Chernobyl

    International Nuclear Information System (INIS)

    Becker, K.

    1996-01-01

    As was amply demonstrated during the EU/IAEA/WHO Summing-up-Conference in Vienna, Austria, April 8-12, 1996, the radiological consequences of the Chernobyl accident were, fortunately, not as serious as frequently presented in the media: 28 people died from acute radiation syndrome in 1986, 14 more of possibly radiation-related causes since. Of the <1000 thyroid cancers in children, 90 to 95% are curable. There have so far been no other demonstrable increases in the former Soviet Union, as well as in Western Europe, of leukemias, solid cancers, or genetic defects, nor are any to be expected in the future. Even among the open-quotes liquidatorsclose quotes with doses ∼100 mSv, of the ∼150 additional expected leukemias during the 10 yr after the accident, none have been observed. The economical, social, and political consequences, however, both in the former Soviet Union and in Western Europe, have been very substantial. Whole countries developed an hysterical 'radiation sickness.' As A. Merkel, the German Minister of Environment and Reactor Safety, who chaired the conference, pointed out, 'the radiation sensitivity of societies far exceeds that of individuals.' It is obvious that important groups in Ukraine, Belaurus, and Russia try to blame a large fraction of all economic, social, and health problems during the last decade, which are substantial (∼ 6 yr less life expectancy, twice the homicides and traffic deaths, increased alcoholism, and so forth), on radiation of the Chernobyl accident in an effort to attract more support. Western scientists refute such claims but admit large non-radiation-related problems caused by the accident

  16. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  17. Dual of big bang and big crunch

    International Nuclear Information System (INIS)

    Bak, Dongsu

    2007-01-01

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory

  18. Big Data and Neuroimaging.

    Science.gov (United States)

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  19. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  20. From Big Data to Big Business

    DEFF Research Database (Denmark)

    Lund Pedersen, Carsten

    2017-01-01

    Idea in Brief: Problem: There is an enormous profit potential for manufacturing firms in big data, but one of the key barriers to obtaining data-driven growth is the lack of knowledge about which capabilities are needed to extract value and profit from data. Solution: We (BDBB research group at C...

  1. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  2. Avaliação da suscetibilidade à corrosão sob tensão da ZAC do aço inoxidável AISI 316L em ambiente de reator nuclear PWR Stress corrosion cracking of stainless steel AISI 316L HAZ in PWR Nuclear reactor environment

    Directory of Open Access Journals (Sweden)

    Mônica Maria de Abreu Mendonça Schvartzman

    2009-09-01

    Full Text Available Aços carbono de baixa liga e aços inoxidáveis são amplamente utilizados nos circuitos primários de reatores nucleares do tipo PWR (Pressurized Water Reactor. Ligas de níquel são empregadas na soldagem destes materiais devido a características como elevadas resistências mecânica e à corrosão, coeficiente de expansão térmica adequado, etc. Nos últimos 30 anos, a corrosão sob tensão (CST tem sido observada principalmente nas regiões das soldas entre materiais dissimilares existentes nestes reatores. Este trabalho teve como objetivo avaliar, por comparação, a suscetibilidade à corrosão sob tensão da zona afetada pelo calor (ZAC do aço inoxidável austenítico AISI 316L quando submetida a um ambiente similar ao do circuito primário de um reator nuclear PWR nas temperaturas de 303ºC e 325ºC. Para esta avaliação empregou-se o ensaio de taxa de deformação lenta - SSRT (Slow Strain Rate Test. Os resultados indicaram que a CST é ativada termicamente e que a 325ºC pode-se observar a presença mais significativa de fratura frágil decorrente do processo de corrosão sob tensão.In pressurized water reactors (PWRs, low alloy carbon steels and stainless steel are widely used in the primary water circuits. In most cases, Ni alloys are used to joint these materials and form dissimilar welds. These alloys are known to accommodate the differences in composition and thermal expansion of the two materials. Stress corrosion cracking of metals and alloys is caused by synergistic effects of environment, material condition and stress. Over the last thirty years, CST has been observed in dissimilar metal welds. This study presents a comparative work between the CST in the HAZ (Heat Affected Zone of the AISI 316L in two different temperatures (303ºC and 325ºC. The susceptibility to stress corrosion cracking was assessed using the slow strain rate tensile (SSRT test. The results of the SSRT tests indicated that CST is a thermally

  3. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  4. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  5. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van

  6. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  7. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Boyd, Richard N.

    2001-01-01

    The precision of measurements in modern cosmology has made huge strides in recent years, with measurements of the cosmic microwave background and the determination of the Hubble constant now rivaling the level of precision of the predictions of big bang nucleosynthesis. However, these results are not necessarily consistent with the predictions of the Standard Model of big bang nucleosynthesis. Reconciling these discrepancies may require extensions of the basic tenets of the model, and possibly of the reaction rates that determine the big bang abundances

  8. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  9. Big Data, indispensable today

    Directory of Open Access Journals (Sweden)

    Radu-Ioan ENACHE

    2015-10-01

    Full Text Available Big data is and will be used more in the future as a tool for everything that happens both online and offline. Of course , online is a real hobbit, Big Data is found in this medium , offering many advantages , being a real help for all consumers. In this paper we talked about Big Data as being a plus in developing new applications, by gathering useful information about the users and their behaviour.We've also presented the key aspects of real-time monitoring and the architecture principles of this technology. The most important benefit brought to this paper is presented in the cloud section.

  10. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  11. Small Big Data Congress 2017

    NARCIS (Netherlands)

    Doorn, J.

    2017-01-01

    TNO, in collaboration with the Big Data Value Center, presents the fourth Small Big Data Congress! Our congress aims at providing an overview of practical and innovative applications based on big data. Do you want to know what is happening in applied research with big data? And what can already be

  12. Cryptography for Big Data Security

    Science.gov (United States)

    2015-07-13

    Cryptography for Big Data Security Book Chapter for Big Data: Storage, Sharing, and Security (3S) Distribution A: Public Release Ariel Hamlin1 Nabil...Email: arkady@ll.mit.edu ii Contents 1 Cryptography for Big Data Security 1 1.1 Introduction...48 Chapter 1 Cryptography for Big Data Security 1.1 Introduction With the amount

  13. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  14. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  15. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  16. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  17. BigDansing

    KAUST Repository

    Khayyat, Zuhair; Ilyas, Ihab F.; Jindal, Alekh; Madden, Samuel; Ouzzani, Mourad; Papotti, Paolo; Quiané -Ruiz, Jorge-Arnulfo; Tang, Nan; Yin, Si

    2015-01-01

    of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic

  18. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  19. Big Creek Pit Tags

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BCPITTAGS database is used to store data from an Oncorhynchus mykiss (steelhead/rainbow trout) population dynamics study in Big Creek, a coastal stream along the...

  20. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-01-01

    on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also

  1. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  2. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  3. Big Bang baryosynthesis

    International Nuclear Information System (INIS)

    Turner, M.S.; Chicago Univ., IL

    1983-01-01

    In these lectures I briefly review Big Bang baryosynthesis. In the first lecture I discuss the evidence which exists for the BAU, the failure of non-GUT symmetrical cosmologies, the qualitative picture of baryosynthesis, and numerical results of detailed baryosynthesis calculations. In the second lecture I discuss the requisite CP violation in some detail, further the statistical mechanics of baryosynthesis, possible complications to the simplest scenario, and one cosmological implication of Big Bang baryosynthesis. (orig./HSI)

  4. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  5. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2015 The Authors.

  6. Big Data and medicine: a big deal?

    Science.gov (United States)

    Mayer-Schönberger, V; Ingelsson, E

    2018-05-01

    Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research. Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data's role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval. © 2017 The Association for the Publication of the Journal of Internal Medicine.

  7. Big data uncertainties.

    Science.gov (United States)

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  8. Farewell to a Big and Rich Nuclear Power Club?

    International Nuclear Information System (INIS)

    Takeda, A.

    2001-01-01

    For the last few decades of the 20th, century, we have seen a large number of big nuclear power plants being built and operated in a few rich countries like the United States, France, Germany, the United Kingdom, and Japan. They have standardized the 1000 MWe-type light water reactors, which have the actual generating capacity of more than 1100 MW. (author)

  9. ABACC ten years applying safeguards

    International Nuclear Information System (INIS)

    Palacios, Elias

    2001-01-01

    The Argentinian-Brazilian Agency for Accounting and Control of nuclear special materials has been in operations for ten years. The rational behind the creation and the work performed by the Agency during the last decade is described. (author)

  10. The Concept of the Use of the Marine Reactor Plant in Small Electric Grids

    International Nuclear Information System (INIS)

    Khlopkin, N.; Makarov, V.; Pologikh, B.

    2002-01-01

    In report some aspects of the using marine nuclear reactor are considered for provision of need small non-interconnected power systems, as well as separate settlements and the mining enterprises disposed in regions with a undeveloped infrastructure. Recently for these purposes it is offered to use the nuclear small modular power plants. The required plant power for small electric grids lies within from 1 to several tens of MWe. Module can be collected and tested on machine-building plant, and then delivered in ready type to the working place on some transport, for instance, a barge. Through determined time it's possible to transport a module to the repair shop and also to the point of storage after the end of operation. Marine nuclear reactors on their powers, compactness, mass and size are ideal prototypes for creation of such modules. For instance, building at present floating power unit, intended for functioning in region of the Russian North, based on using reactor plants of nuclear icebreakers. Reliability and safety of the ship reactor are confirmed by their trouble-free operation during approximately 180 reactors-years. Unlike big stationary nuclear plant, working in base mode, power unit with marine reactor wholly capable to work in mode of the loading following. In contrast with reactor of nuclear icebreaker, advisable to increase the core lifetime and to reduce the enrichment of the uranium. This requires more uranium capacity fuel compositions and design of the core. In particular, possible transition from traditional for ship reactor of the channel core to cassette design. Other directions of evolution of the ship reactors, not touching the basic constructive decisions verified by practice, but promoting development of properties of self-security of plant are possible. Among such directions is reduction volumetric power density of a core. (author)

  11. BigDansing

    KAUST Repository

    Khayyat, Zuhair

    2015-06-02

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to scaling to big datasets. This presents a serious impediment since data cleansing often involves costly computations such as enumerating pairs of tuples, handling inequality joins, and dealing with user-defined functions. In this paper, we present BigDansing, a Big Data Cleansing system to tackle efficiency, scalability, and ease-of-use issues in data cleansing. The system can run on top of most common general purpose data processing platforms, ranging from DBMSs to MapReduce-like frameworks. A user-friendly programming interface allows users to express data quality rules both declaratively and procedurally, with no requirement of being aware of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic and real datasets show that BigDansing outperforms existing baseline systems up to more than two orders of magnitude without sacrificing the quality provided by the repair algorithms.

  12. The origin of the future ten questions for the next ten years

    CERN Document Server

    Gribbin, John

    2006-01-01

    How did the universe begin? Where do galaxies come from? How do stars and planets form? Where do the material particles we are made of come from? How did life begin? Today we have only provisional answers to such questions. But scientific progress will improve these answers dramatically over the next ten years, predicts John Gribbin in this riveting book. He focuses on what we know—or think we know—about ten controversial, unanswered issues in the physical sciences and explains how current cutting-edge research may yield solutions in the very near future. With his trademark facility for engaging readers with or without a scientific background, the author explores ideas concerning the creation of the universe, the possibility of other forms of life, and the fate of the expanding cosmos. He examines “theories of everything,” including grand unified theories and string theory, and he discusses the Big Bang theory, the origin of structure and patterns of matter in the galaxies, and dark mass and dark ene...

  13. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  14. Thick-Big Descriptions

    DEFF Research Database (Denmark)

    Lai, Signe Sophus

    The paper discusses the rewards and challenges of employing commercial audience measurements data – gathered by media industries for profitmaking purposes – in ethnographic research on the Internet in everyday life. It questions claims to the objectivity of big data (Anderson 2008), the assumption...... communication systems, language and behavior appear as texts, outputs, and discourses (data to be ‘found’) – big data then documents things that in earlier research required interviews and observations (data to be ‘made’) (Jensen 2014). However, web-measurement enterprises build audiences according...... to a commercial logic (boyd & Crawford 2011) and is as such directed by motives that call for specific types of sellable user data and specific segmentation strategies. In combining big data and ‘thick descriptions’ (Geertz 1973) scholars need to question how ethnographic fieldwork might map the ‘data not seen...

  15. Big data in biomedicine.

    Science.gov (United States)

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Big data bioinformatics.

    Science.gov (United States)

    Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao

    2014-12-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.

  17. Ten Lessons from Ten Years PPP Experience in Belgium

    NARCIS (Netherlands)

    Willems, T.; Verhoest, K.; Voets, J.; Coppens, T.; van Dooren, W.; van den Hurk, M.

    2017-01-01

    In 2004 Flanders, the northern region of Belgium launched a range of large public–private partnership (PPP) projects for a total value of 6 billion euros. Ten years later, PPP has become a well-embedded procurement method for long-term public infrastructure projects. This article makes a critical

  18. Recommendations for a restart of Molten Salt Reactor development

    International Nuclear Information System (INIS)

    Moir, R. W.

    2007-01-01

    The concept of the molten salt reactor (MSR) refuses to go away. The Generation-IV process lists the MSR as one of the six concepts to be considered for extending fuel resources. Good fuel utilization and good economics are required to meet the often cited goal of 10 TWe globally and 1 TWe for the US by non-carbon energy sources in this century by nuclear fission. A strong incentive for the molten salt reactor design is its good fuel utilization, good economics, amazing flexibility and promised large benefits. It can: - use thorium or uranium; o be designed with lots of graphite to have a fairly thermal neutron spectrum or without graphite moderator to have a fast neutron spectrum reactor; - fission uranium isotopes and plutonium isotopes; - operate with non-weapon grade fissile fuel, or in suitable sites it can operate with enrichment between reactor-grade and weapon-grade fissile fuel; - be a breeder or near breeder; - operate at temperature >1100 degree C if carbon composites are successfully employed. Enhancing 2 32U content in the uranium to over 500 pm makes the fuel undesirable for weapons, but it should not detract from its economic use in liquid fuel reactors: a big advantage in nonproliferation. Economics of the MSR is enhanced by operating at low pressure and high temperature and may even lead to the preferred route to hydrogen production. The cost of the electricity produced from low enriched fuel averaged over the life of the entire process, has been predicted to be about 10% lower than that from LWRs, and 20% lower for high enriched fuel, with uncertainties of about 10%. The development cost has been estimated at about 1 B$ (e.g., a 100 M$/y base program for ten years) not including construction of a series of reactors leading up to the deployment of multiple commercial units at an assumed cost of 9 B$ (450 M$/y over 20 years). A benefit of liquid fuel is that smaller power reactors can faithfully test features of larger reactors, thereby reducing the

  19. Passport to the Big Bang moves across the road

    CERN Document Server

    Corinne Pralavorio

    2015-01-01

    The ATLAS platform of the Passport to the Big Bang circuit has been relocated in front of the CERN Reception.   The ATLAS platform of the Passport to the Big Bang, outside the CERN Reception building. The Passport to the Big Bang platform of the ATLAS Experiment has been moved in front of the CERN Reception to make it more visible and accessible. It had to be dismantled and moved from its previous location in the garden of the Globe of Science and Innovation due to the major refurbishment work in progress on the Globe, and is now fully operational in its new location on the other side of the road, in the Main Reception car-park. The Passport to the Big Bang circuit, inaugurated in 2013, comprises ten platforms installed in front of ten CERN sites and aims to help local residents and visitors to the region understand CERN's research. Dedicated Passport to the Big Bang flyers, containing all necessary information and riddles for you to solve, are available at the CERN Rec...

  20. Stability analysis for the Big Dee upgrade of the Doublet III tokamak

    International Nuclear Information System (INIS)

    Helton, F.J.; Luxon, J.L.

    1987-01-01

    Ideal magnetohydrodynamic stability analysis has been carried out for configurations expected in the Big Dee tokamak, an upgrade of the Doublet III tokamak into a non-circular cross-section device which began operation early in 1986. The results of this analysis support theoretical predictions as follows: Since the maximum value of beta stable to ballooning and Mercier modes, which we denote β c , increases with inverse aspect ratio, elongation and triangularity, the Big Dee is particularly suited to obtain high values of β c and there exist high β c Big Dee equilibria for large variations in all relevant plasma parameters. The beta limits for the Big Dee are consistent with established theory as summarized in present scaling laws. High beta Big Dee equilibria are continuously accessible when approached through changes in all relevant input parameters and are structurally stable with respect to variations of input plasma parameters. Big Dee beta limits have a smooth dependence on plasma parameters such as β p and elongation. These calculations indicate that in the actual running of the device the Big Dee high beta equilibria should be smoothly accessible. Theory predicts that the limiting plasma parameters, such as beta, total plasma current and plasma pressure, which can be obtained within the operating limits of the Big Dee are reactor relevant. Thus the Big Dee should be able to use its favourable ideal MHD scaling and controlled plasma shaping to attain reactor relevant parameters in a moderate sized device. (author)

  1. Reactor feedwater device

    International Nuclear Information System (INIS)

    Igarashi, Noboru.

    1986-01-01

    Purpose: To suppress soluble radioactive corrosion products in a feedwater device. Method: In a light water cooled nuclear reactor, an iron injection system is connected to feedwater pipeways and the iron concentration in the feedwater or reactor coolant is adjusted between twice and ten times of the nickel concentration. When the nickel/iron ratio in the reactor coolant or feedwater goes nearer to 1/2, iron ions are injected together with iron particles to the reactor coolant to suppress the leaching of stainless steels, decrease the nickel in water and increase the iron concentration. As a result, it is possible to suppress the intrusion of nickel as one of parent nuclide of radioactive nuclides. Further, since the iron particles intruded into the reactor constitute nuclei for capturing the radioactive nuclides to reduce the soluble radioactive corrosion products, the radioactive nuclides deposited uniformly to the inside of the pipeways in each of the coolant circuits can be reduced. (Kawakami, Y.)

  2. Big Java late objects

    CERN Document Server

    Horstmann, Cay S

    2012-01-01

    Big Java: Late Objects is a comprehensive introduction to Java and computer programming, which focuses on the principles of programming, software engineering, and effective learning. It is designed for a two-semester first course in programming for computer science students.

  3. Big ideas: innovation policy

    OpenAIRE

    John Van Reenen

    2011-01-01

    In the last CentrePiece, John Van Reenen stressed the importance of competition and labour market flexibility for productivity growth. His latest in CEP's 'big ideas' series describes the impact of research on how policy-makers can influence innovation more directly - through tax credits for business spending on research and development.

  4. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with

  5. Big data in history

    CERN Document Server

    Manning, Patrick

    2013-01-01

    Big Data in History introduces the project to create a world-historical archive, tracing the last four centuries of historical dynamics and change. Chapters address the archive's overall plan, how to interpret the past through a global archive, the missions of gathering records, linking local data into global patterns, and exploring the results.

  6. The Big Sky inside

    Science.gov (United States)

    Adams, Earle; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Hester, Carolyn; Knuth, Randy; Spangler, Todd; Jones, David; Henthorn, Melissa; Hammill, Brock; Smith, Paul; Salisbury, Rob; Reckin, Gene; Boulafentis, Johna

    2009-01-01

    The University of Montana (UM)-Missoula has implemented a problem-based program in which students perform scientific research focused on indoor air pollution. The Air Toxics Under the Big Sky program (Jones et al. 2007; Adams et al. 2008; Ward et al. 2008) provides a community-based framework for understanding the complex relationship between poor…

  7. Moving Another Big Desk.

    Science.gov (United States)

    Fawcett, Gay

    1996-01-01

    New ways of thinking about leadership require that leaders move their big desks and establish environments that encourage trust and open communication. Educational leaders must trust their colleagues to make wise choices. When teachers are treated democratically as leaders, classrooms will also become democratic learning organizations. (SM)

  8. A Big Bang Lab

    Science.gov (United States)

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  9. New 'bigs' in cosmology

    International Nuclear Information System (INIS)

    Yurov, Artyom V.; Martin-Moruno, Prado; Gonzalez-Diaz, Pedro F.

    2006-01-01

    This paper contains a detailed discussion on new cosmic solutions describing the early and late evolution of a universe that is filled with a kind of dark energy that may or may not satisfy the energy conditions. The main distinctive property of the resulting space-times is that they make to appear twice the single singular events predicted by the corresponding quintessential (phantom) models in a manner which can be made symmetric with respect to the origin of cosmic time. Thus, big bang and big rip singularity are shown to take place twice, one on the positive branch of time and the other on the negative one. We have also considered dark energy and phantom energy accretion onto black holes and wormholes in the context of these new cosmic solutions. It is seen that the space-times of these holes would then undergo swelling processes leading to big trip and big hole events taking place on distinct epochs along the evolution of the universe. In this way, the possibility is considered that the past and future be connected in a non-paradoxical manner in the universes described by means of the new symmetric solutions

  10. The Big Bang

    CERN Multimedia

    Moods, Patrick

    2006-01-01

    How did the Universe begin? The favoured theory is that everything - space, time, matter - came into existence at the same moment, around 13.7 thousand million years ago. This event was scornfully referred to as the "Big Bang" by Sir Fred Hoyle, who did not believe in it and maintained that the Universe had always existed.

  11. Big Data Analytics

    Indian Academy of Sciences (India)

    The volume and variety of data being generated using computersis doubling every two years. It is estimated that in 2015,8 Zettabytes (Zetta=1021) were generated which consistedmostly of unstructured data such as emails, blogs, Twitter,Facebook posts, images, and videos. This is called big data. Itis possible to analyse ...

  12. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  13. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  14. A Ten-Year Reflection

    Science.gov (United States)

    Phillip, Cyndi

    2016-01-01

    Five initiatives launched during Cyndi Phillip's term as American Association of School Librarians (AASL) President (2006-2007) continue to have an impact on school librarians ten years later. They include the rewriting of AASL's learning standards, introduction of the SKILLS Act, the presentation of the Crystal Apple Award to Scholastic Library…

  15. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-07-31

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to big data scaling. This presents a serious impediment since identify- ing and repairing dirty data often involves processing huge input datasets, handling sophisticated error discovery approaches and managing huge arbitrary errors. With large datasets, error detection becomes overly expensive and complicated especially when considering user-defined functions. Furthermore, a distinctive algorithm is de- sired to optimize inequality joins in sophisticated error discovery rather than na ̈ıvely parallelizing them. Also, when repairing large errors, their skewed distribution may obstruct effective error repairs. In this dissertation, I present solutions to overcome the above three problems in scaling data cleansing. First, I present BigDansing as a general system to tackle efficiency, scalability, and ease-of-use issues in data cleansing for Big Data. It automatically parallelizes the user’s code on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also enables parallel execution of serial repair algorithms by exploiting the graph representation of discovered errors. The experimental results show that BigDansing outperforms existing baselines up to more than two orders of magnitude. Although BigDansing scales cleansing jobs, it still lacks the ability to handle sophisticated error discovery requiring inequality joins. Therefore, I developed IEJoin as an algorithm for fast inequality joins. It is based on sorted arrays and space efficient bit-arrays to reduce the problem’s search space. By comparing IEJoin against well- known optimizations, I show that it is more scalable, and several orders of magnitude faster. BigDansing depends on vertex-centric graph systems, i.e., Pregel

  16. New research reactor proposed for Australia

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    A new research reactor has been proposed for construction within the next ten years, to replace the HIFAR reactor which operating capabilities have been over taken by later designs. This paper outlines the main research applications of the new reactor design and briefly examines issues related to its cost, economic benefits, safety and location

  17. The NOAA Big Data Project

    Science.gov (United States)

    de la Beaujardiere, J.

    2015-12-01

    The US National Oceanic and Atmospheric Administration (NOAA) is a Big Data producer, generating tens of terabytes per day from hundreds of sensors on satellites, radars, aircraft, ships, and buoys, and from numerical models. These data are of critical importance and value for NOAA's mission to understand and predict changes in climate, weather, oceans, and coasts. In order to facilitate extracting additional value from this information, NOAA has established Cooperative Research and Development Agreements (CRADAs) with five Infrastructure-as-a-Service (IaaS) providers — Amazon, Google, IBM, Microsoft, Open Cloud Consortium — to determine whether hosting NOAA data in publicly-accessible Clouds alongside on-demand computational capability stimulates the creation of new value-added products and services and lines of business based on the data, and if the revenue generated by these new applications can support the costs of data transmission and hosting. Each IaaS provider is the anchor of a "Data Alliance" which organizations or entrepreneurs can join to develop and test new business or research avenues. This presentation will report on progress and lessons learned during the first 6 months of the 3-year CRADAs.

  18. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    Science.gov (United States)

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  19. Integrated plant safety assessment. Systematic evaluation program, Big Rock Point Plant (Docket No. 50-155). Final report

    International Nuclear Information System (INIS)

    1984-05-01

    The Systematic Evaluation Program was initiated in February 1977 by the U.S. Nuclear Regulatory Commission to review the designs of older operating nuclear reactor plants to reconfirm and document their safety. The review provides (1) an assessment of how these plants compare with current licensing safety requirements relating to selected issues, (2) a basis for deciding how these differences should be resolved in an integrated plant review, and (3) a documented evaluation of plant safety when the supplement to the Final Integrated Plant Safety Assessment Report has been issued. This report documents the review of the Big Rock Point Plant, which is one of ten plants reviewed under Phase II of this program. This report indicates how 137 topics selected for review under Phase I of the program were addressed. It also addresses a majority of the pending licensing actions for Big Rock Point, which include TMI Action Plan requirements and implementation criteria for resolved generic issues. Equipment and procedural changes have been identified as a result of the review

  20. Ten per cent more grain

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1967-08-15

    At a low estimate, ten per cent of stored grain is lost every year to insect pests. In this article, based on a lecture given earlier this year in Switzerland, Dr. Harry E. Goresline, Food Radiation Specialist of the Food and Agriculture Organisation, now assisting the Joint FAO/IAEA Division of Atomic Energy in Food and Agriculture, explains how use of radiation can help to prevent losses and the research which has taken place to ensure its safety

  1. Finding the big bang

    CERN Document Server

    Page, Lyman A; Partridge, R Bruce

    2009-01-01

    Cosmology, the study of the universe as a whole, has become a precise physical science, the foundation of which is our understanding of the cosmic microwave background radiation (CMBR) left from the big bang. The story of the discovery and exploration of the CMBR in the 1960s is recalled for the first time in this collection of 44 essays by eminent scientists who pioneered the work. Two introductory chapters put the essays in context, explaining the general ideas behind the expanding universe and fossil remnants from the early stages of the expanding universe. The last chapter describes how the confusion of ideas and measurements in the 1960s grew into the present tight network of tests that demonstrate the accuracy of the big bang theory. This book is valuable to anyone interested in how science is done, and what it has taught us about the large-scale nature of the physical universe.

  2. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Klinkby Madsen, Anders; Rasche, Andreas

    data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects......This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... of international development agendas to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development efforts, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion...

  3. Big Bounce and inhomogeneities

    International Nuclear Information System (INIS)

    Brizuela, David; Mena Marugan, Guillermo A; Pawlowski, Tomasz

    2010-01-01

    The dynamics of an inhomogeneous universe is studied with the methods of loop quantum cosmology, via a so-called hybrid quantization, as an example of the quantization of vacuum cosmological spacetimes containing gravitational waves (Gowdy spacetimes). The analysis of this model with an infinite number of degrees of freedom, performed at the effective level, shows that (i) the initial Big Bang singularity is replaced (as in the case of homogeneous cosmological models) by a Big Bounce, joining deterministically two large universes, (ii) the universe size at the bounce is at least of the same order of magnitude as that of the background homogeneous universe and (iii) for each gravitational wave mode, the difference in amplitude at very early and very late times has a vanishing statistical average when the bounce dynamics is strongly dominated by the inhomogeneities, whereas this average is positive when the dynamics is in a near-vacuum regime, so that statistically the inhomogeneities are amplified. (fast track communication)

  4. Big Data and reality

    Directory of Open Access Journals (Sweden)

    Ryan Shaw

    2015-11-01

    Full Text Available DNA sequencers, Twitter, MRIs, Facebook, particle accelerators, Google Books, radio telescopes, Tumblr: what do these things have in common? According to the evangelists of “data science,” all of these are instruments for observing reality at unprecedentedly large scales and fine granularities. This perspective ignores the social reality of these very different technological systems, ignoring how they are made, how they work, and what they mean in favor of an exclusive focus on what they generate: Big Data. But no data, big or small, can be interpreted without an understanding of the process that generated them. Statistical data science is applicable to systems that have been designed as scientific instruments, but is likely to lead to confusion when applied to systems that have not. In those cases, a historical inquiry is preferable.

  5. Really big numbers

    CERN Document Server

    Schwartz, Richard Evan

    2014-01-01

    In the American Mathematical Society's first-ever book for kids (and kids at heart), mathematician and author Richard Evan Schwartz leads math lovers of all ages on an innovative and strikingly illustrated journey through the infinite number system. By means of engaging, imaginative visuals and endearing narration, Schwartz manages the monumental task of presenting the complex concept of Big Numbers in fresh and relatable ways. The book begins with small, easily observable numbers before building up to truly gigantic ones, like a nonillion, a tredecillion, a googol, and even ones too huge for names! Any person, regardless of age, can benefit from reading this book. Readers will find themselves returning to its pages for a very long time, perpetually learning from and growing with the narrative as their knowledge deepens. Really Big Numbers is a wonderful enrichment for any math education program and is enthusiastically recommended to every teacher, parent and grandparent, student, child, or other individual i...

  6. Super critical water reactors

    International Nuclear Information System (INIS)

    Dumaz, P.; Antoni, O; Arnoux, P.; Bergeron, A; Renault, C.; Rimpault, G.

    2005-01-01

    Water is used as a calori-porter and moderator in the most major nuclear centers which are actually in function. In the pressurized water reactor (PWR) and boiling water reactor (BWR), water is maintained under critical point of water (21 bar, 374 Centigrade) which limits the efficiency of thermodynamic cycle of energy conversion (yield gain of about 33%) Crossing the critical point, one can then use s upercritical water , the obtained pressure and temperature allow a significant yield gains. In addition, the supercritical water offers important properties. Particularly there is no more possible coexistence between vapor and liquid. Therefore, we don't have more boiling problem, one of the phenomena which limits the specific power of PWR and BWR. Since 1950s, the reactor of supercritical water was the subject of studies more or less detailed but neglected. From the early 1990s, this type of conception benefits of some additional interests. Therefore, in the international term G eneration IV , the supercritical water reactors had been considered as one of the big options for study as Generation IV reactors. In the CEA, an active city has engaged from 1930 with the participation to a European program: The HPWR (High Performance Light Water Reactor). In this contest, the R and D studies are focused on the fields of neutrons, thermodynamic and materials. The CEA intends to pursue a limited effort of R and D in this field, in the framework of international cooperation, preferring the study of versions of rapid spectrum. (author)

  7. Big Bang Circus

    Science.gov (United States)

    Ambrosini, C.

    2011-06-01

    Big Bang Circus is an opera I composed in 2001 and which was premiered at the Venice Biennale Contemporary Music Festival in 2002. A chamber group, four singers and a ringmaster stage the story of the Universe confronting and interweaving two threads: how early man imagined it and how scientists described it. Surprisingly enough fancy, myths and scientific explanations often end up using the same images, metaphors and sometimes even words: a strong tension, a drumskin starting to vibrate, a shout…

  8. Big Bang 5

    CERN Document Server

    Apolin, Martin

    2007-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 5 RG behandelt die Grundlagen (Maßsystem, Größenordnungen) und die Mechanik (Translation, Rotation, Kraft, Erhaltungssätze).

  9. Big Bang 8

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Band 8 vermittelt auf verständliche Weise Relativitätstheorie, Kern- und Teilchenphysik (und deren Anwendungen in der Kosmologie und Astrophysik), Nanotechnologie sowie Bionik.

  10. Big Bang 6

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 6 RG behandelt die Gravitation, Schwingungen und Wellen, Thermodynamik und eine Einführung in die Elektrizität anhand von Alltagsbeispielen und Querverbindungen zu anderen Disziplinen.

  11. Big Bang 7

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. In Band 7 werden neben einer Einführung auch viele aktuelle Aspekte von Quantenmechanik (z. Beamen) und Elektrodynamik (zB Elektrosmog), sowie die Klimaproblematik und die Chaostheorie behandelt.

  12. Big Bang Darkleosynthesis

    OpenAIRE

    Krnjaic, Gordan; Sigurdson, Kris

    2014-01-01

    In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis , a dark-sector version of big-bang nucleosynthesis (BBN), in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD , which generica...

  13. Big³. Editorial.

    Science.gov (United States)

    Lehmann, C U; Séroussi, B; Jaulent, M-C

    2014-05-22

    To provide an editorial introduction into the 2014 IMIA Yearbook of Medical Informatics with an overview of the content, the new publishing scheme, and upcoming 25th anniversary. A brief overview of the 2014 special topic, Big Data - Smart Health Strategies, and an outline of the novel publishing model is provided in conjunction with a call for proposals to celebrate the 25th anniversary of the Yearbook. 'Big Data' has become the latest buzzword in informatics and promise new approaches and interventions that can improve health, well-being, and quality of life. This edition of the Yearbook acknowledges the fact that we just started to explore the opportunities that 'Big Data' will bring. However, it will become apparent to the reader that its pervasive nature has invaded all aspects of biomedical informatics - some to a higher degree than others. It was our goal to provide a comprehensive view at the state of 'Big Data' today, explore its strengths and weaknesses, as well as its risks, discuss emerging trends, tools, and applications, and stimulate the development of the field through the aggregation of excellent survey papers and working group contributions to the topic. For the first time in history will the IMIA Yearbook be published in an open access online format allowing a broader readership especially in resource poor countries. For the first time, thanks to the online format, will the IMIA Yearbook be published twice in the year, with two different tracks of papers. We anticipate that the important role of the IMIA yearbook will further increase with these changes just in time for its 25th anniversary in 2016.

  14. Recent big flare

    International Nuclear Information System (INIS)

    Moriyama, Fumio; Miyazawa, Masahide; Yamaguchi, Yoshisuke

    1978-01-01

    The features of three big solar flares observed at Tokyo Observatory are described in this paper. The active region, McMath 14943, caused a big flare on September 16, 1977. The flare appeared on both sides of a long dark line which runs along the boundary of the magnetic field. Two-ribbon structure was seen. The electron density of the flare observed at Norikura Corona Observatory was 3 x 10 12 /cc. Several arc lines which connect both bright regions of different magnetic polarity were seen in H-α monochrome image. The active region, McMath 15056, caused a big flare on December 10, 1977. At the beginning, several bright spots were observed in the region between two main solar spots. Then, the area and the brightness increased, and the bright spots became two ribbon-shaped bands. A solar flare was observed on April 8, 1978. At first, several bright spots were seen around the solar spot in the active region, McMath 15221. Then, these bright spots developed to a large bright region. On both sides of a dark line along the magnetic neutral line, bright regions were generated. These developed to a two-ribbon flare. The time required for growth was more than one hour. A bright arc which connects two ribbons was seen, and this arc may be a loop prominence system. (Kato, T.)

  15. Big Data Technologies

    Science.gov (United States)

    Bellazzi, Riccardo; Dagliati, Arianna; Sacchi, Lucia; Segagni, Daniele

    2015-01-01

    The so-called big data revolution provides substantial opportunities to diabetes management. At least 3 important directions are currently of great interest. First, the integration of different sources of information, from primary and secondary care to administrative information, may allow depicting a novel view of patient’s care processes and of single patient’s behaviors, taking into account the multifaceted nature of chronic care. Second, the availability of novel diabetes technologies, able to gather large amounts of real-time data, requires the implementation of distributed platforms for data analysis and decision support. Finally, the inclusion of geographical and environmental information into such complex IT systems may further increase the capability of interpreting the data gathered and extract new knowledge from them. This article reviews the main concepts and definitions related to big data, it presents some efforts in health care, and discusses the potential role of big data in diabetes care. Finally, as an example, it describes the research efforts carried on in the MOSAIC project, funded by the European Commission. PMID:25910540

  16. Flow of catalyst particles in a flue gas desulfurization plant; mass transfer in the domain of a detached flow - two examples (desulfurization, HTGR type reactor) for the application of big computers solving technical problems

    International Nuclear Information System (INIS)

    Achenbach, E.

    1988-01-01

    The research work of the Institute for Reactor Components is mainly experimental in character. Where possible, the experiments are accompanied by numerical calculations. This has the advantage of rendering parameter studies faster and more economical than is the case with experiments, so that physical contexts can become more apparent. However, these calculations are no substitute for experiments. The application of numerical calculations in connection with experimental results can now be demonstrated with two examples. The examples have been selected with the aim of making the presentation of the results sufficiently interesting for all those participating at the colloquium. The theoretical and experimental results are presented in the form of short films. (orig.) [de

  17. SETI as a part of Big History

    Science.gov (United States)

    Maccone, Claudio

    2014-08-01

    Big History is an emerging academic discipline which examines history scientifically from the Big Bang to the present. It uses a multidisciplinary approach based on combining numerous disciplines from science and the humanities, and explores human existence in the context of this bigger picture. It is taught at some universities. In a series of recent papers ([11] through [15] and [17] through [18]) and in a book [16], we developed a new mathematical model embracing Darwinian Evolution (RNA to Humans, see, in particular, [17] and Human History (Aztecs to USA, see [16]) and then we extrapolated even that into the future up to ten million years (see 18), the minimum time requested for a civilization to expand to the whole Milky Way (Fermi paradox). In this paper, we further extend that model in the past so as to let it start at the Big Bang (13.8 billion years ago) thus merging Big History, Evolution on Earth and SETI (the modern Search for ExtraTerrestrial Intelligence) into a single body of knowledge of a statistical type. Our idea is that the Geometric Brownian Motion (GBM), so far used as the key stochastic process of financial mathematics (Black-Sholes models and related 1997 Nobel Prize in Economics!) may be successfully applied to the whole of Big History. In particular, in this paper we derive the Statistical Drake Equation (namely the statistical extension of the classical Drake Equation typical of SETI) can be regarded as the “frozen in time” part of GBM. This makes SETI a subset of our Big History Theory based on GBMs: just as the GBM is the “movie” unfolding in time, so the Statistical Drake Equation is its “still picture”, static in time, and the GBM is the time-extension of the Drake Equation. Darwinian Evolution on Earth may be easily described as an increasing GBM in the number of living species on Earth over the last 3.5 billion years. The first of them was RNA 3.5 billion years ago, and now 50 million living species or more exist, each

  18. Big bang and big crunch in matrix string theory

    OpenAIRE

    Bedford, J; Papageorgakis, C; Rodríguez-Gómez, D; Ward, J

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  19. Nuclear reactors

    International Nuclear Information System (INIS)

    Barre, Bertrand

    2015-10-01

    After some remarks on the nuclear fuel, on the chain reaction control, on fuel loading and unloading, this article proposes descriptions of the design, principles and operations of different types of nuclear reactors as well as comments on their presence and use in different countries: pressurized water reactors (design of the primary and secondary circuits, volume and chemistry control, backup injection circuits), boiling water reactors, heavy water reactors, graphite and boiling water reactors, graphite-gas reactors, fast breeder reactors, and fourth generation reactors (definition, fast breeding). For these last ones, six concepts are presented: sodium-cooled fast reactor, lead-cooled fast reactor, gas-cooled fast reactor, high temperature gas-cooled reactor, supercritical water-cooled reactor, and molten salt reactor

  20. Nuclear reactor PBMR and cogeneration; Reactor nuclear PBMR y cogeneracion

    Energy Technology Data Exchange (ETDEWEB)

    Ramirez S, J. R.; Alonso V, G., E-mail: ramon.ramirez@inin.gob.mx [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2013-10-15

    In recent years the nuclear reactor designs for the electricity generation have increased their costs, so that at the moment costs are managed of around the 5000 US D for installed kw, reason for which a big nuclear plant requires of investments of the order of billions of dollars, the designed reactors as modular of low power seek to lighten the initial investment of a big reactor dividing the power in parts and dividing in modules the components to lower the production costs, this way it can begin to build a module and finished this to build other, differing the long term investment, getting less risk therefore in the investment. On the other hand the reactors of low power can be very useful in regions where is difficult to have access to the electric net being able to take advantage of the thermal energy of the reactor to feed other processes like the water desalination or the vapor generation for the processes industry like the petrochemical, or even more the possible hydrogen production to be used as fuel. In this work the possibility to generate vapor of high quality for the petrochemical industry is described using a spheres bed reactor of high temperature. (Author)

  1. Ten questions about systems biology

    DEFF Research Database (Denmark)

    Joyner, Michael J; Pedersen, Bente K

    2011-01-01

    In this paper we raise 'ten questions' broadly related to 'omics', the term systems biology, and why the new biology has failed to deliver major therapeutic advances for many common diseases, especially diabetes and cardiovascular disease. We argue that a fundamentally narrow and reductionist...... to understand how whole animals adapt to the real world. We argue that a lack of fluency in these concepts is a major stumbling block for what has been narrowly defined as 'systems biology' by some of its leading advocates. We also point out that it is a failure of regulation at multiple levels that causes many...

  2. Ten questions about systems biology

    DEFF Research Database (Denmark)

    Joyner, Michael J; Pedersen, Bente K

    2011-01-01

    to understand how whole animals adapt to the real world. We argue that a lack of fluency in these concepts is a major stumbling block for what has been narrowly defined as 'systems biology' by some of its leading advocates. We also point out that it is a failure of regulation at multiple levels that causes many......In this paper we raise 'ten questions' broadly related to 'omics', the term systems biology, and why the new biology has failed to deliver major therapeutic advances for many common diseases, especially diabetes and cardiovascular disease. We argue that a fundamentally narrow and reductionist...

  3. Ten Thousand Years of Solitude?

    International Nuclear Information System (INIS)

    Benford, G.; Pasqualetti, M.J.

    1991-03-01

    This report documents the authors work as an expert team advising the US Department of Energy on modes of inadvertent intrusion over the next 10,000 years into the Waste Isolation Pilot Project (WIPP) nuclear waste repository. Credible types of potential future accidental intrusion into the WIPP are estimated as a basis for creating warning markers to prevent inadvertent intrusion. A six-step process is used to structure possible scenarios for such intrusion, and it is concluded that the probability of inadvertent intrusion into the WIPP repository over the next ten thousand years lies between one and twenty-five percent. 3 figs., 5 tabs

  4. Ten Thousand Years of Solitude

    Energy Technology Data Exchange (ETDEWEB)

    Benford, G. (Los Alamos National Lab., NM (USA) California Univ., Irvine, CA (USA). Dept. of Physics); Kirkwood, C.W. (Los Alamos National Lab., NM (USA) Arizona State Univ., Tempe, AZ (USA). Coll. of Business Administration); Harry, O. (Los Alamos National Lab., NM (USA)); Pasqualetti, M.J. (Los Alamos National Lab., NM (USA) Arizona State Univ., Tempe, AZ (USA))

    1991-03-01

    This report documents the authors work as an expert team advising the US Department of Energy on modes of inadvertent intrusion over the next 10,000 years into the Waste Isolation Pilot Project (WIPP) nuclear waste repository. Credible types of potential future accidental intrusion into the WIPP are estimated as a basis for creating warning markers to prevent inadvertent intrusion. A six-step process is used to structure possible scenarios for such intrusion, and it is concluded that the probability of inadvertent intrusion into the WIPP repository over the next ten thousand years lies between one and twenty-five percent. 3 figs., 5 tabs.

  5. Ten questions on nuclear wastes

    International Nuclear Information System (INIS)

    Guillaumont, R.; Bacher, P.

    2004-01-01

    The authors give explanations and answers to ten issues related to nuclear wastes: when a radioactive material becomes a waste, how radioactive wastes are classified and particularly nuclear wastes in France, what are the risks associated with radioactive wastes, whether the present management of radioactive wastes is well controlled in France, which wastes are raising actual problems and what are the solutions, whether amounts and radio-toxicity of wastes can be reduced, whether all long life radionuclides or part of them can be transmuted, whether geologic storage of final wastes is inescapable, whether radioactive material can be warehoused over long durations, and how the information on radioactive waste management is organised

  6. Disaggregating asthma: Big investigation versus big data.

    Science.gov (United States)

    Belgrave, Danielle; Henderson, John; Simpson, Angela; Buchan, Iain; Bishop, Christopher; Custovic, Adnan

    2017-02-01

    We are facing a major challenge in bridging the gap between identifying subtypes of asthma to understand causal mechanisms and translating this knowledge into personalized prevention and management strategies. In recent years, "big data" has been sold as a panacea for generating hypotheses and driving new frontiers of health care; the idea that the data must and will speak for themselves is fast becoming a new dogma. One of the dangers of ready accessibility of health care data and computational tools for data analysis is that the process of data mining can become uncoupled from the scientific process of clinical interpretation, understanding the provenance of the data, and external validation. Although advances in computational methods can be valuable for using unexpected structure in data to generate hypotheses, there remains a need for testing hypotheses and interpreting results with scientific rigor. We argue for combining data- and hypothesis-driven methods in a careful synergy, and the importance of carefully characterized birth and patient cohorts with genetic, phenotypic, biological, and molecular data in this process cannot be overemphasized. The main challenge on the road ahead is to harness bigger health care data in ways that produce meaningful clinical interpretation and to translate this into better diagnoses and properly personalized prevention and treatment plans. There is a pressing need for cross-disciplinary research with an integrative approach to data science, whereby basic scientists, clinicians, data analysts, and epidemiologists work together to understand the heterogeneity of asthma. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Ten years of nuclear power

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1964-08-15

    Ten years have elapsed since the world's first nuclear power station began to supply electricity in Russia, and this in turn marked the end of a twelve year stage following the first controlled nuclear chain reaction at Chicago. These periods mark major stages in the development of atomic energy from the realm of abstract ideas to that of everyday industrial application. They followed a period of fundamental research and laboratory work, culminating in Enrico Fermi's demonstration of a system whereby the forces of the atom could be brought under control. Then it was necessary to find ways and means of using the chain reaction for practical purposes and on an industrial scale. And after this had been shown in 1954 to be technically possible, it had still to be developed into an economic process. The nuclear power station has proved itself from the technical and engineering standpoint. The third phase of development has been to bring it to the stage of being economically competitive with alternative sources of energy, and it would appear that we are now reaching that goal - though more slowly than had been envisaged ten years ago

  8. Measuring Public Acceptance of Nuclear Technology with Big data

    Energy Technology Data Exchange (ETDEWEB)

    Roh, Seugkook [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    Surveys can be conducted only on people in specific region and time interval, and it may be misleading to generalize the results to represent the attitude of the public. For example, opinions of a person living in metropolitan area, far from the dangers of nuclear reactors and enjoying cheap electricity produced by the reactors, and a person living in proximity of nuclear power plants, subject to tremendous damage should nuclear meltdown occur, certainly differs for the topic of nuclear generation. To conclude, big data is a useful tool to measure the public acceptance of nuclear technology efficiently (i.e., saves cost, time, and effort of measurement and analysis) and this research was able to provide a case for using big data to analyze public acceptance of nuclear technology. Finally, the analysis identified opinion leaders, which allows target-marketing when policy is executed.

  9. Measuring Public Acceptance of Nuclear Technology with Big data

    International Nuclear Information System (INIS)

    Roh, Seugkook

    2015-01-01

    Surveys can be conducted only on people in specific region and time interval, and it may be misleading to generalize the results to represent the attitude of the public. For example, opinions of a person living in metropolitan area, far from the dangers of nuclear reactors and enjoying cheap electricity produced by the reactors, and a person living in proximity of nuclear power plants, subject to tremendous damage should nuclear meltdown occur, certainly differs for the topic of nuclear generation. To conclude, big data is a useful tool to measure the public acceptance of nuclear technology efficiently (i.e., saves cost, time, and effort of measurement and analysis) and this research was able to provide a case for using big data to analyze public acceptance of nuclear technology. Finally, the analysis identified opinion leaders, which allows target-marketing when policy is executed

  10. Nuclear reactor PBMR and cogeneration

    International Nuclear Information System (INIS)

    Ramirez S, J. R.; Alonso V, G.

    2013-10-01

    In recent years the nuclear reactor designs for the electricity generation have increased their costs, so that at the moment costs are managed of around the 5000 US D for installed kw, reason for which a big nuclear plant requires of investments of the order of billions of dollars, the designed reactors as modular of low power seek to lighten the initial investment of a big reactor dividing the power in parts and dividing in modules the components to lower the production costs, this way it can begin to build a module and finished this to build other, differing the long term investment, getting less risk therefore in the investment. On the other hand the reactors of low power can be very useful in regions where is difficult to have access to the electric net being able to take advantage of the thermal energy of the reactor to feed other processes like the water desalination or the vapor generation for the processes industry like the petrochemical, or even more the possible hydrogen production to be used as fuel. In this work the possibility to generate vapor of high quality for the petrochemical industry is described using a spheres bed reactor of high temperature. (Author)

  11. Big data and educational research

    OpenAIRE

    Beneito-Montagut, Roser

    2017-01-01

    Big data and data analytics offer the promise to enhance teaching and learning, improve educational research and progress education governance. This chapter aims to contribute to the conceptual and methodological understanding of big data and analytics within educational research. It describes the opportunities and challenges that big data and analytics bring to education as well as critically explore the perils of applying a data driven approach to education. Despite the claimed value of the...

  12. The trashing of Big Green

    International Nuclear Information System (INIS)

    Felten, E.

    1990-01-01

    The Big Green initiative on California's ballot lost by a margin of 2-to-1. Green measures lost in five other states, shocking ecology-minded groups. According to the postmortem by environmentalists, Big Green was a victim of poor timing and big spending by the opposition. Now its supporters plan to break up the bill and try to pass some provisions in the Legislature

  13. Big data in fashion industry

    Science.gov (United States)

    Jain, S.; Bruniaux, J.; Zeng, X.; Bruniaux, P.

    2017-10-01

    Significant work has been done in the field of big data in last decade. The concept of big data includes analysing voluminous data to extract valuable information. In the fashion world, big data is increasingly playing a part in trend forecasting, analysing consumer behaviour, preference and emotions. The purpose of this paper is to introduce the term fashion data and why it can be considered as big data. It also gives a broad classification of the types of fashion data and briefly defines them. Also, the methodology and working of a system that will use this data is briefly described.

  14. Was there a big bang

    International Nuclear Information System (INIS)

    Narlikar, J.

    1981-01-01

    In discussing the viability of the big-bang model of the Universe relative evidence is examined including the discrepancies in the age of the big-bang Universe, the red shifts of quasars, the microwave background radiation, general theory of relativity aspects such as the change of the gravitational constant with time, and quantum theory considerations. It is felt that the arguments considered show that the big-bang picture is not as soundly established, either theoretically or observationally, as it is usually claimed to be, that the cosmological problem is still wide open and alternatives to the standard big-bang picture should be seriously investigated. (U.K.)

  15. How Big is Earth?

    Science.gov (United States)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  16. Privacy and Big Data

    CERN Document Server

    Craig, Terence

    2011-01-01

    Much of what constitutes Big Data is information about us. Through our online activities, we leave an easy-to-follow trail of digital footprints that reveal who we are, what we buy, where we go, and much more. This eye-opening book explores the raging privacy debate over the use of personal data, with one undeniable conclusion: once data's been collected, we have absolutely no control over who uses it or how it is used. Personal data is the hottest commodity on the market today-truly more valuable than gold. We are the asset that every company, industry, non-profit, and government wants. Pri

  17. Visualizing big energy data

    DEFF Research Database (Denmark)

    Hyndman, Rob J.; Liu, Xueqin Amy; Pinson, Pierre

    2018-01-01

    Visualization is a crucial component of data analysis. It is always a good idea to plot the data before fitting models, making predictions, or drawing conclusions. As sensors of the electric grid are collecting large volumes of data from various sources, power industry professionals are facing th...... the challenge of visualizing such data in a timely fashion. In this article, we demonstrate several data-visualization solutions for big energy data through three case studies involving smart-meter data, phasor measurement unit (PMU) data, and probabilistic forecasts, respectively....

  18. Big Data Challenges

    Directory of Open Access Journals (Sweden)

    Alexandru Adrian TOLE

    2013-10-01

    Full Text Available The amount of data that is traveling across the internet today, not only that is large, but is complex as well. Companies, institutions, healthcare system etc., all of them use piles of data which are further used for creating reports in order to ensure continuity regarding the services that they have to offer. The process behind the results that these entities requests represents a challenge for software developers and companies that provide IT infrastructure. The challenge is how to manipulate an impressive volume of data that has to be securely delivered through the internet and reach its destination intact. This paper treats the challenges that Big Data creates.

  19. Big data naturally rescaled

    International Nuclear Information System (INIS)

    Stoop, Ruedi; Kanders, Karlis; Lorimer, Tom; Held, Jenny; Albert, Carlo

    2016-01-01

    We propose that a handle could be put on big data by looking at the systems that actually generate the data, rather than the data itself, realizing that there may be only few generic processes involved in this, each one imprinting its very specific structures in the space of systems, the traces of which translate into feature space. From this, we propose a practical computational clustering approach, optimized for coping with such data, inspired by how the human cortex is known to approach the problem.

  20. BIG Data - BIG Gains? Understanding the Link Between Big Data Analytics and Innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance for product innovations. Since big data technologies provide new data information practices, they create new decision-making possibilities, which firms can use to realize innovations. Applying German firm-level data we find suggestive evidence that big data analytics matters for the likelihood of becoming a product innovator as well as the market success of the firms’ product innovat...

  1. BIG data - BIG gains? Empirical evidence on the link between big data analytics and innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance in terms of product innovations. Since big data technologies provide new data information practices, they create novel decision-making possibilities, which are widely believed to support firms’ innovation process. Applying German firm-level data within a knowledge production function framework we find suggestive evidence that big data analytics is a relevant determinant for the likel...

  2. Big Rock Point: 35 years of electrical generation

    International Nuclear Information System (INIS)

    Petrosky, T.D.

    1998-01-01

    On September 27, 1962, the 75 MWe boiling water reactor, designed and built by General Electric, of the Big Rock Point Nuclear Power Station went critical for the first time. The US Atomic Energy Commission (AEC) and the plant operator, Consumers Power, had designed the plant also as a research reactor. The first studies were devoted to fuel behavior, higher burnup, and materials research. The reactor was also used for medical technology: Co-60 radiation sources were produced for the treatment of more than 120,000 cancer patients. After the accident at the Three Mile Island-2 nuclear generating unit in 1979, Big Rock Point went through an extensive backfitting phase. Personnel from numerous other American nuclear power plants were trained at the simulator of Big Rock Point. The plant was decommissioned permanently on August 29, 1997 after more than 35 years of operation and a cumulated electric power production of 13,291 GWh. A period of five to seven years is estimated for decommissioning and demolition work up to the 'green field' stage. (orig.) [de

  3. Classical propagation of strings across a big crunch/big bang singularity

    International Nuclear Information System (INIS)

    Niz, Gustavo; Turok, Neil

    2007-01-01

    One of the simplest time-dependent solutions of M theory consists of nine-dimensional Euclidean space times 1+1-dimensional compactified Milne space-time. With a further modding out by Z 2 , the space-time represents two orbifold planes which collide and re-emerge, a process proposed as an explanation of the hot big bang [J. Khoury, B. A. Ovrut, P. J. Steinhardt, and N. Turok, Phys. Rev. D 64, 123522 (2001).][P. J. Steinhardt and N. Turok, Science 296, 1436 (2002).][N. Turok, M. Perry, and P. J. Steinhardt, Phys. Rev. D 70, 106004 (2004).]. When the two planes are near, the light states of the theory consist of winding M2-branes, describing fundamental strings in a particular ten-dimensional background. They suffer no blue-shift as the M theory dimension collapses, and their equations of motion are regular across the transition from big crunch to big bang. In this paper, we study the classical evolution of fundamental strings across the singularity in some detail. We also develop a simple semiclassical approximation to the quantum evolution which allows one to compute the quantum production of excitations on the string and implement it in a simplified example

  4. The Big Build

    Science.gov (United States)

    Haigh, Sarah; Bell, Christopher; Ruta, Chris

    2017-01-01

    This article provides details of a successful educational engineering project run in partnership between a group of ten schools and an international engineering, construction and technical services company. It covers the history and evolution of the project and highlights how the project has significant impact not only on the students involved but…

  5. [Big data in imaging].

    Science.gov (United States)

    Sewerin, Philipp; Ostendorf, Benedikt; Hueber, Axel J; Kleyer, Arnd

    2018-04-01

    Until now, most major medical advancements have been achieved through hypothesis-driven research within the scope of clinical trials. However, due to a multitude of variables, only a certain number of research questions could be addressed during a single study, thus rendering these studies expensive and time consuming. Big data acquisition enables a new data-based approach in which large volumes of data can be used to investigate all variables, thus opening new horizons. Due to universal digitalization of the data as well as ever-improving hard- and software solutions, imaging would appear to be predestined for such analyses. Several small studies have already demonstrated that automated analysis algorithms and artificial intelligence can identify pathologies with high precision. Such automated systems would also seem well suited for rheumatology imaging, since a method for individualized risk stratification has long been sought for these patients. However, despite all the promising options, the heterogeneity of the data and highly complex regulations covering data protection in Germany would still render a big data solution for imaging difficult today. Overcoming these boundaries is challenging, but the enormous potential advances in clinical management and science render pursuit of this goal worthwhile.

  6. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Fields, Brian D.; Olive, Keith A.

    2006-01-01

    We present an overview of the standard model of big bang nucleosynthesis (BBN), which describes the production of the light elements in the early universe. The theoretical prediction for the abundances of D, 3 He, 4 He, and 7 Li is discussed. We emphasize the role of key nuclear reactions and the methods by which experimental cross section uncertainties are propagated into uncertainties in the predicted abundances. The observational determination of the light nuclides is also discussed. Particular attention is given to the comparison between the predicted and observed abundances, which yields a measurement of the cosmic baryon content. The spectrum of anisotropies in the cosmic microwave background (CMB) now independently measures the baryon density to high precision; we show how the CMB data test BBN, and find that the CMB and the D and 4 He observations paint a consistent picture. This concordance stands as a major success of the hot big bang. On the other hand, 7 Li remains discrepant with the CMB-preferred baryon density; possible explanations are reviewed. Finally, moving beyond the standard model, primordial nucleosynthesis constraints on early universe and particle physics are also briefly discussed

  7. H Reactor

    Data.gov (United States)

    Federal Laboratory Consortium — The H Reactor was the first reactor to be built at Hanford after World War II.It became operational in October of 1949, and represented the fourth nuclear reactor on...

  8. Ten out of ten for LHC decapole magnets

    CERN Multimedia

    2001-01-01

    CERN's Albert Ijspeert (left) and Avinash Puntambekar of the Indian CAT laboratory with the ten Indian decapole magnets on the test bench. Tests will be carried out by the LHC-MTA group. A batch of 10 superconducting decapole magnets for the LHC has just arrived at CERN from India. These will be used to correct for slight imperfections in the dipole magnets that will steer proton beams around CERN's new accelerator. All magnets have slight imperfections in the fields they produce, and in the LHC dipoles these will be corrected for using sextupoles and decapoles. The sextupoles were the first LHC magnets to be given the production green-light following successful tests of pre-series magnets last year (Bulletin 21/2000, 22 May 2000). Now it is the turn of pre-series decapoles to go on trial at CERN. Of the LHC's 1232 dipole magnets, half will use sextupole correctors only and the other half will use both sextupoles and decapoles. That means that a total of 616 pairs of decapoles are needed. Like the sextupole...

  9. Nursing Needs Big Data and Big Data Needs Nursing.

    Science.gov (United States)

    Brennan, Patricia Flatley; Bakken, Suzanne

    2015-09-01

    Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.

  10. Was the big bang hot

    International Nuclear Information System (INIS)

    Wright, E.L.

    1983-01-01

    The author considers experiments to confirm the substantial deviations from a Planck curve in the Woody and Richards spectrum of the microwave background, and search for conducting needles in our galaxy. Spectral deviations and needle-shaped grains are expected for a cold Big Bang, but are not required by a hot Big Bang. (Auth.)

  11. Ethische aspecten van big data

    NARCIS (Netherlands)

    N. (Niek) van Antwerpen; Klaas Jan Mollema

    2017-01-01

    Big data heeft niet alleen geleid tot uitdagende technische vraagstukken, ook gaat het gepaard met allerlei nieuwe ethische en morele kwesties. Om verantwoord met big data om te gaan, moet ook over deze kwesties worden nagedacht. Want slecht datagebruik kan nadelige gevolgen hebben voor

  12. Fremtidens landbrug bliver big business

    DEFF Research Database (Denmark)

    Hansen, Henning Otte

    2016-01-01

    Landbrugets omverdensforhold og konkurrencevilkår ændres, og det vil nødvendiggøre en udvikling i retning af “big business“, hvor landbrugene bliver endnu større, mere industrialiserede og koncentrerede. Big business bliver en dominerende udvikling i dansk landbrug - men ikke den eneste...

  13. Human factors in Big Data

    NARCIS (Netherlands)

    Boer, J. de

    2016-01-01

    Since 2014 I am involved in various (research) projects that try to make the hype around Big Data more concrete and tangible for the industry and government. Big Data is about multiple sources of (real-time) data that can be analysed, transformed to information and be used to make 'smart' decisions.

  14. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  15. Applications of Big Data in Education

    OpenAIRE

    Faisal Kalota

    2015-01-01

    Big Data and analytics have gained a huge momentum in recent years. Big Data feeds into the field of Learning Analytics (LA) that may allow academic institutions to better understand the learners' needs and proactively address them. Hence, it is important to have an understanding of Big Data and its applications. The purpose of this descriptive paper is to provide an overview of Big Data, the technologies used in Big Data, and some of the applications of Big Data in educa...

  16. Resolution 1540, ten years on

    International Nuclear Information System (INIS)

    Hautecouverture, Benjamin

    2014-06-01

    Adopted on the 28 April 2004 by the United Nations Security Council under Chapter VII of the UN Charter, Resolution 1540 is a composite tool that was hitherto unprecedented. To recap, States are bound to 'refrain from providing any form of support to non-State actors that attempt to develop, acquire, manufacture, possess, transport, transfer or use nuclear, chemical or biological weapons and their means of delivery' (par. 1), and to prohibit and prevent non-State actors from the aforementioned through 'appropriate and effective' (par. 2,3) legal, judiciary, and administrative means. A Committee was established to which States had to submit a first report outlining the steps 'they have taken or intend to take to implement this resolution' (par. 4). This Committee was initially established for two years and has been regularly renewed since, and its mandate was extended in 2011 for ten years. It is not a surveillance mechanism. Finally, with the aim of remedying difficulties that certain States may experience in implementing the Resolution, 'States in a position to do so' are invited to offer assistance (par. 7). The level of the application of Resolution 1540 was originally based on a delicate three-pronged balance of obligation, good will, and partnership. It is not a matter of designating certain States to the rest of the international community, whilst avoiding that the exercise be limited to the submission of national reports, instead aiming to initiate a dynamic. The wager was a risky one. Ten years on, 90% of UN member States have submitted one or several implementation reports. 170 States and 50 international and regional organisations have taken part in outreach and implementation support events. Whatever quantitative or qualitative conclusions that can be reached, we should continue to promote the Re-solution's universal adoption, and to ensure that the implementation of its provisions is undertaken in a lasting manner, taking account of the national

  17. Exploring complex and big data

    Directory of Open Access Journals (Sweden)

    Stefanowski Jerzy

    2017-12-01

    Full Text Available This paper shows how big data analysis opens a range of research and technological problems and calls for new approaches. We start with defining the essential properties of big data and discussing the main types of data involved. We then survey the dedicated solutions for storing and processing big data, including a data lake, virtual integration, and a polystore architecture. Difficulties in managing data quality and provenance are also highlighted. The characteristics of big data imply also specific requirements and challenges for data mining algorithms, which we address as well. The links with related areas, including data streams and deep learning, are discussed. The common theme that naturally emerges from this characterization is complexity. All in all, we consider it to be the truly defining feature of big data (posing particular research and technological challenges, which ultimately seems to be of greater importance than the sheer data volume.

  18. Safe havens in Europe: Switzerland and the ten dwarfs

    Directory of Open Access Journals (Sweden)

    Martin Paldam

    2013-12-01

    Full Text Available Eleven safe havens exist in Europe providing offshore banking and low taxes. Ten of these states are very small while Switzerland is moderately small. All 11countries are richer than their large neighbors. It is shown that causality is from small to safe haven towealth, and that theoretically equilibriums are likely to exist where a certain regulation is substantially lower in a small country than in its big neighbor. This generates a large capital inflow to the safe havens. The pool of funds that may reach the safe havens is shown to be huge. It is far in excess of the absorptive capacity of the safe havens, but it still explains, why they are rich. Microstates offer a veil of anonymity to funds passing through, and Switzerland offers safe storage of funds.

  19. The big data telescope

    International Nuclear Information System (INIS)

    Finkel, Elizabeth

    2017-01-01

    On a flat, red mulga plain in the outback of Western Australia, preparations are under way to build the most audacious telescope astronomers have ever dreamed of - the Square Kilometre Array (SKA). Next-generation telescopes usually aim to double the performance of their predecessors. The Australian arm of SKA will deliver a 168-fold leap on the best technology available today, to show us the universe as never before. It will tune into signals emitted just a million years after the Big Bang, when the universe was a sea of hydrogen gas, slowly percolating with the first galaxies. Their starlight illuminated the fledgling universe in what is referred to as the “cosmic dawn”.

  20. The Big Optical Array

    International Nuclear Information System (INIS)

    Mozurkewich, D.; Johnston, K.J.; Simon, R.S.

    1990-01-01

    This paper describes the design and the capabilities of the Naval Research Laboratory Big Optical Array (BOA), an interferometric optical array for high-resolution imaging of stars, stellar systems, and other celestial objects. There are four important differences between the BOA design and the design of Mark III Optical Interferometer on Mount Wilson (California). These include a long passive delay line which will be used in BOA to do most of the delay compensation, so that the fast delay line will have a very short travel; the beam combination in BOA will be done in triplets, to allow measurement of closure phase; the same light will be used for both star and fringe tracking; and the fringe tracker will use several wavelength channels

  1. Nonstandard big bang models

    International Nuclear Information System (INIS)

    Calvao, M.O.; Lima, J.A.S.

    1989-01-01

    The usual FRW hot big-bang cosmologies have been generalized by considering the equation of state ρ = Anm +(γ-1) -1 p, where m is the rest mass of the fluid particles and A is a dimensionless constant. Explicit analytic solutions are given for the flat case (ε=O). For large cosmological times these extended models behave as the standard Einstein-de Sitter universes regardless of the values of A and γ. Unlike the usual FRW flat case the deceleration parameter q is a time-dependent function and its present value, q≅ 1, obtained from the luminosity distance versus redshift relation, may be fitted by taking, for instance, A=1 and γ = 5/3 (monatomic relativistic gas with >> k B T). In all cases the universe cools obeying the same temperature law of the FRW models and it is shown that the age of the universe is only slightly modified. (author) [pt

  2. The Last Big Bang

    Energy Technology Data Exchange (ETDEWEB)

    McGuire, Austin D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Meade, Roger Allen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-13

    As one of the very few people in the world to give the “go/no go” decision to detonate a nuclear device, Austin “Mac” McGuire holds a very special place in the history of both the Los Alamos National Laboratory and the world. As Commander of Joint Task Force Unit 8.1.1, on Christmas Island in the spring and summer of 1962, Mac directed the Los Alamos data collection efforts for twelve of the last atmospheric nuclear detonations conducted by the United States. Since data collection was at the heart of nuclear weapon testing, it fell to Mac to make the ultimate decision to detonate each test device. He calls his experience THE LAST BIG BANG, since these tests, part of Operation Dominic, were characterized by the dramatic displays of the heat, light, and sounds unique to atmospheric nuclear detonations – never, perhaps, to be witnessed again.

  3. DPF Big One

    International Nuclear Information System (INIS)

    Anon.

    1993-01-01

    At its latest venue at Fermilab from 10-14 November, the American Physical Society's Division of Particles and Fields meeting entered a new dimension. These regular meetings, which allow younger researchers to communicate with their peers, have been gaining popularity over the years (this was the seventh in the series), but nobody had expected almost a thousand participants and nearly 500 requests to give talks. Thus Fermilab's 800-seat auditorium had to be supplemented with another room with a video hookup, while the parallel sessions were organized into nine bewildering streams covering fourteen major physics topics. With the conventionality of the Standard Model virtually unchallenged, physics does not move fast these days. While most of the physics results had already been covered in principle at the International Conference on High Energy Physics held in Dallas in August (October, page 1), the Fermilab DPF meeting had a very different atmosphere. Major international meetings like Dallas attract big names from far and wide, and it is difficult in such an august atmosphere for young researchers to find a receptive audience. This was not the case at the DPF parallel sessions. The meeting also adopted a novel approach, with the parallels sandwiched between an initial day of plenaries to set the scene, and a final day of summaries. With the whole world waiting for the sixth ('top') quark to be discovered at Fermilab's Tevatron protonantiproton collider, the meeting began with updates from Avi Yagil and Ronald Madaras from the big detectors, CDF and DO respectively. Although rumours flew thick and fast, the Tevatron has not yet reached the top, although Yagil could show one intriguing event of a type expected from the heaviest quark

  4. DPF Big One

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1993-01-15

    At its latest venue at Fermilab from 10-14 November, the American Physical Society's Division of Particles and Fields meeting entered a new dimension. These regular meetings, which allow younger researchers to communicate with their peers, have been gaining popularity over the years (this was the seventh in the series), but nobody had expected almost a thousand participants and nearly 500 requests to give talks. Thus Fermilab's 800-seat auditorium had to be supplemented with another room with a video hookup, while the parallel sessions were organized into nine bewildering streams covering fourteen major physics topics. With the conventionality of the Standard Model virtually unchallenged, physics does not move fast these days. While most of the physics results had already been covered in principle at the International Conference on High Energy Physics held in Dallas in August (October, page 1), the Fermilab DPF meeting had a very different atmosphere. Major international meetings like Dallas attract big names from far and wide, and it is difficult in such an august atmosphere for young researchers to find a receptive audience. This was not the case at the DPF parallel sessions. The meeting also adopted a novel approach, with the parallels sandwiched between an initial day of plenaries to set the scene, and a final day of summaries. With the whole world waiting for the sixth ('top') quark to be discovered at Fermilab's Tevatron protonantiproton collider, the meeting began with updates from Avi Yagil and Ronald Madaras from the big detectors, CDF and DO respectively. Although rumours flew thick and fast, the Tevatron has not yet reached the top, although Yagil could show one intriguing event of a type expected from the heaviest quark.

  5. Big bang and big crunch in matrix string theory

    International Nuclear Information System (INIS)

    Bedford, J.; Ward, J.; Papageorgakis, C.; Rodriguez-Gomez, D.

    2007-01-01

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe

  6. Compactified vacuum in ten dimensions

    International Nuclear Information System (INIS)

    Wurmser, D.

    1987-01-01

    Since the 1920's, theories which unify gravity with the other fundamental forces have called for more than the four observed dimensions of space-time. According to such a theory, the vacuum consists of flat four-dimensional space-time described by the Minkowski metric M 4 and a compactified space B. The dimensions of B are small, and the space can only be observed at distance scales smaller than the present experimental limit. These theories have had serious difficulties. The equations of gravity severely restrict the possible choices for the space B. The allowed spaces are complicated and difficult to study. The vacuum is furthermore unstable in the sense that a small perturbation causes the compactified dimensions to expand indefinitely. There is an addition a semi-classical argument which implies that the compactified vacuum by annihilated by virtual black holes. It follows that a universe with compactified extra dimensions could not have survived to the present. These results were derived by applying the equations of general relativity to spaces of more than four dimensions. The form of these equations was assumed to be unchanged by an increase in the number of dimensions. The authors illustrate the effect of such terms by considering the example B = S 6 where S 6 is the six-dimensional sphere. Only when the extra terms are included is this choice of the compactified space allowed. He explore the effect of a small perturbation on such a vacuum. The ten-dimensional spherically symmetric potential is examined, and I determine conditions under which the formation of virtual black holes is forbidden. The examples M 4 x S 6 is still plagued by the semi-classical instability, but this result does not hold in general. The requirement that virtual black holes be forbidden provides a test for any theory which predicts a compactified vacuum

  7. BigOP: Generating Comprehensive Big Data Workloads as a Benchmarking Framework

    OpenAIRE

    Zhu, Yuqing; Zhan, Jianfeng; Weng, Chuliang; Nambiar, Raghunath; Zhang, Jinchao; Chen, Xingzhen; Wang, Lei

    2014-01-01

    Big Data is considered proprietary asset of companies, organizations, and even nations. Turning big data into real treasure requires the support of big data systems. A variety of commercial and open source products have been unleashed for big data storage and processing. While big data users are facing the choice of which system best suits their needs, big data system developers are facing the question of how to evaluate their systems with regard to general big data processing needs. System b...

  8. NCAA Money for Student Assistance Lands in Many Pockets, Big Ten Document Shows

    Science.gov (United States)

    Wolverton, Brad

    2013-01-01

    Amid a national debate about paying college athletes, the NCAA likes to tout its often-overlooked Student Assistance Fund, whose goal is to provide direct financial support to players. The fund--which draws from the association's multibillion-dollar media-rights deals--will distribute some $75-million this year to Division I athletes. The money…

  9. Transgender People at Four Big Ten Campuses: A Policy Discourse Analysis

    Science.gov (United States)

    Dirks, Doris Andrea

    2016-01-01

    This article examines the language used to discuss transgender people on university campuses. This study asks how, despite seemingly benefitting transgender people, the discourses carried by the documents that discuss trans people may actually undermine the intended goals of policy initiatives. For example, a report on the status of transgender…

  10. Backfitting of the FRG reactors

    International Nuclear Information System (INIS)

    Krull, W.

    1990-01-01

    The FRG-research reactors The GKSS-research centre is operating two research reactors of the pool type fueled with MTR-type type fuel elements. The research reactors FRG-1 and FRG-2 having power levels of 5 MW and 15 MW are in operation for 31 year and 27 years respectively. They are comparably old like other research reactors. The reactors are operating at present at approximately 180 days (FRG-1) and between 210 and 250 days (FRG-2) per year. Both reactors are located in the same reactor hall in a connecting pool system. Backfitting measures are needed for our and other research reactors to ensure a high level of safety and availability. The main backfitting activities during last ten years were concerned with: comparison of the existing design with today demands (criteria, guidelines, standards etc.); and probability approach for events from outside like aeroplane crashes and earthquakes; the main accidents were rediscussed like startup from low and full power, loss of coolant flow, loss of heat sink, loss of coolant and fuel plate melting; a new reactor protection system had to be installed, following today's demands; a new crane has been installed in the reactor hall. A cold neutron source has been installed to increase the flux of cold neutrons by a factor of 14. The FRG-l is being converted from 93% enriched U with Alx fuel to 20% enriched U with U 3 Si 2 fuel. Both cooling towers were repaired. Replacement of instrumentation is planned

  11. From big bang to big crunch and beyond

    International Nuclear Information System (INIS)

    Elitzur, Shmuel; Rabinovici, Eliezer; Giveon, Amit; Kutasov, David

    2002-01-01

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a 'big bang' singularity, expands and then contracts to a 'big crunch' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spacetimes is further connected at the singularities to a series of non-compact static regions with closed timelike curves. These regions contain boundaries, on which the observables of the theory live. This suggests a holographic interpretation of the physics. (author)

  12. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  13. Empathy and the Big Five

    OpenAIRE

    Paulus, Christoph

    2016-01-01

    Del Barrio et al. (2004) haben vor mehr als 10 Jahren versucht, eine direkte Beziehung zwischen Empathie und den Big Five herzustellen. Im Mittel hatten in ihrer Stichprobe Frauen höhere Werte in der Empathie und auf den Big Five-Faktoren mit Ausnahme des Faktors Neurotizismus. Zusammenhänge zu Empathie fanden sie in den Bereichen Offenheit, Verträglichkeit, Gewissenhaftigkeit und Extraversion. In unseren Daten besitzen Frauen sowohl in der Empathie als auch den Big Five signifikant höhere We...

  14. "Big data" in economic history.

    Science.gov (United States)

    Gutmann, Myron P; Merchant, Emily Klancher; Roberts, Evan

    2018-03-01

    Big data is an exciting prospect for the field of economic history, which has long depended on the acquisition, keying, and cleaning of scarce numerical information about the past. This article examines two areas in which economic historians are already using big data - population and environment - discussing ways in which increased frequency of observation, denser samples, and smaller geographic units allow us to analyze the past with greater precision and often to track individuals, places, and phenomena across time. We also explore promising new sources of big data: organically created economic data, high resolution images, and textual corpora.

  15. Big Data as Information Barrier

    Directory of Open Access Journals (Sweden)

    Victor Ya. Tsvetkov

    2014-07-01

    Full Text Available The article covers analysis of ‘Big Data’ which has been discussed over last 10 years. The reasons and factors for the issue are revealed. It has proved that the factors creating ‘Big Data’ issue has existed for quite a long time, and from time to time, would cause the informational barriers. Such barriers were successfully overcome through the science and technologies. The conducted analysis refers the “Big Data” issue to a form of informative barrier. This issue may be solved correctly and encourages development of scientific and calculating methods.

  16. Homogeneous and isotropic big rips?

    CERN Document Server

    Giovannini, Massimo

    2005-01-01

    We investigate the way big rips are approached in a fully inhomogeneous description of the space-time geometry. If the pressure and energy densities are connected by a (supernegative) barotropic index, the spatial gradients and the anisotropic expansion decay as the big rip is approached. This behaviour is contrasted with the usual big-bang singularities. A similar analysis is performed in the case of sudden (quiescent) singularities and it is argued that the spatial gradients may well be non-negligible in the vicinity of pressure singularities.

  17. Big Data in Space Science

    OpenAIRE

    Barmby, Pauline

    2018-01-01

    It seems like “big data” is everywhere these days. In planetary science and astronomy, we’ve been dealing with large datasets for a long time. So how “big” is our data? How does it compare to the big data that a bank or an airline might have? What new tools do we need to analyze big datasets, and how can we make better use of existing tools? What kinds of science problems can we address with these? I’ll address these questions with examples including ESA’s Gaia mission, ...

  18. Rate Change Big Bang Theory

    Science.gov (United States)

    Strickland, Ken

    2013-04-01

    The Rate Change Big Bang Theory redefines the birth of the universe with a dramatic shift in energy direction and a new vision of the first moments. With rate change graph technology (RCGT) we can look back 13.7 billion years and experience every step of the big bang through geometrical intersection technology. The analysis of the Big Bang includes a visualization of the first objects, their properties, the astounding event that created space and time as well as a solution to the mystery of anti-matter.

  19. Big Data is invading big places as CERN

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Big Data technologies are becoming more popular with the constant grow of data generation in different fields such as social networks, internet of things and laboratories like CERN. How is CERN making use of such technologies? How machine learning is applied at CERN with Big Data technologies? How much data we move and how it is analyzed? All these questions will be answered during the talk.

  20. The ethics of big data in big agriculture

    OpenAIRE

    Carbonell (Isabelle M.)

    2016-01-01

    This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique in...

  1. Big climate data analysis

    Science.gov (United States)

    Mudelsee, Manfred

    2015-04-01

    The Big Data era has begun also in the climate sciences, not only in economics or molecular biology. We measure climate at increasing spatial resolution by means of satellites and look farther back in time at increasing temporal resolution by means of natural archives and proxy data. We use powerful supercomputers to run climate models. The model output of the calculations made for the IPCC's Fifth Assessment Report amounts to ~650 TB. The 'scientific evolution' of grid computing has started, and the 'scientific revolution' of quantum computing is being prepared. This will increase computing power, and data amount, by several orders of magnitude in the future. However, more data does not automatically mean more knowledge. We need statisticians, who are at the core of transforming data into knowledge. Statisticians notably also explore the limits of our knowledge (uncertainties, that is, confidence intervals and P-values). Mudelsee (2014 Climate Time Series Analysis: Classical Statistical and Bootstrap Methods. Second edition. Springer, Cham, xxxii + 454 pp.) coined the term 'optimal estimation'. Consider the hyperspace of climate estimation. It has many, but not infinite, dimensions. It consists of the three subspaces Monte Carlo design, method and measure. The Monte Carlo design describes the data generating process. The method subspace describes the estimation and confidence interval construction. The measure subspace describes how to detect the optimal estimation method for the Monte Carlo experiment. The envisaged large increase in computing power may bring the following idea of optimal climate estimation into existence. Given a data sample, some prior information (e.g. measurement standard errors) and a set of questions (parameters to be estimated), the first task is simple: perform an initial estimation on basis of existing knowledge and experience with such types of estimation problems. The second task requires the computing power: explore the hyperspace to

  2. Hey, big spender

    Energy Technology Data Exchange (ETDEWEB)

    Cope, G.

    2000-04-01

    Business to business electronic commerce is looming large in the future of the oil industry. It is estimated that by adopting e-commerce the industry could achieve bottom line savings of between $1.8 to $ 3.4 billion a year on annual gross revenues in excess of $ 30 billion. At present there are several teething problems to overcome such as inter-operability standards, which are at least two or three years away. Tying in electronically with specific suppliers is also an expensive proposition, although the big benefits are in fact in doing business with the same suppliers on a continuing basis. Despite these problems, 14 of the world's largest energy and petrochemical companies joined forces in mid-April to create a single Internet procurement marketplace for the industry's complex supply chain. The exchange was designed by B2B (business-to-business) software provider, Commerce One Inc., ; it will leverage the buying clout of these industry giants (BP Amoco, Royal Dutch Shell Group, Conoco, Occidental Petroleum, Phillips Petroleum, Unocal Corporation and Statoil among them), currently about $ 125 billion on procurement per year; they hope to save between 5 to 30 per cent depending on the product and the region involved. Other similar schemes such as Chevron and partners' Petrocosm Marketplace, Network Oil, a Houston-based Internet portal aimed at smaller petroleum companies, are also doing business in the $ 10 billion per annum range. e-Energy, a cooperative project between IBM Ericson and Telus Advertising is another neutral, virtual marketplace targeted at the oil and gas sector. PetroTRAX, a Calgary-based website plans to take online procurement and auction sales a big step forward by establishing a portal to handle any oil company's asset management needs. There are also a number of websites targeting specific needs: IndigoPool.com (acquisitions and divestitures) and WellBid.com (products related to upstream oil and gas operators) are just

  3. Hey, big spender

    International Nuclear Information System (INIS)

    Cope, G.

    2000-01-01

    Business to business electronic commerce is looming large in the future of the oil industry. It is estimated that by adopting e-commerce the industry could achieve bottom line savings of between $1.8 to $ 3.4 billion a year on annual gross revenues in excess of $ 30 billion. At present there are several teething problems to overcome such as inter-operability standards, which are at least two or three years away. Tying in electronically with specific suppliers is also an expensive proposition, although the big benefits are in fact in doing business with the same suppliers on a continuing basis. Despite these problems, 14 of the world's largest energy and petrochemical companies joined forces in mid-April to create a single Internet procurement marketplace for the industry's complex supply chain. The exchange was designed by B2B (business-to-business) software provider, Commerce One Inc., ; it will leverage the buying clout of these industry giants (BP Amoco, Royal Dutch Shell Group, Conoco, Occidental Petroleum, Phillips Petroleum, Unocal Corporation and Statoil among them), currently about $ 125 billion on procurement per year; they hope to save between 5 to 30 per cent depending on the product and the region involved. Other similar schemes such as Chevron and partners' Petrocosm Marketplace, Network Oil, a Houston-based Internet portal aimed at smaller petroleum companies, are also doing business in the $ 10 billion per annum range. e-Energy, a cooperative project between IBM Ericson and Telus Advertising is another neutral, virtual marketplace targeted at the oil and gas sector. PetroTRAX, a Calgary-based website plans to take online procurement and auction sales a big step forward by establishing a portal to handle any oil company's asset management needs. There are also a number of websites targeting specific needs: IndigoPool.com (acquisitions and divestitures) and WellBid.com (products related to upstream oil and gas operators) are just two examples. All in

  4. Methods and tools for big data visualization

    OpenAIRE

    Zubova, Jelena; Kurasova, Olga

    2015-01-01

    In this paper, methods and tools for big data visualization have been investigated. Challenges faced by the big data analysis and visualization have been identified. Technologies for big data analysis have been discussed. A review of methods and tools for big data visualization has been done. Functionalities of the tools have been demonstrated by examples in order to highlight their advantages and disadvantages.

  5. Measuring the Promise of Big Data Syllabi

    Science.gov (United States)

    Friedman, Alon

    2018-01-01

    Growing interest in Big Data is leading industries, academics and governments to accelerate Big Data research. However, how teachers should teach Big Data has not been fully examined. This article suggests criteria for redesigning Big Data syllabi in public and private degree-awarding higher education establishments. The author conducted a survey…

  6. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  7. Ten years of effective multilateralism?

    International Nuclear Information System (INIS)

    Hautecouverture, Benjamin

    2013-12-01

    successes of multilateralism and international law provide security assurances or do they mask the flaws of an insufficiently intrusive system? While the U.S. signaled their defiance under the guidance of John Bolton, European thinking settled on a conclusion with the air of a slogan: yes to multilateralism, providing it is effective. This was to be the cornerstone of the policies to undertake and the instruments to implement to forge a global role for the Union. Its originality is based simultaneously in realism and in a refusal to resort to force as the foremost means of action on the international stage. Another way of defining the 2003 Strategy is a skillful compromise between differing positions to banish the Iraqi 'cacophony' as quickly as possible. Ten years on, it is now time for 'impact assessments' and other 'scorecards' of public policies. In the midst of the Eurozone economic crisis, evaluating a strategy of 'effective multilateralism' is not easy. The Union's bi-annual reports on its implementation have recently begun to try to provide a quantitative analysis that is not entirely convincing (for instance, the number of ratifications of such-and-such an instrument with regard to the budget allocated by the EU to facilitate its universalization, over such-and-such a period). Fundamentally, the 2003 Strategy is essentially beyond this kind of analysis even if it can prove useful. The strength of the European approach consists in establishing a long-term willingness to maintain and strengthen collective security tools approved by the greatest possible number of States. Its weakness is to occasionally confront a strategic reality that is as contradictory as it is stubborn. Effective multilateralism is linked to voluntarism and vows. (author)

  8. Biophotonics: the big picture

    Science.gov (United States)

    Marcu, Laura; Boppart, Stephen A.; Hutchinson, Mark R.; Popp, Jürgen; Wilson, Brian C.

    2018-02-01

    The 5th International Conference on Biophotonics (ICOB) held April 30 to May 1, 2017, in Fremantle, Western Australia, brought together opinion leaders to discuss future directions for the field and opportunities to consider. The first session of the conference, "How to Set a Big Picture Biophotonics Agenda," was focused on setting the stage for developing a vision and strategies for translation and impact on society of biophotonic technologies. The invited speakers, panelists, and attendees engaged in discussions that focused on opportunities and promising applications for biophotonic techniques, challenges when working at the confluence of the physical and biological sciences, driving factors for advances of biophotonic technologies, and educational opportunities. We share a summary of the presentations and discussions. Three main themes from the conference are presented in this position paper that capture the current status, opportunities, challenges, and future directions of biophotonics research and key areas of applications: (1) biophotonics at the nano- to microscale level; (2) biophotonics at meso- to macroscale level; and (3) biophotonics and the clinical translation conundrum.

  9. Big Data, Small Sample.

    Science.gov (United States)

    Gerlovina, Inna; van der Laan, Mark J; Hubbard, Alan

    2017-05-20

    Multiple comparisons and small sample size, common characteristics of many types of "Big Data" including those that are produced by genomic studies, present specific challenges that affect reliability of inference. Use of multiple testing procedures necessitates calculation of very small tail probabilities of a test statistic distribution. Results based on large deviation theory provide a formal condition that is necessary to guarantee error rate control given practical sample sizes, linking the number of tests and the sample size; this condition, however, is rarely satisfied. Using methods that are based on Edgeworth expansions (relying especially on the work of Peter Hall), we explore the impact of departures of sampling distributions from typical assumptions on actual error rates. Our investigation illustrates how far the actual error rates can be from the declared nominal levels, suggesting potentially wide-spread problems with error rate control, specifically excessive false positives. This is an important factor that contributes to "reproducibility crisis". We also review some other commonly used methods (such as permutation and methods based on finite sampling inequalities) in their application to multiple testing/small sample data. We point out that Edgeworth expansions, providing higher order approximations to the sampling distribution, offer a promising direction for data analysis that could improve reliability of studies relying on large numbers of comparisons with modest sample sizes.

  10. Big bang darkleosynthesis

    Directory of Open Access Journals (Sweden)

    Gordan Krnjaic

    2015-12-01

    Full Text Available In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN, in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV/dark-nucleon binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S≫3/2, whose discovery would be smoking gun evidence for dark nuclei.

  11. Predicting big bang deuterium

    Energy Technology Data Exchange (ETDEWEB)

    Hata, N.; Scherrer, R.J.; Steigman, G.; Thomas, D.; Walker, T.P. [Department of Physics, Ohio State University, Columbus, Ohio 43210 (United States)

    1996-02-01

    We present new upper and lower bounds to the primordial abundances of deuterium and {sup 3}He based on observational data from the solar system and the interstellar medium. Independent of any model for the primordial production of the elements we find (at the 95{percent} C.L.): 1.5{times}10{sup {minus}5}{le}(D/H){sub {ital P}}{le}10.0{times}10{sup {minus}5} and ({sup 3}He/H){sub {ital P}}{le}2.6{times}10{sup {minus}5}. When combined with the predictions of standard big bang nucleosynthesis, these constraints lead to a 95{percent} C.L. bound on the primordial abundance deuterium: (D/H){sub best}=(3.5{sup +2.7}{sub {minus}1.8}){times}10{sup {minus}5}. Measurements of deuterium absorption in the spectra of high-redshift QSOs will directly test this prediction. The implications of this prediction for the primordial abundances of {sup 4}He and {sup 7}Li are discussed, as well as those for the universal density of baryons. {copyright} {ital 1996 The American Astronomical Society.}

  12. Big bang darkleosynthesis

    Science.gov (United States)

    Krnjaic, Gordan; Sigurdson, Kris

    2015-12-01

    In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN), in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV /dark-nucleon) binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S ≫ 3 / 2), whose discovery would be smoking gun evidence for dark nuclei.

  13. The role of big laboratories

    CERN Document Server

    Heuer, Rolf-Dieter

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward.

  14. The role of big laboratories

    International Nuclear Information System (INIS)

    Heuer, R-D

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward. (paper)

  15. Reactor Physics

    International Nuclear Information System (INIS)

    Ait Abderrahim, A.

    2002-01-01

    SCK-CEN's Reactor Physics and MYRRHA Department offers expertise in various areas of reactor physics, in particular in neutron and gamma calculations, reactor dosimetry, reactor operation and control, reactor code benchmarking and reactor safety calculations. This expertise is applied in the Department's own research projects in the VENUS critical facility, in the BR1 reactor and in the MYRRHA project (this project aims at designing a prototype Accelerator Driven System). Available expertise is also used in programmes external to the Department such as the reactor pressure steel vessel programme, the BR2 materials testing reactor dosimetry, and the preparation and interpretation of irradiation experiments by means of neutron and gamma calculations. The activities of the Fuzzy Logic and Intelligent Technologies in Nuclear Science programme cover several domains outside the department. Progress and achievements in these topical areas in 2001 are summarised

  16. Reactor Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ait Abderrahim, A

    2001-04-01

    The Reactor Physics and MYRRHA Department of SCK-CEN offers expertise in various areas of reactor physics, in particular in neutronics calculations, reactor dosimetry, reactor operation, reactor safety and control and non-destructive analysis of reactor fuel. This expertise is applied in the Department's own research projects in the VENUS critical facility, in the BR1 reactor and in the MYRRHA project (this project aims at designing a prototype Accelerator Driven System). Available expertise is also used in programmes external to the Department such as the reactor pressure steel vessel programme, the BR2 reactor dosimetry, and the preparation and interpretation of irradiation experiments by means of neutron and gamma calculations. The activities of the Fuzzy Logic and Intelligent Technologies in Nuclear Science programme cover several domains outside the department. Progress and achievements in these topical areas in 2000 are summarised.

  17. Reactor Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ait Abderrahim, A

    2002-04-01

    SCK-CEN's Reactor Physics and MYRRHA Department offers expertise in various areas of reactor physics, in particular in neutron and gamma calculations, reactor dosimetry, reactor operation and control, reactor code benchmarking and reactor safety calculations. This expertise is applied in the Department's own research projects in the VENUS critical facility, in the BR1 reactor and in the MYRRHA project (this project aims at designing a prototype Accelerator Driven System). Available expertise is also used in programmes external to the Department such as the reactor pressure steel vessel programme, the BR2 materials testing reactor dosimetry, and the preparation and interpretation of irradiation experiments by means of neutron and gamma calculations. The activities of the Fuzzy Logic and Intelligent Technologies in Nuclear Science programme cover several domains outside the department. Progress and achievements in these topical areas in 2001 are summarised.

  18. Reactor Physics

    International Nuclear Information System (INIS)

    Ait Abderrahim, A.

    2001-01-01

    The Reactor Physics and MYRRHA Department of SCK-CEN offers expertise in various areas of reactor physics, in particular in neutronics calculations, reactor dosimetry, reactor operation, reactor safety and control and non-destructive analysis of reactor fuel. This expertise is applied in the Department's own research projects in the VENUS critical facility, in the BR1 reactor and in the MYRRHA project (this project aims at designing a prototype Accelerator Driven System). Available expertise is also used in programmes external to the Department such as the reactor pressure steel vessel programme, the BR2 reactor dosimetry, and the preparation and interpretation of irradiation experiments by means of neutron and gamma calculations. The activities of the Fuzzy Logic and Intelligent Technologies in Nuclear Science programme cover several domains outside the department. Progress and achievements in these topical areas in 2000 are summarised

  19. Challenges of Big Data Analysis.

    Science.gov (United States)

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  20. Generalized formal model of Big Data

    OpenAIRE

    Shakhovska, N.; Veres, O.; Hirnyak, M.

    2016-01-01

    This article dwells on the basic characteristic features of the Big Data technologies. It is analyzed the existing definition of the “big data” term. The article proposes and describes the elements of the generalized formal model of big data. It is analyzed the peculiarities of the application of the proposed model components. It is described the fundamental differences between Big Data technology and business analytics. Big Data is supported by the distributed file system Google File System ...

  1. Reactor operation

    CERN Document Server

    Shaw, J

    2013-01-01

    Reactor Operation covers the theoretical aspects and design information of nuclear reactors. This book is composed of nine chapters that also consider their control, calibration, and experimentation.The opening chapters present the general problems of reactor operation and the principles of reactor control and operation. The succeeding chapters deal with the instrumentation, start-up, pre-commissioning, and physical experiments of nuclear reactors. The remaining chapters are devoted to the control rod calibrations and temperature coefficient measurements in the reactor. These chapters also exp

  2. Reactor safeguards

    CERN Document Server

    Russell, Charles R

    1962-01-01

    Reactor Safeguards provides information for all who are interested in the subject of reactor safeguards. Much of the material is descriptive although some sections are written for the engineer or physicist directly concerned with hazards analysis or site selection problems. The book opens with an introductory chapter on radiation hazards, the construction of nuclear reactors, safety issues, and the operation of nuclear reactors. This is followed by separate chapters that discuss radioactive materials, reactor kinetics, control and safety systems, containment, safety features for water reactor

  3. Nuclear reactors

    International Nuclear Information System (INIS)

    Middleton, J.E.

    1977-01-01

    Reference is made to water cooled reactors and in particular to the cooling system of steam generating heavy water reactors (SGHWR). A two-coolant circuit is described for the latter. Full constructural details are given. (U.K.)

  4. Reactor decommissioning

    International Nuclear Information System (INIS)

    Lawton, H.

    1984-01-01

    A pioneering project on the decommissioning of the Windscale Advanced Gas-cooled Reactor, by the UKAEA, is described. Reactor data; policy; waste management; remote handling equipment; development; and recording and timescales, are all briefly discussed. (U.K.)

  5. Change of neutron flow sensors effectiveness in the course of reactor experiments

    International Nuclear Information System (INIS)

    Kurpesheva, A.M.; Kotov, V.M.; Zhotabaev, Zh.R.

    2007-01-01

    Full text: IGR reactor is a reactor of thermal capacity type. During the operation, uranium-graphite core can be heated up to 1500 deg. C and reactivity can be changed considerably. Core dimensions are comparatively small. Amount of control rods, providing required reactivity, is not big as well. Increasing of core temperature leads to the rise of neutrons path length in its basic material - graphite. Change of temperature is not even. All this causes the non-conservation of neutron flows ratio in irradiated sample and in the place of reactor power sensors installation. Deviations in this ratio were registered during the number of reactor experiments. Empiric corrections can be introduced in order to decrease influence of change of neutron flow effectiveness upon provision of required parameters of investigated matters load. However, dependence of these corrections upon many factors can lead to the increasing of instability of process control. Previous experiment-calculated experiments showed inequality of neutron field in the place of sensors location (up to tens of percent), low effectiveness of experimental works, carried out without access to the individual reactor laying elements. Imperfection during the experiment was an idea of possibility to connect distribution of out of reactor neutron flow and control rods position. Subsequent analysis showed that for the development of representative phenomenon model it is necessary to take into account reactor operation dynamic subject to unevenness of heating of individual laying parts. Elemental calculations showed that temperature laying effects in the change of neutron outer field are great. Algorithm of calculations for the change of outer filed and field of investigated fabrication includes calculation of neutron-physic reactor characteristics interlacing with calculations of thermal-physic reactor characteristics, providing correlation of temperature fields for neutron-physic calculations. In the course of such

  6. RA Reactor

    International Nuclear Information System (INIS)

    1978-02-01

    In addition to basic characteristics of the RA reactor, organizational scheme and financial incentives, this document covers describes the state of the reactor components after 18 years of operation, problems concerned with obtaining the licence for operation with 80% fuel, problems of spent fuel storage in the storage pool of the reactor building and the need for renewal of reactor equipment, first of all instrumentation [sr

  7. Multiregion reactors

    International Nuclear Information System (INIS)

    Moura Neto, C. de; Nair, R.P.K.

    1979-08-01

    The study of reflected reactors can be done employing the multigroup diffusion method. The neutron conservation equations, inside the intervals, can be written by fluxes and group constants. A reflected reactor (one and two groups) for a slab geometry is studied, aplying the continuity of flux and current in the interface. At the end, the appropriated solutions for a infinite cylindrical reactor and for a spherical reactor are presented. (Author) [pt

  8. REACTOR GROUT THERMAL PROPERTIES

    Energy Technology Data Exchange (ETDEWEB)

    Steimke, J.; Qureshi, Z.; Restivo, M.; Guerrero, H.

    2011-01-28

    Savannah River Site has five dormant nuclear production reactors. Long term disposition will require filling some reactor buildings with grout up to ground level. Portland cement based grout will be used to fill the buildings with the exception of some reactor tanks. Some reactor tanks contain significant quantities of aluminum which could react with Portland cement based grout to form hydrogen. Hydrogen production is a safety concern and gas generation could also compromise the structural integrity of the grout pour. Therefore, it was necessary to develop a non-Portland cement grout to fill reactors that contain significant quantities of aluminum. Grouts generate heat when they set, so the potential exists for large temperature increases in a large pour, which could compromise the integrity of the pour. The primary purpose of the testing reported here was to measure heat of hydration, specific heat, thermal conductivity and density of various reactor grouts under consideration so that these properties could be used to model transient heat transfer for different pouring strategies. A secondary purpose was to make qualitative judgments of grout pourability and hardened strength. Some reactor grout formulations were unacceptable because they generated too much heat, or started setting too fast, or required too long to harden or were too weak. The formulation called 102H had the best combination of characteristics. It is a Calcium Alumino-Sulfate grout that contains Ciment Fondu (calcium aluminate cement), Plaster of Paris (calcium sulfate hemihydrate), sand, Class F fly ash, boric acid and small quantities of additives. This composition afforded about ten hours of working time. Heat release began at 12 hours and was complete by 24 hours. The adiabatic temperature rise was 54 C which was within specification. The final product was hard and displayed no visible segregation. The density and maximum particle size were within specification.

  9. Big Data in Health: a Literature Review from the Year 2005.

    Science.gov (United States)

    de la Torre Díez, Isabel; Cosgaya, Héctor Merino; Garcia-Zapirain, Begoña; López-Coronado, Miguel

    2016-09-01

    The information stored in healthcare systems has increased over the last ten years, leading it to be considered Big Data. There is a wealth of health information ready to be analysed. However, the sheer volume raises a challenge for traditional methods. The aim of this article is to conduct a cutting-edge study on Big Data in healthcare from 2005 to the present. This literature review will help researchers to know how Big Data has developed in the health industry and open up new avenues for research. Information searches have been made on various scientific databases such as Pubmed, Science Direct, Scopus and Web of Science for Big Data in healthcare. The search criteria were "Big Data" and "health" with a date range from 2005 to the present. A total of 9724 articles were found on the databases. 9515 articles were discarded as duplicates or for not having a title of interest to the study. 209 articles were read, with the resulting decision that 46 were useful for this study. 52.6 % of the articles used were found in Science Direct, 23.7 % in Pubmed, 22.1 % through Scopus and the remaining 2.6 % through the Web of Science. Big Data has undergone extremely high growth since 2011 and its use is becoming compulsory in developed nations and in an increasing number of developing nations. Big Data is a step forward and a cost reducer for public and private healthcare.

  10. Ten steps to successful software process improvement

    Science.gov (United States)

    Kandt, R. K.

    2003-01-01

    This paper identifies ten steps for managing change that address organizational and cultural issues. Four of these steps are critical, that if not done, will almost guarantee failure. This ten-step program emphasizes the alignment of business goals, change process goals, and the work performed by the employees of an organization.

  11. Clinical experience with TENS and TENS combined with nitrous oxide-oxygen. Report of 371 patients.

    OpenAIRE

    Quarnstrom, F. C.; Milgrom, P.

    1989-01-01

    Transcutaneous electrical nerve stimulation (TENS) alone or TENS combined with nitrous oxide-oxygen (N2O) was administered for restorative dentistry without local anesthesia to 371 adult patients. A total of 55% of TENS alone and 84% of TENS/N2O visits were rated successful. A total of 53% of TENS alone and 82% of TENS/N2O patients reported slight or no pain. In multivariable analyses, pain reports were related to the anesthesia technique and patient fear and unrelated to sex, race, age, toot...

  12. Nuclear reactor

    International Nuclear Information System (INIS)

    Hattori, Sadao; Sato, Morihiko.

    1994-01-01

    Liquid metals such as liquid metal sodium are filled in a reactor container as primary coolants. A plurality of reactor core containers are disposed in a row in the circumferential direction along with the inner circumferential wall of the reactor container. One or a plurality of intermediate coolers are disposed at the inside of an annular row of the reactor core containers. A reactor core constituted with fuel rods and control rods (module reactor core) is contained at the inside of each of the reactor core containers. Each of the intermediate coolers comprises a cylindrical intermediate cooling vessels. The intermediate cooling vessel comprises an intermediate heat exchanger for heat exchange of primary coolants and secondary coolants and recycling pumps for compulsorily recycling primary coolants at the inside thereof. Since a plurality of reactor core containers are thus assembled, a great reactor power can be attained. Further, the module reactor core contained in one reactor core vessel may be small sized, to facilitate the control for the reactor core operation. (I.N.)

  13. Triennial technical report - 1986, 1987, 1988 - Instituto de Engenharia Nuclear (IEN) -Dept. of Reactors (DERE)

    International Nuclear Information System (INIS)

    1989-01-01

    The research activities developed during the period 1986, 1987 and 1988 by the Reactor Department of Brazilian Nuclear Energy Commission (CNEN-DERE) are summarized. The principal aim of the Department of Reactors is concerned to the study and development of fast reactors and research thermal reactors. The DERE also assists the CNEN in the areas related to analysis of power reactor structure; to teach Reactor Physics and Engineering at the University, and professional training to the Nuclear Engineering Institute. To develop its research activity the DERE has three big facilities: Argonauta reactor, CTS-1 sodium circuit, and water circuit. (M.I.)

  14. Optimal reactor strategy for commercializing fast breeder reactors

    International Nuclear Information System (INIS)

    Yamaji, Kenji; Nagano, Koji

    1988-01-01

    In this paper, a fuel cycle optimization model developed for analyzing the condition of selecting fast breeder reactors in the optimal reactor strategy is described. By dividing the period of planning, 1966-2055, into nine ten-year periods, the model was formulated as a compact linear programming model. With the model, the best mix of reactor types as well as the optimal timing of reprocessing spent fuel from LWRs to minimize the total cost were found. The results of the analysis are summarized as follows. Fast breeder reactors could be introduced in the optimal strategy when they can economically compete with LWRs with 30 year storage of spent fuel. In order that fast breeder reactors monopolize the new reactor market after the achievement of their technical availability, their capital cost should be less than 0.9 times as much as that of LWRs. When a certain amount of reprocessing commitment is assumed, the condition of employing fast breeder reactors in the optimal strategy is mitigated. In the optimal strategy, reprocessing is done just to meet plutonium demand, and the storage of spent fuel is selected to adjust the mismatch of plutonium production and utilization. The price hike of uranium ore facilitates the commercial adoption of fast breeder reactors. (Kako, I.)

  15. [Big data in official statistics].

    Science.gov (United States)

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  16. GEOSS: Addressing Big Data Challenges

    Science.gov (United States)

    Nativi, S.; Craglia, M.; Ochiai, O.

    2014-12-01

    In the sector of Earth Observation, the explosion of data is due to many factors including: new satellite constellations, the increased capabilities of sensor technologies, social media, crowdsourcing, and the need for multidisciplinary and collaborative research to face Global Changes. In this area, there are many expectations and concerns about Big Data. Vendors have attempted to use this term for their commercial purposes. It is necessary to understand whether Big Data is a radical shift or an incremental change for the existing digital infrastructures. This presentation tries to explore and discuss the impact of Big Data challenges and new capabilities on the Global Earth Observation System of Systems (GEOSS) and particularly on its common digital infrastructure called GCI. GEOSS is a global and flexible network of content providers allowing decision makers to access an extraordinary range of data and information at their desk. The impact of the Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on GEOSS is discussed. The main solutions and experimentation developed by GEOSS along these axes are introduced and analyzed. GEOSS is a pioneering framework for global and multidisciplinary data sharing in the Earth Observation realm; its experience on Big Data is valuable for the many lessons learned.

  17. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  18. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  19. Quantum fields in a big-crunch-big-bang spacetime

    International Nuclear Information System (INIS)

    Tolley, Andrew J.; Turok, Neil

    2002-01-01

    We consider quantum field theory on a spacetime representing the big-crunch-big-bang transition postulated in ekpyrotic or cyclic cosmologies. We show via several independent methods that an essentially unique matching rule holds connecting the incoming state, in which a single extra dimension shrinks to zero, to the outgoing state in which it reexpands at the same rate. For free fields in our construction there is no particle production from the incoming adiabatic vacuum. When interactions are included the particle production for fixed external momentum is finite at the tree level. We discuss a formal correspondence between our construction and quantum field theory on de Sitter spacetime

  20. Poker Player Behavior After Big Wins and Big Losses

    OpenAIRE

    Gary Smith; Michael Levere; Robert Kurtzman

    2009-01-01

    We find that experienced poker players typically change their style of play after winning or losing a big pot--most notably, playing less cautiously after a big loss, evidently hoping for lucky cards that will erase their loss. This finding is consistent with Kahneman and Tversky's (Kahneman, D., A. Tversky. 1979. Prospect theory: An analysis of decision under risk. Econometrica 47(2) 263-292) break-even hypothesis and suggests that when investors incur a large loss, it might be time to take ...

  1. Turning big bang into big bounce: II. Quantum dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Malkiewicz, Przemyslaw; Piechocki, Wlodzimierz, E-mail: pmalk@fuw.edu.p, E-mail: piech@fuw.edu.p [Theoretical Physics Department, Institute for Nuclear Studies, Hoza 69, 00-681 Warsaw (Poland)

    2010-11-21

    We analyze the big bounce transition of the quantum Friedmann-Robertson-Walker model in the setting of the nonstandard loop quantum cosmology (LQC). Elementary observables are used to quantize composite observables. The spectrum of the energy density operator is bounded and continuous. The spectrum of the volume operator is bounded from below and discrete. It has equally distant levels defining a quantum of the volume. The discreteness may imply a foamy structure of spacetime at a semiclassical level which may be detected in astro-cosmo observations. The nonstandard LQC method has a free parameter that should be fixed in some way to specify the big bounce transition.

  2. Nuclear power reactors

    International Nuclear Information System (INIS)

    1982-11-01

    After an introduction and general explanation of nuclear power the following reactor types are described: magnox thermal reactor; advanced gas-cooled reactor (AGR); pressurised water reactor (PWR); fast reactors (sodium cooled); boiling water reactor (BWR); CANDU thermal reactor; steam generating heavy water reactor (SGHWR); high temperature reactor (HTR); Leningrad (RMBK) type water-cooled graphite moderated reactor. (U.K.)

  3. Research reactors

    International Nuclear Information System (INIS)

    Merchie, Francois

    2015-10-01

    This article proposes an overview of research reactors, i.e. nuclear reactors of less than 100 MW. Generally, these reactors are used as neutron generators for basic research in matter sciences and for technological research as a support to power reactors. The author proposes an overview of the general design of research reactors in terms of core size, of number of fissions, of neutron flow, of neutron space distribution. He outlines that this design is a compromise between a compact enough core, a sufficient experiment volume, and high enough power densities without affecting neutron performance or its experimental use. The author evokes the safety framework (same regulations as for power reactors, more constraining measures after Fukushima, international bodies). He presents the main characteristics and operation of the two families which represent almost all research reactors; firstly, heavy water reactors (photos, drawings and figures illustrate different examples); and secondly light water moderated and cooled reactors with a distinction between open core pool reactors like Melusine and Triton, pool reactors with containment, experimental fast breeder reactors (Rapsodie, the Russian BOR 60, the Chinese CEFR). The author describes the main uses of research reactors: basic research, applied and technological research, safety tests, production of radio-isotopes for medicine and industry, analysis of elements present under the form of traces at very low concentrations, non destructive testing, doping of silicon mono-crystalline ingots. The author then discusses the relationship between research reactors and non proliferation, and finally evokes perspectives (decrease of the number of research reactors in the world, the Jules Horowitz project)

  4. Reactor physics and reactor computations

    International Nuclear Information System (INIS)

    Ronen, Y.; Elias, E.

    1994-01-01

    Mathematical methods and computer calculations for nuclear and thermonuclear reactor kinetics, reactor physics, neutron transport theory, core lattice parameters, waste treatment by transmutation, breeding, nuclear and thermonuclear fuels are the main interests of the conference

  5. Big Data Analytics in Medicine and Healthcare.

    Science.gov (United States)

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  6. Prospect of realizing nuclear fusion reactors

    International Nuclear Information System (INIS)

    1989-01-01

    This Report describes the results of the research work on nuclear fusion, which CRIEPI has carried out for about ten years from the standpoint of electric power utilities, potential user of its energy. The principal points are; (a) economic analysis (calculation of costs) based on Japanese analysis procedures and database of commercial fusion reactors, including fusion-fission hybrid reactors, and (b) conceptual design of two types of hybrid reactors, that is, fission-fuel producing DMHR (Demonstration Molten-Salt Hybrid Reactor) and electric-power producing THPR (Tokamak Hybrid Power Reactor). The Report consists of the following chapters: 1. Introduction. 2. Conceptual Design of Hybrid Reactors. 3. Economic Analysis of Commercial Fusion Reactors. 4. Basic Studies Applicable Also to Nuclear Fusion Technology. 5. List of Published Reports and Papers; 6. Conclusion. Appendices. (author)

  7. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  8. The challenges of big data.

    Science.gov (United States)

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. © 2016. Published by The Company of Biologists Ltd.

  9. Research reactors

    International Nuclear Information System (INIS)

    Kowarski, L.

    1955-01-01

    It brings together the techniques data which are involved in the discussion about the utility for a research institute to acquire an atomic reactor for research purposes. This type of decision are often taken by non-specialist people who can need a brief presentation of a research reactor and its possibilities in term of research before asking advises to experts. In a first part, it draws up a list of the different research programs which can be studied by getting a research reactor. First of all is the reactor behaviour and kinetics studies (reproducibility factor, exploration of neutron density, effect of reactor structure, effect of material irradiation...). Physical studies includes study of the behaviour of the control system, studies of neutron resonance phenomena and study of the fission process for example. Chemical studies involves the study of manipulation and control of hot material, characterisation of nuclear species produced in the reactor and chemical effects of irradiation on chemical properties and reactions. Biology and medicine research involves studies of irradiation on man and animals, genetics research, food or medical tools sterilization and neutron beams effect on tumour for example. A large number of other subjects can be studied in a reactor research as reactor construction material research, fabrication of radioactive sources for radiographic techniques or applied research as in agriculture or electronic. The second part discussed the technological considerations when choosing the reactor type. The technological factors, which are considered for its choice, are the power of the reactor, the nature of the fuel which is used, the type of moderator (water, heavy water, graphite or BeO) and the reflector, the type of coolants, the protection shield and the control systems. In the third part, it described the characteristics (place of installation, type of combustible and comments) and performance (power, neutron flux ) of already existing

  10. ALGORITHMS FOR TETRAHEDRAL NETWORK (TEN) GENERATION

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The Tetrahedral Network(TEN) is a powerful 3-D vector structure in GIS, which has a lot of advantages such as simple structure, fast topological relation processing and rapid visualization. The difficulty of TEN application is automatic creating data structure. Al though a raster algorithm has been introduced by some authors, the problems in accuracy, memory requirement, speed and integrity are still existent. In this paper, the raster algorithm is completed and a vector algorithm is presented after a 3-D data model and structure of TEN have been introducted. Finally, experiment, conclusion and future work are discussed.

  11. Big Book of Windows Hacks

    CERN Document Server

    Gralla, Preston

    2008-01-01

    Bigger, better, and broader in scope, the Big Book of Windows Hacks gives you everything you need to get the most out of your Windows Vista or XP system, including its related applications and the hardware it runs on or connects to. Whether you want to tweak Vista's Aero interface, build customized sidebar gadgets and run them from a USB key, or hack the "unhackable" screensavers, you'll find quick and ingenious ways to bend these recalcitrant operating systems to your will. The Big Book of Windows Hacks focuses on Vista, the new bad boy on Microsoft's block, with hacks and workarounds that

  12. Sosiaalinen asiakassuhdejohtaminen ja big data

    OpenAIRE

    Toivonen, Topi-Antti

    2015-01-01

    Tässä tutkielmassa käsitellään sosiaalista asiakassuhdejohtamista sekä hyötyjä, joita siihen voidaan saada big datan avulla. Sosiaalinen asiakassuhdejohtaminen on terminä uusi ja monille tuntematon. Tutkimusta motivoi aiheen vähäinen tutkimus, suomenkielisen tutkimuksen puuttuminen kokonaan sekä sosiaalisen asiakassuhdejohtamisen mahdollinen olennainen rooli yritysten toiminnassa tulevaisuudessa. Big dataa käsittelevissä tutkimuksissa keskitytään monesti sen tekniseen puoleen, eikä sovellutuk...

  13. Do big gods cause anything?

    DEFF Research Database (Denmark)

    Geertz, Armin W.

    2014-01-01

    Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere.......Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere....

  14. Big Data and Social Media

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    A critical analysis of the "keep everything" Big Data era, the impact on our lives of the information, at first glance "convenient for future use" that we make known about ourselves on the network. NB! The lecture will be recorded like all Academic Training lectures. Lecturer's biography: Father of the Internet, see https://internethalloffame.org/inductees/vint-cerf or https://en.wikipedia.org/wiki/Vint_Cerf The video on slide number 9 is from page https://www.gapminder.org/tools/#$state$time$value=2018&value;;&chart-type=bubbles   Keywords: Big Data, Internet, History, Applications, tools, privacy, technology, preservation, surveillance, google, Arpanet, CERN, Web  

  15. Baryon symmetric big bang cosmology

    International Nuclear Information System (INIS)

    Stecker, F.W.

    1978-01-01

    It is stated that the framework of baryon symmetric big bang (BSBB) cosmology offers our greatest potential for deducting the evolution of the Universe because its physical laws and processes have the minimum number of arbitrary assumptions about initial conditions in the big-bang. In addition, it offers the possibility of explaining the photon-baryon ratio in the Universe and how galaxies and galaxy clusters are formed. BSBB cosmology also provides the only acceptable explanation at present for the origin of the cosmic γ-ray background radiation. (author)

  16. Small quarks make big nuggets

    International Nuclear Information System (INIS)

    Deligeorges, S.

    1985-01-01

    After a brief recall on the classification of subatomic particles, this paper deals with quark nuggets, particle with more than three quarks, a big bag, which is called ''nuclearite''. Neutron stars, in fact, are big sacks of quarks, gigantic nuggets. Now, physicists try to calculate which type of nuggets of strange quark matter is stable, what has been the influence of quark nuggets on the primordial nucleosynthesis. At the present time, one says that if these ''nuggets'' exist, and in a large proportion, they may be candidates for the missing mass [fr

  17. [Big Data- challenges and risks].

    Science.gov (United States)

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-06

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented.

  18. Towards a big crunch dual

    Energy Technology Data Exchange (ETDEWEB)

    Hertog, Thomas E-mail: hertog@vulcan2.physics.ucsb.edu; Horowitz, Gary T

    2004-07-01

    We show there exist smooth asymptotically anti-de Sitter initial data which evolve to a big crunch singularity in a low energy supergravity limit of string theory. This opens up the possibility of using the dual conformal field theory to obtain a fully quantum description of the cosmological singularity. A preliminary study of this dual theory suggests that the big crunch is an endpoint of evolution even in the full string theory. We also show that any theory with scalar solitons must have negative energy solutions. The results presented here clarify our earlier work on cosmic censorship violation in N=8 supergravity. (author)

  19. The Inverted Big-Bang

    OpenAIRE

    Vaas, Ruediger

    2004-01-01

    Our universe appears to have been created not out of nothing but from a strange space-time dust. Quantum geometry (loop quantum gravity) makes it possible to avoid the ominous beginning of our universe with its physically unrealistic (i.e. infinite) curvature, extreme temperature, and energy density. This could be the long sought after explanation of the big-bang and perhaps even opens a window into a time before the big-bang: Space itself may have come from an earlier collapsing universe tha...

  20. Big Cities, Big Problems: Reason for the Elderly to Move?

    NARCIS (Netherlands)

    Fokkema, T.; de Jong-Gierveld, J.; Nijkamp, P.

    1996-01-01

    In many European countries, data on geographical patterns of internal elderly migration show that the elderly (55+) are more likely to leave than to move to the big cities. Besides emphasising the attractive features of the destination areas (pull factors), it is often assumed that this negative

  1. Big-Eyed Bugs Have Big Appetite for Pests

    Science.gov (United States)

    Many kinds of arthropod natural enemies (predators and parasitoids) inhabit crop fields in Arizona and can have a large negative impact on several pest insect species that also infest these crops. Geocoris spp., commonly known as big-eyed bugs, are among the most abundant insect predators in field c...

  2. A survey on Big Data Stream Mining

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... Big Data can be static on one machine or distributed ... decision making, and process automation. Big data .... Concept Drifting: concept drifting mean the classifier .... transactions generated by a prefix tree structure. EstDec ...

  3. Study of future reactors

    International Nuclear Information System (INIS)

    Bouchard, J.

    1992-01-01

    Today, more than 420 large reactors with a gross output of close to 350 GWe supply 20 percent of world electricity needs, accounting for less than 5 percent of primary energy consumption. These figures are not expected to change in the near future, due to suspended reactor construction in many countries. Nevertheless, world energy needs continue to grow: the planet's population already exceeds five billion and is forecast to reach ten billion by the middle of the next century. Most less developed countries have a very low rate of energy consumption and, even though some savings can be made in industrialized countries, it will become increasingly difficult to satisfy needs using fossil fuels only. Furthermore, there has been no recent breakthrough in the energy landscape. The physical feasibility of the other great hope of nuclear energy, fusion, has yet to be proved; once this has been done, it will be necessary to solve technological problems and to assess economic viability. Although it is more ever necessary to pursue fusion programs, there is little likelihood of industrial applications being achieved in the coming decades. Coal and fission are the only ways to produce massive amounts of energy for the next century. Coal must overcome the pollution problems inherent in its use; fission nuclear power has to gain better public acceptance, which is obviously colored by safety and waste concerns. Most existing reactors were commissioned in the 1970s; reactor lifetime is a parameter that has not been clearly established. It will certainly be possible to refurbish some to extend their operation beyond the initial target of 30 or 40 years. But normal advances in technology and safety requirements will make the operation of the oldest reactors increasingly difficult. It becomes necessary to develop new generations of nuclear reactors, both to replace older ones and to revive plant construction in their countries that are not yet equipped or that have halted their

  4. Vadose Zone Journal: The first ten years

    NARCIS (Netherlands)

    Vrugt, J.A.; Or, D.; Young, M.H.

    2013-01-01

    Celebrating ten years of publication, the authors introduce a special section commemorating the anniversary of Vadose Zone Journal and reviewing the journal’s role in an evolving understanding of vadose zone science.

  5. Ten Leading Causes of Death and Injury

    Science.gov (United States)

    ... Overdose Traumatic Brain Injury Violence Prevention Ten Leading Causes of Death and Injury Recommend on Facebook Tweet Share Compartir ... in Hospital Emergency Departments, United States – 2014 Leading Causes of Death Charts Causes of Death by Age Group 2016 [ ...

  6. Ten key issues in modern flow chemistry.

    Science.gov (United States)

    Wegner, Jens; Ceylan, Sascha; Kirschning, Andreas

    2011-04-28

    Ten essentials of synthesis in the flow mode, a new enabling technology in organic chemistry, are highlighted as flashlighted providing an insight into current and future issues and developments in this field. © The Royal Society of Chemistry 2011

  7. Ten into four won't go

    International Nuclear Information System (INIS)

    Freedman, D.Z.; West, P.C.

    1989-01-01

    It is shown that nontrivial spontaneous compactification of ten-dimensional N = 1 supergravity with or without Yang-Mills matter is not possible unless maximal symmetry (i.e. Lorentz invariance) is violated in the four-dimensional spacetime

  8. The Supersymmetric Top-Ten Lists

    OpenAIRE

    Haber, Howard E.

    1993-01-01

    Ten reasons are given why supersymmetry is the leading candidate for physics beyond the Standard Model. Ultimately, the experimental discovery of supersymmetric particles at future colliders will determine whether supersymmetry is relevant for TeV scale physics. The grand hope of supersymmetry enthusiasts is to connect TeV scale supersymmetry with Planck scale physics. The ten most pressing theoretical problems standing in the way of this goal are briefly described.

  9. BR2 Reactor: Introduction

    International Nuclear Information System (INIS)

    Moons, F.

    2007-01-01

    The irradiations in the BR2 reactor are in collaboration with or at the request of third parties such as the European Commission, the IAEA, research centres and utilities, reactor vendors or fuel manufacturers. The reactor also contributes significantly to the production of radioisotopes for medical and industrial applications, to neutron silicon doping for the semiconductor industry and to scientific irradiations for universities. Along the ongoing programmes on fuel and materials development, several new irradiation devices are in use or in design. Amongst others a loop providing enhanced cooling for novel materials testing reactor fuel, a device for high temperature gas cooled fuel as well as a rig for the irradiation of metallurgical samples in a Pb-Bi environment. A full scale 3-D heterogeneous model of BR2 is available. The model describes the real hyperbolic arrangement of the reactor and includes the detailed 3-D space dependent distribution of the isotopic fuel depletion in the fuel elements. The model is validated on the reactivity measurements of several tens of BR2 operation cycles. The accurate calculations of the axial and radial distributions of the poisoning of the beryllium matrix by 3 He, 6 Li and 3T are verified on the measured reactivity losses used to predict the reactivity behavior for the coming decades. The model calculates the main functionals in reactor physics like: conventional thermal and equivalent fission neutron fluxes, number of displacements per atom, fission rate, thermal power characteristics as heat flux and linear power density, neutron/gamma heating, determination of the fission energy deposited in fuel plates/rods, neutron multiplication factor and fuel burn-up. For each reactor irradiation project, a detailed geometry model of the experimental device and of its neighborhood is developed. Neutron fluxes are predicted within approximately 10 percent in comparison with the dosimetry measurements. Fission rate, heat flux and

  10. The Chooz power station: ten years of operation

    International Nuclear Information System (INIS)

    Teste du Bailler, Andre

    1977-01-01

    The switching into actual service of the Chooz plant, the first pressurized water reactor ever built in France, occurred on 3rd april 1967. Ten years later, one can establish a highly positive balance schedule of plant's operation whose availability is satisfactory, except the mechanical failure which occurred during the startup. The behavior of the equipment, in particular of the components of the primary loop, was satisfactory in its whole since it allowed the gradual increase in capacity by 15% with respect to the initial design. It allowed also the achievment of noticeable progress in the design of equipment intended for the new power stations. Interesting results have also been obtained in radioprotection, working conditions of the staff and environment protection fields. Finally, the training of the operating teams has been closely followed, whether it concerned the operators directly affected by plant operation or the trainees gathered in a school specially organized for this purpose and transferred since to a training Center [fr

  11. Inspection of Chooz power plant after ten years operation

    International Nuclear Information System (INIS)

    Saglio, Robert.

    1978-01-01

    This report is intended to discuss the results from the complete technical audit of the vessel effected in 1976 at the Ardennes reactor (CNA 305 MWe). This audit had a special character as far as this power plant has never been inspected before: the start-up had taken place in 1967 and was then prior to the development of French regulations (and even to the first version of the ASME code, Section XI). In that time, no inspection was expected; it has yet been possible to have a complete audit in ten days. The automatic start-up of focused ultrasonic testing so appeared to have reached the required reliability and a good sensitivity [fr

  12. BIG DATA-DRIVEN MARKETING: AN ABSTRACT

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (BD use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: What are the key antecedents of big data customer analytics use? How, and to what extent, does big data customer an...

  13. Big data in Finnish financial services

    OpenAIRE

    Laurila, M. (Mikko)

    2017-01-01

    Abstract This thesis aims to explore the concept of big data, and create understanding of big data maturity in the Finnish financial services industry. The research questions of this thesis are “What kind of big data solutions are being implemented in the Finnish financial services sector?” and “Which factors impede faster implementation of big data solutions in the Finnish financial services sector?”. ...

  14. China: Big Changes Coming Soon

    Science.gov (United States)

    Rowen, Henry S.

    2011-01-01

    Big changes are ahead for China, probably abrupt ones. The economy has grown so rapidly for many years, over 30 years at an average of nine percent a year, that its size makes it a major player in trade and finance and increasingly in political and military matters. This growth is not only of great importance internationally, it is already having…

  15. Big data and urban governance

    NARCIS (Netherlands)

    Taylor, L.; Richter, C.; Gupta, J.; Pfeffer, K.; Verrest, H.; Ros-Tonen, M.

    2015-01-01

    This chapter examines the ways in which big data is involved in the rise of smart cities. Mobile phones, sensors and online applications produce streams of data which are used to regulate and plan the city, often in real time, but which presents challenges as to how the city’s functions are seen and

  16. Big Data for personalized healthcare

    NARCIS (Netherlands)

    Siemons, Liseth; Sieverink, Floor; Vollenbroek, Wouter; van de Wijngaert, Lidwien; Braakman-Jansen, Annemarie; van Gemert-Pijnen, Lisette

    2016-01-01

    Big Data, often defined according to the 5V model (volume, velocity, variety, veracity and value), is seen as the key towards personalized healthcare. However, it also confronts us with new technological and ethical challenges that require more sophisticated data management tools and data analysis

  17. Big data en gelijke behandeling

    NARCIS (Netherlands)

    Lammerant, Hans; de Hert, Paul; Blok, P.H.; Blok, P.H.

    2017-01-01

    In dit hoofdstuk bekijken we allereerst de voornaamste basisbegrippen inzake gelijke behandeling en discriminatie (paragraaf 6.2). Vervolgens kijken we haar het Nederlandse en Europese juridisch kader inzake non-discriminatie (paragraaf 6.3-6.5) en hoe die regels moeten worden toegepast op big

  18. Research Ethics in Big Data.

    Science.gov (United States)

    Hammer, Marilyn J

    2017-05-01

    The ethical conduct of research includes, in part, patient agreement to participate in studies and the protection of health information. In the evolving world of data science and the accessibility of large quantities of web-based data created by millions of individuals, novel methodologic approaches to answering research questions are emerging. This article explores research ethics in the context of big data.

  19. Big data e data science

    OpenAIRE

    Cavique, Luís

    2014-01-01

    Neste artigo foram apresentados os conceitos básicos de Big Data e a nova área a que deu origem, a Data Science. Em Data Science foi discutida e exemplificada a noção de redução da dimensionalidade dos dados.

  20. The Case for "Big History."

    Science.gov (United States)

    Christian, David

    1991-01-01

    Urges an approach to the teaching of history that takes the largest possible perspective, crossing time as well as space. Discusses the problems and advantages of such an approach. Describes a course on "big" history that begins with time, creation myths, and astronomy, and moves on to paleontology and evolution. (DK)

  1. Finding errors in big data

    NARCIS (Netherlands)

    Puts, Marco; Daas, Piet; de Waal, A.G.

    No data source is perfect. Mistakes inevitably creep in. Spotting errors is hard enough when dealing with survey responses from several thousand people, but the difficulty is multiplied hugely when that mysterious beast Big Data comes into play. Statistics Netherlands is about to publish its first

  2. Sampling Operations on Big Data

    Science.gov (United States)

    2015-11-29

    gories. These include edge sampling methods where edges are selected by a predetermined criteria; snowball sampling methods where algorithms start... Sampling Operations on Big Data Vijay Gadepally, Taylor Herr, Luke Johnson, Lauren Milechin, Maja Milosavljevic, Benjamin A. Miller Lincoln...process and disseminate information for discovery and exploration under real-time constraints. Common signal processing operations such as sampling and

  3. The International Big History Association

    Science.gov (United States)

    Duffy, Michael; Duffy, D'Neil

    2013-01-01

    IBHA, the International Big History Association, was organized in 2010 and "promotes the unified, interdisciplinary study and teaching of history of the Cosmos, Earth, Life, and Humanity." This is the vision that Montessori embraced long before the discoveries of modern science fleshed out the story of the evolving universe. "Big…

  4. NASA's Big Data Task Force

    Science.gov (United States)

    Holmes, C. P.; Kinter, J. L.; Beebe, R. F.; Feigelson, E.; Hurlburt, N. E.; Mentzel, C.; Smith, G.; Tino, C.; Walker, R. J.

    2017-12-01

    Two years ago NASA established the Ad Hoc Big Data Task Force (BDTF - https://science.nasa.gov/science-committee/subcommittees/big-data-task-force), an advisory working group with the NASA Advisory Council system. The scope of the Task Force included all NASA Big Data programs, projects, missions, and activities. The Task Force focused on such topics as exploring the existing and planned evolution of NASA's science data cyber-infrastructure that supports broad access to data repositories for NASA Science Mission Directorate missions; best practices within NASA, other Federal agencies, private industry and research institutions; and Federal initiatives related to big data and data access. The BDTF has completed its two-year term and produced several recommendations plus four white papers for NASA's Science Mission Directorate. This presentation will discuss the activities and results of the TF including summaries of key points from its focused study topics. The paper serves as an introduction to the papers following in this ESSI session.

  5. Big Math for Little Kids

    Science.gov (United States)

    Greenes, Carole; Ginsburg, Herbert P.; Balfanz, Robert

    2004-01-01

    "Big Math for Little Kids," a comprehensive program for 4- and 5-year-olds, develops and expands on the mathematics that children know and are capable of doing. The program uses activities and stories to develop ideas about number, shape, pattern, logical reasoning, measurement, operations on numbers, and space. The activities introduce the…

  6. BIG DATA IN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2015-06-01

    Full Text Available In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured in order to improve current transactions, to develop new business models, to provide a real image of the supply and demand and thereby, generate market advantages. So, the companies that turn to Big Data have a competitive advantage over other firms. Looking from the perspective of IT organizations, they must accommodate the storage and processing Big Data, and provide analysis tools that are easily integrated into business processes. This paper aims to discuss aspects regarding the Big Data concept, the principles to build, organize and analyse huge datasets in the business environment, offering a three-layer architecture, based on actual software solutions. Also, the article refers to the graphical tools for exploring and representing unstructured data, Gephi and NodeXL.

  7. From Big Bang to Eternity?

    Indian Academy of Sciences (India)

    at different distances (that is, at different epochs in the past) to come to this ... that the expansion started billions of years ago from an explosive Big Bang. Recent research sheds new light on the key cosmological question about the distant ...

  8. Banking Wyoming big sagebrush seeds

    Science.gov (United States)

    Robert P. Karrfalt; Nancy Shaw

    2013-01-01

    Five commercially produced seed lots of Wyoming big sagebrush (Artemisia tridentata Nutt. var. wyomingensis (Beetle & Young) S.L. Welsh [Asteraceae]) were stored under various conditions for 5 y. Purity, moisture content as measured by equilibrium relative humidity, and storage temperature were all important factors to successful seed storage. Our results indicate...

  9. Reactor container

    International Nuclear Information System (INIS)

    Naruse, Yoshihiro.

    1990-01-01

    The thickness of steel shell plates in a reactor container embedded in sand cussions is monitored to recognize the corrosion of the steel shell plates. That is, the reactor pressure vessel is contained in a reactor container shell and the sand cussions are disposed on the lower outside of the reactor container shell to elastically support the shell. A pit is disposed at a position opposing to the sand cussions for measuring the thickness of the reactor container shell plates. The pit is usually closed by a closing member. In the reactor container thus constituted, the closing member can be removed upon periodical inspection to measure the thickness of the shell plates. Accordingly, the corrosion of the steel shell plates can be recognized by the change of the plate thickness. (I.S.)

  10. Hybrid reactors

    International Nuclear Information System (INIS)

    Moir, R.W.

    1980-01-01

    The rationale for hybrid fusion-fission reactors is the production of fissile fuel for fission reactors. A new class of reactor, the fission-suppressed hybrid promises unusually good safety features as well as the ability to support 25 light-water reactors of the same nuclear power rating, or even more high-conversion-ratio reactors such as the heavy-water type. One 4000-MW nuclear hybrid can produce 7200 kg of 233 U per year. To obtain good economics, injector efficiency times plasma gain (eta/sub i/Q) should be greater than 2, the wall load should be greater than 1 MW.m -2 , and the hybrid should cost less than 6 times the cost of a light-water reactor. Introduction rates for the fission-suppressed hybrid are usually rapid

  11. Nuclear reactor

    International Nuclear Information System (INIS)

    Garabedian, G.

    1988-01-01

    A liquid reactor is described comprising: (a) a reactor vessel having a core; (b) one or more satellite tanks; (c) pump means in the satellite tank; (d) heat exchanger means in the satellite tank; (e) an upper liquid metal conduit extending between the reactor vessel and the satellite tank; (f) a lower liquid metal duct extending between the reactor vessel and satellite tanks the upper liquid metal conduit and the lower liquid metal duct being arranged to permit free circulation of liquid metal between the reactor vessel core and the satellite tank by convective flow of liquid metal; (g) a separate sealed common containment vessel around the reactor vessel, conduits and satellite tanks; (h) the satellite tank having space for a volume of liquid metal that is sufficient to dampen temperature transients resulting from abnormal operating conditions

  12. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  13. Big data: een zoektocht naar instituties

    NARCIS (Netherlands)

    van der Voort, H.G.; Crompvoets, J

    2016-01-01

    Big data is a well-known phenomenon, even a buzzword nowadays. It refers to an abundance of data and new possibilities to process and use them. Big data is subject of many publications. Some pay attention to the many possibilities of big data, others warn us for their consequences. This special

  14. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  15. Data, Data, Data : Big, Linked & Open

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Eckartz, S.M.

    2013-01-01

    De gehele business en IT-wereld praat op dit moment over Big Data, een trend die medio 2013 Cloud Computing is gepasseerd (op basis van Google Trends). Ook beleidsmakers houden zich actief bezig met Big Data. Neelie Kroes, vice-president van de Europese Commissie, spreekt over de ‘Big Data

  16. A survey of big data research

    Science.gov (United States)

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  17. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  18. Nuclear reactor

    International Nuclear Information System (INIS)

    Batheja, P.; Huber, R.; Rau, P.

    1985-01-01

    Particularly for nuclear reactors of small output, the reactor pressure vessel contains at least two heat exchangers, which have coolant flowing through them in a circuit through the reactor core. The circuit of at least one heat exchanger is controlled by a slide valve, so that even for low drive forces, particularly in natural circulation, the required even loading of the heat exchanger is possible. (orig./HP) [de

  19. Accident analysis for PRC-II reactor

    International Nuclear Information System (INIS)

    Wei Yongren; Tang Gang; Wu Qing; Lu Yili; Liu Zhifeng

    1997-12-01

    The computer codes, calculation models, transient results, sensitivity research, design improvement, and safety evaluation used in accident analysis for PRC-II Reactor (The Second Pulsed Reactor in China) are introduced. PRC-II Reactor is built in big populous city, so the public pay close attention to reactor safety. Consequently, Some hypothetical accidents are analyzed. They include an uncontrolled control rod withdrawal at rated power, a pulse rod ejection at rated power, and loss of coolant accident. Calculation model which completely depict the principle and process for each accident is established and the relevant analysis code is developed. This work also includes comprehensive computing and analyzing transients for each accident of PRC-II Reactor; the influences in the reactor safety of all kind of sensitive parameters; evaluating the function of engineered safety feature. The measures to alleviate the consequence of accident are suggested and taken in the construction design of PRC-II Reactor. The properties of reactor safety are comprehensively evaluated. A new advanced calculation model (True Core Uncovered Model) of LOCA of PRC-II Reactor and the relevant code (MCRLOCA) are first put forward

  20. Big Data - Smart Health Strategies

    Science.gov (United States)

    2014-01-01

    Summary Objectives To select best papers published in 2013 in the field of big data and smart health strategies, and summarize outstanding research efforts. Methods A systematic search was performed using two major bibliographic databases for relevant journal papers. The references obtained were reviewed in a two-stage process, starting with a blinded review performed by the two section editors, and followed by a peer review process operated by external reviewers recognized as experts in the field. Results The complete review process selected four best papers, illustrating various aspects of the special theme, among them: (a) using large volumes of unstructured data and, specifically, clinical notes from Electronic Health Records (EHRs) for pharmacovigilance; (b) knowledge discovery via querying large volumes of complex (both structured and unstructured) biological data using big data technologies and relevant tools; (c) methodologies for applying cloud computing and big data technologies in the field of genomics, and (d) system architectures enabling high-performance access to and processing of large datasets extracted from EHRs. Conclusions The potential of big data in biomedicine has been pinpointed in various viewpoint papers and editorials. The review of current scientific literature illustrated a variety of interesting methods and applications in the field, but still the promises exceed the current outcomes. As we are getting closer towards a solid foundation with respect to common understanding of relevant concepts and technical aspects, and the use of standardized technologies and tools, we can anticipate to reach the potential that big data offer for personalized medicine and smart health strategies in the near future. PMID:25123721

  1. Heterogeneous reactors

    International Nuclear Information System (INIS)

    Moura Neto, C. de; Nair, R.P.K.

    1979-08-01

    The microscopic study of a cell is meant for the determination of the infinite multiplication factor of the cell, which is given by the four factor formula: K(infinite) = n(epsilon)pf. The analysis of an homogeneous reactor is similar to that of an heterogeneous reactor, but each factor of the four factor formula can not be calculated by the formulas developed in the case of an homogeneous reactor. A great number of methods was developed for the calculation of heterogeneous reactors and some of them are discussed. (Author) [pt

  2. Ideal MHD B limits in the BIG DEE tokamak

    International Nuclear Information System (INIS)

    Helton, F.J.; Bernard, L.C.; Greene, J.M.

    1983-01-01

    Using D-D reactions, tokamak reactors become economically attractive when B (the ratio of volume averaged pressure to magnetic pressure) exceeds 5 percent. Ideal MID instabilities are of great concern because they have the potential to limit B below this range and so extensive studies have been done to determine ideal MHD B limits. As B increases with inverse aspect ratio, elongation and triangularity, the Doublet III upgrade machine -- BIG DEE -- is particularly suited to study the possibility of very high B. The authors have done computations to determine ideal MHD B limits for various plasma shapes and elongations in BIG DEE. They have determined that for q at the plasma surface greater than 2, B is limited by the ballooning mode if the wall is reasonably close to the plasma surface (d/a < 1.5 where d and a are the wall and plasma radii respectively). On the other hand, for q at the plasma surface less than 2, the n=1 external kink is unstable even with a wall close by. Thus, relevant values of limiting B can be obtained by assuming that the external kink limits the value of q at the limiter to a value greater than 2 and that the ballooning modes limit B. Under this assumption, a relevant B limit for the BIG DEE would be over 18%. For such an equilibrium, the wall position necessary to stabilize the n=1 and n=2 modes is 2a and the equilibrium is stable for n=3

  3. Backfitting of the FRG reactors

    Energy Technology Data Exchange (ETDEWEB)

    Krull, W [GKSS-Forschungszentrum Geesthacht GmbH, Geesthacht (Germany)

    1990-05-01

    The FRG-research reactors The GKSS-research centre is operating two research reactors of the pool type fueled with MTR-type type fuel elements. The research reactors FRG-1 and FRG-2 having power levels of 5 MW and 15 MW are in operation for 31 year and 27 years respectively. They are comparably old like other research reactors. The reactors are operating at present at approximately 180 days (FRG-1) and between 210 and 250 days (FRG-2) per year. Both reactors are located in the same reactor hall in a connecting pool system. Backfitting measures are needed for our and other research reactors to ensure a high level of safety and availability. The main backfitting activities during last ten years were concerned with: comparison of the existing design with today demands (criteria, guidelines, standards etc.); and probability approach for events from outside like aeroplane crashes and earthquakes; the main accidents were rediscussed like startup from low and full power, loss of coolant flow, loss of heat sink, loss of coolant and fuel plate melting; a new reactor protection system had to be installed, following today's demands; a new crane has been installed in the reactor hall. A cold neutron source has been installed to increase the flux of cold neutrons by a factor of 14. The FRG-l is being converted from 93% enriched U with Alx fuel to 20% enriched U with U{sub 3}Si{sub 2} fuel. Both cooling towers were repaired. Replacement of instrumentation is planned.

  4. Ten years of KRB Gundremmingen demonstration power station

    International Nuclear Information System (INIS)

    Facius, H. von; Ettemeyer, R.

    1976-01-01

    In August 1976 the first large nuclear power station in the Federal Republic, the KRB Gundremmingen plant with a net power of 237 MWe, has been in operation ten years. The construction of KRB as a demonstration plant was a major step forward on the way to the economic utilization of nuclear power for German utilities. Design and operation of the plant have decisively influenced the further development of the technology of light water reactors in the Federal Republic. Unlike the Kahl Experimental Nuclear Power Station (VAK), which was a test facility designed to generate experience and to train personnel, the decision to build KRB from the outset was conditional upon the fulfillment of economic criteria. Here are some of the aspects in which KRB has greatly influenced the development of nuclear power station technology: first application of internal steam-water separation instead of a steam drum with a water content of the steam of less than 1%; construction of a reactor buildung with all the necessary safety factors; solution of the corrosion and erosion problems linked with the use of a saturated steam turbine; special measures taken to prevent the turbine from speeding up due to post-evaporation effects after shutdown. Detailed comments are devoted to the subjects of availability, causes of failure and repair work. (orig.) [de

  5. Big-Leaf Mahogany on CITES Appendix II: Big Challenge, Big Opportunity

    Science.gov (United States)

    JAMES GROGAN; PAULO BARRETO

    2005-01-01

    On 15 November 2003, big-leaf mahogany (Swietenia macrophylla King, Meliaceae), the most valuable widely traded Neotropical timber tree, gained strengthened regulatory protection from its listing on Appendix II of the Convention on International Trade in Endangered Species ofWild Fauna and Flora (CITES). CITES is a United Nations-chartered agreement signed by 164...

  6. The Thermos process heat reactor

    International Nuclear Information System (INIS)

    Lerouge, Bernard

    1979-01-01

    The THERMOS process heat reactor was born from the following idea: the hot water energy vector is widely used for heating purposes in cities, so why not save on traditional fossil fuels by simply substituting a nuclear boiler of comparable power for the classical boiler installed in the same place. The French Atomic Energy Commission has techniques for heating in the big French cities which provide better guarantees for national independence and for the environment. This THERMOS technique would result in a saving of 40,000 to 80,000 tons of oil per year [fr

  7. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    CERN Multimedia

    2008-01-01

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  8. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  9. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  10. Antigravity and the big crunch/big bang transition

    International Nuclear Information System (INIS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  11. Solution of a braneworld big crunch/big bang cosmology

    International Nuclear Information System (INIS)

    McFadden, Paul L.; Turok, Neil; Steinhardt, Paul J.

    2007-01-01

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c) 2 . At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios

  12. The ethics of big data in big agriculture

    Directory of Open Access Journals (Sweden)

    Isabelle M. Carbonell

    2016-03-01

    Full Text Available This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique insights on a field-by-field basis into a third or more of the US farmland. This power asymmetry may be rebalanced through open-sourced data, and publicly-funded data analytic tools which rival Climate Corp. in complexity and innovation for use in the public domain.

  13. Big Data – Big Deal for Organization Design?

    OpenAIRE

    Janne J. Korhonen

    2014-01-01

    Analytics is an increasingly important source of competitive advantage. It has even been posited that big data will be the next strategic emphasis of organizations and that analytics capability will be manifested in organizational structure. In this article, I explore how analytics capability might be reflected in organizational structure using the notion of  “requisite organization” developed by Jaques (1998). Requisite organization argues that a new strategic emphasis requires the addition ...

  14. Nowcasting using news topics Big Data versus big bank

    OpenAIRE

    Thorsrud, Leif Anders

    2016-01-01

    The agents in the economy use a plethora of high frequency information, including news media, to guide their actions and thereby shape aggregate economic fluctuations. Traditional nowcasting approches have to a relatively little degree made use of such information. In this paper, I show how unstructured textual information in a business newspaper can be decomposed into daily news topics and used to nowcast quarterly GDP growth. Compared with a big bank of experts, here represented by o cial c...

  15. Big Data in Medicine is Driving Big Changes

    Science.gov (United States)

    Verspoor, K.

    2014-01-01

    Summary Objectives To summarise current research that takes advantage of “Big Data” in health and biomedical informatics applications. Methods Survey of trends in this work, and exploration of literature describing how large-scale structured and unstructured data sources are being used to support applications from clinical decision making and health policy, to drug design and pharmacovigilance, and further to systems biology and genetics. Results The survey highlights ongoing development of powerful new methods for turning that large-scale, and often complex, data into information that provides new insights into human health, in a range of different areas. Consideration of this body of work identifies several important paradigm shifts that are facilitated by Big Data resources and methods: in clinical and translational research, from hypothesis-driven research to data-driven research, and in medicine, from evidence-based practice to practice-based evidence. Conclusions The increasing scale and availability of large quantities of health data require strategies for data management, data linkage, and data integration beyond the limits of many existing information systems, and substantial effort is underway to meet those needs. As our ability to make sense of that data improves, the value of the data will continue to increase. Health systems, genetics and genomics, population and public health; all areas of biomedicine stand to benefit from Big Data and the associated technologies. PMID:25123716

  16. Big data is not a monolith

    CERN Document Server

    Ekbia, Hamid R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  17. Slurry reactors

    Energy Technology Data Exchange (ETDEWEB)

    Kuerten, H; Zehner, P [BASF A.G., Ludwigshafen am Rhein (Germany, F.R.)

    1979-08-01

    Slurry reactors are designed on the basis of empirical data and model investigations. It is as yet not possible to calculate the flow behavior of such reactors. The swarm of gas bubbles and cluster formations of solid particles and their interaction in industrial reactors are not known. These effects control to a large extent the gas hold-up, the gas-liquid interface and, similarly as in bubble columns, the back-mixing of liquids and solids. These hydrodynamic problems are illustrated in slurry reactors which constructionally may be bubble columns, stirred tanks or jet loop reactors. The expected effects are predicted by means of tests with model systems modified to represent the conditions in industrial hydrogenation reactors. In his book 'Mass Transfer in Heterogeneous Catalysis' (1970) Satterfield complained of the lack of knowledge about the design of slurry reactors and hence of the impossible task of the engineer who has to design a plant according to accepted rules. There have been no fundamental changes since then. This paper presents the problems facing the engineer in designing slurry reactors, and shows new development trends.

  18. Reactor safety

    International Nuclear Information System (INIS)

    Butz, H.P.; Heuser, F.W.; May, H.

    1985-01-01

    The paper comprises an introduction into nuclear physics bases, the safety concept generally speaking, safety devices of pwr type reactors, accident analysis, external influences, probabilistic safety assessment and risk studies. It further describes operational experience, licensing procedures under the Atomic Energy Law, research in reactor safety and the nuclear fuel cycle. (DG) [de

  19. Nuclear reactor

    International Nuclear Information System (INIS)

    Mysels, K.J.; Shenoy, A.S.

    1976-01-01

    A nuclear reactor is described in which the core consists of a number of fuel regions through each of which regulated coolant flows. The coolant from neighbouring fuel regions is combined in a manner which results in an averaging of the coolant temperature at the outlet of the core. By this method the presence of hot streaks in the reactor is reduced. (UK)

  20. Reactor container

    International Nuclear Information System (INIS)

    Kato, Masami; Nishio, Masahide.

    1987-01-01

    Purpose: To prevent the rupture of the dry well even when the melted reactor core drops into a reactor pedestal cavity. Constitution: In a reactor container in which a dry well disposed above the reactor pedestal cavity for containing the reactor pressure vessel and a torus type suppression chamber for containing pressure suppression water are connected with each other, the pedestal cavity and the suppression chamber are disposed such that the flow level of the pedestal cavity is lower than the level of the pressure suppression water. Further, a pressure suppression water introduction pipeway for introducing the pressure suppression water into the reactor pedestal cavity is disposed by way of an ON-OFF valve. In case if the melted reactor core should fall into the pedestal cavity, the ON-OFF valve for the pressure suppression water introduction pipeway is opened to introduce the pressure suppression water in the suppression chamber into the pedestal cavity to cool the melted reactor core. (Ikeda, J.)

  1. RA Reactor

    International Nuclear Information System (INIS)

    1989-01-01

    This chapter includes the following: General description of the RA reactor, organization of work, responsibilities of leadership and operators team, regulations concerning operation and behaviour in the reactor building, regulations for performing experiments, regulations and instructions for inserting samples into experimental channels [sr

  2. Reactor physics

    International Nuclear Information System (INIS)

    Ait Abderrahim, H.

    1998-01-01

    Progress in research on reactor physics in 1997 at the Belgian Nuclear Research Centre SCK/CEN is described. Activities in the following four domains are discussed: core physics, ex-core neutron transport, experiments in Materials Testing Reactors, international benchmarks

  3. Political-social reactor problems at Berkeley

    International Nuclear Information System (INIS)

    Little, G.A.

    1980-01-01

    For better than ten years there was little public notice of the TRIGA reactor at UC-Berkeley. Then: a) A non-student persuaded the Student and Senate to pass a resolution to request Campus Administration to stop operation of the reactor and remove it from campus. b) Presence of the reactor became a campaign-issue in a City Mayoral election. c) Two local residents reported adverse physical reactions before, during, and after a routine tour of the reactor facility. d) The Berkeley City Council began a study of problems associated with radioactive material within the city. e) Friends Of The Earth formally petitioned the NRC to terminate the reactor's license. Campus personnel have expended many man-hours and many pounds of paper in responding to these happenings. Some of the details are of interest, and may be of use to other reactor facilities. (author)

  4. Reactor core

    International Nuclear Information System (INIS)

    Azekura, Kazuo; Kurihara, Kunitoshi.

    1992-01-01

    In a BWR type reactor, a great number of pipes (spectral shift pipes) are disposed in the reactor core. Moderators having a small moderating cross section (heavy water) are circulated in the spectral shift pipes to suppress the excess reactivity while increasing the conversion ratio at an initial stage of the operation cycle. After the intermediate stage of the operation cycle in which the reactor core reactivity is lowered, reactivity is increased by circulating moderators having a great moderating cross section (light water) to extend the taken up burnup degree. Further, neutron absorbers such as boron are mixed to the moderator in the spectral shift pipe to control the concentration thereof. With such a constitution, control rods and driving mechanisms are no more necessary, to simplify the structure of the reactor core. This can increase the fuel conversion ratio and control great excess reactivity. Accordingly, a nuclear reactor core of high conversion and high burnup degree can be attained. (I.N.)

  5. Reactor container

    International Nuclear Information System (INIS)

    Fukazawa, Masanori.

    1991-01-01

    A system for controlling combustible gases, it has been constituted at present such that the combustible gases are controlled by exhausting them to the wet well of a reactor container. In this system, however, there has been a problem, in a reactor container having plenums in addition to the wet well and the dry well, that the combustible gases in such plenums can not be controlled. In view of the above, in the present invention, suction ports or exhaust ports of the combustible gas control system are disposed to the wet well, the dry well and the plenums to control the combustible gases in the reactor container. Since this can control the combustible gases in the entire reactor container, the integrity of the reactor container can be ensured. (T.M.)

  6. Reactor container

    International Nuclear Information System (INIS)

    Kojima, Yoshihiro; Hosomi, Kenji; Otonari, Jun-ichiro.

    1997-01-01

    In the present invention, a catalyst for oxidizing hydrogen to be disposed in a reactor container upon rupture of pipelines of a reactor primary coolant system is prevented from deposition of water droplets formed from a reactor container spray to suppress elevation of hydrogen concentration in the reactor container. Namely, a catalytic combustion gas concentration control system comprises a catalyst for oxidizing hydrogen and a support thereof. In addition, there is also disposed a water droplet deposition-preventing means for preventing deposition of water droplets in a reactor pressure vessel on the catalyst. Then, the effect of the catalyst upon catalytic oxidation reaction of hydrogen can be kept high. The local elevation of hydrogen concentration can be prevented even upon occurrence of such a phenomenon that various kinds of mobile forces in the container such as dry well cooling system are lost. (I.S.)

  7. Nuclear reactor

    International Nuclear Information System (INIS)

    Tilliette, Z.

    1975-01-01

    A description is given of a nuclear reactor and especially a high-temperature reactor in which provision is made within a pressure vessel for a main cavity containing the reactor core and a series of vertical cylindrical pods arranged in spaced relation around the main cavity and each adapted to communicate with the cavity through two collector ducts or headers for the primary fluid which flows downwards through the reactor core. Each pod contains two superposed steam-generator and circulator sets disposed in substantially symmetrical relation on each side of the hot primary-fluid header which conveys the primary fluid from the reactor cavity to the pod, the circulators of both sets being mounted respectively at the bottom and top ends of the pod

  8. 10 years Institute for Reactor Development

    International Nuclear Information System (INIS)

    1975-05-01

    Ten years ago the Institute of Reactor Development was founded. This report contains a review about the research work of the institute in these past ten years. The work was mainly performed within the framework of the Fast Breeder Project, the Nuclear Safety Project and Computer Aided Design. Especially the following topics are discussed: design studies for different fast breeder reactors, development works for fast breeders, investigations of central safety problems of sodium cooled breeder reactors (such as local and integral coolant disturbances and hypothetical accident analysis), special questions of light water reactor safety (such as dynamic stresses in pressure suppression systems and fuel rod behaviour under loss of coolant conditions), and finally computer application in various engineering fields. (orig.) [de

  9. ANALGETSKI UCINAK TRANSAKUTNE ELEKTRICNE NERVNE STIMULACIJE (TENS)

    OpenAIRE

    ĆURKOVIĆ, B.

    1984-01-01

    Transkutana električna nervna stimulacija. (TENS) danas je široko prihvaćen terapijski postupak za suzbijanje boli. Način njezina djelovanja nije još jasno definiran, premda se većina autora priklanja centralnom mehranizmu smanjenja boli. Rezultati, u literaturi, variraju od nesignifikantno boljeg učinka od placeba do 95°/o dobrog analgetskog djelovanja. U Zavodu za reumatske bolesti i rehabilitaciju Kliničkog bolničkog centra, Zagreb, evaluiran je učinak TENS-a u bolesnika s križ...

  10. Big Data for Precision Medicine

    Directory of Open Access Journals (Sweden)

    Daniel Richard Leff

    2015-09-01

    Full Text Available This article focuses on the potential impact of big data analysis to improve health, prevent and detect disease at an earlier stage, and personalize interventions. The role that big data analytics may have in interrogating the patient electronic health record toward improved clinical decision support is discussed. We examine developments in pharmacogenetics that have increased our appreciation of the reasons why patients respond differently to chemotherapy. We also assess the expansion of online health communications and the way in which this data may be capitalized on in order to detect public health threats and control or contain epidemics. Finally, we describe how a new generation of wearable and implantable body sensors may improve wellbeing, streamline management of chronic diseases, and improve the quality of surgical implants.

  11. Big Data hvor N=1

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    2017-01-01

    Forskningen vedrørende anvendelsen af ’big data’ indenfor sundhed er kun lige begyndt, og kan på sigt blive en stor hjælp i forhold til at tilrettelægge en mere personlig og helhedsorienteret sundhedsindsats for multisyge. Personlig sundhedsteknologi, som kort præsenteres i dette kapital, rummer et...... stor potentiale for at gennemføre ’big data’ analyser for den enkelte person, det vil sige hvor N=1. Der er store teknologiske udfordringer i at få lavet teknologier og metoder til at indsamle og håndtere personlige data, som kan deles, på tværs på en standardiseret, forsvarlig, robust, sikker og ikke...

  12. George and the big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2012-01-01

    George has problems. He has twin baby sisters at home who demand his parents’ attention. His beloved pig Freddy has been exiled to a farm, where he’s miserable. And worst of all, his best friend, Annie, has made a new friend whom she seems to like more than George. So George jumps at the chance to help Eric with his plans to run a big experiment in Switzerland that seeks to explore the earliest moment of the universe. But there is a conspiracy afoot, and a group of evildoers is planning to sabotage the experiment. Can George repair his friendship with Annie and piece together the clues before Eric’s experiment is destroyed forever? This engaging adventure features essays by Professor Stephen Hawking and other eminent physicists about the origins of the universe and ends with a twenty-page graphic novel that explains how the Big Bang happened—in reverse!

  13. Did the Big Bang begin?

    International Nuclear Information System (INIS)

    Levy-Leblond, J.

    1990-01-01

    It is argued that the age of the universe may well be numerically finite (20 billion years or so) and conceptually infinite. A new and natural time scale is defined on a physical basis using group-theoretical arguments. An additive notion of time is obtained according to which the age of the universe is indeed infinite. In other words, never did the Big Bang begin. This new time scale is not supposed to replace the ordinary cosmic time scale, but to supplement it (in the same way as rapidity has taken a place by the side of velocity in Einsteinian relativity). The question is discussed within the framework of conventional (big-bang) and classical (nonquantum) cosmology, but could easily be extended to more elaborate views, as the purpose is not so much to modify present theories as to reach a deeper understanding of their meaning

  14. Big Data in Drug Discovery.

    Science.gov (United States)

    Brown, Nathan; Cambruzzi, Jean; Cox, Peter J; Davies, Mark; Dunbar, James; Plumbley, Dean; Sellwood, Matthew A; Sim, Aaron; Williams-Jones, Bryn I; Zwierzyna, Magdalena; Sheppard, David W

    2018-01-01

    Interpretation of Big Data in the drug discovery community should enhance project timelines and reduce clinical attrition through improved early decision making. The issues we encounter start with the sheer volume of data and how we first ingest it before building an infrastructure to house it to make use of the data in an efficient and productive way. There are many problems associated with the data itself including general reproducibility, but often, it is the context surrounding an experiment that is critical to success. Help, in the form of artificial intelligence (AI), is required to understand and translate the context. On the back of natural language processing pipelines, AI is also used to prospectively generate new hypotheses by linking data together. We explain Big Data from the context of biology, chemistry and clinical trials, showcasing some of the impressive public domain sources and initiatives now available for interrogation. © 2018 Elsevier B.V. All rights reserved.

  15. Comparison between TRU burning reactors and commercial fast reactor

    International Nuclear Information System (INIS)

    Fujimura, Koji; Sanda, Toshio; Ogawa, Takashi

    2001-03-01

    Research and development for stabilizing or shortening the radioactive wastes including in spent nuclear fuel are widely conducted in view point of reducing the environmental impact. Especially it is effective way to irradiate and transmute long-lived TRU by fast reactors. Two types of loading way were previously proposed. The former is loading relatively small amount of TRU in all commercial fast reactors and the latter is loading large amount of TRU in a few TRU burning reactors. This study has been intended to contribute to the feasibility studies on commercialized fast reactor cycle system. The transmutation and nuclear characteristics of TRU burning reactors were evaluated and compared with those of conventional transmutation system using commercial type fast reactor based upon the investigation of technical information about TRU burning reactors. Major results are summarized as follows. (1) Investigation of technical information about TRU burning reactors. Based on published reports and papers, technical information about TRU burning reactor concepts transmutation system using convectional commercial type fast reactors were investigated. Transmutation and nuclear characteristics or R and D issue were investigated based on these results. Homogeneously loading of about 5 wt% MAs on core fuels in the conventional commercial type fast reactor may not cause significant impact on the nuclear core characteristics. Transmutation of MAs being produced in about five fast reactors generating the same output is feasible. The helium cooled MA burning fast reactor core concept propose by JAERI attains criticality using particle type nitride fuels which contain more than 60 wt% MA. This reactor could transmute MAs being produced in more than ten 1000 MWe-LWRs. Ultra-long life core concepts attaining more than 30 years operation without refueling by utilizing MA's nuclear characteristics as burnable absorber and fertile nuclides were proposed. Those were pointed out that

  16. Water desalination using different capacity reactors options

    International Nuclear Information System (INIS)

    Alonso, G.; Vargas, S.; Del Valle, E.; Ramirez, R.

    2010-01-01

    The Northwest region of Mexico has a deficit of potable water, along this necessity is the region growth, which requires of additional energy capacity, cogeneration of potable water production and nuclear electricity is an option to be assessed. In this paper we will perform an economical comparison for cogeneration using a big reactor, the AP1000, and a medium size reactor, the IRIS, both of them are PWR type reactors and will be coupled to the desalination plant using the same method. For this cogeneration case we will assess the best reactor option that can cover both needs using the maximum potable water production for two different desalination methods: Multistage Flash Distillation and Multi-effect Distillation. (authors)

  17. Fast reactor database. 2006 update

    International Nuclear Information System (INIS)

    2006-12-01

    Liquid metal cooled fast reactors (LMFRs) have been under development for about 50 years. Ten experimental fast reactors and six prototype and commercial size fast reactor plants have been constructed and operated. In many cases, the overall experience with LMFRs has been rather good, with the reactors themselves and also the various components showing remarkable performances, well in accordance with the design expectations. The fast reactor system has also been shown to have very attractive safety characteristics, resulting to a large extent from the fact that the fast reactor is a low pressure system with large thermal inertia and negative power and temperature coefficients. In addition to the LMFRs that have been constructed and operated, more than ten advanced LMFR projects have been developed, and the latest designs are now close to achieving economic competitivity with other reactor types. In the current world economic climate, the introduction of a new nuclear energy system based on the LMFR may not be considered by utilities as a near future option when compared to other potential power plants. However, there is a strong agreement between experts in the nuclear energy field that, for sustainability reasons, long term development of nuclear power as a part of the world's future energy mix will require the fast reactor technology, and that, given the decline in fast reactor development projects, data retrieval and knowledge preservation efforts in this area are of particular importance. This publication contains detailed design data and main operational data on experimental, prototype, demonstration, and commercial size LMFRs. Each LMFR plant is characterized by about 500 parameters: physics, thermohydraulics, thermomechanics, by design and technical data, and by relevant sketches. The focus is on practical issues that are useful to engineers, scientists, managers, university students and professors with complete technical information of a total of 37 LMFR

  18. Big Data and central banks

    OpenAIRE

    David Bholat

    2015-01-01

    This commentary recaps a Centre for Central Banking Studies event held at the Bank of England on 2–3 July 2014. The article covers three main points. First, it situates the Centre for Central Banking Studies event within the context of the Bank’s Strategic Plan and initiatives. Second, it summarises and reflects on major themes from the event. Third, the article links central banks’ emerging interest in Big Data approaches with their broader uptake by other economic agents.

  19. Big Bang or vacuum fluctuation

    International Nuclear Information System (INIS)

    Zel'dovich, Ya.B.

    1980-01-01

    Some general properties of vacuum fluctuations in quantum field theory are described. The connection between the ''energy dominance'' of the energy density of vacuum fluctuations in curved space-time and the presence of singularity is discussed. It is pointed out that a de-Sitter space-time (with the energy density of the vacuum fluctuations in the Einstein equations) that matches the expanding Friedman solution may describe the history of the Universe before the Big Bang. (P.L.)

  20. Big bang is not needed

    Energy Technology Data Exchange (ETDEWEB)

    Allen, A.D.

    1976-02-01

    Recent computer simulations indicate that a system of n gravitating masses breaks up, even when the total energy is negative. As a result, almost any initial phase-space distribution results in a universe that eventually expands under the Hubble law. Hence Hubble expansion implies little regarding an initial cosmic state. Especially it does not imply the singularly dense superpositioned state used in the big bang model.

  1. Big data: the management revolution.

    Science.gov (United States)

    McAfee, Andrew; Brynjolfsson, Erik

    2012-10-01

    Big data, the authors write, is far more powerful than the analytics of the past. Executives can measure and therefore manage more precisely than ever before. They can make better predictions and smarter decisions. They can target more-effective interventions in areas that so far have been dominated by gut and intuition rather than by data and rigor. The differences between big data and analytics are a matter of volume, velocity, and variety: More data now cross the internet every second than were stored in the entire internet 20 years ago. Nearly real-time information makes it possible for a company to be much more agile than its competitors. And that information can come from social networks, images, sensors, the web, or other unstructured sources. The managerial challenges, however, are very real. Senior decision makers have to learn to ask the right questions and embrace evidence-based decision making. Organizations must hire scientists who can find patterns in very large data sets and translate them into useful business information. IT departments have to work hard to integrate all the relevant internal and external sources of data. The authors offer two success stories to illustrate how companies are using big data: PASSUR Aerospace enables airlines to match their actual and estimated arrival times. Sears Holdings directly analyzes its incoming store data to make promotions much more precise and faster.

  2. Big Data Comes to School

    Directory of Open Access Journals (Sweden)

    Bill Cope

    2016-03-01

    Full Text Available The prospect of “big data” at once evokes optimistic views of an information-rich future and concerns about surveillance that adversely impacts our personal and private lives. This overview article explores the implications of big data in education, focusing by way of example on data generated by student writing. We have chosen writing because it presents particular complexities, highlighting the range of processes for collecting and interpreting evidence of learning in the era of computer-mediated instruction and assessment as well as the challenges. Writing is significant not only because it is central to the core subject area of literacy; it is also an ideal medium for the representation of deep disciplinary knowledge across a number of subject areas. After defining what big data entails in education, we map emerging sources of evidence of learning that separately and together have the potential to generate unprecedented amounts of data: machine assessments, structured data embedded in learning, and unstructured data collected incidental to learning activity. Our case is that these emerging sources of evidence of learning have significant implications for the traditional relationships between assessment and instruction. Moreover, for educational researchers, these data are in some senses quite different from traditional evidentiary sources, and this raises a number of methodological questions. The final part of the article discusses implications for practice in an emerging field of education data science, including publication of data, data standards, and research ethics.

  3. Reactor building

    International Nuclear Information System (INIS)

    Maruyama, Toru; Murata, Ritsuko.

    1996-01-01

    In the present invention, a spent fuel storage pool of a BWR type reactor is formed at an upper portion and enlarged in the size to effectively utilize the space of the building. Namely, a reactor chamber enhouses reactor facilities including a reactor pressure vessel and a reactor container, and further, a spent fuel storage pool is formed thereabove. A second spent fuel storage pool is formed above the auxiliary reactor chamber at the periphery of the reactor chamber. The spent fuel storage pool and the second spent fuel storage pool are disposed in adjacent with each other. A wall between both of them is formed vertically movable. With such a constitution, the storage amount for spent fuels is increased thereby enabling to store the entire spent fuels generated during operation period of the plant. Further, since requirement of the storage for the spent fuels is increased stepwisely during periodical exchange operation, it can be used for other usage during the period when the enlarged portion is not used. (I.S.)

  4. Reactor container

    International Nuclear Information System (INIS)

    Shibata, Satoru; Kawashima, Hiroaki

    1984-01-01

    Purpose: To optimize the temperature distribution of the reactor container so as to moderate the thermal stress distribution on the reactor wall of LMFBR type reactor. Constitution: A good heat conductor (made of Al or Cu) is appended on the outer side of the reactor container wall from below the liquid level to the lower face of a deck plate. Further, heat insulators are disposed to the outside of the good heat conductor. Furthermore, a gas-cooling duct is circumferentially disposed at the contact portion between the good heat conductor and the deck plate around the reactor container. This enables to flow the cold heat from the liquid metal rapidly through the good heat conductor to the cooling duct and allows to maintain the temperature distribution on the reactor wall substantially linear even with the abrupt temperature change in the liquid metal. Further, by appending the good heat conductor covered with inactive metals not only on the outer side but also on the inside of the reactor wall to introduce the heat near the liquid level to the upper portion and escape the same to the cooling layer below the roof slab, the effect can be improved further. (Ikeda, J.)

  5. Ten recommendations for software engineering in research.

    Science.gov (United States)

    Hastings, Janna; Haug, Kenneth; Steinbeck, Christoph

    2014-01-01

    Research in the context of data-driven science requires a backbone of well-written software, but scientific researchers are typically not trained at length in software engineering, the principles for creating better software products. To address this gap, in particular for young researchers new to programming, we give ten recommendations to ensure the usability, sustainability and practicality of research software.

  6. Kaluza-Klein supergravity in ten dimensions

    International Nuclear Information System (INIS)

    Huq, M.; Namazie, M.A.

    1983-11-01

    We construct a massive version of N=2 supergravity in ten dimensions by compactification of the eleven dimensional, N=1 theory. This theory describes the usual N=2 massless super-multiplet, in addition to which there is an infinite tower of massive, charged N=2 supermultiplets. (author)

  7. Top-Ten IT Issues: 2009

    Science.gov (United States)

    Agee, Anne Scrivener; Yang, Catherine

    2009-01-01

    This article presents the top-ten IT-related issues in terms of strategic importance to the institution, as revealed by the tenth annual EDUCAUSE Current Issues Survey. These IT-related issues include: (1) Funding IT; (2) Administrative/ERP Information Systems; (3) Security; (4) Infrastructure/Cyberinfrastructure; (5) Teaching and Learning with…

  8. Ten years after the Jali Commission

    African Journals Online (AJOL)

    Ten years have lapsed since the Jali Commission's final report became publicly available, and it is therefore an .... as the 'core business' of the department. This was seen as .... the 2009/10–2013/14 DCS Strategic Plan did the department ...

  9. Czech, Slovak science ten years after split

    CERN Multimedia

    2003-01-01

    Ten years after the split of Czechoslovakia Czech and Slovak science are facing the same difficulties: shortage of money for research, poor salaries, obsolete equipment and brain drain, especially of the young, according to a feature in the Daily Lidove Noviny (1 page).

  10. Ten themes of viscous liquid dynamics

    DEFF Research Database (Denmark)

    Dyre, J. C.

    2007-01-01

    Ten ‘themes' of viscous liquid physics are discussed with a focus on how they point to a general description of equilibrium viscous liquid dynamics (i.e., fluctuations) at a given temperature. This description is based on standard time-dependent Ginzburg-Landau equations for the density fields...

  11. Top Ten Concerns for Trustees in 1988.

    Science.gov (United States)

    Meyerson, Joel W.

    1988-01-01

    Ten issues most likely to influence institutions this year include tuition policy and financing, capital renewal and replacement, charitable giving, scientific equipment and laboratories, endowment management and spending policy, research funding, corporate contributions, minority enrollment and hiring, debt financing and debt capacity, and cost…

  12. Giant multipole resonances: perspectives after ten years

    International Nuclear Information System (INIS)

    Bertrand, F.E.

    1980-01-01

    Nearly ten years ago evidence was published for the first of the so-called giant multipole resonances, the giant quadrupole resonance. During the ensuing years research in this field has spread to many nuclear physics laboratories throughout the world. The present status of electric giant multipole resonances is reviewed. 24 figures, 1 table

  13. Players Off the Field. How Jim Delany and Roy Kramer Took over Big-Time College Sports.

    Science.gov (United States)

    Suggs, Welch

    2000-01-01

    Traces the history of the college football bowl system and describes the movement toward replacing the bowl game system with a national championship playoff system. Focuses on the roles of J. Delany, commission of the Big Ten Conference and R. Kramer, commissioner of the Southeastern Conference, in perpetuating the current college football bowl…

  14. Nuclear reactor

    International Nuclear Information System (INIS)

    Rau, P.

    1980-01-01

    The reactor core of nuclear reactors usually is composed of individual elongated fuel elements that may be vertically arranged and through which coolant flows in axial direction, preferably from bottom to top. With their lower end the fuel elements gear in an opening of a lower support grid forming part of the core structure. According to the invention a locking is provided there, part of which is a control element that is movable along the fuel element axis. The corresponding locking element is engaged behind a lateral projection in the opening of the support grid. The invention is particularly suitable for breeder or converter reactors. (orig.) [de

  15. Thermionic nuclear reactor systems

    International Nuclear Information System (INIS)

    Kennel, E.B.

    1986-01-01

    Thermionic nuclear reactors can be expected to be candidate space power supplies for power demands ranging from about ten kilowatts to several megawatts. The conventional ''ignited mode'' thermionic fuel element (TFE) is the basis for most reactor designs to date. Laboratory converters have been built and tested with efficiencies in the range of 7-12% for over 10,000 hours. Even longer lifetimes are projected. More advanced capabilities are potentially achievable in other modes of operation, such as the self-pulsed or unignited diode. Coupled with modest improvements in fuel and emitter material performance, the efficiency of an advanced thermionic conversion system can be extended to the 15-20% range. Advanced thermionic power systems are expected to be compatible with other advanced features such as: (1) Intrinsic subcritically under accident conditions, ensuring 100% safety upon launch abort; (2) Intrinsic low radiation levels during reactor shutdown, allowing manned servicing and/or rendezvous; (3) DC to DC power conditioning using lightweight power MOSFETS; and (4) AC output using pulsed converters

  16. Big science transformed science, politics and organization in Europe and the United States

    CERN Document Server

    Hallonsten, Olof

    2016-01-01

    This book analyses the emergence of a transformed Big Science in Europe and the United States, using both historical and sociological perspectives. It shows how technology-intensive natural sciences grew to a prominent position in Western societies during the post-World War II era, and how their development cohered with both technological and social developments. At the helm of post-war science are large-scale projects, primarily in physics, which receive substantial funds from the public purse. Big Science Transformed shows how these projects, popularly called 'Big Science', have become symbols of progress. It analyses changes to the political and sociological frameworks surrounding publicly-funding science, and their impact on a number of new accelerator and reactor-based facilities that have come to prominence in materials science and the life sciences. Interdisciplinary in scope, this book will be of great interest to historians, sociologists and philosophers of science.

  17. Turning big bang into big bounce. I. Classical dynamics

    Science.gov (United States)

    Dzierżak, Piotr; Małkiewicz, Przemysław; Piechocki, Włodzimierz

    2009-11-01

    The big bounce (BB) transition within a flat Friedmann-Robertson-Walker model is analyzed in the setting of loop geometry underlying the loop cosmology. We solve the constraint of the theory at the classical level to identify physical phase space and find the Lie algebra of the Dirac observables. We express energy density of matter and geometrical functions in terms of the observables. It is the modification of classical theory by the loop geometry that is responsible for BB. The classical energy scale specific to BB depends on a parameter that should be fixed either by cosmological data or determined theoretically at quantum level, otherwise the energy scale stays unknown.

  18. Nuclear reactors

    International Nuclear Information System (INIS)

    Prescott, R.F.

    1976-01-01

    A nuclear reactor containment vessel faced internally with a metal liner is provided with thermal insulation for the liner, comprising one or more layers of compressible material such as ceramic fiber, such as would be conventional in an advanced gas-cooled reactor and also a superposed layer of ceramic bricks or tiles in combination with retention means therefor, the retention means (comprising studs projecting from the liner, and bolts or nuts in threaded engagement with the studs) being themselves insulated from the vessel interior so that the coolant temperatures achieved in a High-Temperature Reactor or a Fast Reactor can be tolerated with the vessel. The layer(s) of compressible material is held under a degree of compression either by the ceramic bricks or tiles themselves or by cover plates held on the studs, in which case the bricks or tiles are preferably bedded on a yielding layer (for example of carbon fibers) rather than directly on the cover plates

  19. Nuclear reactor

    International Nuclear Information System (INIS)

    Miyashita, Akio.

    1981-01-01

    Purpose: To facilitate and accelerate a leakage test of valves of a main steam pipe by adding a leakage test partition valve thereto. Constitution: A leakage testing partition valve is provided between a pressure vessel for a nuclear reactor and the most upstream side valve of a plurality of valves to be tested for leakage, a testing branch pipe is communicated with the downstream side of the partition valve, and the testing water for preventing leakage is introduced thereto through the branch pipe. Since main steam pipe can be simply isolated by closing the partition valve in the leakage test, the leakage test can be conducted without raising or lowering the water level in the pressure vessel, and since interference with other work in the reactor can be eliminated, the leakage test can be readily conducted parallel with other work in the reactor in a short time. Clean water can be used without using reactor water as the test water. (Yoshihara, H.)

  20. Reactor container

    International Nuclear Information System (INIS)

    Abe, Yoshihito; Sano, Tamotsu; Ueda, Sabuo; Tanaka, Kazuhisa.

    1987-01-01

    Purpose: To improve the liquid surface disturbance in LMFBR type reactors. Constitution: A horizontal flow suppressing mechanism mainly comprising vertical members is suspended near the free liquid surface of coolants in the upper plenum. The horizontal flow of coolants near the free liquid surface is reduced by the suppressing mechanism to effectively reduce the surface disturbance. The reduction in the liquid surface disturbance further prevails to the entire surface region with no particular vertical variations to the free liquid surface to remarkably improve the preventive performance for the liquid surface disturbance. Accordingly, it is also possible to attain the advantageous effects such as prevention for the thermal fatigue in reactor vessel walls, reactor upper mechanisms, etc. and prevention of burning damage to the reactor core due to the reduction of envolved Ar gas. (Kamimura, M.)

  1. REACTOR SHIELD

    Science.gov (United States)

    Wigner, E.P.; Ohlinger, L.E.; Young, G.J.; Weinberg, A.M.

    1959-02-17

    Radiation shield construction is described for a nuclear reactor. The shield is comprised of a plurality of steel plates arranged in parallel spaced relationship within a peripheral shell. Reactor coolant inlet tubes extend at right angles through the plates and baffles are arranged between the plates at right angles thereto and extend between the tubes to create a series of zigzag channels between the plates for the circulation of coolant fluid through the shield. The shield may be divided into two main sections; an inner section adjacent the reactor container and an outer section spaced therefrom. Coolant through the first section may be circulated at a faster rate than coolant circulated through the outer section since the area closest to the reactor container is at a higher temperature and is more radioactive. The two sections may have separate cooling systems to prevent the coolant in the outer section from mixing with the more contaminated coolant in the inner section.

  2. NUCLEAR REACTOR

    Science.gov (United States)

    Miller, H.I.; Smith, R.C.

    1958-01-21

    This patent relates to nuclear reactors of the type which use a liquid fuel, such as a solution of uranyl sulfate in ordinary water which acts as the moderator. The reactor is comprised of a spherical vessel having a diameter of about 12 inches substantially surrounded by a reflector of beryllium oxide. Conventionnl control rods and safety rods are operated in slots in the reflector outside the vessel to control the operation of the reactor. An additional means for increasing the safety factor of the reactor by raising the ratio of delayed neutrons to prompt neutrons, is provided and consists of a soluble sulfate salt of beryllium dissolved in the liquid fuel in the proper proportion to obtain the result desired.

  3. Electrical cabling system associated at a nuclear reactor

    International Nuclear Information System (INIS)

    Dejeux, P.; Desfontaines, G.

    1988-01-01

    This cabling system for an electrical device in a nuclear reactor comprises at least a first cable issued of the device, a second cable comprising a first portion, a second portion and a third portion joining the second by a multiple quick fitting connector capable to connect at least ten second portions at ten other third portions of the second cable [fr

  4. Description of the Triton reactor

    International Nuclear Information System (INIS)

    1967-09-01

    The Triton reactor is an enriched uranium pool type reactor. It began operation in 1959, after a divergence made on the June 30 the same year. Devoted to studies of radiation protection, its core can be displaced in the longitudinal direction. The pool can be separated in two unequal compartments by a wall. The Triton core is placed in a small compartment, the Nereide core in the big compartment. A third compartment without water is called Naiade II, is separated by a concrete wall in which is made a window closed by an aluminium plate (2.50 m x 2.70 m). The Naiade II hole is useful for protection experiments using the Nereide core. After a complete refitting, the power of the triton reactor that reached progressively from 1.2 MW to 2 MW, then 3 MW has reached in August 1965 6.5 MW. The reactor has been specialized in irradiations in fix position, the core become fix, the nereide core has been hung mobile. Since it has been used for structure materials irradiation, for radioelements fabrication and fundamental research. The following descriptions are valid for the period after August 1965 [fr

  5. Big Data Knowledge in Global Health Education.

    Science.gov (United States)

    Olayinka, Olaniyi; Kekeh, Michele; Sheth-Chandra, Manasi; Akpinar-Elci, Muge

    The ability to synthesize and analyze massive amounts of data is critical to the success of organizations, including those that involve global health. As countries become highly interconnected, increasing the risk for pandemics and outbreaks, the demand for big data is likely to increase. This requires a global health workforce that is trained in the effective use of big data. To assess implementation of big data training in global health, we conducted a pilot survey of members of the Consortium of Universities of Global Health. More than half the respondents did not have a big data training program at their institution. Additionally, the majority agreed that big data training programs will improve global health deliverables, among other favorable outcomes. Given the observed gap and benefits, global health educators may consider investing in big data training for students seeking a career in global health. Copyright © 2017 Icahn School of Medicine at Mount Sinai. Published by Elsevier Inc. All rights reserved.

  6. Burnable absorber-integrated Guide Thimble (BigT) - 1. Design concepts and neutronic characterization on the fuel assembly benchmarks

    International Nuclear Information System (INIS)

    Yahya, Mohd-Syukri; Yu, Hwanyeal; Kim, Yonghee

    2016-01-01

    This paper presents the conceptual designs of a new burnable absorber (BA) for the pressurized water reactor (PWR), which is named 'Burnable absorber-integrated Guide Thimble' (BigT). The BigT integrates BA materials into standard guide thimble in a PWR fuel assembly. Neutronic sensitivities and practical design considerations of the BigT concept are points of highlight in the first half of the paper. Specifically, the BigT concepts are characterized in view of its BA material and spatial self-shielding variations. In addition, the BigT replaceability requirement, bottom-end design specifications and thermal-hydraulic considerations are also deliberated. Meanwhile, much of the second half of the paper is devoted to demonstrate practical viability of the BigT absorbers via comparative evaluations against the conventional BA technologies in representative 17x17 and 16x16 fuel assembly lattices. For the 17x17 lattice evaluations, all three BigT variants are benchmarked against Westinghouse's existing BA technologies, while in the 16x16 assembly analyses, the BigT designs are compared against traditional integral gadolinia-urania rod design. All analyses clearly show that the BigT absorbers perform as well as the commercial BA technologies in terms of reactivity and power peaking management. In addition, it has been shown that sufficiently high control rod worth can be obtained with the BigT absorbers in place. All neutronic simulations were completed using the Monte Carlo Serpent code with ENDF/B-VII.0 library. (author)

  7. Breeder reactors

    International Nuclear Information System (INIS)

    Gollion, H.

    1977-01-01

    The reasons for the development of fast reactors are briefly reviewed (a propitious neutron balance oriented towards a maximum uranium burnup) and its special requirements (cooling, fissile material density and reprocessing) discussed. The three stages in the French program of fast reactor development are outlined with Rapsodie at Cadarache, Phenix at Marcoule, and Super Phenix at Creys-Malville. The more specific features of the program of research and development are emphasized: kinetics and the core, the fuel and the components [fr

  8. Nuclear reactor

    International Nuclear Information System (INIS)

    Schulze, I.; Gutscher, E.

    1980-01-01

    The core contains a critical mass of UN or U 2 N 3 in the form of a noncritical solution with melted Sn being kept below a N atmosphere. The lining of the reactor core consists of graphite. If fission progresses part of the melted metal solution is removed and cleaned from fission products. The reactor temperatures lie in the range of 300 to 2000 0 C. (Examples and tables). (RW) [de

  9. Reactor technology

    International Nuclear Information System (INIS)

    Erdoes, P.

    1977-01-01

    This is one of a series of articles discussing aspects of nuclear engineering ranging from a survey of various reactor types for static and mobile use to mention of atomic thermo-electric batteries of atomic thermo-electric batteries for cardiac pacemakers. Various statistics are presented on power generation in Europe and U.S.A. and economics are discussed in some detail. Molten salt reactors and research machines are also described. (G.M.E.)

  10. Reactor containment

    International Nuclear Information System (INIS)

    Kawabe, Ryuhei; Yamaki, Rika.

    1990-01-01

    A water vessel is disposed and the gas phase portion of the water vessel is connected to a reactor container by a pipeline having a valve disposed at the midway thereof. A pipe in communication with external air is extended upwardly from the liquid phase portion to a considerable height so as to resist against the back pressure by a waterhead in the pipeline. Accordingly, when the pressure in the container is reduced to a negative level, air passes through the pipeline and uprises through the liquid phase portion in the water vessel in the form of bubbles and then flows into the reactor container. When the pressure inside of the reactor goes higher, since the liquid surface in the water vessel is forced down, water is pushed up into the pipeline. Since the waterhead pressure of a column of water in the pipeline and the pressure of the reactor container are well-balanced, gases in the reactor container are not leaked to the outside. Further, in a case if a great positive pressure is formed in the reactor container, the inner pressure overcomes the waterhead of the column of water, so that the gases containing radioactive aerosol uprise in the pipeline. Since water and the gases flow being in contact with each other, this can provide the effect of removing aerosol. (T.M.)

  11. Fast reactors

    International Nuclear Information System (INIS)

    Vasile, A.

    2001-01-01

    Fast reactors have capacities to spare uranium natural resources by their breeding property and to propose solutions to the management of radioactive wastes by limiting the inventory of heavy nuclei. This article highlights the role that fast reactors could play for reducing the radiotoxicity of wastes. The conversion of 238 U into 239 Pu by neutron capture is more efficient in fast reactors than in light water reactors. In fast reactors multi-recycling of U + Pu leads to fissioning up to 95% of the initial fuel ( 238 U + 235 U). 2 strategies have been studied to burn actinides: - the multi-recycling of heavy nuclei is made inside the fuel element (homogeneous option); - the unique recycling is made in special irradiation targets placed inside the core or at its surroundings (heterogeneous option). Simulations have shown that, for the same amount of energy produced (400 TWhe), the mass of transuranium elements (Pu + Np + Am + Cm) sent to waste disposal is 60,9 Kg in the homogeneous option and 204.4 Kg in the heterogeneous option. Experimental programs are carried out in Phenix and BOR60 reactors in order to study the feasibility of such strategies. (A.C.)

  12. Big Data Strategy for Telco: Network Transformation

    OpenAIRE

    F. Amin; S. Feizi

    2014-01-01

    Big data has the potential to improve the quality of services; enable infrastructure that businesses depend on to adapt continually and efficiently; improve the performance of employees; help organizations better understand customers; and reduce liability risks. Analytics and marketing models of fixed and mobile operators are falling short in combating churn and declining revenue per user. Big Data presents new method to reverse the way and improve profitability. The benefits of Big Data and ...

  13. Big Data in Shipping - Challenges and Opportunities

    OpenAIRE

    Rødseth, Ørnulf Jan; Perera, Lokukaluge Prasad; Mo, Brage

    2016-01-01

    Big Data is getting popular in shipping where large amounts of information is collected to better understand and improve logistics, emissions, energy consumption and maintenance. Constraints to the use of big data include cost and quality of on-board sensors and data acquisition systems, satellite communication, data ownership and technical obstacles to effective collection and use of big data. New protocol standards may simplify the process of collecting and organizing the data, including in...

  14. Big Data in Action for Government : Big Data Innovation in Public Services, Policy, and Engagement

    OpenAIRE

    World Bank

    2017-01-01

    Governments have an opportunity to harness big data solutions to improve productivity, performance and innovation in service delivery and policymaking processes. In developing countries, governments have an opportunity to adopt big data solutions and leapfrog traditional administrative approaches

  15. Big data optimization recent developments and challenges

    CERN Document Server

    2016-01-01

    The main objective of this book is to provide the necessary background to work with big data by introducing some novel optimization algorithms and codes capable of working in the big data setting as well as introducing some applications in big data optimization for both academics and practitioners interested, and to benefit society, industry, academia, and government. Presenting applications in a variety of industries, this book will be useful for the researchers aiming to analyses large scale data. Several optimization algorithms for big data including convergent parallel algorithms, limited memory bundle algorithm, diagonal bundle method, convergent parallel algorithms, network analytics, and many more have been explored in this book.

  16. Medical big data: promise and challenges

    Directory of Open Access Journals (Sweden)

    Choong Ho Lee

    2017-03-01

    Full Text Available The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  17. Traffic information computing platform for big data

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn; Liu, Yan, E-mail: ztduan@chd.edu.cn; Dai, Jiting, E-mail: ztduan@chd.edu.cn; Kang, Jun, E-mail: ztduan@chd.edu.cn [Chang' an University School of Information Engineering, Xi' an, China and Shaanxi Engineering and Technical Research Center for Road and Traffic Detection, Xi' an (China)

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  18. Medical big data: promise and challenges.

    Science.gov (United States)

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  19. Traffic information computing platform for big data

    International Nuclear Information System (INIS)

    Duan, Zongtao; Li, Ying; Zheng, Xibin; Liu, Yan; Dai, Jiting; Kang, Jun

    2014-01-01

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users

  20. Big Data's Role in Precision Public Health.

    Science.gov (United States)

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts.

  1. Big data analytics with R and Hadoop

    CERN Document Server

    Prajapati, Vignesh

    2013-01-01

    Big Data Analytics with R and Hadoop is a tutorial style book that focuses on all the powerful big data tasks that can be achieved by integrating R and Hadoop.This book is ideal for R developers who are looking for a way to perform big data analytics with Hadoop. This book is also aimed at those who know Hadoop and want to build some intelligent applications over Big data with R packages. It would be helpful if readers have basic knowledge of R.

  2. Big Data; A Management Revolution : The emerging role of big data in businesses

    OpenAIRE

    Blasiak, Kevin

    2014-01-01

    Big data is a term that was coined in 2012 and has since then emerged to one of the top trends in business and technology. Big data is an agglomeration of different technologies resulting in data processing capabilities that have been unreached before. Big data is generally characterized by 4 factors. Volume, velocity and variety. These three factors distinct it from the traditional data use. The possibilities to utilize this technology are vast. Big data technology has touch points in differ...

  3. BigDataBench: a Big Data Benchmark Suite from Internet Services

    OpenAIRE

    Wang, Lei; Zhan, Jianfeng; Luo, Chunjie; Zhu, Yuqing; Yang, Qiang; He, Yongqiang; Gao, Wanling; Jia, Zhen; Shi, Yingjie; Zhang, Shujie; Zheng, Chen; Lu, Gang; Zhan, Kent; Li, Xiaona; Qiu, Bizhu

    2014-01-01

    As architecture, systems, and data management communities pay greater attention to innovative big data systems and architectures, the pressure of benchmarking and evaluating these systems rises. Considering the broad use of big data systems, big data benchmarks must include diversity of data and workloads. Most of the state-of-the-art big data benchmarking efforts target evaluating specific types of applications or system software stacks, and hence they are not qualified for serving the purpo...

  4. Data processing potential ten years from now

    International Nuclear Information System (INIS)

    Zajde, C.

    1982-01-01

    What today is still experimental will tomorrow be routine. Use of emission tomography, automatic determination of regions of interest, generalisation of Fourrier phase and amplitude analysis, parametric images these will become routine medical protocols. The next ten years will see a saturation in the field of research and a stabilisation in system development. Equipment renewal will be seen from the same angle as in the automotive industry - more for reasons of wear and tear than for reasons of technical advancement

  5. School Psychologists' Job Satisfaction: Ten Years Later

    OpenAIRE

    Worrell, Travis G.

    2004-01-01

    School Psychologistsâ Job Satisfaction: Ten Years Later (ABSTRACT) This study was designed to replicate nationwide surveys completed in 1982 and 1992. The purpose was to examine and describe the levels of job satisfaction and the relationship between the variables in a national sample of school psychologists belonging to the National Association of School Psychologists (NASP). The sample for this study consisted of respondents who reported being full-time school practitioners. ...

  6. Simplification of the helical TEN2 laser

    Science.gov (United States)

    Krahn, K.-H.

    1980-04-01

    The observation that the helical TEN2 laser can effectively be simplified by giving up the use of decoupling elements as well as by abolishing the segmentation of the electrode structure is examined. Although, as a consequence of this simplification, the operating pressure range was slightly decreased, the output power could be improved by roughly 30%, a result which is attributed to the new electrode geometry exhibiting lower inductance and lower damping losses.

  7. Modification of the 'ten-day rule'

    International Nuclear Information System (INIS)

    Klempfner, G.

    1985-01-01

    In a 1964 decision the National Health and Medical Research Council recommended that radiological examinations of the lower abdomen and pelvis of women of childbearing age should be confined to the 10-day interval following the onset of menstruation. Recent evidence suggests that the first four weeks from the first day of the last menstrual period is not a critically radiosensitive period and consequently strict adherence to the ten-day rule is no longer indicated

  8. Antimicrobial activity of Verbascum macrurum Ten. (Scrophulariaceae).

    Science.gov (United States)

    Guarino, C

    2002-01-01

    The Author presents the results regarding the antibacterial action of extracts of Verbascum macrurum Ten.. The leaves of this species, gathered on the slopes of Mt. Matese, were ground and four extracts were made as follows: with dicholoromethane, ethonol and water (70:30 v/v), water and methanol. The antibacterial activity of each of the samples was tested and it is demonstrated that the extract with the ethanol/water was the most activity one.

  9. EEG Correlates of Ten Positive Emotions.

    Science.gov (United States)

    Hu, Xin; Yu, Jianwen; Song, Mengdi; Yu, Chun; Wang, Fei; Sun, Pei; Wang, Daifa; Zhang, Dan

    2017-01-01

    Compared with the well documented neurophysiological findings on negative emotions, much less is known about positive emotions. In the present study, we explored the EEG correlates of ten different positive emotions (joy, gratitude, serenity, interest, hope, pride, amusement, inspiration, awe, and love). A group of 20 participants were invited to watch 30 short film clips with their EEGs simultaneously recorded. Distinct topographical patterns for different positive emotions were found for the correlation coefficients between the subjective ratings on the ten positive emotions per film clip and the corresponding EEG spectral powers in different frequency bands. Based on the similarities of the participants' ratings on the ten positive emotions, these emotions were further clustered into three representative clusters, as 'encouragement' for awe, gratitude, hope, inspiration, pride, 'playfulness' for amusement, joy, interest, and 'harmony' for love, serenity. Using the EEG spectral powers as features, both the binary classification on the higher and lower ratings on these positive emotions and the binary classification between the three positive emotion clusters, achieved accuracies of approximately 80% and above. To our knowledge, our study provides the first piece of evidence on the EEG correlates of different positive emotions.

  10. The faces of Big Science.

    Science.gov (United States)

    Schatz, Gottfried

    2014-06-01

    Fifty years ago, academic science was a calling with few regulations or financial rewards. Today, it is a huge enterprise confronted by a plethora of bureaucratic and political controls. This change was not triggered by specific events or decisions but reflects the explosive 'knee' in the exponential growth that science has sustained during the past three-and-a-half centuries. Coming to terms with the demands and benefits of 'Big Science' is a major challenge for today's scientific generation. Since its foundation 50 years ago, the European Molecular Biology Organization (EMBO) has been of invaluable help in meeting this challenge.

  11. Big Data and central banks

    Directory of Open Access Journals (Sweden)

    David Bholat

    2015-04-01

    Full Text Available This commentary recaps a Centre for Central Banking Studies event held at the Bank of England on 2–3 July 2014. The article covers three main points. First, it situates the Centre for Central Banking Studies event within the context of the Bank’s Strategic Plan and initiatives. Second, it summarises and reflects on major themes from the event. Third, the article links central banks’ emerging interest in Big Data approaches with their broader uptake by other economic agents.

  12. Inhomogeneous Big Bang Nucleosynthesis Revisited

    OpenAIRE

    Lara, J. F.; Kajino, T.; Mathews, G. J.

    2006-01-01

    We reanalyze the allowed parameters for inhomogeneous big bang nucleosynthesis in light of the WMAP constraints on the baryon-to-photon ratio and a recent measurement which has set the neutron lifetime to be 878.5 +/- 0.7 +/- 0.3 seconds. For a set baryon-to-photon ratio the new lifetime reduces the mass fraction of He4 by 0.0015 but does not significantly change the abundances of other isotopes. This enlarges the region of concordance between He4 and deuterium in the parameter space of the b...

  13. Generation IV reactors: reactor concepts

    International Nuclear Information System (INIS)

    Cardonnier, J.L.; Dumaz, P.; Antoni, O.; Arnoux, P.; Bergeron, A.; Renault, C.; Rimpault, G.; Delpech, M.; Garnier, J.C.; Anzieu, P.; Francois, G.; Lecomte, M.

    2003-01-01

    Liquid metal reactor concept looks promising because of its hard neutron spectrum. Sodium reactors benefit a large feedback experience in Japan and in France. Lead reactors have serious assets concerning safety but they require a great effort in technological research to overcome the corrosion issue and they lack a leader country to develop this innovative technology. In molten salt reactor concept, salt is both the nuclear fuel and the coolant fluid. The high exit temperature of the primary salt (700 Celsius degrees) allows a high energy efficiency (44%). Furthermore molten salts have interesting specificities concerning the transmutation of actinides: they are almost insensitive to irradiation damage, some salts can dissolve large quantities of actinides and they are compatible with most reprocessing processes based on pyro-chemistry. Supercritical water reactor concept is based on operating temperature and pressure conditions that infers water to be beyond its critical point. In this range water gets some useful characteristics: - boiling crisis is no more possible because liquid and vapour phase can not coexist, - a high heat transfer coefficient due to the low thermal conductivity of supercritical water, and - a high global energy efficiency due to the high temperature of water. Gas-cooled fast reactors combining hard neutron spectrum and closed fuel cycle open the way to a high valorization of natural uranium while minimizing ultimate radioactive wastes and proliferation risks. Very high temperature gas-cooled reactor concept is developed in the prospect of producing hydrogen from no-fossil fuels in large scale. This use implies a reactor producing helium over 1000 Celsius degrees. (A.C.)

  14. Comparative validity of brief to medium-length Big Five and Big Six personality questionnaires

    NARCIS (Netherlands)

    Thalmayer, A.G.; Saucier, G.; Eigenhuis, A.

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five

  15. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  16. Big Data en surveillance, deel 1 : Definities en discussies omtrent Big Data

    NARCIS (Netherlands)

    Timan, Tjerk

    2016-01-01

    Naar aanleiding van een (vrij kort) college over surveillance en Big Data, werd me gevraagd iets dieper in te gaan op het thema, definities en verschillende vraagstukken die te maken hebben met big data. In dit eerste deel zal ik proberen e.e.a. uiteen te zetten betreft Big Data theorie en

  17. 77 FR 27245 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN

    Science.gov (United States)

    2012-05-09

    ... DEPARTMENT OF THE INTERIOR Fish and Wildlife Service [FWS-R3-R-2012-N069; FXRS1265030000S3-123-FF03R06000] Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN AGENCY: Fish and... plan (CCP) and environmental assessment (EA) for Big Stone National Wildlife Refuge (Refuge, NWR) for...

  18. Big game hunting practices, meanings, motivations and constraints: a survey of Oregon big game hunters

    Science.gov (United States)

    Suresh K. Shrestha; Robert C. Burns

    2012-01-01

    We conducted a self-administered mail survey in September 2009 with randomly selected Oregon hunters who had purchased big game hunting licenses/tags for the 2008 hunting season. Survey questions explored hunting practices, the meanings of and motivations for big game hunting, the constraints to big game hunting participation, and the effects of age, years of hunting...

  19. Research reactors - an overview

    International Nuclear Information System (INIS)

    West, C.D.

    1997-01-01

    A broad overview of different types of research and type reactors is provided in this paper. Reactor designs and operating conditions are briefly described for four reactors. The reactor types described include swimming pool reactors, the High Flux Isotope Reactor, the Mark I TRIGA reactor, and the Advanced Neutron Source reactor. Emphasis in the descriptions is placed on safety-related features of the reactors. 7 refs., 7 figs., 2 tabs

  20. Decommissioning of the Neuherberg Research Reactor (FRN)

    International Nuclear Information System (INIS)

    Demmeler, M.; Rau, G.; Strube, D.

    1982-01-01

    The Neuherberg Research Reactor is of type TRIGA MARK III with 1 MW steady state power and pulsable up to 2000 MW. During more than ten years of operation 12000 MWh and 6000 reactor pulses had been performed. In spite of its good technical condition and of permanent safe operation without any failures, the decommissioning of the Neuherberg research reactor was decided by the GSF board of directors to save costs for maintaining and personnel. As the mode of decommissioning the safe enclosure was chosen which means that the fuel elements will be transferred back to the USA. All other radioactive reactor components will be enclosed in the reactor block. Procedures for licensing of the decommissioning, dismantling procedures and time tables are presented

  1. Forget the hype or reality. Big data presents new opportunities in Earth Science.

    Science.gov (United States)

    Lee, T. J.

    2015-12-01

    Earth science is arguably one of the most mature science discipline which constantly acquires, curates, and utilizes a large volume of data with diverse variety. We deal with big data before there is big data. For example, while developing the EOS program in the 1980s, the EOS data and information system (EOSDIS) was developed to manage the vast amount of data acquired by the EOS fleet of satellites. EOSDIS continues to be a shining example of modern science data systems in the past two decades. With the explosion of internet, the usage of social media, and the provision of sensors everywhere, the big data era has bring new challenges. First, Goggle developed the search algorithm and a distributed data management system. The open source communities quickly followed up and developed Hadoop file system to facility the map reduce workloads. The internet continues to generate tens of petabytes of data every day. There is a significant shortage of algorithms and knowledgeable manpower to mine the data. In response, the federal government developed the big data programs that fund research and development projects and training programs to tackle these new challenges. Meanwhile, comparatively to the internet data explosion, Earth science big data problem has become quite small. Nevertheless, the big data era presents an opportunity for Earth science to evolve. We learned about the MapReduce algorithms, in memory data mining, machine learning, graph analysis, and semantic web technologies. How do we apply these new technologies to our discipline and bring the hype to Earth? In this talk, I will discuss how we might want to apply some of the big data technologies to our discipline and solve many of our challenging problems. More importantly, I will propose new Earth science data system architecture to enable new type of scientific inquires.

  2. An analysis of cross-sectional differences in big and non-big public accounting firms' audit programs

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans); Drieenhuizen, F.; Stein, M.T.; Simunic, D.A.

    2006-01-01

    A significant body of prior research has shown that audits by the Big 5 (now Big 4) public accounting firms are quality differentiated relative to non-Big 5 audits. This result can be derived analytically by assuming that Big 5 and non-Big 5 firms face different loss functions for "audit failures"

  3. High Flux Materials Testing Reactor (HFR), Petten

    International Nuclear Information System (INIS)

    1975-09-01

    After conversion to burnable poison fuel elements, the High Flux Materials Testing Reactor (HFR) Petten (Netherlands), operated through 1974 for 280 days at 45 MW. Equipment for irradiation experiments has been replaced and extended. The average annual occupation by experiments was 55% as compared to 38% in 1973. Work continued on thirty irradiation projects and ten development activities

  4. Baryon symmetric big bang cosmology

    Science.gov (United States)

    Stecker, F. W.

    1978-01-01

    Both the quantum theory and Einsteins theory of special relativity lead to the supposition that matter and antimatter were produced in equal quantities during the big bang. It is noted that local matter/antimatter asymmetries may be reconciled with universal symmetry by assuming (1) a slight imbalance of matter over antimatter in the early universe, annihilation, and a subsequent remainder of matter; (2) localized regions of excess for one or the other type of matter as an initial condition; and (3) an extremely dense, high temperature state with zero net baryon number; i.e., matter/antimatter symmetry. Attention is given to the third assumption, which is the simplest and the most in keeping with current knowledge of the cosmos, especially as pertains the universality of 3 K background radiation. Mechanisms of galaxy formation are discussed, whereby matter and antimatter might have collided and annihilated each other, or have coexisted (and continue to coexist) at vast distances. It is pointed out that baryon symmetric big bang cosmology could probably be proved if an antinucleus could be detected in cosmic radiation.

  5. Georges et le big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2011-01-01

    Georges et Annie, sa meilleure amie, sont sur le point d'assister à l'une des plus importantes expériences scientifiques de tous les temps : explorer les premiers instants de l'Univers, le Big Bang ! Grâce à Cosmos, leur super ordinateur, et au Grand Collisionneur de hadrons créé par Éric, le père d'Annie, ils vont enfin pouvoir répondre à cette question essentielle : pourquoi existons nous ? Mais Georges et Annie découvrent qu'un complot diabolique se trame. Pire, c'est toute la recherche scientifique qui est en péril ! Entraîné dans d'incroyables aventures, Georges ira jusqu'aux confins de la galaxie pour sauver ses amis...Une plongée passionnante au coeur du Big Bang. Les toutes dernières théories de Stephen Hawking et des plus grands scientifiques actuels.

  6. Astronomical Surveys and Big Data

    Directory of Open Access Journals (Sweden)

    Mickaelian Areg M.

    2016-03-01

    Full Text Available Recent all-sky and large-area astronomical surveys and their catalogued data over the whole range of electromagnetic spectrum, from γ-rays to radio waves, are reviewed, including such as Fermi-GLAST and INTEGRAL in γ-ray, ROSAT, XMM and Chandra in X-ray, GALEX in UV, SDSS and several POSS I and POSS II-based catalogues (APM, MAPS, USNO, GSC in the optical range, 2MASS in NIR, WISE and AKARI IRC in MIR, IRAS and AKARI FIS in FIR, NVSS and FIRST in radio range, and many others, as well as the most important surveys giving optical images (DSS I and II, SDSS, etc., proper motions (Tycho, USNO, Gaia, variability (GCVS, NSVS, ASAS, Catalina, Pan-STARRS, and spectroscopic data (FBS, SBS, Case, HQS, HES, SDSS, CALIFA, GAMA. An overall understanding of the coverage along the whole wavelength range and comparisons between various surveys are given: galaxy redshift surveys, QSO/AGN, radio, Galactic structure, and Dark Energy surveys. Astronomy has entered the Big Data era, with Astrophysical Virtual Observatories and Computational Astrophysics playing an important role in using and analyzing big data for new discoveries.

  7. Big data in oncologic imaging.

    Science.gov (United States)

    Regge, Daniele; Mazzetti, Simone; Giannini, Valentina; Bracco, Christian; Stasi, Michele

    2017-06-01

    Cancer is a complex disease and unfortunately understanding how the components of the cancer system work does not help understand the behavior of the system as a whole. In the words of the Greek philosopher Aristotle "the whole is greater than the sum of parts." To date, thanks to improved information technology infrastructures, it is possible to store data from each single cancer patient, including clinical data, medical images, laboratory tests, and pathological and genomic information. Indeed, medical archive storage constitutes approximately one-third of total global storage demand and a large part of the data are in the form of medical images. The opportunity is now to draw insight on the whole to the benefit of each individual patient. In the oncologic patient, big data analysis is at the beginning but several useful applications can be envisaged including development of imaging biomarkers to predict disease outcome, assessing the risk of X-ray dose exposure or of renal damage following the administration of contrast agents, and tracking and optimizing patient workflow. The aim of this review is to present current evidence of how big data derived from medical images may impact on the diagnostic pathway of the oncologic patient.

  8. Leveraging Mobile Network Big Data for Developmental Policy ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Some argue that big data and big data users offer advantages to generate evidence. ... Supported by IDRC, this research focused on transportation planning in urban ... Using mobile network big data for land use classification CPRsouth 2015.

  9. Nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, R F; George, B V; Baglin, C J

    1978-05-10

    Reference is made to thermal insulation on the inner surfaces of containment vessels of fluid cooled nuclear reactors and particularly in situations where the thermal insulation must also serve a structural function and transmit substantial load forces to the surface which it covers. An arrangement is described that meets this requirement and also provides for core support means that favourably influences the flow of hot coolant from the lower end of the core into a plenum space in the hearth of the reactor. The arrangement comprises a course of thermally insulating bricks arranged as a mosaic covering a wall of the reactor and a course of thermally insulating tiles arranged as a mosaic covering the course of bricks. Full constructional details are given.

  10. Nuclear reactors

    International Nuclear Information System (INIS)

    Prescott, R.F.; George, B.V.; Baglin, C.J.

    1978-01-01

    Reference is made to thermal insulation on the inner surfaces of containment vessels of fluid cooled nuclear reactors and particularly in situations where the thermal insulation must also serve a structural function and transmit substantial load forces to the surface which it covers. An arrangement is described that meets this requirement and also provides for core support means that favourably influences the flow of hot coolant from the lower end of the core into a plenum space in the hearth of the reactor. The arrangement comprises a course of thermally insulating bricks arranged as a mosaic covering a wall of the reactor and a course of thermally insulating tiles arranged as a mosaic covering the course of bricks. Full constructional details are given. (UK)

  11. Bioconversion reactor

    Science.gov (United States)

    McCarty, Perry L.; Bachmann, Andre

    1992-01-01

    A bioconversion reactor for the anaerobic fermentation of organic material. The bioconversion reactor comprises a shell enclosing a predetermined volume, an inlet port through which a liquid stream containing organic materials enters the shell, and an outlet port through which the stream exits the shell. A series of vertical and spaced-apart baffles are positioned within the shell to force the stream to flow under and over them as it passes from the inlet to the outlet port. The baffles present a barrier to the microorganisms within the shell causing them to rise and fall within the reactor but to move horizontally at a very slow rate. Treatment detention times of one day or less are possible.

  12. A New Look at Big History

    Science.gov (United States)

    Hawkey, Kate

    2014-01-01

    The article sets out a "big history" which resonates with the priorities of our own time. A globalizing world calls for new spacial scales to underpin what the history curriculum addresses, "big history" calls for new temporal scales, while concern over climate change calls for a new look at subject boundaries. The article…

  13. A little big history of Tiananmen

    NARCIS (Netherlands)

    Quaedackers, E.; Grinin, L.E.; Korotayev, A.V.; Rodrigue, B.H.

    2011-01-01

    This contribution aims at demonstrating the usefulness of studying small-scale subjects such as Tiananmen, or the Gate of Heavenly Peace, in Beijing - from a Big History perspective. By studying such a ‘little big history’ of Tiananmen, previously overlooked yet fundamental explanations for why

  14. What is beyond the big five?

    Science.gov (United States)

    Saucier, G; Goldberg, L R

    1998-08-01

    Previous investigators have proposed that various kinds of person-descriptive content--such as differences in attitudes or values, in sheer evaluation, in attractiveness, or in height and girth--are not adequately captured by the Big Five Model. We report on a rather exhaustive search for reliable sources of Big Five-independent variation in data from person-descriptive adjectives. Fifty-three candidate clusters were developed in a college sample using diverse approaches and sources. In a nonstudent adult sample, clusters were evaluated with respect to a minimax criterion: minimum multiple correlation with factors from Big Five markers and maximum reliability. The most clearly Big Five-independent clusters referred to Height, Girth, Religiousness, Employment Status, Youthfulness and Negative Valence (or low-base-rate attributes). Clusters referring to Fashionableness, Sensuality/Seductiveness, Beauty, Masculinity, Frugality, Humor, Wealth, Prejudice, Folksiness, Cunning, and Luck appeared to be potentially beyond the Big Five, although each of these clusters demonstrated Big Five multiple correlations of .30 to .45, and at least one correlation of .20 and over with a Big Five factor. Of all these content areas, Religiousness, Negative Valence, and the various aspects of Attractiveness were found to be represented by a substantial number of distinct, common adjectives. Results suggest directions for supplementing the Big Five when one wishes to extend variable selection outside the domain of personality traits as conventionally defined.

  15. Big Data Analytics and Its Applications

    Directory of Open Access Journals (Sweden)

    Mashooque A. Memon

    2017-10-01

    Full Text Available The term, Big Data, has been authored to refer to the extensive heave of data that can't be managed by traditional data handling methods or techniques. The field of Big Data plays an indispensable role in various fields, such as agriculture, banking, data mining, education, chemistry, finance, cloud computing, marketing, health care stocks. Big data analytics is the method for looking at big data to reveal hidden patterns, incomprehensible relationship and other important data that can be utilize to resolve on enhanced decisions. There has been a perpetually expanding interest for big data because of its fast development and since it covers different areas of applications. Apache Hadoop open source technology created in Java and keeps running on Linux working framework was used. The primary commitment of this exploration is to display an effective and free solution for big data application in a distributed environment, with its advantages and indicating its easy use. Later on, there emerge to be a required for an analytical review of new developments in the big data technology. Healthcare is one of the best concerns of the world. Big data in healthcare imply to electronic health data sets that are identified with patient healthcare and prosperity. Data in the healthcare area is developing past managing limit of the healthcare associations and is relied upon to increment fundamentally in the coming years.

  16. Big data and software defined networks

    CERN Document Server

    Taheri, Javid

    2018-01-01

    Big Data Analytics and Software Defined Networking (SDN) are helping to drive the management of data usage of the extraordinary increase of computer processing power provided by Cloud Data Centres (CDCs). This new book investigates areas where Big-Data and SDN can help each other in delivering more efficient services.

  17. Big Science and Long-tail Science

    CERN Document Server

    2008-01-01

    Jim Downing and I were privileged to be the guests of Salavtore Mele at CERN yesterday and to see the Atlas detector of the Large Hadron Collider . This is a wow experience - although I knew it was big, I hadnt realised how big.

  18. The Death of the Big Men

    DEFF Research Database (Denmark)

    Martin, Keir

    2010-01-01

    Recently Tolai people og Papua New Guinea have adopted the term 'Big Shot' to decribe an emerging post-colonial political elite. The mergence of the term is a negative moral evaluation of new social possibilities that have arisen as a consequence of the Big Shots' privileged position within a glo...

  19. An embedding for the big bang

    Science.gov (United States)

    Wesson, Paul S.

    1994-01-01

    A cosmological model is given that has good physical properties for the early and late universe but is a hypersurface in a flat five-dimensional manifold. The big bang can therefore be regarded as an effect of a choice of coordinates in a truncated higher-dimensional geometry. Thus the big bang is in some sense a geometrical illusion.

  20. Probing the pre-big bang universe

    International Nuclear Information System (INIS)

    Veneziano, G.

    2000-01-01

    Superstring theory suggests a new cosmology whereby a long inflationary phase preceded a non singular big bang-like event. After discussing how pre-big bang inflation naturally arises from an almost trivial initial state of the Universe, I will describe how present or near-future experiments can provide sensitive probes of how the Universe behaved in the pre-bang era

  1. Starting Small, Thinking Big - Continuum Magazine | NREL

    Science.gov (United States)

    , Thinking Big Stories NREL Helps Agencies Target New Federal Sustainability Goals Student Engagements Help solar power in the territory. Photo by Don Buchanan, VIEO Starting Small, Thinking Big NREL helps have used these actions to optimize that energy use.'" NREL's cross-organizational work supports

  2. Exploiting big data for critical care research.

    Science.gov (United States)

    Docherty, Annemarie B; Lone, Nazir I

    2015-10-01

    Over recent years the digitalization, collection and storage of vast quantities of data, in combination with advances in data science, has opened up a new era of big data. In this review, we define big data, identify examples of critical care research using big data, discuss the limitations and ethical concerns of using these large datasets and finally consider scope for future research. Big data refers to datasets whose size, complexity and dynamic nature are beyond the scope of traditional data collection and analysis methods. The potential benefits to critical care are significant, with faster progress in improving health and better value for money. Although not replacing clinical trials, big data can improve their design and advance the field of precision medicine. However, there are limitations to analysing big data using observational methods. In addition, there are ethical concerns regarding maintaining confidentiality of patients who contribute to these datasets. Big data have the potential to improve medical care and reduce costs, both by individualizing medicine, and bringing together multiple sources of data about individual patients. As big data become increasingly mainstream, it will be important to maintain public confidence by safeguarding data security, governance and confidentiality.

  3. Practice variation in Big-4 transparency reports

    NARCIS (Netherlands)

    Girdhar, Sakshi; Jeppesen, K.K.

    2018-01-01

    Purpose The purpose of this paper is to examine the transparency reports published by the Big-4 public accounting firms in the UK, Germany and Denmark to understand the determinants of their content within the networks of big accounting firms. Design/methodology/approach The study draws on a

  4. Big data analysis for smart farming

    NARCIS (Netherlands)

    Kempenaar, C.; Lokhorst, C.; Bleumer, E.J.B.; Veerkamp, R.F.; Been, Th.; Evert, van F.K.; Boogaardt, M.J.; Ge, L.; Wolfert, J.; Verdouw, C.N.; Bekkum, van Michael; Feldbrugge, L.; Verhoosel, Jack P.C.; Waaij, B.D.; Persie, van M.; Noorbergen, H.

    2016-01-01

    In this report we describe results of a one-year TO2 institutes project on the development of big data technologies within the milk production chain. The goal of this project is to ‘create’ an integration platform for big data analysis for smart farming and to develop a show case. This includes both

  5. Reactor operations at SAFARI-1

    International Nuclear Information System (INIS)

    Vlok, J.W.H.

    2003-01-01

    A vigorous commercial programme of isotope production and other radiation services has been followed by the SAFARI-1 research reactor over the past ten years - superimposed on the original purpose of the reactor to provide a basic tool for nuclear research, development and education to the country at an institutional level. A combination of the binding nature of the resulting contractual obligations and tighter regulatory control has demanded an equally vigorous programme of upgrading, replacement and renovation of many systems in order to improve the safety and reliability of the reactor. Not least among these changes is the more effective training and deployment of operations personnel that has been necessitated as the operational demands on the reactor evolved from five days per week to twenty four hours per day, seven days per week, with more than 300 days per year at full power. This paper briefly sketches the operational history of SAFARI-1 and then focuses on the training and structuring currently in place to meet the operational needs. There is a detailed step-by-step look at the operator?s career plan and pre-defined milestones. Shift work, especially the shift cycle, has a negative influence on the operator's career path development, especially due to his unavailability for training. Methods utilised to minimise this influence are presented. The increase of responsibilities regarding the operation of the reactor, ancillaries and experimental facilities as the operator progresses with his career are discussed. (author)

  6. Nuclear reactor

    International Nuclear Information System (INIS)

    Scholz, M.

    1976-01-01

    An improvement of the accessibility of that part of a nuclear reactor serving for biological shield is proposed. It is intended to provide within the biological shield, distributed around the circumference of the reactor pressure vessel, several shielding chambers filled with shielding material, which are isolated gastight from the outside by means of glass panes with a given bursting strength. It is advantageous that, on the one hand, inspection and maintenance will be possible without great effort and, on the other, a large relief cross section will be at desposal if required. (UWI) [de

  7. NEUTRONIC REACTOR

    Science.gov (United States)

    Wigner, E.P.; Weinberg, A.W.; Young, G.J.

    1958-04-15

    A nuclear reactor which uses uranium in the form of elongated tubes as fuel elements and liquid as a coolant is described. Elongated tubular uranium bodies are vertically disposed in an efficient neutron slowing agent, such as graphite, for example, to form a lattice structure which is disposed between upper and lower coolant tanks. Fluid coolant tubes extend through the uranium bodies and communicate with the upper and lower tanks and serve to convey the coolant through the uranium body. The reactor is also provided with means for circulating the cooling fluid through the coolant tanks and coolant tubes, suitable neutron and gnmma ray shields, and control means.

  8. Ten steps to successful poster presentation.

    Science.gov (United States)

    Hardicre, Jayne; Devitt, Patric; Coad, Jane

    Receiving a letter confirming acceptance for you to present a poster at a conference can evoke mixed emotions. Joy, panic, fear and dread are among the many possible emotions and this is not exclusive to first time presenters. Developing an effective poster presentation is a skill that you can learn and can provide a rewarding way to present your work in a manner less intimidating than oral presentation (Shelledy, 2004). The key to successful poster presentation is meticulous, timely, well informed preparation. This article outlines ten steps to help guide you through the process to maximize your success.

  9. The Top Ten Algorithms in Data Mining

    CERN Document Server

    Wu, Xindong

    2009-01-01

    From classification and clustering to statistical learning, association analysis, and link mining, this book covers the most important topics in data mining research. It presents the ten most influential algorithms used in the data mining community today. Each chapter provides a detailed description of the algorithm, a discussion of available software implementation, advanced topics, and exercises. With a simple data set, examples illustrate how each algorithm works and highlight the overall performance of each algorithm in a real-world application. Featuring contributions from leading researc

  10. Novitäten im Breslauer Stadttheater

    OpenAIRE

    Zduniak, Maria

    2017-01-01

    Am Anfang des 20. Jahrhunderts zählte das Breslauer Stadttheater, obwohl mit nur einer kleinen Orchester-Besetzung ausgestattet, zu den bemerkenswerten europäischen Opernbühnen. Es war aufgeschlossen für musikalische Novitäten. Als Beweis dafür sind u.a. die Breslauer Premieren Salome (1906) von Richard Strauss, L\\'Orfeo (1913) von Claudio Monteverdi und die Uraufführung der Oper Eros und Psyche (1917) von Ludomir Różycki zu nennen.

  11. Ten financial management principles for survival.

    Science.gov (United States)

    Cleverley, W O

    1988-03-01

    Financial insolvency is the primary cause of hospital failure. Managers may analyze a hospital's financial statements to anticipate and prevent fiscal problems. Ten measures of fiscal status may be used to evaluate the following: operating profitability nonoperating income equity growth liquidity debt capacity age of facilities revenue generation replacement funds receivables survivability Based on data from the Financial Analysis Service, Catholic hospitals are doing better than other U.S. hospitals in some areas of financial preparedness. In most areas, however, all hospitals suffer by comparison with manufacturers. The 10 principles of solvent and successful operations can help hospitals improve financial resiliency.

  12. Ten essential skills for electrical engineers

    CERN Document Server

    Dorr, Barry

    2014-01-01

    Engineers know that, as in any other discipline, getting a good job requires practical, up-to-date skills. An engineering degree provides a broad set of fundamentals. Ten Essential Skills applies those fundamentals to practical tasks required by employers. Written in a user-friendly, no-nonsense format, the book reviews practical skills using the latest tools and techniques, and features a companion website with interview practice problems and advanced material for readers wishing to pursue additional skills. With this book, aspiring and current engineers may approach job interviews confident

  13. Optical spectroscopy of ten extragalactic radiosources

    International Nuclear Information System (INIS)

    Rawlings, S.; Riley, J.M.; Saunders, R.

    1989-01-01

    We present optical spectroscopy of ten objects associated with extra-galactic radiosources, using the University of Hawaii 2.2-m telescope. Redshifts are measured for four radiogalaxies (B20217 + 36A + B, 3C73, 0648 + 19A, 0648 + 19B) and for a galaxy which is probably associated with a double radio-source with highly unusual properties (0951 + 37); existing redshifts are confirmed for two radiogalaxies (4C39.04, 4C40.08); and a tentative redshift of z=2.87 measured for the quasar 3C82. (author)

  14. Audits reveal ten common environmental problems

    International Nuclear Information System (INIS)

    Buys, M.W.

    1992-01-01

    The old saying that open-quotes an ounce of prevention is worth a pound of cureclose quotes rings particularly true in environmental matters in the 1990s. Environmental problems can potentially lead to expensive fines, costly cleanups, negative public relations, and even criminal sanctions against members of the corporation. A recurring pattern of problems has been noted during the performance of environmental disposition, acquisition, and compliance assessments of many different operators in most of the producing states. The ten most common problems found in oilfield audits are discussed here in an effort to enhance the awareness of operators

  15. Ten new withanolides from Physalis peruviana.

    Science.gov (United States)

    Fang, Sheng-Tao; Liu, Ji-Kai; Li, Bo

    2012-01-01

    Ten new withanolides, including four perulactone-type withanolides, perulactones E-H (1-4), three 28-hydroxy-withanolides, withaperuvins I-K (5-7), and three other withanolides, withaperuvins L-N (8-10), together with six known compounds (11-16) were isolated from the aerial parts of Physalis peruviana. The structures of these compounds were elucidated on the basis of extensive spectroscopic analyses (1D and 2D NMR, IR, HR-MS) and chemical methods. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. Cloud Based Big Data Infrastructure: Architectural Components and Automated Provisioning

    OpenAIRE

    Demchenko, Yuri; Turkmen, Fatih; Blanchet, Christophe; Loomis, Charles; Laat, Caees de

    2016-01-01

    This paper describes the general architecture and functional components of the cloud based Big Data Infrastructure (BDI). The proposed BDI architecture is based on the analysis of the emerging Big Data and data intensive technologies and supported by the definition of the Big Data Architecture Framework (BDAF) that defines the following components of the Big Data technologies: Big Data definition, Data Management including data lifecycle and data structures, Big Data Infrastructure (generical...

  17. TENS (transcutaneous electrical nerve stimulation) for labour pain.

    Science.gov (United States)

    Francis, Richard

    2012-05-01

    Because TENS is applied inconsistently and not always in line with optimal TENS application theory, this may explain why TENS for labour pain appears to be effective in some individuals and not in others. This article reviews TENS theory, advises upon optimal TENS application for labour pain and discusses some of the limitations of TENS research on labour pain. TENS application for labour pain may include TENS applied to either side of the lower spine, set to 200 mus pulse duration and 100 pulses per second. As pain increases, TENS intensity should be increased and as pain decreases, TENS intensity should be reduced to maintain a strong but pain free intensity of stimulation. This application may particularly reduce back pain during labour.

  18. The ethics of biomedical big data

    CERN Document Server

    Mittelstadt, Brent Daniel

    2016-01-01

    This book presents cutting edge research on the new ethical challenges posed by biomedical Big Data technologies and practices. ‘Biomedical Big Data’ refers to the analysis of aggregated, very large datasets to improve medical knowledge and clinical care. The book describes the ethical problems posed by aggregation of biomedical datasets and re-use/re-purposing of data, in areas such as privacy, consent, professionalism, power relationships, and ethical governance of Big Data platforms. Approaches and methods are discussed that can be used to address these problems to achieve the appropriate balance between the social goods of biomedical Big Data research and the safety and privacy of individuals. Seventeen original contributions analyse the ethical, social and related policy implications of the analysis and curation of biomedical Big Data, written by leading experts in the areas of biomedical research, medical and technology ethics, privacy, governance and data protection. The book advances our understan...

  19. Big data analytics methods and applications

    CERN Document Server

    Rao, BLS; Rao, SB

    2016-01-01

    This book has a collection of articles written by Big Data experts to describe some of the cutting-edge methods and applications from their respective areas of interest, and provides the reader with a detailed overview of the field of Big Data Analytics as it is practiced today. The chapters cover technical aspects of key areas that generate and use Big Data such as management and finance; medicine and healthcare; genome, cytome and microbiome; graphs and networks; Internet of Things; Big Data standards; bench-marking of systems; and others. In addition to different applications, key algorithmic approaches such as graph partitioning, clustering and finite mixture modelling of high-dimensional data are also covered. The varied collection of themes in this volume introduces the reader to the richness of the emerging field of Big Data Analytics.

  20. 2nd INNS Conference on Big Data

    CERN Document Server

    Manolopoulos, Yannis; Iliadis, Lazaros; Roy, Asim; Vellasco, Marley

    2017-01-01

    The book offers a timely snapshot of neural network technologies as a significant component of big data analytics platforms. It promotes new advances and research directions in efficient and innovative algorithmic approaches to analyzing big data (e.g. deep networks, nature-inspired and brain-inspired algorithms); implementations on different computing platforms (e.g. neuromorphic, graphics processing units (GPUs), clouds, clusters); and big data analytics applications to solve real-world problems (e.g. weather prediction, transportation, energy management). The book, which reports on the second edition of the INNS Conference on Big Data, held on October 23–25, 2016, in Thessaloniki, Greece, depicts an interesting collaborative adventure of neural networks with big data and other learning technologies.

  1. Practice Variation in Big-4 Transparency Reports

    DEFF Research Database (Denmark)

    Girdhar, Sakshi; Klarskov Jeppesen, Kim

    2018-01-01

    Purpose: The purpose of this paper is to examine the transparency reports published by the Big-4 public accounting firms in the UK, Germany and Denmark to understand the determinants of their content within the networks of big accounting firms. Design/methodology/approach: The study draws...... on a qualitative research approach, in which the content of transparency reports is analyzed and semi-structured interviews are conducted with key people from the Big-4 firms who are responsible for developing the transparency reports. Findings: The findings show that the content of transparency reports...... is inconsistent and the transparency reporting practice is not uniform within the Big-4 networks. Differences were found in the way in which the transparency reporting practices are coordinated globally by the respective central governing bodies of the Big-4. The content of the transparency reports...

  2. Epidemiology in the Era of Big Data

    Science.gov (United States)

    Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M

    2015-01-01

    Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called ‘3 Vs’: variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that, while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field’s future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future. PMID:25756221

  3. Ethics and Epistemology in Big Data Research.

    Science.gov (United States)

    Lipworth, Wendy; Mason, Paul H; Kerridge, Ian; Ioannidis, John P A

    2017-12-01

    Biomedical innovation and translation are increasingly emphasizing research using "big data." The hope is that big data methods will both speed up research and make its results more applicable to "real-world" patients and health services. While big data research has been embraced by scientists, politicians, industry, and the public, numerous ethical, organizational, and technical/methodological concerns have also been raised. With respect to technical and methodological concerns, there is a view that these will be resolved through sophisticated information technologies, predictive algorithms, and data analysis techniques. While such advances will likely go some way towards resolving technical and methodological issues, we believe that the epistemological issues raised by big data research have important ethical implications and raise questions about the very possibility of big data research achieving its goals.

  4. The Big bang and the Quantum

    Science.gov (United States)

    Ashtekar, Abhay

    2010-06-01

    General relativity predicts that space-time comes to an end and physics comes to a halt at the big-bang. Recent developments in loop quantum cosmology have shown that these predictions cannot be trusted. Quantum geometry effects can resolve singularities, thereby opening new vistas. Examples are: The big bang is replaced by a quantum bounce; the `horizon problem' disappears; immediately after the big bounce, there is a super-inflationary phase with its own phenomenological ramifications; and, in presence of a standard inflation potential, initial conditions are naturally set for a long, slow roll inflation independently of what happens in the pre-big bang branch. As in my talk at the conference, I will first discuss the foundational issues and then the implications of the new Planck scale physics near the Big Bang.

  5. SparkText: Biomedical Text Mining on Big Data Framework

    Science.gov (United States)

    He, Karen Y.; Wang, Kai

    2016-01-01

    Background Many new biomedical research articles are published every day, accumulating rich information, such as genetic variants, genes, diseases, and treatments. Rapid yet accurate text mining on large-scale scientific literature can discover novel knowledge to better understand human diseases and to improve the quality of disease diagnosis, prevention, and treatment. Results In this study, we designed and developed an efficient text mining framework called SparkText on a Big Data infrastructure, which is composed of Apache Spark data streaming and machine learning methods, combined with a Cassandra NoSQL database. To demonstrate its performance for classifying cancer types, we extracted information (e.g., breast, prostate, and lung cancers) from tens of thousands of articles downloaded from PubMed, and then employed Naïve Bayes, Support Vector Machine (SVM), and Logistic Regression to build prediction models to mine the articles. The accuracy of predicting a cancer type by SVM using the 29,437 full-text articles was 93.81%. While competing text-mining tools took more than 11 hours, SparkText mined the dataset in approximately 6 minutes. Conclusions This study demonstrates the potential for mining large-scale scientific articles on a Big Data infrastructure, with real-time update from new articles published daily. SparkText can be extended to other areas of biomedical research. PMID:27685652

  6. SparkText: Biomedical Text Mining on Big Data Framework.

    Directory of Open Access Journals (Sweden)

    Zhan Ye

    Full Text Available Many new biomedical research articles are published every day, accumulating rich information, such as genetic variants, genes, diseases, and treatments. Rapid yet accurate text mining on large-scale scientific literature can discover novel knowledge to better understand human diseases and to improve the quality of disease diagnosis, prevention, and treatment.In this study, we designed and developed an efficient text mining framework called SparkText on a Big Data infrastructure, which is composed of Apache Spark data streaming and machine learning methods, combined with a Cassandra NoSQL database. To demonstrate its performance for classifying cancer types, we extracted information (e.g., breast, prostate, and lung cancers from tens of thousands of articles downloaded from PubMed, and then employed Naïve Bayes, Support Vector Machine (SVM, and Logistic Regression to build prediction models to mine the articles. The accuracy of predicting a cancer type by SVM using the 29,437 full-text articles was 93.81%. While competing text-mining tools took more than 11 hours, SparkText mined the dataset in approximately 6 minutes.This study demonstrates the potential for mining large-scale scientific articles on a Big Data infrastructure, with real-time update from new articles published daily. SparkText can be extended to other areas of biomedical research.

  7. SparkText: Biomedical Text Mining on Big Data Framework.

    Science.gov (United States)

    Ye, Zhan; Tafti, Ahmad P; He, Karen Y; Wang, Kai; He, Max M

    Many new biomedical research articles are published every day, accumulating rich information, such as genetic variants, genes, diseases, and treatments. Rapid yet accurate text mining on large-scale scientific literature can discover novel knowledge to better understand human diseases and to improve the quality of disease diagnosis, prevention, and treatment. In this study, we designed and developed an efficient text mining framework called SparkText on a Big Data infrastructure, which is composed of Apache Spark data streaming and machine learning methods, combined with a Cassandra NoSQL database. To demonstrate its performance for classifying cancer types, we extracted information (e.g., breast, prostate, and lung cancers) from tens of thousands of articles downloaded from PubMed, and then employed Naïve Bayes, Support Vector Machine (SVM), and Logistic Regression to build prediction models to mine the articles. The accuracy of predicting a cancer type by SVM using the 29,437 full-text articles was 93.81%. While competing text-mining tools took more than 11 hours, SparkText mined the dataset in approximately 6 minutes. This study demonstrates the potential for mining large-scale scientific articles on a Big Data infrastructure, with real-time update from new articles published daily. SparkText can be extended to other areas of biomedical research.

  8. Free Release Standards Utilized at Big Rock Point

    International Nuclear Information System (INIS)

    Robert P. Wills

    2000-01-01

    The decommissioning of Consumers Energy's Big Rock Point (BRP) site involves decommissioning its 75-MW boiling water reactor and all of the associated facilities. Consumers Energy is committed to restoring the site to greenfield conditions. This commitment means that when the decommissioning is complete, all former structures will have been removed, and the site will be available for future use without radiological restrictions. BRP's radiation protection management staff determined that the typical methods used to comply with U.S Nuclear Regulatory Commission (NRC) regulations for analyzing volumetric material for radionuclides would not fulfill the demands of a facility undergoing decommissioning. The challenge at hand is to comply with regulatory requirements and put into production a large-scale bulk release production program. This report describes Consumers Energy's planned approach to the regulatory aspects of free release

  9. Ten principles of good interdisciplinary team work.

    Science.gov (United States)

    Nancarrow, Susan A; Booth, Andrew; Ariss, Steven; Smith, Tony; Enderby, Pam; Roots, Alison

    2013-05-10

    Interdisciplinary team work is increasingly prevalent, supported by policies and practices that bring care closer to the patient and challenge traditional professional boundaries. To date, there has been a great deal of emphasis on the processes of team work, and in some cases, outcomes. This study draws on two sources of knowledge to identify the attributes of a good interdisciplinary team; a published systematic review of the literature on interdisciplinary team work, and the perceptions of over 253 staff from 11 community rehabilitation and intermediate care teams in the UK. These data sources were merged using qualitative content analysis to arrive at a framework that identifies characteristics and proposes ten competencies that support effective interdisciplinary team work. Ten characteristics underpinning effective interdisciplinary team work were identified: positive leadership and management attributes; communication strategies and structures; personal rewards, training and development; appropriate resources and procedures; appropriate skill mix; supportive team climate; individual characteristics that support interdisciplinary team work; clarity of vision; quality and outcomes of care; and respecting and understanding roles. We propose competency statements that an effective interdisciplinary team functioning at a high level should demonstrate.

  10. Neutronic reactor

    International Nuclear Information System (INIS)

    Wende, C.W.J.

    1976-01-01

    The method of operating a water-cooled neutronic reactor having a graphite moderator is described which comprises flowing a gaseous mixture of carbon dioxide and helium, in which the helium comprises 40--60 volume percent of the mixture, in contact with the graphite moderator. 2 claims, 4 figures

  11. Neutronic reactor

    International Nuclear Information System (INIS)

    Wende, C.W.J.

    1976-01-01

    A safety rod for a nuclear reactor has an inner end portion having a gamma absorption coefficient and neutron capture cross section approximately equal to those of the adjacent shield, a central portion containing materials of high neutron capture cross section and an outer end portion having a gamma absorption coefficient at least equal to that of the adjacent shield

  12. Reactor facility

    International Nuclear Information System (INIS)

    Suzuki, Hiroaki; Murase, Michio; Yokomizo, Osamu.

    1997-01-01

    The present invention provides a BWR type reactor facility capable of suppressing the amount of steams generated by the mutual effect of a failed reactor core and coolants upon occurrence of an imaginal accident, and not requiring spacial countermeasures for enhancing the pressure resistance of the container vessel. Namely, a means for supplying cooling water at a temperature not lower by 30degC than the saturated temperature corresponding to the inner pressure of the containing vessel upon occurrence of an accident is disposed to a lower dry well below the pressure vessel. As a result, upon occurrence of such an accident that the reactor core should be melted and flown downward of the pressure vessel, when cooling water at a temperature not lower than the saturated temperature, for example, cooling water at 100degC or higher is supplied to the lower dry well, abrupt generation of steams by the mutual effect of the failed reactor core and cooling water is scarcely caused compared with a case of supplying cooling water at a temperature lower than the saturation temperature by 30degC or more. Accordingly, the amount of steams to be generated can be suppressed, and special countermeasure is no more necessary for enhancing the pressure resistance of the container vessel is no more necessary. (I.S.)

  13. Nuclear reactor

    International Nuclear Information System (INIS)

    Gilroy, J.E.

    1980-01-01

    An improved cover structure for liquid metal cooled fast breeder type reactors is described which it is claimed reduces the temperature differential across the intermediate grid plate of the core cover structure and thereby reduces its subjection to thermal stresses. (UK)

  14. Reactor licensing

    International Nuclear Information System (INIS)

    Harvie, J.D.

    2002-01-01

    This presentation discusses reactor licensing and includes the legislative basis for licensing, other relevant legislation , the purpose of the Nuclear Safety and Control Act, important regulations, regulatory document, policies, and standards. It also discusses the role of the CNSC, its mandate and safety philosophy

  15. Nuclear reactor

    International Nuclear Information System (INIS)

    Hattori, Sadao; Sekine, Katsuhisa.

    1987-01-01

    Purpose: To decrease the thickness of a reactor container and reduce the height and the height and plate thickness of a roof slab without using mechanical vibration stoppers. Constitution: Earthquake proofness is improved by filling fluids such as liquid metal between a reactor container and a secondary container and connecting the outer surface of the reactor container with the inner surface of the secondary container by means of bellows. That is, for the horizontal seismic vibrations, horizontal loads can be supported by the secondary container without providing mechanical vibration stoppers to the reactor container and the wall thickness can be reduced thereby enabling to simplify thermal insulation structure for the reduction of thermal stresses. Further, for the vertical seismic vibrations, verical loads can be transmitted to the secondary container thereby enabling to reduce the wall thickness in the same manner as for the horizontal load. By the effect of transferring the point of action of the container load applied to the roof slab to the outer circumferential portion, the intended purpose can be attained and, in addition, the radiation dose rate at the upper surface of the roof slab can be decreased. (Kamimura, M.)

  16. Reactor system

    International Nuclear Information System (INIS)

    Miyano, Hiroshi; Narabayashi, Naoshi.

    1990-01-01

    The represent invention concerns a reactor system with improved water injection means to a pressure vessel of a BWR type reactor. A steam pump is connected to a heat removing system pipeline, a high pressure water injection system pipeline and a low pressure water injection system pipeline for injecting water into the pressure vessel. A pump actuation pipeline is disposed being branched from a main steam pump or a steam relieaf pipeline system, through which steams are supplied to actuate the steam pump and supply cooling water into the pressure vessel thereby cooling the reactor core. The steam pump converts the heat energy into the kinetic energy and elevates the pressure of water to a level higher than the pressure of the steams supplied by way of a pressure-elevating diffuser. Cooling water can be supplied to the pressure vessel by the pressure elevation. This can surely inject cooling water into the pressure vessel upon loss of coolant accident or in a case if reactor scram is necessary, without using an additional power source. (I.N.)

  17. Reactor core

    International Nuclear Information System (INIS)

    Matsuura, Tetsuaki; Nomura, Teiji; Tokunaga, Kensuke; Okuda, Shin-ichi

    1990-01-01

    Fuel assemblies in the portions where the gradient of fast neutron fluxes between two opposing faces of a channel box is great are kept loaded at the outermost peripheral position of the reactor core also in the second operation cycle in the order to prevent interference between a control rod and the channel box due to bending deformation of the channel box. Further, the fuel assemblies in the second row from the outer most periphery in the first operation cycle are also kept loaded at the second row in the second operation cycle. Since the gradient of the fast neutrons in the reactor core is especially great at the outer circumference of the reactor core, the channel box at the outer circumference is bent such that the surface facing to the center of the reactor core is convexed and the channel box in the second row is also bent to the identical direction, the insertion of the control rod is not interfered. Further, if the positions for the fuels at the outermost periphery and the fuels in the second row are not altered in the second operation cycle, the gaps are not reduced to prevent the interference between the control rod and the channel box. (N.H.)

  18. Intelligent search in Big Data

    Science.gov (United States)

    Birialtsev, E.; Bukharaev, N.; Gusenkov, A.

    2017-10-01

    An approach to data integration, aimed on the ontology-based intelligent search in Big Data, is considered in the case when information objects are represented in the form of relational databases (RDB), structurally marked by their schemes. The source of information for constructing an ontology and, later on, the organization of the search are texts in natural language, treated as semi-structured data. For the RDBs, these are comments on the names of tables and their attributes. Formal definition of RDBs integration model in terms of ontologies is given. Within framework of the model universal RDB representation ontology, oil production subject domain ontology and linguistic thesaurus of subject domain language are built. Technique of automatic SQL queries generation for subject domain specialists is proposed. On the base of it, information system for TATNEFT oil-producing company RDBs was implemented. Exploitation of the system showed good relevance with majority of queries.

  19. Big Data in Transport Geography

    DEFF Research Database (Denmark)

    Reinau, Kristian Hegner; Agerholm, Niels; Lahrmann, Harry Spaabæk

    for studies that explicitly compare the quality of this new type of data to traditional data sources. With the current focus on Big Data in the transport field, public transport planners are increasingly looking towards smart card data to analyze and optimize flows of passengers. However, in many cases...... it is not all public transport passengers in a city, region or country with a smart card system that uses the system, and in such cases, it is important to know what biases smart card data has in relation to giving a complete view upon passenger flows. This paper therefore analyses the quality and biases...... of smart card data in Denmark, where public transport passengers may use a smart card, may pay with cash for individual trips or may hold a season ticket for a certain route. By analyzing smart card data collected in Denmark in relation to data on sales of cash tickets, sales of season tickets, manual...

  20. Advancements in Big Data Processing

    CERN Document Server

    Vaniachine, A; The ATLAS collaboration

    2012-01-01

    The ever-increasing volumes of scientific data present new challenges for Distributed Computing and Grid-technologies. The emerging Big Data revolution drives new discoveries in scientific fields including nanotechnology, astrophysics, high-energy physics, biology and medicine. New initiatives are transforming data-driven scientific fields by pushing Bid Data limits enabling massive data analysis in new ways. In petascale data processing scientists deal with datasets, not individual files. As a result, a task (comprised of many jobs) became a unit of petascale data processing on the Grid. Splitting of a large data processing task into jobs enabled fine-granularity checkpointing analogous to the splitting of a large file into smaller TCP/IP packets during data transfers. Transferring large data in small packets achieves reliability through automatic re-sending of the dropped TCP/IP packets. Similarly, transient job failures on the Grid can be recovered by automatic re-tries to achieve reliable Six Sigma produc...