WorldWideScience

Sample records for big ten reactor

  1. Reactor dosimetry calibrations in the Big Ten critical assembly

    International Nuclear Information System (INIS)

    Barr, D.W.; Hansen, G.E.

    1977-01-01

    Eleven irradiations of foil packs located in the central region of Big Ten were made for the Interlaboratory Reaction Rate Program. Each irradiation was at a nominal 10 15 fluence and the principal fluence monitor was the National Bureau of Standards' double fission chamber containing 235 U and 238 U deposits and located at the center of Big Ten. Secondary monitors consisted of three external fission chambers and two internal foil sets containing Au, In, and Al. Activities of one set were counted at the LASL and the other at the Hanford Engineering Developement Laboratory. The uncertainty in relative fluence for each irradiation was +-0.3%

  2. Dosimetry results for Big Ten and related benchmarks

    International Nuclear Information System (INIS)

    Hansen, G.E.; Gilliam, D.M.

    1980-01-01

    Measured average reaction cross sections for the Big Ten central flux spectrum are given together with calculated values based on the U.S. Evaluated Nuclear Data File ENDF/B-IV. Central reactivity coefficients for 233 U, 235 U, 239 Pu, 6 Li and 10 B are given to check consistency of bias between measured and calculated reaction cross sections for these isotopes. Spectral indexes for the Los Alamos 233 U, 235 U and 239 Pu metal critical assemblies are updated, utilizing the Big Ten measurements and interassembly calibrations, and their implications for inelastic scattering are reiterated

  3. How the Big Ten West Was Won: Football Recruiting

    Directory of Open Access Journals (Sweden)

    Diehl Kevin A.

    2017-06-01

    Full Text Available The paper analyses the 2017 Big Ten West Division football cycle models recruiting. The top ten recruit scores [(Rivals.com * 100 + [100 (outside surrounding; 75 (surrounding] + extra credit walk-ons (100; 75] allow proper ranking: Iowa, Nebraska, Wisconsin, Northwestern, Illinois, Minnesota, and Purdue.

  4. Measurement of the neutron spectrum of the Big Ten critical assembly by lithium-6 spectrometry

    International Nuclear Information System (INIS)

    De Leeuw-Gierts, G.; De Leeuw, S.; Hansen, G.E.; Helmick, H.H.

    1979-01-01

    The central neutron-flux spectrum of the Los Alamos Scientific Laboratory's critical assembly, Big Ten, was measured with a 6 Li spectrometer and techniques developed at the Centre d'Etude de L'Energie Nucleaire, Mol, as part of an experimental program to establish the characteristics of Big Ten

  5. Measurement of the neutron spectrum of the Big Ten critical assembly by lithium-6 spectrometry

    International Nuclear Information System (INIS)

    Leeuw-Gierts, G. de; Leeuw, S. de

    1980-01-01

    The central neutron-flux spectrum of the Los Alamos Scientific Laboratory's critical assembly, Big Ten, was measured with a 6 Li spectrometer and techniques developed at the Centre d'Etude de l'Energie Nucleaire, Mol, as part of an experimental program to establish the characteristics of Big Ten

  6. An improved benchmark model for the Big Ten critical assembly - 021

    International Nuclear Information System (INIS)

    Mosteller, R.D.

    2010-01-01

    A new benchmark specification is developed for the BIG TEN uranium critical assembly. The assembly has a fast spectrum, and its core contains approximately 10 wt.% enriched uranium. Detailed specifications for the benchmark are provided, and results from the MCNP5 Monte Carlo code using a variety of nuclear-data libraries are given for this benchmark and two others. (authors)

  7. Nuclear technology and reactor safety engineering. The situation ten years after the Chernobyl reactor accident

    International Nuclear Information System (INIS)

    Birkhofer, A.

    1996-01-01

    Ten years ago, on April 26, 1986 the most serious accident ever in the history of nuclear tgechnology worldwide happened in unit 4 of the nuclear power plant in Chernobyl in the Ukraine, this accident unveiling to the world at large that the Soviet reactor design lines are bearing unthought of safety engineering deficits. The dimensions of this reactor accident on site, and the radioactive fallout spreading far and wide to many countries in Europe, vividly nourished the concern of great parts of the population in the Western world about the safety of nuclear technology, and re-instigated debates about the risks involved and their justification. Now that ten years have elapsed since the accident, it is appropriate to strike a balance and analyse the situation today. The number of nuclear power plants operating worldwide has been growing in the last few years and this trend will continue, primarily due to developments in Asia. The Chernobyl reactor accident has pushed the international dimension of reactor safety to the foreground. Thus the Western world had reason enough to commit itself to enhancing the engineered safety of reactors in East Europe. The article analyses some of the major developments and activities to date and shows future perspectives. (orig.) [de

  8. Flexibility in faculty work-life policies at medical schools in the Big Ten conference.

    Science.gov (United States)

    Welch, Julie L; Wiehe, Sarah E; Palmer-Smith, Victoria; Dankoski, Mary E

    2011-05-01

    Women lag behind men in several key academic indicators, such as advancement, retention, and securing leadership positions. Although reasons for these disparities are multifactorial, policies that do not support work-life integration contribute to the problem. The objective of this descriptive study was to compare the faculty work-life policies among medical schools in the Big Ten conference. Each institution's website was accessed in order to assess its work-life policies in the following areas: maternity leave, paternity leave, adoption leave, extension of probationary period, part-time appointments, part-time benefits (specifically health insurance), child care options, and lactation policy. Institutions were sent requests to validate the online data and supply additional information if needed. Each institution received an overall score and subscale scores for family leave policies and part-time issues. Data were verified by the human resources office at 8 of the 10 schools. Work-life policies varied among Big Ten schools, with total scores between 9.25 and 13.5 (possible score: 0-21; higher scores indicate greater flexibility). Subscores were not consistently high or low within schools. Comparing the flexibility of faculty work-life policies in relation to other schools will help raise awareness of these issues and promote more progressive policies among less progressive schools. Ultimately, flexible policies will lead to greater equity and institutional cultures that are conducive to recruiting, retaining, and advancing diverse faculty.

  9. TenBig Bangs” in Theory and Practice that Have Made a Difference to Australian Policing in the Last Three Decades

    Directory of Open Access Journals (Sweden)

    Rick Sarre

    2016-05-01

    Full Text Available This paper discusses what could be considered the top ten innovations that have occurred in policing in the last thirty years. The intent is to focus attention on how practice could be further inspired by additional innovation. The innovations are discussed here as “Big Bangs” as a way of drawing attention to the significant impact they have had on policing, in the same way that the cosmological Big Bang was an important watershed event in the universe’s existence. These ten policing innovations ushered in, it is argued, a new mindset, pattern or trend, and they affected Australian policing profoundly; although many had their roots in other settings long before Australian policy-makers implemented them.

  10. Innovations and enhancements in neutronic analysis of the Big-10 university research and training reactors based on the AGENT code system

    International Nuclear Information System (INIS)

    Hursin, M.; Shanjie, X.; Burns, A.; Hopkins, J.; Satvat, N.; Gert, G.; Tsoukalas, L. H.; Jevremovic, T.

    2006-01-01

    Introduction. This paper summarizes salient aspects of the 'virtual' reactor system developed at Purdue Univ. emphasizing efficient neutronic modeling through AGENT (Arbitrary Geometry Neutron Transport) a deterministic neutron transport code. DOE's Big-10 Innovations in Nuclear Infrastructure and Education (INIE) Consortium was launched in 2002 to enhance scholarship activities pertaining to university research and training reactors (URTRs). Existing and next generation URTRs are powerful campus tools for nuclear engineering as well as a number of disciplines that include, but are not limited to, medicine, biology, material science, and food science. Advancing new computational environments for the analysis and configuration of URTRs is an important Big-10 INIE aim. Specifically, Big-10 INIE has pursued development of a 'virtual' reactor, an advanced computational environment to serve as a platform on which to build operations, utilization (research and education), and systemic analysis of URTRs physics. The 'virtual' reactor computational system will integrate computational tools addressing the URTR core and near core physics (transport, dynamics, fuel management and fuel configuration); thermal-hydraulics; beam line, in-core and near-core experiments; instrumentation and controls; confinement/containment and security issues. Such integrated computational environment does not currently exist. The 'virtual' reactor is designed to allow researchers and educators to configure and analyze their systems to optimize experiments, fuel locations for flux shaping, as well as detector selection and configuration. (authors)

  11. Reactor physics verification of the MCNP6 unstructured mesh capability

    International Nuclear Information System (INIS)

    Burke, T. P.; Kiedrowski, B. C.; Martz, R. L.; Martin, W. R.

    2013-01-01

    The Monte Carlo software package MCNP6 has the ability to transport particles on unstructured meshes generated from the Computed-Aided Engineering software Abaqus. Verification is performed using benchmarks with features relevant to reactor physics - Big Ten and the C5G7 computational benchmark. Various meshing strategies are tested and results are compared to reference solutions. Computational performance results are also given. The conclusions show MCNP6 is capable of producing accurate calculations for reactor physics geometries and the computational requirements for small lattice benchmarks are reasonable on modern computing platforms. (authors)

  12. Reactor physics verification of the MCNP6 unstructured mesh capability

    Energy Technology Data Exchange (ETDEWEB)

    Burke, T. P. [Department of Nuclear Engineering and Radiological Sciences, University of Michigan, 2355 Bonisteel Boulevard, Ann Arbor, MI 48109 (United States); Kiedrowski, B. C.; Martz, R. L. [X-Computational Physics Division, Monte Carlo Codes Group, Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States); Martin, W. R. [Department of Nuclear Engineering and Radiological Sciences, University of Michigan, 2355 Bonisteel Boulevard, Ann Arbor, MI 48109 (United States)

    2013-07-01

    The Monte Carlo software package MCNP6 has the ability to transport particles on unstructured meshes generated from the Computed-Aided Engineering software Abaqus. Verification is performed using benchmarks with features relevant to reactor physics - Big Ten and the C5G7 computational benchmark. Various meshing strategies are tested and results are compared to reference solutions. Computational performance results are also given. The conclusions show MCNP6 is capable of producing accurate calculations for reactor physics geometries and the computational requirements for small lattice benchmarks are reasonable on modern computing platforms. (authors)

  13. Innovations and Enhancements for a Consortium of Big-10 University Research and Training Reactors. Final Report

    International Nuclear Information System (INIS)

    Brenizer, Jack

    2011-01-01

    The Consortium of Big-10 University Research and Training Reactors was by design a strategic partnership of seven leading institutions. We received the support of both our industry and DOE laboratory partners. Investments in reactor, laboratory and program infrastructure, allowed us to lead the national effort to expand and improve the education of engineers in nuclear science and engineering, to provide outreach and education to pre-college educators and students and to become a key resource of ideas and trained personnel for our U.S. industrial and DOE laboratory collaborators.

  14. Gender Differences in Personality across the Ten Aspects of the Big Five.

    Science.gov (United States)

    Weisberg, Yanna J; Deyoung, Colin G; Hirsh, Jacob B

    2011-01-01

    This paper investigates gender differences in personality traits, both at the level of the Big Five and at the sublevel of two aspects within each Big Five domain. Replicating previous findings, women reported higher Big Five Extraversion, Agreeableness, and Neuroticism scores than men. However, more extensive gender differences were found at the level of the aspects, with significant gender differences appearing in both aspects of every Big Five trait. For Extraversion, Openness, and Conscientiousness, the gender differences were found to diverge at the aspect level, rendering them either small or undetectable at the Big Five level. These findings clarify the nature of gender differences in personality and highlight the utility of measuring personality at the aspect level.

  15. Ten-year utilization of the Oregon State University TRIGA Reactor (OSTR)

    International Nuclear Information System (INIS)

    Ringle, John C.; Anderson, Terrance V.; Johnson, Arthur G.

    1978-01-01

    The Oregon State University TRIGA Reactor (OSTR) has been used heavily throughout the past ten years to accommodate exclusively university research, teaching, and training efforts. Averages for the past nine years show that the OSTR use time has been as follows: 14% for academic and special training courses; 44% for OSU research projects; 6% for non-OSU research projects; 2% for demonstrations for tours; and 34% for reactor maintenance, calibrations, inspections, etc. The OSTR has operated an average of 25.4 hours per week during this nine-year period. Each year, about 20 academic courses and 30 different research projects use the OSTR. Visitors to the facility average about 1,500 per year. No commercial radiations or services have been performed at the OSTR during this period. Special operator training courses are given at the OSTR at the rate of at least one per year. (author)

  16. Integrated plant-safety assessment, Systematic Evaluation Program: Big Rock Point Plant (Docket No. 50-155)

    International Nuclear Information System (INIS)

    1983-09-01

    The Systematic Evaluation Program was initiated in February 1977 by the US Nuclear Regulatory Commission to review the designs of older operating nuclear reactor plants to reconfirm and document their safety. This report documents the review of the Big Rock Point Plant, which is one of ten plants reviewed under Phase II of this program. This report indicates how 137 topics selected for review under Phase I of the program were addressed. It also addresses a majority of the pending licensing actions for Big Rock Point, which include TMI Action Plan requirements and implementation criteria for resolved generic issues. Equipment and procedural changes have been identified as a result of the review

  17. Big ambitions for small reactors as investors size up power options

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, John [nuclear24, Redditch (United Kingdom)

    2016-04-15

    Earlier this year, US nuclear developer NuScale Power completed a study for the UK's National Nuclear Laboratory (NNL) that supported the suitability of NuScale's small modular reactor (SMR) technology for the effective disposition of plutonium. The UK is a frontrunner to compete in the SMR marketplace, both in terms of technological capabilities, trade and political commitment. Industry observers are openly speculating whether SMR design and construction could start to move ahead faster than 'big and conventional' nuclear construction projects - not just in the UK but worldwide. Economies of scale could increase the attraction of SMRs to investors and the general public.

  18. Usability Analysis of the Big Ten Academic Alliance Geoportal: Findings and Recommendations for Improvement of the User Experience

    Directory of Open Access Journals (Sweden)

    Mara Blake

    2017-10-01

    Full Text Available The Big Ten Academic Alliance (BTAA Geospatial Data Project is a collaboration between twelve member institutions of the consortium and works towards providing discoverability and access to geospatial data, scanned maps, and web mapping services. Usability tests and heuristic evaluations were chosen as methods of evaluation, as they have had a long standing in measuring and managing website engagement and are essential in the process of iterative design. The BTAA project hopes to give back to the community by publishing the results of our usability findings with the hope that it will benefit other portals built with GeoBlacklight.

  19. Evaluation of the integrity of reactor vessels designed to ASME Code, Sections I and/or VIII

    International Nuclear Information System (INIS)

    Hoge, K.G.

    1976-01-01

    A documented review of nuclear reactor pressure vessels designed to ASME Code, Sections I and/or VIII is made. The review is primarily concerned with the design specifications and quality assurance programs utilized for the reactor vessel construction and the status of power plant material surveillance programs, pressure-temperature operating limits, and inservice inspection programs. The following ten reactor vessels for light-water power reactors are covered in the report: Indian Point Unit No. 1, Dresden Unit No. 1, Yankee Rowe, Humboldt Bay Unit No. 3, Big Rock Point, San Onofre Unit No. 1, Connecticut Yankee, Oyster Creek, Nine Mile Point Unit No. 1, and La Crosse

  20. French experience in operating pressurized water reactor power stations. Ten years' operation of the Ardennes power station

    International Nuclear Information System (INIS)

    Teste du Bailler, A.; Vedrinne, J.F.

    1978-01-01

    In the paper the experience gained over ten years' operation of the Ardennes (Chooz) nuclear power station is summarized from the point of view of monitoring and control equipment. The reactor was the first pressurized water reactor to be installed in France; it is operated jointly by France and Belgium. The equipment, which in many cases consists of prototypes, was developed for industrial use and with the experience that has now been gained it is possible to evaluate its qualities and defects, the constraints which it imposes and the action that has to be taken in the future. (author)

  1. Neutron behavior, reactor control, and reactor heat transfer. Volume four

    International Nuclear Information System (INIS)

    Anon.

    1986-01-01

    Volume four covers neutron behavior (neutron absorption, how big are nuclei, neutron slowing down, neutron losses, the self-sustaining reactor), reactor control (what is controlled in a reactor, controlling neutron population, is it easy to control a reactor, range of reactor control, what happens when the fuel burns up, controlling a PWR, controlling a BWR, inherent safety of reactors), and reactor heat transfer (heat generation in a nuclear reactor, how is heat removed from a reactor core, heat transfer rate, heat transfer properties of the reactor coolant)

  2. Ten aspects of the Big Five in the Personality Inventory for DSM-5.

    Science.gov (United States)

    DeYoung, Colin G; Carey, Bridget E; Krueger, Robert F; Ross, Scott R

    2016-04-01

    Diagnostic and Statistical Manual of Mental Disorders (5th ed.; DSM-5) includes a dimensional model of personality pathology, operationalized in the Personality Inventory for DSM-5 (PID-5), with 25 facets grouped into 5 higher order factors resembling the Big Five personality dimensions. The present study tested how well these 25 facets could be integrated with the 10-factor structure of traits within the Big Five that is operationalized by the Big Five Aspect Scales (BFAS). In 2 healthy adult samples, 10-factor solutions largely confirmed our hypothesis that each of the 10 BFAS would be the highest loading BFAS on 1 and only 1 factor. Varying numbers of PID-5 scales were additional markers of each factor, and the overall factor structure in the first sample was well replicated in the second. Our results allow Cybernetic Big Five Theory (CB5T) to be brought to bear on manifestations of personality disorder, because CB5T offers mechanistic explanations of the 10 factors measured by the BFAS. Future research, therefore, may begin to test hypotheses derived from CB5T regarding the mechanisms that are dysfunctional in specific personality disorders. (c) 2016 APA, all rights reserved).

  3. Passport to the Big Bang moves across the road

    CERN Document Server

    Corinne Pralavorio

    2015-01-01

    The ATLAS platform of the Passport to the Big Bang circuit has been relocated in front of the CERN Reception.   The ATLAS platform of the Passport to the Big Bang, outside the CERN Reception building. The Passport to the Big Bang platform of the ATLAS Experiment has been moved in front of the CERN Reception to make it more visible and accessible. It had to be dismantled and moved from its previous location in the garden of the Globe of Science and Innovation due to the major refurbishment work in progress on the Globe, and is now fully operational in its new location on the other side of the road, in the Main Reception car-park. The Passport to the Big Bang circuit, inaugurated in 2013, comprises ten platforms installed in front of ten CERN sites and aims to help local residents and visitors to the region understand CERN's research. Dedicated Passport to the Big Bang flyers, containing all necessary information and riddles for you to solve, are available at the CERN Rec...

  4. Big Data Analytics An Overview

    Directory of Open Access Journals (Sweden)

    Jayshree Dwivedi

    2015-08-01

    Full Text Available Big data is a data beyond the storage capacity and beyond the processing power is called big data. Big data term is used for data sets its so large or complex that traditional data it involves data sets with sizes. Big data size is a constantly moving target year by year ranging from a few dozen terabytes to many petabytes of data means like social networking sites the amount of data produced by people is growing rapidly every year. Big data is not only a data rather it become a complete subject which includes various tools techniques and framework. It defines the epidemic possibility and evolvement of data both structured and unstructured. Big data is a set of techniques and technologies that require new forms of assimilate to uncover large hidden values from large datasets that are diverse complex and of a massive scale. It is difficult to work with using most relational database management systems and desktop statistics and visualization packages exacting preferably massively parallel software running on tens hundreds or even thousands of servers. Big data environment is used to grab organize and resolve the various types of data. In this paper we describe applications problems and tools of big data and gives overview of big data.

  5. Big Rock Point: 35 years of electrical generation

    International Nuclear Information System (INIS)

    Petrosky, T.D.

    1998-01-01

    On September 27, 1962, the 75 MWe boiling water reactor, designed and built by General Electric, of the Big Rock Point Nuclear Power Station went critical for the first time. The US Atomic Energy Commission (AEC) and the plant operator, Consumers Power, had designed the plant also as a research reactor. The first studies were devoted to fuel behavior, higher burnup, and materials research. The reactor was also used for medical technology: Co-60 radiation sources were produced for the treatment of more than 120,000 cancer patients. After the accident at the Three Mile Island-2 nuclear generating unit in 1979, Big Rock Point went through an extensive backfitting phase. Personnel from numerous other American nuclear power plants were trained at the simulator of Big Rock Point. The plant was decommissioned permanently on August 29, 1997 after more than 35 years of operation and a cumulated electric power production of 13,291 GWh. A period of five to seven years is estimated for decommissioning and demolition work up to the 'green field' stage. (orig.) [de

  6. Big inquiry

    Energy Technology Data Exchange (ETDEWEB)

    Wynne, B [Lancaster Univ. (UK)

    1979-06-28

    The recently published report entitled 'The Big Public Inquiry' from the Council for Science and Society and the Outer Circle Policy Unit is considered, with especial reference to any future enquiry which may take place into the first commercial fast breeder reactor. Proposals embodied in the report include stronger rights for objectors and an attempt is made to tackle the problem that participation in a public inquiry is far too late to be objective. It is felt by the author that the CSS/OCPU report is a constructive contribution to the debate about big technology inquiries but that it fails to understand the deeper currents in the economic and political structure of technology which so influence the consequences of whatever formal procedures are evolved.

  7. Big Data-Survey

    Directory of Open Access Journals (Sweden)

    P.S.G. Aruna Sri

    2016-03-01

    Full Text Available Big data is the term for any gathering of information sets, so expensive and complex, that it gets to be hard to process for utilizing customary information handling applications. The difficulties incorporate investigation, catch, duration, inquiry, sharing, stockpiling, Exchange, perception, and protection infringement. To reduce spot business patterns, anticipate diseases, conflict etc., we require bigger data sets when compared with the smaller data sets. Enormous information is hard to work with utilizing most social database administration frameworks and desktop measurements and perception bundles, needing rather enormously parallel programming running on tens, hundreds, or even a large number of servers. In this paper there was an observation on Hadoop architecture, different tools used for big data and its security issues.

  8. Nuclear reactor PBMR and cogeneration; Reactor nuclear PBMR y cogeneracion

    Energy Technology Data Exchange (ETDEWEB)

    Ramirez S, J. R.; Alonso V, G., E-mail: ramon.ramirez@inin.gob.mx [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2013-10-15

    In recent years the nuclear reactor designs for the electricity generation have increased their costs, so that at the moment costs are managed of around the 5000 US D for installed kw, reason for which a big nuclear plant requires of investments of the order of billions of dollars, the designed reactors as modular of low power seek to lighten the initial investment of a big reactor dividing the power in parts and dividing in modules the components to lower the production costs, this way it can begin to build a module and finished this to build other, differing the long term investment, getting less risk therefore in the investment. On the other hand the reactors of low power can be very useful in regions where is difficult to have access to the electric net being able to take advantage of the thermal energy of the reactor to feed other processes like the water desalination or the vapor generation for the processes industry like the petrochemical, or even more the possible hydrogen production to be used as fuel. In this work the possibility to generate vapor of high quality for the petrochemical industry is described using a spheres bed reactor of high temperature. (Author)

  9. Release plan for Big Pete

    International Nuclear Information System (INIS)

    Edwards, T.A.

    1996-11-01

    This release plan is to provide instructions for the Radiological Control Technician (RCT) to conduct surveys for the unconditional release of ''Big Pete,'' which was used in the removal of ''Spacers'' from the N-Reactor. Prior to performing surveys on the rear end portion of ''Big Pete,'' it shall be cleaned (i.e., free of oil, grease, caked soil, heavy dust). If no contamination is found, the vehicle may be released with the permission of the area RCT Supervisor. If contamination is found by any of the surveys, contact the cognizant Radiological Engineer for decontamination instructions

  10. Stability analysis for the Big Dee upgrade of the Doublet III tokamak

    International Nuclear Information System (INIS)

    Helton, F.J.; Luxon, J.L.

    1987-01-01

    Ideal magnetohydrodynamic stability analysis has been carried out for configurations expected in the Big Dee tokamak, an upgrade of the Doublet III tokamak into a non-circular cross-section device which began operation early in 1986. The results of this analysis support theoretical predictions as follows: Since the maximum value of beta stable to ballooning and Mercier modes, which we denote β c , increases with inverse aspect ratio, elongation and triangularity, the Big Dee is particularly suited to obtain high values of β c and there exist high β c Big Dee equilibria for large variations in all relevant plasma parameters. The beta limits for the Big Dee are consistent with established theory as summarized in present scaling laws. High beta Big Dee equilibria are continuously accessible when approached through changes in all relevant input parameters and are structurally stable with respect to variations of input plasma parameters. Big Dee beta limits have a smooth dependence on plasma parameters such as β p and elongation. These calculations indicate that in the actual running of the device the Big Dee high beta equilibria should be smoothly accessible. Theory predicts that the limiting plasma parameters, such as beta, total plasma current and plasma pressure, which can be obtained within the operating limits of the Big Dee are reactor relevant. Thus the Big Dee should be able to use its favourable ideal MHD scaling and controlled plasma shaping to attain reactor relevant parameters in a moderate sized device. (author)

  11. Integrated plant safety assessment. Systematic evaluation program, Big Rock Point Plant (Docket No. 50-155). Final report

    International Nuclear Information System (INIS)

    1984-05-01

    The Systematic Evaluation Program was initiated in February 1977 by the U.S. Nuclear Regulatory Commission to review the designs of older operating nuclear reactor plants to reconfirm and document their safety. The review provides (1) an assessment of how these plants compare with current licensing safety requirements relating to selected issues, (2) a basis for deciding how these differences should be resolved in an integrated plant review, and (3) a documented evaluation of plant safety when the supplement to the Final Integrated Plant Safety Assessment Report has been issued. This report documents the review of the Big Rock Point Plant, which is one of ten plants reviewed under Phase II of this program. This report indicates how 137 topics selected for review under Phase I of the program were addressed. It also addresses a majority of the pending licensing actions for Big Rock Point, which include TMI Action Plan requirements and implementation criteria for resolved generic issues. Equipment and procedural changes have been identified as a result of the review

  12. Big nuclear accidents

    International Nuclear Information System (INIS)

    Marshall, W.; Billingon, D.E.; Cameron, R.F.; Curl, S.J.

    1983-09-01

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but just imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the risks of nuclear power. The paper reviews the way in which the probability and consequences of big nuclear accidents have been presented in the past and makes recommendations for the future, including the presentation of the long-term consequences of such accidents in terms of 'loss of life expectancy', 'increased chance of fatal cancer' and 'equivalent pattern of compulsory cigarette smoking'. The paper presents mathematical arguments, which show the derivation and validity of the proposed methods of presenting the consequences of imaginable big nuclear accidents. (author)

  13. Extended burnup demonstration: reactor fuel program. Pre-irradiation characterization and summary of pre-program poolside examinations. Big Rock Point extended burnup fuel

    International Nuclear Information System (INIS)

    Exarhos, C.A.; Van Swam, L.F.; Wahlquist, F.P.

    1981-12-01

    This report is a resource document characterizing the 64 fuel rods being irradiated at the Big Rock Point reactor as part of the Extended Burnup Demonstration being sponsored jointly by the US Department of Energy, Consumers Power Company, Exxon Nuclear Company, and General Public Utilities. The program entails extending the exposure of standard BWR fuel to a discharge average of 38,000 MWD/MTU to demonstrate the feasibility of operating fuel of standard design to levels significantly above current limits. The fabrication characteristics of the Big Rock Point EBD fuel are presented along with measurement of rod length, rod diameter, pellet stack height, and fuel rod withdrawal force taken at poolside at burnups up to 26,200 MWD/MTU. A review of the fuel examination data indicates no performance characteristics which might restrict the continued irradiation of the fuel

  14. Researchers solve big mysteries of pebble bed reactor

    Energy Technology Data Exchange (ETDEWEB)

    Shams, Afaque; Roelofs, Ferry; Komen, E.M.J. [Nuclear Research and Consultancy Group (NRG), Petten (Netherlands); Baglietto, Emilio [Massachusetts Institute of Technology, Cambridge, MA (United States). Dept. of Nuclear Science and Engineering; Sgro, Titus [CD-adapco, London (United Kingdom). Technical Marketing

    2014-03-15

    The PBR is one type of High Temperature Reactors, which allows high temperature work while preventing the fuel from melting (bringing huge safety margins to the reactor) and high electricity efficiency. The design is also highly scalable; a plant could be designed to be as large or small as needed, and can even be made mobile, allowing it to be used onboard a ship. In a PBR, small particles of nuclear fuel, embedded in a moderating graphite pebble, are dropped into the reactor as needed. At the bottom, the pebbles can be removed simply by opening a small hatch and letting gravity pull them down. To cool the reactor and create electricity, helium gas is pumped through the reactor to pull heat out which is then run through generators. One of the most difficult problems to deal with has been the possible appearance of local temperature hotspots within the pebble bed heating to the point of melting the graphite moderators surrounding the fuel. Obviously, constructing a reactor and experimenting to investigate this possibility is out of the question. Instead, nuclear engineers have been attempting to simulate a PBR with various CFD codes. The thermo-dynamic analysis to simulate realistic conditions in a pebble bed are described and the results are shown. (orig.)

  15. Classical propagation of strings across a big crunch/big bang singularity

    International Nuclear Information System (INIS)

    Niz, Gustavo; Turok, Neil

    2007-01-01

    One of the simplest time-dependent solutions of M theory consists of nine-dimensional Euclidean space times 1+1-dimensional compactified Milne space-time. With a further modding out by Z 2 , the space-time represents two orbifold planes which collide and re-emerge, a process proposed as an explanation of the hot big bang [J. Khoury, B. A. Ovrut, P. J. Steinhardt, and N. Turok, Phys. Rev. D 64, 123522 (2001).][P. J. Steinhardt and N. Turok, Science 296, 1436 (2002).][N. Turok, M. Perry, and P. J. Steinhardt, Phys. Rev. D 70, 106004 (2004).]. When the two planes are near, the light states of the theory consist of winding M2-branes, describing fundamental strings in a particular ten-dimensional background. They suffer no blue-shift as the M theory dimension collapses, and their equations of motion are regular across the transition from big crunch to big bang. In this paper, we study the classical evolution of fundamental strings across the singularity in some detail. We also develop a simple semiclassical approximation to the quantum evolution which allows one to compute the quantum production of excitations on the string and implement it in a simplified example

  16. Reactor technology. Progress report, January--March 1978

    International Nuclear Information System (INIS)

    Warren, J.L.

    1978-07-01

    Progress is reported in eight program areas. The nuclear Space Electric Power Supply Program examined safety questions in the aftermath of the COSMOS 954 incident, examined the use of thermoelectric converters, examined the neutronic effectiveness of various reflecting materials, examined ways of connecting heat pipes to one another, studied the consequences of the failure of one heat pipe in the reactor core, and did conceptual design work on heat radiators for various power supplies. The Heat Pipe Program reported progress in the design of ceramic heat pipes, new application of heat pipes to solar collectors, and final performance tests of two pipes for HEDL applications. Under the Nuclear Process Heat Program, work continues on computer codes to model a pebble bed high-temperature gas-cooled reactor, adaptation of a set of German reactor calculation codes to use on U.S. computers, and a parametric study of a certain resonance integral required in reactor studies. Under the Nonproliferation Alternative Sources Assessment Program LASL has undertaken an evaluation of a study of gaseous core reactors by Southern Science Applications, Inc. Independently LASL has developed a proposal for a comprehensive study of gaseous uranium-fueled reactor technology. The Plasma Core Reactor Program has concentrated on restacking the beryllium reflector and redesigning the nuclear control system. The status of and experiments on four critical assemblies, SKUA, Godiva IV, Big Ten, and Flattop, are reported. The Nuclear Criticality Safety Program carried out several tasks including conducting a course, doing several annual safety reviews and evaluating the safety of two Nevada test devices. During the quarter one of the groups involved in reactor technology has acquired responsibility for the operation of a Cockroft-Walton accelerator. The present report contains information on the use of machine and improvements being made in its operation

  17. A `big-mac` high converting water reactor

    Energy Technology Data Exchange (ETDEWEB)

    Ronen, Y; Dali, Y [Ben-Gurion Univ. of the Negev, Beersheba (Israel). Dept. of Nuclear Engineering

    1996-12-01

    Currently an effort is being made to get rid of plutonium. Therefore, at this time, a scientific study of a high converting reactor seems to be out of place. However , it is our opinion that the future of nuclear energy lies, among other things in the clever utilization of plutonium. It is also our opinion that one of the best ways to utilize plutonium is in high converting water reactors (authors).

  18. Chernobyl ten years after

    International Nuclear Information System (INIS)

    1996-01-01

    The accident in the fourth reactor plant in Chernobyl in Ukraine occurred ten years years ago, caused the death of 31 people while the health consequences have turned out to be difficult to assess. This review describes the accident, its consequences and effects to health, studies carried out at the present state as well as the comparison with the other accidents and disaster. (author)

  19. The origin of the future ten questions for the next ten years

    CERN Document Server

    Gribbin, John

    2006-01-01

    How did the universe begin? Where do galaxies come from? How do stars and planets form? Where do the material particles we are made of come from? How did life begin? Today we have only provisional answers to such questions. But scientific progress will improve these answers dramatically over the next ten years, predicts John Gribbin in this riveting book. He focuses on what we know—or think we know—about ten controversial, unanswered issues in the physical sciences and explains how current cutting-edge research may yield solutions in the very near future. With his trademark facility for engaging readers with or without a scientific background, the author explores ideas concerning the creation of the universe, the possibility of other forms of life, and the fate of the expanding cosmos. He examines “theories of everything,” including grand unified theories and string theory, and he discusses the Big Bang theory, the origin of structure and patterns of matter in the galaxies, and dark mass and dark ene...

  20. Nuclear reactor PBMR and cogeneration

    International Nuclear Information System (INIS)

    Ramirez S, J. R.; Alonso V, G.

    2013-10-01

    In recent years the nuclear reactor designs for the electricity generation have increased their costs, so that at the moment costs are managed of around the 5000 US D for installed kw, reason for which a big nuclear plant requires of investments of the order of billions of dollars, the designed reactors as modular of low power seek to lighten the initial investment of a big reactor dividing the power in parts and dividing in modules the components to lower the production costs, this way it can begin to build a module and finished this to build other, differing the long term investment, getting less risk therefore in the investment. On the other hand the reactors of low power can be very useful in regions where is difficult to have access to the electric net being able to take advantage of the thermal energy of the reactor to feed other processes like the water desalination or the vapor generation for the processes industry like the petrochemical, or even more the possible hydrogen production to be used as fuel. In this work the possibility to generate vapor of high quality for the petrochemical industry is described using a spheres bed reactor of high temperature. (Author)

  1. Measuring Public Acceptance of Nuclear Technology with Big data

    Energy Technology Data Exchange (ETDEWEB)

    Roh, Seugkook [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    Surveys can be conducted only on people in specific region and time interval, and it may be misleading to generalize the results to represent the attitude of the public. For example, opinions of a person living in metropolitan area, far from the dangers of nuclear reactors and enjoying cheap electricity produced by the reactors, and a person living in proximity of nuclear power plants, subject to tremendous damage should nuclear meltdown occur, certainly differs for the topic of nuclear generation. To conclude, big data is a useful tool to measure the public acceptance of nuclear technology efficiently (i.e., saves cost, time, and effort of measurement and analysis) and this research was able to provide a case for using big data to analyze public acceptance of nuclear technology. Finally, the analysis identified opinion leaders, which allows target-marketing when policy is executed.

  2. Measuring Public Acceptance of Nuclear Technology with Big data

    International Nuclear Information System (INIS)

    Roh, Seugkook

    2015-01-01

    Surveys can be conducted only on people in specific region and time interval, and it may be misleading to generalize the results to represent the attitude of the public. For example, opinions of a person living in metropolitan area, far from the dangers of nuclear reactors and enjoying cheap electricity produced by the reactors, and a person living in proximity of nuclear power plants, subject to tremendous damage should nuclear meltdown occur, certainly differs for the topic of nuclear generation. To conclude, big data is a useful tool to measure the public acceptance of nuclear technology efficiently (i.e., saves cost, time, and effort of measurement and analysis) and this research was able to provide a case for using big data to analyze public acceptance of nuclear technology. Finally, the analysis identified opinion leaders, which allows target-marketing when policy is executed

  3. Research reactor put Canada in the nuclear big time

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    The history of the NRX reactor is briefly recounted. When NRX started up in 1947, it was the most powerful neutron source in the world. It is now the oldest research reactor still operating. NRX had to be rebuilt after an accident in 1952, and its calandria was changed again in 1970. Loops in NRX were used to test fuel for the Nautilus submarine, and the first zircaloy pressure tube in the world. At the present time, NRX is in a 'hot standby' condition as a backup to the NRU reactor, which is used mainly for isotope production. NRX will be decommissioned after completion and startup of the new MAPLE-X reactor

  4. A Matrix Big Bang

    OpenAIRE

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matr...

  5. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  6. Molten-Salt Reactors: Report for 1960 Ten-Year-Plan Evaluation

    International Nuclear Information System (INIS)

    MacPherson, H. G.

    1960-01-01

    For purposes of this evaluation, the molten-salt reactor is considered as an advanced concept. It is considered not to have a status of a current technology adequate to allow the construction of large-scale power plants, since no power reactor has been built or even designed in detail. As a result there can be no estimate of present cost of power, and the projection of power costs to later years is necessarily based on general arguments rather than detailed considerations.

  7. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  8. 10 years Institute for Reactor Development

    International Nuclear Information System (INIS)

    1975-05-01

    Ten years ago the Institute of Reactor Development was founded. This report contains a review about the research work of the institute in these past ten years. The work was mainly performed within the framework of the Fast Breeder Project, the Nuclear Safety Project and Computer Aided Design. Especially the following topics are discussed: design studies for different fast breeder reactors, development works for fast breeders, investigations of central safety problems of sodium cooled breeder reactors (such as local and integral coolant disturbances and hypothetical accident analysis), special questions of light water reactor safety (such as dynamic stresses in pressure suppression systems and fuel rod behaviour under loss of coolant conditions), and finally computer application in various engineering fields. (orig.) [de

  9. Big nuclear accidents

    International Nuclear Information System (INIS)

    Marshall, W.

    1983-01-01

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the safety of nuclear power. The way in which the probability and consequences of big nuclear accidents have been presented in the past is reviewed and recommendations for the future are made including the presentation of the long-term consequences of such accidents in terms of 'reduction in life expectancy', 'increased chance of fatal cancer' and the equivalent pattern of compulsory cigarette smoking. (author)

  10. VINKA, ten years on. Main scientific results

    International Nuclear Information System (INIS)

    1979-01-01

    The VINKA facility in the TRITON swimming-pool reactor at Fontenay-aux-Roses allows the irradiation of solids at low temperatures in order to study crystalline defects. After ten years of operation the main scientific results obtained in the fields of creep and growth (chapter I), point defects (chapter II), amorphisation (chapter III) and dechanneling of particles (chapter IV) are summarised [fr

  11. Don’t miss the Passport to the Big Bang event this Sunday!

    CERN Multimedia

    CERN Bulletin

    2013-01-01

    Word has been going around for weeks now about the inauguration of the Passport to the Big Bang on Sunday 2 June. Ideal for a family day out or a day with friends, this is a CERN event not to be missed!   The Passport to the Big Bang is a 54-km scientific tourist trail comprising ten exhibition platforms in front of ten CERN sites in the Pays de Gex and the Canton of Geneva. Linked by cycle routes, these ten platforms will mark the same number of stages in the rally for competitive cyclists and the bicycle tour for families taking place this Sunday from 9 a.m. to 12 p.m. But that’s not all: from 2 p.m., you will also have the chance to take part in a huge range of activities provided by clubs and associations from CERN and the local region. Watch an oriental dance show, have a go at building detectors out of Kapla blocks and Lego, meet different reptile species, learn about wind instruments, try your hand at Nordic walking or Zumba fitness, get a better understanding of road safety...

  12. The Concept of the Use of the Marine Reactor Plant in Small Electric Grids

    International Nuclear Information System (INIS)

    Khlopkin, N.; Makarov, V.; Pologikh, B.

    2002-01-01

    In report some aspects of the using marine nuclear reactor are considered for provision of need small non-interconnected power systems, as well as separate settlements and the mining enterprises disposed in regions with a undeveloped infrastructure. Recently for these purposes it is offered to use the nuclear small modular power plants. The required plant power for small electric grids lies within from 1 to several tens of MWe. Module can be collected and tested on machine-building plant, and then delivered in ready type to the working place on some transport, for instance, a barge. Through determined time it's possible to transport a module to the repair shop and also to the point of storage after the end of operation. Marine nuclear reactors on their powers, compactness, mass and size are ideal prototypes for creation of such modules. For instance, building at present floating power unit, intended for functioning in region of the Russian North, based on using reactor plants of nuclear icebreakers. Reliability and safety of the ship reactor are confirmed by their trouble-free operation during approximately 180 reactors-years. Unlike big stationary nuclear plant, working in base mode, power unit with marine reactor wholly capable to work in mode of the loading following. In contrast with reactor of nuclear icebreaker, advisable to increase the core lifetime and to reduce the enrichment of the uranium. This requires more uranium capacity fuel compositions and design of the core. In particular, possible transition from traditional for ship reactor of the channel core to cassette design. Other directions of evolution of the ship reactors, not touching the basic constructive decisions verified by practice, but promoting development of properties of self-security of plant are possible. Among such directions is reduction volumetric power density of a core. (author)

  13. Ten years of TRIGA reactor research at the University of Texas

    International Nuclear Information System (INIS)

    O'Kelly, Sean

    2002-01-01

    The 1 MW TRIGA Research Reactor at the Nuclear Engineering Teaching Laboratory is the second TRIGA at the University of Texas at Austin (UT). A small (10 kW-1963, 250 kW-1968) TRIGA Mark I was housed in the basement of the Engineering Building until is was shutdown and decommissioned in 1989. The new TRIGA Mark II with a licensed power of 1.1 MW reached initial criticality in 1992. Prior to 1990, reactor research at UT usually consisted of projects requiring neutron activation analysis (NAA) but the step up to a much larger reactor with neutron beam capability required additional personnel to build the neutron research program. The TCNS is currently used to perform Prompt Gamma Activation Analysis to determine hydrogen and boron concentrations of various composite materials. The early 1990s was a very active period for neutron beam projects at the NETL. In addition to the TCNS, a real-time neutron radiography facility (NIF) and a high-resolution neutron depth profiling facility (NDP) were installed in two separate beam ports. The NDP facility was most recently used to investigate alpha damage on stainless steel in support of the U.S. Nuclear Weapons Stewardship programs. In 1999, a sapphire beam filter was installed in the NDP system to reduce the fast neutron flux at the sample location. A collaborative effort was started in 1997 between UT-Austin and the University of Texas at Arlington to build a reactor-based, low-energy positron beam (TIPS). The limited success in obtaining funding has placed the project on hold. The Nuclear and Radiation Engineering Program has grown rapidly and effectively doubled in size over the past 5 years but years of low nuclear research funding, an overall stagnation in the U.S. nuclear power industry and a persuasive public distrust of nuclear energy has caused a precipitous decline in many programs. Recently, the U.S. DOE has encouraged University Research Reactors (URR) in the U.S. to collaborate closely together by forming URR

  14. The installation welding of pressure water reactor coolant piping

    International Nuclear Information System (INIS)

    Deng Feng

    2010-01-01

    Large pressure water reactor nuclear power plants are constructing in our country. There are three symmetry standard loops in reactor coolant system. Each loop possesses a steam generator and a primary poop, in which one of the loops is equipped with a pressurizer. These components are connected with reactor pressure vessel by installation welding of the coolant piping. The integrity of reactor coolant pressure boundary is the second barrier to protect the radioactive substance from release to outside, so the safe operation of nuclear power plant is closely related to the quality of coolant piping installation welding. The heavy tube with super low carbon content austenitic stainless steel is selected for coolant piping. This kind of material has good welding behavior, but the poor thermal conductivity, the big liner expansion coefficient and the big welding deformation will cause bigger welding stress. To reduce the welding deformation, to control the dimension precision, to reduce the residual stress and to ensure the welding quality the installation sequence should be properly designed and the welding technology should be properly controlled. (authors)

  15. New research reactor proposed for Australia

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    A new research reactor has been proposed for construction within the next ten years, to replace the HIFAR reactor which operating capabilities have been over taken by later designs. This paper outlines the main research applications of the new reactor design and briefly examines issues related to its cost, economic benefits, safety and location

  16. Farewell to a Big and Rich Nuclear Power Club?

    International Nuclear Information System (INIS)

    Takeda, A.

    2001-01-01

    For the last few decades of the 20th, century, we have seen a large number of big nuclear power plants being built and operated in a few rich countries like the United States, France, Germany, the United Kingdom, and Japan. They have standardized the 1000 MWe-type light water reactors, which have the actual generating capacity of more than 1100 MW. (author)

  17. Big data analysis of public acceptance of nuclear power in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Roh, Seung Kook [Policy Research Division, Korea Atomic Energy Research Institute (KAERI), Daejeon (Korea, Republic of)

    2017-06-15

    Public acceptance of nuclear power is important for the government, the major stakeholder of the industry, because consensus is required to drive actions. It is therefore no coincidence that the governments of nations operating nuclear reactors are endeavoring to enhance public acceptance of nuclear power, as better acceptance allows stable power generation and peaceful processing of nuclear wastes produced from nuclear reactors. Past research, however, has been limited to epistemological measurements using methods such as the Likert scale. In this research, we propose big data analysis as an attractive alternative and attempt to identify the attitudes of the public on nuclear power. Specifically, we used common big data analyses to analyze consumer opinions via SNS (Social Networking Services), using keyword analysis and opinion analysis. The keyword analysis identified the attitudes of the public toward nuclear power. The public felt positive toward nuclear power when Korea successfully exported nuclear reactors to the United Arab Emirates. With the Fukushima accident in 2011 and certain supplier scandals in 2012, however, the image of nuclear power was degraded and the negative image continues. It is recommended that the government focus on developing useful businesses and use cases of nuclear power in order to improve public acceptance.

  18. Big data analysis of public acceptance of nuclear power in Korea

    International Nuclear Information System (INIS)

    Roh, Seung Kook

    2017-01-01

    Public acceptance of nuclear power is important for the government, the major stakeholder of the industry, because consensus is required to drive actions. It is therefore no coincidence that the governments of nations operating nuclear reactors are endeavoring to enhance public acceptance of nuclear power, as better acceptance allows stable power generation and peaceful processing of nuclear wastes produced from nuclear reactors. Past research, however, has been limited to epistemological measurements using methods such as the Likert scale. In this research, we propose big data analysis as an attractive alternative and attempt to identify the attitudes of the public on nuclear power. Specifically, we used common big data analyses to analyze consumer opinions via SNS (Social Networking Services), using keyword analysis and opinion analysis. The keyword analysis identified the attitudes of the public toward nuclear power. The public felt positive toward nuclear power when Korea successfully exported nuclear reactors to the United Arab Emirates. With the Fukushima accident in 2011 and certain supplier scandals in 2012, however, the image of nuclear power was degraded and the negative image continues. It is recommended that the government focus on developing useful businesses and use cases of nuclear power in order to improve public acceptance

  19. Big Data Analysis of Public Acceptance of Nuclear Power in Korea

    Directory of Open Access Journals (Sweden)

    Seungkook Roh

    2017-06-01

    Full Text Available Public acceptance of nuclear power is important for the government, the major stakeholder of the industry, because consensus is required to drive actions. It is therefore no coincidence that the governments of nations operating nuclear reactors are endeavoring to enhance public acceptance of nuclear power, as better acceptance allows stable power generation and peaceful processing of nuclear wastes produced from nuclear reactors. Past research, however, has been limited to epistemological measurements using methods such as the Likert scale. In this research, we propose big data analysis as an attractive alternative and attempt to identify the attitudes of the public on nuclear power. Specifically, we used common big data analyses to analyze consumer opinions via SNS (Social Networking Services, using keyword analysis and opinion analysis. The keyword analysis identified the attitudes of the public toward nuclear power. The public felt positive toward nuclear power when Korea successfully exported nuclear reactors to the United Arab Emirates. With the Fukushima accident in 2011 and certain supplier scandals in 2012, however, the image of nuclear power was degraded and the negative image continues. It is recommended that the government focus on developing useful businesses and use cases of nuclear power in order to improve public acceptance.

  20. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    Science.gov (United States)

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  1. Change of neutron flow sensors effectiveness in the course of reactor experiments

    International Nuclear Information System (INIS)

    Kurpesheva, A.M.; Kotov, V.M.; Zhotabaev, Zh.R.

    2007-01-01

    Full text: IGR reactor is a reactor of thermal capacity type. During the operation, uranium-graphite core can be heated up to 1500 deg. C and reactivity can be changed considerably. Core dimensions are comparatively small. Amount of control rods, providing required reactivity, is not big as well. Increasing of core temperature leads to the rise of neutrons path length in its basic material - graphite. Change of temperature is not even. All this causes the non-conservation of neutron flows ratio in irradiated sample and in the place of reactor power sensors installation. Deviations in this ratio were registered during the number of reactor experiments. Empiric corrections can be introduced in order to decrease influence of change of neutron flow effectiveness upon provision of required parameters of investigated matters load. However, dependence of these corrections upon many factors can lead to the increasing of instability of process control. Previous experiment-calculated experiments showed inequality of neutron field in the place of sensors location (up to tens of percent), low effectiveness of experimental works, carried out without access to the individual reactor laying elements. Imperfection during the experiment was an idea of possibility to connect distribution of out of reactor neutron flow and control rods position. Subsequent analysis showed that for the development of representative phenomenon model it is necessary to take into account reactor operation dynamic subject to unevenness of heating of individual laying parts. Elemental calculations showed that temperature laying effects in the change of neutron outer field are great. Algorithm of calculations for the change of outer filed and field of investigated fabrication includes calculation of neutron-physic reactor characteristics interlacing with calculations of thermal-physic reactor characteristics, providing correlation of temperature fields for neutron-physic calculations. In the course of such

  2. Big Data in Health: a Literature Review from the Year 2005.

    Science.gov (United States)

    de la Torre Díez, Isabel; Cosgaya, Héctor Merino; Garcia-Zapirain, Begoña; López-Coronado, Miguel

    2016-09-01

    The information stored in healthcare systems has increased over the last ten years, leading it to be considered Big Data. There is a wealth of health information ready to be analysed. However, the sheer volume raises a challenge for traditional methods. The aim of this article is to conduct a cutting-edge study on Big Data in healthcare from 2005 to the present. This literature review will help researchers to know how Big Data has developed in the health industry and open up new avenues for research. Information searches have been made on various scientific databases such as Pubmed, Science Direct, Scopus and Web of Science for Big Data in healthcare. The search criteria were "Big Data" and "health" with a date range from 2005 to the present. A total of 9724 articles were found on the databases. 9515 articles were discarded as duplicates or for not having a title of interest to the study. 209 articles were read, with the resulting decision that 46 were useful for this study. 52.6 % of the articles used were found in Science Direct, 23.7 % in Pubmed, 22.1 % through Scopus and the remaining 2.6 % through the Web of Science. Big Data has undergone extremely high growth since 2011 and its use is becoming compulsory in developed nations and in an increasing number of developing nations. Big Data is a step forward and a cost reducer for public and private healthcare.

  3. Ten years's reactor operation at the Technical University Zittau - operation report

    International Nuclear Information System (INIS)

    Konschak, K.

    1990-01-01

    The Zittau Training and Research Reactor ZLFR is in use for purposes of teaching the engineers who will operate the nuclear power plants in the GDR since 10 years. Since commissioning it was started up more than 1600 times, approximately two thirds of the start-ups being utilized for purposes of teaching. A number of teaching experiments were installed that demonstrate fundamental technological processes in nuclear reactors in a manner easy to understand. The high level of nuclear safety manifests itself, among other things, in extremely low radiation exposures of the operating personal and the persons to be trained. (author)

  4. Water desalination using different capacity reactors options

    International Nuclear Information System (INIS)

    Alonso, G.; Vargas, S.; Del Valle, E.; Ramirez, R.

    2010-01-01

    The Northwest region of Mexico has a deficit of potable water, along this necessity is the region growth, which requires of additional energy capacity, cogeneration of potable water production and nuclear electricity is an option to be assessed. In this paper we will perform an economical comparison for cogeneration using a big reactor, the AP1000, and a medium size reactor, the IRIS, both of them are PWR type reactors and will be coupled to the desalination plant using the same method. For this cogeneration case we will assess the best reactor option that can cover both needs using the maximum potable water production for two different desalination methods: Multistage Flash Distillation and Multi-effect Distillation. (authors)

  5. RETRAN operational transient analysis of the Big Rock Point plant boiling water reactor

    International Nuclear Information System (INIS)

    Sawtelle, G.R.; Atchison, J.D.; Farman, R.F.; VandeWalle, D.J.; Bazydlo, H.G.

    1983-01-01

    Energy Incorporated used the RETRAN computer code to model and calculate nine Consumers Power Company Big Rock Point Nuclear Power Plant transients. RETRAN, a best-estimate, one-dimensional, homogeneous-flow thermal-equilibrium code, is applicable to FSAR Chapter 15 transients for Conditions 1 through IV. The BWR analyses were performed in accordance with USNRC Standard Review Plan criteria and in response to the USNRC Systematic Evaluation Program. The RETRAN Big Rock Point model was verified by comparison to plant startup test data. This paper discusses the unique modeling techniques used in RETRAN to model this steam-drum-type BWR. Transient analyses results are also presented

  6. Big Data, Big Problems: A Healthcare Perspective.

    Science.gov (United States)

    Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M

    2017-01-01

    Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."

  7. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  8. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van

  9. Studying the processes of sodium-water interaction in the BOR-60 reactor micromodule steam generator

    International Nuclear Information System (INIS)

    Tsykanov, V.A.; Antipin, G.K.; Borisov, V.V.

    1981-01-01

    Main results of experimental studies of emergency regimes of micromodule steam generator (MSG) at small and big leaks of water into sodium, realized using the 30 MW MSG, operating in the BOR-o0 reactor, are considered. The aims of the study are as follows: the modelling of macroleak in ''Nadja'' steam generator for the BN-350 reactor; testing the conceptions of alarm signalling and MSG protection; testing under real conditions of new perspective systems of leak detection; gaining the experimence and development of the ways to eliminate the consequences of accident caused by big water leak into sodium; accumulation of knowledge on restoration of MSG operating ability after accident; experimental test of calculational techniques for big leak accidents to use them in future for calculational studies of similar situations at other reactors equipped with sodium-water steam generators; refinement of characteristics of hydrodynamic and thermal effects interaction zone for big leak in real circuit during the plant operation. A series of experiments with the imitation of water leak into sodium by means of argon and steam supply through injection devices, located before the steam superheater module of one of the sections and between evaporator module of the same section, is conducted. The range of steam flow rate is 0.02-0.45 g/s. Duration of steam supply is 100-400 s. A conclusion is made that the results obtained can be used for steam generator of the BN-350 reactor [ru

  10. Optimal reactor strategy for commercializing fast breeder reactors

    International Nuclear Information System (INIS)

    Yamaji, Kenji; Nagano, Koji

    1988-01-01

    In this paper, a fuel cycle optimization model developed for analyzing the condition of selecting fast breeder reactors in the optimal reactor strategy is described. By dividing the period of planning, 1966-2055, into nine ten-year periods, the model was formulated as a compact linear programming model. With the model, the best mix of reactor types as well as the optimal timing of reprocessing spent fuel from LWRs to minimize the total cost were found. The results of the analysis are summarized as follows. Fast breeder reactors could be introduced in the optimal strategy when they can economically compete with LWRs with 30 year storage of spent fuel. In order that fast breeder reactors monopolize the new reactor market after the achievement of their technical availability, their capital cost should be less than 0.9 times as much as that of LWRs. When a certain amount of reprocessing commitment is assumed, the condition of employing fast breeder reactors in the optimal strategy is mitigated. In the optimal strategy, reprocessing is done just to meet plutonium demand, and the storage of spent fuel is selected to adjust the mismatch of plutonium production and utilization. The price hike of uranium ore facilitates the commercial adoption of fast breeder reactors. (Kako, I.)

  11. Accident analysis for PRC-II reactor

    International Nuclear Information System (INIS)

    Wei Yongren; Tang Gang; Wu Qing; Lu Yili; Liu Zhifeng

    1997-12-01

    The computer codes, calculation models, transient results, sensitivity research, design improvement, and safety evaluation used in accident analysis for PRC-II Reactor (The Second Pulsed Reactor in China) are introduced. PRC-II Reactor is built in big populous city, so the public pay close attention to reactor safety. Consequently, Some hypothetical accidents are analyzed. They include an uncontrolled control rod withdrawal at rated power, a pulse rod ejection at rated power, and loss of coolant accident. Calculation model which completely depict the principle and process for each accident is established and the relevant analysis code is developed. This work also includes comprehensive computing and analyzing transients for each accident of PRC-II Reactor; the influences in the reactor safety of all kind of sensitive parameters; evaluating the function of engineered safety feature. The measures to alleviate the consequence of accident are suggested and taken in the construction design of PRC-II Reactor. The properties of reactor safety are comprehensively evaluated. A new advanced calculation model (True Core Uncovered Model) of LOCA of PRC-II Reactor and the relevant code (MCRLOCA) are first put forward

  12. A basic plan of micro reactor for the promotion of nuclear literacy problems in realization of the reactor

    Energy Technology Data Exchange (ETDEWEB)

    Murata, Takashi [Graduate School of Energy Science, Kyoto University, Kyoto (Japan); Yoshiki, Nobuya [Central Research Inst. of Electric Power Industry, Tokyo (Japan); Nakagawa, Haruo [Hitachi Ltd., Tokyo (Japan)

    2001-04-01

    It is difficult for new sciences and big technologies such as life sciences, nuclear energy, biotechnology, etc. to achieve the appropriate public understanding. One of the issues is the lack of scientific literacy in the public. We propose a new type of nuclear reactor to prepare the opportunity for the public to be acquainted with the information and technology of the nuclear science and engineering. This reactor for interpretation is a symbolic facility adjacent to the scientific and technological museum. Features of the interpretative reactor must be safety and openness. The safety is satisfied by the excess reactivity less than 0.5% {delta}k/k and little burn up based on the thermal output less than 1 W. (author)

  13. A basic plan of micro reactor for the promotion of nuclear literacy problems in realization of the reactor

    International Nuclear Information System (INIS)

    Murata, Takashi; Yoshiki, Nobuya; Nakagawa, Haruo

    2001-01-01

    It is difficult for new sciences and big technologies such as life sciences, nuclear energy, biotechnology, etc. to achieve the appropriate public understanding. One of the issues is the lack of scientific literacy in the public. We propose a new type of nuclear reactor to prepare the opportunity for the public to be acquainted with the information and technology of the nuclear science and engineering. This reactor for interpretation is a symbolic facility adjacent to the scientific and technological museum. Features of the interpretative reactor must be safety and openness. The safety is satisfied by the excess reactivity less than 0.5% Δk/k and little burn up based on the thermal output less than 1 W. (author)

  14. BigOP: Generating Comprehensive Big Data Workloads as a Benchmarking Framework

    OpenAIRE

    Zhu, Yuqing; Zhan, Jianfeng; Weng, Chuliang; Nambiar, Raghunath; Zhang, Jinchao; Chen, Xingzhen; Wang, Lei

    2014-01-01

    Big Data is considered proprietary asset of companies, organizations, and even nations. Turning big data into real treasure requires the support of big data systems. A variety of commercial and open source products have been unleashed for big data storage and processing. While big data users are facing the choice of which system best suits their needs, big data system developers are facing the question of how to evaluate their systems with regard to general big data processing needs. System b...

  15. Big rock point restoration project BWR major component removal, packaging and shipping - planning and experience

    International Nuclear Information System (INIS)

    Milner, T.; Dam, S.; Papp, M.; Slade, J.; Slimp, B.; Nurden, P.

    2001-01-01

    The Big Rock Point boiling water reactor (BWR) at Charlevoix, MI was permanently shut down on August 29th 1997. In 1999 BNFL Inc.'s Reactor Decommissioning Group (RDG) was awarded a contract by Consumers Energy (CECo) for the Big Rock Point (BRP) Major Component Removal (MCR) project. BNFL Inc. RDG has teamed with MOTA, Sargent and Lundy and MDM Services to plan and execute MCR in support of the facility restoration project. The facility restoration project will be completed by 2005. Key to the success of the project has been the integration of best available demonstrated technology into a robust and responsive project management approach, which places emphasis on safety and quality assurance in achieving project milestones linked to time and cost. To support decommissioning of the BRP MCR activities, a reactor vessel (RV) shipping container is required. Discussed in this paper is the design and fabrication of a 10 CFR Part 71 Type B container necessary to ship the BRP RV. The container to be used for transportation of the RV to the burial site was designed as an Exclusive Use Type B package for shipment and burial at the Barnwell, South Carolina (SC) disposal facility. (author)

  16. How Big Is Too Big?

    Science.gov (United States)

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  17. A matrix big bang

    International Nuclear Information System (INIS)

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control

  18. A matrix big bang

    Energy Technology Data Exchange (ETDEWEB)

    Craps, Ben [Instituut voor Theoretische Fysica, Universiteit van Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam (Netherlands); Sethi, Savdeep [Enrico Fermi Institute, University of Chicago, Chicago, IL 60637 (United States); Verlinde, Erik [Instituut voor Theoretische Fysica, Universiteit van Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam (Netherlands)

    2005-10-15

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control.

  19. Reactor feedwater device

    International Nuclear Information System (INIS)

    Igarashi, Noboru.

    1986-01-01

    Purpose: To suppress soluble radioactive corrosion products in a feedwater device. Method: In a light water cooled nuclear reactor, an iron injection system is connected to feedwater pipeways and the iron concentration in the feedwater or reactor coolant is adjusted between twice and ten times of the nickel concentration. When the nickel/iron ratio in the reactor coolant or feedwater goes nearer to 1/2, iron ions are injected together with iron particles to the reactor coolant to suppress the leaching of stainless steels, decrease the nickel in water and increase the iron concentration. As a result, it is possible to suppress the intrusion of nickel as one of parent nuclide of radioactive nuclides. Further, since the iron particles intruded into the reactor constitute nuclei for capturing the radioactive nuclides to reduce the soluble radioactive corrosion products, the radioactive nuclides deposited uniformly to the inside of the pipeways in each of the coolant circuits can be reduced. (Kawakami, Y.)

  20. Nursing Needs Big Data and Big Data Needs Nursing.

    Science.gov (United States)

    Brennan, Patricia Flatley; Bakken, Suzanne

    2015-09-01

    Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.

  1. Triennial technical report - 1986, 1987, 1988 - Instituto de Engenharia Nuclear (IEN) -Dept. of Reactors (DERE)

    International Nuclear Information System (INIS)

    1989-01-01

    The research activities developed during the period 1986, 1987 and 1988 by the Reactor Department of Brazilian Nuclear Energy Commission (CNEN-DERE) are summarized. The principal aim of the Department of Reactors is concerned to the study and development of fast reactors and research thermal reactors. The DERE also assists the CNEN in the areas related to analysis of power reactor structure; to teach Reactor Physics and Engineering at the University, and professional training to the Nuclear Engineering Institute. To develop its research activity the DERE has three big facilities: Argonauta reactor, CTS-1 sodium circuit, and water circuit. (M.I.)

  2. BIG Data - BIG Gains? Understanding the Link Between Big Data Analytics and Innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance for product innovations. Since big data technologies provide new data information practices, they create new decision-making possibilities, which firms can use to realize innovations. Applying German firm-level data we find suggestive evidence that big data analytics matters for the likelihood of becoming a product innovator as well as the market success of the firms’ product innovat...

  3. Big science transformed science, politics and organization in Europe and the United States

    CERN Document Server

    Hallonsten, Olof

    2016-01-01

    This book analyses the emergence of a transformed Big Science in Europe and the United States, using both historical and sociological perspectives. It shows how technology-intensive natural sciences grew to a prominent position in Western societies during the post-World War II era, and how their development cohered with both technological and social developments. At the helm of post-war science are large-scale projects, primarily in physics, which receive substantial funds from the public purse. Big Science Transformed shows how these projects, popularly called 'Big Science', have become symbols of progress. It analyses changes to the political and sociological frameworks surrounding publicly-funding science, and their impact on a number of new accelerator and reactor-based facilities that have come to prominence in materials science and the life sciences. Interdisciplinary in scope, this book will be of great interest to historians, sociologists and philosophers of science.

  4. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  5. Recommendations for a restart of Molten Salt Reactor development

    International Nuclear Information System (INIS)

    Moir, R. W.

    2007-01-01

    The concept of the molten salt reactor (MSR) refuses to go away. The Generation-IV process lists the MSR as one of the six concepts to be considered for extending fuel resources. Good fuel utilization and good economics are required to meet the often cited goal of 10 TWe globally and 1 TWe for the US by non-carbon energy sources in this century by nuclear fission. A strong incentive for the molten salt reactor design is its good fuel utilization, good economics, amazing flexibility and promised large benefits. It can: - use thorium or uranium; o be designed with lots of graphite to have a fairly thermal neutron spectrum or without graphite moderator to have a fast neutron spectrum reactor; - fission uranium isotopes and plutonium isotopes; - operate with non-weapon grade fissile fuel, or in suitable sites it can operate with enrichment between reactor-grade and weapon-grade fissile fuel; - be a breeder or near breeder; - operate at temperature >1100 degree C if carbon composites are successfully employed. Enhancing 2 32U content in the uranium to over 500 pm makes the fuel undesirable for weapons, but it should not detract from its economic use in liquid fuel reactors: a big advantage in nonproliferation. Economics of the MSR is enhanced by operating at low pressure and high temperature and may even lead to the preferred route to hydrogen production. The cost of the electricity produced from low enriched fuel averaged over the life of the entire process, has been predicted to be about 10% lower than that from LWRs, and 20% lower for high enriched fuel, with uncertainties of about 10%. The development cost has been estimated at about 1 B$ (e.g., a 100 M$/y base program for ten years) not including construction of a series of reactors leading up to the deployment of multiple commercial units at an assumed cost of 9 B$ (450 M$/y over 20 years). A benefit of liquid fuel is that smaller power reactors can faithfully test features of larger reactors, thereby reducing the

  6. Global fluctuation spectra in big-crunch-big-bang string vacua

    International Nuclear Information System (INIS)

    Craps, Ben; Ovrut, Burt A.

    2004-01-01

    We study big-crunch-big-bang cosmologies that correspond to exact world-sheet superconformal field theories of type II strings. The string theory spacetime contains a big crunch and a big bang cosmology, as well as additional 'whisker' asymptotic and intermediate regions. Within the context of free string theory, we compute, unambiguously, the scalar fluctuation spectrum in all regions of spacetime. Generically, the big crunch fluctuation spectrum is altered while passing through the bounce singularity. The change in the spectrum is characterized by a function Δ, which is momentum and time dependent. We compute Δ explicitly and demonstrate that it arises from the whisker regions. The whiskers are also shown to lead to 'entanglement' entropy in the big bang region. Finally, in the Milne orbifold limit of our superconformal vacua, we show that Δ→1 and, hence, the fluctuation spectrum is unaltered by the big-crunch-big-bang singularity. We comment on, but do not attempt to resolve, subtleties related to gravitational back reaction and light winding modes when interactions are taken into account

  7. Prospect of realizing nuclear fusion reactors

    International Nuclear Information System (INIS)

    1989-01-01

    This Report describes the results of the research work on nuclear fusion, which CRIEPI has carried out for about ten years from the standpoint of electric power utilities, potential user of its energy. The principal points are; (a) economic analysis (calculation of costs) based on Japanese analysis procedures and database of commercial fusion reactors, including fusion-fission hybrid reactors, and (b) conceptual design of two types of hybrid reactors, that is, fission-fuel producing DMHR (Demonstration Molten-Salt Hybrid Reactor) and electric-power producing THPR (Tokamak Hybrid Power Reactor). The Report consists of the following chapters: 1. Introduction. 2. Conceptual Design of Hybrid Reactors. 3. Economic Analysis of Commercial Fusion Reactors. 4. Basic Studies Applicable Also to Nuclear Fusion Technology. 5. List of Published Reports and Papers; 6. Conclusion. Appendices. (author)

  8. Big Argumentation?

    Directory of Open Access Journals (Sweden)

    Daniel Faltesek

    2013-08-01

    Full Text Available Big Data is nothing new. Public concern regarding the mass diffusion of data has appeared repeatedly with computing innovations, in the formation before Big Data it was most recently referred to as the information explosion. In this essay, I argue that the appeal of Big Data is not a function of computational power, but of a synergistic relationship between aesthetic order and a politics evacuated of a meaningful public deliberation. Understanding, and challenging, Big Data requires an attention to the aesthetics of data visualization and the ways in which those aesthetics would seem to depoliticize information. The conclusion proposes an alternative argumentative aesthetic as the appropriate response to the depoliticization posed by the popular imaginary of Big Data.

  9. Comparison of wastewater plant of Nova Pampulha, with an UASB reactor, with another ten brazilian stations; Comparacion de la EDAR de Nova Pampulha, dotada de reactor UASB, con otras diez plantas brasilenas

    Energy Technology Data Exchange (ETDEWEB)

    Barrosa Correa, S. M. B.; Ruiz, E.; Romero, F.

    2004-07-01

    This work is based on data of the wastewater plant of Nova Pampulha with an UASB reactor. The objective of this research was focussed in the comparison of this plant with another ten brazilian stations provided with different depuration techniques. Firstly the graphical comparison of average operational data suggest analogies between influents (less suspended solids in the Nova Pampulha), effluent (more suspended solids and bacteria in the same station) and alimentation's (smaller for suspended solids and bacteria in Nova Pampulha, where there is also an increase in alkalinity). Cluster analysis, made with percentages of elimination of constituents in the eleven stations and shown as dendrograms, was chosen as the second comparative method. A third comparison was affected by multiple linear regression for obtaiming mathematical models from the eliminations of constituents, with statistical significance at level of the 95% confidence, using as possible independent variables the flows and the concentrations of influents. The explanations of the variances of data by the calculated equations is in the range 46 to 91%. As a general conclusion, it can be said that a well operated UASB reactor may be a satisfactory technique for wastewater treatment and well adapted to climatological Brazilian conditions. (Author) 14 refs.

  10. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations......The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  11. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  12. From big bang to big crunch and beyond

    International Nuclear Information System (INIS)

    Elitzur, Shmuel; Rabinovici, Eliezer; Giveon, Amit; Kutasov, David

    2002-01-01

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a 'big bang' singularity, expands and then contracts to a 'big crunch' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spacetimes is further connected at the singularities to a series of non-compact static regions with closed timelike curves. These regions contain boundaries, on which the observables of the theory live. This suggests a holographic interpretation of the physics. (author)

  13. BIG data - BIG gains? Empirical evidence on the link between big data analytics and innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance in terms of product innovations. Since big data technologies provide new data information practices, they create novel decision-making possibilities, which are widely believed to support firms’ innovation process. Applying German firm-level data within a knowledge production function framework we find suggestive evidence that big data analytics is a relevant determinant for the likel...

  14. Big five general contractors dominate civil construction market of nuclear power plants

    International Nuclear Information System (INIS)

    Anon.

    1985-01-01

    The Japanese construction industry is a key industry accounting for about 20 % of the GNP, and the investment in construction amounted to 51,200 billion yen in fiscal 1984. 515,000 firms employing about 5.5 million workers belong to the industry. 99.4 % of these firms is the enterprises capitalized at less than 100 million yen, and most of them are small self-employment enterprises. The Construction Business Law provides that those who wish to engage in construction are required to obtain a permit from the Construction Ministry or from a local prefectural governor. There are big five and seven sub-major construction companies in Japan. The big five formed the tie-up relations with three nuclear reactor manufacturers. 76 civil engineering and construction companies recorded the sales related to nuclear power in 1983 amounting to 330.9 billion yen, equivalent to 21 % of the total nuclear-related sales. The construction of nuclear power plants and the characteristics of the construction, and the activities of the big five in the field of nuclear industry are reported. (Kako, I.)

  15. Players Off the Field. How Jim Delany and Roy Kramer Took over Big-Time College Sports.

    Science.gov (United States)

    Suggs, Welch

    2000-01-01

    Traces the history of the college football bowl system and describes the movement toward replacing the bowl game system with a national championship playoff system. Focuses on the roles of J. Delany, commission of the Big Ten Conference and R. Kramer, commissioner of the Southeastern Conference, in perpetuating the current college football bowl…

  16. How to use Big Data technologies to optimize operations in Upstream Petroleum Industry

    Directory of Open Access Journals (Sweden)

    Abdelkader Baaziz

    2013-12-01

    Full Text Available “Big Data is the oil of the new economy” is the most famous citation during the three last years. It has even been adopted by the World Economic Forum in 2011. In fact, Big Data is like crude! It’s valuable, but if unrefined it cannot be used. It must be broken down, analyzed for it to have value. But what about Big Data generated by the Petroleum Industry and particularly its upstream segment? Upstream is no stranger to Big Data. Understanding and leveraging data in the upstream segment enables firms to remain competitive throughout planning, exploration, delineation, and field development.Oil & Gas Companies conduct advanced geophysics modeling and simulation to support operations where 2D, 3D & 4D Seismic generate significant data during exploration phases. They closely monitor the performance of their operational assets. To do this, they use tens of thousands of data-collecting sensors in subsurface wells and surface facilities to provide continuous and real-time monitoring of assets and environmental conditions. Unfortunately, this information comes in various and increasingly complex forms, making it a challenge to collect, interpret, and leverage the disparate data. As an example, Chevron’s internal IT traffic alone exceeds 1.5 terabytes a day.Big Data technologies integrate common and disparate data sets to deliver the right information at the appropriate time to the correct decision-maker. These capabilities help firms act on large volumes of data, transforming decision-making from reactive to proactive and optimizing all phases of exploration, development and production. Furthermore, Big Data offers multiple opportunities to ensure safer, more responsible operations. Another invaluable effect of that would be shared learning.The aim of this paper is to explain how to use Big Data technologies to optimize operations. How can Big Data help experts to decision-making leading the desired outcomes?Keywords:Big Data; Analytics

  17. Nuclear reactor fuel cycle technology with pyroelectrochemical processes

    International Nuclear Information System (INIS)

    Skiba, O.V.; Maershin, A.A.; Bychkov, A.V.; Zhdanov, A.N.; Kislyj, V.A.; Vavilov, S.K.; Babikov, L.G.

    1999-01-01

    A group of dry technologies and processes of vibro-packing granulated fuel in combination with unique properties of vibro-packed FEs make it possible to implement a new comprehensive approach to the fuel cycle with plutonium fuel. Testing of a big number of FEs with vibro-packed U-Pu oxide fuel in the BOR-60 reactor, successful testing of experimental FSAs in the BN-600 rector, reliable operation of the experimental and research complex facilities allow to make the conclusion about a real possibility to develop a safe, economically beneficial U-Pu fuel cycle based on the technologies enumerated above and to use both reactor-grade and weapon-grade plutonium in nuclear reactors with a reliable control and accounting system [ru

  18. The Influence Of Switching-Off The Big Lamps On The Humidity Operation Hall

    International Nuclear Information System (INIS)

    Wiranto, Slamet; Sriawan

    2001-01-01

    When there is no activity in the Operation Hall, the big lamps in this are switched off. Due to the water trap of ventilation system is not in good function, the humidity of the Operation Hall increases. In any point of time the humidity rise over the permitted limit value. To avoid this problem it is needed to investigate the characteristic by measuring the humidity of the Operation Hall at various condition and situation. From the characteristic, it can be determined that for normal condition, the Operation Hall big lamps should be switched off, and 2 days before start-up reactor, the all operation building lamps should be switched on for about 5 days as the operation building humidity back to normal value

  19. Materials for high temperature reactor vessels

    International Nuclear Information System (INIS)

    Buenaventura Pouyfaucon, A.

    2004-01-01

    Within the 5th Euraton Framework Programme, a big effort is being made to promote and consolidate the development of the High Temperature Reactor (HTR). Empresarios Agrupados is participating in this project and among others, also forms part of the HTR-M project Materials for HTRs. This paper summarises the work carried out by Empresarios Agrupados regarding the material selection of the HTR Reactor Pressure Vessel (RPV). The possible candidate materials and the most promising ones are discussed. Design aspects such as the RPV sensitive zones and material damage mechanisms are considered. Finally, the applicability of the existing design Codes and Standards for the design of the HTR RPV is also discussed. (Author)

  20. FMDP Reactor Alternative Summary Report: Volume 2 - CANDU heavy water reactor alternative

    International Nuclear Information System (INIS)

    Greene, S.R.; Spellman, D.J.; Bevard, B.B.

    1996-09-01

    The Department of Energy Office of Fissile Materials Disposition (DOE/MD) initiated a detailed analysis activity to evaluate each of ten plutonium disposition alternatives that survived an initial screening process. This document, Volume 2 of a four volume report, summarizes the results of these analyses for the CANDU reactor based plutonium disposition alternative

  1. FMDP Reactor Alternative Summary Report: Volume 2 - CANDU heavy water reactor alternative

    Energy Technology Data Exchange (ETDEWEB)

    Greene, S.R.; Spellman, D.J.; Bevard, B.B. [and others

    1996-09-01

    The Department of Energy Office of Fissile Materials Disposition (DOE/MD) initiated a detailed analysis activity to evaluate each of ten plutonium disposition alternatives that survived an initial screening process. This document, Volume 2 of a four volume report, summarizes the results of these analyses for the CANDU reactor based plutonium disposition alternative.

  2. The Big Five personality dimensions and mental health: The mediating role of alexithymia.

    Science.gov (United States)

    Atari, Mohammad; Yaghoubirad, Mahsa

    2016-12-01

    The role of personality constructs on mental health has attracted research attention in the last few decades. The Big Five personality traits have been introduced as parsimonious dimensions of non-pathological traits. The five-factor model of personality includes neuroticism, agreeableness, conscientiousness, extraversion, and openness to experience. The present study aimed to examine the relationship between the Big Five dimensions and mental health considering the mediating role of alexithymia as an important emotional-processing construct. A total of 257 participants were recruited from non-clinical settings in the general population. All participants completed the Ten-Item Personality Inventory (TIPI), 20-item Toronto Alexithymia Scale (TAS-20), and General Health Questionnaire-28 (GHQ-28). Structural equation modeling was utilized to examine the hypothesized mediated model. Findings indicated that the Big Five personality dimensions could significantly predict scores of alexithymia. Moreover, alexithymia could predict mental health scores as measured by indices of depression, anxiety, social functioning, and somatic symptoms. The fit indices (GFI=0.94; CFI=0.91; TLI=0.90; RMSEA=0.071; CMIN/df=2.29) indicated that the model fits the data. Therefore, the relationship between the Big Five personality dimensions and mental health is mediated by alexithymia. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Benchmarking Big Data Systems and the BigData Top100 List.

    Science.gov (United States)

    Baru, Chaitanya; Bhandarkar, Milind; Nambiar, Raghunath; Poess, Meikel; Rabl, Tilmann

    2013-03-01

    "Big data" has become a major force of innovation across enterprises of all sizes. New platforms with increasingly more features for managing big datasets are being announced almost on a weekly basis. Yet, there is currently a lack of any means of comparability among such platforms. While the performance of traditional database systems is well understood and measured by long-established institutions such as the Transaction Processing Performance Council (TCP), there is neither a clear definition of the performance of big data systems nor a generally agreed upon metric for comparing these systems. In this article, we describe a community-based effort for defining a big data benchmark. Over the past year, a Big Data Benchmarking Community has become established in order to fill this void. The effort focuses on defining an end-to-end application-layer benchmark for measuring the performance of big data applications, with the ability to easily adapt the benchmark specification to evolving challenges in the big data space. This article describes the efforts that have been undertaken thus far toward the definition of a BigData Top100 List. While highlighting the major technical as well as organizational challenges, through this article, we also solicit community input into this process.

  4. Review of fast reactor activities in India (1983-84)

    International Nuclear Information System (INIS)

    Paranjpe, S.R.

    1984-01-01

    The last year was very significant for the Indian Nuclear Energy Programme as the first indigeneously built heavy water moderated natural uranium reactor called Madras Atomic Power Plant Unit-I was made operational and connected to the grid. The power level has been gradually increased and the reactor has been operating at a power level of 200 MWe (temporarily limited by Plutonium build up during approach to equilibrium core loading). The 'plutonium peak' will be crossed shortly clearing the way for raising the reactor to the full power of 235 MWe gross. The second unit of MAPP, is well advanced and barring unforeseen difficulties, is expected to become operational during this financial year. This has been a big morale booster for the programme in general and the Fast Reactor Programme in particular as plutonium produced in these reactors is expected to be the inventory for Prototype Fast Breeder Reactors. It may be recalled that in the last report to this group, a reference was made to initiation of some preliminary design studies for such a reactor

  5. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  6. Lead-based Fast Reactor Development Plan and R&D Status in China

    International Nuclear Information System (INIS)

    Wu Yican

    2013-01-01

    • Lead-based fast reactors have good potential for waste transmutation, fuel breeding and energy production, which has been selected by CAS as the advanced reactor development emphasis with the support of ADS program and MFE program. Sharing of technologies R&D is possible among GIF/ADS/Fusion. • The concepts and test strategy of series China lead-based fast reactors (CLEAR) have been developed. The preliminary engineering design and safety analysis of CLEAR-I are underway. • Technology R&D on CLEAR with series lead alloy loops and accelerator-based neutron generator have been constructed or under construction. • CLEAR series reactor design and construction have big challenges, widely international cooperation on reactor design and technology R&D is welcome

  7. BigDataBench: a Big Data Benchmark Suite from Internet Services

    OpenAIRE

    Wang, Lei; Zhan, Jianfeng; Luo, Chunjie; Zhu, Yuqing; Yang, Qiang; He, Yongqiang; Gao, Wanling; Jia, Zhen; Shi, Yingjie; Zhang, Shujie; Zheng, Chen; Lu, Gang; Zhan, Kent; Li, Xiaona; Qiu, Bizhu

    2014-01-01

    As architecture, systems, and data management communities pay greater attention to innovative big data systems and architectures, the pressure of benchmarking and evaluating these systems rises. Considering the broad use of big data systems, big data benchmarks must include diversity of data and workloads. Most of the state-of-the-art big data benchmarking efforts target evaluating specific types of applications or system software stacks, and hence they are not qualified for serving the purpo...

  8. Forget the hype or reality. Big data presents new opportunities in Earth Science.

    Science.gov (United States)

    Lee, T. J.

    2015-12-01

    Earth science is arguably one of the most mature science discipline which constantly acquires, curates, and utilizes a large volume of data with diverse variety. We deal with big data before there is big data. For example, while developing the EOS program in the 1980s, the EOS data and information system (EOSDIS) was developed to manage the vast amount of data acquired by the EOS fleet of satellites. EOSDIS continues to be a shining example of modern science data systems in the past two decades. With the explosion of internet, the usage of social media, and the provision of sensors everywhere, the big data era has bring new challenges. First, Goggle developed the search algorithm and a distributed data management system. The open source communities quickly followed up and developed Hadoop file system to facility the map reduce workloads. The internet continues to generate tens of petabytes of data every day. There is a significant shortage of algorithms and knowledgeable manpower to mine the data. In response, the federal government developed the big data programs that fund research and development projects and training programs to tackle these new challenges. Meanwhile, comparatively to the internet data explosion, Earth science big data problem has become quite small. Nevertheless, the big data era presents an opportunity for Earth science to evolve. We learned about the MapReduce algorithms, in memory data mining, machine learning, graph analysis, and semantic web technologies. How do we apply these new technologies to our discipline and bring the hype to Earth? In this talk, I will discuss how we might want to apply some of the big data technologies to our discipline and solve many of our challenging problems. More importantly, I will propose new Earth science data system architecture to enable new type of scientific inquires.

  9. Burnable absorber-integrated Guide Thimble (BigT) - 1. Design concepts and neutronic characterization on the fuel assembly benchmarks

    International Nuclear Information System (INIS)

    Yahya, Mohd-Syukri; Yu, Hwanyeal; Kim, Yonghee

    2016-01-01

    This paper presents the conceptual designs of a new burnable absorber (BA) for the pressurized water reactor (PWR), which is named 'Burnable absorber-integrated Guide Thimble' (BigT). The BigT integrates BA materials into standard guide thimble in a PWR fuel assembly. Neutronic sensitivities and practical design considerations of the BigT concept are points of highlight in the first half of the paper. Specifically, the BigT concepts are characterized in view of its BA material and spatial self-shielding variations. In addition, the BigT replaceability requirement, bottom-end design specifications and thermal-hydraulic considerations are also deliberated. Meanwhile, much of the second half of the paper is devoted to demonstrate practical viability of the BigT absorbers via comparative evaluations against the conventional BA technologies in representative 17x17 and 16x16 fuel assembly lattices. For the 17x17 lattice evaluations, all three BigT variants are benchmarked against Westinghouse's existing BA technologies, while in the 16x16 assembly analyses, the BigT designs are compared against traditional integral gadolinia-urania rod design. All analyses clearly show that the BigT absorbers perform as well as the commercial BA technologies in terms of reactivity and power peaking management. In addition, it has been shown that sufficiently high control rod worth can be obtained with the BigT absorbers in place. All neutronic simulations were completed using the Monte Carlo Serpent code with ENDF/B-VII.0 library. (author)

  10. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  11. BigDansing

    KAUST Repository

    Khayyat, Zuhair

    2015-06-02

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to scaling to big datasets. This presents a serious impediment since data cleansing often involves costly computations such as enumerating pairs of tuples, handling inequality joins, and dealing with user-defined functions. In this paper, we present BigDansing, a Big Data Cleansing system to tackle efficiency, scalability, and ease-of-use issues in data cleansing. The system can run on top of most common general purpose data processing platforms, ranging from DBMSs to MapReduce-like frameworks. A user-friendly programming interface allows users to express data quality rules both declaratively and procedurally, with no requirement of being aware of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic and real datasets show that BigDansing outperforms existing baseline systems up to more than two orders of magnitude without sacrificing the quality provided by the repair algorithms.

  12. Political-social reactor problems at Berkeley

    International Nuclear Information System (INIS)

    Little, G.A.

    1980-01-01

    For better than ten years there was little public notice of the TRIGA reactor at UC-Berkeley. Then: a) A non-student persuaded the Student and Senate to pass a resolution to request Campus Administration to stop operation of the reactor and remove it from campus. b) Presence of the reactor became a campaign-issue in a City Mayoral election. c) Two local residents reported adverse physical reactions before, during, and after a routine tour of the reactor facility. d) The Berkeley City Council began a study of problems associated with radioactive material within the city. e) Friends Of The Earth formally petitioned the NRC to terminate the reactor's license. Campus personnel have expended many man-hours and many pounds of paper in responding to these happenings. Some of the details are of interest, and may be of use to other reactor facilities. (author)

  13. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  14. Big science

    CERN Multimedia

    Nadis, S

    2003-01-01

    " "Big science" is moving into astronomy, bringing large experimental teams, multi-year research projects, and big budgets. If this is the wave of the future, why are some astronomers bucking the trend?" (2 pages).

  15. Decommissioning of the Neuherberg Research Reactor (FRN)

    International Nuclear Information System (INIS)

    Demmeler, M.; Rau, G.; Strube, D.

    1982-01-01

    The Neuherberg Research Reactor is of type TRIGA MARK III with 1 MW steady state power and pulsable up to 2000 MW. During more than ten years of operation 12000 MWh and 6000 reactor pulses had been performed. In spite of its good technical condition and of permanent safe operation without any failures, the decommissioning of the Neuherberg research reactor was decided by the GSF board of directors to save costs for maintaining and personnel. As the mode of decommissioning the safe enclosure was chosen which means that the fuel elements will be transferred back to the USA. All other radioactive reactor components will be enclosed in the reactor block. Procedures for licensing of the decommissioning, dismantling procedures and time tables are presented

  16. Electrical cabling system associated at a nuclear reactor

    International Nuclear Information System (INIS)

    Dejeux, P.; Desfontaines, G.

    1988-01-01

    This cabling system for an electrical device in a nuclear reactor comprises at least a first cable issued of the device, a second cable comprising a first portion, a second portion and a third portion joining the second by a multiple quick fitting connector capable to connect at least ten second portions at ten other third portions of the second cable [fr

  17. Description of the Triton reactor

    International Nuclear Information System (INIS)

    1967-09-01

    The Triton reactor is an enriched uranium pool type reactor. It began operation in 1959, after a divergence made on the June 30 the same year. Devoted to studies of radiation protection, its core can be displaced in the longitudinal direction. The pool can be separated in two unequal compartments by a wall. The Triton core is placed in a small compartment, the Nereide core in the big compartment. A third compartment without water is called Naiade II, is separated by a concrete wall in which is made a window closed by an aluminium plate (2.50 m x 2.70 m). The Naiade II hole is useful for protection experiments using the Nereide core. After a complete refitting, the power of the triton reactor that reached progressively from 1.2 MW to 2 MW, then 3 MW has reached in August 1965 6.5 MW. The reactor has been specialized in irradiations in fix position, the core become fix, the nereide core has been hung mobile. Since it has been used for structure materials irradiation, for radioelements fabrication and fundamental research. The following descriptions are valid for the period after August 1965 [fr

  18. Development of small and medium reactors for power and heat production

    International Nuclear Information System (INIS)

    Becka, J.

    1978-01-01

    Data are given on the current state of development of small and medium-power reactors designed mainly for electric power production in small power grids, for heat production for small- and medium-power desalination plants with possible electric power generation, for process steam production and heat development for district heating systems, again combined with electric power generation, and for propelling big and fast passenger ships. A diagram is shown of the primary system of an integrated PWR derived from the Otto Hahn reactor. The family is listed of the standard sizes of the integral INTERATOM company pressurized water reactors. Also listed are the specifications and design of CAS 2CG and AS 3G type reactors used mainly for long-distance heating systems. (J.B.)

  19. Relações hierárquicas entre os traços amplos do Big Five Hierarchical relationship between the broad traits of the Big Five

    Directory of Open Access Journals (Sweden)

    Cristiano Mauro Assis Gomes

    2012-01-01

    Full Text Available O modelo Big Five sustenta que a personalidade humana é composta por dezenas de fatores específicos. Apesar dessa diversidade, esses fatores confluem para cinco traços amplos que estão em um mesmo nível de hierarquia. O presente estudo apresenta uma hipótese alternativa, postulando níveis entre os traços amplos do modelo. Fizeram parte do estudo 684 estudantes do ensino fundamental e médio de uma escola particular de Belo Horizonte, MG, com idades entre 10 e 18 anos (m = 13,71 e DP= 2,11. Para medir os fatores do Big Five foi utilizado o Inventário de Características de Personalidade, anteriormente chamado de Inventário dos Adjetivos de Personalidade, de Pinheiro, Gomes e Braga (2009. O instrumento mensura oito polaridades das 10 polaridades presentes nos cinco traços amplos do Big Five. Dois modelos foram comparados via método path analysis: um modelo de quatro níveis hierárquicos e um modelo não hierárquico. O modelo hierárquico apresentou adequado grau de ajuste aos dados e mostrou-se superior ao modelo não hierárquico, que não se ajusta aos dados. Implicações são discutidas para o modelo Big Five.The Big Five model sustains that human personality is composed by dozens of specific factors. Despite of diversity, specific factors are integrated in five broad traits that are in the same hierarchical level. The current study presents an alternative hypothesis arguing that there are hierarchical levels between the broad traits of the model. Six hundred and eighty-four junior and high school level students from 10 to 18 years old (M = 13.71 and SD= 2.11 of a private school in the city of Belo Horizonte, Minas Gerais, Brazil participated in the study. The Big Five was measured by an Inventory of Personality Traits, initially named as Personality Adjective Inventory, elaborated by Pinheiro, Gomes and Braga (2009. This instrument measures eight polarities of the ten presented in the Big Five Model. Two models were compared

  20. Big bang and big crunch in matrix string theory

    OpenAIRE

    Bedford, J; Papageorgakis, C; Rodríguez-Gómez, D; Ward, J

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  1. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  2. Big data uncertainties.

    Science.gov (United States)

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  3. Super critical water reactors

    International Nuclear Information System (INIS)

    Dumaz, P.; Antoni, O; Arnoux, P.; Bergeron, A; Renault, C.; Rimpault, G.

    2005-01-01

    Water is used as a calori-porter and moderator in the most major nuclear centers which are actually in function. In the pressurized water reactor (PWR) and boiling water reactor (BWR), water is maintained under critical point of water (21 bar, 374 Centigrade) which limits the efficiency of thermodynamic cycle of energy conversion (yield gain of about 33%) Crossing the critical point, one can then use s upercritical water , the obtained pressure and temperature allow a significant yield gains. In addition, the supercritical water offers important properties. Particularly there is no more possible coexistence between vapor and liquid. Therefore, we don't have more boiling problem, one of the phenomena which limits the specific power of PWR and BWR. Since 1950s, the reactor of supercritical water was the subject of studies more or less detailed but neglected. From the early 1990s, this type of conception benefits of some additional interests. Therefore, in the international term G eneration IV , the supercritical water reactors had been considered as one of the big options for study as Generation IV reactors. In the CEA, an active city has engaged from 1930 with the participation to a European program: The HPWR (High Performance Light Water Reactor). In this contest, the R and D studies are focused on the fields of neutrons, thermodynamic and materials. The CEA intends to pursue a limited effort of R and D in this field, in the framework of international cooperation, preferring the study of versions of rapid spectrum. (author)

  4. 1980 nuclear plant survey: no reactors sold; more cancellations

    International Nuclear Information System (INIS)

    Friedlander, G.D.

    1980-01-01

    No sales were reported in 1979 by any of the big four reactor suppliers. Three cancellations were reported and construction was suspended on the Jersey Central Power and Light Co.'s Forked River unit. Since last year's survey, the commercial operation dates of about 80 units have been postponed from one year to indefinitely, and nuclear commitments are down from last year's 195 units to 193 units. Presently, there are 72 plants on line, with a capacity of more than 53,000 MW. A resumption of new reactor orders is expected in either late 1980 or early 1981

  5. Fast reactor development strategy targets study in China

    International Nuclear Information System (INIS)

    Xu Mi

    2008-01-01

    China is a big developing Country who needs a huge energy resources and a rapid growing rate. Considering energy resources limited and environment issues it is sure that the nuclear energy will be becoming one of the main energy resources. The Government has decided to develop the nuclear power capacity to 40 GW in 2020. It is envisaged that it will reach to 240 GW in 2050. It is stimulate us to consider conscientiously the development of the fast breeder reactor's and related closed nuclear fuel cycle by the limitation of Uranium resources and uncertainties of international Uranium market. Followings are the proposed strategic targets of fast reactor development in China. (1) To realize the operation of commercial fast breeder reactors with an unit size of 800-900 MWe and one site-multi reactors in 2030. (2) To develop the nuclear power capacity to 240 GW in 2050. (3) To replace step by step the fossil fuel utilization in large scale by nuclear energy beyond 2050. (authors)

  6. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  7. Safe havens in Europe: Switzerland and the ten dwarfs

    Directory of Open Access Journals (Sweden)

    Martin Paldam

    2013-12-01

    Full Text Available Eleven safe havens exist in Europe providing offshore banking and low taxes. Ten of these states are very small while Switzerland is moderately small. All 11countries are richer than their large neighbors. It is shown that causality is from small to safe haven towealth, and that theoretically equilibriums are likely to exist where a certain regulation is substantially lower in a small country than in its big neighbor. This generates a large capital inflow to the safe havens. The pool of funds that may reach the safe havens is shown to be huge. It is far in excess of the absorptive capacity of the safe havens, but it still explains, why they are rich. Microstates offer a veil of anonymity to funds passing through, and Switzerland offers safe storage of funds.

  8. Big bang and big crunch in matrix string theory

    International Nuclear Information System (INIS)

    Bedford, J.; Ward, J.; Papageorgakis, C.; Rodriguez-Gomez, D.

    2007-01-01

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe

  9. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  10. Summary of space nuclear reactor power systems, 1983--1992

    Energy Technology Data Exchange (ETDEWEB)

    Buden, D.

    1993-08-11

    This report summarizes major developments in the last ten years which have greatly expanded the space nuclear reactor power systems technology base. In the SP-100 program, after a competition between liquid-metal, gas-cooled, thermionic, and heat pipe reactors integrated with various combinations of thermoelectric thermionic, Brayton, Rankine, and Stirling energy conversion systems, three concepts:were selected for further evaluation. In 1985, the high-temperature (1,350 K), lithium-cooled reactor with thermoelectric conversion was selected for full scale development. Since then, significant progress has been achieved including the demonstration of a 7-y-life uranium nitride fuel pin. Progress on the lithium-cooled reactor with thermoelectrics has progressed from a concept, through a generic flight system design, to the design, development, and testing of specific components. Meanwhile, the USSR in 1987--88 orbited a new generation of nuclear power systems beyond the, thermoelectric plants on the RORSAT satellites. The US has continued to advance its own thermionic fuel element development, concentrating on a multicell fuel element configuration. Experimental work has demonstrated a single cell operating time of about 1 1/2-y. Technology advances have also been made in the Stirling engine; an advanced engine that operates at 1,050 K is ready for testing. Additional concepts have been studied and experiments have been performed on a variety of systems to meet changing needs; such as powers of tens-to-hundreds of megawatts and highly survivable systems of tens-of-kilowatts power.

  11. Summary of space nuclear reactor power systems, 1983--1992

    International Nuclear Information System (INIS)

    Buden, D.

    1993-01-01

    This report summarizes major developments in the last ten years which have greatly expanded the space nuclear reactor power systems technology base. In the SP-100 program, after a competition between liquid-metal, gas-cooled, thermionic, and heat pipe reactors integrated with various combinations of thermoelectric thermionic, Brayton, Rankine, and Stirling energy conversion systems, three concepts:were selected for further evaluation. In 1985, the high-temperature (1,350 K), lithium-cooled reactor with thermoelectric conversion was selected for full scale development. Since then, significant progress has been achieved including the demonstration of a 7-y-life uranium nitride fuel pin. Progress on the lithium-cooled reactor with thermoelectrics has progressed from a concept, through a generic flight system design, to the design, development, and testing of specific components. Meanwhile, the USSR in 1987--88 orbited a new generation of nuclear power systems beyond the, thermoelectric plants on the RORSAT satellites. The US has continued to advance its own thermionic fuel element development, concentrating on a multicell fuel element configuration. Experimental work has demonstrated a single cell operating time of about 1 1/2-y. Technology advances have also been made in the Stirling engine; an advanced engine that operates at 1,050 K is ready for testing. Additional concepts have been studied and experiments have been performed on a variety of systems to meet changing needs; such as powers of tens-to-hundreds of megawatts and highly survivable systems of tens-of-kilowatts power

  12. Lecture 10: The European Bioinformatics Institute - "Big data" for biomedical sciences

    CERN Multimedia

    CERN. Geneva; Dana, Jose

    2013-01-01

    Part 1: Big data for biomedical sciences (Tom Hancocks) Ten years ago witnessed the completion of the first international 'Big Biology' project that sequenced the human genome. In the years since biological sciences, have seen a vast growth in data. In the coming years advances will come from integration of experimental approaches and the translation into applied technologies is the hospital, clinic and even at home. This talk will examine the development of infrastructure, physical and virtual, that will allow millions of life scientists across Europe better access to biological data Tom studied Human Genetics at the University of Leeds and McMaster University, before completing an MSc in Analytical Genomics at the University of Birmingham. He has worked for the UK National Health Service in diagnostic genetics and in training healthcare scientists and clinicians in bioinformatics. Tom joined the EBI in 2012 and is responsible for the scientific development and delivery of training for the BioMedBridges pr...

  13. SETI as a part of Big History

    Science.gov (United States)

    Maccone, Claudio

    2014-08-01

    Big History is an emerging academic discipline which examines history scientifically from the Big Bang to the present. It uses a multidisciplinary approach based on combining numerous disciplines from science and the humanities, and explores human existence in the context of this bigger picture. It is taught at some universities. In a series of recent papers ([11] through [15] and [17] through [18]) and in a book [16], we developed a new mathematical model embracing Darwinian Evolution (RNA to Humans, see, in particular, [17] and Human History (Aztecs to USA, see [16]) and then we extrapolated even that into the future up to ten million years (see 18), the minimum time requested for a civilization to expand to the whole Milky Way (Fermi paradox). In this paper, we further extend that model in the past so as to let it start at the Big Bang (13.8 billion years ago) thus merging Big History, Evolution on Earth and SETI (the modern Search for ExtraTerrestrial Intelligence) into a single body of knowledge of a statistical type. Our idea is that the Geometric Brownian Motion (GBM), so far used as the key stochastic process of financial mathematics (Black-Sholes models and related 1997 Nobel Prize in Economics!) may be successfully applied to the whole of Big History. In particular, in this paper we derive the Statistical Drake Equation (namely the statistical extension of the classical Drake Equation typical of SETI) can be regarded as the “frozen in time” part of GBM. This makes SETI a subset of our Big History Theory based on GBMs: just as the GBM is the “movie” unfolding in time, so the Statistical Drake Equation is its “still picture”, static in time, and the GBM is the time-extension of the Drake Equation. Darwinian Evolution on Earth may be easily described as an increasing GBM in the number of living species on Earth over the last 3.5 billion years. The first of them was RNA 3.5 billion years ago, and now 50 million living species or more exist, each

  14. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  15. Spatial distribution of radionuclides in Lake Michigan biota near the Big Rock Point Nuclear Plant

    International Nuclear Information System (INIS)

    Wahlgren, M.A.; Yaguchi, E.M.; Nelson, D.M.; Marshall, J.S.

    1974-01-01

    A survey was made of four groups of biota in the vicinity of the Big Rock Point Nuclear Plant near Charlevoix, Michigan, to determine their usefulness in locating possible sources of plutonium and other radionuclides to Lake Michigan. This 70 MW boiling-water reactor, located on the Lake Michigan shoreline, was chosen because its fuel contains recycled plutonium, and because it routinely discharges very low-level radioactive wastes into the lake. Samples of crayfish (Orconectes sp.), green algae (Chara sp. and Cladophora sp.), and an aquatic macrophyte (Potamogeton sp.) were collected in August 1973, at varying distances from the discharge and analyzed for 239 240 Pu, 90 Sr, and five gamma-emitting radionuclides. Comparison samples of reactor waste solution have also been analyzed for these radionuclides. Comparisons of the spatial distributions of the extremely low radionuclide concentrations in biota clearly indicated that 137 Cs, 134 Cs, 65 Zn, and 60 Co were released from the reactor; their concentrations decreased exponentially with increasing distance from the discharge. Conversely, concentrations of 239 240 Pu, 95 Zr, and 90 Sr showed no correlation with distance, suggesting any input from Big Rock was insignificant with respect to the atmospheric origin of these isotopes. The significance of these results is discussed, particularly with respect to current public debate over the possibility of local environmental hazards associated with the use of plutonium as a nuclear fuel. (U.S.)

  16. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  17. Abstraction networks for terminologies: Supporting management of "big knowledge".

    Science.gov (United States)

    Halper, Michael; Gu, Huanying; Perl, Yehoshua; Ochs, Christopher

    2015-05-01

    Terminologies and terminological systems have assumed important roles in many medical information processing environments, giving rise to the "big knowledge" challenge when terminological content comprises tens of thousands to millions of concepts arranged in a tangled web of relationships. Use and maintenance of knowledge structures on that scale can be daunting. The notion of abstraction network is presented as a means of facilitating the usability, comprehensibility, visualization, and quality assurance of terminologies. An abstraction network overlays a terminology's underlying network structure at a higher level of abstraction. In particular, it provides a more compact view of the terminology's content, avoiding the display of minutiae. General abstraction network characteristics are discussed. Moreover, the notion of meta-abstraction network, existing at an even higher level of abstraction than a typical abstraction network, is described for cases where even the abstraction network itself represents a case of "big knowledge." Various features in the design of abstraction networks are demonstrated in a methodological survey of some existing abstraction networks previously developed and deployed for a variety of terminologies. The applicability of the general abstraction-network framework is shown through use-cases of various terminologies, including the Systematized Nomenclature of Medicine - Clinical Terms (SNOMED CT), the Medical Entities Dictionary (MED), and the Unified Medical Language System (UMLS). Important characteristics of the surveyed abstraction networks are provided, e.g., the magnitude of the respective size reduction referred to as the abstraction ratio. Specific benefits of these alternative terminology-network views, particularly their use in terminology quality assurance, are discussed. Examples of meta-abstraction networks are presented. The "big knowledge" challenge constitutes the use and maintenance of terminological structures that

  18. Neutron-physical characteristics of the TVRM-100 reactor with ten ring fuel channels

    International Nuclear Information System (INIS)

    Mikhajlov, V.M.; Myrtsymova, L.A.

    1988-01-01

    Three-dimensional heterogeneous calculations of TVRM-100 reactor which is a research reactor using enriched fuel with heavy-water moderator, coolant and reflector, are conducted. Achievable burnup depths depending on the number of removable FAs are presented. The maximum non-perturbed thermal neutron flux in the reflector is (2-1.8)x10 15 cm -2 c -1 ; mean flux on the fuel is 2.9x10 14 cm -2 c -1 . Energy release radial non-uniformity is 0.67, maximum bending by FA is ∼3.7. Reactivity temperature effect is negative and is equal to - 0.9x10 -4 grad -1 without accounting for experimental channels. Control rod efficiency in the radial reflector is high, but their location dose to experimental devices in the high neutron flux area is undesirable. 4 refs.; 5 figs

  19. Expanding the storage capability at ETRR research reactor at Inshas

    International Nuclear Information System (INIS)

    Mariy, A.; Sultan, M.; Khattab, M.

    2000-01-01

    Storing of spent fuel from test reactor in developing countries has become a big dilemma for the following reasons: The transportation of spent fuel is very expensive. There is no reprocessing plants in most developing countries. The expanding of existing storage facilities in reactor building require experience that most of developing countries lack. some political motivations from nuclear developed countries intervene which makes the transportation procedures and logistics to those countries difficult. This paper gives the conceptual design of a new spent fuel storage now under construction at Inshas research reactor (ETRR-1). The location of the new storage facility is chosen to be within the premises of the reactor facility so that both reactor and the new storage are one material balance area. The paper also proposes some ideas that can enhance the transportation and storage of spent fuel of test reactors, such as: Intensifying the role of IAEA in helping countries to get rid of the spent fuel. The initiation of regional spent fuel storage facilities in some developing countries

  20. Big Data en surveillance, deel 1 : Definities en discussies omtrent Big Data

    NARCIS (Netherlands)

    Timan, Tjerk

    2016-01-01

    Naar aanleiding van een (vrij kort) college over surveillance en Big Data, werd me gevraagd iets dieper in te gaan op het thema, definities en verschillende vraagstukken die te maken hebben met big data. In dit eerste deel zal ik proberen e.e.a. uiteen te zetten betreft Big Data theorie en

  1. Characterizing Big Data Management

    OpenAIRE

    Rogério Rossi; Kechi Hirama

    2015-01-01

    Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: t...

  2. The Passport to the Big Bang: a trail of discovery of CERN and its sites

    CERN Multimedia

    CERN Bulletin

    2013-01-01

    Sunday 2 June 2013 will see the launch of CERN’s Passport to the Big Bang, a scientific tourist trail linking ten of the Laboratory’s sites in the Pays de Gex and the Canton of Geneva. CERN is organising a public event to celebrate the launch and needs lots of volunteers – you could be one of them!   The exhibition platform in Sergy, in front of the ALICE experiment. Does your grocer insist that the Pays de Gex is going to be swallowed up by a black hole made by the LHC? Do your neighbours ask you questions about the CERN site visible from your houses, leaving you stumped when you don’t have the answers?  Well then, take them on an accelerator tour – but above ground and with no need for access cards! How? By taking advantage of the Passport to the Big Bang, a cross-border scientific tourist trail that will be inaugurated on 2 June. The goal of the Passport to the Big Bang is provide the local population wi...

  3. The Chernobyl reactor accident source term: Development of a consensus view

    International Nuclear Information System (INIS)

    Guntay, S.; Powers, D.A.; Devell, L.

    1997-01-01

    In August 1986, scientists from the former Soviet Union provided the nuclear safety community with an impressively detailed account of what was then known about the Chernobyl accident. This included assessments of the magnitudes, rates, and compositions of radionuclide releases during the ten days following initiation of the accident. A summary report based on the Soviet report, the oral presentations, and the discussions with scientists from various countries was issued by the International Atomic Energy Agency shortly thereafter. Ten years have elapsed since the reactor accident at Chernobyl. A great deal more data is now available concerning the events, phenomena, and processes that took place. The purpose of this document is to examine what is known about the radioactive materials released during the accident. The accident was peculiar in the sense that radioactive materials were released, at least initially, in an exceptionally energetic plume and were transported far from the reactor site. Release of radioactivity from the plant continued for about ten days. A number of more recent publications and results from scientists in Russia and elsewhere have significantly improved our understanding of the Chernobyl source term. Because of the special features of the reactor design and the pecularities of the Chernobyl accident, the source term for the Chernobyl accident is of limited applicability of the safety analysis of other types of reactors

  4. The Thermos process heat reactor

    International Nuclear Information System (INIS)

    Lerouge, Bernard

    1979-01-01

    The THERMOS process heat reactor was born from the following idea: the hot water energy vector is widely used for heating purposes in cities, so why not save on traditional fossil fuels by simply substituting a nuclear boiler of comparable power for the classical boiler installed in the same place. The French Atomic Energy Commission has techniques for heating in the big French cities which provide better guarantees for national independence and for the environment. This THERMOS technique would result in a saving of 40,000 to 80,000 tons of oil per year [fr

  5. Unconventional liquid metal cooled fast reactors

    International Nuclear Information System (INIS)

    Spinrad, B.I.; Rohach, A.F.; Razzaque, M.M.

    1989-06-01

    This report describes the rationale for, design of and analytical studies on an unconventional sodium-cooled power reactor, called the Trench Reactor. It derives its name from the long, narrow sodium pool in which the reactor is placed. Unconventional features include: pool shape; reactor shape (also long and narrow); reflector control; low power density; hot-leg primary pumping; absence of a cold sodium pool; large core boxes rather than a large number of subassemblies; large diameter metal fuel; vessel suspension from cables; and vessel cooling by natural circulation of building atmosphere (nitrogen) at all times. These features all seem feasible. They result in a system that is capable of at least a ten year reload interval and shows good safety through direct physical response to loss-of-heat-sink, loss-of-flow and limited-reactivity nuclear transients. 43 figs., 43 tabs

  6. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  7. Data base of reactor physics experimental results in Kyoto University critical assembly experimental facilities

    International Nuclear Information System (INIS)

    Ichihara, Chihiro; Fujine, Shigenori; Hayashi, Masatoshi

    1986-01-01

    The Kyoto University critical assembly experimental facilities belong to the Kyoto University Research Reactor Institute, and are the versatile critical assembly constructed for experimentally studying reactor physics and reactor engineering. The facilities are those for common utilization by universities in whole Japan. During more than ten years since the initial criticality in 1974, various experiments on reactor physics and reactor engineering have been carried out using many experimental facilities such as two solidmoderated cores, a light water-moderated core and a neutron generator. The kinds of the experiment carried out were diverse, and to find out the required data from them is very troublesome, accordingly it has become necessary to make a data base which can be processed by a computer with the data accumulated during the past more than ten years. The outline of the data base, the data base CAEX using personal computers, the data base supported by a large computer and so on are reported. (Kako, I.)

  8. An analysis of cross-sectional differences in big and non-big public accounting firms' audit programs

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans); Drieenhuizen, F.; Stein, M.T.; Simunic, D.A.

    2006-01-01

    A significant body of prior research has shown that audits by the Big 5 (now Big 4) public accounting firms are quality differentiated relative to non-Big 5 audits. This result can be derived analytically by assuming that Big 5 and non-Big 5 firms face different loss functions for "audit failures"

  9. Big Data is invading big places as CERN

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Big Data technologies are becoming more popular with the constant grow of data generation in different fields such as social networks, internet of things and laboratories like CERN. How is CERN making use of such technologies? How machine learning is applied at CERN with Big Data technologies? How much data we move and how it is analyzed? All these questions will be answered during the talk.

  10. The big bang

    International Nuclear Information System (INIS)

    Chown, Marcus.

    1987-01-01

    The paper concerns the 'Big Bang' theory of the creation of the Universe 15 thousand million years ago, and traces events which physicists predict occurred soon after the creation. Unified theory of the moment of creation, evidence of an expanding Universe, the X-boson -the particle produced very soon after the big bang and which vanished from the Universe one-hundredth of a second after the big bang, and the fate of the Universe, are all discussed. (U.K.)

  11. Small Big Data Congress 2017

    NARCIS (Netherlands)

    Doorn, J.

    2017-01-01

    TNO, in collaboration with the Big Data Value Center, presents the fourth Small Big Data Congress! Our congress aims at providing an overview of practical and innovative applications based on big data. Do you want to know what is happening in applied research with big data? And what can already be

  12. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  13. Modelling the low-tar BIG gasification concept[Biomass Integrated gasification

    Energy Technology Data Exchange (ETDEWEB)

    Andersen, Lars; Elmegaard, B.; Qvale, B.; Henriksen, Ulrrik [Technical univ. of Denmark (Denmark); Bentzen, J.D.; Hummelshoej, R. [COWI A/S (Denmark)

    2007-07-01

    A low-tar, high-efficient biomass gasification concept for medium- to large-scale power plants has been designed. The concept is named 'Low-Tar BIG' (BIG = Biomass Integrated Gasification). The concept is based on separate pyrolysis and gasification units. The volatile gases from the pyrolysis (containing tar) are partially oxidised in a separate chamber, and hereby the tar content is dramatically reduced. Thus, the investment, and running cost of a gas cleaning system can be reduced, and the reliability can be increased. Both pyrolysis and gasification chamber are bubbling fluid beds, fluidised with steam. For moist fuels, the gasifier can be integrated with a steam drying process, where the produced steam is used in the pyrolysis/gasification chamber. In this paper, mathematical models and results from initial tests of a laboratory Low-Tar BIG gasifier are presented. Two types of models are presented: 1. The gasifier-dryer applied in different power plant systems: Gas engine, Simple cycle gas turbine, Recuperated gas turbine and Integrated Gasification and Combined Cycle (IGCC). The paper determines the differences in efficiency of these systems and shows that the gasifier will be applicable for very different fuels with different moisture contents, depending on the system. 2. A thermodynamic Low-Tar BIG model. This model is based on mass and heat balance between four reactors: Pyrolysis, partial oxidation, gasification, gas-solid mixer. The paper describes the results from this study and compares the results to actual laboratory tests. The study shows, that the Low-Tar BIG process can use very wet fuels (up to 65-70% moist) and still produce heat and power with a remarkable high electric efficiency. Hereby the process offers the unique combination of large scale gasification and low-cost gas cleaning and use of low-cost fuels which very likely is the necessary combination that will lead to a breakthrough of gasification technology. (au)

  14. Research on the usage of a deep sea fast reactor

    Energy Technology Data Exchange (ETDEWEB)

    Otsubo, Akira; Kowata, Yasuki [Power Reactor and Nuclear Fuel Development Corp., Oarai, Ibaraki (Japan). Oarai Engineering Center

    1997-09-01

    Many new types of fast reactors have been studied in PNC. A deep sea fast reactor has the highest realization probability of the reactors studied because its development is desired by many specialists of oceanography, meteorology, deep sea bottom oil field, seismology and so on and because the development does not cost big budget and few technical problems remain to be solved. This report explains the outline and the usage of the reactor of 40 kWe and 200 to 400 kWe. The reactor can be used as a power source at an unmanned base for long term climate prediction and the earth science and an oil production base in a deep sea region. On the other hand, it is used for heat and electric power supply to a laboratory in the polar region. In future, it will be used in the space. At the present time, a large FBR development plan does not proceed successfully and a realization goal time of FBR has gone later and later. We think that it is the most important to develop the reactor as fast as possible and to plant a fast reactor technique in our present society. (author)

  15. Big Data and Neuroimaging.

    Science.gov (United States)

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  16. Big Data; A Management Revolution : The emerging role of big data in businesses

    OpenAIRE

    Blasiak, Kevin

    2014-01-01

    Big data is a term that was coined in 2012 and has since then emerged to one of the top trends in business and technology. Big data is an agglomeration of different technologies resulting in data processing capabilities that have been unreached before. Big data is generally characterized by 4 factors. Volume, velocity and variety. These three factors distinct it from the traditional data use. The possibilities to utilize this technology are vast. Big data technology has touch points in differ...

  17. Reactivity changes in hybrid thermal-fast reactor systems during fast core flooding

    International Nuclear Information System (INIS)

    Pesic, M.

    1994-09-01

    A new space-dependent kinetic model in adiabatic approximation with local feedback reactivity parameters for reactivity determination in the coupled systems is proposed in this thesis. It is applied in the accident calculation of the 'HERBE' fast-thermal reactor system and compared to usual point kinetics model with core-averaged parameters. Advantages of the new model - more realistic picture of the reactor kinetics and dynamics during local large reactivity perturbation, under the same heat transfer conditions, are underlined. Calculated reactivity parameters of the new model are verified in the experiments performed at the 'HERBE' coupled core. The model has shown that the 'HERBE' safety system can shutdown reactor safely and fast even in the case of highly set power trip and even under conditions of big partial failure of the reactor safety system (author)

  18. Science Hall of Atomic Energy in Research Reactor Institute, Kyoto University

    International Nuclear Information System (INIS)

    Hayashi, Takeo

    1979-01-01

    The Science Hall of Atomic Energy was built as a subsidiary facility of the Research Reactor Institute, Kyoto University. The purpose of this facility is to accept outside demands concerning the application of the research reactor. The building is a two story building, and has the floor area of 901.47 m 2 . There are an exhibition room, a library, and a big lecture room. In the exhibition room, models of the Kyoto University Research Reactor and the Kyoto University Critical Assembly are placed. Various pictures concerning the application of the reactor are on the wall. In the library, people from outside of the Institute can use various books on science. Books for boys and girls are also stocked and used for public use. At the lecture room, various kinds of meeting can be held. (Kato, T.)

  19. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  20. Cryptography for Big Data Security

    Science.gov (United States)

    2015-07-13

    Cryptography for Big Data Security Book Chapter for Big Data: Storage, Sharing, and Security (3S) Distribution A: Public Release Ariel Hamlin1 Nabil...Email: arkady@ll.mit.edu ii Contents 1 Cryptography for Big Data Security 1 1.1 Introduction...48 Chapter 1 Cryptography for Big Data Security 1.1 Introduction With the amount

  1. Data: Big and Small.

    Science.gov (United States)

    Jones-Schenk, Jan

    2017-02-01

    Big data is a big topic in all leadership circles. Leaders in professional development must develop an understanding of what data are available across the organization that can inform effective planning for forecasting. Collaborating with others to integrate data sets can increase the power of prediction. Big data alone is insufficient to make big decisions. Leaders must find ways to access small data and triangulate multiple types of data to ensure the best decision making. J Contin Educ Nurs. 2017;48(2):60-61. Copyright 2017, SLACK Incorporated.

  2. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  3. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  4. Urbanising Big

    DEFF Research Database (Denmark)

    Ljungwall, Christer

    2013-01-01

    Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis.......Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis....

  5. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Boyd, Richard N.

    2001-01-01

    The precision of measurements in modern cosmology has made huge strides in recent years, with measurements of the cosmic microwave background and the determination of the Hubble constant now rivaling the level of precision of the predictions of big bang nucleosynthesis. However, these results are not necessarily consistent with the predictions of the Standard Model of big bang nucleosynthesis. Reconciling these discrepancies may require extensions of the basic tenets of the model, and possibly of the reaction rates that determine the big bang abundances

  6. The ethics of big data in big agriculture

    OpenAIRE

    Carbonell (Isabelle M.)

    2016-01-01

    This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique in...

  7. The big data-big model (BDBM) challenges in ecological research

    Science.gov (United States)

    Luo, Y.

    2015-12-01

    The field of ecology has become a big-data science in the past decades due to development of new sensors used in numerous studies in the ecological community. Many sensor networks have been established to collect data. For example, satellites, such as Terra and OCO-2 among others, have collected data relevant on global carbon cycle. Thousands of field manipulative experiments have been conducted to examine feedback of terrestrial carbon cycle to global changes. Networks of observations, such as FLUXNET, have measured land processes. In particular, the implementation of the National Ecological Observatory Network (NEON), which is designed to network different kinds of sensors at many locations over the nation, will generate large volumes of ecological data every day. The raw data from sensors from those networks offer an unprecedented opportunity for accelerating advances in our knowledge of ecological processes, educating teachers and students, supporting decision-making, testing ecological theory, and forecasting changes in ecosystem services. Currently, ecologists do not have the infrastructure in place to synthesize massive yet heterogeneous data into resources for decision support. It is urgent to develop an ecological forecasting system that can make the best use of multiple sources of data to assess long-term biosphere change and anticipate future states of ecosystem services at regional and continental scales. Forecasting relies on big models that describe major processes that underlie complex system dynamics. Ecological system models, despite great simplification of the real systems, are still complex in order to address real-world problems. For example, Community Land Model (CLM) incorporates thousands of processes related to energy balance, hydrology, and biogeochemistry. Integration of massive data from multiple big data sources with complex models has to tackle Big Data-Big Model (BDBM) challenges. Those challenges include interoperability of multiple

  8. Ideal MHD B limits in the BIG DEE tokamak

    International Nuclear Information System (INIS)

    Helton, F.J.; Bernard, L.C.; Greene, J.M.

    1983-01-01

    Using D-D reactions, tokamak reactors become economically attractive when B (the ratio of volume averaged pressure to magnetic pressure) exceeds 5 percent. Ideal MID instabilities are of great concern because they have the potential to limit B below this range and so extensive studies have been done to determine ideal MHD B limits. As B increases with inverse aspect ratio, elongation and triangularity, the Doublet III upgrade machine -- BIG DEE -- is particularly suited to study the possibility of very high B. The authors have done computations to determine ideal MHD B limits for various plasma shapes and elongations in BIG DEE. They have determined that for q at the plasma surface greater than 2, B is limited by the ballooning mode if the wall is reasonably close to the plasma surface (d/a < 1.5 where d and a are the wall and plasma radii respectively). On the other hand, for q at the plasma surface less than 2, the n=1 external kink is unstable even with a wall close by. Thus, relevant values of limiting B can be obtained by assuming that the external kink limits the value of q at the limiter to a value greater than 2 and that the ballooning modes limit B. Under this assumption, a relevant B limit for the BIG DEE would be over 18%. For such an equilibrium, the wall position necessary to stabilize the n=1 and n=2 modes is 2a and the equilibrium is stable for n=3

  9. A Big Video Manifesto

    DEFF Research Database (Denmark)

    Mcilvenny, Paul Bruce; Davidsen, Jacob

    2017-01-01

    and beautiful visualisations. However, we also need to ask what the tools of big data can do both for the Humanities and for more interpretative approaches and methods. Thus, we prefer to explore how the power of computation, new sensor technologies and massive storage can also help with video-based qualitative......For the last few years, we have witnessed a hype about the potential results and insights that quantitative big data can bring to the social sciences. The wonder of big data has moved into education, traffic planning, and disease control with a promise of making things better with big numbers...

  10. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  11. Applications of Big Data in Education

    OpenAIRE

    Faisal Kalota

    2015-01-01

    Big Data and analytics have gained a huge momentum in recent years. Big Data feeds into the field of Learning Analytics (LA) that may allow academic institutions to better understand the learners' needs and proactively address them. Hence, it is important to have an understanding of Big Data and its applications. The purpose of this descriptive paper is to provide an overview of Big Data, the technologies used in Big Data, and some of the applications of Big Data in educa...

  12. Characterization of a continuous agitated cell reactor for oxygen dependent biocatalysis.

    Science.gov (United States)

    Toftgaard Pedersen, Asbjørn; de Carvalho, Teresa Melo; Sutherland, Euan; Rehn, Gustav; Ashe, Robert; Woodley, John M

    2017-06-01

    Biocatalytic oxidation reactions employing molecular oxygen as the electron acceptor are difficult to conduct in a continuous flow reactor because of the requirement for high oxygen transfer rates. In this paper, the oxidation of glucose to glucono-1,5-lactone by glucose oxidase was used as a model reaction to study a novel continuous agitated cell reactor (ACR). The ACR consists of ten cells interconnected by small channels. An agitator is placed in each cell, which mixes the content of the cell when the reactor body is shaken by lateral movement. Based on tracer experiments, a hydrodynamic model for the ACR was developed. The model consisted of ten tanks-in-series with back-mixing occurring within and between each cell. The back-mixing was a necessary addition to the model in order to explain the observed phenomenon that the ACR behaved as two continuous stirred tank reactors (CSTRs) at low flow rates, while it at high flow rates behaved as the expected ten CSTRs in series. The performance of the ACR was evaluated by comparing the steady state conversion at varying residence times with the conversion observed in a stirred batch reactor of comparable size. It was found that the ACR could more than double the overall reaction rate, which was solely due to an increased oxygen transfer rate in the ACR caused by the intense mixing as a result of the spring agitators. The volumetric oxygen transfer coefficient, k L a, was estimated to be 344 h -1 in the 100 mL ACR, opposed to only 104 h -1 in a batch reactor of comparable working volume. Interestingly, the large deviation from plug flow behavior seen in the tracer experiments was found to have little influence on the conversion in the ACR, since both a plug flow reactor (PFR) model and the backflow cell model described the data sufficiently well. Biotechnol. Bioeng. 2017;114: 1222-1230. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  13. Big Data Semantics

    NARCIS (Netherlands)

    Ceravolo, Paolo; Azzini, Antonia; Angelini, Marco; Catarci, Tiziana; Cudré-Mauroux, Philippe; Damiani, Ernesto; Mazak, Alexandra; van Keulen, Maurice; Jarrar, Mustafa; Santucci, Giuseppe; Sattler, Kai-Uwe; Scannapieco, Monica; Wimmer, Manuel; Wrembel, Robert; Zaraket, Fadi

    2018-01-01

    Big Data technology has discarded traditional data modeling approaches as no longer applicable to distributed data processing. It is, however, largely recognized that Big Data impose novel challenges in data and infrastructure management. Indeed, multiple components and procedures must be

  14. Ten years of IAEA cooperation with the Russian research reactor fuel return programme

    Energy Technology Data Exchange (ETDEWEB)

    Tozser, S.; Adelfang, P.; Bradley, E. [International Atomic Energy Agency, Vienna (Austria)

    2013-01-15

    The Russian Research Reactor Fuel Return (RRRFR) Programme was launched in 2001. Over the duration, the programme successfully completed 43 safe shipments of 1.6 tons of fresh and spent HEU fuel from different countries using Russian fuelled research reactors to the country of origin. The IAEA has been a very active supporter of the RRRFR Programme since its inception. Under the auspices of the RRRFR Programme, the Agency has been ensuring a broad range of technical advisory and organizational support to the HEU fuel repatriation, as well as training and advisory assistance for supporting RR conversion from HEU to LEU. The presentation gives an overview of the RRRFR programme achievements with special consideration of the IAEA contribution. These include an overview of the shipments' history in terms of fresh and spent fuel, as well as a summary of experiences gained during the shipments' preparation and termination. The presentation focuses on technical advisory support given by the IAEA during the programme implementation, captures the consolidated knowledge of the unique international programme and shares the most important lessons learned. (orig.)

  15. Backfitting of the FRG reactors

    Energy Technology Data Exchange (ETDEWEB)

    Krull, W [GKSS-Forschungszentrum Geesthacht GmbH, Geesthacht (Germany)

    1990-05-01

    The FRG-research reactors The GKSS-research centre is operating two research reactors of the pool type fueled with MTR-type type fuel elements. The research reactors FRG-1 and FRG-2 having power levels of 5 MW and 15 MW are in operation for 31 year and 27 years respectively. They are comparably old like other research reactors. The reactors are operating at present at approximately 180 days (FRG-1) and between 210 and 250 days (FRG-2) per year. Both reactors are located in the same reactor hall in a connecting pool system. Backfitting measures are needed for our and other research reactors to ensure a high level of safety and availability. The main backfitting activities during last ten years were concerned with: comparison of the existing design with today demands (criteria, guidelines, standards etc.); and probability approach for events from outside like aeroplane crashes and earthquakes; the main accidents were rediscussed like startup from low and full power, loss of coolant flow, loss of heat sink, loss of coolant and fuel plate melting; a new reactor protection system had to be installed, following today's demands; a new crane has been installed in the reactor hall. A cold neutron source has been installed to increase the flux of cold neutrons by a factor of 14. The FRG-l is being converted from 93% enriched U with Alx fuel to 20% enriched U with U{sub 3}Si{sub 2} fuel. Both cooling towers were repaired. Replacement of instrumentation is planned.

  16. Backfitting of the FRG reactors

    International Nuclear Information System (INIS)

    Krull, W.

    1990-01-01

    The FRG-research reactors The GKSS-research centre is operating two research reactors of the pool type fueled with MTR-type type fuel elements. The research reactors FRG-1 and FRG-2 having power levels of 5 MW and 15 MW are in operation for 31 year and 27 years respectively. They are comparably old like other research reactors. The reactors are operating at present at approximately 180 days (FRG-1) and between 210 and 250 days (FRG-2) per year. Both reactors are located in the same reactor hall in a connecting pool system. Backfitting measures are needed for our and other research reactors to ensure a high level of safety and availability. The main backfitting activities during last ten years were concerned with: comparison of the existing design with today demands (criteria, guidelines, standards etc.); and probability approach for events from outside like aeroplane crashes and earthquakes; the main accidents were rediscussed like startup from low and full power, loss of coolant flow, loss of heat sink, loss of coolant and fuel plate melting; a new reactor protection system had to be installed, following today's demands; a new crane has been installed in the reactor hall. A cold neutron source has been installed to increase the flux of cold neutrons by a factor of 14. The FRG-l is being converted from 93% enriched U with Alx fuel to 20% enriched U with U 3 Si 2 fuel. Both cooling towers were repaired. Replacement of instrumentation is planned

  17. Fast reactor database. 2006 update

    International Nuclear Information System (INIS)

    2006-12-01

    Liquid metal cooled fast reactors (LMFRs) have been under development for about 50 years. Ten experimental fast reactors and six prototype and commercial size fast reactor plants have been constructed and operated. In many cases, the overall experience with LMFRs has been rather good, with the reactors themselves and also the various components showing remarkable performances, well in accordance with the design expectations. The fast reactor system has also been shown to have very attractive safety characteristics, resulting to a large extent from the fact that the fast reactor is a low pressure system with large thermal inertia and negative power and temperature coefficients. In addition to the LMFRs that have been constructed and operated, more than ten advanced LMFR projects have been developed, and the latest designs are now close to achieving economic competitivity with other reactor types. In the current world economic climate, the introduction of a new nuclear energy system based on the LMFR may not be considered by utilities as a near future option when compared to other potential power plants. However, there is a strong agreement between experts in the nuclear energy field that, for sustainability reasons, long term development of nuclear power as a part of the world's future energy mix will require the fast reactor technology, and that, given the decline in fast reactor development projects, data retrieval and knowledge preservation efforts in this area are of particular importance. This publication contains detailed design data and main operational data on experimental, prototype, demonstration, and commercial size LMFRs. Each LMFR plant is characterized by about 500 parameters: physics, thermohydraulics, thermomechanics, by design and technical data, and by relevant sketches. The focus is on practical issues that are useful to engineers, scientists, managers, university students and professors with complete technical information of a total of 37 LMFR

  18. Comparative validity of brief to medium-length Big Five and Big Six personality questionnaires

    NARCIS (Netherlands)

    Thalmayer, A.G.; Saucier, G.; Eigenhuis, A.

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five

  19. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    Energy Technology Data Exchange (ETDEWEB)

    Lucas, Robert [University of Southern California, Information Sciences Institute; Ang, James [Sandia National Laboratories; Bergman, Keren [Columbia University; Borkar, Shekhar [Intel; Carlson, William [Institute for Defense Analyses; Carrington, Laura [University of California, San Diego; Chiu, George [IBM; Colwell, Robert [DARPA; Dally, William [NVIDIA; Dongarra, Jack [University of Tennessee; Geist, Al [Oak Ridge National Laboratory; Haring, Rud [IBM; Hittinger, Jeffrey [Lawrence Livermore National Laboratory; Hoisie, Adolfy [Pacific Northwest National Laboratory; Klein, Dean Micron; Kogge, Peter [University of Notre Dame; Lethin, Richard [Reservoir Labs; Sarkar, Vivek [Rice University; Schreiber, Robert [Hewlett Packard; Shalf, John [Lawrence Berkeley National Laboratory; Sterling, Thomas [Indiana University; Stevens, Rick [Argonne National Laboratory; Bashor, Jon [Lawrence Berkeley National Laboratory; Brightwell, Ron [Sandia National Laboratories; Coteus, Paul [IBM; Debenedictus, Erik [Sandia National Laboratories; Hiller, Jon [Science and Technology Associates; Kim, K. H. [IBM; Langston, Harper [Reservoir Labs; Murphy, Richard Micron; Webster, Clayton [Oak Ridge National Laboratory; Wild, Stefan [Argonne National Laboratory; Grider, Gary [Los Alamos National Laboratory; Ross, Rob [Argonne National Laboratory; Leyffer, Sven [Argonne National Laboratory; Laros III, James [Sandia National Laboratories

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a system that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.

  20. Intra- and interspecific responses to Rafinesque’s big-eared bat (Corynorhinus rafinesquii) social calls.

    Energy Technology Data Exchange (ETDEWEB)

    Loeb, Susan, C.; Britzke, Eric, R.

    2010-07-01

    Bats respond to the calls of conspecifics as well as to calls of other species; however, few studies have attempted to quantify these responses or understand the functions of these calls. We tested the response of Rafinesque’s big-eared bats (Corynorhinus rafinesquii) to social calls as a possible method to increase capture success and to understand the function of social calls. We also tested if calls of bats within the range of the previously designated subspecies differed, if the responses of Rafinesque’s big-eared bats varied with geographic origin of the calls, and if other species responded to the calls of C. rafinesquii. We recorded calls of Rafinesque’s big-eared bats at two colony roost sites in South Carolina, USA. Calls were recorded while bats were in the roosts and as they exited. Playback sequences for each site were created by copying typical pulses into the playback file. Two mist nets were placed approximately 50–500 m from known roost sites; the net with the playback equipment served as the Experimental net and the one without the equipment served as the Control net. Call structures differed significantly between the Mountain and Coastal Plains populations with calls from the Mountains being of higher frequency and longer duration. Ten of 11 Rafinesque’s big-eared bats were caught in the Control nets and, 13 of 19 bats of other species were captured at Experimental nets even though overall bat activity did not differ significantly between Control and Experimental nets. Our results suggest that Rafinesque’s big-eared bats are not attracted to conspecifics’ calls and that these calls may act as an intraspecific spacing mechanism during foraging.

  1. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2015 The Authors.

  2. Development of inertia-increased reactor internal pump

    International Nuclear Information System (INIS)

    Tanaka, Masaaki; Matsumura, Seiichi; Kikushima, Jun; Kawamura, Shinichi; Yamashita, Norimichi; Kurosaki, Toshikazu; Kondo, Takahisa

    2000-01-01

    The Reactor Internal Pump (RIP) was adopted for the Reactor Recirculation System (RRS) of Advanced Boiling Water Reactor (ABWR) plants, and ten RIPs are located at the bottom of the reactor pressure vessel. In order to simplify the power supply system for the RIPs, a new inertia-increased RIP was developed, which allows to eliminate the Motor-Generator (M-G) sets. The rotating inertia was increased approximately 2.5 times of current RIP inertia by addition of flywheel on its main shaft. A full scale proving test of the inertia-increased RIP under actual plant operating conditions using full scale test loop was performed to evaluate vibration characteristics and coast down characteristics. From the results of this proving test, the validity of the new inertia-increased RIP and its power supply system (without M-G sets) was confirmed. (author)

  3. Big Data and medicine: a big deal?

    Science.gov (United States)

    Mayer-Schönberger, V; Ingelsson, E

    2018-05-01

    Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research. Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data's role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval. © 2017 The Association for the Publication of the Journal of Internal Medicine.

  4. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  5. Neutron converter at reactor RB; Konvertor neutrona na reaktoru RB

    Energy Technology Data Exchange (ETDEWEB)

    Strugar, P; Sotic, P; Ninkovic, M; Pesic, M [Boris Kidric Institute of nuclear sciences, Vinca, Belgrade (Yugoslavia)

    1977-07-01

    A neutron converter at Reactor RB in the 'Boris Kidric' Institute of Nuclear Sciences - Vinca has been constructed. Preliminary measurements have been shown that the converted neutron spectrum is very similar to the fission neutron spectrum. For the same integral reactor power, the measured neutron radiation dose has been for about ten times larger with the neutron converter. The neutron converter offers wide possibilities, as in investigations in the reactor physics, where the fission neutron spectra have been required, as well as in the field of neutron dosimetry and biological irradiations (author)

  6. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  7. Comparative validity of brief to medium-length Big Five and Big Six Personality Questionnaires.

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-12-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are faced with a variety of options as to inventory length. Furthermore, a 6-factor model has been proposed to extend and update the Big Five model, in part by adding a dimension of Honesty/Humility or Honesty/Propriety. In this study, 3 popular brief to medium-length Big Five measures (NEO Five Factor Inventory, Big Five Inventory [BFI], and International Personality Item Pool), and 3 six-factor measures (HEXACO Personality Inventory, Questionnaire Big Six Scales, and a 6-factor version of the BFI) were placed in competition to best predict important student life outcomes. The effect of test length was investigated by comparing brief versions of most measures (subsets of items) with original versions. Personality questionnaires were administered to undergraduate students (N = 227). Participants' college transcripts and student conduct records were obtained 6-9 months after data was collected. Six-factor inventories demonstrated better predictive ability for life outcomes than did some Big Five inventories. Additional behavioral observations made on participants, including their Facebook profiles and cell-phone text usage, were predicted similarly by Big Five and 6-factor measures. A brief version of the BFI performed surprisingly well; across inventory platforms, increasing test length had little effect on predictive validity. Comparative validity of the models and measures in terms of outcome prediction and parsimony is discussed.

  8. Big Machines and Big Science: 80 Years of Accelerators at Stanford

    Energy Technology Data Exchange (ETDEWEB)

    Loew, Gregory

    2008-12-16

    Longtime SLAC physicist Greg Loew will present a trip through SLAC's origins, highlighting its scientific achievements, and provide a glimpse of the lab's future in 'Big Machines and Big Science: 80 Years of Accelerators at Stanford.'

  9. Regulatory Framework for Controlling the Research Reactor Decommissioning Project

    International Nuclear Information System (INIS)

    Melani, Ai; Chang, Soon Heung

    2009-01-01

    Decommissioning is one of important stages in construction and operation of research reactors. Currently, there are three research reactors operating in Indonesia. These reactors are operated by the National Nuclear Energy Agency (BATAN). The age of the three research reactors varies from 22 to 45 years since the reactors reached their first criticality. Regulatory control of the three reactors is conducted by the Nuclear Energy Regulatory Agency (BAPETEN). Controlling the reactors is carried out based on the Act No. 10/1997 on Nuclear Energy, Government Regulations and BAPETEN Chairman Decrees concerning the nuclear safety, security and safeguards. Nevertheless, BAPETEN still lack of the regulation, especially for controlling the decommissioning project. Therefore, in the near future BAPETEN has to prepare the regulations for decommissioning, particularly to anticipate the decommissioning of the oldest research reactors, which probably will be done in the next ten years. In this papers author give a list of regulations should be prepared by BAPETEN for the decommissioning stage of research reactor in Indonesia based on the international regulatory practice

  10. Dual of big bang and big crunch

    International Nuclear Information System (INIS)

    Bak, Dongsu

    2007-01-01

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory

  11. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  12. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  13. Ten-yearly report on operations at the Garigliano nuclear power station

    International Nuclear Information System (INIS)

    1977-01-01

    This document is the final report on operation of the Garigliano Nuclear Power Station as required under the participation contract between Enel and Euratom and refers to the first ten years of commercial operation (1 May 1964-31 December 1973) of the power station. In the decade in question the Garigliano Power Station has assumed an important position in the world spectrum of nuclear energy since it was the first thermal-power reactor in the world to have irradiated considerable quantities of plutonium as a fuel in its own core for the production of energy. An experimental programme on this was started in 1966 with theoretical study of plutonium recycling and was followed by the charging of the Garigliano reactor with some mixed oxide elements (PuO 2 -UO 2 ) in 1968 and 1970. The excellent performance of these prototype elements, which among other things were examined in detail at the end of each radiation cycle, prompted Enel to decide in favour of the use of entire batches of recycled fuel elements of the PuO 2 -UO 2 type in the reactor from 1975 onwards

  14. Pebble bed modular reactor - The first Generation IV reactor to be constructed

    International Nuclear Information System (INIS)

    Ion, S.; Nicholls, D.; Matzie, R.; Matzner, D.

    2004-01-01

    Substantial interest has been generated in advanced reactors over the past few years. This interest is motivated by the view that new nuclear power reactors will be needed to provide low carbon generation of electricity and possibly hydrogen to support the future growth in demand for both of these commodities. Some governments feel that substantially different designs will be needed to satisfy the desires for public perception, improved safety, proliferation resistance, reduced waste and competitive economics. This has motivated the creation of the Generation IV Nuclear Energy Systems programme in which ten countries have agreed on a framework for international cooperation in research for advanced reactors. Six designs have been selected for continued evaluation, with the objective of deployment by 2030. One of these designs is the very high temperature reactor (VHTR), which is a thermal neutron spectrum system with a helium-cooled core utilising carbon-based fuel. The pebble bed modular reactor (PBMR), being developed in South Africa through a worldwide international collaborative effort led by Eskom, the national utility, will represent a key milestone on the way to achievement of the VHTR design objectives, but in the much nearer term. This paper outlines the design objectives, safety approach and design details of the PBMR, which is already at a very advanced stage of development. (author)

  15. Expanding the storage capability at ET-RR-1 research reactor at Inshass

    International Nuclear Information System (INIS)

    Sultan, Mariy M.; Khattab, M.

    1999-01-01

    Storing of spent fuel from Test Reactor in developing countries has become a big dilemma for the following reasons: The transportation of spent fuel is very expensive; There are no reprocessing plants in most developing countries; The expanding of existing storage facilities in reactor building require experience that most of developing countries lack; Some political motivations from Nuclear Developed countries intervene which makes the transportation procedures and logistics to those countries difficult. This paper gives the conceptual design of a new spent fuel storage now under construction at Inshass research reactor (ET-RR-1). The location of the new storage facility is chosen to be within the premises of the reactor facility so that both reactor and the new storage are one Material Balance Area. The paper also proposes some ideas that can enhance the transportation and storage of spent fuel of test reactors, such as: Intensifying the role of IAEA in helping countries to get rid of the spent fuel; The initiation of regional spent fuel storage facilities in some developing countries. (author)

  16. RA Research reactor, Annual report 1969

    International Nuclear Information System (INIS)

    Milosevic, D. et al.

    1969-12-01

    During 1969, the RA Reactor was operated at nominal power of 6.5 MW for 200 days, and 15 days at lower power levels. Total production mounted to 31131 MWh which is 3.77% higher than planned. Reactor was used for irradiation and experiments according to the demand of 463 users from the Institute and 63 external users. This report contains detailed data about reactor power and experiments performed in 1969. It is concluded that the reactor operated successfully according to the plan. If there had been no problems with power supply during last three months and Danube low water level in September and October the past year would have been the most successful up to now. The number od scram shutdowns was not higher than during past two years in spite of the difficulties in the last quarter. There were three incidents which caused higher personnel exposure during operation. One, was the destruction of the canner with silver (because the time spent in the core was too long) which caused the surface contamination of the platform, the background radiation was 10 to 100 times higher than regular. The other two cases were caused by failure of the device for handling the fuel slugs in the fuel channels during refuelling. Reactor refuelling was done four times during 1969, and 499 fresh fuel slugs were used. Refuelling applied the approach of 'mixing' the fresh fuel slugs with the 'old' fuel slugs in the fuel channel. Decontamination of surfaces was on the same level as previously in spite of the problems with silver. Since two staff members have left, the present number od employees is now the minimum needed for reactor operation and maintenance. It is stated that the operation of components and equipment is on sufficiently high level after ten years of reactor operation. The action plan for 1970 is made according to the same principles as in previous four years but the planned production is decreased to 25000 MWh, because control of important components is needed after ten

  17. SparkText: Biomedical Text Mining on Big Data Framework.

    Science.gov (United States)

    Ye, Zhan; Tafti, Ahmad P; He, Karen Y; Wang, Kai; He, Max M

    Many new biomedical research articles are published every day, accumulating rich information, such as genetic variants, genes, diseases, and treatments. Rapid yet accurate text mining on large-scale scientific literature can discover novel knowledge to better understand human diseases and to improve the quality of disease diagnosis, prevention, and treatment. In this study, we designed and developed an efficient text mining framework called SparkText on a Big Data infrastructure, which is composed of Apache Spark data streaming and machine learning methods, combined with a Cassandra NoSQL database. To demonstrate its performance for classifying cancer types, we extracted information (e.g., breast, prostate, and lung cancers) from tens of thousands of articles downloaded from PubMed, and then employed Naïve Bayes, Support Vector Machine (SVM), and Logistic Regression to build prediction models to mine the articles. The accuracy of predicting a cancer type by SVM using the 29,437 full-text articles was 93.81%. While competing text-mining tools took more than 11 hours, SparkText mined the dataset in approximately 6 minutes. This study demonstrates the potential for mining large-scale scientific articles on a Big Data infrastructure, with real-time update from new articles published daily. SparkText can be extended to other areas of biomedical research.

  18. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  19. Generalized formal model of Big Data

    OpenAIRE

    Shakhovska, N.; Veres, O.; Hirnyak, M.

    2016-01-01

    This article dwells on the basic characteristic features of the Big Data technologies. It is analyzed the existing definition of the “big data” term. The article proposes and describes the elements of the generalized formal model of big data. It is analyzed the peculiarities of the application of the proposed model components. It is described the fundamental differences between Big Data technology and business analytics. Big Data is supported by the distributed file system Google File System ...

  20. BigWig and BigBed: enabling browsing of large distributed datasets.

    Science.gov (United States)

    Kent, W J; Zweig, A S; Barber, G; Hinrichs, A S; Karolchik, D

    2010-09-01

    BigWig and BigBed files are compressed binary indexed files containing data at several resolutions that allow the high-performance display of next-generation sequencing experiment results in the UCSC Genome Browser. The visualization is implemented using a multi-layered software approach that takes advantage of specific capabilities of web-based protocols and Linux and UNIX operating systems files, R trees and various indexing and compression tricks. As a result, only the data needed to support the current browser view is transmitted rather than the entire file, enabling fast remote access to large distributed data sets. Binaries for the BigWig and BigBed creation and parsing utilities may be downloaded at http://hgdownload.cse.ucsc.edu/admin/exe/linux.x86_64/. Source code for the creation and visualization software is freely available for non-commercial use at http://hgdownload.cse.ucsc.edu/admin/jksrc.zip, implemented in C and supported on Linux. The UCSC Genome Browser is available at http://genome.ucsc.edu.

  1. Top-level DB design for Big Data in ATLAS Experiment at CERN

    CERN Document Server

    Dimitrov, Gancho; The ATLAS collaboration

    2017-01-01

    This presentation describes a system that accumulates a set of key quantities for a very large number of particle collision events recorded by the ATLAS experiment at the LHC (Large Hadron Collider) at CERN. The main project requirements are the handling of tens of billions of rows per year with minimal DB resources, and providing outstanding performance for the fundamental use cases. Various challenges were faced in the process of project development, such as large data volume, large transactions (tens to hundreds of million of rows per transaction) requiring significant amount of undo, row duplication checks, adequate table statistics gathering, and SQL execution plan stability. Currently the system hosts about 120 billion rows as the data ingestion rate has gone beyond the initially foreseen 30 billion rows per year. The crucial DB schema design decisions and the Oracle DB features and techniques will be shared with the audience. By attending this session you will learn how big physics data can be organize...

  2. Basic considerations for the mechanical design of heating reactors

    International Nuclear Information System (INIS)

    Rau, P.

    1997-01-01

    The paper discusses the principal aspects of the mechanical design of the reactor unit for a nuclear district heating plant. It is reasoned that the design must be specifically tailored to the characteristics of the applications, and that the experience gained with the design practice of big nuclear power stations must also be incorporated. Some examples of the design solutions for the SIEMENS NRH-200 are presented for illustration. (author). 7 refs, 10 figs

  3. Basic considerations for the mechanical design of heating reactors

    Energy Technology Data Exchange (ETDEWEB)

    Rau, P [Siemens AG, Unternehmensbereich KWU, Erlangen (Germany)

    1997-09-01

    The paper discusses the principal aspects of the mechanical design of the reactor unit for a nuclear district heating plant. It is reasoned that the design must be specifically tailored to the characteristics of the applications, and that the experience gained with the design practice of big nuclear power stations must also be incorporated. Some examples of the design solutions for the SIEMENS NRH-200 are presented for illustration. (author). 7 refs, 10 figs.

  4. Ewing sarcoma of the left big toe with trans-articular skip lesion

    Directory of Open Access Journals (Sweden)

    Ahmad F. Kamal

    2009-06-01

    Full Text Available We report the case of the patient who had Ewing Sarcoma in whom radiological and hystopathological appearances revealed a tumor mass in the left big toe along with trans-artikular skip lesion on the left diaphysis of tibia. In Cipto Mangunkusomo Hospital since 1995 until 2004 we have found 20 Ewing sarcoma cases, but only one skip lesion Ewing sarcoma was found. The diagnosis of transarticular skip lesion in association of Ewing sarcoma was confirmed in clinicopathological conferrence. The initial evaluation of all patients included the recording of the medical history, physical examination, and hematological studies. Radiographs of the chest and the site of the primary tumor were made routinely. Systemic staging was performed with use of total-body bone scan. Ray amputation of left big toe and open biopsy from mass of mid-shaft of tibia had been done to confirm the diagnosis. The patient underwent induction chemotherapy and above knee amputation. Ten months after diagnosis, he died because of advanced-distant metastasis. (Med J Indones 2008; 18: 139-44Key words: Ewing sarcoma, trans-articular skip lesion

  5. Big data-driven business how to use big data to win customers, beat competitors, and boost profits

    CERN Document Server

    Glass, Russell

    2014-01-01

    Get the expert perspective and practical advice on big data The Big Data-Driven Business: How to Use Big Data to Win Customers, Beat Competitors, and Boost Profits makes the case that big data is for real, and more than just big hype. The book uses real-life examples-from Nate Silver to Copernicus, and Apple to Blackberry-to demonstrate how the winners of the future will use big data to seek the truth. Written by a marketing journalist and the CEO of a multi-million-dollar B2B marketing platform that reaches more than 90% of the U.S. business population, this book is a comprehens

  6. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  7. Advances in zirconium technology for nuclear reactor application

    International Nuclear Information System (INIS)

    Ganguly, C.

    2002-01-01

    Zirconium alloys are extensively used as a material for cladding nuclear fuels and for making core structurals of water-cooled nuclear power reactors all over the world for generation of nearly 16 percent of the worlds electricity. Only four countries in the world, namely France, USA, Russia and India, have large zirconium industry and capability to manufacture reactor grade zirconium sponge, a number of zirconium alloys and a wide variety of structural components for water cooled nuclear reactor. The present paper summarises the status of zirconium technology and highlights the achievement of Nuclear Fuel Complex during the last ten years in developing a wide variety of zirconium alloys and components for water-cooled nuclear power programme

  8. Stalin's Big Fleet Program

    National Research Council Canada - National Science Library

    Mauner, Milan

    2002-01-01

    Although Dr. Milan Hauner's study 'Stalin's Big Fleet program' has focused primarily on the formation of Big Fleets during the Tsarist and Soviet periods of Russia's naval history, there are important lessons...

  9. Free Release Standards Utilized at Big Rock Point

    International Nuclear Information System (INIS)

    Robert P. Wills

    2000-01-01

    The decommissioning of Consumers Energy's Big Rock Point (BRP) site involves decommissioning its 75-MW boiling water reactor and all of the associated facilities. Consumers Energy is committed to restoring the site to greenfield conditions. This commitment means that when the decommissioning is complete, all former structures will have been removed, and the site will be available for future use without radiological restrictions. BRP's radiation protection management staff determined that the typical methods used to comply with U.S Nuclear Regulatory Commission (NRC) regulations for analyzing volumetric material for radionuclides would not fulfill the demands of a facility undergoing decommissioning. The challenge at hand is to comply with regulatory requirements and put into production a large-scale bulk release production program. This report describes Consumers Energy's planned approach to the regulatory aspects of free release

  10. Five Big, Big Five Issues : Rationale, Content, Structure, Status, and Crosscultural Assessment

    NARCIS (Netherlands)

    De Raad, Boele

    1998-01-01

    This article discusses the rationale, content, structure, status, and crosscultural assessment of the Big Five trait factors, focusing on topics of dispute and misunderstanding. Taxonomic restrictions of the original Big Five forerunner, the "Norman Five," are discussed, and criticisms regarding the

  11. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  12. Big Data and HPC collocation: Using HPC idle resources for Big Data Analytics

    OpenAIRE

    MERCIER , Michael; Glesser , David; Georgiou , Yiannis; Richard , Olivier

    2017-01-01

    International audience; Executing Big Data workloads upon High Performance Computing (HPC) infrastractures has become an attractive way to improve their performances. However, the collocation of HPC and Big Data workloads is not an easy task, mainly because of their core concepts' differences. This paper focuses on the challenges related to the scheduling of both Big Data and HPC workloads on the same computing platform. In classic HPC workloads, the rigidity of jobs tends to create holes in ...

  13. New generation main control room of enhanced safety NPP with MKER reactor

    International Nuclear Information System (INIS)

    Golovanev, V.E.; Gorelov, A.I.; Proshin, V.A.

    1994-01-01

    Russia is planning to begin the gradual substitution RMBK NPP units, whose resources were worked out itself, to NPP units with a 800 MW multiloop boiling water power reactor (MKER-800) enhanced safety at next ten-year period. Main drawbacks of RBMK Reactor were completely removed in design of MKER-800 reactor. Moreover some special decisions were made to give MKER-800 self-safety properties. The proposed design of the MKER-800 enhanced safety reactor is not only fully free from the drawbacks of the RBMK reactors, but also show a number of advantages of channel-type reactors. This Paper presents some preliminary proposals of MCR Design, that developed Research and Development Institute of Power Energy (RDIPE). 6 refs, 2 figs

  14. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  15. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  16. Big data - a 21st century science Maginot Line? No-boundary thinking: shifting from the big data paradigm.

    Science.gov (United States)

    Huang, Xiuzhen; Jennings, Steven F; Bruce, Barry; Buchan, Alison; Cai, Liming; Chen, Pengyin; Cramer, Carole L; Guan, Weihua; Hilgert, Uwe Kk; Jiang, Hongmei; Li, Zenglu; McClure, Gail; McMullen, Donald F; Nanduri, Bindu; Perkins, Andy; Rekepalli, Bhanu; Salem, Saeed; Specker, Jennifer; Walker, Karl; Wunsch, Donald; Xiong, Donghai; Zhang, Shuzhong; Zhang, Yu; Zhao, Zhongming; Moore, Jason H

    2015-01-01

    Whether your interests lie in scientific arenas, the corporate world, or in government, you have certainly heard the praises of big data: Big data will give you new insights, allow you to become more efficient, and/or will solve your problems. While big data has had some outstanding successes, many are now beginning to see that it is not the Silver Bullet that it has been touted to be. Here our main concern is the overall impact of big data; the current manifestation of big data is constructing a Maginot Line in science in the 21st century. Big data is not "lots of data" as a phenomena anymore; The big data paradigm is putting the spirit of the Maginot Line into lots of data. Big data overall is disconnecting researchers and science challenges. We propose No-Boundary Thinking (NBT), applying no-boundary thinking in problem defining to address science challenges.

  17. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Andersen, Kristina Vaarst; Jeppesen, Jacob

    In this paper we investigate the micro-mechanisms governing structural evolution and performance of scientific collaboration. Scientific discovery tends not to be lead by so called lone ?stars?, or big egos, but instead by collaboration among groups of researchers, from a multitude of institutions...

  18. Big Data and Big Science

    OpenAIRE

    Di Meglio, Alberto

    2014-01-01

    Brief introduction to the challenges of big data in scientific research based on the work done by the HEP community at CERN and how the CERN openlab promotes collaboration among research institutes and industrial IT companies. Presented at the FutureGov 2014 conference in Singapore.

  19. Challenges of Big Data Analysis.

    Science.gov (United States)

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  20. Hybrid reactors: recent progress of a demonstration pilot

    International Nuclear Information System (INIS)

    Billebaud, Annick

    2006-12-01

    Accelerator driven sub-critical reactors are subject of many research programmes since more than ten years, with the aim of testing the feasibility of the concept as well as their efficiency as a transmutation tool. Several key points like the accelerator, the spallation target, or neutronics in a subcritical medium were investigated extensively these last years, allowing for technological choices and the design of a low power European demonstration ADS (a few tens of MWth). Programmes dedicated to subcritical reactor piloting proposed a monitoring procedure to be validated in forthcoming experiments. Accelerator R and D provided the design of a LINAC for an ADS and research work on accelerator reliability is going on. A spallation target was operated at PSI and the design of a windowless target is in progress. All this research work converges to the design of a European demonstration ADS, the ETD/XT-ADS, which could be the Belgian MYRRHA project. (author)

  1. Big data is not a monolith

    CERN Document Server

    Ekbia, Hamid R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  2. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......, and highlight some recent methodological advancements in machine learning and image analysis triggered by astronomical applications....

  3. Poker Player Behavior After Big Wins and Big Losses

    OpenAIRE

    Gary Smith; Michael Levere; Robert Kurtzman

    2009-01-01

    We find that experienced poker players typically change their style of play after winning or losing a big pot--most notably, playing less cautiously after a big loss, evidently hoping for lucky cards that will erase their loss. This finding is consistent with Kahneman and Tversky's (Kahneman, D., A. Tversky. 1979. Prospect theory: An analysis of decision under risk. Econometrica 47(2) 263-292) break-even hypothesis and suggests that when investors incur a large loss, it might be time to take ...

  4. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  5. A first order phase transition from inflationary to big bang universe

    International Nuclear Information System (INIS)

    Horwitz, G.

    1986-01-01

    The microcanonical entropy is calculated for a system of massive, conformally coupled, scalar bosons using a conformal gravitational theory. The resulting entropy is seen to indicate a first order phase transition from an inflationary expansion stage (where the amplitude of the scalar boson follows that of the scale function of the universe and the mass of the solar boson is the source of the cosmological constant) to a big bang stage (where neither of these conditions hold). Such a first order phase transition involves an entropy increase of some thirty orders of magnitude. In the author's theory, the invariant temperature (proper temperature times scale function) is not zero, nor is it the Hawking temperature, but it is tens of magnitudes smaller than the corresponding temperature of the big bang stage. A specific model for these bosons that provides the phase transition and serves as the source of the cosmological constant is also examined briefly, where the bosons are identified as spontaneously generated primordial black holes as in the cosmological model of Brout, Englert and Casher. In that case, the decay of the black holes provides a decaying cosmological constant and an explicit mechanism for heating up the universe

  6. Big data in Finnish financial services

    OpenAIRE

    Laurila, M. (Mikko)

    2017-01-01

    Abstract This thesis aims to explore the concept of big data, and create understanding of big data maturity in the Finnish financial services industry. The research questions of this thesis are “What kind of big data solutions are being implemented in the Finnish financial services sector?” and “Which factors impede faster implementation of big data solutions in the Finnish financial services sector?”. ...

  7. Big data in fashion industry

    Science.gov (United States)

    Jain, S.; Bruniaux, J.; Zeng, X.; Bruniaux, P.

    2017-10-01

    Significant work has been done in the field of big data in last decade. The concept of big data includes analysing voluminous data to extract valuable information. In the fashion world, big data is increasingly playing a part in trend forecasting, analysing consumer behaviour, preference and emotions. The purpose of this paper is to introduce the term fashion data and why it can be considered as big data. It also gives a broad classification of the types of fashion data and briefly defines them. Also, the methodology and working of a system that will use this data is briefly described.

  8. Thermionic nuclear reactor systems

    International Nuclear Information System (INIS)

    Kennel, E.B.

    1986-01-01

    Thermionic nuclear reactors can be expected to be candidate space power supplies for power demands ranging from about ten kilowatts to several megawatts. The conventional ''ignited mode'' thermionic fuel element (TFE) is the basis for most reactor designs to date. Laboratory converters have been built and tested with efficiencies in the range of 7-12% for over 10,000 hours. Even longer lifetimes are projected. More advanced capabilities are potentially achievable in other modes of operation, such as the self-pulsed or unignited diode. Coupled with modest improvements in fuel and emitter material performance, the efficiency of an advanced thermionic conversion system can be extended to the 15-20% range. Advanced thermionic power systems are expected to be compatible with other advanced features such as: (1) Intrinsic subcritically under accident conditions, ensuring 100% safety upon launch abort; (2) Intrinsic low radiation levels during reactor shutdown, allowing manned servicing and/or rendezvous; (3) DC to DC power conditioning using lightweight power MOSFETS; and (4) AC output using pulsed converters

  9. Inspection of Chooz power plant after ten years operation

    International Nuclear Information System (INIS)

    Saglio, Robert.

    1978-01-01

    This report is intended to discuss the results from the complete technical audit of the vessel effected in 1976 at the Ardennes reactor (CNA 305 MWe). This audit had a special character as far as this power plant has never been inspected before: the start-up had taken place in 1967 and was then prior to the development of French regulations (and even to the first version of the ASME code, Section XI). In that time, no inspection was expected; it has yet been possible to have a complete audit in ten days. The automatic start-up of focused ultrasonic testing so appeared to have reached the required reliability and a good sensitivity [fr

  10. Operational experience with Dragon reactor experiment of relevance to commercial reactors

    International Nuclear Information System (INIS)

    Capp, P.D.; Simon, R.A.

    1976-01-01

    An important part of the experience gained during the first ten years of successful power operation of the Dragon Reactor is relevant to the design and operation of future High Temperature Reactors (HTRs). The aspects presented in this paper have been chosen as being particularly applicable to larger HTR systems. Core performance under a variety of conditions is surveyed with particular emphasis on a technique developed for the identification and location of unpurged releasing fuel and the presence of activation and fission products in the core area. The lessons learned during the reflector block replacement are presented. Operating experience with the primary circuit identifies the lack of mixing of gas streams within the hot plenum and the problems of gas streaming in ducts. Helium leakage from the circuit is often greater than the optimum 0.1%/d. Virtually all the leakage problems are associated with the small bore instrument pipework essential for the many experiments associated with the Dragon Reactor Experiment (DRE). Primary circuit maintenance work confirms the generally clean state of the DRE circuit but identifies 137 Cs and 110 Agsup(m) as possible hazards if fuel emitting these isotopes is irradiated. (author)

  11. Big data bioinformatics.

    Science.gov (United States)

    Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao

    2014-12-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.

  12. Changing the personality of a face: Perceived Big Two and Big Five personality factors modeled in real photographs.

    Science.gov (United States)

    Walker, Mirella; Vetter, Thomas

    2016-04-01

    General, spontaneous evaluations of strangers based on their faces have been shown to reflect judgments of these persons' intention and ability to harm. These evaluations can be mapped onto a 2D space defined by the dimensions trustworthiness (intention) and dominance (ability). Here we go beyond general evaluations and focus on more specific personality judgments derived from the Big Two and Big Five personality concepts. In particular, we investigate whether Big Two/Big Five personality judgments can be mapped onto the 2D space defined by the dimensions trustworthiness and dominance. Results indicate that judgments of the Big Two personality dimensions almost perfectly map onto the 2D space. In contrast, at least 3 of the Big Five dimensions (i.e., neuroticism, extraversion, and conscientiousness) go beyond the 2D space, indicating that additional dimensions are necessary to describe more specific face-based personality judgments accurately. Building on this evidence, we model the Big Two/Big Five personality dimensions in real facial photographs. Results from 2 validation studies show that the Big Two/Big Five are perceived reliably across different samples of faces and participants. Moreover, results reveal that participants differentiate reliably between the different Big Two/Big Five dimensions. Importantly, this high level of agreement and differentiation in personality judgments from faces likely creates a subjective reality which may have serious consequences for those being perceived-notably, these consequences ensue because the subjective reality is socially shared, irrespective of the judgments' validity. The methodological approach introduced here might prove useful in various psychological disciplines. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  13. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  14. Big game hunting practices, meanings, motivations and constraints: a survey of Oregon big game hunters

    Science.gov (United States)

    Suresh K. Shrestha; Robert C. Burns

    2012-01-01

    We conducted a self-administered mail survey in September 2009 with randomly selected Oregon hunters who had purchased big game hunting licenses/tags for the 2008 hunting season. Survey questions explored hunting practices, the meanings of and motivations for big game hunting, the constraints to big game hunting participation, and the effects of age, years of hunting...

  15. Comparison between TRU burning reactors and commercial fast reactor

    International Nuclear Information System (INIS)

    Fujimura, Koji; Sanda, Toshio; Ogawa, Takashi

    2001-03-01

    Research and development for stabilizing or shortening the radioactive wastes including in spent nuclear fuel are widely conducted in view point of reducing the environmental impact. Especially it is effective way to irradiate and transmute long-lived TRU by fast reactors. Two types of loading way were previously proposed. The former is loading relatively small amount of TRU in all commercial fast reactors and the latter is loading large amount of TRU in a few TRU burning reactors. This study has been intended to contribute to the feasibility studies on commercialized fast reactor cycle system. The transmutation and nuclear characteristics of TRU burning reactors were evaluated and compared with those of conventional transmutation system using commercial type fast reactor based upon the investigation of technical information about TRU burning reactors. Major results are summarized as follows. (1) Investigation of technical information about TRU burning reactors. Based on published reports and papers, technical information about TRU burning reactor concepts transmutation system using convectional commercial type fast reactors were investigated. Transmutation and nuclear characteristics or R and D issue were investigated based on these results. Homogeneously loading of about 5 wt% MAs on core fuels in the conventional commercial type fast reactor may not cause significant impact on the nuclear core characteristics. Transmutation of MAs being produced in about five fast reactors generating the same output is feasible. The helium cooled MA burning fast reactor core concept propose by JAERI attains criticality using particle type nitride fuels which contain more than 60 wt% MA. This reactor could transmute MAs being produced in more than ten 1000 MWe-LWRs. Ultra-long life core concepts attaining more than 30 years operation without refueling by utilizing MA's nuclear characteristics as burnable absorber and fertile nuclides were proposed. Those were pointed out that

  16. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  17. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  18. Exploring complex and big data

    Directory of Open Access Journals (Sweden)

    Stefanowski Jerzy

    2017-12-01

    Full Text Available This paper shows how big data analysis opens a range of research and technological problems and calls for new approaches. We start with defining the essential properties of big data and discussing the main types of data involved. We then survey the dedicated solutions for storing and processing big data, including a data lake, virtual integration, and a polystore architecture. Difficulties in managing data quality and provenance are also highlighted. The characteristics of big data imply also specific requirements and challenges for data mining algorithms, which we address as well. The links with related areas, including data streams and deep learning, are discussed. The common theme that naturally emerges from this characterization is complexity. All in all, we consider it to be the truly defining feature of big data (posing particular research and technological challenges, which ultimately seems to be of greater importance than the sheer data volume.

  19. Was there a big bang

    International Nuclear Information System (INIS)

    Narlikar, J.

    1981-01-01

    In discussing the viability of the big-bang model of the Universe relative evidence is examined including the discrepancies in the age of the big-bang Universe, the red shifts of quasars, the microwave background radiation, general theory of relativity aspects such as the change of the gravitational constant with time, and quantum theory considerations. It is felt that the arguments considered show that the big-bang picture is not as soundly established, either theoretically or observationally, as it is usually claimed to be, that the cosmological problem is still wide open and alternatives to the standard big-bang picture should be seriously investigated. (U.K.)

  20. BIG DATA-DRIVEN MARKETING: AN ABSTRACT

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (BD use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: What are the key antecedents of big data customer analytics use? How, and to what extent, does big data customer an...

  1. Big Data Analytics in Medicine and Healthcare.

    Science.gov (United States)

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  2. The trashing of Big Green

    International Nuclear Information System (INIS)

    Felten, E.

    1990-01-01

    The Big Green initiative on California's ballot lost by a margin of 2-to-1. Green measures lost in five other states, shocking ecology-minded groups. According to the postmortem by environmentalists, Big Green was a victim of poor timing and big spending by the opposition. Now its supporters plan to break up the bill and try to pass some provisions in the Legislature

  3. Radio-active pollution near natural uranium-graphite-gas reactors

    International Nuclear Information System (INIS)

    Chassany, J.; Pouthier, J.; Delmar, J.

    1967-01-01

    The results of numerous evaluations of the contamination are given: - Reactors in operation during maintenance operations. - Reactors shut-down during typical repair operations (coolants, exchangers, interior of the vessel, etc. ) - Following incidents on the cooling circuit and can-rupture. They show that, except in particular cases, it is the activation products which dominate. Furthermore, after ten years operation, the points at which contamination liable to emit strong doses accumulates are very localized and the individual protective equipment has not had to be reinforced. (authors) [fr

  4. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  5. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  6. High Flux Materials Testing Reactor (HFR), Petten

    International Nuclear Information System (INIS)

    1975-09-01

    After conversion to burnable poison fuel elements, the High Flux Materials Testing Reactor (HFR) Petten (Netherlands), operated through 1974 for 280 days at 45 MW. Equipment for irradiation experiments has been replaced and extended. The average annual occupation by experiments was 55% as compared to 38% in 1973. Work continued on thirty irradiation projects and ten development activities

  7. SparkText: Biomedical Text Mining on Big Data Framework

    Science.gov (United States)

    He, Karen Y.; Wang, Kai

    2016-01-01

    Background Many new biomedical research articles are published every day, accumulating rich information, such as genetic variants, genes, diseases, and treatments. Rapid yet accurate text mining on large-scale scientific literature can discover novel knowledge to better understand human diseases and to improve the quality of disease diagnosis, prevention, and treatment. Results In this study, we designed and developed an efficient text mining framework called SparkText on a Big Data infrastructure, which is composed of Apache Spark data streaming and machine learning methods, combined with a Cassandra NoSQL database. To demonstrate its performance for classifying cancer types, we extracted information (e.g., breast, prostate, and lung cancers) from tens of thousands of articles downloaded from PubMed, and then employed Naïve Bayes, Support Vector Machine (SVM), and Logistic Regression to build prediction models to mine the articles. The accuracy of predicting a cancer type by SVM using the 29,437 full-text articles was 93.81%. While competing text-mining tools took more than 11 hours, SparkText mined the dataset in approximately 6 minutes. Conclusions This study demonstrates the potential for mining large-scale scientific articles on a Big Data infrastructure, with real-time update from new articles published daily. SparkText can be extended to other areas of biomedical research. PMID:27685652

  8. SparkText: Biomedical Text Mining on Big Data Framework.

    Directory of Open Access Journals (Sweden)

    Zhan Ye

    Full Text Available Many new biomedical research articles are published every day, accumulating rich information, such as genetic variants, genes, diseases, and treatments. Rapid yet accurate text mining on large-scale scientific literature can discover novel knowledge to better understand human diseases and to improve the quality of disease diagnosis, prevention, and treatment.In this study, we designed and developed an efficient text mining framework called SparkText on a Big Data infrastructure, which is composed of Apache Spark data streaming and machine learning methods, combined with a Cassandra NoSQL database. To demonstrate its performance for classifying cancer types, we extracted information (e.g., breast, prostate, and lung cancers from tens of thousands of articles downloaded from PubMed, and then employed Naïve Bayes, Support Vector Machine (SVM, and Logistic Regression to build prediction models to mine the articles. The accuracy of predicting a cancer type by SVM using the 29,437 full-text articles was 93.81%. While competing text-mining tools took more than 11 hours, SparkText mined the dataset in approximately 6 minutes.This study demonstrates the potential for mining large-scale scientific articles on a Big Data infrastructure, with real-time update from new articles published daily. SparkText can be extended to other areas of biomedical research.

  9. Application of autoregressive moving average model in reactor noise analysis

    International Nuclear Information System (INIS)

    Tran Dinh Tri

    1993-01-01

    The application of an autoregressive (AR) model to estimating noise measurements has achieved many successes in reactor noise analysis in the last ten years. The physical processes that take place in the nuclear reactor, however, are described by an autoregressive moving average (ARMA) model rather than by an AR model. Consequently more correct results could be obtained by applying the ARMA model instead of the AR model to reactor noise analysis. In this paper the system of the generalised Yule-Walker equations is derived from the equation of an ARMA model, then a method for its solution is given. Numerical results show the applications of the method proposed. (author)

  10. Artificial Intelligence and its Reasonable Application Scenario to Reactor Operation

    International Nuclear Information System (INIS)

    Im, Ki Hong; Suh, Yong-Suk; Park, Cheol; Lim, In-Cheol

    2017-01-01

    This paper presents brief but reasonable scenarios for applying AI or machine learning technologies to research reactor from various perspectives. Two less safety critical scenarios for applying AI to reactor operation are introduced in this study. However, the AI assistant will not only be an assistant but it will also be an operator in the future. What is required is big operation data which can represent all the cases requiring operation decision, including normal operation and accident data as well, and enough time to train and fix the AI system with this data. We can predict AI study in this area can begin with a mild and safe application. But in the near future, this technology could be used to handle or automate more severe operations.

  11. Medical big data: promise and challenges.

    Science.gov (United States)

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  12. Medical big data: promise and challenges

    Directory of Open Access Journals (Sweden)

    Choong Ho Lee

    2017-03-01

    Full Text Available The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  13. Simulation of the fuzzy-smith control system for the high temperature gas cooled reactor

    International Nuclear Information System (INIS)

    Li Deheng; Xu Xiaolin; Zheng Jie; Guo Renjun; Zhang Guifen

    1997-01-01

    The Fuzzy-Smith pre-estimate controller to solve the control of the big delay system is developed, accompanied with the development of the mathematical model of the 10 MW high temperature gas cooled test reactor (HTR-10) and the design of its control system. The simulation results show the Fuzzy-Smith pre-estimate controller has the advantages of both fuzzy control and Smith pre-estimate controller; it has better compensation to the delay and better adaptability to the parameter change of the control object. So it is applicable to the design of the control system for the high temperature gas cooled reactor

  14. Measurement and analysis of pressure tube elongation in the Douglas Point reactor

    International Nuclear Information System (INIS)

    Causey, A.R.; MacEwan, S.R.; Jamieson, H.C.; Mitchell, A.B.

    1980-02-01

    Elongations of zirconium alloy pressure tubes in CANDU reactors, which occur as a result of neutron-irradiation-induced creep and growth, have been measured over the past 6 years, and the consequences of thses elongations have recently been analysed. Elongation rates, previously deduced from extensive measurements of elongations of cold-worked Zircaloy-2 pressure tubes in the Pickering reactors, have been modified to apply to the pressure tubes in the Douglas Point (DP) reactor by taking into account measured diffences in texture and dislocation density. Using these elongation rates, and structural data unique to the DP reactor, the analysis predicts elongation behaviour which is in good agreement with pressure tube elongations measured during the ten years of reactor operation. (Auth)

  15. What is beyond the big five?

    Science.gov (United States)

    Saucier, G; Goldberg, L R

    1998-08-01

    Previous investigators have proposed that various kinds of person-descriptive content--such as differences in attitudes or values, in sheer evaluation, in attractiveness, or in height and girth--are not adequately captured by the Big Five Model. We report on a rather exhaustive search for reliable sources of Big Five-independent variation in data from person-descriptive adjectives. Fifty-three candidate clusters were developed in a college sample using diverse approaches and sources. In a nonstudent adult sample, clusters were evaluated with respect to a minimax criterion: minimum multiple correlation with factors from Big Five markers and maximum reliability. The most clearly Big Five-independent clusters referred to Height, Girth, Religiousness, Employment Status, Youthfulness and Negative Valence (or low-base-rate attributes). Clusters referring to Fashionableness, Sensuality/Seductiveness, Beauty, Masculinity, Frugality, Humor, Wealth, Prejudice, Folksiness, Cunning, and Luck appeared to be potentially beyond the Big Five, although each of these clusters demonstrated Big Five multiple correlations of .30 to .45, and at least one correlation of .20 and over with a Big Five factor. Of all these content areas, Religiousness, Negative Valence, and the various aspects of Attractiveness were found to be represented by a substantial number of distinct, common adjectives. Results suggest directions for supplementing the Big Five when one wishes to extend variable selection outside the domain of personality traits as conventionally defined.

  16. Big Data Analytics and Its Applications

    Directory of Open Access Journals (Sweden)

    Mashooque A. Memon

    2017-10-01

    Full Text Available The term, Big Data, has been authored to refer to the extensive heave of data that can't be managed by traditional data handling methods or techniques. The field of Big Data plays an indispensable role in various fields, such as agriculture, banking, data mining, education, chemistry, finance, cloud computing, marketing, health care stocks. Big data analytics is the method for looking at big data to reveal hidden patterns, incomprehensible relationship and other important data that can be utilize to resolve on enhanced decisions. There has been a perpetually expanding interest for big data because of its fast development and since it covers different areas of applications. Apache Hadoop open source technology created in Java and keeps running on Linux working framework was used. The primary commitment of this exploration is to display an effective and free solution for big data application in a distributed environment, with its advantages and indicating its easy use. Later on, there emerge to be a required for an analytical review of new developments in the big data technology. Healthcare is one of the best concerns of the world. Big data in healthcare imply to electronic health data sets that are identified with patient healthcare and prosperity. Data in the healthcare area is developing past managing limit of the healthcare associations and is relied upon to increment fundamentally in the coming years.

  17. Measuring the Promise of Big Data Syllabi

    Science.gov (United States)

    Friedman, Alon

    2018-01-01

    Growing interest in Big Data is leading industries, academics and governments to accelerate Big Data research. However, how teachers should teach Big Data has not been fully examined. This article suggests criteria for redesigning Big Data syllabi in public and private degree-awarding higher education establishments. The author conducted a survey…

  18. Limits on cosmological variation of strong interaction and quark masses from big bang nucleosynthesis, cosmic, laboratory and Oklo data

    International Nuclear Information System (INIS)

    Flambaum, V.V.; Shuryak, E.V.

    2002-01-01

    Recent data on the cosmological variation of the electromagnetic fine structure constant from distant quasar (QSO) absorption spectra have inspired a more general discussion of the possible variation of other constants. We discuss the variation of strong scale and quark masses. We derive limits on their relative change from (i) primordial big bang nucleosynthesis, (ii) the Oklo natural nuclear reactor, (iii) quasar absorption spectra, and (iv) laboratory measurements of hyperfine intervals

  19. 77 FR 27245 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN

    Science.gov (United States)

    2012-05-09

    ... DEPARTMENT OF THE INTERIOR Fish and Wildlife Service [FWS-R3-R-2012-N069; FXRS1265030000S3-123-FF03R06000] Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN AGENCY: Fish and... plan (CCP) and environmental assessment (EA) for Big Stone National Wildlife Refuge (Refuge, NWR) for...

  20. REACTOR GROUT THERMAL PROPERTIES

    Energy Technology Data Exchange (ETDEWEB)

    Steimke, J.; Qureshi, Z.; Restivo, M.; Guerrero, H.

    2011-01-28

    Savannah River Site has five dormant nuclear production reactors. Long term disposition will require filling some reactor buildings with grout up to ground level. Portland cement based grout will be used to fill the buildings with the exception of some reactor tanks. Some reactor tanks contain significant quantities of aluminum which could react with Portland cement based grout to form hydrogen. Hydrogen production is a safety concern and gas generation could also compromise the structural integrity of the grout pour. Therefore, it was necessary to develop a non-Portland cement grout to fill reactors that contain significant quantities of aluminum. Grouts generate heat when they set, so the potential exists for large temperature increases in a large pour, which could compromise the integrity of the pour. The primary purpose of the testing reported here was to measure heat of hydration, specific heat, thermal conductivity and density of various reactor grouts under consideration so that these properties could be used to model transient heat transfer for different pouring strategies. A secondary purpose was to make qualitative judgments of grout pourability and hardened strength. Some reactor grout formulations were unacceptable because they generated too much heat, or started setting too fast, or required too long to harden or were too weak. The formulation called 102H had the best combination of characteristics. It is a Calcium Alumino-Sulfate grout that contains Ciment Fondu (calcium aluminate cement), Plaster of Paris (calcium sulfate hemihydrate), sand, Class F fly ash, boric acid and small quantities of additives. This composition afforded about ten hours of working time. Heat release began at 12 hours and was complete by 24 hours. The adiabatic temperature rise was 54 C which was within specification. The final product was hard and displayed no visible segregation. The density and maximum particle size were within specification.

  1. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  2. Big data and educational research

    OpenAIRE

    Beneito-Montagut, Roser

    2017-01-01

    Big data and data analytics offer the promise to enhance teaching and learning, improve educational research and progress education governance. This chapter aims to contribute to the conceptual and methodological understanding of big data and analytics within educational research. It describes the opportunities and challenges that big data and analytics bring to education as well as critically explore the perils of applying a data driven approach to education. Despite the claimed value of the...

  3. Thick-Big Descriptions

    DEFF Research Database (Denmark)

    Lai, Signe Sophus

    The paper discusses the rewards and challenges of employing commercial audience measurements data – gathered by media industries for profitmaking purposes – in ethnographic research on the Internet in everyday life. It questions claims to the objectivity of big data (Anderson 2008), the assumption...... communication systems, language and behavior appear as texts, outputs, and discourses (data to be ‘found’) – big data then documents things that in earlier research required interviews and observations (data to be ‘made’) (Jensen 2014). However, web-measurement enterprises build audiences according...... to a commercial logic (boyd & Crawford 2011) and is as such directed by motives that call for specific types of sellable user data and specific segmentation strategies. In combining big data and ‘thick descriptions’ (Geertz 1973) scholars need to question how ethnographic fieldwork might map the ‘data not seen...

  4. Performance analysis of Brayton cycle system for space power reactor

    International Nuclear Information System (INIS)

    Li Zhi; Yang Xiaoyong; Zhao Gang; Wang Jie; Zhang Zuoyi

    2017-01-01

    The closed Brayton cycle system now is the potential choice as the power conversion system for High Temperature Gas-cooled Reactors because of its high energy conversion efficiency and compact configuration. The helium is the best working fluid for the system for its chemical stability and small neutron absorption cross section. However, the Helium has small mole mass and big specific volume, which would lead to larger pipes and heat exchanger. What's more, the big compressor enthalpy rise of helium would also lead to an unacceptably large number of compressor's stage. For space use, it's more important to satisfy the limit of the system's volume and mass, instead of the requirement of the system's thermal capacity. So Noble-Gas binary mixture of helium and xenon is presented as the working fluid for space Brayton cycle. This paper makes a mathematical model for space Brayton cycle system by Fortran language, then analyzes the binary mixture of helium and xenon's properties and effects on power conversion units of the space power reactor, which would be helpful to understand and design the space power reactor. The results show that xenon would lead to a worse system's thermodynamic property, the cycle's efficiency and specific power decrease as xenon's mole fraction increasing. On the other hand, proper amount of xenon would decrease the enthalpy changes in turbomachines, which would be good for turbomachines' design. Another optimization method – the specific power optimization is also proposed to make a comparison. (author)

  5. Big Data's Role in Precision Public Health.

    Science.gov (United States)

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts.

  6. Big Data, indispensable today

    Directory of Open Access Journals (Sweden)

    Radu-Ioan ENACHE

    2015-10-01

    Full Text Available Big data is and will be used more in the future as a tool for everything that happens both online and offline. Of course , online is a real hobbit, Big Data is found in this medium , offering many advantages , being a real help for all consumers. In this paper we talked about Big Data as being a plus in developing new applications, by gathering useful information about the users and their behaviour.We've also presented the key aspects of real-time monitoring and the architecture principles of this technology. The most important benefit brought to this paper is presented in the cloud section.

  7. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  8. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  9. Antigravity and the big crunch/big bang transition

    International Nuclear Information System (INIS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  10. Big data: een zoektocht naar instituties

    NARCIS (Netherlands)

    van der Voort, H.G.; Crompvoets, J

    2016-01-01

    Big data is a well-known phenomenon, even a buzzword nowadays. It refers to an abundance of data and new possibilities to process and use them. Big data is subject of many publications. Some pay attention to the many possibilities of big data, others warn us for their consequences. This special

  11. Data, Data, Data : Big, Linked & Open

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Eckartz, S.M.

    2013-01-01

    De gehele business en IT-wereld praat op dit moment over Big Data, een trend die medio 2013 Cloud Computing is gepasseerd (op basis van Google Trends). Ook beleidsmakers houden zich actief bezig met Big Data. Neelie Kroes, vice-president van de Europese Commissie, spreekt over de ‘Big Data

  12. The NOAA Big Data Project

    Science.gov (United States)

    de la Beaujardiere, J.

    2015-12-01

    The US National Oceanic and Atmospheric Administration (NOAA) is a Big Data producer, generating tens of terabytes per day from hundreds of sensors on satellites, radars, aircraft, ships, and buoys, and from numerical models. These data are of critical importance and value for NOAA's mission to understand and predict changes in climate, weather, oceans, and coasts. In order to facilitate extracting additional value from this information, NOAA has established Cooperative Research and Development Agreements (CRADAs) with five Infrastructure-as-a-Service (IaaS) providers — Amazon, Google, IBM, Microsoft, Open Cloud Consortium — to determine whether hosting NOAA data in publicly-accessible Clouds alongside on-demand computational capability stimulates the creation of new value-added products and services and lines of business based on the data, and if the revenue generated by these new applications can support the costs of data transmission and hosting. Each IaaS provider is the anchor of a "Data Alliance" which organizations or entrepreneurs can join to develop and test new business or research avenues. This presentation will report on progress and lessons learned during the first 6 months of the 3-year CRADAs.

  13. A design study of reactor core optimization for direct nuclear heat-to-electricity conversion in a space power reactor

    Energy Technology Data Exchange (ETDEWEB)

    Yoshikawa, Hidekazu; Takahashi, Makoto; Shimoda, Hiroshi; Takeoka, Satoshi [Kyoto Univ. (Japan); Nakagawa, Masayuki; Kugo, Teruhiko

    1998-01-01

    To propose a new design concept of a nuclear reactor used in the space, research has been conducted on the conceptual design of a new nuclear reactor on the basis of the following three main concepts: (1) Thermionic generation by thermionic fuel elements (TFE), (2) reactivity control by rotary reflector, and (3) reactor cooling by liquid metal. The outcomes of the research are: (1) A calculation algorithm was derived for obtaining convergent conditions by repeating nuclear characteristic calculation and thermal flow characteristic calculation for the space nuclear reactor. (2) Use of this algorithm and the parametric study established that a space nuclear reactor using 97% enriched uranium nitride as the fuel and lithium as the coolant and having a core with a radius of about 25 cm, a height of about 50 cm and a generation efficiency of about 7% can probably be operated continuously for at least more than ten years at 100 kW only by reactivity control by rotary reflector. (3) A new CAD/CAE system was developed to assist design work to optimize the core characteristics of the space nuclear reactor comprehensively. It is composed of the integrated design support system VINDS using virtual reality and the distributed system WINDS to collaboratively support design work using Internet. (N.H.)

  14. Methods and tools for big data visualization

    OpenAIRE

    Zubova, Jelena; Kurasova, Olga

    2015-01-01

    In this paper, methods and tools for big data visualization have been investigated. Challenges faced by the big data analysis and visualization have been identified. Technologies for big data analysis have been discussed. A review of methods and tools for big data visualization has been done. Functionalities of the tools have been demonstrated by examples in order to highlight their advantages and disadvantages.

  15. A large neutral fraction of cosmic hydrogen a billion years after the Big Bang.

    Science.gov (United States)

    Wyithe, J Stuart B; Loeb, Abraham

    2004-02-26

    The fraction of ionized hydrogen left over from the Big Bang provides evidence for the time of formation of the first stars and quasar black holes in the early Universe; such objects provide the high-energy photons necessary to ionize hydrogen. Spectra of the two most distant known quasars show nearly complete absorption of photons with wavelengths shorter than the Lyman alpha transition of neutral hydrogen, indicating that hydrogen in the intergalactic medium (IGM) had not been completely ionized at a redshift of z approximately 6.3, about one billion years after the Big Bang. Here we show that the IGM surrounding these quasars had a neutral hydrogen fraction of tens of per cent before the quasar activity started, much higher than the previous lower limits of approximately 0.1 per cent. Our results, when combined with the recent inference of a large cumulative optical depth to electron scattering after cosmological recombination therefore suggest the presence of a second peak in the mean ionization history of the Universe.

  16. Big data analytics methods and applications

    CERN Document Server

    Rao, BLS; Rao, SB

    2016-01-01

    This book has a collection of articles written by Big Data experts to describe some of the cutting-edge methods and applications from their respective areas of interest, and provides the reader with a detailed overview of the field of Big Data Analytics as it is practiced today. The chapters cover technical aspects of key areas that generate and use Big Data such as management and finance; medicine and healthcare; genome, cytome and microbiome; graphs and networks; Internet of Things; Big Data standards; bench-marking of systems; and others. In addition to different applications, key algorithmic approaches such as graph partitioning, clustering and finite mixture modelling of high-dimensional data are also covered. The varied collection of themes in this volume introduces the reader to the richness of the emerging field of Big Data Analytics.

  17. The Big bang and the Quantum

    Science.gov (United States)

    Ashtekar, Abhay

    2010-06-01

    General relativity predicts that space-time comes to an end and physics comes to a halt at the big-bang. Recent developments in loop quantum cosmology have shown that these predictions cannot be trusted. Quantum geometry effects can resolve singularities, thereby opening new vistas. Examples are: The big bang is replaced by a quantum bounce; the `horizon problem' disappears; immediately after the big bounce, there is a super-inflationary phase with its own phenomenological ramifications; and, in presence of a standard inflation potential, initial conditions are naturally set for a long, slow roll inflation independently of what happens in the pre-big bang branch. As in my talk at the conference, I will first discuss the foundational issues and then the implications of the new Planck scale physics near the Big Bang.

  18. Big Bang baryosynthesis

    International Nuclear Information System (INIS)

    Turner, M.S.; Chicago Univ., IL

    1983-01-01

    In these lectures I briefly review Big Bang baryosynthesis. In the first lecture I discuss the evidence which exists for the BAU, the failure of non-GUT symmetrical cosmologies, the qualitative picture of baryosynthesis, and numerical results of detailed baryosynthesis calculations. In the second lecture I discuss the requisite CP violation in some detail, further the statistical mechanics of baryosynthesis, possible complications to the simplest scenario, and one cosmological implication of Big Bang baryosynthesis. (orig./HSI)

  19. A novel concept for CRIEC-driven subcritical research reactors

    International Nuclear Information System (INIS)

    Nieto, M.; Miley, G.H.

    2001-01-01

    A novel scheme is proposed to drive a low-power subcritical fuel assembly by means of a long Cylindrical Radially-convergent Inertial Electrostatic Confinement (CRIEC) used as a neutron source. The concept is inherently safe in the sense that the fuel assembly remains subcritical at all times. Previous work has been done for the possible implementation of CRIEC as a subcritical assembly driver for power reactors. However, it has been found that the present technology and stage of development of IEC-based neutron sources can not meet the neutron flux requirements to drive a system as big as a power reactor. Nevertheless, smaller systems, such as research and training reactors, could be successfully driven with levels of neutron flux that seem more reasonable to be achieved in the near future by IEC devices. The need for custom-made expensive nuclear fission fuel, as in the case of the TRIGA reactors, is eliminated, and the CRIEC presents substantial advantages with respect to the accelerator-driven subcritical reactors in terms of simplicity and cost. In the present paper, a conceptual design for a research/training CRIEC-driven subcritical assembly is presented, emphasizing the description, principle of operation and performance of the CRIEC neutron source, highlighting its advantages and discussing some key issues that require study for the implementation of this concept. (author)

  20. Idaho National Laboratory Ten-Year Site Plan Project Description Document

    Energy Technology Data Exchange (ETDEWEB)

    Not Listed

    2012-03-01

    This document describes the currently active and proposed infrastructure projects listed in Appendix B of the Idaho National Laboratory 2013-2022 Ten Year Site Plan (DOE/ID-11449). It was produced in accordance with Contract Data Requirements List I.06. The projects delineated in this document support infrastructure needs at INL's Research and Education Campus, Materials and Fuels Complex, Advanced Test Reactor Complex and the greater site-wide area. The projects provide critical infrastructure needed to meet current and future INL opereational and research needs. Execution of these projects will restore, rebuild, and revitalize INL's physical infrastructure; enhance program execution, and make a significant contribution toward reducing complex-wide deferred maintenance.

  1. Exploiting big data for critical care research.

    Science.gov (United States)

    Docherty, Annemarie B; Lone, Nazir I

    2015-10-01

    Over recent years the digitalization, collection and storage of vast quantities of data, in combination with advances in data science, has opened up a new era of big data. In this review, we define big data, identify examples of critical care research using big data, discuss the limitations and ethical concerns of using these large datasets and finally consider scope for future research. Big data refers to datasets whose size, complexity and dynamic nature are beyond the scope of traditional data collection and analysis methods. The potential benefits to critical care are significant, with faster progress in improving health and better value for money. Although not replacing clinical trials, big data can improve their design and advance the field of precision medicine. However, there are limitations to analysing big data using observational methods. In addition, there are ethical concerns regarding maintaining confidentiality of patients who contribute to these datasets. Big data have the potential to improve medical care and reduce costs, both by individualizing medicine, and bringing together multiple sources of data about individual patients. As big data become increasingly mainstream, it will be important to maintain public confidence by safeguarding data security, governance and confidentiality.

  2. Empathy and the Big Five

    OpenAIRE

    Paulus, Christoph

    2016-01-01

    Del Barrio et al. (2004) haben vor mehr als 10 Jahren versucht, eine direkte Beziehung zwischen Empathie und den Big Five herzustellen. Im Mittel hatten in ihrer Stichprobe Frauen höhere Werte in der Empathie und auf den Big Five-Faktoren mit Ausnahme des Faktors Neurotizismus. Zusammenhänge zu Empathie fanden sie in den Bereichen Offenheit, Verträglichkeit, Gewissenhaftigkeit und Extraversion. In unseren Daten besitzen Frauen sowohl in der Empathie als auch den Big Five signifikant höhere We...

  3. Importance of tracks on habitat use characterization of Medium and Big mammals in Los Mangos Forest (Puerto Lopez, Meta, Colombia)

    International Nuclear Information System (INIS)

    Guzman Lenis, Angelica; Camargo Sanabria, Angela

    2004-01-01

    Tracks and signs are very useful for detecting medium and big mammals, which usually are out of sight. These are helpful tools on field investigation, provide detailed information on the identity and activities of an animal in a place, and can provide us indications of their habitat use (Aranda, 1981a; Navarro and Munoz, 2000; Villalba and Yanosky, 2000). In this paper we characterize the habitat use of medium and big land mammals in Los Mangos Forest. We use an observation and track tramp transect, and a modification of the Habitat Suitability Index (HSI) for evaluating habitat suitability. We detect six burrows, four footprints and five Seje palm (Oenocarpus batagua) feeding places, in addition to ten tracks compiled along the other days of field investigation. We recognized ten species of mammals, which belong to five orders, using tracks and bitted fruits. The HSI calculated was 7.30 on inner forest, indicating that the habitat is appropriate for animals, which use burrows. Resources like food (insects, fruits and preys), refuge, water and resting places converges generating favorable environment for immigration and residence of insectivore, frugivore and carnivore mammals. The fertile plane forest is an important habitat of this area because present there. It offers quality resources to the animal species in there

  4. Gas reactor international coope--ative program. Interim report: assessment of gas-cooled reactor economics

    Energy Technology Data Exchange (ETDEWEB)

    1979-08-01

    A computer analysis of domestic economic incentive is presented. Included are the sample computer data set for ten combinations of reprocessing and reactor assumptions; basic data set and computer output; higher uranium availability computer output; 50 percent higher GCR fabrication cost computer output; 50 percent higher GCR reprocessing cost computer output; year 1990 and year 2000 GCR introduction scenario computer outputs; 75 percent perceived capacity factor for PBR computer output; and capital cost of GCRs 1.2 times that of LWRs.

  5. Clinical experience with TENS and TENS combined with nitrous oxide-oxygen. Report of 371 patients.

    OpenAIRE

    Quarnstrom, F. C.; Milgrom, P.

    1989-01-01

    Transcutaneous electrical nerve stimulation (TENS) alone or TENS combined with nitrous oxide-oxygen (N2O) was administered for restorative dentistry without local anesthesia to 371 adult patients. A total of 55% of TENS alone and 84% of TENS/N2O visits were rated successful. A total of 53% of TENS alone and 82% of TENS/N2O patients reported slight or no pain. In multivariable analyses, pain reports were related to the anesthesia technique and patient fear and unrelated to sex, race, age, toot...

  6. Ten years of KRB Gundremmingen demonstration power station

    International Nuclear Information System (INIS)

    Facius, H. von; Ettemeyer, R.

    1976-01-01

    In August 1976 the first large nuclear power station in the Federal Republic, the KRB Gundremmingen plant with a net power of 237 MWe, has been in operation ten years. The construction of KRB as a demonstration plant was a major step forward on the way to the economic utilization of nuclear power for German utilities. Design and operation of the plant have decisively influenced the further development of the technology of light water reactors in the Federal Republic. Unlike the Kahl Experimental Nuclear Power Station (VAK), which was a test facility designed to generate experience and to train personnel, the decision to build KRB from the outset was conditional upon the fulfillment of economic criteria. Here are some of the aspects in which KRB has greatly influenced the development of nuclear power station technology: first application of internal steam-water separation instead of a steam drum with a water content of the steam of less than 1%; construction of a reactor buildung with all the necessary safety factors; solution of the corrosion and erosion problems linked with the use of a saturated steam turbine; special measures taken to prevent the turbine from speeding up due to post-evaporation effects after shutdown. Detailed comments are devoted to the subjects of availability, causes of failure and repair work. (orig.) [de

  7. Big domains are novel Ca²+-binding modules: evidences from big domains of Leptospira immunoglobulin-like (Lig) proteins.

    Science.gov (United States)

    Raman, Rajeev; Rajanikanth, V; Palaniappan, Raghavan U M; Lin, Yi-Pin; He, Hongxuan; McDonough, Sean P; Sharma, Yogendra; Chang, Yung-Fu

    2010-12-29

    Many bacterial surface exposed proteins mediate the host-pathogen interaction more effectively in the presence of Ca²+. Leptospiral immunoglobulin-like (Lig) proteins, LigA and LigB, are surface exposed proteins containing Bacterial immunoglobulin like (Big) domains. The function of proteins which contain Big fold is not known. Based on the possible similarities of immunoglobulin and βγ-crystallin folds, we here explore the important question whether Ca²+ binds to a Big domains, which would provide a novel functional role of the proteins containing Big fold. We selected six individual Big domains for this study (three from the conserved part of LigA and LigB, denoted as Lig A3, Lig A4, and LigBCon5; two from the variable region of LigA, i.e., 9(th) (Lig A9) and 10(th) repeats (Lig A10); and one from the variable region of LigB, i.e., LigBCen2. We have also studied the conserved region covering the three and six repeats (LigBCon1-3 and LigCon). All these proteins bind the calcium-mimic dye Stains-all. All the selected four domains bind Ca²+ with dissociation constants of 2-4 µM. Lig A9 and Lig A10 domains fold well with moderate thermal stability, have β-sheet conformation and form homodimers. Fluorescence spectra of Big domains show a specific doublet (at 317 and 330 nm), probably due to Trp interaction with a Phe residue. Equilibrium unfolding of selected Big domains is similar and follows a two-state model, suggesting the similarity in their fold. We demonstrate that the Lig are Ca²+-binding proteins, with Big domains harbouring the binding motif. We conclude that despite differences in sequence, a Big motif binds Ca²+. This work thus sets up a strong possibility for classifying the proteins containing Big domains as a novel family of Ca²+-binding proteins. Since Big domain is a part of many proteins in bacterial kingdom, we suggest a possible function these proteins via Ca²+ binding.

  8. International Conference on Physics and Technology of Reactors and Applications

    International Nuclear Information System (INIS)

    2007-01-01

    The first international conference on physics and technology of reactors and applications (PHYTRA 1) which took place in Marrakech (Morocco) from 14 to 16 March 2007, was designed to bring together scientists, teachers and students from universities, research centres and industry and other institutions to exchange knowledge and to discuss ideas and future issues. The programmes of the PHYTRA 1 conference covers a wide variety topics, the conference was organised in three plenary sessions, ten oral technical sessions and two poster sessions. The plenary sessions covers the following topics : The prospects of nuclear energy, The situation of nuclear sciences and energy in Morocco and Africa, and the new development in reactor physics and reactor design [fr

  9. Scaling of silent electrical discharge reactors for hazardous organics destruction

    International Nuclear Information System (INIS)

    Coogan, J.J.; Rosocha, L.A.; Brower, M.J.; Kang, M.; Schmidt, C.A.

    1993-01-01

    Silent electrical discharges are used to produce highly reactive free radicals that destroy hazardous compounds entrained in gaseous effluents at ambient gas temperatures and pressures. We have carried out destruction experiments at Los Alamos on a range of volatile organic compounds (VOCs), including trichloroethylene (TCE), carbon tetrachloride, perchloroethylene (PCE), and chlorofluorocarbons (CFCs). We have measured a ''nine-factor'', the amount of energy required to reduce the VOC concentration by a factor of ten. For practical reactor power densities, the ''nine-factor'' can be used to predict the destruction an removal efficiency (DRE) in terms of gas flow rate and the number of reactor modules. This report proposes a modular, stackable architecture for scaling up the reactor throughput

  10. Evaluating usability of the Halden Reactor Large Screen Display. Is the Information Rich Design concept suitable for real-world installations?

    International Nuclear Information System (INIS)

    Braseth, Alf Ove

    2013-01-01

    Large Screen Displays (LSDs) are beginning to supplement desktop displays in modern control rooms, having the potential to display the big picture of complex processes. Information Rich Design (IRD) is a LSD concept used in many real-life installations in the petroleum domain, and more recently in nuclear research applications. The objectives of IRD are to provide the big picture, avoiding keyhole related problems while supporting fast visual perception of larger data sets. Two LSDs based on the IRD concept have been developed for large-scale nuclear simulators for research purposes; they have however suffered from unsatisfying user experience. The new Halden Reactor LSD, used to monitor a nuclear research reactor, was designed according to recent proposed Design Principles compiled in this paper to mitigate previously experienced problems. This paper evaluates the usability of the Halden Reactor LSD, comparing usability data with the replaced analogue panel, and data for an older IRD large screen display. The results suggest that the IRD concept is suitable for use in real-life applications from a user experience point of view, and that the recently proposed Design Principles have had a positive effect on usability. (author)

  11. Evaluation of the revised electrolytic reduction reactor from a remote operability aspect

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hyo Jik; Yoon, K. H.; Kim, K. H. (and others)

    2008-01-15

    This report presents an evaluation of the remote operability of the revised electrolytic reduction (ER) reactor installed in the ACP at KAERI. All operations have to be implemented in a fully remote manner since the ACPF is a hotcell for handling highly radioactive materials such as spent nuclear fuels. In particular, the ER process is a key process of the Advanced spent fuel Conditioning Process (ACP) and it needs a lot of other auxiliary equipment to perform it. Also, since the ER equipment is too big and complicated, and contrarily the ACPF is not big enough, and one common rail is allotted for a bridge transported servo manipulator, an in-cell crane and a gate crane, a remote handling of the ER reactor is accompanied by a difficulty for a remote operation. For an easy understanding, short overview of the ER process, the remote handling equipment, the structural configuration of the ACPF and the detail drawings of the ER equipment are presented. Through 4 month-remote operational tests, detailed operational procedures are presented along with pictures. The remote handling equipment and tools required in each operation are addressed in detail. Also, the procedure to implement each remote operation, and a task difficulty are evaluated from a remote operability aspect. All the remote tasks are distinguished according to whether a remote operation can be performed or not. Finally, partial improvement or an idea to solve the suggested problems is presented. This report will assist in modifying or scaling up the ER reactor.

  12. Semantic Web Technologies and Big Data Infrastructures: SPARQL Federated Querying of Heterogeneous Big Data Stores

    OpenAIRE

    Konstantopoulos, Stasinos; Charalambidis, Angelos; Mouchakis, Giannis; Troumpoukis, Antonis; Jakobitsch, Jürgen; Karkaletsis, Vangelis

    2016-01-01

    The ability to cross-link large scale data with each other and with structured Semantic Web data, and the ability to uniformly process Semantic Web and other data adds value to both the Semantic Web and to the Big Data community. This paper presents work in progress towards integrating Big Data infrastructures with Semantic Web technologies, allowing for the cross-linking and uniform retrieval of data stored in both Big Data infrastructures and Semantic Web data. The technical challenges invo...

  13. Quantum fields in a big-crunch-big-bang spacetime

    International Nuclear Information System (INIS)

    Tolley, Andrew J.; Turok, Neil

    2002-01-01

    We consider quantum field theory on a spacetime representing the big-crunch-big-bang transition postulated in ekpyrotic or cyclic cosmologies. We show via several independent methods that an essentially unique matching rule holds connecting the incoming state, in which a single extra dimension shrinks to zero, to the outgoing state in which it reexpands at the same rate. For free fields in our construction there is no particle production from the incoming adiabatic vacuum. When interactions are included the particle production for fixed external momentum is finite at the tree level. We discuss a formal correspondence between our construction and quantum field theory on de Sitter spacetime

  14. Turning big bang into big bounce: II. Quantum dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Malkiewicz, Przemyslaw; Piechocki, Wlodzimierz, E-mail: pmalk@fuw.edu.p, E-mail: piech@fuw.edu.p [Theoretical Physics Department, Institute for Nuclear Studies, Hoza 69, 00-681 Warsaw (Poland)

    2010-11-21

    We analyze the big bounce transition of the quantum Friedmann-Robertson-Walker model in the setting of the nonstandard loop quantum cosmology (LQC). Elementary observables are used to quantize composite observables. The spectrum of the energy density operator is bounded and continuous. The spectrum of the volume operator is bounded from below and discrete. It has equally distant levels defining a quantum of the volume. The discreteness may imply a foamy structure of spacetime at a semiclassical level which may be detected in astro-cosmo observations. The nonstandard LQC method has a free parameter that should be fixed in some way to specify the big bounce transition.

  15. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-07-31

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to big data scaling. This presents a serious impediment since identify- ing and repairing dirty data often involves processing huge input datasets, handling sophisticated error discovery approaches and managing huge arbitrary errors. With large datasets, error detection becomes overly expensive and complicated especially when considering user-defined functions. Furthermore, a distinctive algorithm is de- sired to optimize inequality joins in sophisticated error discovery rather than na ̈ıvely parallelizing them. Also, when repairing large errors, their skewed distribution may obstruct effective error repairs. In this dissertation, I present solutions to overcome the above three problems in scaling data cleansing. First, I present BigDansing as a general system to tackle efficiency, scalability, and ease-of-use issues in data cleansing for Big Data. It automatically parallelizes the user’s code on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also enables parallel execution of serial repair algorithms by exploiting the graph representation of discovered errors. The experimental results show that BigDansing outperforms existing baselines up to more than two orders of magnitude. Although BigDansing scales cleansing jobs, it still lacks the ability to handle sophisticated error discovery requiring inequality joins. Therefore, I developed IEJoin as an algorithm for fast inequality joins. It is based on sorted arrays and space efficient bit-arrays to reduce the problem’s search space. By comparing IEJoin against well- known optimizations, I show that it is more scalable, and several orders of magnitude faster. BigDansing depends on vertex-centric graph systems, i.e., Pregel

  16. The ethics of big data in big agriculture

    Directory of Open Access Journals (Sweden)

    Isabelle M. Carbonell

    2016-03-01

    Full Text Available This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique insights on a field-by-field basis into a third or more of the US farmland. This power asymmetry may be rebalanced through open-sourced data, and publicly-funded data analytic tools which rival Climate Corp. in complexity and innovation for use in the public domain.

  17. Homogeneous and isotropic big rips?

    CERN Document Server

    Giovannini, Massimo

    2005-01-01

    We investigate the way big rips are approached in a fully inhomogeneous description of the space-time geometry. If the pressure and energy densities are connected by a (supernegative) barotropic index, the spatial gradients and the anisotropic expansion decay as the big rip is approached. This behaviour is contrasted with the usual big-bang singularities. A similar analysis is performed in the case of sudden (quiescent) singularities and it is argued that the spatial gradients may well be non-negligible in the vicinity of pressure singularities.

  18. Rate Change Big Bang Theory

    Science.gov (United States)

    Strickland, Ken

    2013-04-01

    The Rate Change Big Bang Theory redefines the birth of the universe with a dramatic shift in energy direction and a new vision of the first moments. With rate change graph technology (RCGT) we can look back 13.7 billion years and experience every step of the big bang through geometrical intersection technology. The analysis of the Big Bang includes a visualization of the first objects, their properties, the astounding event that created space and time as well as a solution to the mystery of anti-matter.

  19. Intelligent Test Mechanism Design of Worn Big Gear

    Directory of Open Access Journals (Sweden)

    Hong-Yu LIU

    2014-10-01

    Full Text Available With the continuous development of national economy, big gear was widely applied in metallurgy and mine domains. So, big gear plays an important role in above domains. In practical production, big gear abrasion and breach take place often. It affects normal production and causes unnecessary economic loss. A kind of intelligent test method was put forward on worn big gear mainly aimed at the big gear restriction conditions of high production cost, long production cycle and high- intensity artificial repair welding work. The measure equations transformations were made on involute straight gear. Original polar coordinate equations were transformed into rectangular coordinate equations. Big gear abrasion measure principle was introduced. Detection principle diagram was given. Detection route realization method was introduced. OADM12 laser sensor was selected. Detection on big gear abrasion area was realized by detection mechanism. Tested data of unworn gear and worn gear were led in designed calculation program written by Visual Basic language. Big gear abrasion quantity can be obtained. It provides a feasible method for intelligent test and intelligent repair welding on worn big gear.

  20. [Big data in medicine and healthcare].

    Science.gov (United States)

    Rüping, Stefan

    2015-08-01

    Healthcare is one of the business fields with the highest Big Data potential. According to the prevailing definition, Big Data refers to the fact that data today is often too large and heterogeneous and changes too quickly to be stored, processed, and transformed into value by previous technologies. The technological trends drive Big Data: business processes are more and more executed electronically, consumers produce more and more data themselves - e.g. in social networks - and finally ever increasing digitalization. Currently, several new trends towards new data sources and innovative data analysis appear in medicine and healthcare. From the research perspective, omics-research is one clear Big Data topic. In practice, the electronic health records, free open data and the "quantified self" offer new perspectives for data analytics. Regarding analytics, significant advances have been made in the information extraction from text data, which unlocks a lot of data from clinical documentation for analytics purposes. At the same time, medicine and healthcare is lagging behind in the adoption of Big Data approaches. This can be traced to particular problems regarding data complexity and organizational, legal, and ethical challenges. The growing uptake of Big Data in general and first best-practice examples in medicine and healthcare in particular, indicate that innovative solutions will be coming. This paper gives an overview of the potentials of Big Data in medicine and healthcare.

  1. In-service inspections of V-230 reactor

    International Nuclear Information System (INIS)

    Prepechal, J.

    1984-01-01

    It is stated that despite certain constraints the configuration of the WWER-440 is such that it allows to make in-service inspections on a fully satisfactory scale. Three factors are discussed whose existence is necessary for the implementation of in-service inspections. The program defining the scale of inspections is satisfactory with regard to the safety and reliability of reactor operation. Its further development must result in reducing time consumption and radiation burden of personnel. Regulations for the implementation and evaluation of inspections represent the weakest link in the system of in-service inspections. At present, various organizations are dealing with the said problem within international cooperation. Equipment for in-service inspections of WWER-440 reactors is relatively good. The most important knowledge is summed up gained from the ten pre-service and in-service inspections of reactors of this type made so far. (Z.M.)

  2. Fuel cycles and advanced core designs for the Gas-Cooled Fast Breeder Reactor

    International Nuclear Information System (INIS)

    Simon, R.H.; Hamilton, C.J.; Hunter, R.S.

    1982-01-01

    Studies indicate that a 1200 MW(e) Gas-Cooled Fast Breeder Reactor could achieve compound system doubling times of under ten years when using advanced oxide or carbide fuels. In addition, when thorium is used in the breeding blankets, enough U-233 can be generated in each GCFR to supply several advanced converter reactors with fissionable material and this symbiotic relationship could provide energy for the world for centuries. (author)

  3. From Big Data to Big Business

    DEFF Research Database (Denmark)

    Lund Pedersen, Carsten

    2017-01-01

    Idea in Brief: Problem: There is an enormous profit potential for manufacturing firms in big data, but one of the key barriers to obtaining data-driven growth is the lack of knowledge about which capabilities are needed to extract value and profit from data. Solution: We (BDBB research group at C...

  4. Making big sense from big data in toxicology by read-across.

    Science.gov (United States)

    Hartung, Thomas

    2016-01-01

    Modern information technologies have made big data available in safety sciences, i.e., extremely large data sets that may be analyzed only computationally to reveal patterns, trends and associations. This happens by (1) compilation of large sets of existing data, e.g., as a result of the European REACH regulation, (2) the use of omics technologies and (3) systematic robotized testing in a high-throughput manner. All three approaches and some other high-content technologies leave us with big data--the challenge is now to make big sense of these data. Read-across, i.e., the local similarity-based intrapolation of properties, is gaining momentum with increasing data availability and consensus on how to process and report it. It is predominantly applied to in vivo test data as a gap-filling approach, but can similarly complement other incomplete datasets. Big data are first of all repositories for finding similar substances and ensure that the available data is fully exploited. High-content and high-throughput approaches similarly require focusing on clusters, in this case formed by underlying mechanisms such as pathways of toxicity. The closely connected properties, i.e., structural and biological similarity, create the confidence needed for predictions of toxic properties. Here, a new web-based tool under development called REACH-across, which aims to support and automate structure-based read-across, is presented among others.

  5. [Big data in official statistics].

    Science.gov (United States)

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  6. Neutron streaming evaluation for the DREAM fusion power reactor

    International Nuclear Information System (INIS)

    Seki, Yasushi; Nishio, Satoshi; Ueda, Shuzo; Kurihara, Ryoichi

    2000-01-01

    Aiming at high degree of safety and benign environmental effect, we have proposed a tokamak fusion reactor concept called DREAM, which stands for DRastically EAsy Maintenance Reactor. The blanket structure of the reactor is made from very low activation SiC/SiC composites and cooled by non-reactive helium gas. High net thermal efficiency of about 50% is realized by 900 C helium gas and high plant availability is possible with simple maintenance scheme. In the DREAM Reactor, neutron streaming is a big problem because cooling pipes with diameter larger than 80 cm are used for blanket heat removal. Neutron streaming through the cooling pipes could cause hot spots in the superconducting magnets adjacent to the cooling pipes to shorten the magnet lifetime or increase cryogenic cooling requirement. Neutron streaming could also activate components such as gas turbine further away from the fusion plasma. The effect of neutron streaming through the helium cooling pipes was evaluated for the two types of cooling pipe extraction scheme. The result of a preliminary calculation indicates the gas turbine activation prohibits personnel access in the case of inboard pipe extraction while with additional shielding measures, limited contact maintenance is possible in the case of outboard extraction. (author)

  7. Big-Leaf Mahogany on CITES Appendix II: Big Challenge, Big Opportunity

    Science.gov (United States)

    JAMES GROGAN; PAULO BARRETO

    2005-01-01

    On 15 November 2003, big-leaf mahogany (Swietenia macrophylla King, Meliaceae), the most valuable widely traded Neotropical timber tree, gained strengthened regulatory protection from its listing on Appendix II of the Convention on International Trade in Endangered Species ofWild Fauna and Flora (CITES). CITES is a United Nations-chartered agreement signed by 164...

  8. Development of Advanced Monitoring System with Reactor Neutrino Detection Technique for Verification of Reactor Operations

    International Nuclear Information System (INIS)

    Furuta, H.; Tadokoro, H.; Imura, A.; Furuta, Y.; Suekane, F.

    2010-01-01

    Recently, technique of Gadolinium-loaded liquid scintillator (Gd-LS) for reactor neutrino oscillation experiments has attracted attention as a monitor of reactor operation and ''nuclear Gain (GA)'' for IAEA safeguards. When the thermal operation power is known, it is, in principle, possible to non-destructively measure the ratio of Pu/U in reactor fuel under operation from the reactor neutrino flux. An experimental program led by Lawrence Livermore National Laboratory and Sandia National Laboratories in USA has already demonstrated feasibility of the reactor monitoring by neutrinos at San Onofre Nuclear Power Station, and the Pu monitoring by neutrino detection is recognized as a candidate of novel technology to detect undeclared operation of reactor. However, further R and D studies of detector design and materials are still necessary to realize compact and mobile detector for practical use of neutrino detector. Considering the neutrino interaction cross-section and compact detector size, the detector must be set at a short distance (a few tens of meters) from reactor core to accumulate enough statistics for monitoring. In addition, although previous reactor neutrino experiments were performed at underground to reduce cosmic ray muon background, feasibility of the measurement at ground level is required for the monitor considering limited access to the reactor site. Therefore, the detector must be designed to be able to reduce external backgrounds extremely without huge shields at ground level, eg. cosmic ray muons and fast neutrons. We constructed a 0.76 ton Gd-LS detector, and carried out a reactor neutrino measurement at the experimental fast reactor JOYO in 2007. The neutrino detector was set up at 24.3m away from the reactor core at the ground level, and we understood the property of the main background; the cosmic-ray induced fast neutron, well. Based on the experience, we are constructing a new detector for the next experiment. The detector is a Gd

  9. Big Data as Information Barrier

    Directory of Open Access Journals (Sweden)

    Victor Ya. Tsvetkov

    2014-07-01

    Full Text Available The article covers analysis of ‘Big Data’ which has been discussed over last 10 years. The reasons and factors for the issue are revealed. It has proved that the factors creating ‘Big Data’ issue has existed for quite a long time, and from time to time, would cause the informational barriers. Such barriers were successfully overcome through the science and technologies. The conducted analysis refers the “Big Data” issue to a form of informative barrier. This issue may be solved correctly and encourages development of scientific and calculating methods.

  10. Big Data in Space Science

    OpenAIRE

    Barmby, Pauline

    2018-01-01

    It seems like “big data” is everywhere these days. In planetary science and astronomy, we’ve been dealing with large datasets for a long time. So how “big” is our data? How does it compare to the big data that a bank or an airline might have? What new tools do we need to analyze big datasets, and how can we make better use of existing tools? What kinds of science problems can we address with these? I’ll address these questions with examples including ESA’s Gaia mission, ...

  11. Big Data in Medicine is Driving Big Changes

    Science.gov (United States)

    Verspoor, K.

    2014-01-01

    Summary Objectives To summarise current research that takes advantage of “Big Data” in health and biomedical informatics applications. Methods Survey of trends in this work, and exploration of literature describing how large-scale structured and unstructured data sources are being used to support applications from clinical decision making and health policy, to drug design and pharmacovigilance, and further to systems biology and genetics. Results The survey highlights ongoing development of powerful new methods for turning that large-scale, and often complex, data into information that provides new insights into human health, in a range of different areas. Consideration of this body of work identifies several important paradigm shifts that are facilitated by Big Data resources and methods: in clinical and translational research, from hypothesis-driven research to data-driven research, and in medicine, from evidence-based practice to practice-based evidence. Conclusions The increasing scale and availability of large quantities of health data require strategies for data management, data linkage, and data integration beyond the limits of many existing information systems, and substantial effort is underway to meet those needs. As our ability to make sense of that data improves, the value of the data will continue to increase. Health systems, genetics and genomics, population and public health; all areas of biomedicine stand to benefit from Big Data and the associated technologies. PMID:25123716

  12. Main Issues in Big Data Security

    Directory of Open Access Journals (Sweden)

    Julio Moreno

    2016-09-01

    Full Text Available Data is currently one of the most important assets for companies in every field. The continuous growth in the importance and volume of data has created a new problem: it cannot be handled by traditional analysis techniques. This problem was, therefore, solved through the creation of a new paradigm: Big Data. However, Big Data originated new issues related not only to the volume or the variety of the data, but also to data security and privacy. In order to obtain a full perspective of the problem, we decided to carry out an investigation with the objective of highlighting the main issues regarding Big Data security, and also the solutions proposed by the scientific community to solve them. In this paper, we explain the results obtained after applying a systematic mapping study to security in the Big Data ecosystem. It is almost impossible to carry out detailed research into the entire topic of security, and the outcome of this research is, therefore, a big picture of the main problems related to security in a Big Data system, along with the principal solutions to them proposed by the research community.

  13. Reactor operations at SAFARI-1

    International Nuclear Information System (INIS)

    Vlok, J.W.H.

    2003-01-01

    A vigorous commercial programme of isotope production and other radiation services has been followed by the SAFARI-1 research reactor over the past ten years - superimposed on the original purpose of the reactor to provide a basic tool for nuclear research, development and education to the country at an institutional level. A combination of the binding nature of the resulting contractual obligations and tighter regulatory control has demanded an equally vigorous programme of upgrading, replacement and renovation of many systems in order to improve the safety and reliability of the reactor. Not least among these changes is the more effective training and deployment of operations personnel that has been necessitated as the operational demands on the reactor evolved from five days per week to twenty four hours per day, seven days per week, with more than 300 days per year at full power. This paper briefly sketches the operational history of SAFARI-1 and then focuses on the training and structuring currently in place to meet the operational needs. There is a detailed step-by-step look at the operator?s career plan and pre-defined milestones. Shift work, especially the shift cycle, has a negative influence on the operator's career path development, especially due to his unavailability for training. Methods utilised to minimise this influence are presented. The increase of responsibilities regarding the operation of the reactor, ancillaries and experimental facilities as the operator progresses with his career are discussed. (author)

  14. Harnessing the Power of Big Data to Improve Graduate Medical Education: Big Idea or Bust?

    Science.gov (United States)

    Arora, Vineet M

    2018-06-01

    With the advent of electronic medical records (EMRs) fueling the rise of big data, the use of predictive analytics, machine learning, and artificial intelligence are touted as transformational tools to improve clinical care. While major investments are being made in using big data to transform health care delivery, little effort has been directed toward exploiting big data to improve graduate medical education (GME). Because our current system relies on faculty observations of competence, it is not unreasonable to ask whether big data in the form of clinical EMRs and other novel data sources can answer questions of importance in GME such as when is a resident ready for independent practice.The timing is ripe for such a transformation. A recent National Academy of Medicine report called for reforms to how GME is delivered and financed. While many agree on the need to ensure that GME meets our nation's health needs, there is little consensus on how to measure the performance of GME in meeting this goal. During a recent workshop at the National Academy of Medicine on GME outcomes and metrics in October 2017, a key theme emerged: Big data holds great promise to inform GME performance at individual, institutional, and national levels. In this Invited Commentary, several examples are presented, such as using big data to inform clinical experience and provide clinically meaningful data to trainees, and using novel data sources, including ambient data, to better measure the quality of GME training.

  15. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  16. A survey of big data research

    Science.gov (United States)

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  17. Big Data in Action for Government : Big Data Innovation in Public Services, Policy, and Engagement

    OpenAIRE

    World Bank

    2017-01-01

    Governments have an opportunity to harness big data solutions to improve productivity, performance and innovation in service delivery and policymaking processes. In developing countries, governments have an opportunity to adopt big data solutions and leapfrog traditional administrative approaches

  18. 78 FR 3911 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN; Final Comprehensive...

    Science.gov (United States)

    2013-01-17

    ... DEPARTMENT OF THE INTERIOR Fish and Wildlife Service [FWS-R3-R-2012-N259; FXRS1265030000-134-FF03R06000] Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN; Final Comprehensive... significant impact (FONSI) for the environmental assessment (EA) for Big Stone National Wildlife Refuge...

  19. Big domains are novel Ca²+-binding modules: evidences from big domains of Leptospira immunoglobulin-like (Lig proteins.

    Directory of Open Access Journals (Sweden)

    Rajeev Raman

    Full Text Available BACKGROUND: Many bacterial surface exposed proteins mediate the host-pathogen interaction more effectively in the presence of Ca²+. Leptospiral immunoglobulin-like (Lig proteins, LigA and LigB, are surface exposed proteins containing Bacterial immunoglobulin like (Big domains. The function of proteins which contain Big fold is not known. Based on the possible similarities of immunoglobulin and βγ-crystallin folds, we here explore the important question whether Ca²+ binds to a Big domains, which would provide a novel functional role of the proteins containing Big fold. PRINCIPAL FINDINGS: We selected six individual Big domains for this study (three from the conserved part of LigA and LigB, denoted as Lig A3, Lig A4, and LigBCon5; two from the variable region of LigA, i.e., 9(th (Lig A9 and 10(th repeats (Lig A10; and one from the variable region of LigB, i.e., LigBCen2. We have also studied the conserved region covering the three and six repeats (LigBCon1-3 and LigCon. All these proteins bind the calcium-mimic dye Stains-all. All the selected four domains bind Ca²+ with dissociation constants of 2-4 µM. Lig A9 and Lig A10 domains fold well with moderate thermal stability, have β-sheet conformation and form homodimers. Fluorescence spectra of Big domains show a specific doublet (at 317 and 330 nm, probably due to Trp interaction with a Phe residue. Equilibrium unfolding of selected Big domains is similar and follows a two-state model, suggesting the similarity in their fold. CONCLUSIONS: We demonstrate that the Lig are Ca²+-binding proteins, with Big domains harbouring the binding motif. We conclude that despite differences in sequence, a Big motif binds Ca²+. This work thus sets up a strong possibility for classifying the proteins containing Big domains as a novel family of Ca²+-binding proteins. Since Big domain is a part of many proteins in bacterial kingdom, we suggest a possible function these proteins via Ca²+ binding.

  20. New 'bigs' in cosmology

    International Nuclear Information System (INIS)

    Yurov, Artyom V.; Martin-Moruno, Prado; Gonzalez-Diaz, Pedro F.

    2006-01-01

    This paper contains a detailed discussion on new cosmic solutions describing the early and late evolution of a universe that is filled with a kind of dark energy that may or may not satisfy the energy conditions. The main distinctive property of the resulting space-times is that they make to appear twice the single singular events predicted by the corresponding quintessential (phantom) models in a manner which can be made symmetric with respect to the origin of cosmic time. Thus, big bang and big rip singularity are shown to take place twice, one on the positive branch of time and the other on the negative one. We have also considered dark energy and phantom energy accretion onto black holes and wormholes in the context of these new cosmic solutions. It is seen that the space-times of these holes would then undergo swelling processes leading to big trip and big hole events taking place on distinct epochs along the evolution of the universe. In this way, the possibility is considered that the past and future be connected in a non-paradoxical manner in the universes described by means of the new symmetric solutions

  1. 2nd INNS Conference on Big Data

    CERN Document Server

    Manolopoulos, Yannis; Iliadis, Lazaros; Roy, Asim; Vellasco, Marley

    2017-01-01

    The book offers a timely snapshot of neural network technologies as a significant component of big data analytics platforms. It promotes new advances and research directions in efficient and innovative algorithmic approaches to analyzing big data (e.g. deep networks, nature-inspired and brain-inspired algorithms); implementations on different computing platforms (e.g. neuromorphic, graphics processing units (GPUs), clouds, clusters); and big data analytics applications to solve real-world problems (e.g. weather prediction, transportation, energy management). The book, which reports on the second edition of the INNS Conference on Big Data, held on October 23–25, 2016, in Thessaloniki, Greece, depicts an interesting collaborative adventure of neural networks with big data and other learning technologies.

  2. The ethics of biomedical big data

    CERN Document Server

    Mittelstadt, Brent Daniel

    2016-01-01

    This book presents cutting edge research on the new ethical challenges posed by biomedical Big Data technologies and practices. ‘Biomedical Big Data’ refers to the analysis of aggregated, very large datasets to improve medical knowledge and clinical care. The book describes the ethical problems posed by aggregation of biomedical datasets and re-use/re-purposing of data, in areas such as privacy, consent, professionalism, power relationships, and ethical governance of Big Data platforms. Approaches and methods are discussed that can be used to address these problems to achieve the appropriate balance between the social goods of biomedical Big Data research and the safety and privacy of individuals. Seventeen original contributions analyse the ethical, social and related policy implications of the analysis and curation of biomedical Big Data, written by leading experts in the areas of biomedical research, medical and technology ethics, privacy, governance and data protection. The book advances our understan...

  3. Scalable privacy-preserving big data aggregation mechanism

    Directory of Open Access Journals (Sweden)

    Dapeng Wu

    2016-08-01

    Full Text Available As the massive sensor data generated by large-scale Wireless Sensor Networks (WSNs recently become an indispensable part of ‘Big Data’, the collection, storage, transmission and analysis of the big sensor data attract considerable attention from researchers. Targeting the privacy requirements of large-scale WSNs and focusing on the energy-efficient collection of big sensor data, a Scalable Privacy-preserving Big Data Aggregation (Sca-PBDA method is proposed in this paper. Firstly, according to the pre-established gradient topology structure, sensor nodes in the network are divided into clusters. Secondly, sensor data is modified by each node according to the privacy-preserving configuration message received from the sink. Subsequently, intra- and inter-cluster data aggregation is employed during the big sensor data reporting phase to reduce energy consumption. Lastly, aggregated results are recovered by the sink to complete the privacy-preserving big data aggregation. Simulation results validate the efficacy and scalability of Sca-PBDA and show that the big sensor data generated by large-scale WSNs is efficiently aggregated to reduce network resource consumption and the sensor data privacy is effectively protected to meet the ever-growing application requirements.

  4. Evaluation of the integrity of SEP reactor vessels

    International Nuclear Information System (INIS)

    Hoge, K.G.

    1979-12-01

    A documented review is presented of the integrity of the 11 reactor pressure vessels covered in the Systematic Evaluation Program. This review deals primarily with the design specifications and quality assurance programs used in the vessel construction and the status of material surveillance programs, pressure-temperature operating limits, and inservice inspection programs of the applicable plants. Several generic items such as PWR overpressurization protection and BWR nozzle and safe-end cracking also are evaluated. The 11 vessels evaluated include Dresden Units 1 and 2, Big Rock Point, Haddam Neck, Yankee Rowe, Oyster Creek, San Onofre 1, LaCrosse, Ginna, Millstone 1, and Palisades

  5. Ethische aspecten van big data

    NARCIS (Netherlands)

    N. (Niek) van Antwerpen; Klaas Jan Mollema

    2017-01-01

    Big data heeft niet alleen geleid tot uitdagende technische vraagstukken, ook gaat het gepaard met allerlei nieuwe ethische en morele kwesties. Om verantwoord met big data om te gaan, moet ook over deze kwesties worden nagedacht. Want slecht datagebruik kan nadelige gevolgen hebben voor

  6. Epidemiology in wonderland: Big Data and precision medicine.

    Science.gov (United States)

    Saracci, Rodolfo

    2018-03-01

    Big Data and precision medicine, two major contemporary challenges for epidemiology, are critically examined from two different angles. In Part 1 Big Data collected for research purposes (Big research Data) and Big Data used for research although collected for other primary purposes (Big secondary Data) are discussed in the light of the fundamental common requirement of data validity, prevailing over "bigness". Precision medicine is treated developing the key point that high relative risks are as a rule required to make a variable or combination of variables suitable for prediction of disease occurrence, outcome or response to treatment; the commercial proliferation of allegedly predictive tests of unknown or poor validity is commented. Part 2 proposes a "wise epidemiology" approach to: (a) choosing in a context imprinted by Big Data and precision medicine-epidemiological research projects actually relevant to population health, (b) training epidemiologists, (c) investigating the impact on clinical practices and doctor-patient relation of the influx of Big Data and computerized medicine and (d) clarifying whether today "health" may be redefined-as some maintain in purely technological terms.

  7. The overpressure protection for the chemical reactors: the batch-size approach

    International Nuclear Information System (INIS)

    Dellavedova, M.; Gigante, L.; Lunghi, A.; Pasturenzi, C.; Cardillo, P.; Gerosa, N.P.; Rota, R.

    2008-01-01

    Small and medium enterprises (SMEs) main feature is to run batch and semi-batch processes, working on job orders. They generally have multi propose reactors with an emergency relief system (ERS) already installed. These are normally sized when the reactor is designed, assuming as worst incidental scenario a single phase vapour flow generated by a fire developed outside the apparatus. These assumptions can lead to a big underestimation of the vent area if the actual flow is two-phase and besides generated by a runaway reaction. ERS sizing is particularly hazardous and complex for small mills, as for example fine chemicals and pharmaceutical companies. These factories have usually narrow financial and personal resources, moreover they often use fast processes turnovers. In many cases a complete safety study or the replacement of the ERS is not possible and it can lead to not sustainable costs. The batch-size approach is focused on discontinuous process conditions: aim of this approach is to find the reactor fill level that can lead a vapour single phase flow whether an incident occurs, this condition is considered safe that the ERS installed on the reactor can protect the plant from explosions [it

  8. Big Data and Analytics in Healthcare.

    Science.gov (United States)

    Tan, S S-L; Gao, G; Koch, S

    2015-01-01

    This editorial is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". The amount of data being generated in the healthcare industry is growing at a rapid rate. This has generated immense interest in leveraging the availability of healthcare data (and "big data") to improve health outcomes and reduce costs. However, the nature of healthcare data, and especially big data, presents unique challenges in processing and analyzing big data in healthcare. This Focus Theme aims to disseminate some novel approaches to address these challenges. More specifically, approaches ranging from efficient methods of processing large clinical data to predictive models that could generate better predictions from healthcare data are presented.

  9. Big Data for Business Ecosystem Players

    Directory of Open Access Journals (Sweden)

    Perko Igor

    2016-06-01

    Full Text Available In the provided research, some of the Big Data most prospective usage domains connect with distinguished player groups found in the business ecosystem. Literature analysis is used to identify the state of the art of Big Data related research in the major domains of its use-namely, individual marketing, health treatment, work opportunities, financial services, and security enforcement. System theory was used to identify business ecosystem major player types disrupted by Big Data: individuals, small and mid-sized enterprises, large organizations, information providers, and regulators. Relationships between the domains and players were explained through new Big Data opportunities and threats and by players’ responsive strategies. System dynamics was used to visualize relationships in the provided model.

  10. "Big data" in economic history.

    Science.gov (United States)

    Gutmann, Myron P; Merchant, Emily Klancher; Roberts, Evan

    2018-03-01

    Big data is an exciting prospect for the field of economic history, which has long depended on the acquisition, keying, and cleaning of scarce numerical information about the past. This article examines two areas in which economic historians are already using big data - population and environment - discussing ways in which increased frequency of observation, denser samples, and smaller geographic units allow us to analyze the past with greater precision and often to track individuals, places, and phenomena across time. We also explore promising new sources of big data: organically created economic data, high resolution images, and textual corpora.

  11. Big Data Knowledge in Global Health Education.

    Science.gov (United States)

    Olayinka, Olaniyi; Kekeh, Michele; Sheth-Chandra, Manasi; Akpinar-Elci, Muge

    The ability to synthesize and analyze massive amounts of data is critical to the success of organizations, including those that involve global health. As countries become highly interconnected, increasing the risk for pandemics and outbreaks, the demand for big data is likely to increase. This requires a global health workforce that is trained in the effective use of big data. To assess implementation of big data training in global health, we conducted a pilot survey of members of the Consortium of Universities of Global Health. More than half the respondents did not have a big data training program at their institution. Additionally, the majority agreed that big data training programs will improve global health deliverables, among other favorable outcomes. Given the observed gap and benefits, global health educators may consider investing in big data training for students seeking a career in global health. Copyright © 2017 Icahn School of Medicine at Mount Sinai. Published by Elsevier Inc. All rights reserved.

  12. GEOSS: Addressing Big Data Challenges

    Science.gov (United States)

    Nativi, S.; Craglia, M.; Ochiai, O.

    2014-12-01

    In the sector of Earth Observation, the explosion of data is due to many factors including: new satellite constellations, the increased capabilities of sensor technologies, social media, crowdsourcing, and the need for multidisciplinary and collaborative research to face Global Changes. In this area, there are many expectations and concerns about Big Data. Vendors have attempted to use this term for their commercial purposes. It is necessary to understand whether Big Data is a radical shift or an incremental change for the existing digital infrastructures. This presentation tries to explore and discuss the impact of Big Data challenges and new capabilities on the Global Earth Observation System of Systems (GEOSS) and particularly on its common digital infrastructure called GCI. GEOSS is a global and flexible network of content providers allowing decision makers to access an extraordinary range of data and information at their desk. The impact of the Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on GEOSS is discussed. The main solutions and experimentation developed by GEOSS along these axes are introduced and analyzed. GEOSS is a pioneering framework for global and multidisciplinary data sharing in the Earth Observation realm; its experience on Big Data is valuable for the many lessons learned.

  13. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  14. BIG DATA IN TAMIL: OPPORTUNITIES, BENEFITS AND CHALLENGES

    OpenAIRE

    R.S. Vignesh Raj; Babak Khazaei; Ashik Ali

    2015-01-01

    This paper gives an overall introduction on big data and has tried to introduce Big Data in Tamil. It discusses the potential opportunities, benefits and likely challenges from a very Tamil and Tamil Nadu perspective. The paper has also made original contribution by proposing the ‘big data’s’ terminology in Tamil. The paper further suggests a few areas to explore using big data Tamil on the lines of the Tamil Nadu Government ‘vision 2023’. Whilst, big data has something to offer everyone, it ...

  15. Big data in biomedicine.

    Science.gov (United States)

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Big Data’s Role in Precision Public Health

    Science.gov (United States)

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts. PMID:29594091

  17. Big data analytics with R and Hadoop

    CERN Document Server

    Prajapati, Vignesh

    2013-01-01

    Big Data Analytics with R and Hadoop is a tutorial style book that focuses on all the powerful big data tasks that can be achieved by integrating R and Hadoop.This book is ideal for R developers who are looking for a way to perform big data analytics with Hadoop. This book is also aimed at those who know Hadoop and want to build some intelligent applications over Big data with R packages. It would be helpful if readers have basic knowledge of R.

  18. Big data in forensic science and medicine.

    Science.gov (United States)

    Lefèvre, Thomas

    2018-07-01

    In less than a decade, big data in medicine has become quite a phenomenon and many biomedical disciplines got their own tribune on the topic. Perspectives and debates are flourishing while there is a lack for a consensual definition for big data. The 3Vs paradigm is frequently evoked to define the big data principles and stands for Volume, Variety and Velocity. Even according to this paradigm, genuine big data studies are still scarce in medicine and may not meet all expectations. On one hand, techniques usually presented as specific to the big data such as machine learning techniques are supposed to support the ambition of personalized, predictive and preventive medicines. These techniques are mostly far from been new and are more than 50 years old for the most ancient. On the other hand, several issues closely related to the properties of big data and inherited from other scientific fields such as artificial intelligence are often underestimated if not ignored. Besides, a few papers temper the almost unanimous big data enthusiasm and are worth attention since they delineate what is at stakes. In this context, forensic science is still awaiting for its position papers as well as for a comprehensive outline of what kind of contribution big data could bring to the field. The present situation calls for definitions and actions to rationally guide research and practice in big data. It is an opportunity for grounding a true interdisciplinary approach in forensic science and medicine that is mainly based on evidence. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  19. Proceedings of first SWCR-KURRI academic seminar on research reactors and related research topics

    International Nuclear Information System (INIS)

    Kimura, Itsuro; Cong, Zhebao

    1986-01-01

    These are the proceedings of an academic seminar on research reactors and related research topics held at the Southwest Centre for Reactor Engineering Research and Design in Chengdu, Sichuan, People's Republic of China in September 24-26 in 1985. Included are the chairmen's addresses and 10 papers presented at the seminar in English. The titles of these papers are: (1) Nuclear Safety and Safeguards, (2) General Review of Thorium Research in Japanese Universities, (3) Comprehensive Utilization and Economic Analysis of the High Flux Engineering Test Reactor, (4) Present States of Applied Health Physics in Japan, (5) Neutron Radiography with Kyoto University Reactor, (6) Topics of Experimental Works with Kyoto University Reactor, (7) Integral Check of Nuclear Data for Reactor Structural Materials, (8) The Reactor Core, Physical Experiments and the Operation Safety Regulation of the Zero Energy Thermal Reactor for PWR Nuclear Power Plant, (9) HFETR Core Physical Parameters at Power, (10) Physical Consideration for Loads of Operated Ten Cycles in HFETR. (author)

  20. NASA's Big Data Task Force

    Science.gov (United States)

    Holmes, C. P.; Kinter, J. L.; Beebe, R. F.; Feigelson, E.; Hurlburt, N. E.; Mentzel, C.; Smith, G.; Tino, C.; Walker, R. J.

    2017-12-01

    Two years ago NASA established the Ad Hoc Big Data Task Force (BDTF - https://science.nasa.gov/science-committee/subcommittees/big-data-task-force), an advisory working group with the NASA Advisory Council system. The scope of the Task Force included all NASA Big Data programs, projects, missions, and activities. The Task Force focused on such topics as exploring the existing and planned evolution of NASA's science data cyber-infrastructure that supports broad access to data repositories for NASA Science Mission Directorate missions; best practices within NASA, other Federal agencies, private industry and research institutions; and Federal initiatives related to big data and data access. The BDTF has completed its two-year term and produced several recommendations plus four white papers for NASA's Science Mission Directorate. This presentation will discuss the activities and results of the TF including summaries of key points from its focused study topics. The paper serves as an introduction to the papers following in this ESSI session.

  1. HIPs at Ten

    Science.gov (United States)

    Kuh, George; O'Donnell, Ken; Schneider, Carol Geary

    2017-01-01

    2017 is the anniversary of the introduction of what are now commonly known as high-impact practices (HIPs). Many of the specific activities pursued under the HIPs acronym have been around in some form for decades, such as study abroad, internships, and student-faculty research. It was about ten years ago that, after conferring HIPs at Ten with…

  2. Big Data Technologies

    Science.gov (United States)

    Bellazzi, Riccardo; Dagliati, Arianna; Sacchi, Lucia; Segagni, Daniele

    2015-01-01

    The so-called big data revolution provides substantial opportunities to diabetes management. At least 3 important directions are currently of great interest. First, the integration of different sources of information, from primary and secondary care to administrative information, may allow depicting a novel view of patient’s care processes and of single patient’s behaviors, taking into account the multifaceted nature of chronic care. Second, the availability of novel diabetes technologies, able to gather large amounts of real-time data, requires the implementation of distributed platforms for data analysis and decision support. Finally, the inclusion of geographical and environmental information into such complex IT systems may further increase the capability of interpreting the data gathered and extract new knowledge from them. This article reviews the main concepts and definitions related to big data, it presents some efforts in health care, and discusses the potential role of big data in diabetes care. Finally, as an example, it describes the research efforts carried on in the MOSAIC project, funded by the European Commission. PMID:25910540

  3. Recycling : The advanced fuel cycle for existing reactors

    International Nuclear Information System (INIS)

    Lamorlette, Guy

    1994-01-01

    In 1993, the Installed capacity of the world's 427 nuclear power plants was over 335 GWe. Additional plants representing 67 GWe were under construction or on order. Taking construction schedules into consideration, their start-up will stretch out over a period of ten years. Nuclear power will therefore increase by 20% at best in ten years, transiting into a relatively modest 2% average annual growth rate. Of these units, about 80% are light water reactors, whether PWR, BWR, or WER. All of these reactors utilize enriched uranium oxide fuel clad with zirconium alloy. From a fuel perspective, these reactors form a pretty homogeneous group. During reactor residence, energy is supplied by fission of three-fourths of the Initial uranium 235, but also by plutonium fission, which is formed in the fuel as soon as it is Irradiated. The plutonium supplies 40% of the generated power. When the fuel is unloaded, it consists of four elements : fission products and structural materials, such as cladding and end-fittings, which are the reel waste, and residual plutonium and uranium, which are energy materials that can be recycled in accordance with French legislation applicable to both non-nuclear and nuclear industries : 'the purpose of this law is to... make use of waste by reusing, recycling or otherwise obtaining reusable material or energy from.'. The nuclear power industry has entered a phase in which most of its capital-intensive projects are behind it. Now, It must depose Itself to ensuring the competitiveness of nuclear energy compared to other sources of power generation, while protecting the environment and respecting safety regulations. Significant gains have been achieved by improving fuel performance : optimization of fuel design, utilization of less neutron-absorbent materials, and increases in fuel burn-up have made it possible to increase the amount of energy derived from one kilogram of natural uranium by more than 50%. Recycling of the fuel in light water reactor

  4. The Berlin Inventory of Gambling behavior - Screening (BIG-S): Validation using a clinical sample.

    Science.gov (United States)

    Wejbera, Martin; Müller, Kai W; Becker, Jan; Beutel, Manfred E

    2017-05-18

    Published diagnostic questionnaires for gambling disorder in German are either based on DSM-III criteria or focus on aspects other than life time prevalence. This study was designed to assess the usability of the DSM-IV criteria based Berlin Inventory of Gambling Behavior Screening tool in a clinical sample and adapt it to DSM-5 criteria. In a sample of 432 patients presenting for behavioral addiction assessment at the University Medical Center Mainz, we checked the screening tool's results against clinical diagnosis and compared a subsample of n=300 clinically diagnosed gambling disorder patients with a comparison group of n=132. The BIG-S produced a sensitivity of 99.7% and a specificity of 96.2%. The instrument's unidimensionality and the diagnostic improvements of DSM-5 criteria were verified by exploratory and confirmatory factor analysis as well as receiver operating characteristic analysis. The BIG-S is a reliable and valid screening tool for gambling disorder and demonstrated its concise and comprehensible operationalization of current DSM-5 criteria in a clinical setting.

  5. Traffic information computing platform for big data

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn; Liu, Yan, E-mail: ztduan@chd.edu.cn; Dai, Jiting, E-mail: ztduan@chd.edu.cn; Kang, Jun, E-mail: ztduan@chd.edu.cn [Chang' an University School of Information Engineering, Xi' an, China and Shaanxi Engineering and Technical Research Center for Road and Traffic Detection, Xi' an (China)

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  6. Traffic information computing platform for big data

    International Nuclear Information System (INIS)

    Duan, Zongtao; Li, Ying; Zheng, Xibin; Liu, Yan; Dai, Jiting; Kang, Jun

    2014-01-01

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users

  7. Fremtidens landbrug bliver big business

    DEFF Research Database (Denmark)

    Hansen, Henning Otte

    2016-01-01

    Landbrugets omverdensforhold og konkurrencevilkår ændres, og det vil nødvendiggøre en udvikling i retning af “big business“, hvor landbrugene bliver endnu større, mere industrialiserede og koncentrerede. Big business bliver en dominerende udvikling i dansk landbrug - men ikke den eneste...

  8. Quantum nature of the big bang.

    Science.gov (United States)

    Ashtekar, Abhay; Pawlowski, Tomasz; Singh, Parampreet

    2006-04-14

    Some long-standing issues concerning the quantum nature of the big bang are resolved in the context of homogeneous isotropic models with a scalar field. Specifically, the known results on the resolution of the big-bang singularity in loop quantum cosmology are significantly extended as follows: (i) the scalar field is shown to serve as an internal clock, thereby providing a detailed realization of the "emergent time" idea; (ii) the physical Hilbert space, Dirac observables, and semiclassical states are constructed rigorously; (iii) the Hamiltonian constraint is solved numerically to show that the big bang is replaced by a big bounce. Thanks to the nonperturbative, background independent methods, unlike in other approaches the quantum evolution is deterministic across the deep Planck regime.

  9. Mentoring in Schools: An Impact Study of Big Brothers Big Sisters School-Based Mentoring

    Science.gov (United States)

    Herrera, Carla; Grossman, Jean Baldwin; Kauh, Tina J.; McMaken, Jennifer

    2011-01-01

    This random assignment impact study of Big Brothers Big Sisters School-Based Mentoring involved 1,139 9- to 16-year-old students in 10 cities nationwide. Youth were randomly assigned to either a treatment group (receiving mentoring) or a control group (receiving no mentoring) and were followed for 1.5 school years. At the end of the first school…

  10. The use of fault tree analysis to minimize research reactor downtime

    International Nuclear Information System (INIS)

    Dodd, B.; Wang, C.H.; Anderson, T.V.

    1984-01-01

    For many reasons it is often highly desirable to maintain a research reactor in a continuously operable state and in the event of any failures to minimize the length of the reactor downtime. In order to keep the length of future downtimes to less than ten days for the sixteen year old OSU TRIGA reactor, a fault tree analysis was performed for all of the systems required to maintain the reactor operational. As a result of this analysis, it was possible to determine the critical parts and key components. By examining the availability and delivery times for each of these items, it was then possible to make reasoned decisions relating to the advance purchase of spare parts. This paper outlines the above process, along with examples of fault trees developed, and a recent history of the efficacy of this technique. (author)

  11. Big data processing in the cloud - Challenges and platforms

    Science.gov (United States)

    Zhelev, Svetoslav; Rozeva, Anna

    2017-12-01

    Choosing the appropriate architecture and technologies for a big data project is a difficult task, which requires extensive knowledge in both the problem domain and in the big data landscape. The paper analyzes the main big data architectures and the most widely implemented technologies used for processing and persisting big data. Clouds provide for dynamic resource scaling, which makes them a natural fit for big data applications. Basic cloud computing service models are presented. Two architectures for processing big data are discussed, Lambda and Kappa architectures. Technologies for big data persistence are presented and analyzed. Stream processing as the most important and difficult to manage is outlined. The paper highlights main advantages of cloud and potential problems.

  12. BigData and computing challenges in high energy and nuclear physics

    Science.gov (United States)

    Klimentov, A.; Grigorieva, M.; Kiryanov, A.; Zarochentsev, A.

    2017-06-01

    In this contribution we discuss the various aspects of the computing resource needs experiments in High Energy and Nuclear Physics, in particular at the Large Hadron Collider. This will evolve in the future when moving from LHC to HL-LHC in ten years from now, when the already exascale levels of data we are processing could increase by a further order of magnitude. The distributed computing environment has been a great success and the inclusion of new super-computing facilities, cloud computing and volunteering computing for the future is a big challenge, which we are successfully mastering with a considerable contribution from many super-computing centres around the world, academic and commercial cloud providers. We also discuss R&D computing projects started recently in National Research Center ``Kurchatov Institute''

  13. BigData and computing challenges in high energy and nuclear physics

    International Nuclear Information System (INIS)

    Klimentov, A.; Grigorieva, M.; Kiryanov, A.; Zarochentsev, A.

    2017-01-01

    In this contribution we discuss the various aspects of the computing resource needs experiments in High Energy and Nuclear Physics, in particular at the Large Hadron Collider. This will evolve in the future when moving from LHC to HL-LHC in ten years from now, when the already exascale levels of data we are processing could increase by a further order of magnitude. The distributed computing environment has been a great success and the inclusion of new super-computing facilities, cloud computing and volunteering computing for the future is a big challenge, which we are successfully mastering with a considerable contribution from many super-computing centres around the world, academic and commercial cloud providers. We also discuss R and D computing projects started recently in National Research Center ''Kurchatov Institute''

  14. Ethics and Epistemology in Big Data Research.

    Science.gov (United States)

    Lipworth, Wendy; Mason, Paul H; Kerridge, Ian; Ioannidis, John P A

    2017-12-01

    Biomedical innovation and translation are increasingly emphasizing research using "big data." The hope is that big data methods will both speed up research and make its results more applicable to "real-world" patients and health services. While big data research has been embraced by scientists, politicians, industry, and the public, numerous ethical, organizational, and technical/methodological concerns have also been raised. With respect to technical and methodological concerns, there is a view that these will be resolved through sophisticated information technologies, predictive algorithms, and data analysis techniques. While such advances will likely go some way towards resolving technical and methodological issues, we believe that the epistemological issues raised by big data research have important ethical implications and raise questions about the very possibility of big data research achieving its goals.

  15. Life management for a non replaceable structure: the reactor building

    International Nuclear Information System (INIS)

    Torres, V.; Francia, L.

    1998-01-01

    Phase 1 of UNESA N.P.P. Lifetime Management Project identified and ranked important components, relative to plant life management. The list showed the Reactor Containment Structure in the third position, and thirteen concrete structures were among the list top twenty. Since the Reactor Containment Building, together with the Reactor Vessel, is the only non-replaceable plant component, and has a big impact on the plant technical life, there is an increasing interest on understanding its behavior to maintain structural integrity. This paper presents: a) IAEA (International Atomic Energy Agency) Coordinated Research Program experiences and studies. Under this Program, international experts address the most frequent degradation mechanisms affecting the containment building. b) IAEA Aging Management Program adapted to our plants. The paper addresses the aging mechanisms affecting the concrete structures, reinforcing steel and prestress systems as well as the aging management programs and the mitigation and control methods. Finally, this paper presents a new module called STRUCTURES, included in phase 2 of the above mentioned project, which will monitor and document the different aging mechanisms and management programs described in item b) regarding the Reactor Containment Building (concrete liner, post stressing system, anchor elements). This module will also support the Maintenance Rule related practices. (Author)

  16. Pellet bed reactor for multi-modal space power

    International Nuclear Information System (INIS)

    Buden, D.; Williams, K.; Mast, P.; Mims, J.

    1987-01-01

    A review of forthcoming space power needs for both civil and military missions indicates that power requirements will be in the tens of megawatts. The electrical power requirements are envisioned to be twofold: long-duration lower power levels will be needed for station keeping, communications, and/or surveillance; short-duration higher power levels will be required for pulsed power devices. These power characteristics led to the proposal of a multi-modal space power reactor using a pellet bed design. Characteristics desired for such a multimegawatt reactor power source are standby, alert, and pulsed power modes; high-thermal output heat source (approximately 1000 MWt peak power); long lifetime station keeping power (10 to 30 years); high temperature output (1500 K to 1800 K); rapid-burst power transition; high reliability (above 95 percent); and stringent safety standards compliance. The proposed pellet bed reactor is designed to satisfy these characteristics

  17. Victoria Stodden: Scholarly Communication in the Era of Big Data and Big Computation

    OpenAIRE

    Stodden, Victoria

    2015-01-01

    Victoria Stodden gave the keynote address for Open Access Week 2015. "Scholarly communication in the era of big data and big computation" was sponsored by the University Libraries, Computational Modeling and Data Analytics, the Department of Computer Science, the Department of Statistics, the Laboratory for Interdisciplinary Statistical Analysis (LISA), and the Virginia Bioinformatics Institute. Victoria Stodden is an associate professor in the Graduate School of Library and Information Scien...

  18. Big Data: Concept, Potentialities and Vulnerabilities

    Directory of Open Access Journals (Sweden)

    Fernando Almeida

    2018-03-01

    Full Text Available The evolution of information systems and the growth in the use of the Internet and social networks has caused an explosion in the amount of available data relevant to the activities of the companies. Therefore, the treatment of these available data is vital to support operational, tactical and strategic decisions. This paper aims to present the concept of big data and the main technologies that support the analysis of large data volumes. The potential of big data is explored considering nine sectors of activity, such as financial, retail, healthcare, transports, agriculture, energy, manufacturing, public, and media and entertainment. In addition, the main current opportunities, vulnerabilities and privacy challenges of big data are discussed. It was possible to conclude that despite the potential for using the big data to grow in the previously identified areas, there are still some challenges that need to be considered and mitigated, namely the privacy of information, the existence of qualified human resources to work with Big Data and the promotion of a data-driven organizational culture.

  19. Big data analytics a management perspective

    CERN Document Server

    Corea, Francesco

    2016-01-01

    This book is about innovation, big data, and data science seen from a business perspective. Big data is a buzzword nowadays, and there is a growing necessity within practitioners to understand better the phenomenon, starting from a clear stated definition. This book aims to be a starting reading for executives who want (and need) to keep the pace with the technological breakthrough introduced by new analytical techniques and piles of data. Common myths about big data will be explained, and a series of different strategic approaches will be provided. By browsing the book, it will be possible to learn how to implement a big data strategy and how to use a maturity framework to monitor the progress of the data science team, as well as how to move forward from one stage to the next. Crucial challenges related to big data will be discussed, where some of them are more general - such as ethics, privacy, and ownership – while others concern more specific business situations (e.g., initial public offering, growth st...

  20. Human factors in Big Data

    NARCIS (Netherlands)

    Boer, J. de

    2016-01-01

    Since 2014 I am involved in various (research) projects that try to make the hype around Big Data more concrete and tangible for the industry and government. Big Data is about multiple sources of (real-time) data that can be analysed, transformed to information and be used to make 'smart' decisions.

  1. Slaves to Big Data. Or Are We?

    Directory of Open Access Journals (Sweden)

    Mireille Hildebrandt

    2013-10-01

    Full Text Available

    In this contribution, the notion of Big Data is discussed in relation to the monetisation of personal data. The claim of some proponents, as well as adversaries, that Big Data implies that ‘n = all’, meaning that we no longer need to rely on samples because we have all the data, is scrutinised and found to be both overly optimistic and unnecessarily pessimistic. A set of epistemological and ethical issues is presented, focusing on the implications of Big Data for our perception, cognition, fairness, privacy and due process. The article then looks into the idea of user-centric personal data management to investigate to what extent it provides solutions for some of the problems triggered by the Big Data conundrum. Special attention is paid to the core principle of data protection legislation, namely purpose binding. Finally, this contribution seeks to inquire into the influence of Big Data politics on self, mind and society, and asks how we can prevent ourselves from becoming slaves to Big Data.

  2. Studies on environment safety and application of advanced reactor for inland nuclear power plants

    International Nuclear Information System (INIS)

    Wei, L.; Jie, L.

    2014-01-01

    To study environment safety assessment of inland nuclear power plants (NPPs), the impact of environment safety under the normal operation was researched and the environment risk of serious accidents was analyzed. Moreover, the requirements and relevant provisions of site selection between international nuclear power plant and China's are comparatively studied. The conclusion was that the environment safety assessment of inland and coastal nuclear power plant have no essential difference; the advanced reactor can meet with high criteria of environment safety of inland nuclear power plants. In this way, China is safe and feasible to develop inland nuclear power plant. China's inland nuclear power plants will be as big market for advanced reactor. (author)

  3. Will Organization Design Be Affected By Big Data?

    Directory of Open Access Journals (Sweden)

    Giles Slinger

    2014-12-01

    Full Text Available Computing power and analytical methods allow us to create, collate, and analyze more data than ever before. When datasets are unusually large in volume, velocity, and variety, they are referred to as “big data.” Some observers have suggested that in order to cope with big data (a organizational structures will need to change and (b the processes used to design organizations will be different. In this article, we differentiate big data from relatively slow-moving, linked people data. We argue that big data will change organizational structures as organizations pursue the opportunities presented by big data. The processes by which organizations are designed, however, will be relatively unaffected by big data. Instead, organization design processes will be more affected by the complex links found in people data.

  4. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  5. Big Data

    OpenAIRE

    Bútora, Matúš

    2017-01-01

    Cieľom bakalárskej práca je popísať problematiku Big Data a agregačné operácie OLAP pre podporu rozhodovania, ktoré sú na ne aplikované pomocou technológie Apache Hadoop. Prevažná časť práce je venovaná popisu práve tejto technológie. Posledná kapitola sa zaoberá spôsobom aplikovania agregačných operácií a problematikou ich realizácie. Nasleduje celkové zhodnotenie práce a možnosti využitia výsledného systému do budúcna. The aim of the bachelor thesis is to describe the Big Data issue and ...

  6. BigDansing

    KAUST Repository

    Khayyat, Zuhair; Ilyas, Ihab F.; Jindal, Alekh; Madden, Samuel; Ouzzani, Mourad; Papotti, Paolo; Quiané -Ruiz, Jorge-Arnulfo; Tang, Nan; Yin, Si

    2015-01-01

    of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic

  7. Leveraging Mobile Network Big Data for Developmental Policy ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Some argue that big data and big data users offer advantages to generate evidence. ... Supported by IDRC, this research focused on transportation planning in urban ... Using mobile network big data for land use classification CPRsouth 2015.

  8. Physics design of an ultra-long pulsed tokamak reactor

    International Nuclear Information System (INIS)

    Ogawa, Y.; Inoue, N.; Wang, J.; Yamamoto, T.; Okano, K.

    1993-01-01

    A pulsed tokamak reactor driven only by inductive current drive has recently revived, because the non-inductive current drive efficiency seems to be too low to realize a steady-state tokamak reactor with sufficiently high energy gain Q. Essential problems in pulsed operation mode is considered to be material fatigue due to cyclic operation and expensive energy storage system to keep continuous electric output during a dwell time. To overcome these problems, we have proposed an ultra-long pulsed tokamak reactor called IDLT (abbr. Inductively operated Day-Long Tokamak), which has the major and minor radii of 10 m and 1.87 m, respectively, sufficiently to ensure the burning period of about ten hours. Here we discuss physical features of inductively operated tokamak plasmas, employing the similar constraints with ITER CDA design for engineering issues. (author) 9 refs., 2 figs., 1 tab

  9. Practice Variation in Big-4 Transparency Reports

    DEFF Research Database (Denmark)

    Girdhar, Sakshi; Klarskov Jeppesen, Kim

    2018-01-01

    Purpose: The purpose of this paper is to examine the transparency reports published by the Big-4 public accounting firms in the UK, Germany and Denmark to understand the determinants of their content within the networks of big accounting firms. Design/methodology/approach: The study draws...... on a qualitative research approach, in which the content of transparency reports is analyzed and semi-structured interviews are conducted with key people from the Big-4 firms who are responsible for developing the transparency reports. Findings: The findings show that the content of transparency reports...... is inconsistent and the transparency reporting practice is not uniform within the Big-4 networks. Differences were found in the way in which the transparency reporting practices are coordinated globally by the respective central governing bodies of the Big-4. The content of the transparency reports...

  10. Calculation system for physical analysis of boiling water reactors

    International Nuclear Information System (INIS)

    Bouveret, F.

    2001-01-01

    Although Boiling Water Reactors generate a quarter of worldwide nuclear electricity, they have been only little studied in France. A certain interest now shows up for these reactors. So, the aim of the work presented here is to contribute to determine a core calculation methodology with CEA (Commissariat a l'Energie Atomique) codes. Vapour production in the reactor core involves great differences in technological options from pressurised water reactor. We analyse main physical phenomena for BWR and offer solutions taking them into account. BWR fuel assembly heterogeneity causes steep thermal flux gradients. The two dimensional collision probability method with exact boundary conditions makes possible to calculate accurately the flux in BWR fuel assemblies using the APOLLO-2 lattice code but induces a very long calculation time. So, we determine a new methodology based on a two-level flux calculation. Void fraction variations in assemblies involve big spectrum changes that we have to consider in core calculation. We suggest to use a void history parameter to generate cross-sections libraries for core calculation. The core calculation code has also to calculate the depletion of main isotopes concentrations. A core calculation associating neutronics and thermal-hydraulic codes lays stress on points we still have to study out. The most important of them is to take into account the control blade in the different calculation stages. (author)

  11. Big data and biomedical informatics: a challenging opportunity.

    Science.gov (United States)

    Bellazzi, R

    2014-05-22

    Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations.

  12. Ten year report on the operation of the Latina nuclear power station 1964-1973

    International Nuclear Information System (INIS)

    1976-07-01

    The final report on the operation of the Latina nuclear power station, required under the terms of the contract of participation between ENEL and EURATOM is presented. It covers the first ten years of commercial operation (1 January 1964-31 December 1973) of this power station. Latina uses a British Magnox-type gas-graphite natural uranium reactor with a design thermal capacity of 724 MW. The rated electrical output of the three main turbogenerators was originally 210 MW (3x70), but was reduced to 160 MW in 1971. Construction began in November 1958 and was completed when the reactor first reached criticality in December 1962, the station being connected to the Italian electricity network for the first time in May 1963. The gross rated output of 210 MWe was reached in December 1963 and commercial operation began on 1 January 1964, by which date, however, the power station had already fed 295.5 million kWh into the network

  13. Was the big bang hot

    International Nuclear Information System (INIS)

    Wright, E.L.

    1983-01-01

    The author considers experiments to confirm the substantial deviations from a Planck curve in the Woody and Richards spectrum of the microwave background, and search for conducting needles in our galaxy. Spectral deviations and needle-shaped grains are expected for a cold Big Bang, but are not required by a hot Big Bang. (Auth.)

  14. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  15. Big Data over a 100 G network at Fermilab

    International Nuclear Information System (INIS)

    Garzoglio, Gabriele; Mhashilkar, Parag; Kim, Hyunwoo; Dykstra, Dave; Slyz, Marko

    2014-01-01

    As the need for Big Data in science becomes ever more relevant, networks around the world are upgrading their infrastructure to support high-speed interconnections. To support its mission, the high-energy physics community as a pioneer in Big Data has always been relying on the Fermi National Accelerator Laboratory to be at the forefront of storage and data movement. This need was reiterated in recent years with the data-taking rate of the major LHC experiments reaching tens of petabytes per year. At Fermilab, this resulted regularly in peaks of data movement on the Wide area network (WAN) in and out of the laboratory of about 30 Gbit/s and on the Local area network (LAN) between storage and computational farms of 160 Gbit/s. To address these ever increasing needs, as of this year Fermilab is connected to the Energy Sciences Network (ESnet) through a 100 Gb/s link. To understand the optimal system-and application-level configuration to interface computational systems with the new highspeed interconnect, Fermilab has deployed a Network Research and Development facility connected to the ESnet 100 G Testbed. For the past two years, the High Throughput Data Program (HTDP) has been using the Testbed to identify gaps in data movement middleware [5] when transferring data at these high-speeds. The program has published evaluations of technologies typically used in High Energy Physics, such as GridFTP [4], XrootD [9], and Squid [8]. This work presents the new R and D facility and the continuation of the evaluation program.

  16. Big Data Over a 100G Network at Fermilab

    Science.gov (United States)

    Garzoglio, Gabriele; Mhashilkar, Parag; Kim, Hyunwoo; Dykstra, Dave; Slyz, Marko

    2014-06-01

    As the need for Big Data in science becomes ever more relevant, networks around the world are upgrading their infrastructure to support high-speed interconnections. To support its mission, the high-energy physics community as a pioneer in Big Data has always been relying on the Fermi National Accelerator Laboratory to be at the forefront of storage and data movement. This need was reiterated in recent years with the data-taking rate of the major LHC experiments reaching tens of petabytes per year. At Fermilab, this resulted regularly in peaks of data movement on the Wide area network (WAN) in and out of the laboratory of about 30 Gbit/s and on the Local are network (LAN) between storage and computational farms of 160 Gbit/s. To address these ever increasing needs, as of this year Fermilab is connected to the Energy Sciences Network (ESnet) through a 100 Gb/s link. To understand the optimal system-and application-level configuration to interface computational systems with the new highspeed interconnect, Fermilab has deployed a Network Research & Development facility connected to the ESnet 100G Testbed. For the past two years, the High Throughput Data Program (HTDP) has been using the Testbed to identify gaps in data movement middleware [5] when transferring data at these high-speeds. The program has published evaluations of technologies typically used in High Energy Physics, such as GridFTP [4], XrootD [9], and Squid [8]. This work presents the new R&D facility and the continuation of the evaluation program.

  17. Construction of reactor and outline of breaking state

    International Nuclear Information System (INIS)

    Imanaka, Tetsuji

    1980-01-01

    The Mihama No. 1 reactor of Kansai Electric Power Co., Inc., is the first power-generating PWR in Japan, and it commenced the commercial operation in November, 1970. In June, 1972, the leak of steam generator tubes occurred, and the reactor was stopped for about a half year. In the second periodic inspection in March, 1973, the accident of broken fuel rods was discovered, but it was generally published in December, 1976. In August, 1974, the leak of steam generator tubes occurred again, and since then, the reactor was stopped for a long period. From October, 1978, the reactor has entered the test operation called cycling operation. The Kyoto University Reactor Research Institute lent the cask for transporting the broken pieces of the fuel rods to Kansai Electric Power Co., and obtained the investigation report of Japan Atomic Energy Research Institute, Based on the contract. The accident investigation group of the KURRI has pursued the causes of the accident and examined the propriety of the countermeasures. The outline of the construction of the reactor is described. The upper part of two fuel rods broke and fell on the baffle supporting plate. The broken fuel assembly was named C-34. About a half of fuel pellets and the cladding tube of several tens cm have not yet recovered. The state of break and the presumption of the causes are reported. (Kako, I.)

  18. Keynote: Big Data, Big Opportunities

    OpenAIRE

    Borgman, Christine L.

    2014-01-01

    The enthusiasm for big data is obscuring the complexity and diversity of data in scholarship and the challenges for stewardship. Inside the black box of data are a plethora of research, technology, and policy issues. Data are not shiny objects that are easily exchanged. Rather, data are representations of observations, objects, or other entities used as evidence of phenomena for the purposes of research or scholarship. Data practices are local, varying from field to field, individual to indiv...

  19. Integrating R and Hadoop for Big Data Analysis

    OpenAIRE

    Bogdan Oancea; Raluca Mariana Dragoescu

    2014-01-01

    Analyzing and working with big data could be very diffi cult using classical means like relational database management systems or desktop software packages for statistics and visualization. Instead, big data requires large clusters with hundreds or even thousands of computing nodes. Offi cial statistics is increasingly considering big data for deriving new statistics because big data sources could produce more relevant and timely statistics than traditional sources. One of the software tools ...

  20. Evaluation of filters in RSPCS (Reactor Service Pool Cooling System) and HWL (Hot Water Layer) in OPAL research reactor at ANSTO (Australian Nuclear Science and Technology Organization) using Gamma Spectrometry System and Liquid Scintillation Counter

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jim In; Foy, Robin; Jung, Seong Moon; Park, Hyeon Suk; Ye, Sung Joon [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    Australian Nuclear Science and Technology Organization(ANSTO) has a research reactor, OPAL (Open Pool Australian Lightwater reactor) which is a state-of-art 20 MW reactor for various purposes. In OPAL reactor, there are many kinds of radionuclides produced from various reactions in pool water and those should be identified and quantified for the safe use of OPAL. To do that, it is essential to check the efficiency of filters which are able to remove the radioactive substance from the reactor pool water. There are two main water circuits in OPAL which are RSPCS (Reactor Service Pool Cooling System) and HWL (Hot Water Layer) water circuits. The reactor service pool is connected to the reactor pool via a transfer canal and provides a working area and storage space for the spent and other materials. Also, HWL is the upper part of the reactor pool water and it minimize radiation dose rates at the pool surface. We collected water samples from these circuits and measured the radioactivity by using Gamma Spectrometry System (GSS) and Liquid Scintillation Counter (LSC) to evaluate the filters. We could evaluate the efficiency of filters in RSPCS and HWL in OPAL research reactor. Through the measurements of radioactivity using GSS and LSC, we could conclude that there is likely to be no alpha emitter in water samples, and for beta and gamma activity, there are very big differences between inlet and outlet results, so every filter is working efficiently to remove the radioactive substance.

  1. The challenges of big data.

    Science.gov (United States)

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. © 2016. Published by The Company of Biologists Ltd.

  2. The Chooz power station: ten years of operation

    International Nuclear Information System (INIS)

    Teste du Bailler, Andre

    1977-01-01

    The switching into actual service of the Chooz plant, the first pressurized water reactor ever built in France, occurred on 3rd april 1967. Ten years later, one can establish a highly positive balance schedule of plant's operation whose availability is satisfactory, except the mechanical failure which occurred during the startup. The behavior of the equipment, in particular of the components of the primary loop, was satisfactory in its whole since it allowed the gradual increase in capacity by 15% with respect to the initial design. It allowed also the achievment of noticeable progress in the design of equipment intended for the new power stations. Interesting results have also been obtained in radioprotection, working conditions of the staff and environment protection fields. Finally, the training of the operating teams has been closely followed, whether it concerned the operators directly affected by plant operation or the trainees gathered in a school specially organized for this purpose and transferred since to a training Center [fr

  3. An artificial intelligence approach fit for tRNA gene studies in the era of big sequence data.

    Science.gov (United States)

    Iwasaki, Yuki; Abe, Takashi; Wada, Kennosuke; Wada, Yoshiko; Ikemura, Toshimichi

    2017-09-12

    Unsupervised data mining capable of extracting a wide range of knowledge from big data without prior knowledge or particular models is a timely application in the era of big sequence data accumulation in genome research. By handling oligonucleotide compositions as high-dimensional data, we have previously modified the conventional self-organizing map (SOM) for genome informatics and established BLSOM, which can analyze more than ten million sequences simultaneously. Here, we develop BLSOM specialized for tRNA genes (tDNAs) that can cluster (self-organize) more than one million microbial tDNAs according to their cognate amino acid solely depending on tetra- and pentanucleotide compositions. This unsupervised clustering can reveal combinatorial oligonucleotide motifs that are responsible for the amino acid-dependent clustering, as well as other functionally and structurally important consensus motifs, which have been evolutionarily conserved. BLSOM is also useful for identifying tDNAs as phylogenetic markers for special phylotypes. When we constructed BLSOM with 'species-unknown' tDNAs from metagenomic sequences plus 'species-known' microbial tDNAs, a large portion of metagenomic tDNAs self-organized with species-known tDNAs, yielding information on microbial communities in environmental samples. BLSOM can also enhance accuracy in the tDNA database obtained from big sequence data. This unsupervised data mining should become important for studying numerous functionally unclear RNAs obtained from a wide range of organisms.

  4. Big³. Editorial.

    Science.gov (United States)

    Lehmann, C U; Séroussi, B; Jaulent, M-C

    2014-05-22

    To provide an editorial introduction into the 2014 IMIA Yearbook of Medical Informatics with an overview of the content, the new publishing scheme, and upcoming 25th anniversary. A brief overview of the 2014 special topic, Big Data - Smart Health Strategies, and an outline of the novel publishing model is provided in conjunction with a call for proposals to celebrate the 25th anniversary of the Yearbook. 'Big Data' has become the latest buzzword in informatics and promise new approaches and interventions that can improve health, well-being, and quality of life. This edition of the Yearbook acknowledges the fact that we just started to explore the opportunities that 'Big Data' will bring. However, it will become apparent to the reader that its pervasive nature has invaded all aspects of biomedical informatics - some to a higher degree than others. It was our goal to provide a comprehensive view at the state of 'Big Data' today, explore its strengths and weaknesses, as well as its risks, discuss emerging trends, tools, and applications, and stimulate the development of the field through the aggregation of excellent survey papers and working group contributions to the topic. For the first time in history will the IMIA Yearbook be published in an open access online format allowing a broader readership especially in resource poor countries. For the first time, thanks to the online format, will the IMIA Yearbook be published twice in the year, with two different tracks of papers. We anticipate that the important role of the IMIA yearbook will further increase with these changes just in time for its 25th anniversary in 2016.

  5. Development of System Model for Level 1 Probabilistic Safety Assessment of TRIGA PUSPATI Reactor

    International Nuclear Information System (INIS)

    Tom, P.P; Mazleha Maskin; Ahmad Hassan Sallehudin Mohd Sarif; Faizal Mohamed; Mohd Fazli Zakaria; Shaharum Ramli; Muhamad Puad Abu

    2014-01-01

    Nuclear safety is a very big issue in the world. As a consequence of the accident at Fukushima, Japan, most of the reactors in the world have been reviewed their safety of the reactors including also research reactors. To develop Level 1 Probabilistic Safety Assessment (PSA) of TRIGA PUSPATI Reactor (RTP), three organizations are involved; Nuclear Malaysia, AELB and UKM. PSA methodology is a logical, deductive technique which specifies an undesired top event and uses fault trees and event trees to model the various parallel and sequential combinations of failures that might lead to an undesired event. Fault Trees (FT) methodology is use in developing of system models. At the lowest level, the Basic Events (BE) of the fault trees (components failure and human errors) are assigned probability distributions. In this study, Risk Spectrum software used to construct the fault trees and analyze the system models. The results of system models analysis such as core damage frequency (CDF), minimum cut set (MCS) and common cause failure (CCF) uses to support decision making for upgrading or modification of the RTP?s safety system. (author)

  6. BR2 Reactor: Introduction

    International Nuclear Information System (INIS)

    Moons, F.

    2007-01-01

    The irradiations in the BR2 reactor are in collaboration with or at the request of third parties such as the European Commission, the IAEA, research centres and utilities, reactor vendors or fuel manufacturers. The reactor also contributes significantly to the production of radioisotopes for medical and industrial applications, to neutron silicon doping for the semiconductor industry and to scientific irradiations for universities. Along the ongoing programmes on fuel and materials development, several new irradiation devices are in use or in design. Amongst others a loop providing enhanced cooling for novel materials testing reactor fuel, a device for high temperature gas cooled fuel as well as a rig for the irradiation of metallurgical samples in a Pb-Bi environment. A full scale 3-D heterogeneous model of BR2 is available. The model describes the real hyperbolic arrangement of the reactor and includes the detailed 3-D space dependent distribution of the isotopic fuel depletion in the fuel elements. The model is validated on the reactivity measurements of several tens of BR2 operation cycles. The accurate calculations of the axial and radial distributions of the poisoning of the beryllium matrix by 3 He, 6 Li and 3T are verified on the measured reactivity losses used to predict the reactivity behavior for the coming decades. The model calculates the main functionals in reactor physics like: conventional thermal and equivalent fission neutron fluxes, number of displacements per atom, fission rate, thermal power characteristics as heat flux and linear power density, neutron/gamma heating, determination of the fission energy deposited in fuel plates/rods, neutron multiplication factor and fuel burn-up. For each reactor irradiation project, a detailed geometry model of the experimental device and of its neighborhood is developed. Neutron fluxes are predicted within approximately 10 percent in comparison with the dosimetry measurements. Fission rate, heat flux and

  7. The qualification of U3O8 as research reactor fuel

    International Nuclear Information System (INIS)

    Krull, W.

    1983-01-01

    This report summarizes the today knowledge of the qualification status of U 3 O 8 as low enriched ( 3 O 8 is so far qualified to start testing of ten (10) fuel elements with an U-density of 3.1 g U/cc in the FRG-2 research reactor. (orig.) [de

  8. Cloud Based Big Data Infrastructure: Architectural Components and Automated Provisioning

    OpenAIRE

    Demchenko, Yuri; Turkmen, Fatih; Blanchet, Christophe; Loomis, Charles; Laat, Caees de

    2016-01-01

    This paper describes the general architecture and functional components of the cloud based Big Data Infrastructure (BDI). The proposed BDI architecture is based on the analysis of the emerging Big Data and data intensive technologies and supported by the definition of the Big Data Architecture Framework (BDAF) that defines the following components of the Big Data technologies: Big Data definition, Data Management including data lifecycle and data structures, Big Data Infrastructure (generical...

  9. Physics with Big Karl Brainstorming. Abstracts

    International Nuclear Information System (INIS)

    Machner, H.; Lieb, J.

    2000-08-01

    Before summarizing details of the meeting, a short description of the spectrometer facility Big Karl is given. The facility is essentially a new instrument using refurbished dipole magnets from its predecessor. The large acceptance quadrupole magnets and the beam optics are new. Big Karl has a design very similar as the focussing spectrometers at MAMI (Mainz), AGOR (Groningen) and the high resolution spectrometer (HRS) in Hall A at Jefferson Laboratory with ΔE/E = 10 -4 but at some lower maximum momentum. The focal plane detectors consisting of multiwire drift chambers and scintillating hodoscopes are similar. Unlike HRS, Big Karl still needs Cerenkov counters and polarimeters in its focal plane; detectors which are necessary to perform some of the experiments proposed during the brainstorming. In addition, BIG KARL allows emission angle reconstruction via track measurements in its focal plane with high resolution. In the following the physics highlights, the proposed and potential experiments are summarized. During the meeting it became obvious that the physics to be explored at Big Karl can be grouped into five distinct categories, and this summary is organized accordingly. (orig.)

  10. Seed bank and big sagebrush plant community composition in a range margin for big sagebrush

    Science.gov (United States)

    Martyn, Trace E.; Bradford, John B.; Schlaepfer, Daniel R.; Burke, Ingrid C.; Laurenroth, William K.

    2016-01-01

    The potential influence of seed bank composition on range shifts of species due to climate change is unclear. Seed banks can provide a means of both species persistence in an area and local range expansion in the case of increasing habitat suitability, as may occur under future climate change. However, a mismatch between the seed bank and the established plant community may represent an obstacle to persistence and expansion. In big sagebrush (Artemisia tridentata) plant communities in Montana, USA, we compared the seed bank to the established plant community. There was less than a 20% similarity in the relative abundance of species between the established plant community and the seed bank. This difference was primarily driven by an overrepresentation of native annual forbs and an underrepresentation of big sagebrush in the seed bank compared to the established plant community. Even though we expect an increase in habitat suitability for big sagebrush under future climate conditions at our sites, the current mismatch between the plant community and the seed bank could impede big sagebrush range expansion into increasingly suitable habitat in the future.

  11. Application and Prospect of Big Data in Water Resources

    Science.gov (United States)

    Xi, Danchi; Xu, Xinyi

    2017-04-01

    Because of developed information technology and affordable data storage, we h ave entered the era of data explosion. The term "Big Data" and technology relate s to it has been created and commonly applied in many fields. However, academic studies just got attention on Big Data application in water resources recently. As a result, water resource Big Data technology has not been fully developed. This paper introduces the concept of Big Data and its key technologies, including the Hadoop system and MapReduce. In addition, this paper focuses on the significance of applying the big data in water resources and summarizing prior researches by others. Most studies in this field only set up theoretical frame, but we define the "Water Big Data" and explain its tridimensional properties which are time dimension, spatial dimension and intelligent dimension. Based on HBase, the classification system of Water Big Data is introduced: hydrology data, ecology data and socio-economic data. Then after analyzing the challenges in water resources management, a series of solutions using Big Data technologies such as data mining and web crawler, are proposed. Finally, the prospect of applying big data in water resources is discussed, it can be predicted that as Big Data technology keeps developing, "3D" (Data Driven Decision) will be utilized more in water resources management in the future.

  12. Big Data in food and agriculture

    Directory of Open Access Journals (Sweden)

    Kelly Bronson

    2016-06-01

    Full Text Available Farming is undergoing a digital revolution. Our existing review of current Big Data applications in the agri-food sector has revealed several collection and analytics tools that may have implications for relationships of power between players in the food system (e.g. between farmers and large corporations. For example, Who retains ownership of the data generated by applications like Monsanto Corproation's Weed I.D . “app”? Are there privacy implications with the data gathered by John Deere's precision agricultural equipment? Systematically tracing the digital revolution in agriculture, and charting the affordances as well as the limitations of Big Data applied to food and agriculture, should be a broad research goal for Big Data scholarship. Such a goal brings data scholarship into conversation with food studies and it allows for a focus on the material consequences of big data in society.

  13. Big data optimization recent developments and challenges

    CERN Document Server

    2016-01-01

    The main objective of this book is to provide the necessary background to work with big data by introducing some novel optimization algorithms and codes capable of working in the big data setting as well as introducing some applications in big data optimization for both academics and practitioners interested, and to benefit society, industry, academia, and government. Presenting applications in a variety of industries, this book will be useful for the researchers aiming to analyses large scale data. Several optimization algorithms for big data including convergent parallel algorithms, limited memory bundle algorithm, diagonal bundle method, convergent parallel algorithms, network analytics, and many more have been explored in this book.

  14. Una aproximación a Big Data = An approach to Big Data

    OpenAIRE

    Puyol Moreno, Javier

    2014-01-01

    Big Data puede ser considerada como una tendencia en el avance de la tecnología que ha abierto la puerta a un nuevo enfoque para la comprensión y la toma de decisiones, que se utiliza para describir las enormes cantidades de datos (estructurados, no estructurados y semi- estructurados) que sería demasiado largo y costoso para cargar una base de datos relacional para su análisis. Así, el concepto de Big Data se aplica a toda la información que no puede ser procesada o analizada utilizando herr...

  15. Toward a Literature-Driven Definition of Big Data in Healthcare.

    Science.gov (United States)

    Baro, Emilie; Degoul, Samuel; Beuscart, Régis; Chazard, Emmanuel

    2015-01-01

    The aim of this study was to provide a definition of big data in healthcare. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n) and the number of variables (p) for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. A total of 196 papers were included. Big data can be defined as datasets with Log(n∗p) ≥ 7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR) data.

  16. Big Data Analytic, Big Step for Patient Management and Care in Puerto Rico.

    Science.gov (United States)

    Borrero, Ernesto E

    2018-01-01

    This letter provides an overview of the application of big data in health care system to improve quality of care, including predictive modelling for risk and resource use, precision medicine and clinical decision support, quality of care and performance measurement, public health and research applications, among others. The author delineates the tremendous potential for big data analytics and discuss how it can be successfully implemented in clinical practice, as an important component of a learning health-care system.

  17. Big Data and Biomedical Informatics: A Challenging Opportunity

    Science.gov (United States)

    2014-01-01

    Summary Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations. PMID:24853034

  18. Big data governance an emerging imperative

    CERN Document Server

    Soares, Sunil

    2012-01-01

    Written by a leading expert in the field, this guide focuses on the convergence of two major trends in information management-big data and information governance-by taking a strategic approach oriented around business cases and industry imperatives. With the advent of new technologies, enterprises are expanding and handling very large volumes of data; this book, nontechnical in nature and geared toward business audiences, encourages the practice of establishing appropriate governance over big data initiatives and addresses how to manage and govern big data, highlighting the relevant processes,

  19. Big Data and historical social science

    Directory of Open Access Journals (Sweden)

    Peter Bearman

    2015-11-01

    Full Text Available “Big Data” can revolutionize historical social science if it arises from substantively important contexts and is oriented towards answering substantively important questions. Such data may be especially important for answering previously largely intractable questions about the timing and sequencing of events, and of event boundaries. That said, “Big Data” makes no difference for social scientists and historians whose accounts rest on narrative sentences. Since such accounts are the norm, the effects of Big Data on the practice of historical social science may be more limited than one might wish.

  20. The Inverted Big-Bang

    OpenAIRE

    Vaas, Ruediger

    2004-01-01

    Our universe appears to have been created not out of nothing but from a strange space-time dust. Quantum geometry (loop quantum gravity) makes it possible to avoid the ominous beginning of our universe with its physically unrealistic (i.e. infinite) curvature, extreme temperature, and energy density. This could be the long sought after explanation of the big-bang and perhaps even opens a window into a time before the big-bang: Space itself may have come from an earlier collapsing universe tha...

  1. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  2. Granulocyte colony-stimulating factor in toxic epidermal necrolysis (TEN) and Chelsea & Westminster TEN management protocol [corrected].

    Science.gov (United States)

    de Sica-Chapman, A; Williams, G; Soni, N; Bunker, C B

    2010-04-01

    Toxic epidermal necrolysis (TEN) is a rare but life-threatening, allergic drug reaction. Skin blistering with epidermal and mucosal necrolysis with subsequent detachment from an inflamed underlying dermis is a hallmark of the condition. The pathogenesis of TEN is not well understood, accounting for controversies about its management and significant delay in initiating potentially beneficial therapy. There are no management protocols based on a robust evidence base. Prompt recognition of the diagnosis and consensus on early management initiatives are necessary in order to improve outcomes and survival in TEN. To date, TEN management has been directed at arresting the allergic reaction and treating the complications. We have identified a need for specific medical interventions to accelerate wound regeneration. This approach has not previously been adopted in the management of TEN. We observed that in two cases of severe TEN, dramatic re-epithelialization and recovery coincided with the introduction of granulocyte colony-stimulating factor (G-CSF) for neutropenia. We explain how addition of the G-CSF promotes recovery from TEN by enhanced bioregeneration of the damaged tissues through accelerated re-epithelialization. G-CSF has been used for severe neutropenia in TEN, but we recommend and explain why, as in our Chelsea and Westminster protocol, G-CSF should be considered in treating severe TEN irrespective of the severity of neutropenia.

  3. Study of future reactors

    International Nuclear Information System (INIS)

    Bouchard, J.

    1992-01-01

    Today, more than 420 large reactors with a gross output of close to 350 GWe supply 20 percent of world electricity needs, accounting for less than 5 percent of primary energy consumption. These figures are not expected to change in the near future, due to suspended reactor construction in many countries. Nevertheless, world energy needs continue to grow: the planet's population already exceeds five billion and is forecast to reach ten billion by the middle of the next century. Most less developed countries have a very low rate of energy consumption and, even though some savings can be made in industrialized countries, it will become increasingly difficult to satisfy needs using fossil fuels only. Furthermore, there has been no recent breakthrough in the energy landscape. The physical feasibility of the other great hope of nuclear energy, fusion, has yet to be proved; once this has been done, it will be necessary to solve technological problems and to assess economic viability. Although it is more ever necessary to pursue fusion programs, there is little likelihood of industrial applications being achieved in the coming decades. Coal and fission are the only ways to produce massive amounts of energy for the next century. Coal must overcome the pollution problems inherent in its use; fission nuclear power has to gain better public acceptance, which is obviously colored by safety and waste concerns. Most existing reactors were commissioned in the 1970s; reactor lifetime is a parameter that has not been clearly established. It will certainly be possible to refurbish some to extend their operation beyond the initial target of 30 or 40 years. But normal advances in technology and safety requirements will make the operation of the oldest reactors increasingly difficult. It becomes necessary to develop new generations of nuclear reactors, both to replace older ones and to revive plant construction in their countries that are not yet equipped or that have halted their

  4. Scram and nonlinear reactor system seismic analysis for a liquid metal fast reactor

    International Nuclear Information System (INIS)

    Morrone, A.; Brussalis, W.G.

    1975-01-01

    The paper presents the analysis and results for a LMFBR system which was analyzed for both scram times and seismic responses such as bending moments, accelerations and forces. The reactor system was represented with a one-dimensional nonlinear mathematical model with two degrees of freedom per node (translational and rotational). The model was developed to incorporate as many reactor components as possible without exceeding computer limitations. It consists of 12 reactor components with a total of 71 nodes, 69 beam and pin-jointed elements and 27 gap elements. The gap elements were defined by their clearances, impact spring constants and impact damping constants based on a 50% coefficient of restitution. The horizontal excitation input to the model was the response of the containment building at the location of the reactor vessel supports. It consists of a ten seconds Safe Shutdown Earthquake acceleration-time history at 0.005 seconds intervals and with a maximum acceleration of 0.408 g. The analysis was performed with two Westinghouse special purpose computer programs. The first program calculated the reactor system seismic responses and stored the impact forces on tape. The impact forces on the control rod driveline were converted into vertical frictional forces by multiplying them by a coefficient of friction, and then used by the second program for the scram time determination. The results give time history plots of various seismic responses, and plots of scram times as a function of control rod travel distance for the most critical scram initiation times. The total scram time considering the effects of the earthquake was still acceptable but about 4 times longer than that calculated without the earthquake. The bending moment and shear force responses were used as input for the structural analysis (stresses, deflections, fatigue) of the various components, in combination with the other applicable loading conditions. (orig./HP) [de

  5. The Information Panopticon in the Big Data Era

    Directory of Open Access Journals (Sweden)

    Martin Berner

    2014-04-01

    Full Text Available Taking advantage of big data opportunities is challenging for traditional organizations. In this article, we take a panoptic view of big data – obtaining information from more sources and making it visible to all organizational levels. We suggest that big data requires the transformation from command and control hierarchies to post-bureaucratic organizational structures wherein employees at all levels can be empowered while simultaneously being controlled. We derive propositions that show how to best exploit big data technologies in organizations.

  6. Kaleido: Visualizing Big Brain Data with Automatic Color Assignment for Single-Neuron Images.

    Science.gov (United States)

    Wang, Ting-Yuan; Chen, Nan-Yow; He, Guan-Wei; Wang, Guo-Tzau; Shih, Chi-Tin; Chiang, Ann-Shyn

    2018-03-03

    Effective 3D visualization is essential for connectomics analysis, where the number of neural images easily reaches over tens of thousands. A formidable challenge is to simultaneously visualize a large number of distinguishable single-neuron images, with reasonable processing time and memory for file management and 3D rendering. In the present study, we proposed an algorithm named "Kaleido" that can visualize up to at least ten thousand single neurons from the Drosophila brain using only a fraction of the memory traditionally required, without increasing computing time. Adding more brain neurons increases memory only nominally. Importantly, Kaleido maximizes color contrast between neighboring neurons so that individual neurons can be easily distinguished. Colors can also be assigned to neurons based on biological relevance, such as gene expression, neurotransmitters, and/or development history. For cross-lab examination, the identity of every neuron is retrievable from the displayed image. To demonstrate the effectiveness and tractability of the method, we applied Kaleido to visualize the 10,000 Drosophila brain neurons obtained from the FlyCircuit database ( http://www.flycircuit.tw/modules.php?name=kaleido ). Thus, Kaleido visualization requires only sensible computer memory for manual examination of big connectomics data.

  7. Experimental development of power reactor intelligent control

    International Nuclear Information System (INIS)

    Edwards, R.M.; Garcia, H.E.; Lee, K.Y.

    1992-01-01

    The US nuclear utility industry initiated an ambitious program to modernize the control systems at a minimum of ten existing nuclear power plants by the year 2000. That program addresses urgent needs to replace obsolete instrumentation and analog controls with highly reliable state-of-the-art computer-based digital systems. Large increases in functionality that could theoretically be achieved in a distributed digital control system are not an initial priority in the industry program but could be logically considered in later phases. This paper discusses the initial development of an experimental sequence for developing, testing, and verifying intelligent fault-accommodating control for commercial nuclear power plant application. The sequence includes an ultra-safe university research reactor (TRIGA) and a passively safe experimental power plant (Experimental Breeder Reactor 2)

  8. WE-H-BRB-00: Big Data in Radiation Oncology

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2016-06-15

    Big Data in Radiation Oncology: (1) Overview of the NIH 2015 Big Data Workshop, (2) Where do we stand in the applications of big data in radiation oncology?, and (3) Learning Health Systems for Radiation Oncology: Needs and Challenges for Future Success The overriding goal of this trio panel of presentations is to improve awareness of the wide ranging opportunities for big data impact on patient quality care and enhancing potential for research and collaboration opportunities with NIH and a host of new big data initiatives. This presentation will also summarize the Big Data workshop that was held at the NIH Campus on August 13–14, 2015 and sponsored by AAPM, ASTRO, and NIH. The workshop included discussion of current Big Data cancer registry initiatives, safety and incident reporting systems, and other strategies that will have the greatest impact on radiation oncology research, quality assurance, safety, and outcomes analysis. Learning Objectives: To discuss current and future sources of big data for use in radiation oncology research To optimize our current data collection by adopting new strategies from outside radiation oncology To determine what new knowledge big data can provide for clinical decision support for personalized medicine L. Xing, NIH/NCI Google Inc.

  9. WE-H-BRB-00: Big Data in Radiation Oncology

    International Nuclear Information System (INIS)

    2016-01-01

    Big Data in Radiation Oncology: (1) Overview of the NIH 2015 Big Data Workshop, (2) Where do we stand in the applications of big data in radiation oncology?, and (3) Learning Health Systems for Radiation Oncology: Needs and Challenges for Future Success The overriding goal of this trio panel of presentations is to improve awareness of the wide ranging opportunities for big data impact on patient quality care and enhancing potential for research and collaboration opportunities with NIH and a host of new big data initiatives. This presentation will also summarize the Big Data workshop that was held at the NIH Campus on August 13–14, 2015 and sponsored by AAPM, ASTRO, and NIH. The workshop included discussion of current Big Data cancer registry initiatives, safety and incident reporting systems, and other strategies that will have the greatest impact on radiation oncology research, quality assurance, safety, and outcomes analysis. Learning Objectives: To discuss current and future sources of big data for use in radiation oncology research To optimize our current data collection by adopting new strategies from outside radiation oncology To determine what new knowledge big data can provide for clinical decision support for personalized medicine L. Xing, NIH/NCI Google Inc.

  10. De impact van Big Data op Internationale Betrekkingen

    NARCIS (Netherlands)

    Zwitter, Andrej

    Big Data changes our daily lives, but does it also change international politics? In this contribution, Andrej Zwitter (NGIZ chair at Groningen University) argues that Big Data impacts on international relations in ways that we only now start to understand. To comprehend how Big Data influences

  11. RA Research reactor, Annual report 1971

    International Nuclear Information System (INIS)

    Milosevic, D. et al.

    1971-12-01

    During 1971, the RA Reactor was operated at nominal power of 6.5 MW for 190 days, and 50 days at lower power levels. Total production mounted to 31606 MWh which is 5.3% higher than planned, and the highest annual level since the reactor started operation. Reactor was used for irradiation and experiments according to the demand of 425 users, of which 370 from the Institute and 55 external users. This report contains detailed data about reactor power and experiments performed in 1971. Discrepancies from the action plan, meaning higher production was achieved in June and December due to special demand of the users. Total number of interruptions was lower than during all the previous years, and were caused mainly due to power cuts during reactor operation. There was no longer interruption caused by failures of the equipment. There was only on scram shutdown during this year caused by a false signal of the reactor control instrumentation. Shorter interruptions resulted from breaking of connectors in the technical water pipe system caused by soil sliding near the pumping station on the Danube. Total personnel exposure dose was lower than during previous years. There was no accident nor any event that could be called accidental. Decontamination od surfaces was less than during previous years. It was concluded that the successful operation in 1971 resulted from efficient work during past years. But, some of the activities were interrupted due to undefined policy concerned with operation of the RA reactor and financial issues. This involves study of the possibility to use highly enriched fuel that would increase the useful neutron flux and the reactor compatibility with similar reactors for the future ten years. Another project that has been interrupted is construction of the emergency core cooling system which is important for the reactor safety. Financial problems have influenced not only the reactor operation but the number of employees, which could cause negative

  12. Epidemiology in the Era of Big Data

    Science.gov (United States)

    Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M

    2015-01-01

    Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called ‘3 Vs’: variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that, while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field’s future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future. PMID:25756221

  13. Operational power reactor health physics

    International Nuclear Information System (INIS)

    Watson, B.A.

    1987-01-01

    Operational Health Physics can be comprised of a multitude of organizations, both corporate and at the plant sites. The following discussion centers around Baltimore Gas and Electric's (BG and E) Calvert Cliffs Nuclear Power Plant, located in Lusby, Maryland. Calvert Cliffs is a twin Combustion Engineering 825 MWe pressurized water reactor site with Unit I having a General electric turbine-generator and Unit II having a Westinghouse turbine-generator. Having just completed each Unit's ten-year Inservice Inspection and Refueling Outge, a total of 20 reactor years operating health physics experience have been accumulated at Calvert Cliffs. Because BG and E has only one nuclear site most health physics functions are performed at the plant site. This is also true for the other BG and E nuclear related organizations, such as Engineering and Quality Assurance. Utilities with multiple plant sites have corporate health physics entity usually providing oversight to the various plant programs

  14. Big data and analytics strategic and organizational impacts

    CERN Document Server

    Morabito, Vincenzo

    2015-01-01

    This book presents and discusses the main strategic and organizational challenges posed by Big Data and analytics in a manner relevant to both practitioners and scholars. The first part of the book analyzes strategic issues relating to the growing relevance of Big Data and analytics for competitive advantage, which is also attributable to empowerment of activities such as consumer profiling, market segmentation, and development of new products or services. Detailed consideration is also given to the strategic impact of Big Data and analytics on innovation in domains such as government and education and to Big Data-driven business models. The second part of the book addresses the impact of Big Data and analytics on management and organizations, focusing on challenges for governance, evaluation, and change management, while the concluding part reviews real examples of Big Data and analytics innovation at the global level. The text is supported by informative illustrations and case studies, so that practitioners...

  15. Big Science and Long-tail Science

    CERN Document Server

    2008-01-01

    Jim Downing and I were privileged to be the guests of Salavtore Mele at CERN yesterday and to see the Atlas detector of the Large Hadron Collider . This is a wow experience - although I knew it was big, I hadnt realised how big.

  16. Toward a Literature-Driven Definition of Big Data in Healthcare

    Directory of Open Access Journals (Sweden)

    Emilie Baro

    2015-01-01

    Full Text Available Objective. The aim of this study was to provide a definition of big data in healthcare. Methods. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n and the number of variables (p for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. Results. A total of 196 papers were included. Big data can be defined as datasets with Log⁡(n*p≥7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Conclusion. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR data.

  17. Toward a Literature-Driven Definition of Big Data in Healthcare

    Science.gov (United States)

    Baro, Emilie; Degoul, Samuel; Beuscart, Régis; Chazard, Emmanuel

    2015-01-01

    Objective. The aim of this study was to provide a definition of big data in healthcare. Methods. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n) and the number of variables (p) for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. Results. A total of 196 papers were included. Big data can be defined as datasets with Log⁡(n∗p) ≥ 7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Conclusion. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR) data. PMID:26137488

  18. Big-Eyed Bugs Have Big Appetite for Pests

    Science.gov (United States)

    Many kinds of arthropod natural enemies (predators and parasitoids) inhabit crop fields in Arizona and can have a large negative impact on several pest insect species that also infest these crops. Geocoris spp., commonly known as big-eyed bugs, are among the most abundant insect predators in field c...

  19. Breaking the Silence: Disordered Eating and Big Five Traits in College Men.

    Science.gov (United States)

    Dubovi, Abigail S; Li, Yue; Martin, Jessica L

    2016-11-01

    Men remain largely underrepresented in the eating disorder literature and few studies have investigated risk factors for disordered eating among men. The current study examined associations between Big Five personality traits and eating disorder symptoms in a sample of college men (N = 144). Participants completed the Eating Disorder Diagnostic Scale and Ten Item Personality Inventory online. Results suggested that openness was positively associated with purging-type behaviors and that emotional stability was positively related to symptoms of anorexia nervosa and global eating pathology. Findings highlight the prevalence of eating disorder symptoms among college men and suggest that these symptoms are associated with a different constellation of personality traits than is typically reported among women. Implications for targeted prevention and intervention programs and future research are discussed. © The Author(s) 2015.

  20. Big Data - What is it and why it matters.

    Science.gov (United States)

    Tattersall, Andy; Grant, Maria J

    2016-06-01

    Big data, like MOOCs, altmetrics and open access, is a term that has been commonplace in the library community for some time yet, despite its prevalence, many in the library and information sector remain unsure of the relationship between big data and their roles. This editorial explores what big data could mean for the day-to-day practice of health library and information workers, presenting examples of big data in action, considering the ethics of accessing big data sets and the potential for new roles for library and information workers. © 2016 Health Libraries Group.

  1. Research on information security in big data era

    Science.gov (United States)

    Zhou, Linqi; Gu, Weihong; Huang, Cheng; Huang, Aijun; Bai, Yongbin

    2018-05-01

    Big data is becoming another hotspot in the field of information technology after the cloud computing and the Internet of Things. However, the existing information security methods can no longer meet the information security requirements in the era of big data. This paper analyzes the challenges and a cause of data security brought by big data, discusses the development trend of network attacks under the background of big data, and puts forward my own opinions on the development of security defense in technology, strategy and product.

  2. BIG DATA IN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2015-06-01

    Full Text Available In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured in order to improve current transactions, to develop new business models, to provide a real image of the supply and demand and thereby, generate market advantages. So, the companies that turn to Big Data have a competitive advantage over other firms. Looking from the perspective of IT organizations, they must accommodate the storage and processing Big Data, and provide analysis tools that are easily integrated into business processes. This paper aims to discuss aspects regarding the Big Data concept, the principles to build, organize and analyse huge datasets in the business environment, offering a three-layer architecture, based on actual software solutions. Also, the article refers to the graphical tools for exploring and representing unstructured data, Gephi and NodeXL.

  3. Fuzzy 2-partition entropy threshold selection based on Big Bang–Big Crunch Optimization algorithm

    Directory of Open Access Journals (Sweden)

    Baljit Singh Khehra

    2015-03-01

    Full Text Available The fuzzy 2-partition entropy approach has been widely used to select threshold value for image segmenting. This approach used two parameterized fuzzy membership functions to form a fuzzy 2-partition of the image. The optimal threshold is selected by searching an optimal combination of parameters of the membership functions such that the entropy of fuzzy 2-partition is maximized. In this paper, a new fuzzy 2-partition entropy thresholding approach based on the technology of the Big Bang–Big Crunch Optimization (BBBCO is proposed. The new proposed thresholding approach is called the BBBCO-based fuzzy 2-partition entropy thresholding algorithm. BBBCO is used to search an optimal combination of parameters of the membership functions for maximizing the entropy of fuzzy 2-partition. BBBCO is inspired by the theory of the evolution of the universe; namely the Big Bang and Big Crunch Theory. The proposed algorithm is tested on a number of standard test images. For comparison, three different algorithms included Genetic Algorithm (GA-based, Biogeography-based Optimization (BBO-based and recursive approaches are also implemented. From experimental results, it is observed that the performance of the proposed algorithm is more effective than GA-based, BBO-based and recursion-based approaches.

  4. A little big history of Tiananmen

    NARCIS (Netherlands)

    Quaedackers, E.; Grinin, L.E.; Korotayev, A.V.; Rodrigue, B.H.

    2011-01-01

    This contribution aims at demonstrating the usefulness of studying small-scale subjects such as Tiananmen, or the Gate of Heavenly Peace, in Beijing - from a Big History perspective. By studying such a ‘little big history’ of Tiananmen, previously overlooked yet fundamental explanations for why

  5. Stilometrie en karakterisering in The Big Bang Theory

    Directory of Open Access Journals (Sweden)

    Maryka van Zyl

    2016-11-01

    Full Text Available Dialoog is ’n belangrike aspek van televisuele karakterkonstruering. Skrywers maak talige keuses namens die karakters en hierdie keuses kan daartoe aanleiding gee dat kykers ’n karakter met ’n spesifieke stereotipe subkultuur of sosiale groep vereenselwig. Hierdie studie ondersoek die talige konstruering van die karakter Sheldon Cooper in die CBS-sitkom The Big Bang Theory. ’n Trosanaliseboom van die spraak van elk van die vyf hoofkarakters in die eerste sewe seisoene (gegenereer deur die R-skrip Stylo 0.6.0 dui daarop dat die karakter Sheldon Cooper van die ander hoofkarakters (Leonard, Penny, Howard en Rajesh verskil ten opsigte van taalstyl. Hierdie verskille word verder ondersoek deur gebruik te maak van die korpusanalise program (WordSmith 6.0. om sleutelwoorde en leksikale bondels te identifiseer en om die gebruik van aktiewe en passiewe werkwoordkonstruksies te vergelyk. Sheldon se keuse van wetenskaplike of meer formele woorde en sy relatiewe voorkeur vir die passiefkonstruksie tipeer sy taalstyl as verduidelikend eerder as tipies van omgangstaal.

  6. Addressing big data issues in Scientific Data Infrastructure

    NARCIS (Netherlands)

    Demchenko, Y.; Membrey, P.; Grosso, P.; de Laat, C.; Smari, W.W.; Fox, G.C.

    2013-01-01

    Big Data are becoming a new technology focus both in science and in industry. This paper discusses the challenges that are imposed by Big Data on the modern and future Scientific Data Infrastructure (SDI). The paper discusses a nature and definition of Big Data that include such features as Volume,

  7. Improving Healthcare Using Big Data Analytics

    Directory of Open Access Journals (Sweden)

    Revanth Sonnati

    2017-03-01

    Full Text Available In daily terms we call the current era as Modern Era which can also be named as the era of Big Data in the field of Information Technology. Our daily lives in todays world are rapidly advancing never quenching ones thirst. The fields of science engineering and technology are producing data at an exponential rate leading to Exabytes of data every day. Big data helps us to explore and re-invent many areas not limited to education health and law. The primary purpose of this paper is to provide an in-depth analysis in the area of Healthcare using the big data and analytics. The main purpose is to emphasize on the usage of the big data which is being stored all the time helping to look back in the history but this is the time to emphasize on the analyzation to improve the medication and services. Although many big data implementations happen to be in-house development this proposed implementation aims to propose a broader extent using Hadoop which just happen to be the tip of the iceberg. The focus of this paper is not limited to the improvement and analysis of the data it also focusses on the strengths and drawbacks compared to the conventional techniques available.

  8. Big Data - Smart Health Strategies

    Science.gov (United States)

    2014-01-01

    Summary Objectives To select best papers published in 2013 in the field of big data and smart health strategies, and summarize outstanding research efforts. Methods A systematic search was performed using two major bibliographic databases for relevant journal papers. The references obtained were reviewed in a two-stage process, starting with a blinded review performed by the two section editors, and followed by a peer review process operated by external reviewers recognized as experts in the field. Results The complete review process selected four best papers, illustrating various aspects of the special theme, among them: (a) using large volumes of unstructured data and, specifically, clinical notes from Electronic Health Records (EHRs) for pharmacovigilance; (b) knowledge discovery via querying large volumes of complex (both structured and unstructured) biological data using big data technologies and relevant tools; (c) methodologies for applying cloud computing and big data technologies in the field of genomics, and (d) system architectures enabling high-performance access to and processing of large datasets extracted from EHRs. Conclusions The potential of big data in biomedicine has been pinpointed in various viewpoint papers and editorials. The review of current scientific literature illustrated a variety of interesting methods and applications in the field, but still the promises exceed the current outcomes. As we are getting closer towards a solid foundation with respect to common understanding of relevant concepts and technical aspects, and the use of standardized technologies and tools, we can anticipate to reach the potential that big data offer for personalized medicine and smart health strategies in the near future. PMID:25123721

  9. About Big Data and its Challenges and Benefits in Manufacturing

    OpenAIRE

    Bogdan NEDELCU

    2013-01-01

    The aim of this article is to show the importance of Big Data and its growing influence on companies. It also shows what kind of big data is currently generated and how much big data is estimated to be generated. We can also see how much are the companies willing to invest in big data and how much are they currently gaining from their big data. There are also shown some major influences that big data has over one major segment in the industry (manufacturing) and the challenges that appear.

  10. Nodalization effects on RELAP5 results related to MTR research reactor transient scenarios

    Directory of Open Access Journals (Sweden)

    Khedr Ahmed

    2005-01-01

    Full Text Available The present work deals with the anal y sis of RELAP5 results obtained from the evaluation study of the total loss of flow transient with the deficiency of the heat removal system in a research reactor using two different nodalizations. It focuses on the effect of nodalization on the thermal-hydraulic evaluation of the re search reactor. The analysis of RELAP5 results has shown that nodalization has a big effect on the predicted scenario of the postulated transient. There fore, great care should be taken during the nodalization of the reactor, especially when the avail able experimental or measured data are insufficient for making a complete qualification of the nodalization. Our analysis also shows that the research reactor pool simulation has a great effect on the evaluation of natural circulation flow and on other thermal-hydraulic parameters during the loss of flow transient. For example, the on set time of core boiling changes from less than 2000 s to 15000 s, starting from the beginning of the transient. This occurs if the pool is simulated by two vertical volumes in stead of one vertical volume.

  11. The neutronics studies of a fusion fission hybrid reactor using pressure tube blankets

    International Nuclear Information System (INIS)

    Zheng Youqi; Zu Tiejun; Wu Hongchun; Cao Liangzhi; Yang Chao

    2012-01-01

    In this paper, a fusion fission hybrid reactor used for energy producing is proposed based on the situation of nuclear power in China. The pressurized light water is applied as the coolant. The fuel assemblies are loaded in the pressure tubes with a modular type structure. The neutronics analysis is performed to get the suitable design and prove the feasibility. The energy multiplication and tritium self-sustaining are evaluated. The neutron load is also cared. From different candidates, the PWR spent fuel is selected as the feed fuel. The results show that the hybrid reactor can meet the expected reactor core lifetime of 5 years with 1000 MWe power output. Two ways are discussed including burning the discharged PWR spent fuel and burning the reprocessed plutonium. The energy multiplication is big enough and the tritium can be self-sustaining for both of the two ways. The neutron wall load in the operating time is kept smaller than the one of ITER. The way to use the reprocessed plutonium brings low neutron wall load, but also brings additional difficulties in operating the hybrid reactor. The way to use the discharged spent fuel is proposed to be a better choice currently.

  12. Big Data Management in US Hospitals: Benefits and Barriers.

    Science.gov (United States)

    Schaeffer, Chad; Booton, Lawrence; Halleck, Jamey; Studeny, Jana; Coustasse, Alberto

    Big data has been considered as an effective tool for reducing health care costs by eliminating adverse events and reducing readmissions to hospitals. The purposes of this study were to examine the emergence of big data in the US health care industry, to evaluate a hospital's ability to effectively use complex information, and to predict the potential benefits that hospitals might realize if they are successful in using big data. The findings of the research suggest that there were a number of benefits expected by hospitals when using big data analytics, including cost savings and business intelligence. By using big data, many hospitals have recognized that there have been challenges, including lack of experience and cost of developing the analytics. Many hospitals will need to invest in the acquiring of adequate personnel with experience in big data analytics and data integration. The findings of this study suggest that the adoption, implementation, and utilization of big data technology will have a profound positive effect among health care providers.

  13. Gas cooled fast breeder reactors using mixed carbide fuel

    International Nuclear Information System (INIS)

    Kypreos, S.

    1976-09-01

    The fast reactors being developed at the present time use mixed oxide fuel, stainless-steel cladding and liquid sodium as coolant (LMFBR). Theoretical and experimental designing work has also been done in the field of gas-cooled fast breeder reactors. The more advanced carbide fuel offers greater potential for developing fuel systems with doubling times in the range of ten years. The thermohydraulic and physics performance of a GCFR utilising this fuel is assessed. One question to be answered is whether helium is an efficient coolant to be coupled with the carbide fuel while preserving its superior neutronic performance. Also, an assessment of the fuel cycle cost in comparison to oxide fuel is presented. (Auth.)

  14. Big Data Strategy for Telco: Network Transformation

    OpenAIRE

    F. Amin; S. Feizi

    2014-01-01

    Big data has the potential to improve the quality of services; enable infrastructure that businesses depend on to adapt continually and efficiently; improve the performance of employees; help organizations better understand customers; and reduce liability risks. Analytics and marketing models of fixed and mobile operators are falling short in combating churn and declining revenue per user. Big Data presents new method to reverse the way and improve profitability. The benefits of Big Data and ...

  15. Big Data in Shipping - Challenges and Opportunities

    OpenAIRE

    Rødseth, Ørnulf Jan; Perera, Lokukaluge Prasad; Mo, Brage

    2016-01-01

    Big Data is getting popular in shipping where large amounts of information is collected to better understand and improve logistics, emissions, energy consumption and maintenance. Constraints to the use of big data include cost and quality of on-board sensors and data acquisition systems, satellite communication, data ownership and technical obstacles to effective collection and use of big data. New protocol standards may simplify the process of collecting and organizing the data, including in...

  16. Studies of thermal hydraulics and heat transfer in cascade subcritical molten salt reactor

    International Nuclear Information System (INIS)

    Aysen, E.M.; Sedov, A.A.; Subbotin, A.S.

    2005-01-01

    Full text of publication follows: Cascade Subcritical Molten Salt Reactor (CSMSR) consists of three main parts: accelerator-driven proton-bombarded target, central and peripheral zones. External neutrons generated in the result of interaction of protons with the target nuclei are multiplied then in the central zone and leak farther into the peripheral reactor zone, where an efficient burning of Minor Actinides dissolved in a molten salt fluoride composition is produced. The bunch of target and two zones is designed so that preset subcriticality of reactor would not be less than 1% of k eff . A characteristic feature of the reactor is a high density of neutron flux (2.10 15 n/cm 2 s) in the central zone and target and very high volumetric power rate (2000 - 6000 W/cm 3 ) in all the parts of CSMSR. To provide a workability of the core structures under condition of so big level of power rate it is necessary to impose strict limitations on the temperatures and temperature gradients developed in the coolants and constructions. In this reason it has been arranged a calculational-designing study to reveal the problems of heat transfer in the coolant and core structures and to find more appropriate variant of the core and target design, which is a compromise of contradictory requirements: provision of high neutron flux and coolability of the core structures. In this paper the results of studies of thermal hydraulics and heat transfer in the core zones and proton-beam target are presented. Different variants of the target and central zone design as well as application of different kind of coolants in them are discussed and the main problems of heat removal in their structures are analyzed. Multidimensional fields of velocity and temperature got in thermal hydraulics calculations for free flow of fuelled molten salt in cylindrical-cave peripheral CSMSR zone without structures inside are demonstrated. The role of turbulent exchange of momentum and heat for free flow in the

  17. [Relevance of big data for molecular diagnostics].

    Science.gov (United States)

    Bonin-Andresen, M; Smiljanovic, B; Stuhlmüller, B; Sörensen, T; Grützkau, A; Häupl, T

    2018-04-01

    Big data analysis raises the expectation that computerized algorithms may extract new knowledge from otherwise unmanageable vast data sets. What are the algorithms behind the big data discussion? In principle, high throughput technologies in molecular research already introduced big data and the development and application of analysis tools into the field of rheumatology some 15 years ago. This includes especially omics technologies, such as genomics, transcriptomics and cytomics. Some basic methods of data analysis are provided along with the technology, however, functional analysis and interpretation requires adaptation of existing or development of new software tools. For these steps, structuring and evaluating according to the biological context is extremely important and not only a mathematical problem. This aspect has to be considered much more for molecular big data than for those analyzed in health economy or epidemiology. Molecular data are structured in a first order determined by the applied technology and present quantitative characteristics that follow the principles of their biological nature. These biological dependencies have to be integrated into software solutions, which may require networks of molecular big data of the same or even different technologies in order to achieve cross-technology confirmation. More and more extensive recording of molecular processes also in individual patients are generating personal big data and require new strategies for management in order to develop data-driven individualized interpretation concepts. With this perspective in mind, translation of information derived from molecular big data will also require new specifications for education and professional competence.

  18. Big data in psychology: A framework for research advancement.

    Science.gov (United States)

    Adjerid, Idris; Kelley, Ken

    2018-02-22

    The potential for big data to provide value for psychology is significant. However, the pursuit of big data remains an uncertain and risky undertaking for the average psychological researcher. In this article, we address some of this uncertainty by discussing the potential impact of big data on the type of data available for psychological research, addressing the benefits and most significant challenges that emerge from these data, and organizing a variety of research opportunities for psychology. Our article yields two central insights. First, we highlight that big data research efforts are more readily accessible than many researchers realize, particularly with the emergence of open-source research tools, digital platforms, and instrumentation. Second, we argue that opportunities for big data research are diverse and differ both in their fit for varying research goals, as well as in the challenges they bring about. Ultimately, our outlook for researchers in psychology using and benefiting from big data is cautiously optimistic. Although not all big data efforts are suited for all researchers or all areas within psychology, big data research prospects are diverse, expanding, and promising for psychology and related disciplines. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  19. 'Big data' in pharmaceutical science: challenges and opportunities.

    Science.gov (United States)

    Dossetter, Al G; Ecker, Gerhard; Laverty, Hugh; Overington, John

    2014-05-01

    Future Medicinal Chemistry invited a selection of experts to express their views on the current impact of big data in drug discovery and design, as well as speculate on future developments in the field. The topics discussed include the challenges of implementing big data technologies, maintaining the quality and privacy of data sets, and how the industry will need to adapt to welcome the big data era. Their enlightening responses provide a snapshot of the many and varied contributions being made by big data to the advancement of pharmaceutical science.

  20. The PRISM reactor as a possible option to deal with the british civilian plutonium stockpile

    Energy Technology Data Exchange (ETDEWEB)

    Fichtlscherer, Christopher [IANUS, TU Darmstadt (Germany); Friess, Friederike [IANUS, TU Darmstadt (Germany); ISR, Universitaet fuer Bodenkultur Wien (Boku) (Austria)

    2017-07-01

    Dealing with stocks of separated weapon-usable plutonium is a big challenge for our modern society. This work focuses on the British civil plutonium stockpiles, which amount to 103.3 tons. One option is seen in irradiating the plutonium in a fast reactor under development, namely the GE PRISM reactor. The PRISM reactor is a small modular, fast reactor which has a thermal power of 840 MW and an electrical output of 311 MW. It is intended to use MOX fuel and proponents claim, that it thus would be possible to produce clean energy, while making the plutonium proliferation resistant. A MCNP model of the reactor is built and depletion calculations with different target burnups of the fuel were conducted to check whether the burned material would fulfil the Spent-Fuel Standard. Particularly it was checked whether the spent fuel is self protecting, meaning that the dose rate does not fall below a limit of 1 Sv/h in 1 meter distance after a cooling period of 30 years. Based on the reactor model calculations the irradiation time to fulfill this limit for the spent fuel is calculated. Based on the needed target burnup, it can be verified, whether it is possible for the PRISM reactor to render the civil plutonium proliferation resistant in only 20 years as is is claimed by its proponents.

  1. Soft computing in big data processing

    CERN Document Server

    Park, Seung-Jong; Lee, Jee-Hyong

    2014-01-01

    Big data is an essential key to build a smart world as a meaning of the streaming, continuous integration of large volume and high velocity data covering from all sources to final destinations. The big data range from data mining, data analysis and decision making, by drawing statistical rules and mathematical patterns through systematical or automatically reasoning. The big data helps serve our life better, clarify our future and deliver greater value. We can discover how to capture and analyze data. Readers will be guided to processing system integrity and implementing intelligent systems. With intelligent systems, we deal with the fundamental data management and visualization challenges in effective management of dynamic and large-scale data, and efficient processing of real-time and spatio-temporal data. Advanced intelligent systems have led to managing the data monitoring, data processing and decision-making in realistic and effective way. Considering a big size of data, variety of data and frequent chan...

  2. Solution of a braneworld big crunch/big bang cosmology

    International Nuclear Information System (INIS)

    McFadden, Paul L.; Turok, Neil; Steinhardt, Paul J.

    2007-01-01

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c) 2 . At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios

  3. The design rationale of the Integral Fast Reactor (IFR)

    International Nuclear Information System (INIS)

    Wade, D.C.; Hill, R.N.

    1997-01-01

    The Integral Fast Reactor (IFR) concept has been developed over the last ten years to provide technical solutions to perceptual concerns associated with nuclear power. Beyond the traditional advanced reactor objectives of increased safety, improved economy and more efficient fuel utilization, the IFR is designed to simplify waste disposal and increase resistance to proliferation. Only a fast reactor with an efficient recycle technology can provide for total consumption of actinides. The basic physics governing reactor design dictates that, for efficient recycle, the fuel form should be limited in burnup only by radiation damage to fuel cladding. The recycle technology must recover essentially all actinides. In a fast reactor, not all fission products need to be removed from the recycled fuel, and there is no need to produce pure plutonium. Recovery, recycle, and ultimate consumption of all actinides resolves several waste-disposal concerns. The IFR can be configured to achieve safe passive response to any of the traditional postulated reactor accident initiators, and can be configured for a variety of power output levels. Passive heat removal is achieved by use of a large inventory sodium coolant and a physical configuration that emphasizes natural circulation. An IFR can be designed to consume excess fissile material, to produce a surplus, or to maintain inventory. It appears that commercial designs should be economically competitive with other available alternatives. (author)

  4. [Big data and their perspectives in radiation therapy].

    Science.gov (United States)

    Guihard, Sébastien; Thariat, Juliette; Clavier, Jean-Baptiste

    2017-02-01

    The concept of big data indicates a change of scale in the use of data and data aggregation into large databases through improved computer technology. One of the current challenges in the creation of big data in the context of radiation therapy is the transformation of routine care items into dark data, i.e. data not yet collected, and the fusion of databases collecting different types of information (dose-volume histograms and toxicity data for example). Processes and infrastructures devoted to big data collection should not impact negatively on the doctor-patient relationship, the general process of care or the quality of the data collected. The use of big data requires a collective effort of physicians, physicists, software manufacturers and health authorities to create, organize and exploit big data in radiotherapy and, beyond, oncology. Big data involve a new culture to build an appropriate infrastructure legally and ethically. Processes and issues are discussed in this article. Copyright © 2016 Société Française du Cancer. Published by Elsevier Masson SAS. All rights reserved.

  5. Current applications of big data in obstetric anesthesiology.

    Science.gov (United States)

    Klumpner, Thomas T; Bauer, Melissa E; Kheterpal, Sachin

    2017-06-01

    The narrative review aims to highlight several recently published 'big data' studies pertinent to the field of obstetric anesthesiology. Big data has been used to study rare outcomes, to identify trends within the healthcare system, to identify variations in practice patterns, and to highlight potential inequalities in obstetric anesthesia care. Big data studies have helped define the risk of rare complications of obstetric anesthesia, such as the risk of neuraxial hematoma in thrombocytopenic parturients. Also, large national databases have been used to better understand trends in anesthesia-related adverse events during cesarean delivery as well as outline potential racial/ethnic disparities in obstetric anesthesia care. Finally, real-time analysis of patient data across a number of disparate health information systems through the use of sophisticated clinical decision support and surveillance systems is one promising application of big data technology on the labor and delivery unit. 'Big data' research has important implications for obstetric anesthesia care and warrants continued study. Real-time electronic surveillance is a potentially useful application of big data technology on the labor and delivery unit.

  6. Outline of research program on thorium fuel supported by grant-in-aid for energy research of ministry of education, science and culture

    International Nuclear Information System (INIS)

    Shibata, Toshikazu

    1984-01-01

    Since 1980, the Research Program on Thorium Fuel has been performed under the support of Grant-in-Aid for Energy Research of the Ministry of Education, Science and Culture of Japanese Government on the university basis including several tens professors. The main results have been published in the English-written report, ''Research on Thorium Fuel (SPEY-9, 1984)''. This report describes the outline and review of the symposium held on January 31, 1984. It consists of nuclear data, reactor physics, thorium fuel, irradiation of thorium, down-stream, biological effect, molten salt reactor engineering and others. It has been the first trial to perform such a big systematic cooperative studies in nuclear field on the university basis in Japan. (author)

  7. The Thermos program for nuclear reactors specialized in district heating

    International Nuclear Information System (INIS)

    Lerouge, B.

    1976-01-01

    Many studies have been made in France on the use of nuclear heat for district heating. After a brief account of the problems raised by the use of thermal waste from big nuclear power stations, the quantitative and qualitative needs of heating networks are analyzed and the Thermos project described. This is a very robust reactor of the pool type, with an output of 100MW, supplying low-pressure water at 100 deg C. The advantages from the aspects of safety and economy are described, and the present state of the project and its possible developments summarized [fr

  8. Volume and Value of Big Healthcare Data.

    Science.gov (United States)

    Dinov, Ivo D

    Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions.

  9. Using Big Book to Teach Things in My House

    OpenAIRE

    Effrien, Intan; Lailatus, Sa’diyah; Nuruliftitah Maja, Neneng

    2017-01-01

    The purpose of this study to determine students' interest in learning using the big book media. Big book is a big book from the general book. The big book contains simple words and images that match the content of sentences and spelling. From here researchers can know the interest and development of students' knowledge. As well as train researchers to remain crative in developing learning media for students.

  10. Design and safety of the Sizewell pressurized water reactor

    International Nuclear Information System (INIS)

    Marshall, W.

    1983-01-01

    The Central Electricity Generating Board propose to build a pressurized water reactor at Sizewell in Suffolk. The PWR Task Force was set up in June 1981 to provide a communications centre for developing firm design proposals for this reactor. These were to follow the Standardized Nuclear Unit Power Plant System designed by Bechtel for the Westinghouse nuclear steam supply system for reactors built in the United States. Changes were required to the design to accommodate, for example, the use of two turbine generators and to satisfy British safety requirements. Differences exist between the British and American licensing procedures. In the UK the statutory responsibility for the safety of a nuclear power station rests unambiguously with the Generating Boards. In the U.S.A. the Nuclear Regulatory Commission issues detailed written instructions, which must be followed precisely. Much of the debate on the safety of nuclear power focuses on the risks of big nuclear accidents. It is necessary to explain to the public what, in a balanced perspective, the risks of accidents actually are. The long-term consequences can be presented in terms of reduction in life expectancy, increased chance of cancer or the equivalent pattern of compulsory cigarette smoking. (author)

  11. Big Data Analytics Methodology in the Financial Industry

    Science.gov (United States)

    Lawler, James; Joseph, Anthony

    2017-01-01

    Firms in industry continue to be attracted by the benefits of Big Data Analytics. The benefits of Big Data Analytics projects may not be as evident as frequently indicated in the literature. The authors of the study evaluate factors in a customized methodology that may increase the benefits of Big Data Analytics projects. Evaluating firms in the…

  12. Big data: survey, technologies, opportunities, and challenges.

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  13. Big Data: Survey, Technologies, Opportunities, and Challenges

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Mahmoud Ali, Waleed Kamaleldin; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  14. Russian RERTR program as a part of Joint US DOE-RF MINATOM collaboration on elimination of the threat connected to the use of HEU in research reactors

    International Nuclear Information System (INIS)

    Arkhangelsky, N.

    2002-01-01

    The Russian RERTR Program started at the end of 70's, the final goal of the program is to eliminate supplies of HEU in fuel elements and assemblies for foreign research reactors that were designed according to Russian projects. Basic directions of the work include: completion of the development of the fuel elements and assemblies on a basis of uranium dioxide; development of the fuel on a basis of U-Mo alloy; and development of pin type fuel elements. Fuel assemblies of WWR-M2 type with LEU were developed and qualified for using in foreign research reactors that use such type of fuel assemblies. These assemblies are ready for the supplying several operating foreign research reactors. There are more than 20 sites in Eastern European countries, former Soviet republics and another countries that have big amount of Russian origin HEU in fresh and spent fuel. The problem of the shipment of SNF from sites of research reactors is also very important for domestic Russian research reactors. More than ten years from its beginning the Russian RERTR program developed practically independently from the international RERTR program and only at the begin of 90's the Russian specialists started to contact with foreign scientists and the exchange of the scientific information has become more intensive. In September 1994, representatives of Minatom and DOE signed a protocol of intent to reduce an enrichment of uranium in research reactors. The main aspects of collaboration involve: Several domestic Russian research reactors such as WWR-M, IR-8 and others were investigated from the point of view of possibility of reducing of enrichment; financial support of the program from US DOE which is insufficient. The important part of international collaboration is the import of Russian origin spent and fresh fuel of research reactors to Russia. In August 2002 an impressive result of the Russian-American collaboration with support of IAEA and with the help and assistance of Yugoslavian side was

  15. Opportunity and Challenges for Migrating Big Data Analytics in Cloud

    Science.gov (United States)

    Amitkumar Manekar, S.; Pradeepini, G., Dr.

    2017-08-01

    Big Data Analytics is a big word now days. As per demanding and more scalable process data generation capabilities, data acquisition and storage become a crucial issue. Cloud storage is a majorly usable platform; the technology will become crucial to executives handling data powered by analytics. Now a day’s trend towards “big data-as-a-service” is talked everywhere. On one hand, cloud-based big data analytics exactly tackle in progress issues of scale, speed, and cost. But researchers working to solve security and other real-time problem of big data migration on cloud based platform. This article specially focused on finding possible ways to migrate big data to cloud. Technology which support coherent data migration and possibility of doing big data analytics on cloud platform is demanding in natute for new era of growth. This article also gives information about available technology and techniques for migration of big data in cloud.

  16. Hot big bang or slow freeze?

    Science.gov (United States)

    Wetterich, C.

    2014-09-01

    We confront the big bang for the beginning of the universe with an equivalent picture of a slow freeze - a very cold and slowly evolving universe. In the freeze picture the masses of elementary particles increase and the gravitational constant decreases with cosmic time, while the Newtonian attraction remains unchanged. The freeze and big bang pictures both describe the same observations or physical reality. We present a simple ;crossover model; without a big bang singularity. In the infinite past space-time is flat. Our model is compatible with present observations, describing the generation of primordial density fluctuations during inflation as well as the present transition to a dark energy-dominated universe.

  17. Big Data

    DEFF Research Database (Denmark)

    Aaen, Jon; Nielsen, Jeppe Agger

    2016-01-01

    Big Data byder sig til som en af tidens mest hypede teknologiske innovationer, udråbt til at rumme kimen til nye, værdifulde operationelle indsigter for private virksomheder og offentlige organisationer. Mens de optimistiske udmeldinger er mange, er forskningen i Big Data i den offentlige sektor...... indtil videre begrænset. Denne artikel belyser, hvordan den offentlige sundhedssektor kan genanvende og udnytte en stadig større mængde data under hensyntagen til offentlige værdier. Artiklen bygger på et casestudie af anvendelsen af store mængder sundhedsdata i Dansk AlmenMedicinsk Database (DAMD......). Analysen viser, at (gen)brug af data i nye sammenhænge er en flerspektret afvejning mellem ikke alene økonomiske rationaler og kvalitetshensyn, men også kontrol over personfølsomme data og etiske implikationer for borgeren. I DAMD-casen benyttes data på den ene side ”i den gode sags tjeneste” til...

  18. Operational Experience with the TRIGA Mark II Reactor of the University of Pavia

    Energy Technology Data Exchange (ETDEWEB)

    Tigliole, A. Borio Di; Alloni, D.; Cagnazzo, M.; Coniglio, M.; Lana, F.; Losi, A.; Magrotti, G.; Manera, S.; Marchetti, F.; Pappalardo, P.; Prata, M.; Provasi, M.C.; Salvini, A.; Scian, G.; Vinciguerra, G. [University of Pavia, Laboratory of Applied Nuclear Energy (L.E.N.A), Via Aselli 41, 27100 Pavia (Italy)

    2011-07-01

    The Laboratory of Applied Nuclear Energy (LENA) is an Interdepartmental Research Centre of the University of Pavia which operates a 250 kW TRIGA Mark II Research Nuclear Reactor, a Cyclotron for the production of radioisotopes and other irradiation facilities. The reactor is in operation since 1965 and many home-made upgrading were realized in the past years in order to assure a continuous operation of the reactor for the future. The annual reactor operational time at nominal power is in the range of 300 - 400 hours depending upon the time schedule of some experiments and research activities. The reactor is mainly used for NAA activities, BNCT research, samples irradiation and training. In specific, few tens of hours of reactor operation per year are dedicated to training courses for University students and for professionals. Besides, the LENA Centre hosts every year more than one thousand high school students in visit. Lately, LENA was certified ISO 9001:2008 for the ''operation and maintenance of the reactor'' and for the ''design and delivery of the irradiation service''. Nowadays the reactor shows a good technical state and, at the moment, there are no political or economical reason to consider the reactor shut-down. (author)

  19. Curating Big Data Made Simple: Perspectives from Scientific Communities.

    Science.gov (United States)

    Sowe, Sulayman K; Zettsu, Koji

    2014-03-01

    The digital universe is exponentially producing an unprecedented volume of data that has brought benefits as well as fundamental challenges for enterprises and scientific communities alike. This trend is inherently exciting for the development and deployment of cloud platforms to support scientific communities curating big data. The excitement stems from the fact that scientists can now access and extract value from the big data corpus, establish relationships between bits and pieces of information from many types of data, and collaborate with a diverse community of researchers from various domains. However, despite these perceived benefits, to date, little attention is focused on the people or communities who are both beneficiaries and, at the same time, producers of big data. The technical challenges posed by big data are as big as understanding the dynamics of communities working with big data, whether scientific or otherwise. Furthermore, the big data era also means that big data platforms for data-intensive research must be designed in such a way that research scientists can easily search and find data for their research, upload and download datasets for onsite/offsite use, perform computations and analysis, share their findings and research experience, and seamlessly collaborate with their colleagues. In this article, we present the architecture and design of a cloud platform that meets some of these requirements, and a big data curation model that describes how a community of earth and environmental scientists is using the platform to curate data. Motivation for developing the platform, lessons learnt in overcoming some challenges associated with supporting scientists to curate big data, and future research directions are also presented.

  20. Big data analytics in healthcare: promise and potential.

    Science.gov (United States)

    Raghupathi, Wullianallur; Raghupathi, Viju

    2014-01-01

    To describe the promise and potential of big data analytics in healthcare. The paper describes the nascent field of big data analytics in healthcare, discusses the benefits, outlines an architectural framework and methodology, describes examples reported in the literature, briefly discusses the challenges, and offers conclusions. The paper provides a broad overview of big data analytics for healthcare researchers and practitioners. Big data analytics in healthcare is evolving into a promising field for providing insight from very large data sets and improving outcomes while reducing costs. Its potential is great; however there remain challenges to overcome.

  1. Data warehousing in the age of big data

    CERN Document Server

    Krishnan, Krish

    2013-01-01

    Data Warehousing in the Age of the Big Data will help you and your organization make the most of unstructured data with your existing data warehouse. As Big Data continues to revolutionize how we use data, it doesn't have to create more confusion. Expert author Krish Krishnan helps you make sense of how Big Data fits into the world of data warehousing in clear and concise detail. The book is presented in three distinct parts. Part 1 discusses Big Data, its technologies and use cases from early adopters. Part 2 addresses data warehousing, its shortcomings, and new architecture

  2. The Death of the Big Men

    DEFF Research Database (Denmark)

    Martin, Keir

    2010-01-01

    Recently Tolai people og Papua New Guinea have adopted the term 'Big Shot' to decribe an emerging post-colonial political elite. The mergence of the term is a negative moral evaluation of new social possibilities that have arisen as a consequence of the Big Shots' privileged position within a glo...

  3. Big data and software defined networks

    CERN Document Server

    Taheri, Javid

    2018-01-01

    Big Data Analytics and Software Defined Networking (SDN) are helping to drive the management of data usage of the extraordinary increase of computer processing power provided by Cloud Data Centres (CDCs). This new book investigates areas where Big-Data and SDN can help each other in delivering more efficient services.

  4. The research reactor TRIGA Heidelberg II

    International Nuclear Information System (INIS)

    Maier-Borst, W.; Krauss, O.

    1988-01-01

    The reactor is in operation since the beginning of 1978. On the base of the working experience gathered during that time employing the TRIGA in biomedical research, especially the irradiation units have been extended or newly developed. Several TRIGA users have reported difficulties in using the rotary irradiation system. It became obvious that the alternatives to the original Lazy Susan are not commonly known. In this report, the open rotary system fed by a hydraulic rabbit system, which has proved successful in this form during the past ten years is presented

  5. Nuclear reactors in de-regulated markets: Integration between providers and customers?

    International Nuclear Information System (INIS)

    Girard, Philippe

    2006-01-01

    The deregulation of electricity markets has in most cases coincided with the end of state monopolies, where financial risks were borne by customers/citizens. Today, despite an economic advantage, nuclear power development faces two main problems: public acceptance and reticence of investors (banks, utilities shareholders). The development of electricity markets provides different financial instruments in order to hedge financial risks, but it is currently difficult to fix forward contracts for more than three to four years, and this period is insufficient for the financing of a nuclear reactor. A solution could be the evolution of nuclear providers into nuclear operators selling electricity (MWh) rather than selling nuclear capacity (MW), nuclear fuel and services. In this case, their customers would be utilities and big customers aiming to hedge a part of their supplies with long-term contracts or stakes in nuclear reactors without some nuclear constraints. (author)

  6. FMDP Reactor Alternative Summary Report: Volume 3 - partially complete LWR alternative

    International Nuclear Information System (INIS)

    Greene, S.R.; Fisher, S.E.; Bevard, B.B.

    1996-09-01

    The Department of Energy Office of Fissile Materials Disposition (DOE/MD) initiated a detailed analysis activity to evaluate each of ten plutonium disposition alternatives that survived an initial screening process. This document, Volume 3 of a four volume report summarizes the results of these analyses for the partially complete LWR (PCLWR) reactor based plutonium disposition alternative

  7. FMDP Reactor Alternative Summary Report: Volume 3 - partially complete LWR alternative

    Energy Technology Data Exchange (ETDEWEB)

    Greene, S.R.; Fisher, S.E.; Bevard, B.B. [and others

    1996-09-01

    The Department of Energy Office of Fissile Materials Disposition (DOE/MD) initiated a detailed analysis activity to evaluate each of ten plutonium disposition alternatives that survived an initial screening process. This document, Volume 3 of a four volume report summarizes the results of these analyses for the partially complete LWR (PCLWR) reactor based plutonium disposition alternative.

  8. Big Data Analytics, Infectious Diseases and Associated Ethical Impacts

    OpenAIRE

    Garattini, C.; Raffle, J.; Aisyah, D. N.; Sartain, F.; Kozlakidis, Z.

    2017-01-01

    The exponential accumulation, processing and accrual of big data in healthcare are only possible through an equally rapidly evolving field of big data analytics. The latter offers the capacity to rationalize, understand and use big data to serve many different purposes, from improved services modelling to prediction of treatment outcomes, to greater patient and disease stratification. In the area of infectious diseases, the application of big data analytics has introduced a number of changes ...

  9. Evaluation of Data Management Systems for Geospatial Big Data

    OpenAIRE

    Amirian, Pouria; Basiri, Anahid; Winstanley, Adam C.

    2014-01-01

    Big Data encompasses collection, management, processing and analysis of the huge amount of data that varies in types and changes with high frequency. Often data component of Big Data has a positional component as an important part of it in various forms, such as postal address, Internet Protocol (IP) address and geographical location. If the positional components in Big Data extensively used in storage, retrieval, analysis, processing, visualization and knowledge discovery (geospatial Big Dat...

  10. Ten years after the Chernobyl reactor accident: expected and detected health effects in the CIS

    International Nuclear Information System (INIS)

    Kellerer, A.M.

    1996-01-01

    The author explains the essential aspects of the actual or possible health effects of the reactor accident in the immediatedly affected areas. Radiation-induced injury to health primarily manifested itself in thyroid tumors induced by the short-lived radio-iodine. It is possible that the long-lived fission products in the fallout will increase in the long run, or have done so already, the incidence rate of cancer, especially leukemias, in the population; to date, this possible increase is of an order of magnitude not yet observable in the available statistical data. (orig.) [de

  11. A New Look at Big History

    Science.gov (United States)

    Hawkey, Kate

    2014-01-01

    The article sets out a "big history" which resonates with the priorities of our own time. A globalizing world calls for new spacial scales to underpin what the history curriculum addresses, "big history" calls for new temporal scales, while concern over climate change calls for a new look at subject boundaries. The article…

  12. West Virginia's big trees: setting the record straight

    Science.gov (United States)

    Melissa Thomas-Van Gundy; Robert. Whetsell

    2016-01-01

    People love big trees, people love to find big trees, and people love to find big trees in the place they call home. Having been suspicious for years, my coauthor and historian Rob Whetsell, approached me with a species identification challenge. There are several photographs of giant trees used by many people to illustrate the past forests of West Virginia,...

  13. Sosiaalinen asiakassuhdejohtaminen ja big data

    OpenAIRE

    Toivonen, Topi-Antti

    2015-01-01

    Tässä tutkielmassa käsitellään sosiaalista asiakassuhdejohtamista sekä hyötyjä, joita siihen voidaan saada big datan avulla. Sosiaalinen asiakassuhdejohtaminen on terminä uusi ja monille tuntematon. Tutkimusta motivoi aiheen vähäinen tutkimus, suomenkielisen tutkimuksen puuttuminen kokonaan sekä sosiaalisen asiakassuhdejohtamisen mahdollinen olennainen rooli yritysten toiminnassa tulevaisuudessa. Big dataa käsittelevissä tutkimuksissa keskitytään monesti sen tekniseen puoleen, eikä sovellutuk...

  14. D-branes in a big bang/big crunch universe: Misner space

    International Nuclear Information System (INIS)

    Hikida, Yasuaki; Nayak, Rashmi R.; Panigrahi, Kamal L.

    2005-01-01

    We study D-branes in a two-dimensional lorentzian orbifold R 1,1 /Γ with a discrete boost Γ. This space is known as Misner or Milne space, and includes big crunch/big bang singularity. In this space, there are D0-branes in spiral orbits and D1-branes with or without flux on them. In particular, we observe imaginary parts of partition functions, and interpret them as the rates of open string pair creation for D0-branes and emission of winding closed strings for D1-branes. These phenomena occur due to the time-dependence of the background. Open string 2→2 scattering amplitude on a D1-brane is also computed and found to be less singular than closed string case

  15. D-branes in a big bang/big crunch universe: Misner space

    Energy Technology Data Exchange (ETDEWEB)

    Hikida, Yasuaki [Theory Group, High Energy Accelerator Research Organization (KEK), Tukuba, Ibaraki 305-0801 (Japan); Nayak, Rashmi R. [Dipartimento di Fisica and INFN, Sezione di Roma 2, ' Tor Vergata' , Rome 00133 (Italy); Panigrahi, Kamal L. [Dipartimento di Fisica and INFN, Sezione di Roma 2, ' Tor Vergata' , Rome 00133 (Italy)

    2005-09-01

    We study D-branes in a two-dimensional lorentzian orbifold R{sup 1,1}/{gamma} with a discrete boost {gamma}. This space is known as Misner or Milne space, and includes big crunch/big bang singularity. In this space, there are D0-branes in spiral orbits and D1-branes with or without flux on them. In particular, we observe imaginary parts of partition functions, and interpret them as the rates of open string pair creation for D0-branes and emission of winding closed strings for D1-branes. These phenomena occur due to the time-dependence of the background. Open string 2{yields}2 scattering amplitude on a D1-brane is also computed and found to be less singular than closed string case.

  16. Astroinformatics: the big data of the universe

    OpenAIRE

    Barmby, Pauline

    2016-01-01

    In astrophysics we like to think that our field was the originator of big data, back when it had to be carried around in big sky charts and books full of tables. These days, it's easier to move astrophysics data around, but we still have a lot of it, and upcoming telescope  facilities will generate even more. I discuss how astrophysicists approach big data in general, and give examples from some Western Physics & Astronomy research projects.  I also give an overview of ho...

  17. Recent big flare

    International Nuclear Information System (INIS)

    Moriyama, Fumio; Miyazawa, Masahide; Yamaguchi, Yoshisuke

    1978-01-01

    The features of three big solar flares observed at Tokyo Observatory are described in this paper. The active region, McMath 14943, caused a big flare on September 16, 1977. The flare appeared on both sides of a long dark line which runs along the boundary of the magnetic field. Two-ribbon structure was seen. The electron density of the flare observed at Norikura Corona Observatory was 3 x 10 12 /cc. Several arc lines which connect both bright regions of different magnetic polarity were seen in H-α monochrome image. The active region, McMath 15056, caused a big flare on December 10, 1977. At the beginning, several bright spots were observed in the region between two main solar spots. Then, the area and the brightness increased, and the bright spots became two ribbon-shaped bands. A solar flare was observed on April 8, 1978. At first, several bright spots were seen around the solar spot in the active region, McMath 15221. Then, these bright spots developed to a large bright region. On both sides of a dark line along the magnetic neutral line, bright regions were generated. These developed to a two-ribbon flare. The time required for growth was more than one hour. A bright arc which connects two ribbons was seen, and this arc may be a loop prominence system. (Kato, T.)

  18. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    CERN Multimedia

    2008-01-01

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  19. Inflated granularity: Spatial “Big Data” and geodemographics

    Directory of Open Access Journals (Sweden)

    Craig M Dalton

    2015-08-01

    Full Text Available Data analytics, particularly the current rhetoric around “Big Data”, tend to be presented as new and innovative, emerging ahistorically to revolutionize modern life. In this article, we situate one branch of Big Data analytics, spatial Big Data, through a historical predecessor, geodemographic analysis, to help develop a critical approach to current data analytics. Spatial Big Data promises an epistemic break in marketing, a leap from targeting geodemographic areas to targeting individuals. Yet it inherits characteristics and problems from geodemographics, including a justification through the market, and a process of commodification through the black-boxing of technology. As researchers develop sustained critiques of data analytics and its effects on everyday life, we must so with a grounding in the cultural and historical contexts from which data technologies emerged. This article and others (Barnes and Wilson, 2014 develop a historically situated, critical approach to spatial Big Data. This history illustrates connections to the critical issues of surveillance, redlining, and the production of consumer subjects and geographies. The shared histories and structural logics of spatial Big Data and geodemographics create the space for a continued critique of data analyses’ role in society.

  20. Big data analysis for smart farming

    NARCIS (Netherlands)

    Kempenaar, C.; Lokhorst, C.; Bleumer, E.J.B.; Veerkamp, R.F.; Been, Th.; Evert, van F.K.; Boogaardt, M.J.; Ge, L.; Wolfert, J.; Verdouw, C.N.; Bekkum, van Michael; Feldbrugge, L.; Verhoosel, Jack P.C.; Waaij, B.D.; Persie, van M.; Noorbergen, H.

    2016-01-01

    In this report we describe results of a one-year TO2 institutes project on the development of big data technologies within the milk production chain. The goal of this project is to ‘create’ an integration platform for big data analysis for smart farming and to develop a show case. This includes both

  1. The use of moving bed bio-reactor to laundry wastewater treatment

    Science.gov (United States)

    Bering, Sławomira; Mazur, Jacek; Tarnowski, Krzysztof; Janus, Magdalena; Mozia, Sylwia; Waldemar Morawski, Antoni

    2017-11-01

    Large laboratory scale biological treatment test of industrial real wastewater, generated in industrial big laundry, has been conducted in the period of May 2016-August 2016. The research aimed at selection of laundry wastewater treatment technology included tests of two-stage Moving Bed Bio Reactor (MBBR), with two reactors filled with carriers Kaldnes K5 (specific area - 800 m2/m3), have been realized in aerobic condition. Operating on site, in the laundry, reactors have been fed real wastewater from laundry retention tank. To the laundry wastewater, contained mainly surfactants and impurities originating from washed fabrics, a solution of urea to supplement nitrogen content and a solution of acid to correct pH have been added. Daily flow of raw wastewater Qd was equal to 0.6-0.8 m3/d. The values of determined wastewater quality indicators showed that substantial decrease of pollutants content have been reached: BOD5 by 94.7-98.1%, COD by 86.9-93.5%, the sum of anionic and nonionic surfactants by 98.7-99.8%. The quality of the purified wastewater, after star-up period, meets the legal requirements regarding the standards for wastewater discharged to the environment.

  2. A survey on Big Data Stream Mining

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... Big Data can be static on one machine or distributed ... decision making, and process automation. Big data .... Concept Drifting: concept drifting mean the classifier .... transactions generated by a prefix tree structure. EstDec ...

  3. Emerging technology and architecture for big-data analytics

    CERN Document Server

    Chang, Chip; Yu, Hao

    2017-01-01

    This book describes the current state of the art in big-data analytics, from a technology and hardware architecture perspective. The presentation is designed to be accessible to a broad audience, with general knowledge of hardware design and some interest in big-data analytics. Coverage includes emerging technology and devices for data-analytics, circuit design for data-analytics, and architecture and algorithms to support data-analytics. Readers will benefit from the realistic context used by the authors, which demonstrates what works, what doesn’t work, and what are the fundamental problems, solutions, upcoming challenges and opportunities. Provides a single-source reference to hardware architectures for big-data analytics; Covers various levels of big-data analytics hardware design abstraction and flow, from device, to circuits and systems; Demonstrates how non-volatile memory (NVM) based hardware platforms can be a viable solution to existing challenges in hardware architecture for big-data analytics.

  4. Thermal performance and efficiency of supercritical nuclear reactors

    International Nuclear Information System (INIS)

    Romney Duffey; Tracy Zhou; Hussam Khartabil

    2009-01-01

    The paper reviews the major advances and innovative aspects of the thermal performance of recent concepts for super-critical water-cooled nuclear reactors (SCWR). The concepts are based on the extensive experience in the thermal power industry with super and ultra-supercritical boilers and turbines. The challenges and goals of increased efficiency, reduced cost, enhanced safety and co-generation have been pursued over the last ten years, and have resulted both in viable concepts and a vibrant defined R and D effort. The supercritical concept has wide acceptance among industry, as it reflects standard engineering practices and current thermal plant technology that is being already deployed. The SCWR concept represents a continuous development of water-cooled reactor technology, which utilizes the best and latest advances made in the thermal power industry. (author)

  5. Evolution of on-power fuelling machines on Canadian natural uranium power reactors

    International Nuclear Information System (INIS)

    Isaac, P.

    1984-10-01

    The evolution of the on-power fuel changing process and fuelling machines on CANDU heavy-water pressure tube power reactors from the first nuclear power demonstration plant, 22 MWe NPD, to the latest plants now in design and development is described. The high availability of CANDU's is largely dependent on on-power fuelling. The on-power fuelling performance record of the 16 operating CANDU reactors, covering a 22 year period since the first plant became operational, is given. This shows that on-power fuel changing with light (unshielded), highly mobile and readily maintainable fuelling machines has been a success. The fuelling machines have contributed very little to the incapabilities of the plants and have been a key factor in placing CANDUs in the top ten list of world performance. Although fuel handling technology has reached a degree of maturity, refinements are continuing. A new single-ended fuel changing concept for horizontal reactors under development is described. This has the potential for reducing capital and operating costs for small reactors and increasing the fuelling capability of possible large reactors of the future

  6. Toward a manifesto for the 'public understanding of big data'.

    Science.gov (United States)

    Michael, Mike; Lupton, Deborah

    2016-01-01

    In this article, we sketch a 'manifesto' for the 'public understanding of big data'. On the one hand, this entails such public understanding of science and public engagement with science and technology-tinged questions as follows: How, when and where are people exposed to, or do they engage with, big data? Who are regarded as big data's trustworthy sources, or credible commentators and critics? What are the mechanisms by which big data systems are opened to public scrutiny? On the other hand, big data generate many challenges for public understanding of science and public engagement with science and technology: How do we address publics that are simultaneously the informant, the informed and the information of big data? What counts as understanding of, or engagement with, big data, when big data themselves are multiplying, fluid and recursive? As part of our manifesto, we propose a range of empirical, conceptual and methodological exhortations. We also provide Appendix 1 that outlines three novel methods for addressing some of the issues raised in the article. © The Author(s) 2015.

  7. What do Big Data do in Global Governance?

    DEFF Research Database (Denmark)

    Krause Hansen, Hans; Porter, Tony

    2017-01-01

    Two paradoxes associated with big data are relevant to global governance. First, while promising to increase the capacities of humans in governance, big data also involve an increasingly independent role for algorithms, technical artifacts, the Internet of things, and other objects, which can...... reduce the control of human actors. Second, big data involve new boundary transgressions as data are brought together from multiple sources while also creating new boundary conflicts as powerful actors seek to gain advantage by controlling big data and excluding competitors. These changes are not just...... about new data sources for global decision-makers, but instead signal more profound changes in the character of global governance....

  8. Current status and ageing management of the Dalat Nuclear Research Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen Nhi Dien [Nuclear Research Institute, Dalat (Viet Nam)

    2000-10-01

    The Dalat Nuclear Research Reactor (DNRR) is a 500 kW swimming pool type reactor loaded with the Soviet WWR-M2 fuel elements, moderated and cooled by light water. It was reconstructed and upgraded from the former 250 kW TRIGA Mark-II reactor built in 1963. The first criticality of the renovated reactor was in November 1983 and it has been put in regular operation at nominal power since March 1984. The DNRR is operated mainly in continuous runs of 100 hrs every 4 weeks, for radioisotope production, neutron activation analyses and other research purposes. The remaining time is devoted to maintenance work and to short runs for reactor physics studies as well. From its first start-up to the end of 1998, it totaled about 20,000 hrs of operation at nominal power. After ten years of operation, reactor general inspection and refurbishment were implemented in the 1992-1996 period. In April 1994, refueling work was executed with adding of 11 fresh fuel elements to the reactor core. At present, the reactor has been working with 100-fuel element configuration. Corrosion study has been implemented by visual inspection of the reactor pool tank and some other inside components which remain unchanged from the previous TRIGA reactor. The inspections were carried out with the assistance of some experts from other countries. Some visual inspection results have been obtained and the nature of the electrochemical corrosion and related aspects were little by little identified. In this paper, the operation status of the Dalat reactor is presented, and some activities related to the ageing management of the reactor pool tank and its inside components are also discussed. (author)

  9. Current status and ageing management of the Dalat Nuclear Research Reactor

    International Nuclear Information System (INIS)

    Nguyen Nhi Dien

    2000-01-01

    The Dalat Nuclear Research Reactor (DNRR) is a 500 kW swimming pool type reactor loaded with the Soviet WWR-M2 fuel elements, moderated and cooled by light water. It was reconstructed and upgraded from the former 250 kW TRIGA Mark-II reactor built in 1963. The first criticality of the renovated reactor was in November 1983 and it has been put in regular operation at nominal power since March 1984. The DNRR is operated mainly in continuous runs of 100 hrs every 4 weeks, for radioisotope production, neutron activation analyses and other research purposes. The remaining time is devoted to maintenance work and to short runs for reactor physics studies as well. From its first start-up to the end of 1998, it totaled about 20,000 hrs of operation at nominal power. After ten years of operation, reactor general inspection and refurbishment were implemented in the 1992-1996 period. In April 1994, refueling work was executed with adding of 11 fresh fuel elements to the reactor core. At present, the reactor has been working with 100-fuel element configuration. Corrosion study has been implemented by visual inspection of the reactor pool tank and some other inside components which remain unchanged from the previous TRIGA reactor. The inspections were carried out with the assistance of some experts from other countries. Some visual inspection results have been obtained and the nature of the electrochemical corrosion and related aspects were little by little identified. In this paper, the operation status of the Dalat reactor is presented, and some activities related to the ageing management of the reactor pool tank and its inside components are also discussed. (author)

  10. Big Data in Caenorhabditis elegans: quo vadis?

    Science.gov (United States)

    Hutter, Harald; Moerman, Donald

    2015-11-05

    A clear definition of what constitutes "Big Data" is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of "complete" data sets for this organism is actually rather small--not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein-protein interaction--important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell. © 2015 Hutter and Moerman. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  11. 76 FR 7810 - Big Horn County Resource Advisory Committee

    Science.gov (United States)

    2011-02-11

    ..., Wyoming 82801. Comments may also be sent via e-mail to [email protected] , with the words Big... DEPARTMENT OF AGRICULTURE Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee...

  12. Hot big bang or slow freeze?

    Energy Technology Data Exchange (ETDEWEB)

    Wetterich, C.

    2014-09-07

    We confront the big bang for the beginning of the universe with an equivalent picture of a slow freeze — a very cold and slowly evolving universe. In the freeze picture the masses of elementary particles increase and the gravitational constant decreases with cosmic time, while the Newtonian attraction remains unchanged. The freeze and big bang pictures both describe the same observations or physical reality. We present a simple “crossover model” without a big bang singularity. In the infinite past space–time is flat. Our model is compatible with present observations, describing the generation of primordial density fluctuations during inflation as well as the present transition to a dark energy-dominated universe.

  13. Hot big bang or slow freeze?

    International Nuclear Information System (INIS)

    Wetterich, C.

    2014-01-01

    We confront the big bang for the beginning of the universe with an equivalent picture of a slow freeze — a very cold and slowly evolving universe. In the freeze picture the masses of elementary particles increase and the gravitational constant decreases with cosmic time, while the Newtonian attraction remains unchanged. The freeze and big bang pictures both describe the same observations or physical reality. We present a simple “crossover model” without a big bang singularity. In the infinite past space–time is flat. Our model is compatible with present observations, describing the generation of primordial density fluctuations during inflation as well as the present transition to a dark energy-dominated universe

  14. Hot big bang or slow freeze?

    Directory of Open Access Journals (Sweden)

    C. Wetterich

    2014-09-01

    Full Text Available We confront the big bang for the beginning of the universe with an equivalent picture of a slow freeze — a very cold and slowly evolving universe. In the freeze picture the masses of elementary particles increase and the gravitational constant decreases with cosmic time, while the Newtonian attraction remains unchanged. The freeze and big bang pictures both describe the same observations or physical reality. We present a simple “crossover model” without a big bang singularity. In the infinite past space–time is flat. Our model is compatible with present observations, describing the generation of primordial density fluctuations during inflation as well as the present transition to a dark energy-dominated universe.

  15. Big Data: Survey, Technologies, Opportunities, and Challenges

    Directory of Open Access Journals (Sweden)

    Nawsher Khan

    2014-01-01

    Full Text Available Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  16. Pre-big bang cosmology and quantum fluctuations

    International Nuclear Information System (INIS)

    Ghosh, A.; Pollifrone, G.; Veneziano, G.

    2000-01-01

    The quantum fluctuations of a homogeneous, isotropic, open pre-big bang model are discussed. By solving exactly the equations for tensor and scalar perturbations we find that particle production is negligible during the perturbative Pre-Big Bang phase

  17. Simulation of Molten Salt Reactor dynamics

    International Nuclear Information System (INIS)

    Krepel, J.; Rohde, U.; Grundmann, U.

    2005-01-01

    Dynamics of the Molten Salt Reactor - one of the 'Generation IV' concepts - was studied in this paper. The graphite-moderated channel type MSR was selected for the numerical simulation of the reactor with liquid fuel. The MSR dynamics is very specific because of two physical peculiarities of the liquid fueled reactor: the delayed neutrons precursors are drifted by the fuel flow and the fission energy is immediately released directly into the coolant. Presently, there are not many accessible numerical codes appropriate for the MSR simulation, therefore the DYN3D-MSR code was developed based on the FZR in-house code DYN3D. It allows calculating of full 3D transient neutronics in combination with parallel channel type thermal-hydraulics. By means of DYN3D-MSR, several transients typical for the liquid fuel system were analyzed. Those transients were initiated by reactivity insertion, by overcooling of fuel at the core inlet, by the fuel pump start-up or coast-down, or by the blockage of selected fuel channels. In these considered transients, the response of the MSR is characterized by the immediate change of the fuel temperature with changing power and fast negative temperature feedback to the power. The response through the graphite temperature is slower. Furthermore, for big MSR cores fueled with U233 the graphite feedback coefficient can be positive. In this case the addition of erbium to the graphite can ensure the inherent safety features. The DYN3D-MSR code has been shown to be an effective tool for MSR dynamics studies. (author)

  18. Analysis of Big Data Maturity Stage in Hospitality Industry

    OpenAIRE

    Shabani, Neda; Munir, Arslan; Bose, Avishek

    2017-01-01

    Big data analytics has an extremely significant impact on many areas in all businesses and industries including hospitality. This study aims to guide information technology (IT) professionals in hospitality on their big data expedition. In particular, the purpose of this study is to identify the maturity stage of the big data in hospitality industry in an objective way so that hotels be able to understand their progress, and realize what it will take to get to the next stage of big data matur...

  19. A Multidisciplinary Perspective of Big Data in Management Research

    OpenAIRE

    Sheng, Jie; Amankwah-Amoah, J.; Wang, X.

    2017-01-01

    In recent years, big data has emerged as one of the prominent buzzwords in business and management. In spite of the mounting body of research on big data across the social science disciplines, scholars have offered little synthesis on the current state of knowledge. To take stock of academic research that contributes to the big data revolution, this paper tracks scholarly work's perspectives on big data in the management domain over the past decade. We identify key themes emerging in manageme...

  20. How Big Science Came to Long Island: the Birth of Brookhaven Lab (429th Brookhaven Lecture)

    International Nuclear Information System (INIS)

    Crease, Robert P.

    2007-01-01

    Robert P. Crease, historian for the U.S. Department of Energy's Brookhaven National Laboratory and Chair of the Philosophy Department at Stony Brook University, will give two talks on the Laboratory's history on October 31 and December 12. Crease's October 31 talk, titled 'How Big Science Came to Long Island: The Birth of Brookhaven Lab,' will cover the founding of the Laboratory soon after World War II as a peacetime facility to construct and maintain basic research facilities, such as nuclear reactors and particle accelerators, that were too large for single institutions to build and operate. He will discuss the key figures involved in starting the Laboratory, including Nobel laureates I.I. Rabi and Norman Ramsey, as well as Donald Dexter Van Slyke, one of the most renowned medical researchers in American history. Crease also will focus on the many problems that had to be overcome in creating the Laboratory and designing its first big machines, as well as the evolving relations of the Laboratory with the surrounding Long Island community and news media. Throughout his talk, Crease will tell fascinating stories about Brookhaven's scientists and their research.