WorldWideScience

Sample records for big ten reactor

  1. Gender Differences in Personality across the Ten Aspects of the Big Five

    OpenAIRE

    Weisberg, Yanna J.; DeYoung, Colin G.; Hirsh, Jacob B.

    2011-01-01

    This paper investigates gender differences in personality traits, both at the level of the Big Five and at the sublevel of two aspects within each Big Five domain. Replicating previous findings, women reported higher Big Five Extraversion, Agreeableness, and Neuroticism scores than men. However, more extensive gender differences were found at the level of the aspects, with significant gender differences appearing in both aspects of every Big Five trait. For Extraversion, Openness, and Conscie...

  2. Gender differences in personality across the ten aspects of the Big Five

    OpenAIRE

    YannaJWeisberg; JacobBHirsh

    2011-01-01

    This paper investigates gender differences in personality traits, both at the level of the Big Five and at the sublevel of two aspects within each Big Five domain. Replicating previous findings, women reported higher Big Five Extraversion, Agreeableness and Neuroticism scores than men. However, more extensive gender differences were found at the level of the aspects, with significant gender differences appearing in both aspects of every Big Five trait. For Extraversion, Openness, and Conscien...

  3. Gender differences in personality across the ten aspects of the Big Five

    Directory of Open Access Journals (Sweden)

    YannaJWeisberg

    2011-08-01

    Full Text Available This paper investigates gender differences in personality traits, both at the level of the Big Five and at the sublevel of two aspects within each Big Five domain. Replicating previous findings, women reported higher Big Five Extraversion, Agreeableness and Neuroticism scores than men. However, more extensive gender differences were found at the level of the aspects, with significant gender differences appearing in both aspects of every Big Five trait. For Extraversion, Openness, and Conscientiousness, the gender differences were found to diverge at the aspect level, rendering them either small or undetectable at the Big Five level. These findings clarify the nature of gender differences in personality and highlight the utility of measuring personality at the aspect level.

  4. Simulation of FCC Riser Reactor Based on Ten Lump Model

    Directory of Open Access Journals (Sweden)

    Debashri Paul

    2015-07-01

    Full Text Available The ten lump strategy and reaction schemes are based on the concentration of the various stocks i.e., paraffins, naphthenes, aromatic and aromatic substituent groups (paraffinic and napthenic groups attached to aromatic rings. The developed model has been studied using C++ programming language using Runge-Kutta Fehlberg mathematical method. At a space time of 4.5 s, the gasoline yield is predicted to be 72 mass % and 67 mass % for naphthenic and paraffinic feedstock respectively. Type of feed determines the yield of gasoline and coke. A highly naphthenic charge stock has given the greatest yield of gasoline among naphthenic, paraffinic and aromatic charge stock. In addition to this, effect of space time and temperature on the yield of coke and gasoline and conversion of gas oil has been presented. Also, the effect of catalyst to oil ratio is also taken in studies.

  5. Ten aspects of the Big Five in the Personality Inventory for DSM-5.

    Science.gov (United States)

    DeYoung, Colin G; Carey, Bridget E; Krueger, Robert F; Ross, Scott R

    2016-04-01

    Diagnostic and Statistical Manual of Mental Disorders (5th ed.; DSM-5) includes a dimensional model of personality pathology, operationalized in the Personality Inventory for DSM-5 (PID-5), with 25 facets grouped into 5 higher order factors resembling the Big Five personality dimensions. The present study tested how well these 25 facets could be integrated with the 10-factor structure of traits within the Big Five that is operationalized by the Big Five Aspect Scales (BFAS). In 2 healthy adult samples, 10-factor solutions largely confirmed our hypothesis that each of the 10 BFAS would be the highest loading BFAS on 1 and only 1 factor. Varying numbers of PID-5 scales were additional markers of each factor, and the overall factor structure in the first sample was well replicated in the second. Our results allow Cybernetic Big Five Theory (CB5T) to be brought to bear on manifestations of personality disorder, because CB5T offers mechanistic explanations of the 10 factors measured by the BFAS. Future research, therefore, may begin to test hypotheses derived from CB5T regarding the mechanisms that are dysfunctional in specific personality disorders. PMID:27032017

  6. Neutrino-driven nucleon fission reactors: Supernovae, quasars, and the big bang

    International Nuclear Information System (INIS)

    The purpose of this work is to establish the existence of naturally occurring celestial neutrino-driven nucleon fission chain reaction reactors as the first step in the development of controlled nucleon fission reactors on Earth. Celestial nucleon fission reactors provide functioning models that serve as starting points for reactor development. Recognizing supernovae, quasars, and the Big Bang as functioning neutrino-driven nucleon fission reactors presents the nuclear industry with a new and significant challenge. That challenge is our technological prowess to achieve a controlled nucleon fission chain reaction using the Earth's resources

  7. Ten-year utilization of the Oregon State University TRIGA Reactor (OSTR)

    International Nuclear Information System (INIS)

    The Oregon State University TRIGA Reactor (OSTR) has been used heavily throughout the past ten years to accommodate exclusively university research, teaching, and training efforts. Averages for the past nine years show that the OSTR use time has been as follows: 14% for academic and special training courses; 44% for OSU research projects; 6% for non-OSU research projects; 2% for demonstrations for tours; and 34% for reactor maintenance, calibrations, inspections, etc. The OSTR has operated an average of 25.4 hours per week during this nine-year period. Each year, about 20 academic courses and 30 different research projects use the OSTR. Visitors to the facility average about 1,500 per year. No commercial radiations or services have been performed at the OSTR during this period. Special operator training courses are given at the OSTR at the rate of at least one per year. (author)

  8. TenBig Bangs” in Theory and Practice that Have Made a Difference to Australian Policing in the Last Three Decades

    Directory of Open Access Journals (Sweden)

    Rick Sarre

    2016-05-01

    Full Text Available This paper discusses what could be considered the top ten innovations that have occurred in policing in the last thirty years. The intent is to focus attention on how practice could be further inspired by additional innovation. The innovations are discussed here as “Big Bangs” as a way of drawing attention to the significant impact they have had on policing, in the same way that the cosmological Big Bang was an important watershed event in the universe’s existence. These ten policing innovations ushered in, it is argued, a new mindset, pattern or trend, and they affected Australian policing profoundly; although many had their roots in other settings long before Australian policy-makers implemented them.

  9. Innovations and Enhancements for a Consortium of Big-10 University Research and Training Reactors. Final Report

    International Nuclear Information System (INIS)

    The Consortium of Big-10 University Research and Training Reactors was by design a strategic partnership of seven leading institutions. We received the support of both our industry and DOE laboratory partners. Investments in reactor, laboratory and program infrastructure, allowed us to lead the national effort to expand and improve the education of engineers in nuclear science and engineering, to provide outreach and education to pre-college educators and students and to become a key resource of ideas and trained personnel for our U.S. industrial and DOE laboratory collaborators.

  10. Big ambitions for small reactors as investors size up power options

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, John [nuclear24, Redditch (United Kingdom)

    2016-04-15

    Earlier this year, US nuclear developer NuScale Power completed a study for the UK's National Nuclear Laboratory (NNL) that supported the suitability of NuScale's small modular reactor (SMR) technology for the effective disposition of plutonium. The UK is a frontrunner to compete in the SMR marketplace, both in terms of technological capabilities, trade and political commitment. Industry observers are openly speculating whether SMR design and construction could start to move ahead faster than 'big and conventional' nuclear construction projects - not just in the UK but worldwide. Economies of scale could increase the attraction of SMRs to investors and the general public.

  11. Big ambitions for small reactors as investors size up power options

    International Nuclear Information System (INIS)

    Earlier this year, US nuclear developer NuScale Power completed a study for the UK's National Nuclear Laboratory (NNL) that supported the suitability of NuScale's small modular reactor (SMR) technology for the effective disposition of plutonium. The UK is a frontrunner to compete in the SMR marketplace, both in terms of technological capabilities, trade and political commitment. Industry observers are openly speculating whether SMR design and construction could start to move ahead faster than 'big and conventional' nuclear construction projects - not just in the UK but worldwide. Economies of scale could increase the attraction of SMRs to investors and the general public.

  12. Ten years of TRIGA reactor research at the University of Texas

    International Nuclear Information System (INIS)

    The 1 MW TRIGA Research Reactor at the Nuclear Engineering Teaching Laboratory is the second TRIGA at the University of Texas at Austin (UT). A small (10 kW-1963, 250 kW-1968) TRIGA Mark I was housed in the basement of the Engineering Building until is was shutdown and decommissioned in 1989. The new TRIGA Mark II with a licensed power of 1.1 MW reached initial criticality in 1992. Prior to 1990, reactor research at UT usually consisted of projects requiring neutron activation analysis (NAA) but the step up to a much larger reactor with neutron beam capability required additional personnel to build the neutron research program. The TCNS is currently used to perform Prompt Gamma Activation Analysis to determine hydrogen and boron concentrations of various composite materials. The early 1990s was a very active period for neutron beam projects at the NETL. In addition to the TCNS, a real-time neutron radiography facility (NIF) and a high-resolution neutron depth profiling facility (NDP) were installed in two separate beam ports. The NDP facility was most recently used to investigate alpha damage on stainless steel in support of the U.S. Nuclear Weapons Stewardship programs. In 1999, a sapphire beam filter was installed in the NDP system to reduce the fast neutron flux at the sample location. A collaborative effort was started in 1997 between UT-Austin and the University of Texas at Arlington to build a reactor-based, low-energy positron beam (TIPS). The limited success in obtaining funding has placed the project on hold. The Nuclear and Radiation Engineering Program has grown rapidly and effectively doubled in size over the past 5 years but years of low nuclear research funding, an overall stagnation in the U.S. nuclear power industry and a persuasive public distrust of nuclear energy has caused a precipitous decline in many programs. Recently, the U.S. DOE has encouraged University Research Reactors (URR) in the U.S. to collaborate closely together by forming URR

  13. Ten years of IAEA cooperation with the Russian research reactor fuel return programme

    Energy Technology Data Exchange (ETDEWEB)

    Tozser, S.; Adelfang, P.; Bradley, E. [International Atomic Energy Agency, Vienna (Austria)

    2013-01-15

    The Russian Research Reactor Fuel Return (RRRFR) Programme was launched in 2001. Over the duration, the programme successfully completed 43 safe shipments of 1.6 tons of fresh and spent HEU fuel from different countries using Russian fuelled research reactors to the country of origin. The IAEA has been a very active supporter of the RRRFR Programme since its inception. Under the auspices of the RRRFR Programme, the Agency has been ensuring a broad range of technical advisory and organizational support to the HEU fuel repatriation, as well as training and advisory assistance for supporting RR conversion from HEU to LEU. The presentation gives an overview of the RRRFR programme achievements with special consideration of the IAEA contribution. These include an overview of the shipments' history in terms of fresh and spent fuel, as well as a summary of experiences gained during the shipments' preparation and termination. The presentation focuses on technical advisory support given by the IAEA during the programme implementation, captures the consolidated knowledge of the unique international programme and shares the most important lessons learned. (orig.)

  14. Innovations and enhancements in neutronic analysis of the Big-10 university research and training reactors based on the AGENT code system

    International Nuclear Information System (INIS)

    Introduction. This paper summarizes salient aspects of the 'virtual' reactor system developed at Purdue Univ. emphasizing efficient neutronic modeling through AGENT (Arbitrary Geometry Neutron Transport) a deterministic neutron transport code. DOE's Big-10 Innovations in Nuclear Infrastructure and Education (INIE) Consortium was launched in 2002 to enhance scholarship activities pertaining to university research and training reactors (URTRs). Existing and next generation URTRs are powerful campus tools for nuclear engineering as well as a number of disciplines that include, but are not limited to, medicine, biology, material science, and food science. Advancing new computational environments for the analysis and configuration of URTRs is an important Big-10 INIE aim. Specifically, Big-10 INIE has pursued development of a 'virtual' reactor, an advanced computational environment to serve as a platform on which to build operations, utilization (research and education), and systemic analysis of URTRs physics. The 'virtual' reactor computational system will integrate computational tools addressing the URTR core and near core physics (transport, dynamics, fuel management and fuel configuration); thermal-hydraulics; beam line, in-core and near-core experiments; instrumentation and controls; confinement/containment and security issues. Such integrated computational environment does not currently exist. The 'virtual' reactor is designed to allow researchers and educators to configure and analyze their systems to optimize experiments, fuel locations for flux shaping, as well as detector selection and configuration. (authors)

  15. Extended burnup demonstration: reactor fuel program. Pre-irradiation characterization and summary of pre-program poolside examinations. Big Rock Point extended burnup fuel

    International Nuclear Information System (INIS)

    This report is a resource document characterizing the 64 fuel rods being irradiated at the Big Rock Point reactor as part of the Extended Burnup Demonstration being sponsored jointly by the US Department of Energy, Consumers Power Company, Exxon Nuclear Company, and General Public Utilities. The program entails extending the exposure of standard BWR fuel to a discharge average of 38,000 MWD/MTU to demonstrate the feasibility of operating fuel of standard design to levels significantly above current limits. The fabrication characteristics of the Big Rock Point EBD fuel are presented along with measurement of rod length, rod diameter, pellet stack height, and fuel rod withdrawal force taken at poolside at burnups up to 26,200 MWD/MTU. A review of the fuel examination data indicates no performance characteristics which might restrict the continued irradiation of the fuel

  16. Particles fluidized bed receiver/reactor with a beam-down solar concentrating optics: 30-kWth performance test using a big sun-simulator

    Science.gov (United States)

    Kodama, Tatsuya; Gokon, Nobuyuki; Cho, Hyun Seok; Matsubara, Koji; Etori, Tetsuro; Takeuchi, Akane; Yokota, Shin-nosuke; Ito, Sumie

    2016-05-01

    A novel concept of particles fluidized bed receiver/reactor with a beam-down solar concentrating optics was performed using a 30-kWth window type receiver by a big sun-simulator. A fluidized bed of quartz sand particles was created by passing air from the bottom distributor of the receiver, and about 30 kWth of high flux visible light from 19 xenon-arc lamps of the sun-simulator was directly irradiated on the top of the fluidized bed in the receiver through a quartz window. The particle bed temperature at the center position of the fluidized bed went up to a temperature range from 1050 to 1200°C by the visible light irradiation with the average heat flux of about 950 kW/m2, depending on the air flow rate. The output air temperature from the receiver reached 1000 - 1060°C.

  17. Big Society, Big Deal?

    Science.gov (United States)

    Thomson, Alastair

    2011-01-01

    Political leaders like to put forward guiding ideas or themes which pull their individual decisions into a broader narrative. For John Major it was Back to Basics, for Tony Blair it was the Third Way and for David Cameron it is the Big Society. While Mr. Blair relied on Lord Giddens to add intellectual weight to his idea, Mr. Cameron's legacy idea…

  18. BigDog

    Science.gov (United States)

    Playter, R.; Buehler, M.; Raibert, M.

    2006-05-01

    BigDog's goal is to be the world's most advanced quadruped robot for outdoor applications. BigDog is aimed at the mission of a mechanical mule - a category with few competitors to date: power autonomous quadrupeds capable of carrying significant payloads, operating outdoors, with static and dynamic mobility, and fully integrated sensing. BigDog is about 1 m tall, 1 m long and 0.3 m wide, and weighs about 90 kg. BigDog has demonstrated walking and trotting gaits, as well as standing up and sitting down. Since its creation in the fall of 2004, BigDog has logged tens of hours of walking, climbing and running time. It has walked up and down 25 & 35 degree inclines and trotted at speeds up to 1.8 m/s. BigDog has walked at 0.7 m/s over loose rock beds and carried over 50 kg of payload. We are currently working to expand BigDog's rough terrain mobility through the creation of robust locomotion strategies and terrain sensing capabilities.

  19. Integrated plant-safety assessment, Systematic Evaluation Program: Big Rock Point Plant (Docket No. 50-155)

    International Nuclear Information System (INIS)

    The Systematic Evaluation Program was initiated in February 1977 by the US Nuclear Regulatory Commission to review the designs of older operating nuclear reactor plants to reconfirm and document their safety. This report documents the review of the Big Rock Point Plant, which is one of ten plants reviewed under Phase II of this program. This report indicates how 137 topics selected for review under Phase I of the program were addressed. It also addresses a majority of the pending licensing actions for Big Rock Point, which include TMI Action Plan requirements and implementation criteria for resolved generic issues. Equipment and procedural changes have been identified as a result of the review

  20. Big Data

    Directory of Open Access Journals (Sweden)

    Prachi More

    2013-05-01

    Full Text Available Demand and spurt in collections and accumulation of data has coined new term “Big Data” has begun. Accidently, incidentally and by interaction of people, information so called data is massively generated. This BIG DATA is to be smartly and effectively used Computer scientists, physicists, economists, mathematicians, political scientists, bio-informaticists, sociologists and many Variety of Intellegesia debate over the potential benefits and costs of analysing information from Twitter, Google, Facebook, Wikipedia and every space where large groups of people leave digital traces and deposit data. Given the rise of Big Data as both a phenomenon and a methodological persuasion, it is time to start critically interrogating this phenomenon, its assumptions and its biases. Big data, which refers to the data sets that are too big to be handled using the existing database management tools, are emerging in many important applications, such as Internet search, business informatics, social networks, social media, genomics, and meteorology. Big data presents a grand challenge for database and data analytics research. This paper is a blend of non-technical and introductory-level technical detail, ideal for the novice. We conclude with some technical challenges as well as the solutions that can be used to these challenges. Big Data differs from other data with five characteristics like volume, variety, value, velocity and complexity. The article will focus on some current and future cases and causes for BIG DATA.

  1. Big Data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin;

    2016-01-01

    The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper is...... to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big...

  2. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Ruppert, Evelyn; Flyverbom, Mikkel;

    2016-01-01

    The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper is...... to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments that is contained in any big data practice. Secondly, it suggest a research agenda built around a set of sub-themes that each deserve dedicated scrutiny when studying the interplay...

  3. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Jeppesen, Jacob; Vaarst Andersen, Kristina; Lauto, Giancarlo; Valentin, Finn

    utilize a stochastic actor oriented model (SAOM) to analyze both network endogeneous mechanisms and individual agency driving the collaboration network and further if being a Big Ego in Big Science translates to increasing performance. Our findings suggest that the selection of collaborators is not based...... on preferentialattachment, but more of an assortativity effect creating not merely a rich-gets-richer effect but an elitist network with high entry barriers. In this acclaimed democratic and collaborative environment of Big Science, the elite closes in on itself. We propose this tendency to be even...

  4. A Matrix Big Bang

    OpenAIRE

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matr...

  5. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  6. Big Rock Point

    International Nuclear Information System (INIS)

    The Big Rock Point Nuclear Plant is the second oldest operating nuclear power plant in the United States. Its 25-yr history is an embodiment of the history of commercial nuclear power. In some respects, its situation today - 5 yr past the midpoint of its design life - can provide operators of other nuclear plants a glimpse of where they will be in another decade. Construction on Big Rock Point began in 1960. It was completed just 2 1/2 yr later at a cost of $27 million. The plant is a General Electric (GE)-designed boiling water direct cycle, forced circulation, high power density reactor. Its construction was undertaken by Consumers Power under the third round of the U.S. Atomic Energy Commission's (AEC's) Power Demonstration Reactor Program. It was an advanced version of GE's Vallecitos boiling water reactor. The plant's fuel was GE's responsibility and, under contract with the AEC, it conducted a fuel research and development (RandD) program involving the plant. Although the plant was designed for research - its original electrical capacity was set at 50 MW(electric) - the unit was subsequently uprated to 69 MW(net electric). The original plant staff included only 44 people and minimal security. Mirroring the industry experience, the number of people on-site had quadrupled

  7. Big Science

    International Nuclear Information System (INIS)

    Astronomy, like particle physics, has become Big Science where the demands of front line research can outstrip the science budgets of whole nations. Thus came into being the European Southern Observatory (ESO), founded in 1962 to provide European scientists with a major modern observatory to study the southern sky under optimal conditions

  8. Big Dreams

    Science.gov (United States)

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  9. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  10. Big Data and Big Science

    OpenAIRE

    Di Meglio, Alberto

    2014-01-01

    Brief introduction to the challenges of big data in scientific research based on the work done by the HEP community at CERN and how the CERN openlab promotes collaboration among research institutes and industrial IT companies. Presented at the FutureGov 2014 conference in Singapore.

  11. Release plan for Big Pete

    International Nuclear Information System (INIS)

    This release plan is to provide instructions for the Radiological Control Technician (RCT) to conduct surveys for the unconditional release of ''Big Pete,'' which was used in the removal of ''Spacers'' from the N-Reactor. Prior to performing surveys on the rear end portion of ''Big Pete,'' it shall be cleaned (i.e., free of oil, grease, caked soil, heavy dust). If no contamination is found, the vehicle may be released with the permission of the area RCT Supervisor. If contamination is found by any of the surveys, contact the cognizant Radiological Engineer for decontamination instructions

  12. Reactors

    International Nuclear Information System (INIS)

    Purpose: To provide a spray cooling structure wherein the steam phase in a bwr reactor vessel can sufficiently be cooled and the upper cap and flanges in the vessel can be cooled rapidly which kept from direct contaction with cold water. Constitution: An apertured shielding is provided in parallel spaced apart from the inner wall surface at the upper portion of a reactor vessel equipped with a spray nozzle, and the lower end of the shielding and the inner wall of the vessel are closed to each other so as to store the cooling water. Upon spray cooling, cooling water jetting out from the nozzle cools the vapor phase in the vessel and then hits against the shielding. Then the cooling water mostly falls as it is, while partially enters through the apertures to the back of the shielding plate, abuts against stoppers and falls down. The stoppers are formed in an inverted L shape so that the spray water may not in direct contaction with the inner wall of the vessel. (Horiuchi, T.)

  13. Big queues

    CERN Document Server

    Ganesh, Ayalvadi; Wischik, Damon

    2004-01-01

    Big Queues aims to give a simple and elegant account of how large deviations theory can be applied to queueing problems. Large deviations theory is a collection of powerful results and general techniques for studying rare events, and has been applied to queueing problems in a variety of ways. The strengths of large deviations theory are these: it is powerful enough that one can answer many questions which are hard to answer otherwise, and it is general enough that one can draw broad conclusions without relying on special case calculations.

  14. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  15. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  16. A matrix big bang

    International Nuclear Information System (INIS)

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control

  17. A Matrix Big Bang

    CERN Document Server

    Craps, B; Verlinde, E; Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control.

  18. Experimental Study of Big Row Spacing Cultivation of Tomato Using Straw Biological Reactor Technology%应用秸秆生物反应堆技术大行距栽培番茄试验研究

    Institute of Scientific and Technical Information of China (English)

    王继涛; 张翔; 温学萍; 赵玮; 俞风娟; 汪金山

    2015-01-01

    应用秸秆生物反应堆技术能有效地改善设施内环境因素、减缓病害发生、提高产量效益,但此项技术在开沟过程中比较费工费力,为了降低秸秆生物反应堆技术劳动用工和生产投入,特开展秸秆生物反应堆技术大行距栽培番茄试验研究。结果表明:仅挖沟、埋秸秆、起垄、铺设滴管、定植环节比对照每公顷节省劳动用工35.7%,节约成本16810.5元/hm2,上市期提前5 d,产量增加26.68%,病虫害发病率明显降低。综合田间生长势及室内考种数据,建议在宁夏地区大面积推广应用秸秆生物反应堆技术大行距栽培番茄。%The application of the straw biological reactor technology can effectively improve the environmental factors within the facility, slow down the occurrence of the disease and improve the yield and benefit. But with this technology, in the process of ditching, a lot of work and effort are needed. In order to reduce the labor employment and production inputs in the utilization of the technology, an experiment research on the big row spacing cultivation of tomato using the straw biologi-cal reactor technology was conducted. The results showed that compared with the control, only in the links such as ditching, straw burring, ridging, laying of dropper and planting, 35.7% of the labor employment per hectare, 16,810.5 yuan/hm2 of the cost could be saved the marketing time could be advance by 5 days, the yield could be increased by 26.68% and the inci-dence of pests and diseases could be lowered significantly. In considering the comprehensive growth potential in the field and the indoor test data it is suggested that the big row spacing cultivation of tomato using the straw biological reactor technology should be extended and applied in large areas in Ningxia.

  19. Big nuclear accidents

    International Nuclear Information System (INIS)

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the safety of nuclear power. The way in which the probability and consequences of big nuclear accidents have been presented in the past is reviewed and recommendations for the future are made including the presentation of the long-term consequences of such accidents in terms of 'reduction in life expectancy', 'increased chance of fatal cancer' and the equivalent pattern of compulsory cigarette smoking. (author)

  20. Big Data : Overview

    OpenAIRE

    Richa Gupta; Sunny Gupta; Anuradha Singhal

    2014-01-01

    Big data is data that exceeds the processing capacity of traditional databases. The data is too big to be processed by a single machine. New and innovative methods are required to process and store such large volumes of data. This paper provides an overview on big data, its importance in our live and some technologies to handle big data.

  1. Big Data: Overview

    OpenAIRE

    Gupta, Richa; Gupta, Sunny; Singhal, Anuradha

    2014-01-01

    Big data is data that exceeds the processing capacity of traditional databases. The data is too big to be processed by a single machine. New and innovative methods are required to process and store such large volumes of data. This paper provides an overview on big data, its importance in our live and some technologies to handle big data.

  2. Spinoffs of big nuclear projects

    International Nuclear Information System (INIS)

    Spinoffs so far used to be discussed only in connection with space travel. The question is well worth investigating wether also big nuclear projects, such as the advanced reactor lines or the nuclear fuel cycle, produce technical spinoffs. One misunderstanding should be cleared right at the beginning: man did not travel to the moon to invent the teflon coated frying pan. Nor is nuclear spinoff the actual purpose of the exercise. The high temperature reactor and the fast breeder reactor, or the closing of the nuclear fuel cycle, are justified independent goals of energy policy. However, if the overall benefit to the national economy of nuclear high technology is to be evaluated, also the question of technical spinoff must be considered. (orig.)

  3. Powers of ten

    CERN Multimedia

    Pyramid FILMS

    1977-01-01

    Powers of Ten is a 1977 short documentary film written and directed by Charles Eames and his wife, Ray. The film depicts the relative scale of the Universe in factors of ten (see also logarithmic scale and order of magnitude). The film begins with an aerial image of a man reclining on a blanket; the view is that of one meter across. The viewpoint, accompanied by expository voiceover, then slowly zooms out to a view ten meters across ( or 101 m in standard form), revealing that the man is picnicking in a park with a female companion. The zoom-out continues, to a view of 100 meters (102 m), then 1 kilometer (103 m), and so on, increasing the perspective—the picnic is revealed to be taking place near Soldier Field on Chicago's waterfront—and continuing to zoom out to a field of view of 1024 meters, or the size of the observable universe. The camera then zooms back in to the picnic, and then to views of negative powers of ten—10-1 m (10 centimeters), and so forth, until we are viewing a carbon nucl...

  4. Characterizing Big Data Management

    OpenAIRE

    Rogério Rossi; Kechi Hirama

    2015-01-01

    Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: t...

  5. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  6. Five Big Ideas

    Science.gov (United States)

    Morgan, Debbie

    2012-01-01

    Designing quality continuing professional development (CPD) for those teaching mathematics in primary schools is a challenge. If the CPD is to be built on the scaffold of five big ideas in mathematics, what might be these five big ideas? Might it just be a case of, if you tell me your five big ideas, then I'll tell you mine? Here, there is…

  7. Powers of ten

    CERN Document Server

    Innocenti, Pier Giorgio

    1979-01-01

    Powers of Ten is a 1977 short documentary film written and directed by Charles Eames and his wife, Ray. The film depicts the relative scale of the Universe in factors of ten (see also logarithmic scale and order of magnitude). The idea for the film appears to have come from the 1957 book Cosmic View by Kees Boeke. The film begins with an aerial image of a man reclining on a blanket; the view is that of one meter across. The viewpoint, accompanied by expository voiceover, then slowly zooms out to a view ten meters across ( or 101 m in standard form), revealing that the man is picnicking in a park with a female companion. The zoom-out continues, to a view of 100 meters (102 m), then 1 kilometer (103 m), and so on, increasing the perspective—the picnic is revealed to be taking place near Soldier Field on Chicago's waterfront—and continuing to zoom out to a field of view of 1024 meters, or the size of the observable universe. The camera then zooms back in to the picnic, and then to views of negative pow...

  8. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  9. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  10. Mining "big data" using big data services

    OpenAIRE

    Reips, UD; Matzat, U Uwe

    2014-01-01

    While many colleagues within science are fed up with the “big data fad”, empirical analyses we conducted for the current editorial actually show an inconsistent picture: we use big data services to determine whether there really is an increase in writing about big data or even widespread use of the term. Google Correlate (http://www.google.com/trends/correlate/), the first free tool we are presenting here, doesn’t list the term, showing that number of searches for it are below an absolute min...

  11. From Big Crunch to Big Bang

    OpenAIRE

    Khoury, Justin; Ovrut, Burt A.; Seiberg, Nathan; Steinhardt, Paul J.(Princeton Center for Theoretical Science, Princeton University, Princeton, NJ, 08544, USA); Turok, Neil

    2001-01-01

    We consider conditions under which a universe contracting towards a big crunch can make a transition to an expanding big bang universe. A promising example is 11-dimensional M-theory in which the eleventh dimension collapses, bounces, and re-expands. At the bounce, the model can reduce to a weakly coupled heterotic string theory and, we conjecture, it may be possible to follow the transition from contraction to expansion. The possibility opens the door to new classes of cosmological models. F...

  12. Big fundamental groups: generalizing homotopy and big homotopy

    OpenAIRE

    Penrod, Keith

    2014-01-01

    The concept of big homotopy theory was introduced by J. Cannon and G. Conner using big intervals of arbitrarily large cardinality to detect big loops. We find, for each space, a canonical cardinal that is sufficient to detect all big loops and all big homotopies in the space.

  13. ANALYSIS OF BIG DATA

    OpenAIRE

    Anshul Sharma; Preeti Gulia

    2014-01-01

    Big Data is data that either is too large, grows too fast, or does not fit into traditional architectures. Within such data can be valuable information that can be discovered through data analysis [1]. Big data is a collection of complex and large data sets that are difficult to process and mine for patterns and knowledge using traditional database management tools or data processing and mining systems. Big Data is data whose scale, diversity and complexity require new architecture, technique...

  14. Matrix Big Brunch

    OpenAIRE

    Bedford, J; Papageorgakis, C.; Rodriguez-Gomez, D.; Ward, J.

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  15. The big bang

    International Nuclear Information System (INIS)

    The paper concerns the 'Big Bang' theory of the creation of the Universe 15 thousand million years ago, and traces events which physicists predict occurred soon after the creation. Unified theory of the moment of creation, evidence of an expanding Universe, the X-boson -the particle produced very soon after the big bang and which vanished from the Universe one-hundredth of a second after the big bang, and the fate of the Universe, are all discussed. (U.K.)

  16. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  17. Ten years of PAMELA

    Science.gov (United States)

    Spillantini, Piero

    2016-07-01

    Pamela experiment has been designed as a cosmic ray observatory at 1 AU, dedicated to the precise and high statistics study of CR fluxes on a three decades energy range, form a few tens MeV up to several hundred GeV region. It is the last step of the 'Russian-Italian Mission' (RIM) program born in 1992 between several Italian and Russian institutes and with the participation of the Royal Institute of Technology of Stockholm (Sweden) and the Siegen University (German). Launched the 16 June 2006 from Baikonur cosmodrome on board of the Resurs-DK1 Russian satellite by a Soyuz rocket in an elliptical (350-610 km) quasi polar orbit (70° inclination) it was activated on 21 June 2006, afterword has been in a continuous data taking mode for ten years. The Pamela program pays particular attention to the study of particles (protons and electrons) and antiparticles (antiprotons and positrons) energy spectra. It also includes search for possible signals of dark matter annihilation, search for primordial antimatter (antihelium), search for new Matter in the Universe (Strangelets?), study of cosmic-ray propagation, solar physics and solar modulation, terrestrial magnetosphere. This program is made possible thanks to the outstanding performance of the instrument, the low energy threshold, the quasi-polar orbit, the 10 years duration of the observation. Protons and helium nuclei are the most abundant components of the cosmic radiation and the precise measurements of their fluxes allow understanding the acceleration and propagation of cosmic rays in the Galaxy. Their spectral shapes cannot be well described by a single power law: at 230-240 GV they exhibit an abrupt spectral hardening. They challenge the current paradigm of cosmic-ray acceleration in supernova remnants followed by diffusive propagation in the Galaxy. Of paramount importance is the discover of the anomalous increase of the positron flux at energies higher that 50 GeV (the so called 'Pamela anomaly'). The review of

  18. The ten thousand Kims

    Science.gov (United States)

    Baek, Seung Ki; Minnhagen, Petter; Kim, Beom Jun

    2011-07-01

    In Korean culture, the names of family members are recorded in special family books. This makes it possible to follow the distribution of Korean family names far back in history. It is shown here that these name distributions are well described by a simple null model, the random group formation (RGF) model. This model makes it possible to predict how the name distributions change and these predictions are shown to be borne out. In particular, the RGF model predicts that for married women entering a collection of family books in a certain year, the occurrence of the most common family name 'Kim' should be directly proportional to the total number of married women with the same proportionality constant for all the years. This prediction is also borne out to a high degree. We speculate that it reflects some inherent social stability in the Korean culture. In addition, we obtain an estimate of the total population of the Korean culture down to the year 500 AD, based on the RGF model, and find about ten thousand Kims.

  19. The Ten Thousand Kims

    CERN Document Server

    Baek, Seung Ki; Kim, Beom Jun; 10.1088/1367-2630/13/7/073036

    2011-01-01

    In the Korean culture the family members are recorded in special family books. This makes it possible to follow the distribution of Korean family names far back in history. It is here shown that these name distributions are well described by a simple null model, the random group formation (RGF) model. This model makes it possible to predict how the name distributions change and these predictions are shown to be borne out. In particular, the RGF model predicts that, for married women entering a collection of family books in a certain year, the occurrence of the most common family name "Kim" should be directly proportional the total number of married women with the same proportionality constant for all the years. This prediction is also borne out to high degree. We speculate that it reflects some inherent social stability in the Korean culture. In addition, we obtain an estimate of the total population of the Korean culture down to year 500 AD, based on the RGF model and find about ten thousand Kims.

  20. The ten thousand Kims

    International Nuclear Information System (INIS)

    In Korean culture, the names of family members are recorded in special family books. This makes it possible to follow the distribution of Korean family names far back in history. It is shown here that these name distributions are well described by a simple null model, the random group formation (RGF) model. This model makes it possible to predict how the name distributions change and these predictions are shown to be borne out. In particular, the RGF model predicts that for married women entering a collection of family books in a certain year, the occurrence of the most common family name 'Kim' should be directly proportional to the total number of married women with the same proportionality constant for all the years. This prediction is also borne out to a high degree. We speculate that it reflects some inherent social stability in the Korean culture. In addition, we obtain an estimate of the total population of the Korean culture down to the year 500 AD, based on the RGF model, and find about ten thousand Kims.

  1. Dual of big bang and big crunch

    International Nuclear Information System (INIS)

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory

  2. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  3. Big Ideas in Art

    Science.gov (United States)

    Day, Kathleen

    2008-01-01

    In this article, the author shares how she was able to discover some big ideas about art education. She relates how she found great ideas to improve her teaching from the book "Rethinking Curriculum in Art." She also shares how she designed a "Big Idea" unit in her class.

  4. Big Bang Nucleosynthesis constraints on new physics

    International Nuclear Information System (INIS)

    Primordial Nucleosynthesis provides a probe of the physics of the early Universe when the temperature and particle densities are high. The Cosmic Nuclear Reactor may, thereby, lead to constraints on new physics which may be inaccessible to current accelerators. Current Big Bang Nucleosynthesis (BBN) bounds to the existence and/or properties of new particles are reviewed and used to constrain physics 'beyond the standard model.' (orig.)

  5. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled. PMID:26173222

  6. Big Data, Big Knowledge: Big Data for Personalized Healthcare.

    OpenAIRE

    Viceconti, M.; Hunter, P.; Hose, R.

    2015-01-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine soluti...

  7. Ten years after Chernobyl

    International Nuclear Information System (INIS)

    As was amply demonstrated during the EU/IAEA/WHO Summing-up-Conference in Vienna, Austria, April 8-12, 1996, the radiological consequences of the Chernobyl accident were, fortunately, not as serious as frequently presented in the media: 28 people died from acute radiation syndrome in 1986, 14 more of possibly radiation-related causes since. Of the <1000 thyroid cancers in children, 90 to 95% are curable. There have so far been no other demonstrable increases in the former Soviet Union, as well as in Western Europe, of leukemias, solid cancers, or genetic defects, nor are any to be expected in the future. Even among the open-quotes liquidatorsclose quotes with doses ∼100 mSv, of the ∼150 additional expected leukemias during the 10 yr after the accident, none have been observed. The economical, social, and political consequences, however, both in the former Soviet Union and in Western Europe, have been very substantial. Whole countries developed an hysterical 'radiation sickness.' As A. Merkel, the German Minister of Environment and Reactor Safety, who chaired the conference, pointed out, 'the radiation sensitivity of societies far exceeds that of individuals.' It is obvious that important groups in Ukraine, Belaurus, and Russia try to blame a large fraction of all economic, social, and health problems during the last decade, which are substantial (∼ 6 yr less life expectancy, twice the homicides and traffic deaths, increased alcoholism, and so forth), on radiation of the Chernobyl accident in an effort to attract more support. Western scientists refute such claims but admit large non-radiation-related problems caused by the accident

  8. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority. PMID:26218867

  9. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  10. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  11. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    2016-01-01

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van dez

  12. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  13. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    The precision of measurements in modern cosmology has made huge strides in recent years, with measurements of the cosmic microwave background and the determination of the Hubble constant now rivaling the level of precision of the predictions of big bang nucleosynthesis. However, these results are not necessarily consistent with the predictions of the Standard Model of big bang nucleosynthesis. Reconciling these discrepancies may require extensions of the basic tenets of the model, and possibly of the reaction rates that determine the big bang abundances

  14. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  15. Cloud Computing and Big Data Can Improve the Quality of Our Life

    OpenAIRE

    Chang, Victor

    2015-01-01

    The rise of Cloud Computing and Big Data has played influential roles in the evolution of IT services and has made significant contributions to different disciplines. For example, there are ten services that cannot be achieved without the combined effort from Cloud Computing and Big Data techniques: They are Storage as a Service, Health Informatics as a Service, Financial Software as a Service, Business Intelligence as a Service, Education as a Service, Big Data Processing as a Service, Integ...

  16. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  17. Big Data Analytics

    Indian Academy of Sciences (India)

    2016-08-01

    The volume and variety of data being generated using computersis doubling every two years. It is estimated that in 2015,8 Zettabytes (Zetta=1021) were generated which consistedmostly of unstructured data such as emails, blogs, Twitter,Facebook posts, images, and videos. This is called big data. Itis possible to analyse such huge data collections with clustersof thousands of inexpensive computers to discover patterns inthe data that have many applications. But analysing massiveamounts of data available in the Internet has the potential ofimpinging on our privacy. Inappropriate analysis of big datacan lead to misleading conclusions. In this article, we explainwhat is big data, how it is analysed, and give some case studiesillustrating the potentials and pitfalls of big data analytics.

  18. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    ’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  19. Big Creek Pit Tags

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BCPITTAGS database is used to store data from an Oncorhynchus mykiss (steelhead/rainbow trout) population dynamics study in Big Creek, a coastal stream along...

  20. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  1. Sharing big biomedical data

    OpenAIRE

    Toga, Arthur W.; Dinov, Ivo D.

    2015-01-01

    Background The promise of Big Biomedical Data may be offset by the enormous challenges in handling, analyzing, and sharing it. In this paper, we provide a framework for developing practical and reasonable data sharing policies that incorporate the sociological, financial, technical and scientific requirements of a sustainable Big Data dependent scientific community. Findings Many biomedical and healthcare studies may be significantly impacted by using large, heterogeneous and incongruent data...

  2. Testing Big Bang Nucleosynthesis

    OpenAIRE

    Steigman, Gary

    1996-01-01

    Big Bang Nucleosynthesis (BBN), along with the cosmic background radiation and the Hubble expansion, is one of the pillars ofthe standard, hot, big bang cosmology since the primordial synthesis of the light nuclides (D, $^3$He, $^4$He, $^7$Li) must have occurred during the early evolution of a universe described by this model. The overall consistency between the predicted and observed abundances of the light nuclides, each of which spans a range of some nine orders of magnitude, provides impr...

  3. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  4. Big Bang baryosynthesis

    International Nuclear Information System (INIS)

    In these lectures I briefly review Big Bang baryosynthesis. In the first lecture I discuss the evidence which exists for the BAU, the failure of non-GUT symmetrical cosmologies, the qualitative picture of baryosynthesis, and numerical results of detailed baryosynthesis calculations. In the second lecture I discuss the requisite CP violation in some detail, further the statistical mechanics of baryosynthesis, possible complications to the simplest scenario, and one cosmological implication of Big Bang baryosynthesis. (orig./HSI)

  5. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  6. Ten Smart Snacks for Teens

    Science.gov (United States)

    ... Professionals Community Organizations​ ​​ HealthSense Alternate Language URL Ten Smart Snacks for Teens Page Content Many adults think ... Help your teen choose healthy snacks using these smart ideas: Make a fruit pizza. Spread 2 tablespoons ...

  7. Ten new primitive binary trinomials

    Science.gov (United States)

    Brent, Richard P.; Zimmermann, Paul

    2009-06-01

    We exhibit ten new primitive trinomials over GF(2) of record degrees 24 036 583 , 25 964 951 , 30 402 457 , and 32 582 657 . This completes the search for the currently known Mersenne prime exponents.

  8. ABACC ten years applying safeguards

    International Nuclear Information System (INIS)

    The Argentinian-Brazilian Agency for Accounting and Control of nuclear special materials has been in operations for ten years. The rational behind the creation and the work performed by the Agency during the last decade is described. (author)

  9. Ten Common First Aid Mistakes

    Science.gov (United States)

    ... for the Latest in Workplace Safety Ten Common First Aid Mistakes These days, there are countless resources to ... We’ve listed some of the most common first aid mistakes below, along with the correct response methods. ...

  10. 2007 China Harbor Ten People

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    @@ 2007 China Harbor Ten People elected the entrepreneurs who contributed a lot to port economy and enterprises this year trough their talent management.These ten people embody their social responsibility,professional skills,creative ability,and charming personality.Bearing full confidence in China's port economy,the port entrepreneurs are brave enough to explore a brand new area,so as to promote harbor economic development.

  11. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution....... The massive involvement of lay publics as instrumented by social media breaks with the strong expert cultures that have underlain the production and use of data in modern organizations. It also sets apart the interactive and communicative processes by which social data is produced from sensor data...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...

  12. Big Data and Peacebuilding

    Directory of Open Access Journals (Sweden)

    Sanjana Hattotuwa

    2013-11-01

    Full Text Available Any peace process is an exercise in the negotiation of big data. From centuries old communal hagiography to the reams of official texts, media coverage and social media updates, peace negotiations generate data. Peacebuilding and peacekeeping today are informed by, often respond and contribute to big data. This is no easy task. As recently as a few years ago, before the term big data embraced the virtual on the web, what informed peace process design and implementation was in the physical domain – from contested borders and resources to background information in the form of text. The move from analogue, face-to-face negotiations to online, asynchronous, web-mediated negotiations – which can still include real world meetings – has profound implications for how peace is strengthened in fragile democracies.

  13. Big data in biomedicine.

    Science.gov (United States)

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. PMID:24183925

  14. Primordial Big Bang Nucleosynthesis

    OpenAIRE

    Olive, Keith A.

    1999-01-01

    Big Bang Nucleosynthesis is the theory of the production of the the light element isotopes of D, He3, He4, and Li7. After a brief review of the essential elements of the standard Big Bang model at a temperature of about 1 MeV, the theoretical input and predictions of BBN are discussed. The theory is tested by the observational determinations of the light element abundances and the current status of these observations is reviewed. Concordance of standard model and the related observations is f...

  15. Networks & big data

    OpenAIRE

    Litvak, Nelly; Meulen, van der, P.

    2015-01-01

    Once a year, the NWO cluster Stochastics – Theoretical and Applied Research (STAR) organises a STAR Outreach Day, a one-day event around a theme that is of a broad interest to the stochastics community in the Netherlands. The last Outreach Day took place at Eurandom on 12 December 2014. The theme of the day was ‘Networks & Big Data’. The topic is very timely. The Vision document 2025 of the PlatformWiskunde Nederland (PWN) mentions big data as one of the six “major societal and scientific tre...

  16. The origin of the future ten questions for the next ten years

    CERN Document Server

    Gribbin, John

    2006-01-01

    How did the universe begin? Where do galaxies come from? How do stars and planets form? Where do the material particles we are made of come from? How did life begin? Today we have only provisional answers to such questions. But scientific progress will improve these answers dramatically over the next ten years, predicts John Gribbin in this riveting book. He focuses on what we know—or think we know—about ten controversial, unanswered issues in the physical sciences and explains how current cutting-edge research may yield solutions in the very near future. With his trademark facility for engaging readers with or without a scientific background, the author explores ideas concerning the creation of the universe, the possibility of other forms of life, and the fate of the expanding cosmos. He examines “theories of everything,” including grand unified theories and string theory, and he discusses the Big Bang theory, the origin of structure and patterns of matter in the galaxies, and dark mass and dark ene...

  17. Passport to the Big Bang moves across the road

    CERN Multimedia

    Corinne Pralavorio

    2015-01-01

    The ATLAS platform of the Passport to the Big Bang circuit has been relocated in front of the CERN Reception.   The ATLAS platform of the Passport to the Big Bang, outside the CERN Reception building. The Passport to the Big Bang platform of the ATLAS Experiment has been moved in front of the CERN Reception to make it more visible and accessible. It had to be dismantled and moved from its previous location in the garden of the Globe of Science and Innovation due to the major refurbishment work in progress on the Globe, and is now fully operational in its new location on the other side of the road, in the Main Reception car-park. The Passport to the Big Bang circuit, inaugurated in 2013, comprises ten platforms installed in front of ten CERN sites and aims to help local residents and visitors to the region understand CERN's research. Dedicated Passport to the Big Bang flyers, containing all necessary information and riddles for you to solve, are available at the CERN Rec...

  18. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with sp

  19. Space big book

    CERN Document Server

    Homer, Charlene

    2007-01-01

    Our Combined resource includes all necessary areas of Space for grades five to eight. Get the big picture about the Solar System, Galaxies and the Universe as your students become fascinated by the interesting information about the Sun, Earth, Moon, Comets, Asteroids Meteoroids, Stars and Constellations. Also, thrill your young astronomers as they connect Earth and space cycles with their daily life.

  20. The Big Bang

    CERN Multimedia

    Moods, Patrick

    2006-01-01

    How did the Universe begin? The favoured theory is that everything - space, time, matter - came into existence at the same moment, around 13.7 thousand million years ago. This event was scornfully referred to as the "Big Bang" by Sir Fred Hoyle, who did not believe in it and maintained that the Universe had always existed.

  1. A Big Bang Lab

    Science.gov (United States)

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  2. Governing Big Data

    Directory of Open Access Journals (Sweden)

    Andrej J. Zwitter

    2014-04-01

    Full Text Available 2.5 quintillion bytes of data are created every day through pictures, messages, gps-data, etc. "Big Data" is seen simultaneously as the new Philosophers Stone and Pandora's box: a source of great knowledge and power, but equally, the root of serious problems.

  3. The big bang

    Science.gov (United States)

    Silk, Joseph

    Our universe was born billions of years ago in a hot, violent explosion of elementary particles and radiation - the big bang. What do we know about this ultimate moment of creation, and how do we know it? Drawing upon the latest theories and technology, this new edition of The big bang, is a sweeping, lucid account of the event that set the universe in motion. Joseph Silk begins his story with the first microseconds of the big bang, on through the evolution of stars, galaxies, clusters of galaxies, quasars, and into the distant future of our universe. He also explores the fascinating evidence for the big bang model and recounts the history of cosmological speculation. Revised and updated, this new edition features all the most recent astronomical advances, including: Photos and measurements from the Hubble Space Telescope, Cosmic Background Explorer Satellite (COBE), and Infrared Space Observatory; the latest estimates of the age of the universe; new ideas in string and superstring theory; recent experiments on neutrino detection; new theories about the presence of dark matter in galaxies; new developments in the theory of the formation and evolution of galaxies; the latest ideas about black holes, worm holes, quantum foam, and multiple universes.

  4. Big Java late objects

    CERN Document Server

    Horstmann, Cay S

    2012-01-01

    Big Java: Late Objects is a comprehensive introduction to Java and computer programming, which focuses on the principles of programming, software engineering, and effective learning. It is designed for a two-semester first course in programming for computer science students.

  5. Big is beautiful

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, J.P.

    2007-06-08

    Although big solar systems are both effective and architecturally pleasing, they are still not widespread in Germany. Recently, politicians reacted by improving funding conditions. In order to prevent planning errors, planners and fitters must be better trained, and standardisation of systems must be enhanced. (orig.)

  6. Big ideas: innovation policy

    OpenAIRE

    Van Reenen, John

    2011-01-01

    In the last CentrePiece, John Van Reenen stressed the importance of competition and labour market flexibility for productivity growth. His latest in CEP's 'big ideas' series describes the impact of research on how policy-makers can influence innovation more directly - through tax credits for business spending on research and development.

  7. Business and Science - Big Data, Big Picture

    Science.gov (United States)

    Rosati, A.

    2013-12-01

    Data Science is more than the creation, manipulation, and transformation of data. It is more than Big Data. The business world seems to have a hold on the term 'data science' and, for now, they define what it means. But business is very different than science. In this talk, I address how large datasets, Big Data, and data science are conceptually different in business and science worlds. I focus on the types of questions each realm asks, the data needed, and the consequences of findings. Gone are the days of datasets being created or collected to serve only one purpose or project. The trick with data reuse is to become familiar enough with a dataset to be able to combine it with other data and extract accurate results. As a Data Curator for the Advanced Cooperative Arctic Data and Information Service (ACADIS), my specialty is communication. Our team enables Arctic sciences by ensuring datasets are well documented and can be understood by reusers. Previously, I served as a data community liaison for the North American Regional Climate Change Assessment Program (NARCCAP). Again, my specialty was communicating complex instructions and ideas to a broad audience of data users. Before entering the science world, I was an entrepreneur. I have a bachelor's degree in economics and a master's degree in environmental social science. I am currently pursuing a Ph.D. in Geography. Because my background has embraced both the business and science worlds, I would like to share my perspectives on data, data reuse, data documentation, and the presentation or communication of findings. My experiences show that each can inform and support the other.

  8. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  9. The NOAA Big Data Project

    Science.gov (United States)

    de la Beaujardiere, J.

    2015-12-01

    The US National Oceanic and Atmospheric Administration (NOAA) is a Big Data producer, generating tens of terabytes per day from hundreds of sensors on satellites, radars, aircraft, ships, and buoys, and from numerical models. These data are of critical importance and value for NOAA's mission to understand and predict changes in climate, weather, oceans, and coasts. In order to facilitate extracting additional value from this information, NOAA has established Cooperative Research and Development Agreements (CRADAs) with five Infrastructure-as-a-Service (IaaS) providers — Amazon, Google, IBM, Microsoft, Open Cloud Consortium — to determine whether hosting NOAA data in publicly-accessible Clouds alongside on-demand computational capability stimulates the creation of new value-added products and services and lines of business based on the data, and if the revenue generated by these new applications can support the costs of data transmission and hosting. Each IaaS provider is the anchor of a "Data Alliance" which organizations or entrepreneurs can join to develop and test new business or research avenues. This presentation will report on progress and lessons learned during the first 6 months of the 3-year CRADAs.

  10. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  11. A Novel Burnable Absorber Concept for PWR: BigT (Burnable Absorber-Integrated Guide Thimble)

    International Nuclear Information System (INIS)

    This paper presents the essential BigT design concepts and its lattice neutronic characteristics. Neutronic performance of a newly-proposed BA concept for PWR named BigT is investigated in this study. Preliminary lattice analyses of the BigT absorber-loaded WH 17x17 fuel assembly show a high potential of the concept as it performs relatively well in comparison with commercial burnable absorber technologies, especially in managing reactivity depletion and peaking factor. A sufficiently high control rod worth can still be obtained with the BigT absorbers in place. It is expected that with such performance and design flexibilities, any loading pattern and core management objective, including a soluble boron-free PWR, can potentially be fulfilled with the BigT absorbers. Future study involving full 3D reactor core simulations with the BigT absorbers shall hopefully verify this hypothesis. A new burnable absorber design for Pressurized Water Reactor (PWR) named 'Burnable absorber-Integrated control rod Guide Thimble' (BigT) was recently proposed. Unlike conventional burnable absorber (BA) technologies, the BigT integrates BA materials directly into the guide thimble but still allows insertion of control rod (CR). In addition, the BigT offers a variety of design flexibilities such that any loading pattern and core management objective can potentially be fulfilled

  12. A Novel Burnable Absorber Concept for PWR: BigT (Burnable Absorber-Integrated Guide Thimble)

    Energy Technology Data Exchange (ETDEWEB)

    Yahya, Mohdsyukri; Kim, Yonghee [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Chung, Chang Kyu [KEPCO Engineering and Construction Company, Daejeon (Korea, Republic of)

    2014-05-15

    This paper presents the essential BigT design concepts and its lattice neutronic characteristics. Neutronic performance of a newly-proposed BA concept for PWR named BigT is investigated in this study. Preliminary lattice analyses of the BigT absorber-loaded WH 17x17 fuel assembly show a high potential of the concept as it performs relatively well in comparison with commercial burnable absorber technologies, especially in managing reactivity depletion and peaking factor. A sufficiently high control rod worth can still be obtained with the BigT absorbers in place. It is expected that with such performance and design flexibilities, any loading pattern and core management objective, including a soluble boron-free PWR, can potentially be fulfilled with the BigT absorbers. Future study involving full 3D reactor core simulations with the BigT absorbers shall hopefully verify this hypothesis. A new burnable absorber design for Pressurized Water Reactor (PWR) named 'Burnable absorber-Integrated control rod Guide Thimble' (BigT) was recently proposed. Unlike conventional burnable absorber (BA) technologies, the BigT integrates BA materials directly into the guide thimble but still allows insertion of control rod (CR). In addition, the BigT offers a variety of design flexibilities such that any loading pattern and core management objective can potentially be fulfilled.

  13. Ten Rules of Academic Writing

    OpenAIRE

    Donovan, S.K.

    2011-01-01

    Creative writers are well served with 'how to' guides, but just how much do they help? And how might they be relevant to academic authors? A recent survey of writing tips by twenty-eight creative authors has been condensed to the ten most relevant to the academic, supported by some comments on methodology and applicability.

  14. Ten-dimensional Supergravity Revisited

    NARCIS (Netherlands)

    Bergshoeff, Eric; Roo, Mees de; Kerstan, Sven; Riccioni, Fabio; Diaz Alonso, J.; Mornas, L.

    2006-01-01

    We show that the exisiting supergravity theories in ten dimensions can be extended with extra gauge fields whose rank is equal to the spacetime dimension. These gauge fields have vanishing field strength but nevertheless play an important role in the coupling of supergravity to spacetime filling bra

  15. Ten Problems in Experimental Mathematics

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, David H.; Borwein, Jonathan M.; Kapoor, Vishaal; Weisstein, Eric

    2004-09-30

    This article was stimulated by the recent SIAM ''100 DigitChallenge'' of Nick Trefethen, beautifully described in a recent book. Indeed, these ten numeric challenge problems are also listed in a recent book by two of present authors, where they are followed by the ten symbolic/numeric challenge problems that are discussed in this article. Our intent was to present ten problems that are characteristic of the sorts of problems that commonly arise in ''experimental mathematics''. The challenge in each case is to obtain a high precision numeric evaluation of the quantity, and then, if possible, to obtain a symbolic answer, ideally one with proof. Our goal in this article is to provide solutions to these ten problems, and in the process present a concise account of how one combines symbolic and numeric computation, which may be termed ''hybrid computation'', in the process of mathematical discovery.

  16. Nuclear Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Hogerton, John

    1964-01-01

    This pamphlet describes how reactors work; discusses reactor design; describes research, teaching, and materials testing reactors; production reactors; reactors for electric power generation; reactors for supply heat; reactors for propulsion; reactors for space; reactor safety; and reactors of tomorrow. The appendix discusses characteristics of U.S. civilian power reactor concepts and lists some of the U.S. reactor power projects, with location, type, capacity, owner, and startup date.

  17. ANALYTICS OF BIG DATA

    Directory of Open Access Journals (Sweden)

    Asst. Prof. Shubhada Talegaon

    2014-10-01

    Full Text Available Big Data analytics has started to impact all types of organizations, as it carries the potential power to extract embedded knowledge from big amounts of data and react according to it in real time. The current technology enables us to efficiently store and query large datasets, the focus is now on techniques that make use of the complete data set, instead of sampling. This has tremendous implications in areas like machine learning, pattern recognition and classification, sentiment analysis, social networking analysis to name a few. Therefore, there are a number of requirements for moving beyond standard data mining technique. Purpose of this paper is to understand various techniques to analysis data.

  18. ANALYTICS OF BIG DATA

    Directory of Open Access Journals (Sweden)

    Prof. Shubhada Talegaon

    2015-10-01

    Full Text Available Big Data analytics has started to impact all types of organizations, as it carries the potential power to extract embedded knowledge from big amounts of data and react according to it in real time. The current technology enables us to efficiently store and query large datasets, the focus is now on techniques that make use of the complete data set, instead of sampling. This has tremendous implications in areas like machine learning, pattern recognition and classification, sentiment analysis, social networking analysis to name a few. Therefore, there are a number of requirements for moving beyond standard data mining technique. Purpose of this paper is to understand various techniques to analysis data.

  19. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects......This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... of the international development agenda to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development policies, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion...

  20. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Klinkby Madsen, Anders; Rasche, Andreas

    data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects......This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... of international development agendas to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development efforts, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion...

  1. Really big numbers

    CERN Document Server

    Schwartz, Richard Evan

    2014-01-01

    In the American Mathematical Society's first-ever book for kids (and kids at heart), mathematician and author Richard Evan Schwartz leads math lovers of all ages on an innovative and strikingly illustrated journey through the infinite number system. By means of engaging, imaginative visuals and endearing narration, Schwartz manages the monumental task of presenting the complex concept of Big Numbers in fresh and relatable ways. The book begins with small, easily observable numbers before building up to truly gigantic ones, like a nonillion, a tredecillion, a googol, and even ones too huge for names! Any person, regardless of age, can benefit from reading this book. Readers will find themselves returning to its pages for a very long time, perpetually learning from and growing with the narrative as their knowledge deepens. Really Big Numbers is a wonderful enrichment for any math education program and is enthusiastically recommended to every teacher, parent and grandparent, student, child, or other individual i...

  2. Big Data Refinement

    OpenAIRE

    Boiten, Eerke Albert

    2016-01-01

    "Big data" has become a major area of research and associated funding, as well as a focus of utopian thinking. In the still growing research community, one of the favourite optimistic analogies for data processing is that of the oil refinery, extracting the essence out of the raw data. Pessimists look for their imagery to the other end of the petrol cycle, and talk about the "data exhausts" of our society. Obviously, the refinement community knows how to do "refining". This paper explores...

  3. DARPA's Big Mechanism program

    Science.gov (United States)

    Cohen, Paul R.

    2015-07-01

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  4. Canonical Big Operators

    OpenAIRE

    Bertot, Yves; Gonthier, Georges; Ould Biha, Sidi; Pasca, Ioana

    2008-01-01

    In this paper, we present an approach to describe uniformly iterated “big” operations and to provide lemmas that encapsulate all the commonly used reasoning steps on these constructs. We show that these iterated operations can be handled generically using the syntactic notation and canonical structure facilities provided by the Coq system. We then show how these canonical big operations played a crucial enabling role in the study of various parts of linear algebra and multi-dimensional real a...

  5. Big Bang 6

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 6 RG behandelt die Gravitation, Schwingungen und Wellen, Thermodynamik und eine Einführung in die Elektrizität anhand von Alltagsbeispielen und Querverbindungen zu anderen Disziplinen.

  6. Big Bang 7

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. In Band 7 werden neben einer Einführung auch viele aktuelle Aspekte von Quantenmechanik (z. Beamen) und Elektrodynamik (zB Elektrosmog), sowie die Klimaproblematik und die Chaostheorie behandelt.

  7. Big Bang 8

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Band 8 vermittelt auf verständliche Weise Relativitätstheorie, Kern- und Teilchenphysik (und deren Anwendungen in der Kosmologie und Astrophysik), Nanotechnologie sowie Bionik.

  8. Big Bang 5

    CERN Document Server

    Apolin, Martin

    2007-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 5 RG behandelt die Grundlagen (Maßsystem, Größenordnungen) und die Mechanik (Translation, Rotation, Kraft, Erhaltungssätze).

  9. Think Small Go Big

    Institute of Scientific and Technical Information of China (English)

    汤维维

    2006-01-01

    Vepoo公司在创立之前,经历了三次创业转型。用他们的话来说,从“think big go small”转到“think small go big”用了一年的时间。这期间他们耗尽了初期筹备资金,幸运的是在最后一刻迎来了黎明的曙光。

  10. Ten propositions about liquidity crises

    OpenAIRE

    Claudio E. V. Borio

    2015-01-01

    What are liquidity crises? And what can be done to address them? This short paper brings together some personal reflections on this issue, largely based on previous work. In the process, it questions a number of commonly held beliefs that have become part of the conventional wisdom. The paper is organised around ten propositions that cover the following issues: the distinction between idiosyncratic and systematic elements of liquidity crises; the growing reliance on funding liquidity in a mar...

  11. Ten tendencies of criminal justice

    Institute of Scientific and Technical Information of China (English)

    HE Jiahong

    2007-01-01

    A study of the global tendencies of criminal justice will help us design a more scientific and rational pathway for the reformation of existing criminal justice system of China. In the forthcoming several hundred years to come, theworld's criminal justice is to take on ten tendencies, that is, the tendency toward unity, civilization, science, rule of law, human rights, justice, efficiency,specialization, standardization and harmony.

  12. Lagrange-Singularitäten

    OpenAIRE

    Sevenheck, Christian

    2003-01-01

    In dieser Arbeit wird eine Deformationstheorie fürLagrange-Singularitäten entwickelt. Wir definieren einen Komplex von Moduln mit nicht-linearem Differential, densogenannten Lagrange-de Rham-Komplex, dessen ersteKohomologie isomorph zum Raum der infinitesimalenLagrange-Deformationen ist. Wir beschreiben die Beziehung diesesKomplexes zur Theorie der Moduln über dem Ring vonDifferentieloperatoren. Informationen zur Obstruktionstheorie vonLagrange-Deformationen werden aus derzweiten Kohomologie ...

  13. Classical Propagation of Strings across a Big Crunch/Big Bang Singularity

    OpenAIRE

    Niz, Gustavo(Departamento de Física, Universidad de Guanajuato, DCI, Campus León, C.P. 37150, León, Guanajuato, México); Turok, Neil

    2006-01-01

    One of the simplest time-dependent solutions of M theory consists of nine-dimensional Euclidean space times 1+1-dimensional compactified Milne space-time. With a further modding out by Z_2, the space-time represents two orbifold planes which collide and re-emerge, a process proposed as an explanation of the hot big bang. When the two planes are near, the light states of the theory consist of winding M2-branes, describing fundamental strings in a particular ten-dimensional background. They suf...

  14. SETI as a part of Big History

    Science.gov (United States)

    Maccone, Claudio

    2014-08-01

    Big History is an emerging academic discipline which examines history scientifically from the Big Bang to the present. It uses a multidisciplinary approach based on combining numerous disciplines from science and the humanities, and explores human existence in the context of this bigger picture. It is taught at some universities. In a series of recent papers ([11] through [15] and [17] through [18]) and in a book [16], we developed a new mathematical model embracing Darwinian Evolution (RNA to Humans, see, in particular, [17] and Human History (Aztecs to USA, see [16]) and then we extrapolated even that into the future up to ten million years (see 18), the minimum time requested for a civilization to expand to the whole Milky Way (Fermi paradox). In this paper, we further extend that model in the past so as to let it start at the Big Bang (13.8 billion years ago) thus merging Big History, Evolution on Earth and SETI (the modern Search for ExtraTerrestrial Intelligence) into a single body of knowledge of a statistical type. Our idea is that the Geometric Brownian Motion (GBM), so far used as the key stochastic process of financial mathematics (Black-Sholes models and related 1997 Nobel Prize in Economics!) may be successfully applied to the whole of Big History. In particular, in this paper we derive Big History Theory based on GBMs: just as the GBM is the “movie” unfolding in time, so the Statistical Drake Equation is its “still picture”, static in time, and the GBM is the time-extension of the Drake Equation. Darwinian Evolution on Earth may be easily described as an increasing GBM in the number of living species on Earth over the last 3.5 billion years. The first of them was RNA 3.5 billion years ago, and now 50

  15. The Concept of the Use of the Marine Reactor Plant in Small Electric Grids

    International Nuclear Information System (INIS)

    In report some aspects of the using marine nuclear reactor are considered for provision of need small non-interconnected power systems, as well as separate settlements and the mining enterprises disposed in regions with a undeveloped infrastructure. Recently for these purposes it is offered to use the nuclear small modular power plants. The required plant power for small electric grids lies within from 1 to several tens of MWe. Module can be collected and tested on machine-building plant, and then delivered in ready type to the working place on some transport, for instance, a barge. Through determined time it's possible to transport a module to the repair shop and also to the point of storage after the end of operation. Marine nuclear reactors on their powers, compactness, mass and size are ideal prototypes for creation of such modules. For instance, building at present floating power unit, intended for functioning in region of the Russian North, based on using reactor plants of nuclear icebreakers. Reliability and safety of the ship reactor are confirmed by their trouble-free operation during approximately 180 reactors-years. Unlike big stationary nuclear plant, working in base mode, power unit with marine reactor wholly capable to work in mode of the loading following. In contrast with reactor of nuclear icebreaker, advisable to increase the core lifetime and to reduce the enrichment of the uranium. This requires more uranium capacity fuel compositions and design of the core. In particular, possible transition from traditional for ship reactor of the channel core to cassette design. Other directions of evolution of the ship reactors, not touching the basic constructive decisions verified by practice, but promoting development of properties of self-security of plant are possible. Among such directions is reduction volumetric power density of a core. (author)

  16. Integrated plant safety assessment. Systematic evaluation program, Big Rock Point Plant (Docket No. 50-155). Final report

    International Nuclear Information System (INIS)

    The Systematic Evaluation Program was initiated in February 1977 by the U.S. Nuclear Regulatory Commission to review the designs of older operating nuclear reactor plants to reconfirm and document their safety. The review provides (1) an assessment of how these plants compare with current licensing safety requirements relating to selected issues, (2) a basis for deciding how these differences should be resolved in an integrated plant review, and (3) a documented evaluation of plant safety when the supplement to the Final Integrated Plant Safety Assessment Report has been issued. This report documents the review of the Big Rock Point Plant, which is one of ten plants reviewed under Phase II of this program. This report indicates how 137 topics selected for review under Phase I of the program were addressed. It also addresses a majority of the pending licensing actions for Big Rock Point, which include TMI Action Plan requirements and implementation criteria for resolved generic issues. Equipment and procedural changes have been identified as a result of the review

  17. From Big Bang to Big Crunch and Beyond

    OpenAIRE

    Elitzur, S.; Giveon, A.; Kutasov, D.; Rabinovici, E.

    2002-01-01

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a ``big bang'' singularity, expands and then contracts to a ``big crunch'' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceeding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spaceti...

  18. Thinking small is big again

    Energy Technology Data Exchange (ETDEWEB)

    Butler, C.; Didsbury, R.; Strati, G.L.; Sur, B. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada)

    2012-12-15

    This article introduces a series of articles in the field of small reactors. Activity in this field has been on the upswing in North America. The major advantages of a small modular reactor are improved safety, lower cost and a small-risk venture. Canada has a long history of small reactor designs for research reactors as well as power reactors. Some of these reactors are ZEEP, ZED-2, NRX, NRU, WR-1, SLOWPOKE and CANDU power reactors.

  19. How Big is Earth?

    Science.gov (United States)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  20. Big Bang Nucleosynthesis Calculation

    CERN Document Server

    Kurki-Suonio, H

    2001-01-01

    I review standard big bang nucleosynthesis and some versions of nonstandard BBN. The abundances of the primordial isotopes D, He-3, and Li-7 produced in standard BBN can be calculated as a function of the baryon density with an accuracy of about 10%. For He-4 the accuracy is better than 1%. The calculated abundances agree fairly well with observations, but the baryon density of the universe cannot be determined with high precision. Possibilities for nonstandard BBN include inhomogeneous and antimatter BBN and nonzero neutrino chemical potentials.

  1. Big and little OER

    OpenAIRE

    Weller, Martin

    2010-01-01

    Much of the attention around OERs has been on institutional projects which make explicit learning content available. These can be classified as ‘big OER’, but another form of OER is that of small scale, individually produced resources using web 2.0 type services, which are classified as ‘little OER’. This paper examines some of the differences between the use of these two types of OER to highlight issues in open education. These include attitudes towards reputation, the intentionality of the ...

  2. Big Red Telephone, Gone

    Institute of Scientific and Technical Information of China (English)

    Toni Piech

    2006-01-01

    @@ The Chinese big red telephones looked exactly as Iimagined the ones servicing the direct emergen line between the Kreml and the White House duing the cold-war era would have look like. But here in China, every kio seemed to have such a device in t1990s, and anyone could use it for ju 0.2 yuan. The government did not juinstall public phones on street corner but they let small-business owners pa ticipate in telecommunication. Supply and demand were juggled by a kind of Hutong capitalism.

  3. Big Data Challenges

    Directory of Open Access Journals (Sweden)

    Alexandru Adrian TOLE

    2013-10-01

    Full Text Available The amount of data that is traveling across the internet today, not only that is large, but is complex as well. Companies, institutions, healthcare system etc., all of them use piles of data which are further used for creating reports in order to ensure continuity regarding the services that they have to offer. The process behind the results that these entities requests represents a challenge for software developers and companies that provide IT infrastructure. The challenge is how to manipulate an impressive volume of data that has to be securely delivered through the internet and reach its destination intact. This paper treats the challenges that Big Data creates.

  4. Privacy and Big Data

    CERN Document Server

    Craig, Terence

    2011-01-01

    Much of what constitutes Big Data is information about us. Through our online activities, we leave an easy-to-follow trail of digital footprints that reveal who we are, what we buy, where we go, and much more. This eye-opening book explores the raging privacy debate over the use of personal data, with one undeniable conclusion: once data's been collected, we have absolutely no control over who uses it or how it is used. Personal data is the hottest commodity on the market today-truly more valuable than gold. We are the asset that every company, industry, non-profit, and government wants. Pri

  5. Bigness in compatible systems

    OpenAIRE

    Snowden, Andrew; Wiles, Andrew

    2009-01-01

    Clozel, Harris and Taylor have recently proved a modularity lifting theorem of the following general form: if rho is an l-adic representation of the absolute Galois group of a number field for which the residual representation rho-bar comes from a modular form then so does rho. This theorem has numerous hypotheses; a crucial one is that the image of rho-bar must be "big," a technical condition on subgroups of GL(n). In this paper we investigate this condition in compatible systems. Our main r...

  6. Recommendations for a restart of Molten Salt Reactor development

    International Nuclear Information System (INIS)

    The concept of the molten salt reactor (MSR) refuses to go away. The Generation-IV process lists the MSR as one of the six concepts to be considered for extending fuel resources. Good fuel utilization and good economics are required to meet the often cited goal of 10 TWe globally and 1 TWe for the US by non-carbon energy sources in this century by nuclear fission. A strong incentive for the molten salt reactor design is its good fuel utilization, good economics, amazing flexibility and promised large benefits. It can: - use thorium or uranium; o be designed with lots of graphite to have a fairly thermal neutron spectrum or without graphite moderator to have a fast neutron spectrum reactor; - fission uranium isotopes and plutonium isotopes; - operate with non-weapon grade fissile fuel, or in suitable sites it can operate with enrichment between reactor-grade and weapon-grade fissile fuel; - be a breeder or near breeder; - operate at temperature >1100 degree C if carbon composites are successfully employed. Enhancing 232U content in the uranium to over 500 pm makes the fuel undesirable for weapons, but it should not detract from its economic use in liquid fuel reactors: a big advantage in nonproliferation. Economics of the MSR is enhanced by operating at low pressure and high temperature and may even lead to the preferred route to hydrogen production. The cost of the electricity produced from low enriched fuel averaged over the life of the entire process, has been predicted to be about 10% lower than that from LWRs, and 20% lower for high enriched fuel, with uncertainties of about 10%. The development cost has been estimated at about 1 B$ (e.g., a 100 M$/y base program for ten years) not including construction of a series of reactors leading up to the deployment of multiple commercial units at an assumed cost of 9 B$ (450 M$/y over 20 years). A benefit of liquid fuel is that smaller power reactors can faithfully test features of larger reactors, thereby reducing the

  7. Ten questions on nuclear wastes

    International Nuclear Information System (INIS)

    The authors give explanations and answers to ten issues related to nuclear wastes: when a radioactive material becomes a waste, how radioactive wastes are classified and particularly nuclear wastes in France, what are the risks associated with radioactive wastes, whether the present management of radioactive wastes is well controlled in France, which wastes are raising actual problems and what are the solutions, whether amounts and radio-toxicity of wastes can be reduced, whether all long life radionuclides or part of them can be transmuted, whether geologic storage of final wastes is inescapable, whether radioactive material can be warehoused over long durations, and how the information on radioactive waste management is organised

  8. Ten Thousand Years of Solitude

    Energy Technology Data Exchange (ETDEWEB)

    Benford, G. (Los Alamos National Lab., NM (USA) California Univ., Irvine, CA (USA). Dept. of Physics); Kirkwood, C.W. (Los Alamos National Lab., NM (USA) Arizona State Univ., Tempe, AZ (USA). Coll. of Business Administration); Harry, O. (Los Alamos National Lab., NM (USA)); Pasqualetti, M.J. (Los Alamos National Lab., NM (USA) Arizona State Univ., Tempe, AZ (USA))

    1991-03-01

    This report documents the authors work as an expert team advising the US Department of Energy on modes of inadvertent intrusion over the next 10,000 years into the Waste Isolation Pilot Project (WIPP) nuclear waste repository. Credible types of potential future accidental intrusion into the WIPP are estimated as a basis for creating warning markers to prevent inadvertent intrusion. A six-step process is used to structure possible scenarios for such intrusion, and it is concluded that the probability of inadvertent intrusion into the WIPP repository over the next ten thousand years lies between one and twenty-five percent. 3 figs., 5 tabs.

  9. Ten questions about systems biology

    DEFF Research Database (Denmark)

    Joyner, Michael J; Pedersen, Bente K

    2011-01-01

    understand how whole animals adapt to the real world. We argue that a lack of fluency in these concepts is a major stumbling block for what has been narrowly defined as 'systems biology' by some of its leading advocates. We also point out that it is a failure of regulation at multiple levels that causes many......In this paper we raise 'ten questions' broadly related to 'omics', the term systems biology, and why the new biology has failed to deliver major therapeutic advances for many common diseases, especially diabetes and cardiovascular disease. We argue that a fundamentally narrow and reductionist...

  10. Ten questions about systems biology

    DEFF Research Database (Denmark)

    Joyner, Michael J; Pedersen, Bente K

    2011-01-01

    In this paper we raise 'ten questions' broadly related to 'omics', the term systems biology, and why the new biology has failed to deliver major therapeutic advances for many common diseases, especially diabetes and cardiovascular disease. We argue that a fundamentally narrow and reductionist...... to understand how whole animals adapt to the real world. We argue that a lack of fluency in these concepts is a major stumbling block for what has been narrowly defined as 'systems biology' by some of its leading advocates. We also point out that it is a failure of regulation at multiple levels that causes many...

  11. Ten Thousand Years of Solitude?

    International Nuclear Information System (INIS)

    This report documents the authors work as an expert team advising the US Department of Energy on modes of inadvertent intrusion over the next 10,000 years into the Waste Isolation Pilot Project (WIPP) nuclear waste repository. Credible types of potential future accidental intrusion into the WIPP are estimated as a basis for creating warning markers to prevent inadvertent intrusion. A six-step process is used to structure possible scenarios for such intrusion, and it is concluded that the probability of inadvertent intrusion into the WIPP repository over the next ten thousand years lies between one and twenty-five percent. 3 figs., 5 tabs

  12. Ten years of nuclear power

    International Nuclear Information System (INIS)

    Ten years have elapsed since the world's first nuclear power station began to supply electricity in Russia, and this in turn marked the end of a twelve year stage following the first controlled nuclear chain reaction at Chicago. These periods mark major stages in the development of atomic energy from the realm of abstract ideas to that of everyday industrial application. They followed a period of fundamental research and laboratory work, culminating in Enrico Fermi's demonstration of a system whereby the forces of the atom could be brought under control. Then it was necessary to find ways and means of using the chain reaction for practical purposes and on an industrial scale. And after this had been shown in 1954 to be technically possible, it had still to be developed into an economic process. The nuclear power station has proved itself from the technical and engineering standpoint. The third phase of development has been to bring it to the stage of being economically competitive with alternative sources of energy, and it would appear that we are now reaching that goal - though more slowly than had been envisaged ten years ago

  13. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    We present an overview of the standard model of big bang nucleosynthesis (BBN), which describes the production of the light elements in the early universe. The theoretical prediction for the abundances of D, 3He, 4He, and 7Li is discussed. We emphasize the role of key nuclear reactions and the methods by which experimental cross section uncertainties are propagated into uncertainties in the predicted abundances. The observational determination of the light nuclides is also discussed. Particular attention is given to the comparison between the predicted and observed abundances, which yields a measurement of the cosmic baryon content. The spectrum of anisotropies in the cosmic microwave background (CMB) now independently measures the baryon density to high precision; we show how the CMB data test BBN, and find that the CMB and the D and 4He observations paint a consistent picture. This concordance stands as a major success of the hot big bang. On the other hand, 7Li remains discrepant with the CMB-preferred baryon density; possible explanations are reviewed. Finally, moving beyond the standard model, primordial nucleosynthesis constraints on early universe and particle physics are also briefly discussed

  14. Flow of catalyst particles in a flue gas desulfurization plant; mass transfer in the domain of a detached flow - two examples (desulfurization, HTGR type reactor) for the application of big computers solving technical problems

    International Nuclear Information System (INIS)

    The research work of the Institute for Reactor Components is mainly experimental in character. Where possible, the experiments are accompanied by numerical calculations. This has the advantage of rendering parameter studies faster and more economical than is the case with experiments, so that physical contexts can become more apparent. However, these calculations are no substitute for experiments. The application of numerical calculations in connection with experimental results can now be demonstrated with two examples. The examples have been selected with the aim of making the presentation of the results sufficiently interesting for all those participating at the colloquium. The theoretical and experimental results are presented in the form of short films. (orig.)

  15. Classical propagation of strings across a big crunch/big bang singularity

    International Nuclear Information System (INIS)

    One of the simplest time-dependent solutions of M theory consists of nine-dimensional Euclidean space times 1+1-dimensional compactified Milne space-time. With a further modding out by Z2, the space-time represents two orbifold planes which collide and re-emerge, a process proposed as an explanation of the hot big bang [J. Khoury, B. A. Ovrut, P. J. Steinhardt, and N. Turok, Phys. Rev. D 64, 123522 (2001).][P. J. Steinhardt and N. Turok, Science 296, 1436 (2002).][N. Turok, M. Perry, and P. J. Steinhardt, Phys. Rev. D 70, 106004 (2004).]. When the two planes are near, the light states of the theory consist of winding M2-branes, describing fundamental strings in a particular ten-dimensional background. They suffer no blue-shift as the M theory dimension collapses, and their equations of motion are regular across the transition from big crunch to big bang. In this paper, we study the classical evolution of fundamental strings across the singularity in some detail. We also develop a simple semiclassical approximation to the quantum evolution which allows one to compute the quantum production of excitations on the string and implement it in a simplified example

  16. Measuring Public Acceptance of Nuclear Technology with Big data

    International Nuclear Information System (INIS)

    Surveys can be conducted only on people in specific region and time interval, and it may be misleading to generalize the results to represent the attitude of the public. For example, opinions of a person living in metropolitan area, far from the dangers of nuclear reactors and enjoying cheap electricity produced by the reactors, and a person living in proximity of nuclear power plants, subject to tremendous damage should nuclear meltdown occur, certainly differs for the topic of nuclear generation. To conclude, big data is a useful tool to measure the public acceptance of nuclear technology efficiently (i.e., saves cost, time, and effort of measurement and analysis) and this research was able to provide a case for using big data to analyze public acceptance of nuclear technology. Finally, the analysis identified opinion leaders, which allows target-marketing when policy is executed

  17. Measuring Public Acceptance of Nuclear Technology with Big data

    Energy Technology Data Exchange (ETDEWEB)

    Roh, Seugkook [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    Surveys can be conducted only on people in specific region and time interval, and it may be misleading to generalize the results to represent the attitude of the public. For example, opinions of a person living in metropolitan area, far from the dangers of nuclear reactors and enjoying cheap electricity produced by the reactors, and a person living in proximity of nuclear power plants, subject to tremendous damage should nuclear meltdown occur, certainly differs for the topic of nuclear generation. To conclude, big data is a useful tool to measure the public acceptance of nuclear technology efficiently (i.e., saves cost, time, and effort of measurement and analysis) and this research was able to provide a case for using big data to analyze public acceptance of nuclear technology. Finally, the analysis identified opinion leaders, which allows target-marketing when policy is executed.

  18. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  19. IZVEDBENI ELEMENTI U BIG BROTHERU

    OpenAIRE

    Radman, Korana

    2009-01-01

    Big Brother publici nudi "ultimativnu stvarnost" osiguranu cjelodnevnim nadzorom televizijskih kamera, o čemu je polemizirano od početka njegova prikazivanja u Europi i svijetu. Imajući to na umu, ovaj rad je pristupio Big Brotheru iz perspektive izvedbenih studija, pokušavajući u njemu prepoznati neke od mogućih izvedbi.

  20. The Big Read: Case Studies

    Science.gov (United States)

    National Endowment for the Arts, 2009

    2009-01-01

    The Big Read evaluation included a series of 35 case studies designed to gather more in-depth information on the program's implementation and impact. The case studies gave readers a valuable first-hand look at The Big Read in context. Both formal and informal interviews, focus groups, attendance at a wide range of events--all showed how…

  1. Resources: Building Big (and Small)

    OpenAIRE

    Kelley, Todd R.

    2007-01-01

    The article offers a set of videos and web resources for elementary teachers to help them explore five different structures, including bridges, domes, skyscrapers, dams, and tunnels, that have been built big to meet the human needs and wants. It includes the miniseries video "Building Big" by David Macaulay and the website www.pbs.org/buildingbig.com.

  2. Big Bounce Genesis

    CERN Document Server

    Li, Changhong; Cheung, Yeuk-Kwan E

    2014-01-01

    We report on the possibility to use dark matter mass and its interaction cross section as a smoking gun signal of the existence of a big bounce at the early stage in the evolution of our currently observed universe. A model independent study of dark matter production in the contraction and expansion phases of the bounce universe reveals a new venue for achieving the observed relic abundance in which a significantly smaller amount of dark matter--compared to the standard cosmology--is produced and survives until today, diluted only by the cosmic expansion since the radiation dominated era. Once DM mass and its interaction strength with ordinary matter are determined by experiments, this alternative route becomes a signature of the bounce universe scenario.

  3. Ten out of ten for LHC decapole magnets

    CERN Multimedia

    2001-01-01

    CERN's Albert Ijspeert (left) and Avinash Puntambekar of the Indian CAT laboratory with the ten Indian decapole magnets on the test bench. Tests will be carried out by the LHC-MTA group. A batch of 10 superconducting decapole magnets for the LHC has just arrived at CERN from India. These will be used to correct for slight imperfections in the dipole magnets that will steer proton beams around CERN's new accelerator. All magnets have slight imperfections in the fields they produce, and in the LHC dipoles these will be corrected for using sextupoles and decapoles. The sextupoles were the first LHC magnets to be given the production green-light following successful tests of pre-series magnets last year (Bulletin 21/2000, 22 May 2000). Now it is the turn of pre-series decapoles to go on trial at CERN. Of the LHC's 1232 dipole magnets, half will use sextupole correctors only and the other half will use both sextupoles and decapoles. That means that a total of 616 pairs of decapoles are needed. Like the sextupole...

  4. DPF Big One

    International Nuclear Information System (INIS)

    At its latest venue at Fermilab from 10-14 November, the American Physical Society's Division of Particles and Fields meeting entered a new dimension. These regular meetings, which allow younger researchers to communicate with their peers, have been gaining popularity over the years (this was the seventh in the series), but nobody had expected almost a thousand participants and nearly 500 requests to give talks. Thus Fermilab's 800-seat auditorium had to be supplemented with another room with a video hookup, while the parallel sessions were organized into nine bewildering streams covering fourteen major physics topics. With the conventionality of the Standard Model virtually unchallenged, physics does not move fast these days. While most of the physics results had already been covered in principle at the International Conference on High Energy Physics held in Dallas in August (October, page 1), the Fermilab DPF meeting had a very different atmosphere. Major international meetings like Dallas attract big names from far and wide, and it is difficult in such an august atmosphere for young researchers to find a receptive audience. This was not the case at the DPF parallel sessions. The meeting also adopted a novel approach, with the parallels sandwiched between an initial day of plenaries to set the scene, and a final day of summaries. With the whole world waiting for the sixth ('top') quark to be discovered at Fermilab's Tevatron protonantiproton collider, the meeting began with updates from Avi Yagil and Ronald Madaras from the big detectors, CDF and DO respectively. Although rumours flew thick and fast, the Tevatron has not yet reached the top, although Yagil could show one intriguing event of a type expected from the heaviest quark

  5. Mining “Big Data” using Big Data Services

    OpenAIRE

    2014-01-01

    While many colleagues within science are fed up with the “big data fad”, empirical analyses we conducted for the current editorial actually show an inconsistent picture: we use big data services to determine whether there really is an increase in writing about big data or even widespread use of the term. Google Correlate (http://www.google.com/trends/correlate/), the first free tool we are presenting here, doesn’t list the term, showing that number of searches for it are below an absolute min...

  6. Big bang and big crunch in matrix string theory

    International Nuclear Information System (INIS)

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe

  7. Primary hypoadrenocorticism in ten cats.

    Science.gov (United States)

    Peterson, M E; Greco, D S; Orth, D N

    1989-01-01

    Primary hypoadrenocorticism was diagnosed in ten young to middle-aged cats of mixed breeding. Five of the cats were male, and five were female. Historic signs included lethargy (n = 10), anorexia (n = 10), weight loss (n = 9), vomiting (n = 4), and polyuria (n = 3). Dehydration (n = 9), hypothermia (n = 8), prolonged capillary refill time (n = 5), weak pulse (n = 5), collapse (n = 3), and sinus bradycardia (n = 2) were found on physical examination. Results of initial laboratory tests revealed anemia (n = 3), absolute lymphocytosis (n = 2), absolute eosinophilia (n = 1), and azotemia and hyperphosphatemia (n = 10). Serum electrolyte changes included hyponatremia (n = 10), hyperkalemia (n = 9), hypochloremia (n = 9), and hypercalcemia (n = 1). The diagnosis of primary adrenocortical insufficiency was established on the basis of results of adrenocorticotropic hormone (ACTH) stimulation tests (n = 10) and endogenous plasma ACTH determinations (n = 7). Initial therapy for hypoadrenocorticism included intravenous administration of 0.9% saline and dexamethasone and intramuscular administration of desoxycorticosterone acetate in oil. Three cats were euthanatized shortly after diagnosis because of poor clinical response. Results of necropsy examination were unremarkable except for complete destruction of both adrenal cortices. Seven cats were treated chronically with oral prednisone or intramuscular methylprednisolone acetate for glucocorticoid supplementation and with oral fludrocortisone acetate or intramuscular injections of repository desoxycorticosterone pivalate for mineralocorticoid replacement. One cat died after 47 days of therapy from unknown causes; the other six cats are still alive and well after 3 to 70 months of treatment. PMID:2469793

  8. BigOP: Generating Comprehensive Big Data Workloads as a Benchmarking Framework

    OpenAIRE

    Zhu, Yuqing; Zhan, Jianfeng; Weng, Chuliang; Nambiar, Raghunath; Zhang, Jinchao; Chen, Xingzhen; Wang, Lei

    2014-01-01

    Big Data is considered proprietary asset of companies, organizations, and even nations. Turning big data into real treasure requires the support of big data systems. A variety of commercial and open source products have been unleashed for big data storage and processing. While big data users are facing the choice of which system best suits their needs, big data system developers are facing the question of how to evaluate their systems with regard to general big data processing needs. System b...

  9. From big bang to big crunch and beyond

    International Nuclear Information System (INIS)

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a 'big bang' singularity, expands and then contracts to a 'big crunch' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spacetimes is further connected at the singularities to a series of non-compact static regions with closed timelike curves. These regions contain boundaries, on which the observables of the theory live. This suggests a holographic interpretation of the physics. (author)

  10. The challenges of big data.

    Science.gov (United States)

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. PMID:27147249

  11. Big Data Mining: Tools & Algorithms

    Directory of Open Access Journals (Sweden)

    Adeel Shiraz Hashmi

    2016-03-01

    Full Text Available We are now in Big Data era, and there is a growing demand for tools which can process and analyze it. Big data analytics deals with extracting valuable information from that complex data which can’t be handled by traditional data mining tools. This paper surveys the available tools which can handle large volumes of data as well as evolving data streams. The data mining tools and algorithms which can handle big data have also been summarized, and one of the tools has been used for mining of large datasets using distributed algorithms.

  12. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  13. The challenges of big data

    Science.gov (United States)

    2016-01-01

    ABSTRACT The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. PMID:27147249

  14. Big Data: present and future

    Directory of Open Access Journals (Sweden)

    Mircea Raducu TRIFU

    2014-05-01

    Full Text Available The paper explains the importance of the Big Data concept, a concept that even now, after years of development, is for the most companies just a cool keyword. The paper also describes the level of the actual big data development and the things it can do, and also the things that can be done in the near future. The paper focuses on explaining to nontechnical and non-database related technical specialists what basically is big data, presents the three most important V's, as well as the new ones, the most important solutions used by companies like Google or Amazon, as well as some interesting perceptions based on this subject.

  15. NCAA Money for Student Assistance Lands in Many Pockets, Big Ten Document Shows

    Science.gov (United States)

    Wolverton, Brad

    2013-01-01

    Amid a national debate about paying college athletes, the NCAA likes to tout its often-overlooked Student Assistance Fund, whose goal is to provide direct financial support to players. The fund--which draws from the association's multibillion-dollar media-rights deals--will distribute some $75-million this year to Division I athletes. The money…

  16. Transgender People at Four Big Ten Campuses: A Policy Discourse Analysis

    Science.gov (United States)

    Dirks, Doris Andrea

    2016-01-01

    This article examines the language used to discuss transgender people on university campuses. This study asks how, despite seemingly benefitting transgender people, the discourses carried by the documents that discuss trans people may actually undermine the intended goals of policy initiatives. For example, a report on the status of transgender…

  17. Detecting and understanding big events in big cities

    OpenAIRE

    Furletti, Barbara; Trasarti, Roberto; Gabrielli, Lorenzo; Smoreda, Zbigniew; Vanhoof, Maarten; Ziemlicki, Cezary

    2015-01-01

    Recent studies have shown the great potential of big data such as mobile phone location data to model human behavior. Big data allow to analyze people presence in a territory in a fast and effective way with respect to the classical surveys (diaries or questionnaires). One of the drawbacks of these collection systems is incompleteness of the users' traces; people are localized only when they are using their phones. In this work we define a data mining method for identifying people presence an...

  18. Antigravity and the big crunch/big bang transition

    OpenAIRE

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.(Princeton Center for Theoretical Science, Princeton University, Princeton, NJ, 08544, USA); Turok, Neil

    2011-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition...

  19. Quantum Fields in a Big Crunch/Big Bang Spacetime

    OpenAIRE

    Tolley, Andrew J.; Turok, Neil

    2002-01-01

    We consider quantum field theory on a spacetime representing the Big Crunch/Big Bang transition postulated in the ekpyrotic or cyclic cosmologies. We show via several independent methods that an essentially unique matching rule holds connecting the incoming state, in which a single extra dimension shrinks to zero, to the outgoing state in which it re-expands at the same rate. For free fields in our construction there is no particle production from the incoming adiabatic vacuum. When interacti...

  20. Sailing through the big crunch-big bang transition

    OpenAIRE

    Bars, Itzhak; Steinhardt, Paul; Turok, Neil

    2013-01-01

    In a recent series of papers, we have shown that theories with scalar fields coupled to gravity (e.g., the standard model) can be lifted to a Weyl-invariant equivalent theory in which it is possible to unambiguously trace the classical cosmological evolution through the transition from big crunch to big bang. The key was identifying a sufficient number of finite, Weyl-invariant conserved quantities to uniquely match the fundamental cosmological degrees of freedom across the transition. In so ...

  1. Hey, big spender

    International Nuclear Information System (INIS)

    Business to business electronic commerce is looming large in the future of the oil industry. It is estimated that by adopting e-commerce the industry could achieve bottom line savings of between $1.8 to $ 3.4 billion a year on annual gross revenues in excess of $ 30 billion. At present there are several teething problems to overcome such as inter-operability standards, which are at least two or three years away. Tying in electronically with specific suppliers is also an expensive proposition, although the big benefits are in fact in doing business with the same suppliers on a continuing basis. Despite these problems, 14 of the world's largest energy and petrochemical companies joined forces in mid-April to create a single Internet procurement marketplace for the industry's complex supply chain. The exchange was designed by B2B (business-to-business) software provider, Commerce One Inc., ; it will leverage the buying clout of these industry giants (BP Amoco, Royal Dutch Shell Group, Conoco, Occidental Petroleum, Phillips Petroleum, Unocal Corporation and Statoil among them), currently about $ 125 billion on procurement per year; they hope to save between 5 to 30 per cent depending on the product and the region involved. Other similar schemes such as Chevron and partners' Petrocosm Marketplace, Network Oil, a Houston-based Internet portal aimed at smaller petroleum companies, are also doing business in the $ 10 billion per annum range. e-Energy, a cooperative project between IBM Ericson and Telus Advertising is another neutral, virtual marketplace targeted at the oil and gas sector. PetroTRAX, a Calgary-based website plans to take online procurement and auction sales a big step forward by establishing a portal to handle any oil company's asset management needs. There are also a number of websites targeting specific needs: IndigoPool.com (acquisitions and divestitures) and WellBid.com (products related to upstream oil and gas operators) are just two examples. All in

  2. Big bang darkleosynthesis

    Directory of Open Access Journals (Sweden)

    Gordan Krnjaic

    2015-12-01

    Full Text Available In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN, in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV/dark-nucleon binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S≫3/2, whose discovery would be smoking gun evidence for dark nuclei.

  3. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  4. Big Lake Dam Inspection Report

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This report summarizes an inspection of the Big Lake Dam that was done in September of 1983. The inspection did not reveal any conditions that constitute and...

  5. Le Big Bang en laboratoire

    CERN Multimedia

    Roy, Christelle

    2006-01-01

    Physiciens have been dreaming of it for 30 years; Thanks to huge particle accelerators, they were able to observe the matter such as it was some instants after the Big Bang (three different articles in 10 pages)

  6. Nuclear reactor PBMR and cogeneration; Reactor nuclear PBMR y cogeneracion

    Energy Technology Data Exchange (ETDEWEB)

    Ramirez S, J. R.; Alonso V, G., E-mail: ramon.ramirez@inin.gob.mx [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2013-10-15

    In recent years the nuclear reactor designs for the electricity generation have increased their costs, so that at the moment costs are managed of around the 5000 US D for installed kw, reason for which a big nuclear plant requires of investments of the order of billions of dollars, the designed reactors as modular of low power seek to lighten the initial investment of a big reactor dividing the power in parts and dividing in modules the components to lower the production costs, this way it can begin to build a module and finished this to build other, differing the long term investment, getting less risk therefore in the investment. On the other hand the reactors of low power can be very useful in regions where is difficult to have access to the electric net being able to take advantage of the thermal energy of the reactor to feed other processes like the water desalination or the vapor generation for the processes industry like the petrochemical, or even more the possible hydrogen production to be used as fuel. In this work the possibility to generate vapor of high quality for the petrochemical industry is described using a spheres bed reactor of high temperature. (Author)

  7. Big Data Analytics in Healthcare

    OpenAIRE

    Ashwin Belle; Raghuram Thiagarajan; S. M. Reza Soroushmehr; Fatemeh Navidi; Daniel A Beard; Kayvan Najarian

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is sti...

  8. Big Data and Ambulatory Care

    OpenAIRE

    Thorpe, Jane Hyatt; Gray, Elizabeth Alexandra

    2014-01-01

    Big data is heralded as having the potential to revolutionize health care by making large amounts of data available to support care delivery, population health, and patient engagement. Critics argue that big data's transformative potential is inhibited by privacy requirements that restrict health information exchange. However, there are a variety of permissible activities involving use and disclosure of patient information that support care delivery and management. This article presents an ov...

  9. The role of big laboratories

    International Nuclear Information System (INIS)

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward. (paper)

  10. The role of big laboratories

    CERN Document Server

    Heuer, Rolf-Dieter

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward.

  11. Nuclear reactor PBMR and cogeneration

    International Nuclear Information System (INIS)

    In recent years the nuclear reactor designs for the electricity generation have increased their costs, so that at the moment costs are managed of around the 5000 US D for installed kw, reason for which a big nuclear plant requires of investments of the order of billions of dollars, the designed reactors as modular of low power seek to lighten the initial investment of a big reactor dividing the power in parts and dividing in modules the components to lower the production costs, this way it can begin to build a module and finished this to build other, differing the long term investment, getting less risk therefore in the investment. On the other hand the reactors of low power can be very useful in regions where is difficult to have access to the electric net being able to take advantage of the thermal energy of the reactor to feed other processes like the water desalination or the vapor generation for the processes industry like the petrochemical, or even more the possible hydrogen production to be used as fuel. In this work the possibility to generate vapor of high quality for the petrochemical industry is described using a spheres bed reactor of high temperature. (Author)

  12. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process. PMID:27068058

  13. [Big data in official statistics].

    Science.gov (United States)

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany. PMID:26077871

  14. Big Data in Health: a Literature Review from the Year 2005.

    Science.gov (United States)

    de la Torre Díez, Isabel; Cosgaya, Héctor Merino; Garcia-Zapirain, Begoña; López-Coronado, Miguel

    2016-09-01

    The information stored in healthcare systems has increased over the last ten years, leading it to be considered Big Data. There is a wealth of health information ready to be analysed. However, the sheer volume raises a challenge for traditional methods. The aim of this article is to conduct a cutting-edge study on Big Data in healthcare from 2005 to the present. This literature review will help researchers to know how Big Data has developed in the health industry and open up new avenues for research. Information searches have been made on various scientific databases such as Pubmed, Science Direct, Scopus and Web of Science for Big Data in healthcare. The search criteria were "Big Data" and "health" with a date range from 2005 to the present. A total of 9724 articles were found on the databases. 9515 articles were discarded as duplicates or for not having a title of interest to the study. 209 articles were read, with the resulting decision that 46 were useful for this study. 52.6 % of the articles used were found in Science Direct, 23.7 % in Pubmed, 22.1 % through Scopus and the remaining 2.6 % through the Web of Science. Big Data has undergone extremely high growth since 2011 and its use is becoming compulsory in developed nations and in an increasing number of developing nations. Big Data is a step forward and a cost reducer for public and private healthcare. PMID:27520614

  15. An Antitrust Economic Analysis of Stop & Shop's Proposed Acquisition of the Big V Shop Rite Supermarket Chain

    OpenAIRE

    Cotterill, Ronald W.

    2002-01-01

    In early 2002, the Royal Ahold subsidiary, Stop & Shop Supermarkets, offered to purchase the Big V supermarket chain, which was in bankruptcy court after three successive, unsuccessful leveraged buyouts over the past ten years. At a later date, Pathmark Supermarkets joined the offer to purchase. Big V was Wakefern Food Corporation?s largest member. The acquisition was a horizontal merger in at least three local markets, Newburgh NJ, Poughkeepsie NY, and Trenton NJ. This research was conducted...

  16. Dual of Big-bang and Big-crunch

    OpenAIRE

    Bak, Dongsu

    2006-01-01

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by procedure of the double anaytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are non singular at all as the coupling goes to zero in the N=4 Super Yang-Mills theory. The cosmological sing...

  17. Turning big bang into big bounce: II. Quantum dynamics

    International Nuclear Information System (INIS)

    We analyze the big bounce transition of the quantum Friedmann-Robertson-Walker model in the setting of the nonstandard loop quantum cosmology (LQC). Elementary observables are used to quantize composite observables. The spectrum of the energy density operator is bounded and continuous. The spectrum of the volume operator is bounded from below and discrete. It has equally distant levels defining a quantum of the volume. The discreteness may imply a foamy structure of spacetime at a semiclassical level which may be detected in astro-cosmo observations. The nonstandard LQC method has a free parameter that should be fixed in some way to specify the big bounce transition.

  18. N Reactor

    Data.gov (United States)

    Federal Laboratory Consortium — The last of Hanfordqaodmasdkwaspemas7ajkqlsmdqpakldnzsdflss nine plutonium production reactors to be built was the N Reactor.This reactor was called a dual purpose...

  19. CLOUD COMPUTING WITH BIG DATA: A REVIEW

    OpenAIRE

    Anjali; Er. Amandeep Kaur; Mrs. Shakshi

    2016-01-01

    Big data is a collection of huge quantities of data. Big data is the process of examining large amounts of data. Big data and Cloud computing are the hot issues in Information Technology. Big data is the one of the main problem now a day’s. Researchers focusing how to handle huge amount of data with cloud computing and how to gain a perfect security for big data in cloud computing. To handle the Big Data problem Hadoop framework is used in which data is fragmented and executed parallel....

  20. Big Data Analytics in Healthcare

    Directory of Open Access Journals (Sweden)

    Ashwin Belle

    2015-01-01

    Full Text Available The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  1. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957

  2. Big Data: Astronomical or Genomical?

    Science.gov (United States)

    Stephens, Zachary D; Lee, Skylar Y; Faghri, Faraz; Campbell, Roy H; Zhai, Chengxiang; Efron, Miles J; Iyer, Ravishankar; Schatz, Michael C; Sinha, Saurabh; Robinson, Gene E

    2015-07-01

    Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a "four-headed beast"--it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the "genomical" challenges of the next decade. PMID:26151137

  3. Big Data: Astronomical or Genomical?

    Directory of Open Access Journals (Sweden)

    Zachary D Stephens

    2015-07-01

    Full Text Available Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a "four-headed beast"--it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the "genomical" challenges of the next decade.

  4. Progress of Research on Demonstration Fast Reactor Main Pipe Material

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    The main characteristics of the sodium pipe system in demonstration fast reactor are high-temperature, thin-wall and big-caliber, which is different from the high-pressure and thick-wall of the pressurized water reactor system, and the system is long-term

  5. Big Book of Windows Hacks

    CERN Document Server

    Gralla, Preston

    2008-01-01

    Bigger, better, and broader in scope, the Big Book of Windows Hacks gives you everything you need to get the most out of your Windows Vista or XP system, including its related applications and the hardware it runs on or connects to. Whether you want to tweak Vista's Aero interface, build customized sidebar gadgets and run them from a USB key, or hack the "unhackable" screensavers, you'll find quick and ingenious ways to bend these recalcitrant operating systems to your will. The Big Book of Windows Hacks focuses on Vista, the new bad boy on Microsoft's block, with hacks and workarounds that

  6. Sosiaalinen asiakassuhdejohtaminen ja big data

    OpenAIRE

    Toivonen, Topi-Antti

    2015-01-01

    Tässä tutkielmassa käsitellään sosiaalista asiakassuhdejohtamista sekä hyötyjä, joita siihen voidaan saada big datan avulla. Sosiaalinen asiakassuhdejohtaminen on terminä uusi ja monille tuntematon. Tutkimusta motivoi aiheen vähäinen tutkimus, suomenkielisen tutkimuksen puuttuminen kokonaan sekä sosiaalisen asiakassuhdejohtamisen mahdollinen olennainen rooli yritysten toiminnassa tulevaisuudessa. Big dataa käsittelevissä tutkimuksissa keskitytään monesti sen tekniseen puoleen, eikä sovellutuk...

  7. AAPOR Report on Big Data

    OpenAIRE

    Task Force Members Include: Lilli Japec; Frauke Kreuter; Marcus Berg; Paul Biemer; Paul Decker; Cliff Lampe; Julia Lane; Cathy O'Neil; Abe Usher

    2015-01-01

    In recent years we have seen an increase in the amount of statistics in society describing different phenomena based on so called Big Data. The term Big Data is used for a variety of data as explained in the report, many of them characterized not just by their large volume, but also by their variety and velocity, the organic way in which they are created, and the new types of processes needed to analyze them and make inference from them. The change in the nature of the new types of data, thei...

  8. Towards a big crunch dual

    Energy Technology Data Exchange (ETDEWEB)

    Hertog, Thomas E-mail: hertog@vulcan2.physics.ucsb.edu; Horowitz, Gary T

    2004-07-01

    We show there exist smooth asymptotically anti-de Sitter initial data which evolve to a big crunch singularity in a low energy supergravity limit of string theory. This opens up the possibility of using the dual conformal field theory to obtain a fully quantum description of the cosmological singularity. A preliminary study of this dual theory suggests that the big crunch is an endpoint of evolution even in the full string theory. We also show that any theory with scalar solitons must have negative energy solutions. The results presented here clarify our earlier work on cosmic censorship violation in N=8 supergravity. (author)

  9. [Big Data- challenges and risks].

    Science.gov (United States)

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-01

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented. PMID:26614539

  10. The BigBOSS Experiment

    OpenAIRE

    Schlegel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Prieto, C. Allende; Annis, J.; Aubourg, E.; Azzaro, M.; Baltay, S. Bailey. C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.

    2011-01-01

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra...

  11. Big society, big data. The radicalisation of the network society

    NARCIS (Netherlands)

    Frissen, V.

    2011-01-01

    During the British election campaign of 2010, David Cameron produced the idea of the ‘Big Society’ as a cornerstone of his political agenda. At the core of the idea is a stronger civil society and local community coupled with a more withdrawn government. Although many commentators have dismissed thi

  12. Do Big Bottles Kickstart Infant Weight Issues?

    Science.gov (United States)

    ... nih.gov/medlineplus/news/fullstory_159241.html Do Big Bottles Kickstart Infant Weight Issues? Smaller baby bottles ... 2016 (HealthDay News) -- Feeding babies formula from a big bottle might put them at higher risk for ...

  13. ALGORITHMS FOR TETRAHEDRAL NETWORK (TEN) GENERATION

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The Tetrahedral Network(TEN) is a powerful 3-D vector structure in GIS, which has a lot of advantages such as simple structure, fast topological relation processing and rapid visualization. The difficulty of TEN application is automatic creating data structure. Al though a raster algorithm has been introduced by some authors, the problems in accuracy, memory requirement, speed and integrity are still existent. In this paper, the raster algorithm is completed and a vector algorithm is presented after a 3-D data model and structure of TEN have been introducted. Finally, experiment, conclusion and future work are discussed.

  14. Backfitting of the FRG reactors

    International Nuclear Information System (INIS)

    The FRG-research reactors The GKSS-research centre is operating two research reactors of the pool type fueled with MTR-type type fuel elements. The research reactors FRG-1 and FRG-2 having power levels of 5 MW and 15 MW are in operation for 31 year and 27 years respectively. They are comparably old like other research reactors. The reactors are operating at present at approximately 180 days (FRG-1) and between 210 and 250 days (FRG-2) per year. Both reactors are located in the same reactor hall in a connecting pool system. Backfitting measures are needed for our and other research reactors to ensure a high level of safety and availability. The main backfitting activities during last ten years were concerned with: comparison of the existing design with today demands (criteria, guidelines, standards etc.); and probability approach for events from outside like aeroplane crashes and earthquakes; the main accidents were rediscussed like startup from low and full power, loss of coolant flow, loss of heat sink, loss of coolant and fuel plate melting; a new reactor protection system had to be installed, following today's demands; a new crane has been installed in the reactor hall. A cold neutron source has been installed to increase the flux of cold neutrons by a factor of 14. The FRG-l is being converted from 93% enriched U with Alx fuel to 20% enriched U with U3Si2 fuel. Both cooling towers were repaired. Replacement of instrumentation is planned

  15. Big data e data science

    OpenAIRE

    Cavique, Luís

    2014-01-01

    Neste artigo foram apresentados os conceitos básicos de Big Data e a nova área a que deu origem, a Data Science. Em Data Science foi discutida e exemplificada a noção de redução da dimensionalidade dos dados.

  16. Do big gods cause anything?

    DEFF Research Database (Denmark)

    Geertz, Armin W.

    2014-01-01

    Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere....

  17. China: Big Changes Coming Soon

    Science.gov (United States)

    Rowen, Henry S.

    2011-01-01

    Big changes are ahead for China, probably abrupt ones. The economy has grown so rapidly for many years, over 30 years at an average of nine percent a year, that its size makes it a major player in trade and finance and increasingly in political and military matters. This growth is not only of great importance internationally, it is already having…

  18. YOUNG CITY,BIG PARTY

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Shenzhen Universiade unites the world’s young people through sports with none of the usual hoop-la, no fireworks, no grand performances by celebrities and superstars, the Shenzhen Summer Universiade lowered the curtain on a big party for youth and college students on August 23.

  19. BIG DATA IN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2015-06-01

    Full Text Available In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured in order to improve current transactions, to develop new business models, to provide a real image of the supply and demand and thereby, generate market advantages. So, the companies that turn to Big Data have a competitive advantage over other firms. Looking from the perspective of IT organizations, they must accommodate the storage and processing Big Data, and provide analysis tools that are easily integrated into business processes. This paper aims to discuss aspects regarding the Big Data concept, the principles to build, organize and analyse huge datasets in the business environment, offering a three-layer architecture, based on actual software solutions. Also, the article refers to the graphical tools for exploring and representing unstructured data, Gephi and NodeXL.

  20. Characterizing and Subsetting Big Data Workloads

    OpenAIRE

    Jia, Zhen; Zhan, Jianfeng; Wang, Lei; Han, Rui; Mckee, Sally A.; Yang, Qiang; Luo, Chunjie; Li, Jingwei

    2014-01-01

    Big data benchmark suites must include a diversity of data and workloads to be useful in fairly evaluating big data systems and architectures. However, using truly comprehensive benchmarks poses great challenges for the architecture community. First, we need to thoroughly understand the behaviors of a variety of workloads. Second, our usual simulation-based research methods become prohibitively expensive for big data. As big data is an emerging field, more and more software stacks are being p...

  1. Big Graph Mining: Frameworks and Techniques

    OpenAIRE

    Aridhi, Sabeur; Nguifo, Engelbert Mephu

    2016-01-01

    Big graph mining is an important research area and it has attracted considerable attention. It allows to process, analyze, and extract meaningful information from large amounts of graph data. Big graph mining has been highly motivated not only by the tremendously increasing size of graphs but also by its huge number of applications. Such applications include bioinformatics, chemoinformatics and social networks. One of the most challenging tasks in big graph mining is pattern mining in big gra...

  2. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  3. Judging Big Deals: Challenges, Outcomes, and Advice

    Science.gov (United States)

    Glasser, Sarah

    2013-01-01

    This article reports the results of an analysis of five Big Deal electronic journal packages to which Hofstra University's Axinn Library subscribes. COUNTER usage reports were used to judge the value of each Big Deal. Limitations of usage statistics are also discussed. In the end, the author concludes that four of the five Big Deals are good…

  4. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  5. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services. PMID:27559194

  6. "Big Data" - Grosse Daten, viel Wissen?

    OpenAIRE

    Hothorn, Torsten

    2015-01-01

    Since a couple of years, the term Big Data describes technologies to extract knowledge from data. Applications of Big Data and their consequences are also increasingly discussed in the mass media. Because medicine is an empirical science, we discuss the meaning of Big Data and its potential for future medical research.

  7. The BigBoss Experiment

    International Nuclear Information System (INIS)

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = λ/Δλ = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 max = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (kmax = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  8. ATLAS: Big Data in a Small Package

    Science.gov (United States)

    Denneau, Larry; Tonry, John

    2015-08-01

    For even small telescope projects, the petabyte scale is now upon us. The Asteroid Terrestrial-impact Last Alert System (ATLAS; Tonry 2011) will robotically survey the entire visible sky from Hawaii multiple times per night to search for near-Earth asteroids (NEAs) on impact trajectories. While the ATLAS optical system is modest by modern astronomical standards -- two 0.5 m F/2.0 telescopes -- each year the ATLAS system will obtain ~103 measurements of 109 astronomical sources to a photometric accuracy of objects then archived for further analysis, and alerts for newly discovered near-Earth NEAs disseminated within tens of minutes from detection. ATLAS's all-sky coverage ensures it will discover many ``rifle shot'' near-misses moving rapidly on the sky as they shoot past the Earth, so the system will need software to automatically detect highly-trailed sources and discriminate them from the thousands of satellites and pieces of space junk that ATLAS will see each night. Additional interrogation will identify interesting phenomena from beyond the solar system occurring over millions of transient sources per night. The data processing and storage requirements for ATLAS demand a ``big data'' approach typical of commercial Internet enterprises. We describe our approach to deploying a nimble, scalable and reliable data processing infrastructure, and promote ATLAS as steppingstone to eventual processing scales in the era of LSST.

  9. Ten Leading Causes of Death and Injury

    Science.gov (United States)

    ... Overdose Traumatic Brain Injury Violence Prevention Ten Leading Causes of Death and Injury Recommend on Facebook Tweet ... Violence-Related Injury Deaths, United States - 2013 Leading Causes of Death Charts Causes of Death by Age ...

  10. Ten new species of Afrotropical Pterophoridae (Lepidoptera)

    NARCIS (Netherlands)

    Gielis, C.

    2008-01-01

    Ten new Afrotropical species of Pterophoridae are described: Agdistis linnaei spec. nov., Agdistis bouyeri spec. nov., Ochyrotica bjoernstadti spec. nov., Platyptilia aarviki spec. nov., Stenoptilia kiitulo spec. nov., Exelastis caroli spec. nov., Eucapperia continentalis spec. nov., Buckleria vande

  11. Top Ten Workplace Skills for Future Organizations

    OpenAIRE

    Kelly Y. Senter; Austin McClelland, Sr

    2015-01-01

    Many researchers have indicated that there are skills that are going to be essential for the future workforce. This review helps identify these skills and the applicability of these skills to job performance of the future organization competing in a globalized environment. The review provides insight into each of the ten listed skills and also information on how the skills will be useful for future organizations. The review will extend previous literature regarding the identified ten skills e...

  12. Reactor accident-big impacts but small possibilities

    International Nuclear Information System (INIS)

    Accidents are an unfortunate incident that happened in our lives. The government provides facilities and programs to reduce accidents; people also take a variety of initiatives that accidents can be avoided, and every family and its members are constantly vigilant to protect against accidents. Some industries are relatively simple operations are recorded accidents is higher than other industries is more complex and sophisticated. Authors relate this fact with the accident that occurred in the area where the power generation plant according to author accidents in this area is very small and grouped as isolated cases. This article also commented on two major accidents in nuclear power generation are Chernobyl and Three Miles Island. Authors also hope that the progress of current and future technology can overcome this problem and then convince the public that nuclear energy is safe and low risk.

  13. Global Fluctuation Spectra in Big Crunch/Big Bang String Vacua

    OpenAIRE

    Craps, Ben; Ovrut, Burt A.

    2003-01-01

    We study Big Crunch/Big Bang cosmologies that correspond to exact world-sheet superconformal field theories of type II strings. The string theory spacetime contains a Big Crunch and a Big Bang cosmology, as well as additional ``whisker'' asymptotic and intermediate regions. Within the context of free string theory, we compute, unambiguously, the scalar fluctuation spectrum in all regions of spacetime. Generically, the Big Crunch fluctuation spectrum is altered while passing through the bounce...

  14. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    CERN Multimedia

    2008-01-01

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  15. Perspectives on Big Data and Big Data Analytics

    Directory of Open Access Journals (Sweden)

    Elena Geanina ULARU

    2012-12-01

    Full Text Available Nowadays companies are starting to realize the importance of using more data in order to support decision for their strategies. It was said and proved through study cases that “More data usually beats better algorithms”. With this statement companies started to realize that they can chose to invest more in processing larger sets of data rather than investing in expensive algorithms. The large quantity of data is better used as a whole because of the possible correlations on a larger amount, correlations that can never be found if the data is analyzed on separate sets or on a smaller set. A larger amount of data gives a better output but also working with it can become a challenge due to processing limitations. This article intends to define the concept of Big Data and stress the importance of Big Data Analytics.

  16. Antigravity and the big crunch/big bang transition

    International Nuclear Information System (INIS)

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  17. Solution of a braneworld big crunch/big bang cosmology

    International Nuclear Information System (INIS)

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c)2. At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios

  18. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  19. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  20. Antigravity and the big crunch/big bang transition

    CERN Document Server

    Bars, Itzhak; Steinhardt, Paul J; Turok, Neil

    2011-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  1. Web Science Big Wins: Information Big Bang & Fundamental Constants

    OpenAIRE

    Carr, Les

    2010-01-01

    We take for granted a Web that provides free and unrestricted information exchange, but the Web is under pressure to change in order to respond to issues of security, commerce, criminality, privacy. Web Science needs to explain how the Web impacts society and predict the outcomes of proposed changes to Web infrastructure on business and society. Using the analogy of the Big Bang, this presentation describes how the Web spread the conditions of its initial creation throughout the whole of soci...

  2. Big Data – Big Deal for Organization Design?

    OpenAIRE

    Janne J. Korhonen

    2014-01-01

    Analytics is an increasingly important source of competitive advantage. It has even been posited that big data will be the next strategic emphasis of organizations and that analytics capability will be manifested in organizational structure. In this article, I explore how analytics capability might be reflected in organizational structure using the notion of  “requisite organization” developed by Jaques (1998). Requisite organization argues that a new strategic emphasis requires the addition ...

  3. Nástroje pro Big Data Analytics

    OpenAIRE

    Miloš, Marek

    2013-01-01

    The thesis covers the term for specific data analysis called Big Data. The thesis firstly defines the term Big Data and the need for its creation because of the rising need for deeper data processing and analysis tools and methods. The thesis also covers some of the technical aspects of Big Data tools, focusing on Apache Hadoop in detail. The later chapters contain Big Data market analysis and describe the biggest Big Data competitors and tools. The practical part of the thesis presents a way...

  4. ISSUES, CHALLENGES, AND SOLUTIONS: BIG DATA MINING

    Directory of Open Access Journals (Sweden)

    Jaseena K.U.

    2014-12-01

    Full Text Available Data has become an indispensable part of every economy, industry, organization, business function and individual. Big Data is a term used to identify the datasets that whose size is beyond the ability of typical database software tools to store, manage and analyze. The Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This paper presents the literature review about the Big data Mining and the issues and challenges with emphasis on the distinguished features of Big Data. It also discusses some methods to deal with big data.

  5. Fessenheim makes the most of its ten year outage

    International Nuclear Information System (INIS)

    EdF took advantage of the long shutdown opportunity during its Fessenheim 1 ten year outage (April to October 1989) to carry out a number of tests and modifications additional to normal outage operations. The major activities incorporated in this outage were: both normal outage operations and unscheduled operations carried out in response to problems found after inspections had been carried out; a reactor coolant pressure boundary hydro test, to comply with French regulations; a containment test (a mandatory safety requirement); 147 modifications, carried out as part of the 900MWe series upgrading programme; additional design studies initiated by EdF upon request from the Nuclear Facility Central Surveillance Service (SCSIN). (author)

  6. Big data is not a monolith

    CERN Document Server

    Sugimoto, Cassidy R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  7. Reactor Physics

    International Nuclear Information System (INIS)

    The Reactor Physics and MYRRHA Department of SCK-CEN offers expertise in various areas of reactor physics, in particular in neutronics calculations, reactor dosimetry, reactor operation, reactor safety and control and non-destructive analysis of reactor fuel. This expertise is applied in the Department's own research projects in the VENUS critical facility, in the BR1 reactor and in the MYRRHA project (this project aims at designing a prototype Accelerator Driven System). Available expertise is also used in programmes external to the Department such as the reactor pressure steel vessel programme, the BR2 reactor dosimetry, and the preparation and interpretation of irradiation experiments by means of neutron and gamma calculations. The activities of the Fuzzy Logic and Intelligent Technologies in Nuclear Science programme cover several domains outside the department. Progress and achievements in these topical areas in 2000 are summarised

  8. Reactor Physics

    International Nuclear Information System (INIS)

    SCK-CEN's Reactor Physics and MYRRHA Department offers expertise in various areas of reactor physics, in particular in neutron and gamma calculations, reactor dosimetry, reactor operation and control, reactor code benchmarking and reactor safety calculations. This expertise is applied in the Department's own research projects in the VENUS critical facility, in the BR1 reactor and in the MYRRHA project (this project aims at designing a prototype Accelerator Driven System). Available expertise is also used in programmes external to the Department such as the reactor pressure steel vessel programme, the BR2 materials testing reactor dosimetry, and the preparation and interpretation of irradiation experiments by means of neutron and gamma calculations. The activities of the Fuzzy Logic and Intelligent Technologies in Nuclear Science programme cover several domains outside the department. Progress and achievements in these topical areas in 2001 are summarised

  9. Reactor Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ait Abderrahim, A

    2001-04-01

    The Reactor Physics and MYRRHA Department of SCK-CEN offers expertise in various areas of reactor physics, in particular in neutronics calculations, reactor dosimetry, reactor operation, reactor safety and control and non-destructive analysis of reactor fuel. This expertise is applied in the Department's own research projects in the VENUS critical facility, in the BR1 reactor and in the MYRRHA project (this project aims at designing a prototype Accelerator Driven System). Available expertise is also used in programmes external to the Department such as the reactor pressure steel vessel programme, the BR2 reactor dosimetry, and the preparation and interpretation of irradiation experiments by means of neutron and gamma calculations. The activities of the Fuzzy Logic and Intelligent Technologies in Nuclear Science programme cover several domains outside the department. Progress and achievements in these topical areas in 2000 are summarised.

  10. Big Numbers in String Theory

    CERN Document Server

    Schellekens, A N

    2016-01-01

    This paper contains some personal reflections on several computational contributions to what is now known as the "String Theory Landscape". It consists of two parts. The first part concerns the origin of big numbers, and especially the number $10^{1500}$ that appeared in work on the covariant lattice construction (with W. Lerche and D. Luest). This part contains some new results. I correct a huge but inconsequential error, discuss some more accurate estimates, and compare with the counting for free fermion constructions. In particular I prove that the latter only provide an exponentially small fraction of all even self-dual lattices for large lattice dimensions. The second part of the paper concerns dealing with big numbers, and contains some lessons learned from various vacuum scanning projects.

  11. George and the big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2012-01-01

    George has problems. He has twin baby sisters at home who demand his parents’ attention. His beloved pig Freddy has been exiled to a farm, where he’s miserable. And worst of all, his best friend, Annie, has made a new friend whom she seems to like more than George. So George jumps at the chance to help Eric with his plans to run a big experiment in Switzerland that seeks to explore the earliest moment of the universe. But there is a conspiracy afoot, and a group of evildoers is planning to sabotage the experiment. Can George repair his friendship with Annie and piece together the clues before Eric’s experiment is destroyed forever? This engaging adventure features essays by Professor Stephen Hawking and other eminent physicists about the origins of the universe and ends with a twenty-page graphic novel that explains how the Big Bang happened—in reverse!

  12. The big wheels of ATLAS

    CERN Multimedia

    2006-01-01

    The ATLAS cavern is filling up at an impressive rate. The installation of the first of the big wheels of the muon spectrometer, a thin gap chamber (TGC) wheel, was completed in September. The muon spectrometer will include four big moving wheels at each end, each measuring 25 metres in diameter. Of the eight wheels in total, six will be composed of thin gap chambers for the muon trigger system and the other two will consist of monitored drift tubes (MDTs) to measure the position of the muons (see Bulletin No. 13/2006). The installation of the 688 muon chambers in the barrel is progressing well, with three-quarters of them already installed between the coils of the toroid magnet.

  13. Big data and ophthalmic research.

    Science.gov (United States)

    Clark, Antony; Ng, Jonathon Q; Morlet, Nigel; Semmens, James B

    2016-01-01

    Large population-based health administrative databases, clinical registries, and data linkage systems are a rapidly expanding resource for health research. Ophthalmic research has benefited from the use of these databases in expanding the breadth of knowledge in areas such as disease surveillance, disease etiology, health services utilization, and health outcomes. Furthermore, the quantity of data available for research has increased exponentially in recent times, particularly as e-health initiatives come online in health systems across the globe. We review some big data concepts, the databases and data linkage systems used in eye research-including their advantages and limitations, the types of studies previously undertaken, and the future direction for big data in eye research. PMID:26844660

  14. Big Bounce in Dipole Cosmology

    OpenAIRE

    Battisti, Marco Valerio; Marciano, Antonino

    2010-01-01

    We derive the cosmological Big Bounce scenario from the dipole approximation of Loop Quantum Gravity. We show that a non-singular evolution takes place for any matter field and that, by considering a massless scalar field as a relational clock for the dynamics, the semi-classical proprieties of an initial state are preserved on the other side of the bounce. This model thus enhances the relation between Loop Quantum Cosmology and the full theory.

  15. BIG DATA IN BUSINESS ENVIRONMENT

    OpenAIRE

    Logica BANICA; Alina HAGIU

    2015-01-01

    In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured) in order to improve current transactions, to develop new business models, to provide a real image ...

  16. BIG Data – A Review.

    OpenAIRE

    Anuradha Bhatia; Gaurav Vaswani

    2013-01-01

    As more data becomes available from an abundance of sources both within and outside, organizations are seeking to use those abundant resources to increase innovation, retain customers, and increase operational efficiency. At the same time, organizations are challenged by their end users, who are demanding greater capability and integration to mine and analyze burgeoning new sources of information. Big Data provides opportunities for business users to ask questions they never were able to ask ...

  17. Big data processing with Hadoop

    OpenAIRE

    Wu, Shiqi

    2015-01-01

    Computing technology has changed the way we work, study, and live. The distributed data processing technology is one of the popular topics in the IT field. It provides a simple and centralized computing platform by reducing the cost of the hardware. The characteristics of distributed data processing technology have changed the whole industry. Hadoop, as the open source project of Apache foundation, is the most representative platform of distributed big data processing. The Hadoop distribu...

  18. Big Bang Nucleosynthesis: An Update

    OpenAIRE

    Olive, Keith A.; Scully, Sean T.

    1995-01-01

    WThe current status of big bang nucleosynthesis is reviewed with an emphasis on the comparison between the observational determination of the light element abundances of \\D, \\he3, \\he4 and \\li7 and the predictions from theory. In particular, we present new analyses for \\he4 and \\li7. Implications for physics beyond the standard model are also discussed. Limits on the effective number of neutrino flavors are also updated.

  19. Industrialization and the Big Push

    OpenAIRE

    1988-01-01

    This paper explores Rosenstein-Rodman's (1943) idea that simultaneous industrialization of many sectors of the economy can be profitable for all of them, even when no sector can break even industrializing alone. We analyze this ides in the context of an imperfectly competitive economy with aggregate demand spillovers, and interpret the big push into industrialization as a move from a bad to a good equilibrium. We show that for two equilibria to exist, it must be the case that an industrializi...

  20. Pragmatic Interaction between Big Powers

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Lu. It is very difficult to summarize the relationship among big powers in 2004. Looking east, there existed a """"small cold war"""" named by some media between Europe and Russia and between the United States and Russia; with regard to the """"orange revolution"""" in Ukraine at the end of the year, a rival show rope and Russia. Looking east, awas displayed between America, Eufresh scent seems to fill the air.

  1. Reactor operation

    CERN Document Server

    Shaw, J

    2013-01-01

    Reactor Operation covers the theoretical aspects and design information of nuclear reactors. This book is composed of nine chapters that also consider their control, calibration, and experimentation.The opening chapters present the general problems of reactor operation and the principles of reactor control and operation. The succeeding chapters deal with the instrumentation, start-up, pre-commissioning, and physical experiments of nuclear reactors. The remaining chapters are devoted to the control rod calibrations and temperature coefficient measurements in the reactor. These chapters also exp

  2. Reactor safeguards

    CERN Document Server

    Russell, Charles R

    2013-01-01

    Reactor Safeguards provides information for all who are interested in the subject of reactor safeguards. Much of the material is descriptive although some sections are written for the engineer or physicist directly concerned with hazards analysis or site selection problems. The book opens with an introductory chapter on radiation hazards, the construction of nuclear reactors, safety issues, and the operation of nuclear reactors. This is followed by separate chapters that discuss radioactive materials, reactor kinetics, control and safety systems, containment, safety features for water reactor

  3. The BigBOSS Experiment

    CERN Document Server

    Schlegel, D; Abraham, T; Ahn, C; Prieto, C Allende; Annis, J; Aubourg, E; Azzaro, M; Baltay, S Bailey C; Baugh, C; Bebek, C; Becerril, S; Blanton, M; Bolton, A; Bromley, B; Cahn, R; Carton, P -H; Cervantes-Cota, J L; Chu, Y; Cortes, M; Dawson, K; Dey, A; Dickinson, M; Diehl, H T; Doel, P; Ealet, A; Edelstein, J; Eppelle, D; Escoffier, S; Evrard, A; Faccioli, L; Frenk, C; Geha, M; Gerdes, D; Gondolo, P; Gonzalez-Arroyo, A; Grossan, B; Heckman, T; Heetderks, H; Ho, S; Honscheid, K; Huterer, D; Ilbert, O; Ivans, I; Jelinsky, P; Jing, Y; Joyce, D; Kennedy, R; Kent, S; Kieda, D; Kim, A; Kim, C; Kneib, J -P; Kong, X; Kosowsky, A; Krishnan, K; Lahav, O; Lampton, M; LeBohec, S; Brun, V Le; Levi, M; Li, C; Liang, M; Lim, H; Lin, W; Linder, E; Lorenzon, W; de la Macorra, A; Magneville, Ch; Malina, R; Marinoni, C; Martinez, V; Majewski, S; Matheson, T; McCloskey, R; McDonald, P; McKay, T; McMahon, J; Menard, B; Miralda-Escude, J; Modjaz, M; Montero-Dorta, A; Morales, I; Mostek, N; Newman, J; Nichol, R; Nugent, P; Olsen, K; Padmanabhan, N; Palanque-Delabrouille, N; Park, I; Peacock, J; Percival, W; Perlmutter, S; Peroux, C; Petitjean, P; Prada, F; Prieto, E; Prochaska, J; Reil, K; Rockosi, C; Roe, N; Rollinde, E; Roodman, A; Ross, N; Rudnick, G; Ruhlmann-Kleider, V; Sanchez, J; Sawyer, D; Schimd, C; Schubnell, M; Scoccimaro, R; Seljak, U; Seo, H; Sheldon, E; Sholl, M; Shulte-Ladbeck, R; Slosar, A; Smith, D S; Smoot, G; Springer, W; Stril, A; Szalay, A S; Tao, C; Tarle, G; Taylor, E; Tilquin, A; Tinker, J; Valdes, F; Wang, J; Wang, T; Weaver, B A; Weinberg, D; White, M; Wood-Vasey, M; Yang, J; Yeche, X Yang Ch; Zakamska, N; Zentner, A; Zhai, C; Zhang, P

    2011-01-01

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy red...

  4. Big data: the management revolution.

    Science.gov (United States)

    McAfee, Andrew; Brynjolfsson, Erik

    2012-10-01

    Big data, the authors write, is far more powerful than the analytics of the past. Executives can measure and therefore manage more precisely than ever before. They can make better predictions and smarter decisions. They can target more-effective interventions in areas that so far have been dominated by gut and intuition rather than by data and rigor. The differences between big data and analytics are a matter of volume, velocity, and variety: More data now cross the internet every second than were stored in the entire internet 20 years ago. Nearly real-time information makes it possible for a company to be much more agile than its competitors. And that information can come from social networks, images, sensors, the web, or other unstructured sources. The managerial challenges, however, are very real. Senior decision makers have to learn to ask the right questions and embrace evidence-based decision making. Organizations must hire scientists who can find patterns in very large data sets and translate them into useful business information. IT departments have to work hard to integrate all the relevant internal and external sources of data. The authors offer two success stories to illustrate how companies are using big data: PASSUR Aerospace enables airlines to match their actual and estimated arrival times. Sears Holdings directly analyzes its incoming store data to make promotions much more precise and faster. PMID:23074865

  5. Consequences and experiences - ten years after the Chernobyl accident

    International Nuclear Information System (INIS)

    On 26 April 1986. the most serious accident in the history of the nuclear industry occurred at the Chernobyl nuclear power plant in the former Soviet Union, near the present borders of Ukraine, Belarus and Russia.Material released into the atmosphere dispersed and eventually deposited back on the surface of the earth,were it was measurable over the whole northern hemisphere. Millions of people and all segments of life and economy have been affected by the accident. Radioactive contamination has reached several tens of MBq/m2 in the area of 30 km diameter around the reactor in 1986., and plants and animals have been exposed to short lived radionuclides up to external doses of several tens of Gy. In the early phase after the accident, 237 persons were suspected to have acute radiation syndrome as a consequence of the Chernobyl accident, but diagnoses has been confirmed in 134 cases. In that phase 28 person have died as a consequence of exposure. There are significant non - related health disorders and symptoms, such as anxiety, depression and various psychosomatic disorders attributable to mental stress among the population in the region

  6. REACTOR GROUT THERMAL PROPERTIES

    Energy Technology Data Exchange (ETDEWEB)

    Steimke, J.; Qureshi, Z.; Restivo, M.; Guerrero, H.

    2011-01-28

    Savannah River Site has five dormant nuclear production reactors. Long term disposition will require filling some reactor buildings with grout up to ground level. Portland cement based grout will be used to fill the buildings with the exception of some reactor tanks. Some reactor tanks contain significant quantities of aluminum which could react with Portland cement based grout to form hydrogen. Hydrogen production is a safety concern and gas generation could also compromise the structural integrity of the grout pour. Therefore, it was necessary to develop a non-Portland cement grout to fill reactors that contain significant quantities of aluminum. Grouts generate heat when they set, so the potential exists for large temperature increases in a large pour, which could compromise the integrity of the pour. The primary purpose of the testing reported here was to measure heat of hydration, specific heat, thermal conductivity and density of various reactor grouts under consideration so that these properties could be used to model transient heat transfer for different pouring strategies. A secondary purpose was to make qualitative judgments of grout pourability and hardened strength. Some reactor grout formulations were unacceptable because they generated too much heat, or started setting too fast, or required too long to harden or were too weak. The formulation called 102H had the best combination of characteristics. It is a Calcium Alumino-Sulfate grout that contains Ciment Fondu (calcium aluminate cement), Plaster of Paris (calcium sulfate hemihydrate), sand, Class F fly ash, boric acid and small quantities of additives. This composition afforded about ten hours of working time. Heat release began at 12 hours and was complete by 24 hours. The adiabatic temperature rise was 54 C which was within specification. The final product was hard and displayed no visible segregation. The density and maximum particle size were within specification.

  7. Top Ten Workplace Skills for Future Organizations

    Directory of Open Access Journals (Sweden)

    Kelly Y. Senter

    2015-08-01

    Full Text Available Many researchers have indicated that there are skills that are going to be essential for the future workforce. This review helps identify these skills and the applicability of these skills to job performance of the future organization competing in a globalized environment. The review provides insight into each of the ten listed skills and also information on how the skills will be useful for future organizations. The review will extend previous literature regarding the identified ten skills essential for future organizations. Providing this analysis of the literature and potential applicability of these identified essential skills will help guide and focus additional studies relating to future job performance requirements.

  8. REVIVAL PLANS, Ten Key Sectors Benefit?

    Institute of Scientific and Technical Information of China (English)

    Yantai CHEN; Hongbo CAI; Yang XU

    2009-01-01

    @@ To revive China's industrial establishment during the biggest Great Depression in the world after World War Ⅱ,China's Central government has launched the "Revival Plans of Ten Key Sectors"plus the 4 trillion stimulus package in the early 2009.Formulated by the National Development and Planning Commission(NDRC).these revival plans aimed at reinvigorating"ten key sectors",to be specified.the iron and steel.automotive,shipbuilding,petrochemical,textile,light,nonferrous metals,equipment manufacturing,electronics and information technology,and logistics industrial sectors.

  9. Fakultäten und Einrichungen

    OpenAIRE

    2014-01-01

    Mit rund 35 000 Studierenden, Wissenschaftlern und Angestellten, mit 14 Fakultäten und 150 Instituten zählt die Universität Leipzig zu den größten Hochschulen des Freistaates Sachsen. Als Landesuniversität ist sie ein gewichtiger Faktor in Forschung und Lehre weit über die Region hinaus. Von A wie Afrikanistik bis Z wie Zahnmedizin deckt die Universität Leipzig als klassische Volluniversität die ganze Bandbreite von den Naturwissenschaften über Jura und Medizin bis hin zu zahlreichen geist...

  10. Triennial technical report - 1986, 1987, 1988 - Instituto de Engenharia Nuclear (IEN) -Dept. of Reactors (DERE)

    International Nuclear Information System (INIS)

    The research activities developed during the period 1986, 1987 and 1988 by the Reactor Department of Brazilian Nuclear Energy Commission (CNEN-DERE) are summarized. The principal aim of the Department of Reactors is concerned to the study and development of fast reactors and research thermal reactors. The DERE also assists the CNEN in the areas related to analysis of power reactor structure; to teach Reactor Physics and Engineering at the University, and professional training to the Nuclear Engineering Institute. To develop its research activity the DERE has three big facilities: Argonauta reactor, CTS-1 sodium circuit, and water circuit. (M.I.)

  11. Big Bang–Big Crunch Optimization Algorithm for Linear Phase Fir Digital Filter Design

    OpenAIRE

    Ms. Rashmi Singh Dr. H. K. Verma

    2012-01-01

    The Big Bang–Big Crunch (BB–BC) optimization algorithm is a new optimization method that relies on the Big Bang and Big Crunch theory, one of the theories of the evolution of the universe. In this paper, a Big Bang–Big Crunch algorithm has been used here for the design of linear phase finite impulse response (FIR) filters. Here the experimented fitness function based on the mean squared error between the actual and the ideal filter response. This paper presents the plot of magnitude response ...

  12. Research Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Martens, Frederick H. [Argonne National Laboratory; Jacobson, Norman H.

    1968-09-01

    This booklet discusses research reactors - reactors designed to provide a source of neutrons and/or gamma radiation for research, or to aid in the investigation of the effects of radiation on any type of material.

  13. Safe Operation of Research Reactors in Germany

    International Nuclear Information System (INIS)

    In Germany, experience was gained in the field of safe operation of research reactors during the last five decades. In this time, in total 46 research reactors were built and operated safely. Concerning the design, there is, or has been, a very broad range of different types of research reactors. The variety of facilities includes large pool or tank reactors with a thermal power of several tens of megawatt as well as small educational reactors with a negligible thermal power and critical assemblies. At present, 8 research reactors are still in operation. The other facilities are permanently shutdown, in decommissioning or have already been dismantled completely and released from regulatory control. In this paper, four selected facilities still being operated are presented as examples for safe operation of research reactors in Germany, including especially a description of the safety reviews and safety upgrades for the older facilities. (author)

  14. Top-Ten IT Issues: 2009

    Science.gov (United States)

    Agee, Anne Scrivener; Yang, Catherine

    2009-01-01

    This article presents the top-ten IT-related issues in terms of strategic importance to the institution, as revealed by the tenth annual EDUCAUSE Current Issues Survey. These IT-related issues include: (1) Funding IT; (2) Administrative/ERP Information Systems; (3) Security; (4) Infrastructure/Cyberinfrastructure; (5) Teaching and Learning with…

  15. Bridging gaps : ten crosscurrents in media studies

    OpenAIRE

    Fornäs, Johan

    2008-01-01

    The final, definitive version of this paper has been published in: Media, Culture and Society, (30), 6, 895-905, 2008.Johan Fornäs, Bridging Gaps: Ten Crosscurrents in Media Studies.http://dx.doi.org/10.1177/0163443708096811. by SAGE Publications Ltd, All rights reserved. http://www.sagepub.com/

  16. Czech, Slovak science ten years after split

    CERN Multimedia

    2003-01-01

    Ten years after the split of Czechoslovakia Czech and Slovak science are facing the same difficulties: shortage of money for research, poor salaries, obsolete equipment and brain drain, especially of the young, according to a feature in the Daily Lidove Noviny (1 page).

  17. Ten recommendations for software engineering in research

    OpenAIRE

    Hastings, Janna; Haug, Kenneth; Steinbeck, Christoph

    2014-01-01

    Research in the context of data-driven science requires a backbone of well-written software, but scientific researchers are typically not trained at length in software engineering, the principles for creating better software products. To address this gap, in particular for young researchers new to programming, we give ten recommendations to ensure the usability, sustainability and practicality of research software.

  18. Ten recommendations for software engineering in research.

    Science.gov (United States)

    Hastings, Janna; Haug, Kenneth; Steinbeck, Christoph

    2014-01-01

    Research in the context of data-driven science requires a backbone of well-written software, but scientific researchers are typically not trained at length in software engineering, the principles for creating better software products. To address this gap, in particular for young researchers new to programming, we give ten recommendations to ensure the usability, sustainability and practicality of research software. PMID:25685331

  19. Ten themes of viscous liquid dynamics

    DEFF Research Database (Denmark)

    Dyre, J. C.

    2007-01-01

    Ten ‘themes' of viscous liquid physics are discussed with a focus on how they point to a general description of equilibrium viscous liquid dynamics (i.e., fluctuations) at a given temperature. This description is based on standard time-dependent Ginzburg-Landau equations for the density fields...

  20. Ten new species of Afrotropical Pterophoridae (Lepidoptera)

    OpenAIRE

    Gielis, C.

    2008-01-01

    Ten new Afrotropical species of Pterophoridae are described: Agdistis linnaei spec. nov., Agdistis bouyeri spec. nov., Ochyrotica bjoernstadti spec. nov., Platyptilia aarviki spec. nov., Stenoptilia kiitulo spec. nov., Exelastis caroli spec. nov., Eucapperia continentalis spec. nov., Buckleria vanderwolfi spec. nov., Pselnophorus meruensis spec. nov., and Hellinsia emmelinoida spec. nov. The species are illustrated in colour, and their genitalia in line drawings.

  1. Orbits of Ten Visual Binary Stars

    Institute of Scientific and Technical Information of China (English)

    B.Novakovi(c)

    2007-01-01

    We present the orbits of ten visual binary stars:WDS 01015+6922.WDS 01424-0645,WDS 01461+6349,WDS 04374-0951,WDS 04478+5318,WDS 05255-0033,WDS 05491+6248,WDS 06404+4058,WDS 07479-1212,and WDS 18384+0850.We have also determined their masses,dynamical parallaxes and ephemerides.

  2. IAEA safeguards at research reactors

    International Nuclear Information System (INIS)

    The International Atomic Energy Agency applies safeguards to almost 150 facilities classified as research reactors. From a safeguards point of view, these facilities present a spectrum of features that must be addressed both from the nuclear material and from the operational viewpoints. The nuclear fuel used by these reactors varies from high enriched uranium (NEU), up to 93 U-235, to natural uranium and the thermal power output from over 100 megawatt to less than ten watts. Research reactors are also used for a wide variety of purposes, including materials testing, radiosotope production, training and nuclear physics studies. The effort spent by the Agency in safeguarding these reactors is dependant upon the thermal power of the reactor and on the quantity and type of nuclear material present. On some research reactors, the Agency devotes more inspection effort than on a large power reactor. On others, very little effort is reguired. Safeguards that are applied are done so according to Agency State agreements and consist of a combiination of nuclear material accounting and containment and surveillance. In this paper, the safeguards activities performed by the State and by the Agency will be reviewed for a large (≤50MWt) and for a small (≥ 1 MWt) reactor according to the most common type agreement. (author)

  3. Research reactors

    International Nuclear Information System (INIS)

    This article proposes an overview of research reactors, i.e. nuclear reactors of less than 100 MW. Generally, these reactors are used as neutron generators for basic research in matter sciences and for technological research as a support to power reactors. The author proposes an overview of the general design of research reactors in terms of core size, of number of fissions, of neutron flow, of neutron space distribution. He outlines that this design is a compromise between a compact enough core, a sufficient experiment volume, and high enough power densities without affecting neutron performance or its experimental use. The author evokes the safety framework (same regulations as for power reactors, more constraining measures after Fukushima, international bodies). He presents the main characteristics and operation of the two families which represent almost all research reactors; firstly, heavy water reactors (photos, drawings and figures illustrate different examples); and secondly light water moderated and cooled reactors with a distinction between open core pool reactors like Melusine and Triton, pool reactors with containment, experimental fast breeder reactors (Rapsodie, the Russian BOR 60, the Chinese CEFR). The author describes the main uses of research reactors: basic research, applied and technological research, safety tests, production of radio-isotopes for medicine and industry, analysis of elements present under the form of traces at very low concentrations, non destructive testing, doping of silicon mono-crystalline ingots. The author then discusses the relationship between research reactors and non proliferation, and finally evokes perspectives (decrease of the number of research reactors in the world, the Jules Horowitz project)

  4. Reactor physics and reactor computations

    International Nuclear Information System (INIS)

    Mathematical methods and computer calculations for nuclear and thermonuclear reactor kinetics, reactor physics, neutron transport theory, core lattice parameters, waste treatment by transmutation, breeding, nuclear and thermonuclear fuels are the main interests of the conference

  5. Research reactors

    International Nuclear Information System (INIS)

    There are currently 284 research reactors in operation, and 12 under construction around the world. Of the operating reactors, nearly two-thirds are used exclusively for research, and the rest for a variety of purposes, including training, testing, and critical assembly. For more than 50 years, research reactor programs have contributed greatly to the scientific and educational communities. Today, six of the world's research reactors are being shut down, three of which are in the USA. With government budget constraints and the growing proliferation concerns surrounding the use of highly enriched uranium in some of these reactors, the future of nuclear research could be impacted

  6. Reactor container

    International Nuclear Information System (INIS)

    Object: To provide a jet and missile protective wall of a configuration being inflated toward the center of a reactor container on the inside of a body of the reactor container disposed within a biological shield wall to thereby increase safety of the reactor container. Structure: A jet and missile protective wall comprised of curved surfaces internally formed with a plurality of arch inflations filled with concrete between inner and outer iron plates and shape steel beam is provided between a reactor container surrounded by a biological shield wall and a thermal shield wall surrounding the reactor pressure vessel, and an adiabatic heat insulating material is filled in space therebetween. (Yoshino, Y.)

  7. An Overview of Big Data Privacy Issues

    OpenAIRE

    Patrick Hung

    2013-01-01

    Big data is the term for a collection of large and complex datasets from different sources that is difficult to process using traditional data management and processing applications. In these datasets, some information must be kept secret from others. On the other hand, some information has to be released for acquainting information or big data analytical services. The research challenge is how to protect the private information in the context of big data. Privacy is described by the ability ...

  8. Social Big Data and Privacy Awareness

    OpenAIRE

    Sang, Lin

    2015-01-01

    Based on the rapid development of Big Data, the data from the online social network becomea major part of it. Big data make the social networks became data-oriented rather than social-oriented. Taking this into account, this dissertation presents a qualitative study to research howdoes the data-oriented social network affect its users’ privacy management for nowadays. Within this dissertation, an overview of Big Data and privacy issues on the social network waspresented as a background study. ...

  9. ATLAS: Big Data in a Small Package?

    Science.gov (United States)

    Denneau, Larry

    2016-01-01

    For even small astronomy projects, the petabyte scale is now upon us. The Asteroid Terrestrial-impact Last Alert System (Tonry 2011) will survey the entire visible sky from Hawaii multiple times per night to search for near-Earth asteroids on impact trajectories. While the ATLAS optical system is modest by modern astronomical standards - two 0.5 m F/2.0 telescopes - each night the ATLAS system will measure nearly 109 astronomical sources to a photometric accuracy of <5%, totaling 1012 individual observations over its initial 3-year mission. This ever-growing dataset must be searched in real-time for moving objects and transients then archived for further analysis, and alerts for newly discovered near-Earth asteroids (NEAs) disseminated within tens of minutes from detection. ATLAS's all-sky coverage ensures it will discover many `rifle shot' near-misses moving rapidly on the sky as they shoot past the Earth, so the system will need software to automatically detect highly-trailed sources and discriminate them from the thousands of low-Earth orbit (LEO) and geosynchronous orbit (GEO) satellites ATLAS will see each night. Additional interrogation will identify interesting phenomena from millions of transient sources per night beyond the solar system. The data processing and storage requirements for ATLAS demand a `big data' approach typical of commercial internet enterprises. We describe our experience in deploying a nimble, scalable and reliable data processing infrastructure, and suggest ATLAS as steppingstone to data processing capability needed as we enter the era of LSST.

  10. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Corridor Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A first-surface topography Digital Elevation Model (DEM) mosaic for the Big Sandy Creek Corridor Unit of Big Thicket National Preserve in Texas was produced from...

  11. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A bare-earth topography digital elevation model (DEM) mosaic for the Big Sandy Creek Unit of Big Thicket National Preserve in Texas, was produced from remotely...

  12. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A first-surface topography digital elevation model (DEM) mosaic for the Big Sandy Creek Unit of Big Thicket National Preserve in Texas, was produced from remotely...

  13. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Corridor Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A bare-earth topography Digital Elevation Model (DEM) mosaic for the Big Sandy Creek Corridor Unit of Big Thicket National Preserve in Texas was produced from...

  14. Fitting ERGMs on big networks.

    Science.gov (United States)

    An, Weihua

    2016-09-01

    The exponential random graph model (ERGM) has become a valuable tool for modeling social networks. In particular, ERGM provides great flexibility to account for both covariates effects on tie formations and endogenous network formation processes. However, there are both conceptual and computational issues for fitting ERGMs on big networks. This paper describes a framework and a series of methods (based on existent algorithms) to address these issues. It also outlines the advantages and disadvantages of the methods and the conditions to which they are most applicable. Selected methods are illustrated through examples. PMID:27480375

  15. Big deformation in 17C

    International Nuclear Information System (INIS)

    Reaction and interaction cross sections of 17C on a carbon target have been re-analyzed using the modified Glauber model. The analysis with a deformed Woods-Saxon density/potential suggests a big deformation structure for 17C. The existence of a tail in the density distribution supports the possibility of it being a one-neutron halo structure. Under a deformed core plus a single-particle assumption, analysis shows a dominant d-wave of the valence neutron in 17C. (authors)

  16. Big bang nucleosynthesis: An update

    International Nuclear Information System (INIS)

    An update on the standard model of big bang nucleosynthesis (BBN) is presented. With the value of the baryon-tophoton ratio determined to high precision by WMAP, standard BBN is a parameter-free theory. In this context, the theoretical prediction for the abundances of D, 4He, and 7Li is discussed and compared to their observational determination. While concordance for D and 4He is satisfactory, the prediction for 7Li exceeds the observational determination by a factor of about four. Possible solutions to this problem are discussed

  17. Big Five -persoonallisuuspiirteiden yhteydet unettomuuteen

    OpenAIRE

    Aronen, Aino

    2015-01-01

    Tutkimuksen tarkoituksena oli selvittÀÀ, ovatko Big Five -persoonallisuuspiirteet (neuroottisuus, ulospÀinsuuntautuneisuus, tunnollisuus, avoimuus kokemuksille ja sovinnollisuus) yhteydessÀ unettomuuden oireisiin, joita olivat nukahtamisvaikeudet, herÀilyt, vaikeudet pysyÀ unessa ja vÀsyneenÀ herÀÀmiset normaalipituisten unien jÀlkeen. Unettomuutta koskevien teorioiden mukaan korkea neuroottisuus, matala ulospÀinsuuntautuneisuus, matala tunnollisuus ja matala sovinnollisuus voivat...

  18. Traffic information computing platform for big data

    International Nuclear Information System (INIS)

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users

  19. Big Data Analytics Using Cloud and Crowd

    OpenAIRE

    Allahbakhsh, Mohammad; Arbabi, Saeed; Motahari-Nezhad, Hamid-Reza; Benatallah, Boualem

    2016-01-01

    The increasing application of social and human-enabled systems in people's daily life from one side and from the other side the fast growth of mobile and smart phones technologies have resulted in generating tremendous amount of data, also referred to as big data, and a need for analyzing these data, i.e., big data analytics. Recently a trend has emerged to incorporate human computing power into big data analytics to solve some shortcomings of existing big data analytics such as dealing with ...

  20. Big data optimization recent developments and challenges

    CERN Document Server

    2016-01-01

    The main objective of this book is to provide the necessary background to work with big data by introducing some novel optimization algorithms and codes capable of working in the big data setting as well as introducing some applications in big data optimization for both academics and practitioners interested, and to benefit society, industry, academia, and government. Presenting applications in a variety of industries, this book will be useful for the researchers aiming to analyses large scale data. Several optimization algorithms for big data including convergent parallel algorithms, limited memory bundle algorithm, diagonal bundle method, convergent parallel algorithms, network analytics, and many more have been explored in this book.

  1. Big data: an introduction for librarians.

    Science.gov (United States)

    Hoy, Matthew B

    2014-01-01

    Modern life produces data at an astounding rate and shows no signs of slowing. This has lead to new advances in data storage and analysis and the concept of "big data," that is, massive data sets that can yield surprising insights when analyzed. This column will briefly describe what big data is and why it is important. It will also briefly explore the possibilities and problems of big data and the implications it has for librarians. A list of big data projects and resources is also included. PMID:25023020

  2. Big data analytics with R and Hadoop

    CERN Document Server

    Prajapati, Vignesh

    2013-01-01

    Big Data Analytics with R and Hadoop is a tutorial style book that focuses on all the powerful big data tasks that can be achieved by integrating R and Hadoop.This book is ideal for R developers who are looking for a way to perform big data analytics with Hadoop. This book is also aimed at those who know Hadoop and want to build some intelligent applications over Big data with R packages. It would be helpful if readers have basic knowledge of R.

  3. Urgent Call for Nursing Big Data.

    Science.gov (United States)

    Delaney, Connie W

    2016-01-01

    The purpose of this panel is to expand internationally a National Action Plan for sharable and comparable nursing data for quality improvement and big data science. There is an urgent need to assure that nursing has sharable and comparable data for quality improvement and big data science. A national collaborative - Nursing Knowledge and Big Data Science includes multi-stakeholder groups focused on a National Action Plan toward implementing and using sharable and comparable nursing big data. Panelists will share accomplishments and future plans with an eye toward international collaboration. This presentation is suitable for any audience attending the NI2016 conference. PMID:27332330

  4. Traffic information computing platform for big data

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn; Liu, Yan, E-mail: ztduan@chd.edu.cn; Dai, Jiting, E-mail: ztduan@chd.edu.cn; Kang, Jun, E-mail: ztduan@chd.edu.cn [Chang' an University School of Information Engineering, Xi' an, China and Shaanxi Engineering and Technical Research Center for Road and Traffic Detection, Xi' an (China)

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  5. Fuel for advanced CANDU reactors

    International Nuclear Information System (INIS)

    The CANDU reactor system has proven itself to be a world leader in terms of station availability and low total unit energy cost. In 1985 for example, four of the top ten reactor units in the world were CANDU reactors operating in South Korea and Canada. This excellent operating record requires an equivalent performance record of the low-cost, natural uranium fuel. Future CANDU reactors will be an evolution of the present design. Engineering work is under way to refine the existing CANDU 600 and to incorporate state-of-the-art technology, reducing the capital cost and construction schedule. In addition, a smaller CANDU 300 plant has been designed using proven CANDU 600 technology and components but with an innovative new plant layout that makes it cost competitive with coal fired plants. For the long term, work on advanced fuel cycles and major system improvements is underway ensuring that CANDU plants will stay competitive well into the next century

  6. BigDataBench: a Big Data Benchmark Suite from Internet Services

    OpenAIRE

    Wang, Lei; Zhan, Jianfeng; Luo, Chunjie; Zhu, Yuqing; Yang, Qiang; He, Yongqiang; Gao, Wanling; Jia, Zhen; Shi, Yingjie; Zhang, Shujie; Zheng, Chen; Lu, Gang; Zhan, Kent; Li, Xiaona; Qiu, Bizhu

    2014-01-01

    As architecture, systems, and data management communities pay greater attention to innovative big data systems and architectures, the pressure of benchmarking and evaluating these systems rises. Considering the broad use of big data systems, big data benchmarks must include diversity of data and workloads. Most of the state-of-the-art big data benchmarking efforts target evaluating specific types of applications or system software stacks, and hence they are not qualified for serving the purpo...

  7. Big Data; A Management Revolution : The emerging role of big data in businesses

    OpenAIRE

    Blasiak, Kevin

    2014-01-01

    Big data is a term that was coined in 2012 and has since then emerged to one of the top trends in business and technology. Big data is an agglomeration of different technologies resulting in data processing capabilities that have been unreached before. Big data is generally characterized by 4 factors. Volume, velocity and variety. These three factors distinct it from the traditional data use. The possibilities to utilize this technology are vast. Big data technology has touch points in differ...

  8. CloudJet4BigData: Streamlining Big Data via an accelerated socket interface

    OpenAIRE

    Frank Z.Wang

    2014-01-01

    Big data needs to feed users with fresh processing results and cloud platforms can be used to speed up big data applications. This paper describes a new data communication protocol (CloudJet) for long distance and large volume big data accessing operations to alleviate the large latencies encountered in sharing big data resources in the clouds. It encapsulates a dynamic multi-stream/multi-path engine at the socket level, which conforms to Portable Operating System Interface (POSIX) and thereb...

  9. CloudJet4BigData: Streamlining Big Data via an Accelerated Socket Interface

    OpenAIRE

    Wang, Frank Zhigang; Dimitrakos, Theo; Helian, Na; Wu, Sining; Li, Ling; Yates, Rodric

    2014-01-01

    Big data needs to feed users with fresh processing results and cloud platforms can be used to speed up big data applications. This paper describes a new data communication protocol (CloudJet) for long distance and large volume big data accessing operations to alleviate the large latencies encountered in sharing big data resources in the clouds. It encapsulates a dynamic multi-stream/multi-path engine at the socket level, which conforms to Portable Operating System Interface (POSIX) and thereb...

  10. Big science transformed science, politics and organization in Europe and the United States

    CERN Document Server

    Hallonsten, Olof

    2016-01-01

    This book analyses the emergence of a transformed Big Science in Europe and the United States, using both historical and sociological perspectives. It shows how technology-intensive natural sciences grew to a prominent position in Western societies during the post-World War II era, and how their development cohered with both technological and social developments. At the helm of post-war science are large-scale projects, primarily in physics, which receive substantial funds from the public purse. Big Science Transformed shows how these projects, popularly called 'Big Science', have become symbols of progress. It analyses changes to the political and sociological frameworks surrounding publicly-funding science, and their impact on a number of new accelerator and reactor-based facilities that have come to prominence in materials science and the life sciences. Interdisciplinary in scope, this book will be of great interest to historians, sociologists and philosophers of science.

  11. Study of future reactors

    International Nuclear Information System (INIS)

    Today, more than 420 large reactors with a gross output of close to 350 GWe supply 20 percent of world electricity needs, accounting for less than 5 percent of primary energy consumption. These figures are not expected to change in the near future, due to suspended reactor construction in many countries. Nevertheless, world energy needs continue to grow: the planet's population already exceeds five billion and is forecast to reach ten billion by the middle of the next century. Most less developed countries have a very low rate of energy consumption and, even though some savings can be made in industrialized countries, it will become increasingly difficult to satisfy needs using fossil fuels only. Furthermore, there has been no recent breakthrough in the energy landscape. The physical feasibility of the other great hope of nuclear energy, fusion, has yet to be proved; once this has been done, it will be necessary to solve technological problems and to assess economic viability. Although it is more ever necessary to pursue fusion programs, there is little likelihood of industrial applications being achieved in the coming decades. Coal and fission are the only ways to produce massive amounts of energy for the next century. Coal must overcome the pollution problems inherent in its use; fission nuclear power has to gain better public acceptance, which is obviously colored by safety and waste concerns. Most existing reactors were commissioned in the 1970s; reactor lifetime is a parameter that has not been clearly established. It will certainly be possible to refurbish some to extend their operation beyond the initial target of 30 or 40 years. But normal advances in technology and safety requirements will make the operation of the oldest reactors increasingly difficult. It becomes necessary to develop new generations of nuclear reactors, both to replace older ones and to revive plant construction in their countries that are not yet equipped or that have halted their

  12. 77 FR 27245 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN

    Science.gov (United States)

    2012-05-09

    ... Register (73 FR 76677) on December 17, 2008. For more about the initial process and the history of this... Fish and Wildlife Service Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN... comprehensive conservation plan (CCP) and environmental assessment (EA) for Big Stone National Wildlife...

  13. Comparative validity of brief to medium-length Big Five and Big Six personality questionnaires

    NARCIS (Netherlands)

    A.G. Thalmayer; G. Saucier; A. Eigenhuis

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five

  14. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  15. Astronomical surveys and big data

    Science.gov (United States)

    Mickaelian, Areg M.

    Recent all-sky and large-area astronomical surveys and their catalogued data over the whole range of electromagnetic spectrum, from γ -rays to radio waves, are reviewed, including such as Fermi-GLAST and INTEGRAL in γ -ray, ROSAT, XMM and Chandra in X-ray, GALEX in UV, SDSS and several POSS I and POSS II-based catalogues (APM, MAPS, USNO, GSC) in the optical range, 2MASS in NIR, WISE and AKARI IRC in MIR, IRAS and AKARI FIS in FIR, NVSS and FIRST in radio range, and many others, as well as the most important surveys giving optical images (DSS I and II, SDSS, etc.), proper motions (Tycho, USNO, Gaia), variability (GCVS, NSVS, ASAS, Catalina, Pan-STARRS), and spectroscopic data (FBS, SBS, Case, HQS, HES, SDSS, CALIFA, GAMA). An overall understanding of the coverage along the whole wavelength range and comparisons between various surveys are given: galaxy redshift surveys, QSO/AGN, radio, Galactic structure, and Dark Energy surveys. Astronomy has entered the Big Data era, with Astrophysical Virtual Observatories and Computational Astrophysics playing an important role in using and analyzing big data for new discoveries.

  16. Georges et le big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2011-01-01

    Georges et Annie, sa meilleure amie, sont sur le point d'assister à l'une des plus importantes expériences scientifiques de tous les temps : explorer les premiers instants de l'Univers, le Big Bang ! Grâce à Cosmos, leur super ordinateur, et au Grand Collisionneur de hadrons créé par Éric, le père d'Annie, ils vont enfin pouvoir répondre à cette question essentielle : pourquoi existons nous ? Mais Georges et Annie découvrent qu'un complot diabolique se trame. Pire, c'est toute la recherche scientifique qui est en péril ! Entraîné dans d'incroyables aventures, Georges ira jusqu'aux confins de la galaxie pour sauver ses amis...Une plongée passionnante au coeur du Big Bang. Les toutes dernières théories de Stephen Hawking et des plus grands scientifiques actuels.

  17. Baryon symmetric big bang cosmology

    Science.gov (United States)

    Stecker, F. W.

    1978-01-01

    Both the quantum theory and Einsteins theory of special relativity lead to the supposition that matter and antimatter were produced in equal quantities during the big bang. It is noted that local matter/antimatter asymmetries may be reconciled with universal symmetry by assuming (1) a slight imbalance of matter over antimatter in the early universe, annihilation, and a subsequent remainder of matter; (2) localized regions of excess for one or the other type of matter as an initial condition; and (3) an extremely dense, high temperature state with zero net baryon number; i.e., matter/antimatter symmetry. Attention is given to the third assumption, which is the simplest and the most in keeping with current knowledge of the cosmos, especially as pertains the universality of 3 K background radiation. Mechanisms of galaxy formation are discussed, whereby matter and antimatter might have collided and annihilated each other, or have coexisted (and continue to coexist) at vast distances. It is pointed out that baryon symmetric big bang cosmology could probably be proved if an antinucleus could be detected in cosmic radiation.

  18. BR2 Reactor: Introduction

    International Nuclear Information System (INIS)

    The irradiations in the BR2 reactor are in collaboration with or at the request of third parties such as the European Commission, the IAEA, research centres and utilities, reactor vendors or fuel manufacturers. The reactor also contributes significantly to the production of radioisotopes for medical and industrial applications, to neutron silicon doping for the semiconductor industry and to scientific irradiations for universities. Along the ongoing programmes on fuel and materials development, several new irradiation devices are in use or in design. Amongst others a loop providing enhanced cooling for novel materials testing reactor fuel, a device for high temperature gas cooled fuel as well as a rig for the irradiation of metallurgical samples in a Pb-Bi environment. A full scale 3-D heterogeneous model of BR2 is available. The model describes the real hyperbolic arrangement of the reactor and includes the detailed 3-D space dependent distribution of the isotopic fuel depletion in the fuel elements. The model is validated on the reactivity measurements of several tens of BR2 operation cycles. The accurate calculations of the axial and radial distributions of the poisoning of the beryllium matrix by 3He, 6Li and 3T are verified on the measured reactivity losses used to predict the reactivity behavior for the coming decades. The model calculates the main functionals in reactor physics like: conventional thermal and equivalent fission neutron fluxes, number of displacements per atom, fission rate, thermal power characteristics as heat flux and linear power density, neutron/gamma heating, determination of the fission energy deposited in fuel plates/rods, neutron multiplication factor and fuel burn-up. For each reactor irradiation project, a detailed geometry model of the experimental device and of its neighborhood is developed. Neutron fluxes are predicted within approximately 10 percent in comparison with the dosimetry measurements. Fission rate, heat flux and

  19. Verification of Unstructured Mesh Capabilities in MCNP6 for Reactor Physics Problems

    Energy Technology Data Exchange (ETDEWEB)

    Burke, Timothy P. [Los Alamos National Laboratory; Martz, Roger L. [Los Alamos National Laboratory; Kiedrowski, Brian C. [Los Alamos National Laboratory; Martin, William R. [Los Alamos National Laboratory

    2012-08-22

    New unstructured mesh capabilities in MCNP6 (developmental version during summer 2012) show potential for conducting multi-physics analyses by coupling MCNP to a finite element solver such as Abaqus/CAE[2]. Before these new capabilities can be utilized, the ability of MCNP to accurately estimate eigenvalues and pin powers using an unstructured mesh must first be verified. Previous work to verify the unstructured mesh capabilities in MCNP was accomplished using the Godiva sphere [1], and this work attempts to build on that. To accomplish this, a criticality benchmark and a fuel assembly benchmark were used for calculations in MCNP using both the Constructive Solid Geometry (CSG) native to MCNP and the unstructured mesh geometry generated using Abaqus/CAE. The Big Ten criticality benchmark [3] was modeled due to its geometry being similar to that of a reactor fuel pin. The C5G7 3-D Mixed Oxide (MOX) Fuel Assembly Benchmark [4] was modeled to test the unstructured mesh capabilities on a reactor-type problem.

  20. Leak failure analysis for reactor coolant loop in a nuclear power plant

    International Nuclear Information System (INIS)

    Over the years, the use of risk information in safety assessment of nuclear power plant has been gradually increased. For example, the risk information, or failure probabilities are used in the risk-based inspection. Objectives of this paper are to evaluate the fatigue failure probabilities of various subcomponents of the main coolant loop in a pressurized water reactor and the relative risk ranking based on the plant's data. For this purpose, a probabilistic fracture mechanics code base on Monte Carlo simulation techniques was developed by incorporating both circumferential and longitudinal crack modules. It was then utilized to calculate fatigue failure probabilities due to through wall crack, small leak, big leak and LOCA situations subjected to all possible loadings. Various crack sizes and aspect ratios were assumed and both circumferential and axial directions were considered to find out the relative cumulative failure probabilities for the plant's life depending on components geometry, materials' mechanical properties. Special attentions were given to elbow sections. In addition, the effect of in-service inspection on the reduction of failure probabilities of these parts was investigated. For weld locations and for normal operating conditions, circumferential weld of reactor pressure vessel outlet nozzle shows the highest cumulative small leak failure probability and steam generator outlet nozzle shows the lowest cumulative small leak failure for the sixty years of plant life. Ten years interval in service inspection (ISI) program reduces the small leak failure probabilities up to 10%. (author)

  1. CNPC's Ten Major Technological Events in 2004

    Institute of Scientific and Technical Information of China (English)

    Technological Development Department of CNPC

    2005-01-01

    @@ Editor's note: To make a timely introduction of the latest technologies developed by CNPC, Technological Development Department of CNPC entrusted Petroleum Economic & Technological Research Center of CNPC to appraise the oil company 's major technological developments. Based on three rounds of voting by nearly 100 oil experts, ten major technological events in 2004 are finally selected from more than 1 00 technological projects of CNPC according to the measurement standards of innovation, technological maturity, function and scientific value.

  2. The First Ten Years of Swift Supernovae

    OpenAIRE

    Brown, Peter J.; Roming, Peter W. A.; Milne, Peter A.

    2015-01-01

    The Swift Gamma Ray Burst Explorer has proven to be an incredible platform for studying the multiwavelength properties of supernova explosions. In its first ten years, Swift has observed over three hundred supernovae. The ultraviolet observations reveal a complex diversity of behavior across supernova types and classes. Even amongst the standard candle type Ia supernovae, ultraviolet observations reveal distinct groups. When the UVOT data is combined with higher redshift optical data, the rel...

  3. The GATS turns ten: A preliminary stocktaking

    OpenAIRE

    Adlung, Rudolf

    2004-01-01

    The paper discusses the experience to date with the implementation and application of the General Agreement on Trade in Services (GATS), some ten years after its entry into force. One striking observation is the smooth functioning of the Agreement, which has created far less tensions and frictions, including at Ministerial Meetings, than its difficult negotiating history might have suggested. This is due in large part to a high degree of flexibility at several levels: Members have more scope ...

  4. String-String Duality in Ten Dimensions

    OpenAIRE

    Hull, C. M.

    1995-01-01

    The heterotic string occurs as a soliton of the type I superstring in ten dimensions, supporting the conjecture that these two theories are equivalent. The conjecture that the type IIB string is self-dual, with the strong coupling dynamics described by a dual type IIB theory, is supported by the occurrence of the dual string as a Ramond-Ramond soliton of the weakly-coupled theory.

  5. The Ten Relationships in Rural Land Circulation

    OpenAIRE

    Cai, Zhirong; Ren, Shuo; Zhang, Zhigang

    2009-01-01

    The ten relationships during land circulation are discussed. Among them, the relationship between peasant household and government indicates that government should only carry out its service and regulatory functions and farmers should be the main body of land circulation, because peasants usually have no discourse power during land circulation. In the relationship between land ownership and contracting management right, we mainly discuss the transfer of land contracting management right and p...

  6. Forget the hype or reality. Big data presents new opportunities in Earth Science.

    Science.gov (United States)

    Lee, T. J.

    2015-12-01

    Earth science is arguably one of the most mature science discipline which constantly acquires, curates, and utilizes a large volume of data with diverse variety. We deal with big data before there is big data. For example, while developing the EOS program in the 1980s, the EOS data and information system (EOSDIS) was developed to manage the vast amount of data acquired by the EOS fleet of satellites. EOSDIS continues to be a shining example of modern science data systems in the past two decades. With the explosion of internet, the usage of social media, and the provision of sensors everywhere, the big data era has bring new challenges. First, Goggle developed the search algorithm and a distributed data management system. The open source communities quickly followed up and developed Hadoop file system to facility the map reduce workloads. The internet continues to generate tens of petabytes of data every day. There is a significant shortage of algorithms and knowledgeable manpower to mine the data. In response, the federal government developed the big data programs that fund research and development projects and training programs to tackle these new challenges. Meanwhile, comparatively to the internet data explosion, Earth science big data problem has become quite small. Nevertheless, the big data era presents an opportunity for Earth science to evolve. We learned about the MapReduce algorithms, in memory data mining, machine learning, graph analysis, and semantic web technologies. How do we apply these new technologies to our discipline and bring the hype to Earth? In this talk, I will discuss how we might want to apply some of the big data technologies to our discipline and solve many of our challenging problems. More importantly, I will propose new Earth science data system architecture to enable new type of scientific inquires.

  7. An analysis of cross-sectional differences in big and non-big public accounting firms' audit programs

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans); Drieenhuizen, F.; Stein, M.T.; Simunic, D.A.

    2006-01-01

    A significant body of prior research has shown that audits by the Big 5 (now Big 4) public accounting firms are quality differentiated relative to non-Big 5 audits. This result can be derived analytically by assuming that Big 5 and non-Big 5 firms face different loss functions for "audit failures" a

  8. Why Big Data Is a Big Deal (Ⅱ)

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    A new group of data mining technologies promises to change forever the way we sift through our vast stores of data,making it faster and cheaper.Some of the technologies are actively being used by people on the bleeding edge who need the technology now,like those involved in creating Web-based services that are driven by social media.They're also heavily contributing to these projects.In other vertical industries,businesses are realizing that much more of their value proposition is informationbased than they had previously thought,which will allow big data technologies to gain traction quickly,Olofson says.Couple that with affordable hardware and software,and enterprises find themselves in a perfect storm of business transformation opportunities.

  9. A reduced-boron OPR1000 core based on the BigT burnable absorber

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Hwan Yeal; Yahya, Mohd-Syukri; Kim, Yong Hee [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon (Korea, Republic of)

    2016-04-15

    Reducing critical boron concentration in a commercial pressurized water reactor core offers many advantages in view of safety and economics. This paper presents a preliminary investigation of a reduced-boron pressurized water reactor core to achieve a clearly negative moderator temperature coefficient at hot zero power using the newly-proposed 'Burnable absorber-Integrated Guide Thimble' (BigT) absorbers. The reference core is based on a commercial OPR1000 equilibrium configuration. The reduced-boron ORP1000 configuration was determined by simply replacing commercial gadolinia-based burnable absorbers with the optimized BigT-loaded design. The equilibrium cores in this study were directly searched via repetitive Monte Carlo depletion calculations until convergence. The results demonstrate that, with the same fuel management scheme as in the reference core, application of the BigT absorbers can effectively reduce the critical boron concentration at the beginning of cycle by about 65 ppm. More crucially, the analyses indicate promising potential of the reduced-boron OPR1000 core with the BigT absorbers, as its moderator temperature coefficient at the beginning of cycle is clearly more negative and all other vital neutronic parameters are within practical safety limits. All simulations were completed using the Monte Carlo Serpent code with the ENDF/B-VII.0 library.

  10. The ethics of Big data: analytical survey

    OpenAIRE

    GIBER L.; KAZANTSEV N.

    2015-01-01

    The number of recent publications on the matter of ethical challenges of the implementation of Big Data has signified the growing interest to all the aspects of this issue. The proposed study specifically aims at analyzing ethical issues connected with Big Data.

  11. A New Look at Big History

    Science.gov (United States)

    Hawkey, Kate

    2014-01-01

    The article sets out a "big history" which resonates with the priorities of our own time. A globalizing world calls for new spacial scales to underpin what the history curriculum addresses, "big history" calls for new temporal scales, while concern over climate change calls for a new look at subject boundaries. The article…

  12. The Big Sleep in the Woods

    Institute of Scientific and Technical Information of China (English)

    王玉峰

    2002-01-01

    Now it's the time of the big sleep for the bees and the bears. Even the buds of the plants whose leaves fall off share in it. But the intensity of this winter sleep, or hibernation, depends on who's doing it.The big sleep of the bears ,for instance ,would probably be thought of as a

  13. Big Science and Long-tail Science

    CERN Multimedia

    2008-01-01

    Jim Downing and I were privileged to be the guests of Salavtore Mele at CERN yesterday and to see the Atlas detector of the Large Hadron Collider . This is a wow experience - although I knew it was big, I hadnt realised how big.

  14. An embedding for the big bang

    Science.gov (United States)

    Wesson, Paul S.

    1994-01-01

    A cosmological model is given that has good physical properties for the early and late universe but is a hypersurface in a flat five-dimensional manifold. The big bang can therefore be regarded as an effect of a choice of coordinates in a truncated higher-dimensional geometry. Thus the big bang is in some sense a geometrical illusion.

  15. Big Red: A Development Environment for Bigraphs

    DEFF Research Database (Denmark)

    Faithfull, Alexander John; Perrone, Gian David; Hildebrandt, Thomas

    2013-01-01

    We present Big Red, a visual editor for bigraphs and bigraphical reactive systems, based upon Eclipse. The editor integrates with several existing bigraph tools to permit simulation and model-checking of bigraphical models. We give a brief introduction to the bigraphs formalism, and show how these...... concepts manifest within the tool using a small motivating example bigraphical model developed in Big Red....

  16. Hom-Big Brackets: Theory and Applications

    OpenAIRE

    Cai, Liqiang; Sheng, Yunhe

    2015-01-01

    In this paper, we introduce the notion of hom-big brackets, which is a generalization of Kosmann-Schwarzbach's big brackets. We show that it gives rise to a graded hom-Lie algebra. Thus, it is a useful tool to study hom-structures. In particular, we use it to describe hom-Lie bialgebras and hom-Nijenhuis operators.

  17. Big system: Interactive graphics for the engineer

    Science.gov (United States)

    Quenneville, C. E.

    1975-01-01

    The BCS Interactive Graphics System (BIG System) approach to graphics was presented, along with several significant engineering applications. The BIG System precompiler, the graphics support library, and the function requirements of graphics applications are discussed. It was concluded that graphics standardization and a device independent code can be developed to assure maximum graphic terminal transferability.

  18. What is beyond the big five?

    Science.gov (United States)

    Saucier, G; Goldberg, L R

    1998-08-01

    Previous investigators have proposed that various kinds of person-descriptive content--such as differences in attitudes or values, in sheer evaluation, in attractiveness, or in height and girth--are not adequately captured by the Big Five Model. We report on a rather exhaustive search for reliable sources of Big Five-independent variation in data from person-descriptive adjectives. Fifty-three candidate clusters were developed in a college sample using diverse approaches and sources. In a nonstudent adult sample, clusters were evaluated with respect to a minimax criterion: minimum multiple correlation with factors from Big Five markers and maximum reliability. The most clearly Big Five-independent clusters referred to Height, Girth, Religiousness, Employment Status, Youthfulness and Negative Valence (or low-base-rate attributes). Clusters referring to Fashionableness, Sensuality/Seductiveness, Beauty, Masculinity, Frugality, Humor, Wealth, Prejudice, Folksiness, Cunning, and Luck appeared to be potentially beyond the Big Five, although each of these clusters demonstrated Big Five multiple correlations of .30 to .45, and at least one correlation of .20 and over with a Big Five factor. Of all these content areas, Religiousness, Negative Valence, and the various aspects of Attractiveness were found to be represented by a substantial number of distinct, common adjectives. Results suggest directions for supplementing the Big Five when one wishes to extend variable selection outside the domain of personality traits as conventionally defined. PMID:9728415

  19. Kansen voor Big data – WPA Vertrouwen

    NARCIS (Netherlands)

    Broek, T.A. van den; Roosendaal, A.P.C.; Veenstra, A.F.E. van; Nunen, A.M. van

    2014-01-01

    Big data is expected to become a driver for economic growth, but this can only be achieved when services based on (big) data are accepted by citizens and consumers. In a recent policy brief, the Cabinet Office mentions trust as one of the three pillars (the others being transparency and control) for

  20. Big Food, Food Systems, and Global Health

    OpenAIRE

    Stuckler, David; Nestle, Marion

    2012-01-01

    In an article that forms part of the PLoS Medicine series on Big Food, guest editors David Stuckler and Marion Nestle lay out why more examination of the food industry is necessary, and offer three competing views on how public health professionals might engage with Big Food.

  1. Probing the pre-big bang universe

    International Nuclear Information System (INIS)

    Superstring theory suggests a new cosmology whereby a long inflationary phase preceded a non singular big bang-like event. After discussing how pre-big bang inflation naturally arises from an almost trivial initial state of the Universe, I will describe how present or near-future experiments can provide sensitive probes of how the Universe behaved in the pre-bang era

  2. BIG Data – A Review.

    Directory of Open Access Journals (Sweden)

    Anuradha Bhatia

    2013-08-01

    Full Text Available As more data becomes available from an abundance of sources both within and outside, organizations are seeking to use those abundant resources to increase innovation, retain customers, and increase operational efficiency. At the same time, organizations are challenged by their end users, who are demanding greater capability and integration to mine and analyze burgeoning new sources of information. Big Data provides opportunities for business users to ask questions they never were able to ask before. How can a financial organization find better ways to detect fraud? How can an insurance company gain a deeper insight into its customers to see who may be the least economical to insure? How does a software company find its most at-risk customers those who are about to deploy a competitive product? They need to integrate Big Data techniques with their current enterprise data to gain that competitive advantage. Heterogeneity, scale, timeliness, complexity, and privacy problems with Big Data impede progress at all phases of the pipeline that can create value from data. The problems start right away during data acquisition, when the data tsunami requires us to make decisions, currently in an ad hoc manner, about what data to keep and what to discard, and how to store what we keep reliably with the right metadata. Much data today is not natively in structured format; for example, tweets and blogs are weakly structured pieces of text, while images and video are structured for storage and display, but not for semantic content and search: transforming such content into a structured format for later analysis is a major challenge. The value of data explodes when it can be linked with other data, thus data integration is a major creator of value. Since most data is directly generated in digital format today, we have the opportunity and the challenge both to influence the creation to facilitate later linkage and to automatically link previously created data

  3. Reactor building

    International Nuclear Information System (INIS)

    The whole reactor building is accommodated in a shaft and is sealed level with the earth's surface by a building ceiling, which provides protection against penetration due to external effects. The building ceiling is supported on walls of the reactor building, which line the shaft and transfer the vertical components of forces to the foundations. The thickness of the walls is designed to withstand horizontal pressure waves in the floor. The building ceiling has an opening above the reactor, which must be closed by cover plates. Operating equipment for the reactor can be situated above the building ceiling. (orig./HP)

  4. Heterogeneous reactors

    International Nuclear Information System (INIS)

    The microscopic study of a cell is meant for the determination of the infinite multiplication factor of the cell, which is given by the four factor formula: K(infinite) = n(epsilon)pf. The analysis of an homogeneous reactor is similar to that of an heterogeneous reactor, but each factor of the four factor formula can not be calculated by the formulas developed in the case of an homogeneous reactor. A great number of methods was developed for the calculation of heterogeneous reactors and some of them are discussed. (Author)

  5. The ethics of biomedical big data

    CERN Document Server

    Mittelstadt, Brent Daniel

    2016-01-01

    This book presents cutting edge research on the new ethical challenges posed by biomedical Big Data technologies and practices. ‘Biomedical Big Data’ refers to the analysis of aggregated, very large datasets to improve medical knowledge and clinical care. The book describes the ethical problems posed by aggregation of biomedical datasets and re-use/re-purposing of data, in areas such as privacy, consent, professionalism, power relationships, and ethical governance of Big Data platforms. Approaches and methods are discussed that can be used to address these problems to achieve the appropriate balance between the social goods of biomedical Big Data research and the safety and privacy of individuals. Seventeen original contributions analyse the ethical, social and related policy implications of the analysis and curation of biomedical Big Data, written by leading experts in the areas of biomedical research, medical and technology ethics, privacy, governance and data protection. The book advances our understan...

  6. Energy management: the big picture

    International Nuclear Information System (INIS)

    Since the recent dramatic fall in energy prices may have come to an end, energy managers will have to turn to a range of non-price cost reduction techniques. A framework to aid this process is provided. It rests on ten categories of activity. These are: obtaining a refund; negotiating cheaper tariffs; modifying patterns of demand; inspection and maintenance; operating practices; training awareness and motivation; waste avoidance; retrofit technology; modifying plant and equipment; energy-efficient design. (UK)

  7. Primordial alchemy: from the Big Bang to the present universe

    Science.gov (United States)

    Steigman, Gary

    Of the light nuclides observed in the universe today, D, 3He, 4He, and 7Li are relics from its early evolution. The primordial abundances of these relics, produced via Big Bang Nucleosynthesis (BBN) during the first half hour of the evolution of the universe provide a unique window on Physics and Cosmology at redshifts ~1010. Comparing the BBN-predicted abundances with those inferred from observational data tests the consistency of the standard cosmological model over ten orders of magnitude in redshift, constrains the baryon and other particle content of the universe, and probes both Physics and Cosmology beyond the current standard models. These lectures are intended to introduce students, both of theory and observation, to those aspects of the evolution of the universe relevant to the production and evolution of the light nuclides from the Big Bang to the present. The current observational data is reviewed and compared with the BBN predictions and the implications for cosmology (e.g., universal baryon density) and particle physics (e.g., relativistic energy density) are discussed. While this comparison reveals the stunning success of the standard model(s), there are currently some challenge which leave open the door for more theoretical and observational work with potential implications for astronomy, cosmology, and particle physics.

  8. Primordial Alchemy From The Big Bang To The Present Universe

    CERN Document Server

    Steigman, G

    2002-01-01

    Of the light nuclides observed in the universe today, D, 3He, 4He, and 7Li are relics from its early evolution. The primordial abundances of these relics, produced via Big Bang Nucleosynthesis (BBN) during the first half hour of the evolution of the universe provide a unique window on Physics and Cosmology at redshifts of order 10^10. Comparing the BBN-predicted abundances with those inferred from observational data tests the consistency of the standard model of cosmology over ten orders of magnitude in redshift, constrains the baryon and other particle content of the universe, and probes both Cosmology and Physics beyond their current standard models. These lectures are intended to introduce students, both of theory and observation, to those aspects of the evolution of the universe relevant to the production and evolution of the light nuclides from the Big Bang to the present. The current observational data is reviewed and compared with the BBN predictions and the implications for cosmology (e.g., universal ...

  9. Cloud Based Big Data Infrastructure: Architectural Components and Automated Provisioning

    OpenAIRE

    Demchenko, Yuri; Turkmen, Fatih; Blanchet, Christophe; Loomis, Charles; Laat, Caees de

    2016-01-01

    This paper describes the general architecture and functional components of the cloud based Big Data Infrastructure (BDI). The proposed BDI architecture is based on the analysis of the emerging Big Data and data intensive technologies and supported by the definition of the Big Data Architecture Framework (BDAF) that defines the following components of the Big Data technologies: Big Data definition, Data Management including data lifecycle and data structures, Big Data Infrastructure (generical...

  10. The Chernobyl accident ten years later

    International Nuclear Information System (INIS)

    On April 26, 1986 at 1:23 AM a fire and explosion occurred at the fourth unit of the Chernobyl Nuclear Power Plant Complex, located in the Ukraine, that resulted in the destruction of the reactor core and most of the building in which it was housed. Several environmental impacts resulting from the accident will be discussed in this paper, which will include the effects on plant and wild life, radioactive waste generated and stored or disposed of, effects of evacuations relating to residents within the subsequently established 10km and 30km control zones, impacts of the emergency containment structure (sarcophagus), and potential effects on world opinion and future development of nuclear power. As an immediate result of the fire, 31 people died (2 from the fire ampersand smoke, and 29 from excessive radiation); 237 cases of acute radiation sickness occurred; the total fatalities based upon induced chronic diseases as a result of the accident is unknown: more than 100,000 people were evacuated from within the subsequently established 30 km control zone; in excess of 50 million curies of radionuclides that included finely dispersed nuclear fuel, fragments of graphite, concrete and other building materials were released from the reactor into the environment; an estimated one million cubic meters of radioactive waste were generated (LLW, ILW, HLW); more than 5000 tons of materials (sand, boron, dolomite, cement, and lead) were used to put the fire out by helicopter; shutdown of the adjacent power plants were performed; and other environmental impacts occurred. The Chernobyl Nuclear Power Plant Unit No 4 is an RBMK-1000. It initiated operations in 1983, it was a 1000 MWe with a power output of 3200 MW(th), the reactor core contained 190 MT of fuel, with 1659 assemblies (plus 211 control rods), the average burnup rate was 10.3 MWd/kg, and the reactor operated on a continuous basis with maintenance and fuel reload performed during operations

  11. Evidence of the Big Fix

    CERN Document Server

    Hamada, Yuta; Kawana, Kiyoharu

    2014-01-01

    We give an evidence of the Big Fix. The theory of wormholes and multiverse suggests that the parameters of the Standard Model are fixed in such a way that the total entropy at the late stage of the universe is maximized, which we call the maximum entropy principle. In this paper, we discuss how it can be confirmed by the experimental data, and we show that it is indeed true for the Higgs vacuum expectation value $v_{h}$. We assume that the baryon number is produced by the sphaleron process, and that the current quark masses, the gauge couplings and the Higgs self coupling are fixed when we vary $v_{h}$. It turns out that the existence of the atomic nuclei plays a crucial role to maximize the entropy. This is reminiscent of the anthropic principle, however it is required by the fundamental low in our case.

  12. Big data ja yrityksen markkinointi

    OpenAIRE

    Perolainen, Pekka

    2014-01-01

    Opinnäytetyössä oli tavoitteena tutkia big datan hyödyntämistä yrityksen myyntityössä ja markkinoinnissa. Yrityksillä on mahdollisuuksia käyttää omista tai ulkoisista lähteistä kerättyä tietoa toimintansa tehostamiseen. Yrityksen omat tiedot ovat lähinnä transaktiotietoja, asiakaskorttitietoa, logistiikkadataa tai anturidataa. Kameratallenteet ovat myös osa yritysten keräämää dataa, lainsäädännössä tämä data lasketaan henkilörekisteritiedoksi. Yritysten on mahdollista kerätä, käsitellä ja yhd...

  13. Was the Big Bang hot?

    Science.gov (United States)

    Wright, E. L.

    1983-01-01

    Techniques for verifying the spectrum defined by Woody and Richards (WR, 1981), which serves as a base for dust-distorted models of the 3 K background, are discussed. WR detected a sharp deviation from the Planck curve in the 3 K background. The absolute intensity of the background may be determined by the frequency dependence of the dipole anisotropy of the background or the frequency dependence effect in galactic clusters. Both methods involve the Doppler shift; analytical formulae are defined for characterization of the dipole anisotropy. The measurement of the 30-300 GHz spectra of cold galactic dust may reveal the presence of significant amounts of needle-shaped grains, which would in turn support a theory of a cold Big Bang.

  14. Advancements in Big Data Processing

    CERN Document Server

    Vaniachine, A; The ATLAS collaboration

    2012-01-01

    The ever-increasing volumes of scientific data present new challenges for Distributed Computing and Grid-technologies. The emerging Big Data revolution drives new discoveries in scientific fields including nanotechnology, astrophysics, high-energy physics, biology and medicine. New initiatives are transforming data-driven scientific fields by pushing Bid Data limits enabling massive data analysis in new ways. In petascale data processing scientists deal with datasets, not individual files. As a result, a task (comprised of many jobs) became a unit of petascale data processing on the Grid. Splitting of a large data processing task into jobs enabled fine-granularity checkpointing analogous to the splitting of a large file into smaller TCP/IP packets during data transfers. Transferring large data in small packets achieves reliability through automatic re-sending of the dropped TCP/IP packets. Similarly, transient job failures on the Grid can be recovered by automatic re-tries to achieve reliable Six Sigma produc...

  15. Big Book of Apple Hacks

    CERN Document Server

    Seibold, Chris

    2008-01-01

    Bigger in size, longer in length, broader in scope, and even more useful than our original Mac OS X Hacks, the new Big Book of Apple Hacks offers a grab bag of tips, tricks and hacks to get the most out of Mac OS X Leopard, as well as the new line of iPods, iPhone, and Apple TV. With 125 entirely new hacks presented in step-by-step fashion, this practical book is for serious Apple computer and gadget users who really want to take control of these systems. Many of the hacks take you under the hood and show you how to tweak system preferences, alter or add keyboard shortcuts, mount drives and

  16. Big Bang nucleosynthesis in crisis?

    International Nuclear Information System (INIS)

    A new evaluation of the constraint on the number of light neutrino species (Nν) from big bang nucleosynthesis suggests a discrepancy between the predicted light element abundances and those inferred from observations, unless the inferred primordial 4He abundance has been underestimated by 0.014±0.004 (1σ) or less than 10% (95% C.L.) of 3He survives stellar processing. With the quoted systematic errors in the observed abundances and a conservative chemical evolution parametrization, the best fit to the combined data is Nν=2.1±0.3 (1σ) and the upper limit is Nνν=3) at the 98.6% C.L. copyright 1995 The American Physical Society

  17. The safety of big workmanship

    International Nuclear Information System (INIS)

    This book brings together the contributions of a colloquium given in memory of Pierre Londe (1922-1999) and dealing with the safety of big workmanship. The main topics concern: the 3-D under pressure water flow inside fractured environments; Rion-Antarion bridge: reliability and para-seismic design of foundations; Rion-Antarion bridge: design and realization; geology and safety of dams; risk assessment; salt storage cavities: evaluation of tightness; safety of tunnels supporting in deformed rock massifs: application to the El Achir tunnel; instability risk of rock formations on the natural slopes of the Alps; safety approach applied to the civil engineering of nuclear facilities; lessons learnt from the accidents of offshore platforms; the engineer in front of the natural hazards; science and regulation. (J.S.)

  18. Exploring Relationships in Big Data

    Science.gov (United States)

    Mahabal, A.; Djorgovski, S. G.; Crichton, D. J.; Cinquini, L.; Kelly, S.; Colbert, M. A.; Kincaid, H.

    2015-12-01

    Big Data are characterized by several different 'V's. Volume, Veracity, Volatility, Value and so on. For many datasets inflated Volumes through redundant features often make the data more noisy and difficult to extract Value out of. This is especially true if one is comparing/combining different datasets, and the metadata are diverse. We have been exploring ways to exploit such datasets through a variety of statistical machinery, and visualization. We show how we have applied it to time-series from large astronomical sky-surveys. This was done in the Virtual Observatory framework. More recently we have been doing similar work for a completely different domain viz. biology/cancer. The methodology reuse involves application to diverse datasets gathered through the various centers associated with the Early Detection Research Network (EDRN) for cancer, an initiative of the National Cancer Institute (NCI). Application to Geo datasets is a natural extension.

  19. Political-social reactor problems at Berkeley

    International Nuclear Information System (INIS)

    For better than ten years there was little public notice of the TRIGA reactor at UC-Berkeley. Then: a) A non-student persuaded the Student and Senate to pass a resolution to request Campus Administration to stop operation of the reactor and remove it from campus. b) Presence of the reactor became a campaign-issue in a City Mayoral election. c) Two local residents reported adverse physical reactions before, during, and after a routine tour of the reactor facility. d) The Berkeley City Council began a study of problems associated with radioactive material within the city. e) Friends Of The Earth formally petitioned the NRC to terminate the reactor's license. Campus personnel have expended many man-hours and many pounds of paper in responding to these happenings. Some of the details are of interest, and may be of use to other reactor facilities. (author)

  20. The Top Ten Algorithms in Data Mining

    CERN Document Server

    Wu, Xindong

    2009-01-01

    From classification and clustering to statistical learning, association analysis, and link mining, this book covers the most important topics in data mining research. It presents the ten most influential algorithms used in the data mining community today. Each chapter provides a detailed description of the algorithm, a discussion of available software implementation, advanced topics, and exercises. With a simple data set, examples illustrate how each algorithm works and highlight the overall performance of each algorithm in a real-world application. Featuring contributions from leading researc

  1. KEK: Looking forward after ten years

    International Nuclear Information System (INIS)

    Although KEK was formally established on 1 April 1971, the Japanese National Laboratory for High Energy Physics chose to celebrate its tenth anniversary on 20 November last year, seven years to the day that first beam was injected into the main ring of its 12 GeV proton synchrotron. As well as looking back over ten years of fine achievement, KEK is also able to look forward to a new era of physics with the 3 km circumference TRISTAN electron-positron collider ring

  2. Compaction behavior of ten pyrotechnic materials

    Energy Technology Data Exchange (ETDEWEB)

    Burchett, O.L.; Dietzel, R.W.; Montoya, A.P.

    1980-01-01

    The compaction behavior of ten pyrotechnic materials, KClO/sub 4/, Ti/KClO/sub 4/ (33/67), TiH/sub 0/ /sub 65//KClO/sub 4/ (33/67), TiH/sub 1/ /sub 68//KClO/sub 4/ (33/67), HNAB, CP, B/CaCrO/sub 4/ (20/80), Pd/Al (94/6), Pd/Al (90/10), and Pd/Al (80/20), was determined by using a simple compaction test and a two-step iterative procedure to fit an assumed three parameter pressure-density relationship to the test results.

  3. Ten new withanolides from Physalis peruviana.

    Science.gov (United States)

    Fang, Sheng-Tao; Liu, Ji-Kai; Li, Bo

    2012-01-01

    Ten new withanolides, including four perulactone-type withanolides, perulactones E-H (1-4), three 28-hydroxy-withanolides, withaperuvins I-K (5-7), and three other withanolides, withaperuvins L-N (8-10), together with six known compounds (11-16) were isolated from the aerial parts of Physalis peruviana. The structures of these compounds were elucidated on the basis of extensive spectroscopic analyses (1D and 2D NMR, IR, HR-MS) and chemical methods. PMID:22037277

  4. Ten essential skills for electrical engineers

    CERN Document Server

    Dorr, Barry

    2014-01-01

    Engineers know that, as in any other discipline, getting a good job requires practical, up-to-date skills. An engineering degree provides a broad set of fundamentals. Ten Essential Skills applies those fundamentals to practical tasks required by employers. Written in a user-friendly, no-nonsense format, the book reviews practical skills using the latest tools and techniques, and features a companion website with interview practice problems and advanced material for readers wishing to pursue additional skills. With this book, aspiring and current engineers may approach job interviews confident

  5. A new string in ten dimensions?

    Science.gov (United States)

    Sethi, Savdeep

    2013-09-01

    I suggest the possibility of a new string in ten dimensions. Evidence for this string is presented both from orientifold physics and from K-theory, along with a mystery concerning the M-theory description. Motivated by this possibility, some novel aspects of decoupling limits in heterotic/type I theories are described; specifically, the decoupled theory on type I D-strings is argued to be three-dimensional rather than two-dimensional. These decoupled theories provide the matrix model definitions of the heterotic/type I strings.

  6. Audits reveal ten common environmental problems

    International Nuclear Information System (INIS)

    The old saying that open-quotes an ounce of prevention is worth a pound of cureclose quotes rings particularly true in environmental matters in the 1990s. Environmental problems can potentially lead to expensive fines, costly cleanups, negative public relations, and even criminal sanctions against members of the corporation. A recurring pattern of problems has been noted during the performance of environmental disposition, acquisition, and compliance assessments of many different operators in most of the producing states. The ten most common problems found in oilfield audits are discussed here in an effort to enhance the awareness of operators

  7. Singularitäten von Phase und Polarisation des Lichts

    OpenAIRE

    Flossmann, Florian

    2006-01-01

    Singularitäten sind in der Optik von besonderem Interesse, da sie als die strukturell stabilen Objekte des Lichtfeldes dessen Topologie weitgehend festlegen. Solche Singularitäten sind in der Strahlenoptik die seit Jahrhunderten bekannten Kaustiken, in der skalaren Wellenoptik Singularitäten der Phase und in der vektoriellen Wellenoptik Singularitäten der Polarisation. Mit den beiden letzteren beschäftigt sich diese Arbeit sowohl experimentell als auch theoretisch. Singularitäten der Ph...

  8. Relações hierárquicas entre os traços amplos do Big Five Hierarchical relationship between the broad traits of the Big Five

    Directory of Open Access Journals (Sweden)

    Cristiano Mauro Assis Gomes

    2012-01-01

    Full Text Available O modelo Big Five sustenta que a personalidade humana é composta por dezenas de fatores específicos. Apesar dessa diversidade, esses fatores confluem para cinco traços amplos que estão em um mesmo nível de hierarquia. O presente estudo apresenta uma hipótese alternativa, postulando níveis entre os traços amplos do modelo. Fizeram parte do estudo 684 estudantes do ensino fundamental e médio de uma escola particular de Belo Horizonte, MG, com idades entre 10 e 18 anos (m = 13,71 e DP= 2,11. Para medir os fatores do Big Five foi utilizado o Inventário de Características de Personalidade, anteriormente chamado de Inventário dos Adjetivos de Personalidade, de Pinheiro, Gomes e Braga (2009. O instrumento mensura oito polaridades das 10 polaridades presentes nos cinco traços amplos do Big Five. Dois modelos foram comparados via método path analysis: um modelo de quatro níveis hierárquicos e um modelo não hierárquico. O modelo hierárquico apresentou adequado grau de ajuste aos dados e mostrou-se superior ao modelo não hierárquico, que não se ajusta aos dados. Implicações são discutidas para o modelo Big Five.The Big Five model sustains that human personality is composed by dozens of specific factors. Despite of diversity, specific factors are integrated in five broad traits that are in the same hierarchical level. The current study presents an alternative hypothesis arguing that there are hierarchical levels between the broad traits of the model. Six hundred and eighty-four junior and high school level students from 10 to 18 years old (M = 13.71 and SD= 2.11 of a private school in the city of Belo Horizonte, Minas Gerais, Brazil participated in the study. The Big Five was measured by an Inventory of Personality Traits, initially named as Personality Adjective Inventory, elaborated by Pinheiro, Gomes and Braga (2009. This instrument measures eight polarities of the ten presented in the Big Five Model. Two models were compared

  9. Water desalination using different capacity reactors options

    International Nuclear Information System (INIS)

    The Northwest region of Mexico has a deficit of potable water, along this necessity is the region growth, which requires of additional energy capacity, cogeneration of potable water production and nuclear electricity is an option to be assessed. In this paper we will perform an economical comparison for cogeneration using a big reactor, the AP1000, and a medium size reactor, the IRIS, both of them are PWR type reactors and will be coupled to the desalination plant using the same method. For this cogeneration case we will assess the best reactor option that can cover both needs using the maximum potable water production for two different desalination methods: Multistage Flash Distillation and Multi-effect Distillation. (authors)

  10. Plasma reactor

    OpenAIRE

    Molina Mansilla, Ricardo; Erra Serrabasa, Pilar; Bertrán Serra, Enric

    2008-01-01

    [EN] A plasma reactor that can operate in a wide pressure range, from vacuum and low pressures to atmospheric pressure and higher pressures. The plasma reactor is also able to regulate other important settings and can be used for processing a wide range of different samples, such as relatively large samples or samples with rough surfaces.

  11. Reactor physics

    International Nuclear Information System (INIS)

    Progress in research on reactor physics in 1997 at the Belgian Nuclear Research Centre SCK/CEN is described. Activities in the following four domains are discussed: core physics, ex-core neutron transport, experiments in Materials Testing Reactors, international benchmarks

  12. Fast reactor database. 2006 update

    International Nuclear Information System (INIS)

    Liquid metal cooled fast reactors (LMFRs) have been under development for about 50 years. Ten experimental fast reactors and six prototype and commercial size fast reactor plants have been constructed and operated. In many cases, the overall experience with LMFRs has been rather good, with the reactors themselves and also the various components showing remarkable performances, well in accordance with the design expectations. The fast reactor system has also been shown to have very attractive safety characteristics, resulting to a large extent from the fact that the fast reactor is a low pressure system with large thermal inertia and negative power and temperature coefficients. In addition to the LMFRs that have been constructed and operated, more than ten advanced LMFR projects have been developed, and the latest designs are now close to achieving economic competitivity with other reactor types. In the current world economic climate, the introduction of a new nuclear energy system based on the LMFR may not be considered by utilities as a near future option when compared to other potential power plants. However, there is a strong agreement between experts in the nuclear energy field that, for sustainability reasons, long term development of nuclear power as a part of the world's future energy mix will require the fast reactor technology, and that, given the decline in fast reactor development projects, data retrieval and knowledge preservation efforts in this area are of particular importance. This publication contains detailed design data and main operational data on experimental, prototype, demonstration, and commercial size LMFRs. Each LMFR plant is characterized by about 500 parameters: physics, thermohydraulics, thermomechanics, by design and technical data, and by relevant sketches. The focus is on practical issues that are useful to engineers, scientists, managers, university students and professors with complete technical information of a total of 37 LMFR

  13. Ten Years of Infrasound Observation in Korea

    Science.gov (United States)

    Lee, Hee-Il; Che, Il-Young; Kim, Tae Sung

    2010-05-01

    Over the ten years after the installation of our first seismo-acoustic array station (CHNAR) in September 1999, Korea Institute of Geoscience and Mineral Resources (KIGAM) has been continuously observing infrasound with an infrasound array network, named KIN (Korean Infrasound Network) in Korea. This network consists of seven seismo-acoustic arrays (BRDAR, KMPAR, CHNAR, YAGAR, KSGAR, ULDAR and TJIAR). The aperture size of the smallest array (KMPAR and TJIAR) is about 300m and the largest is about 1.4km. The number of acoustic gauges are between 4 (TJIAR) and 18 (YAGAR), and 1 or 5 seismometers are collocated at the center of the acoustic array. All seismic and infrasonic signals of the arrays are digitized at 40 samples/sec and transmitted to KIGAM in real time. Many interesting infrasound signals associated with different kind of anthropogenic source as well as natural one are detected by KIN. Ten years of seismo-acoustic data are analyzed by using PMCC program, and identified more than five thousand of infrasonic events and catalogued in our infrasound database. This database is used to study characteristics of seasonally dependent propagation of the infrasound wave in local scale, as well as to better understand how atmospheric condition affects the detection ratio at a specific station throughout the year. It also played a valuable role in discriminating the anthropogenic events such as the second nuclear test on 25 May 2009 in North Korea, from natural earthquakes, which is important in estimating the seismicity in Korea.

  14. Experimental Particle Physics: the Next Ten Years

    International Nuclear Information System (INIS)

    Over the next ten years a number of new accelerator facilities for frontier research will be become operational. Within weeks the Large Hadron Collider (LHC) at CERN is scheduled to produce first collisions. The world's most powerful accelerator is designed to become a 'discovery machine'. Around 2013/2014 we expect first beams at BELLE-II at KEK, Japan. This Beauty-Factory will build on the very successful BELLE program with ten times higher luminosity and a discovery reach beyond the Standard Model. Later in the decade, an array of facilities of the FAIR project at GSI, Darmstadt will commence operation. This broad accelerator-based program is complemented with ultra-precision particle physics experiments, both in university laboratories and facilities, such as ILL, Grenoble and a strong astroparticle physics program with ground- and satellite-based facilities. These should be truly exciting years for particle physics. Austria is strongly involved in all these research programs and well positioned in the race for major discoveries. (author)

  15. Main characteristic parameters of centrifugal pump and safe operation of reactor

    International Nuclear Information System (INIS)

    Based on main characteristic parameters of centrifugal pump, the problems are discussed during the reactor operation. To High-Flux Engineering Test Reactor, ten-type typical faults are generalized in the reactor operation. Making use of basic knowledge of centrifugal pump, it can help reactor operators to find out whether centrifugal pump is out of order and to handle it. So, the safety of reactor can be assured

  16. The Chernobyl disaster - ten years on

    International Nuclear Information System (INIS)

    Large areas of Belarus, Russia and the Ukraine are contaminated with very high levels of radioactivity from the fallout of the Chernobyl reactor accident. The most affected areas are in the vicinity of Chernobyl, and east of Gomel (in Belarus), where much of the radioactive plume came down. The article describes the contamination with cesium 137 and iodine 131, as well as the immediate countermeasures taken after the accident, and the long-term action for decontamination of the polluted soil. Information is given on the radiation dose received by the population, in particular the thyroid doses, and prognostic data on thyroid cancer incidence. (orig.)

  17. Big questions, big science: meeting the challenges of global ecology.

    Science.gov (United States)

    Schimel, David; Keller, Michael

    2015-04-01

    Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects. PMID:25680334

  18. Equity and the big city.

    Science.gov (United States)

    Chakravorty, S

    1994-01-01

    The author attempts to answer two questions: "how does the spatial distribution of population change during the process of development, and how do these changes relate to changes in the size distribution and regional distribution of income. The causal connection between population and income distribution is examined through a simulation model. The theoretical implications of the model's results are empirically examined at several spatial levels: at the national level, with longitudinal data from ten Asian and Latin American nations, and at the regional and subregional levels, with data from Japan, Brazil, and the Philippines. Finally, a multistage model of polarization reversal (with interconnected regional inequality changes) is proposed." PMID:12291804

  19. Big data and the electronic health record.

    Science.gov (United States)

    Peters, Steve G; Buntrock, James D

    2014-01-01

    The electronic medical record has evolved from a digital representation of individual patient results and documents to information of large scale and complexity. Big Data refers to new technologies providing management and processing capabilities, targeting massive and disparate data sets. For an individual patient, techniques such as Natural Language Processing allow the integration and analysis of textual reports with structured results. For groups of patients, Big Data offers the promise of large-scale analysis of outcomes, patterns, temporal trends, and correlations. The evolution of Big Data analytics moves us from description and reporting to forecasting, predictive modeling, and decision optimization. PMID:24887521

  20. BLENDING IOT AND BIG DATA ANALYTICS

    OpenAIRE

    Tulasi.B*; Girish J Vemulkar

    2016-01-01

    Internet is continuously evolving and changing. Internet of Things (IoT) can be considered as the future of Internet applications which involves machine to machine learning (M2M). The actionable intelligence can be derived through fusion of Big Data and real time analytics with IoT. Big Data and IoT can be viewed as two sides of a coin. With the connection between Big Data and the objects on Internet benefits of IoT can be easily reaped. The applications of IoT spread across various domains l...

  1. Big data governance an emerging imperative

    CERN Document Server

    Soares, Sunil

    2012-01-01

    Written by a leading expert in the field, this guide focuses on the convergence of two major trends in information management-big data and information governance-by taking a strategic approach oriented around business cases and industry imperatives. With the advent of new technologies, enterprises are expanding and handling very large volumes of data; this book, nontechnical in nature and geared toward business audiences, encourages the practice of establishing appropriate governance over big data initiatives and addresses how to manage and govern big data, highlighting the relevant processes,

  2. Processing Solutions for Big Data in Astronomy

    Science.gov (United States)

    Fillatre, L.; Lepiller, D.

    2016-09-01

    This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.

  3. How to use Big Data technologies to optimize operations in Upstream Petroleum Industry

    Directory of Open Access Journals (Sweden)

    Abdelkader Baaziz

    2013-12-01

    Full Text Available “Big Data is the oil of the new economy” is the most famous citation during the three last years. It has even been adopted by the World Economic Forum in 2011. In fact, Big Data is like crude! It’s valuable, but if unrefined it cannot be used. It must be broken down, analyzed for it to have value. But what about Big Data generated by the Petroleum Industry and particularly its upstream segment? Upstream is no stranger to Big Data. Understanding and leveraging data in the upstream segment enables firms to remain competitive throughout planning, exploration, delineation, and field development.Oil & Gas Companies conduct advanced geophysics modeling and simulation to support operations where 2D, 3D & 4D Seismic generate significant data during exploration phases. They closely monitor the performance of their operational assets. To do this, they use tens of thousands of data-collecting sensors in subsurface wells and surface facilities to provide continuous and real-time monitoring of assets and environmental conditions. Unfortunately, this information comes in various and increasingly complex forms, making it a challenge to collect, interpret, and leverage the disparate data. As an example, Chevron’s internal IT traffic alone exceeds 1.5 terabytes a day.Big Data technologies integrate common and disparate data sets to deliver the right information at the appropriate time to the correct decision-maker. These capabilities help firms act on large volumes of data, transforming decision-making from reactive to proactive and optimizing all phases of exploration, development and production. Furthermore, Big Data offers multiple opportunities to ensure safer, more responsible operations. Another invaluable effect of that would be shared learning.The aim of this paper is to explain how to use Big Data technologies to optimize operations. How can Big Data help experts to decision-making leading the desired outcomes?Keywords:Big Data; Analytics

  4. BIG DATA, BIG CONSEQUENCES? EEN VERKENNING NAAR PRIVACY EN BIG DATA GEBRUIK BINNEN DE OPSPORING, VERVOLGING EN RECHTSPRAAK

    OpenAIRE

    Lodder, A.R.; Meulen, van der, N.; Wisman, T.H.A.; Meij, Lisette; Zwinkels, C.M.M.

    2014-01-01

    In deze verkenning is ingegaan op de privacy aspecten van Big Data analysis binnen het domein Veiligheid en Justitie. Besproken zijn toepassingen binnen de rechtspraak zoals voorspellen van uitspraken en gebruik in rechtszaken. Met betrekking tot opsporing is onder andere ingegaan op predictive policing en internetopsporing. Na een uiteenzetting van de privacynormen en toepassingsmogelijkheden, zijn de volgende zes uitgangspunten voor Big Data toepassingen voorgesteld: 7 A.R. Lodder e.a. ‐ Bi...

  5. NOAA Big Data Partnership RFI

    Science.gov (United States)

    de la Beaujardiere, J.

    2014-12-01

    In February 2014, the US National Oceanic and Atmospheric Administration (NOAA) issued a Big Data Request for Information (RFI) from industry and other organizations (e.g., non-profits, research laboratories, and universities) to assess capability and interest in establishing partnerships to position a copy of NOAA's vast data holdings in the Cloud, co-located with easy and affordable access to analytical capabilities. This RFI was motivated by a number of concerns. First, NOAA's data facilities do not necessarily have sufficient network infrastructure to transmit all available observations and numerical model outputs to all potential users, or sufficient infrastructure to support simultaneous computation by many users. Second, the available data are distributed across multiple services and data facilities, making it difficult to find and integrate data for cross-domain analysis and decision-making. Third, large datasets require users to have substantial network, storage, and computing capabilities of their own in order to fully interact with and exploit the latent value of the data. Finally, there may be commercial opportunities for value-added products and services derived from our data. Putting a working copy of data in the Cloud outside of NOAA's internal networks and infrastructures should reduce demands and risks on our systems, and should enable users to interact with multiple datasets and create new lines of business (much like the industries built on government-furnished weather or GPS data). The NOAA Big Data RFI therefore solicited information on technical and business approaches regarding possible partnership(s) that -- at no net cost to the government and minimum impact on existing data facilities -- would unleash the commercial potential of its environmental observations and model outputs. NOAA would retain the master archival copy of its data. Commercial partners would not be permitted to charge fees for access to the NOAA data they receive, but

  6. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2005-01-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. Efforts are underway to showcase the architecture of the GIS framework and initial results for sources and sinks. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is

  7. Materials requirements for fusion reactors

    International Nuclear Information System (INIS)

    Once the physics of fusion devices is understood, one or more experimental power reactors (EPR) are planned which will produce net electrical power. The structural material for the device will probably be a modification of an austenitic stainless steel. Unlike fission reactors, whose pressure boundaries are subjected to no or only light irradiation, the pressure boundary of a fusion reactor is subjected to high atomic displacement-damage and high production rates of transmutation products, e.g., helium and hydrogen. The design data base must include irradiated materials. Since in situ testing to obtain tensile, fatigue, creep, crack-growth, stress-rupture, and swelling data is currently impossible for fusion reactor conditions, a program of service-temperature irradiations in fission reactors followed by postirradiation testing, simulation of fusion conditions, and low-fluence 14 MeV neutron-irradiation tests are planned. For the Demonstration Reactor (DEMO) expected to be built within ten years after theEPR, higher heat fluxes may require the use of refractory metals, at least for the first 20 cm. A partial data base may be provided by high-flux 14 MeV neutron sources being planned. Many materials other than those for structural components will be required in the EPR and DEMO. These include superconducting magnets, insulators, neutron reflectors and shields, and breeding materials. The rest of the device should utilize conventional materials except that portion involved in tritium confinement and recovery

  8. Big bang nucleosynthesis and CMB constraints on dark energy

    International Nuclear Information System (INIS)

    Current observational data favor cosmological models which differ from the standard model due to the presence of some form of dark energy and, perhaps, by additional contributions to the more familiar dark matter. Primordial nucleosynthesis provides a window on the very early evolution of the universe and constraints from big bang nucleosynthesis (BBN) can bound the parameters of models for dark matter or energy at redshifts of the order of ten billion. The spectrum of temperature fluctuations imprinted on the cosmic microwave background (CMB) radiation opens a completely different window on the universe at epochs from redshifts of the order of ten thousand to nearly the present. The CMB anisotropy spectrum provides constraints on new physics which are independent of and complementary to those from BBN. Here we consider three classes of models for the dark matter or energy: extra particles which were relativistic during the early evolution of the universe ('X'); quintessence models involving a minimally coupled scalar field ('Q'); models with a non-minimally coupled scalar field which modify the strength of gravity during the early evolution of the universe ('G'). We constrain the parameters of these models using data from BBN and the CMB and identify the allowed regions in their parameter spaces consistent with the more demanding joint BBN and CMB constraints. For 'X' and 'Q' such consistency is relatively easy to find; it is more difficult for the 'G' models with an inverse power law potential for the scalar field

  9. Compact Reactor

    International Nuclear Information System (INIS)

    Weyl's Gauge Principle of 1929 has been used to establish Weyl's Quantum Principle (WQP) that requires that the Weyl scale factor should be unity. It has been shown that the WQP requires the following: quantum mechanics must be used to determine system states; the electrostatic potential must be non-singular and quantified; interactions between particles with different electric charges (i.e. electron and proton) do not obey Newton's Third Law at sub-nuclear separations, and nuclear particles may be much different than expected using the standard model. The above WQP requirements lead to a potential fusion reactor wherein deuterium nuclei are preferentially fused into helium nuclei. Because the deuterium nuclei are preferentially fused into helium nuclei at temperatures and energies lower than specified by the standard model there is no harmful radiation as a byproduct of this fusion process. Therefore, a reactor using this reaction does not need any shielding to contain such radiation. The energy released from each reaction and the absence of shielding makes the deuterium-plus-deuterium-to-helium (DDH) reactor very compact when compared to other reactors, both fission and fusion types. Moreover, the potential energy output per reactor weight and the absence of harmful radiation makes the DDH reactor an ideal candidate for space power. The logic is summarized by which the WQP requires the above conditions that make the prediction of DDH possible. The details of the DDH reaction will be presented along with the specifics of why the DDH reactor may be made to cause two deuterium nuclei to preferentially fuse to a helium nucleus. The presentation will also indicate the calculations needed to predict the reactor temperature as a function of fuel loading, reactor size, and desired output and will include the progress achieved to date

  10. Big Bang–Big Crunch Optimization Algorithm for Linear Phase Fir Digital Filter Design

    Directory of Open Access Journals (Sweden)

    Ms. Rashmi Singh Dr. H. K. Verma

    2012-02-01

    Full Text Available The Big Bang–Big Crunch (BB–BC optimization algorithm is a new optimization method that relies on the Big Bang and Big Crunch theory, one of the theories of the evolution of the universe. In this paper, a Big Bang–Big Crunch algorithm has been used here for the design of linear phase finite impulse response (FIR filters. Here the experimented fitness function based on the mean squared error between the actual and the ideal filter response. This paper presents the plot of magnitude response of FIR filters and error graph. The BB-BC seems to be promising tool for FIR filter design especially in a dynamic environment where filter coefficients have to be adapted and fast convergence is of importance.

  11. The First Ten Years of Swift Supernovae

    CERN Document Server

    Brown, Peter J; Milne, Peter A

    2015-01-01

    The Swift Gamma Ray Burst Explorer has proven to be an incredible platform for studying the multiwavelength properties of supernova explosions. In its first ten years, Swift has observed over three hundred supernovae. The ultraviolet observations reveal a complex diversity of behavior across supernova types and classes. Even amongst the standard candle type Ia supernovae, ultraviolet observations reveal distinct groups. When the UVOT data is combined with higher redshift optical data, the relative populations of these groups appear to change with redshift. Among core-collapse supernovae, Swift discovered the shock breakout of two supernovae and the Swift data show a diversity in the cooling phase of the shock breakout of supernovae discovered from the ground and promptly followed up with Swift. Swift observations have resulted in an incredible dataset of UV and X-ray data for comparison with high-redshift supernova observations and theoretical models. Swift's supernova program has the potential to dramaticall...

  12. Classification of ten-dimensional heterotic strings

    International Nuclear Information System (INIS)

    Progress towards the classification of the meromorphic c=24 conformal field theories is reported. It is shown that if such a theory has any spin-1 currents, it is either the Leech lattice CFT, or it can be written as a tensor product of Kac-Moody algebras with total central charge 24. The total number of combinations of Kac-Moody algebras for which meromorphic c=24 theories may exist is 221. The next step towards classification is to obtain all modular-invariant combinations of Kac-Moody characters. The presently available results are sufficient to obtain a complete list of all ten-dimensional heterotic strings. Furthermore there are strong indications for the existence of several (probably at least 20) new meromorphic c=24 theories. (orig.)

  13. Ten years of the Spanish Virtual Observatory

    Science.gov (United States)

    Solano, E.

    2015-05-01

    The main objective of the Virtual Observatory (VO) is to guarantee an easy and efficient access and analysis of the information hosted in astronomical archives. The Spanish Virtual Observatory (SVO) is a project that was born in 2004 with the goal of promoting and coordinating the VO-related activities at national level. SVO is also the national contact point for the international VO initiatives, in particular the International Virtual Observatory Alliance (IVOA) and the Euro-VO project. The project, led by Centro de Astrobiología (INTA-CSIC), is structured around four major topics: a) VO compliance of astronomical archives, b) VO-science, c) VO- and data mining-tools, and d) Education and outreach. In this paper I will describe the most important results obtained by the Spanish Virtual Observatory in its first ten years of life as well as the future lines of work.

  14. Spacelab - Ten years of international cooperation

    Science.gov (United States)

    Bignier, M.; Harrington, J. C.; Sander, M. J.

    1983-01-01

    The history, current status, and future plans of the Spacelab program are reviewed, with a focus on the cooperative relationship between ESA and NASA. The initial decision to undertake the program and the three agreements signed to begin its implementation are examined, and the division of responsibilities and financial contributions is discussed insofar as it affected the management structure. Consideration is given to the major facilities, the 50-mission operational cycle, communications, the currently scheduled activities (through 1985), the prospective later uses, and the ten dedicated discipline laboratories. The importance of continuous mutual support during the planning and development phases is stressed. The program so far is considered a success, in terms of the goals set by the participants and in terms of the resolution of the problems inherent in international technological endeavors.

  15. Ten proofs of the generalized second law

    International Nuclear Information System (INIS)

    Ten attempts to prove the Generalized Second Law of Thermodyanmics (GSL) are described and critiqued. Each proof provides valuable insights which should be useful for constructing future, more complete proofs. Rather than merely summarizing previous research, this review offers new perspectives, and strategies for overcoming limitations of the existing proofs. A long introductory section addresses some choices that must be made in any formulation the GSL: Should one use the Gibbs or the Boltzmann entropy? Should one use the global or the apparent horizon? Is it necessary to assume any entropy bounds? If the area has quantum fluctuations, should the GSL apply to the average area? The definition and implications of the classical, hydrodynamic, semiclassical and full quantum gravity regimes are also discussed. A lack of agreement regarding how to define the 'quasi-stationary' regime is addressed by distinguishing it from the 'quasi-steady' regime.

  16. BDGS: A Scalable Big Data Generator Suite in Big Data Benchmarking

    OpenAIRE

    Ming, Zijian; Luo, Chunjie; Gao, Wanling; Han, Rui; Yang, Qiang; Wang, Lei; Zhan, Jianfeng

    2014-01-01

    Data generation is a key issue in big data benchmarking that aims to generate application-specific data sets to meet the 4V requirements of big data. Specifically, big data generators need to generate scalable data (Volume) of different types (Variety) under controllable generation rates (Velocity) while keeping the important characteristics of raw data (Veracity). This gives rise to various new challenges about how we design generators efficiently and successfully. To date, most existing tec...

  17. HOW BIG ARE ’BIG FOUR’ COMPANIES – EVIDENCE FROM ROMANIA

    OpenAIRE

    SORIN ROMULUS BERINDE

    2013-01-01

    The audit market is divided between two main categories of auditors: Big Four auditors and Non Big Four auditors. The general accepted opinion is that the former cover most audit services. The objective of the study is to quantify the share covered by Big Four auditors at the level of Romanian market. In this respect one collected and processed data obtained from the audited companies from the North-West Region of Romania which is considered representative for extrapolating the results at nat...

  18. BigDataBench: a Big Data Benchmark Suite from Web Search Engines

    OpenAIRE

    Gao, Wanling; Zhu, Yuqing; Jia, Zhen; Luo, Chunjie; Wang, Lei; Li, Zhiguo; Zhan, Jianfeng; Qi, Yong; He, Yongqiang; Gong, Shiming; Li, Xiaona; Zhang, Shujie; Qiu, Bizhu

    2013-01-01

    This paper presents our joint research efforts on big data benchmarking with several industrial partners. Considering the complexity, diversity, workload churns, and rapid evolution of big data systems, we take an incremental approach in big data benchmarking. For the first step, we pay attention to search engines, which are the most important domain in Internet services in terms of the number of page views and daily visitors. However, search engine service providers treat data, applications,...

  19. "Big Data" : big gaps of knowledge in the field of internet science

    OpenAIRE

    Snijders, CCP Chris; Matzat, U Uwe; Reips, UD

    2012-01-01

    Research on so-called 'Big Data' has received a considerable momentum and is expected to grow in the future. One very interesting stream of research on Big Data analyzes online networks. Many online networks are known to have some typical macro-characteristics, such as 'small world' properties. Much less is known about underlying micro-processes leading to these properties. The models used by Big Data researchers usually are inspired by mathematical ease of exposition. We propose to follow in...

  20. 6 Top Tools for Taming Big Data%6Top Tools for Taming Big Data

    Institute of Scientific and Technical Information of China (English)

    JakoB BJ orklund

    2012-01-01

    The industry now has a buzzword,"big data," for how we're going to do something with the huge amount of information piling up."Big data" is replacing "business intelligence,"which subsumed "reporting," which put a nicer gloss on "spreadsheets," which beat out the old-fashioned "printouts."Managers who long ago studied printouts are now hiring mathematicians who claim to be big data specialists to help them solve the same old problem:What's selling and why?

  1. Description of the Triton reactor

    International Nuclear Information System (INIS)

    The Triton reactor is an enriched uranium pool type reactor. It began operation in 1959, after a divergence made on the June 30 the same year. Devoted to studies of radiation protection, its core can be displaced in the longitudinal direction. The pool can be separated in two unequal compartments by a wall. The Triton core is placed in a small compartment, the Nereide core in the big compartment. A third compartment without water is called Naiade II, is separated by a concrete wall in which is made a window closed by an aluminium plate (2.50 m x 2.70 m). The Naiade II hole is useful for protection experiments using the Nereide core. After a complete refitting, the power of the triton reactor that reached progressively from 1.2 MW to 2 MW, then 3 MW has reached in August 1965 6.5 MW. The reactor has been specialized in irradiations in fix position, the core become fix, the nereide core has been hung mobile. Since it has been used for structure materials irradiation, for radioelements fabrication and fundamental research. The following descriptions are valid for the period after August 1965

  2. 'Big bang' of quantum universe

    International Nuclear Information System (INIS)

    The reparametrization-invariant generating functional for the unitary and causal perturbation theory in general relativity in a finite space-time is obtained. The classical cosmology of a Universe and the Faddeev-Popov-DeWitt functional correspond to different orders of decomposition of this functional over the inverse 'mass' of a Universe. It is shown that the invariant content of general relativity as a constrained system can be covered by two 'equivalent' unconstrained systems: the 'dynamic' (with 'dynamic' time as the cosmic scale factor and conformal field variables) and 'geometric' (given by the Levi-Civita type canonical transformation to the action-angle variables which determine initial cosmological states with the arrow of the proper time measured by the watch of an observer in the comoving frame). 'Big Bang', the Hubble evolution, and creation of 'dynamic' particles by the 'geometric' vacuum are determined by 'relations' between the dynamic and geometric systems as pure relativistic phenomena, like the Lorentz-type 'relation' between the rest and comoving frames in special relativity

  3. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-06-01

    The Big Sky Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts during the second performance period fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts begun in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for

  4. "Big Science" exhibition at Balexert

    CERN Multimedia

    2008-01-01

    CERN is going out to meet those members of the general public who were unable to attend the recent Open Day. The Laboratory will be taking its "Big Science" exhibition from the Globe of Science and Innovation to the Balexert shopping centre from 19 to 31 May 2008. The exhibition, which shows the LHC and its experiments through the eyes of a photographer, features around thirty spectacular photographs measuring 4.5 metres high and 2.5 metres wide. Welcomed and guided around the exhibition by CERN volunteers, shoppers at Balexert will also have the opportunity to discover LHC components on display and watch films. "Fun with Physics" workshops will be held at certain times of the day. Main hall of the Balexert shopping centre, ground floor, from 9.00 a.m. to 7.00 p.m. Monday to Friday and from 10 a.m. to 6 p.m. on the two Saturdays. Call for volunteers All members of the CERN personnel are invited to enrol as volunteers to help welcom...

  5. Big-bang nucleosynthesis revisited

    Science.gov (United States)

    Olive, Keith A.; Schramm, David N.; Steigman, Gary; Walker, Terry P.

    1989-01-01

    The homogeneous big-bang nucleosynthesis yields of D, He-3, He-4, and Li-7 are computed taking into account recent measurements of the neutron mean-life as well as updates of several nuclear reaction rates which primarily affect the production of Li-7. The extraction of primordial abundances from observation and the likelihood that the primordial mass fraction of He-4, Y(sub p) is less than or equal to 0.24 are discussed. Using the primordial abundances of D + He-3 and Li-7 we limit the baryon-to-photon ratio (eta in units of 10 exp -10) 2.6 less than or equal to eta(sub 10) less than or equal to 4.3; which we use to argue that baryons contribute between 0.02 and 0.11 to the critical energy density of the universe. An upper limit to Y(sub p) of 0.24 constrains the number of light neutrinos to N(sub nu) less than or equal to 3.4, in excellent agreement with the LEP and SLC collider results. We turn this argument around to show that the collider limit of 3 neutrino species can be used to bound the primordial abundance of He-4: 0.235 less than or equal to Y(sub p) less than or equal to 0.245.

  6. Astronomical Surveys and Big Data

    CERN Document Server

    Mickaelian, A M

    2015-01-01

    Recent all-sky and large-area astronomical surveys and their catalogued data over the whole range of electromagnetic spectrum are reviewed, from Gamma-ray to radio, such as Fermi-GLAST and INTEGRAL in Gamma-ray, ROSAT, XMM and Chandra in X-ray, GALEX in UV, SDSS and several POSS I and II based catalogues (APM, MAPS, USNO, GSC) in optical range, 2MASS in NIR, WISE and AKARI IRC in MIR, IRAS and AKARI FIS in FIR, NVSS and FIRST in radio and many others, as well as most important surveys giving optical images (DSS I and II, SDSS, etc.), proper motions (Tycho, USNO, Gaia), variability (GCVS, NSVS, ASAS, Catalina, Pan-STARRS) and spectroscopic data (FBS, SBS, Case, HQS, HES, SDSS, CALIFA, GAMA). An overall understanding of the coverage along the whole wavelength range and comparisons between various surveys are given: galaxy redshift surveys, QSO/AGN, radio, Galactic structure, and Dark Energy surveys. Astronomy has entered the Big Data era. Astrophysical Virtual Observatories and Computational Astrophysics play a...

  7. Deuterium and big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Measurements of deuterium absorption in high redshift quasar absorption systems provide a direct inference of the deuterium abundance produced by big bang nucleosynthesis (BBN). With measurements and limits from five independent absorption systems, we place strong constraints on the primordial ratio of deuterium to hydrogen, (D/H)p = 3.4 ± 0.3 x 10-5 [1,2]. We employ a direct numerical treatment to improve the estimates of critical reaction rates and reduce the uncertainties in BBN predictions of D/H and 7Li/H by a factor of three[3] over previous efforts[4]. Using our measurements of (D/H)p and new BBN predictions, we find at 95% confidence the baryon density ρb = (3.6 ± 0.4) x 10-31 g cm-3 (Ωbh265 = 0.045 ± 0.006 in units of the critical density), and cosmological baryon-photon ratio η = (5.1 ± 0.6) x 10-10

  8. Tick-Borne Diseases: The Big Two

    Science.gov (United States)

    ... Ticks and Diseases Tick-borne Diseases: The Big Two Past Issues / Spring - Summer 2010 Table of Contents ... muscle pain. The red-spotted rash usually happens 2 to 5 days after the fever begins. Antibiotics ...

  9. ARC Code TI: BigView

    Data.gov (United States)

    National Aeronautics and Space Administration — BigView allows for interactive panning and zooming of images of arbitrary size on desktop PCs running linux. Additionally, it can work in a multi-screen environment...

  10. Heat Waves Pose Big Health Threats

    Science.gov (United States)

    ... https://medlineplus.gov/news/fullstory_159744.html Heat Waves Pose Big Health Threats Kids, elderly among those ... can be inherently dangerous, but the initial heat waves every summer can be particularly perilous to those ...

  11. Scaling big data with Hadoop and Solr

    CERN Document Server

    Karambelkar, Hrishikesh Vijay

    2015-01-01

    This book is aimed at developers, designers, and architects who would like to build big data enterprise search solutions for their customers or organizations. No prior knowledge of Apache Hadoop and Apache Solr/Lucene technologies is required.

  12. Cosmic relics from the big bang

    International Nuclear Information System (INIS)

    A brief introduction to the big bang picture of the early universe is given. Dark matter is discussed; particularly its implications for elementary particle physics. A classification scheme for dark matter relics is given. 21 refs., 11 figs., 1 tab

  13. Fisicos argentinos reproduciran el Big Bang

    CERN Multimedia

    De Ambrosio, Martin

    2008-01-01

    Two groups of argentine physicists from La Plata and Buenos Aires Universities work in a sery of experiments who while recreate the conditions of the big explosion that was at the origin of the universe. (1 page)

  14. Soft computing in big data processing

    CERN Document Server

    Park, Seung-Jong; Lee, Jee-Hyong

    2014-01-01

    Big data is an essential key to build a smart world as a meaning of the streaming, continuous integration of large volume and high velocity data covering from all sources to final destinations. The big data range from data mining, data analysis and decision making, by drawing statistical rules and mathematical patterns through systematical or automatically reasoning. The big data helps serve our life better, clarify our future and deliver greater value. We can discover how to capture and analyze data. Readers will be guided to processing system integrity and implementing intelligent systems. With intelligent systems, we deal with the fundamental data management and visualization challenges in effective management of dynamic and large-scale data, and efficient processing of real-time and spatio-temporal data. Advanced intelligent systems have led to managing the data monitoring, data processing and decision-making in realistic and effective way. Considering a big size of data, variety of data and frequent chan...

  15. Big Fish and Prized Trees Gain Protection

    Institute of Scientific and Technical Information of China (English)

    Fred Pearce; 吴敏

    2004-01-01

    @@ Decisions made at a key conservation① meeting are good news for big and quirky② fish and commercially prized trees. Several species will enjoy extra protection against trade following rulings made at the Convention on International Trade in Endangered Species (CITES).

  16. Hunting Plan : Big Stone National Wildlife Refuge

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The Big Stone National Wildlife Refuge Hunting Plan provides guidance for the management of hunting on the refuge. Hunting program objectives include providing a...

  17. Conjecture on Avoidance of Big Crunch

    Institute of Scientific and Technical Information of China (English)

    SUN Cheng-Yi; ZHANG De-Hai

    2006-01-01

    By conjecturing the physics at the Planck scale, we modify the definition of the Hawking temperature and modify the Friedmann equation. It is found that we can avoid the singularity of the big crunch and obtain a bouncing cosmological model.

  18. 76 FR 7837 - Big Rivers Electric Corporation; Notice of Filing

    Science.gov (United States)

    2011-02-11

    ... Energy Regulatory Commission Big Rivers Electric Corporation; Notice of Filing Take notice that on February 4, 2011, Big Rivers Electric Corporation (Big Rivers) filed a notice of cancellation of its Second Revised and Restated Open Access Transmission Tariff. Big Rivers also requests waiver of the...

  19. From data quality to big data quality

    OpenAIRE

    Batini, C; Rula, A; Scannapieco, M; Viscusi, G

    2015-01-01

    This article investigates the evolution of data quality issues from traditional structured data managed in relational databases to Big Data. In particular, the paper examines the nature of the relationship between Data Quality and several research coordinates that are relevant in Big Data, such as the variety of data types, data sources and application domains, focusing on maps, semi-structured texts, linked open data, sensor &sensor networks and official statistics. Consequently a set of str...

  20. Adapting bioinformatics curricula for big data

    OpenAIRE

    Greene, Anna C.; Giffin, Kristine A.; Greene, Casey S; Jason H Moore

    2015-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these...

  1. Mining Big Data to Predicting Future

    OpenAIRE

    Tyagi, Amit K.; Priya, R.

    2015-01-01

    Due to technological advances, vast data sets (e.g. big data) are increasing now days. Big Data a new term; is used to identify the collected datasets. But due to their large size and complexity, we cannot manage with our current methodologies or data mining software tools to extract those datasets. Such datasets provide us with unparalleled opportunities for modelling and predicting of future with new challenges. So as an awareness of this and weaknesses as well as the possibilit...

  2. Scientific Big Data Analytics by HPC

    OpenAIRE

    Lippert, Thomas; Mallmann, Daniel; Riedel, Morris

    2016-01-01

    Storing, managing, sharing, curating and especially analysing huge amounts of data face animmense visibility and importance in industry and economy as well as in science and research.Industry and economy exploit "Big Data" for predictive analysis, to increase the efficiency ofinfrastructures, customer segmentation, and tailored services. In science, Big Data allows foraddressing problems with complexities that were impossible to deal with so far. The amountsof data are growing exponentially i...

  3. Effective Dynamics of the Matrix Big Bang

    OpenAIRE

    Craps, Ben; Rajaraman, Arvind; Sethi, Savdeep

    2006-01-01

    We study the leading quantum effects in the recently introduced Matrix Big Bang model. This amounts to a study of supersymmetric Yang-Mills theory compactified on the Milne orbifold. We find a one-loop potential that is attractive near the Big Bang. Surprisingly, the potential decays very rapidly at late times, where it appears to be generated by D-brane effects. Usually, general covariance constrains the form of any effective action generated by renormalization group flow. However, the form ...

  4. Dark energy, wormholes, and the Big Rip

    OpenAIRE

    Faraoni, Valerio; Israel, Werner

    2005-01-01

    The time evolution of a wormhole in a Friedmann universe approaching the Big Rip is studied. The wormhole is modeled by a thin spherical shell accreting the superquintessence fluid - two different models are presented. Contrary to recent claims that the wormhole overtakes the expansion of the universe and engulfs it before the Big Rip is reached, it is found that the wormhole becomes asymptotically comoving with the cosmic fluid and the future evolution of the universe is fully causal.

  5. COBE looks back to the Big Bang

    Science.gov (United States)

    Mather, John C.

    1993-01-01

    An overview is presented of NASA-Goddard's Cosmic Background Explorer (COBE), the first NASA satellite designed to observe the primeval explosion of the universe. The spacecraft carries three extremely sensitive IR and microwave instruments designed to measure the faint residual radiation from the Big Bang and to search for the formation of the first galaxies. COBE's far IR absolute spectrophotometer has shown that the Big Bang radiation has a blackbody spectrum, proving that there was no large energy release after the explosion.

  6. Leading Undergraduate Students to Big Data Generation

    OpenAIRE

    Yang, Jianjun; Shen, Ju

    2015-01-01

    People are facing a flood of data today. Data are being collected at unprecedented scale in many areas, such as networking, image processing, virtualization, scientific computation, and algorithms. The huge data nowadays are called Big Data. Big data is an all encompassing term for any collection of data sets so large and complex that it becomes difficult to process them using traditional data processing applications. In this article, the authors present a unique way which uses network simula...

  7. Cincinnati Big Area Additive Manufacturing (BAAM)

    Energy Technology Data Exchange (ETDEWEB)

    Duty, Chad E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Love, Lonnie J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-03-04

    Oak Ridge National Laboratory (ORNL) worked with Cincinnati Incorporated (CI) to demonstrate Big Area Additive Manufacturing which increases the speed of the additive manufacturing (AM) process by over 1000X, increases the size of parts by over 10X and shows a cost reduction of over 100X. ORNL worked with CI to transition the Big Area Additive Manufacturing (BAAM) technology from a proof-of-principle (TRL 2-3) demonstration to a prototype product stage (TRL 7-8).

  8. The Death of the Big Men

    DEFF Research Database (Denmark)

    Martin, Keir

    2010-01-01

    Recently Tolai people og Papua New Guinea have adopted the term 'Big Shot' to decribe an emerging post-colonial political elite. The mergence of the term is a negative moral evaluation of new social possibilities that have arisen as a consequence of the Big Shots' privileged position within a...... example of a global proces in which key lexical categories that contest, trace and shape how global historical change is experienced and constituted through linguistic categories....

  9. Data Confidentiality Challenges in Big Data Applications

    Energy Technology Data Exchange (ETDEWEB)

    Yin, Jian; Zhao, Dongfang

    2015-12-15

    In this paper, we address the problem of data confidentiality in big data analytics. In many fields, much useful patterns can be extracted by applying machine learning techniques to big data. However, data confidentiality must be protected. In many scenarios, data confidentiality could well be a prerequisite for data to be shared. We present a scheme to provide provable secure data confidentiality and discuss various techniques to optimize performance of such a system.

  10. NEUTRONIC REACTOR

    Science.gov (United States)

    Anderson, H.L.

    1960-09-20

    A nuclear reactor is described comprising fissionable material dispersed in graphite blocks, helium filling the voids of the blocks and the spaces therebetween, and means other than the helium in thermal conductive contact with the graphite for removing heat.

  11. NUCLEAR REACTOR

    Science.gov (United States)

    Miller, H.I.; Smith, R.C.

    1958-01-21

    This patent relates to nuclear reactors of the type which use a liquid fuel, such as a solution of uranyl sulfate in ordinary water which acts as the moderator. The reactor is comprised of a spherical vessel having a diameter of about 12 inches substantially surrounded by a reflector of beryllium oxide. Conventionnl control rods and safety rods are operated in slots in the reflector outside the vessel to control the operation of the reactor. An additional means for increasing the safety factor of the reactor by raising the ratio of delayed neutrons to prompt neutrons, is provided and consists of a soluble sulfate salt of beryllium dissolved in the liquid fuel in the proper proportion to obtain the result desired.

  12. Nuclear reactors

    International Nuclear Information System (INIS)

    This draft chart contains graphical symbols from which the type of (nuclear) reactor can be seen. They will serve as illustrations for graphical sketches. Important features of the individual reactor types are marked out graphically. The user can combine these symbols to characterize a specific reactor type. The basic graphical symbol is a square with a point in the centre. Functional groups can be depicted for closer specification. If two functional groups are not clearly separated, this is symbolized by a dotted line or a channel. Supply and discharge lines for coolant, moderator and fuel are specified in accordance with DIN 2481 and can be further specified by additional symbols if necessary. The examples in the paper show several different reactor types. (orig./AK)

  13. Multifunctional reactors

    OpenAIRE

    Westerterp, K.R.

    1992-01-01

    Multifunctional reactors are single pieces of equipment in which, besides the reaction, other functions are carried out simultaneously. The other functions can be a heat, mass or momentum transfer operation and even another reaction. Multifunctional reactors are not new, but they have received much emphasis in research in the last decade. A survey is given of modern developments and the first successful applications on a large scale. It is explained why their application in many instances is ...

  14. NUCLEAR REACTOR

    Science.gov (United States)

    Anderson, C.R.

    1962-07-24

    A fluidized bed nuclear reactor and a method of operating such a reactor are described. In the design means are provided for flowing a liquid moderator upwardly through the center of a bed of pellets of a nentron-fissionable material at such a rate as to obtain particulate fluidization while constraining the lower pontion of the bed into a conical shape. A smooth circulation of particles rising in the center and falling at the outside of the bed is thereby established. (AEC)

  15. Nuclear reactor

    International Nuclear Information System (INIS)

    In order to reduce neutron embrittlement of the pressue vessel of an LWR, blanked off elements are fitted at the edge of the reactor core, with the same dimensions as the fuel elements. They are parallel to each other, and to the edge of the reactor taking the place of fuel rods, and are plates of neutron-absorbing material (stainless steel, boron steel, borated Al). (HP)

  16. Breeder reactors

    International Nuclear Information System (INIS)

    The reasons for the development of fast reactors are briefly reviewed (a propitious neutron balance oriented towards a maximum uranium burnup) and its special requirements (cooling, fissile material density and reprocessing) discussed. The three stages in the French program of fast reactor development are outlined with Rapsodie at Cadarache, Phenix at Marcoule, and Super Phenix at Creys-Malville. The more specific features of the program of research and development are emphasized: kinetics and the core, the fuel and the components

  17. Main Issues in Big Data Security

    Directory of Open Access Journals (Sweden)

    Julio Moreno

    2016-09-01

    Full Text Available Data is currently one of the most important assets for companies in every field. The continuous growth in the importance and volume of data has created a new problem: it cannot be handled by traditional analysis techniques. This problem was, therefore, solved through the creation of a new paradigm: Big Data. However, Big Data originated new issues related not only to the volume or the variety of the data, but also to data security and privacy. In order to obtain a full perspective of the problem, we decided to carry out an investigation with the objective of highlighting the main issues regarding Big Data security, and also the solutions proposed by the scientific community to solve them. In this paper, we explain the results obtained after applying a systematic mapping study to security in the Big Data ecosystem. It is almost impossible to carry out detailed research into the entire topic of security, and the outcome of this research is, therefore, a big picture of the main problems related to security in a Big Data system, along with the principal solutions to them proposed by the research community.

  18. Big data: survey, technologies, opportunities, and challenges.

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  19. Big Data: Survey, Technologies, Opportunities, and Challenges

    Directory of Open Access Journals (Sweden)

    Nawsher Khan

    2014-01-01

    Full Text Available Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  20. Analyzing Big Data with the Hybrid Interval Regression Methods

    OpenAIRE

    Chia-Hui Huang; Keng-Chieh Yang; Han-Ying Kao

    2014-01-01

    Big data is a new trend at present, forcing the significant impacts on information technologies. In big data applications, one of the most concerned issues is dealing with large-scale data sets that often require computation resources provided by public cloud services. How to analyze big data efficiently becomes a big challenge. In this paper, we collaborate interval regression with the smooth support vector machine (SSVM) to analyze big data. Recently, the smooth support vector machine (SSVM...

  1. Decommissioning of the Neuherberg Research Reactor (FRN)

    International Nuclear Information System (INIS)

    The Neuherberg Research Reactor is of type TRIGA MARK III with 1 MW steady state power and pulsable up to 2000 MW. During more than ten years of operation 12000 MWh and 6000 reactor pulses had been performed. In spite of its good technical condition and of permanent safe operation without any failures, the decommissioning of the Neuherberg research reactor was decided by the GSF board of directors to save costs for maintaining and personnel. As the mode of decommissioning the safe enclosure was chosen which means that the fuel elements will be transferred back to the USA. All other radioactive reactor components will be enclosed in the reactor block. Procedures for licensing of the decommissioning, dismantling procedures and time tables are presented

  2. Papers on reactor physics for operators and unit managers

    International Nuclear Information System (INIS)

    The monograph contains papers submitted to the Dukovany nuclear power plant personnel with the aim of improving professional knowledge of reactor operators and unit managers and helping them in their preparation for state examinations. It presents an easy to understand explanation of phenomena unit control room personnel actually encounter. The following topics are covered: radioactivity, nuclear reactions, nuclear fission and the fate of neutrons in the reactor; delayed neutrons; reactor period, reactivity; subcriticality and transition to criticality; heat generation in the reactor; reactivity coefficients; reactivity effects during the fuel cycle; reactivity compensation during power changes; reactor response to reactivity changes; xenon poisoning, samarium poisoning; residual power; unit start-up after refuelling; unit power rise to the minimal controllable level following emergency shutdown; shutdown concentrations; reactor control and safety system; scram rod drop; neutron sensors in the reactor; monitoring system inside the reactor; 3rd unit computer; ''operator's ten commandments''. (P.A.). 36 figs., 2 tabs., 6 refs

  3. Ten Years of ENA Imaging from Cassini

    Science.gov (United States)

    Brandt, Pontus; Mitchell, Donald; Westlake, Joseph; Carbary, James; Paranicas, Christopher; Mauk, Barry; Krimigis, Stamatios

    2014-05-01

    In this presentation we will provide a detailed review of the science highlights of the ENA observations obtained by The Ion Neutral Camera (INCA) on board Cassini. Since the launch of Cassini, INCA has unveiled an invisible world of hot plasma and neutral gas of the two biggest objects of our solar system: the giant magnetosphere of Jupiter and Saturn. Although more than ten years ago, INCA captured the first ENA images of the Jovian system revealing magnetospheric dynamics and an asymmetric Europa neutral gas torus. Approaching Saturn, INCA observed variability of Saturn's magnetospheric activity in response to changes in solar wind dynamic pressure, which was contrary to expectations and current theories. In orbit around Saturn, INCA continued the surprises including the first imaging and global characterization of Titan's exosphere extended out to its gravitational Hill sphere; recurring injections correlating with periodic Saturn Kilometric Radiation (SKR) bursts and magnetic field perturbations; and the discovery of energetic ionospheric outflow. Perhaps most significant, and the focal point of this presentation, is INCA's contribution to the understanding of global magnetospheric particle acceleration and transport, where the combination between ENA imaging and in-situ measurements have demonstrated that transport and acceleration of plasma is likely to occur in a two-step process. First, large-scale injections in the post-midnight sector accelerate and transport plasma in to about 12 RS up to energies of several hundreds of keV. Second, centrifugal interchange acts on the plasma inside of this region and provides further heating and transport in to about 6RS. We discuss this finding in the context of the two fundamental types of injections (or ENA intensifications) that INCA has revealed during its ten years of imaging. The first type is large-scale injections appearing beyond 12 RS in the post-midnight sector that have in many cases had an inward component

  4. Boosting Big National Lab Data

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-02-21

    Introduction: Big data. Love it or hate it, solving the world’s most intractable problems requires the ability to make sense of huge and complex sets of data and do it quickly. Speeding up the process – from hours to minutes or from weeks to days – is key to our success. One major source of such big data are physical experiments. As many will know, these physical experiments are commonly used to solve challenges in fields such as energy security, manufacturing, medicine, pharmacology, environmental protection and national security. Experiments use different instruments and sensor types to research for example the validity of new drugs, the base cause for diseases, more efficient energy sources, new materials for every day goods, effective methods for environmental cleanup, the optimal ingredients composition for chocolate or determine how to preserve valuable antics. This is done by experimentally determining the structure, properties and processes that govern biological systems, chemical processes and materials. The speed and quality at which we can acquire new insights from experiments directly influences the rate of scientific progress, industrial innovation and competitiveness. And gaining new groundbreaking insights, faster, is key to the economic success of our nations. Recent years have seen incredible advances in sensor technologies, from house size detector systems in large experiments such as the Large Hadron Collider and the ‘Eye of Gaia’ billion pixel camera detector to high throughput genome sequencing. These developments have led to an exponential increase in data volumes, rates and variety produced by instruments used for experimental work. This increase is coinciding with a need to analyze the experimental results at the time they are collected. This speed is required to optimize the data taking and quality, and also to enable new adaptive experiments, where the sample is manipulated as it is observed, e.g. a substance is injected into a

  5. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-10-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification

  6. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-06-30

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop (see attached agenda). The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement

  7. Small government or big government?

    Directory of Open Access Journals (Sweden)

    MATEO SPAHO

    2015-03-01

    Full Text Available Since the beginning of the twentieth century, economists and philosophers were polarizedon their positions beyond the role that the government should have in the economy. On one hand John Maynard Keynes represented, within the optics of market economy, a position where the state should intervene in the economy to maintain the aggregate demand and the employment in the country, without hesitation in creating budget deficits and public debt expansion. This approach happens especially in the moments when the domestic economy and global economic trends show a weak growth or a recession. This means a heavy interference inthe economy, with higher income but with high expenditure to GDP too. On the other side, Liberals and Neoliberalsled by Friedrich Hayek advocated a withdrawal of the government from economic activity not just in moments of economic growth but also during the crisis, believing that the market has self-regulating mechanisms within itself. The government, as a result will have a smaller dimension with lower income and also low expenditures compared to the GDP of the country. We took the South-Eastern Europe countries distinguishing those with a "Big Government" or countries with "Small Government". There are analyzed the economic performances during the global crisis (2007-2014. In which countries the public debt grew less? Which country managed to attract more investments and which were the countries that preserved the purchasing power of their consumers? We shall see if during the economic crisis in Eastern Europe the Great Government or the Liberal and "Small" one has been the most successful the model.

  8. Big bang nucleosynthesis: Present status

    Science.gov (United States)

    Cyburt, Richard H.; Fields, Brian D.; Olive, Keith A.; Yeh, Tsung-Han

    2016-01-01

    Big bang nucleosynthesis (BBN) describes the production of the lightest nuclides via a dynamic interplay among the four fundamental forces during the first seconds of cosmic time. A brief overview of the essentials of this physics is given, and new calculations presented of light-element abundances through 6Li and 7Li, with updated nuclear reactions and uncertainties including those in the neutron lifetime. Fits are provided for these results as a function of baryon density and of the number of neutrino flavors Nν. Recent developments are reviewed in BBN, particularly new, precision Planck cosmic microwave background (CMB) measurements that now probe the baryon density, helium content, and the effective number of degrees of freedom Neff. These measurements allow for a tight test of BBN and cosmology using CMB data alone. Our likelihood analysis convolves the 2015 Planck data chains with our BBN output and observational data. Adding astronomical measurements of light elements strengthens the power of BBN. A new determination of the primordial helium abundance is included in our likelihood analysis. New D/H observations are now more precise than the corresponding theoretical predictions and are consistent with the standard model and the Planck baryon density. Moreover, D/H now provides a tight measurement of Nν when combined with the CMB baryon density and provides a 2 σ upper limit Nνdata. In contrast with D/H and 4He, 7Li predictions continue to disagree with observations, perhaps pointing to new physics. This paper concludes with a look at future directions including key nuclear reactions, astronomical observations, and theoretical issues.

  9. Storytelling Through the Temporal Bands: Collapsing Time With the Power of Ten

    Science.gov (United States)

    McCaffrey, M. S.

    2004-12-01

    Framing the history of the universe with a logarithmic axis in time provides an opportunity to break the temporal continuum into specific segments within the continuum of time. In recent years, the log-ten approach to temporal scaling has been used as a scientific and educational scaffolding for a variety of cosmic and Earth system processes and events, such as in J.M. Mitchell's 1976 "An Overview of Climatic Variability and Its Causal Mechanisms," (Quaternary Research 6, 481-493) and the "temporal bands" presented in the 1986 Bretherton Report and follow-up "Earth System Science: A Closer View (1988, NASA). Other efforts, such as the NOAA Climate TimeLine Information Tool (http://www.ngdc.noaa.gov/paleo/ctl) have begun to further flesh out the "powers of ten" framework, which allows time to be effectively collapsed in order to focus on particular aspects of the evolution and existence of the universe. We will present an overview of past efforts to capture the breadth of the universe using log-ten temporal scaling. In addition, particular "stories" from each time scale will be proposed: from the first seconds of the Big Bang (+1010Yrs.) to the development of light, galaxies and solar systems and planet Earth (109Yrs.), from the tectonic processes, evolution of biologic life, and mass extinctions that have occurred at the scales of millions of years, to the orbital processes that serve as the primary trigger of Ice Ages over hundreds of thousands of years, then focusing on the emergence of Homo sapiens from Africa in the past 100,000 years, the development of agriculture and civilizations in the past 10,000 years or so during the Holocene, and then concentrating on shorter time-scales and the events and processes they span. Whether beginning at the beginning (The Big Bang) or beginning at the sub-annual scale in which our everyday lives are lived, the "powers of ten" provide a scientific framework that holds strong potential for communicating the history and nature

  10. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan Capalbo

    2005-12-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I are organized into four areas: (1) Evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; (2) Development of GIS-based reporting framework that links with national networks; (3) Design of an integrated suite of monitoring, measuring, and verification technologies, market-based opportunities for carbon management, and an economic/risk assessment framework; (referred to below as the Advanced Concepts component of the Phase I efforts) and (4) Initiation of a comprehensive education and outreach program. As a result of the Phase I activities, the groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that complements the ongoing DOE research agenda in Carbon Sequestration. The geology of the Big Sky Carbon Sequestration Partnership Region is favorable for the potential sequestration of enormous volume of CO{sub 2}. The United States Geological Survey (USGS 1995) identified 10 geologic provinces and 111 plays in the region. These provinces and plays include both sedimentary rock types characteristic of oil, gas, and coal productions as well as large areas of mafic volcanic rocks. Of the 10 provinces and 111 plays, 1 province and 4 plays are located within Idaho. The remaining 9 provinces and 107 plays are dominated by sedimentary rocks and located in the states of Montana and Wyoming. The potential sequestration capacity of the 9 sedimentary provinces within the region ranges from 25,000 to almost 900,000 million metric tons of CO{sub 2}. Overall every sedimentary formation investigated

  11. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2005-11-01

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO2 utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research agenda in Carbon Sequestration. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other DOE regional partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for soil C in the

  12. The first ten years of Swift supernovae

    Science.gov (United States)

    Brown, Peter J.; Roming, Peter W. A.; Milne, Peter A.

    2015-09-01

    The Swift Gamma Ray Burst Explorer has proven to be an incredible platform for studying the multiwavelength properties of supernova explosions. In its first ten years, Swift has observed over three hundred supernovae. The ultraviolet observations reveal a complex diversity of behavior across supernova types and classes. Even amongst the standard candle type Ia supernovae, ultraviolet observations reveal distinct groups. When the UVOT data is combined with higher redshift optical data, the relative populations of these groups appear to change with redshift. Among core-collapse supernovae, Swift discovered the shock breakout of two supernovae and the Swift data show a diversity in the cooling phase of the shock breakout of supernovae discovered from the ground and promptly followed up with Swift. Swift observations have resulted in an incredible dataset of UV and X-ray data for comparison with high-redshift supernova observations and theoretical models. Swift's supernova program has the potential to dramatically improve our understanding of stellar life and death as well as the history of our universe.

  13. Ten steps to a cooler planet

    International Nuclear Information System (INIS)

    The overwhelming scientific consensus is that human activities, especially activities that produce greenhouse gases are accumulating in the atmosphere and are changing the intricate balance that has sustained life on earth for millions of years. The estimate is that in Canada about 35 per cent of the greenhouse gases released to the atmosphere come from personal sources. This booklet provides brief account of the ways that individuals contribute to green house gas emissions and provides some helpful hints on how to reduce this personal contribution to the climate change problem. The ten ways suggested are: (1) use public transport as much as possible, (2) buy smaller cars , (3) use the train instead of flying, make fewer long distance trips by combining business and vacation travel, (4) set the thermostat one degree lower and save 525 kg of carbon dioxide per year, (5) avoid overhousing by choosing a smaller house and by making every square foot count, (6) buy energy efficient appliances, (7) reduce water heater's energy consumption by insulating it, (8) reduce meat intake and buy locally produced food in season, (9) minimize garbage production by reusing and recycling, (10) use green power wherever available and contribute to the development of renewable energy sources

  14. Comparative Genomics of Ten Solanaceous Plastomes

    Directory of Open Access Journals (Sweden)

    Harpreet Kaur

    2014-01-01

    Full Text Available Availability of complete plastid genomes of ten solanaceous species, Atropa belladonna, Capsicum annuum, Datura stramonium, Nicotiana sylvestris, Nicotiana tabacum, Nicotiana tomentosiformis, Nicotiana undulata, Solanum bulbocastanum, Solanum lycopersicum, and Solanum tuberosum provided us with an opportunity to conduct their in silico comparative analysis in depth. The size of complete chloroplast genomes and LSC and SSC regions of three species of Solanum is comparatively smaller than that of any other species studied till date (exception: SSC region of A. belladonna. AT content of coding regions was found to be less than noncoding regions. A duplicate copy of trnH gene in C. annuum and two alternative tRNA genes for proline in D. stramonium were observed for the first time in this analysis. Further, homology search revealed the presence of rps19 pseudogene and infA genes in A. belladonna and D. stramonium, a region identical to rps19 pseudogene in C. annum and orthologues of sprA gene in another six species. Among the eighteen intron-containing genes, 3 genes have two introns and 15 genes have one intron. The longest insertion was found in accD gene in C. annuum. Phylogenetic analysis using concatenated protein coding sequences gave two clades, one for Nicotiana species and another for Solanum, Capsicum, Atropa, and Datura.

  15. Ten years of Lusi: A review

    Science.gov (United States)

    Miller, Stephen A.

    2016-04-01

    The Lusi mud eruption has continued uninterrupted for ten years, settling into its current steady-state as a quasi-periodic geyser system. Many past, current, and future studies aim to quantify this system, which increasing evidence suggests is a new-born, tectonic scale hydrothermal system linked to the nearby volcano complex. The debate about whether the triggering of Lusi was a natural event of rather caused by drilling continues, but evidence mounts from the behavior of this system that an anthropogenic cause is highly unlikely. Understanding this system is very important because of its social and economic impact on the surrounding communities, and whether it poses future geohazards in the region from future eruptions. A large effort of infrastructures and constant maintenance activity has been and is being conducted inside the 7km2 mud flooded area. This region is framed by a tall embankment that contains the erupted mud and protects the surrounding settlements. This system is also very important for understanding at a larger scale volcanic hydrothermal systems, and to determine whether this new geothermal resource might be exploited. A large effort is underway from an EU-grant supporting the Lusi-Lab project (CEED, University of Oslo) and an SNF grant supporting the University of Neuchatel to study this system from geochemical, geophysical, and modeling perspectives. This review talk summarizes what is known, what is still unclear, and will revisit the behavior of Lusi since its inception.

  16. Choledochal cysts: our ten year experience.

    LENUS (Irish Health Repository)

    Cianci, F

    2012-04-01

    We present our experience in the management of choledochal cysts from 1999 to 2009. A retrospective review of all charts with a diagnosis of choledochal cysts in our institution in this ten-year period. Data was collated using Excel. A total of 17 patients were diagnosed with choledochal cyst: 9 females and 8 males. The average age at diagnosis was 28 months (range from 0 to 9 years). The most common presenting symptoms were obstructive jaundice 6 (35%) and abdominal pain and vomiting 4 (23%). Ultrasound (US) was the initial diagnostic test in all cases with 4 patients requiring further investigations. All patients underwent Roux-en-Y Hepaticojejunostomy. The average length of stay was 11 days. Patients were followed up with Liver Function Tests (LFTS) and US 4-6 weeks post-operatively. Three patients developed complications including post-op collection, high drain output requiring blood transfusion and adhesive bowel obstruction. Our overall experience with choledochal cyst patients has been a positive one with effective management and low complication rates.

  17. Ten years for the public Web

    CERN Multimedia

    2003-01-01

    Ten years ago, CERN issued a statement declaring that a little known piece of software called the World Wide Web was in the public domain. Nowadays, the Web is an indispensable part of modern communications. The idea for the Web goes back to March 1989 when CERN Computer scientist Tim Berners-Lee wrote a proposal for a 'Distributed Information Management System' for the high-energy physics community. The Web was originaly conceived and developed to meet the demand for information sharing between scientists working all over the world. There were many obstacles in the 1980s to the effective exchange of information. There was, for example a great variety of computer and network systems, with hardly any common features. The main purpose of the web was to allow scientists to access information from any source in a consistent and simple way. By Christmas 1990, Berners-Lee's idea had become the World Wide Web, with its first server and browser running at CERN. Through 1991, the Web spread to other particle physics ...

  18. Lecture 10: The European Bioinformatics Institute - "Big data" for biomedical sciences

    CERN Document Server

    CERN. Geneva; Dana, Jose

    2013-01-01

    Part 1: Big data for biomedical sciences (Tom Hancocks) Ten years ago witnessed the completion of the first international 'Big Biology' project that sequenced the human genome. In the years since biological sciences, have seen a vast growth in data. In the coming years advances will come from integration of experimental approaches and the translation into applied technologies is the hospital, clinic and even at home. This talk will examine the development of infrastructure, physical and virtual, that will allow millions of life scientists across Europe better access to biological data Tom studied Human Genetics at the University of Leeds and McMaster University, before completing an MSc in Analytical Genomics at the University of Birmingham. He has worked for the UK National Health Service in diagnostic genetics and in training healthcare scientists and clinicians in bioinformatics. Tom joined the EBI in 2012 and is responsible for the scientific development and delivery of training for the BioMedBridges pr...

  19. Don’t miss the Passport to the Big Bang event this Sunday!

    CERN Multimedia

    CERN Bulletin

    2013-01-01

    Word has been going around for weeks now about the inauguration of the Passport to the Big Bang on Sunday 2 June. Ideal for a family day out or a day with friends, this is a CERN event not to be missed!   The Passport to the Big Bang is a 54-km scientific tourist trail comprising ten exhibition platforms in front of ten CERN sites in the Pays de Gex and the Canton of Geneva. Linked by cycle routes, these ten platforms will mark the same number of stages in the rally for competitive cyclists and the bicycle tour for families taking place this Sunday from 9 a.m. to 12 p.m. But that’s not all: from 2 p.m., you will also have the chance to take part in a huge range of activities provided by clubs and associations from CERN and the local region. Watch an oriental dance show, have a go at building detectors out of Kapla blocks and Lego, meet different reptile species, learn about wind instruments, try your hand at Nordic walking or Zumba fitness, get a better understanding of road safety...

  20. Research reactors - an overview

    Energy Technology Data Exchange (ETDEWEB)

    West, C.D.

    1997-03-01

    A broad overview of different types of research and type reactors is provided in this paper. Reactor designs and operating conditions are briefly described for four reactors. The reactor types described include swimming pool reactors, the High Flux Isotope Reactor, the Mark I TRIGA reactor, and the Advanced Neutron Source reactor. Emphasis in the descriptions is placed on safety-related features of the reactors. 7 refs., 7 figs., 2 tabs.

  1. Big Data in Caenorhabditis elegans: quo vadis?

    Science.gov (United States)

    Hutter, Harald; Moerman, Donald

    2015-11-01

    A clear definition of what constitutes "Big Data" is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of "complete" data sets for this organism is actually rather small--not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein-protein interaction--important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell. PMID:26543198

  2. Five Big, Big Five Issues : Rationale, Content, Structure, Status, and Crosscultural Assessment

    NARCIS (Netherlands)

    De Raad, Boele

    1998-01-01

    This article discusses the rationale, content, structure, status, and crosscultural assessment of the Big Five trait factors, focusing on topics of dispute and misunderstanding. Taxonomic restrictions of the original Big Five forerunner, the "Norman Five," are discussed, and criticisms regarding the

  3. 大数据,大变革%Big Data Big Changes

    Institute of Scientific and Technical Information of China (English)

    梁爽

    2014-01-01

    Big data are always happen in people’s side, big data era has arrived. This paper has described the characteristics of big data, analyzed big data research status and future application direction. Only to understand big data again, change the thinking of big data, adapt to changes in business model, innovative big data management, strengthen institution construction, enhance law awareness, ensure the personal and national security, it can continuously promote the healthy development of big data.%大数据正时刻发生在人们的身边,大数据时代已经到来。本文通过对大数据特点的描述,分析了大数据在国内外的研究现状以及未来的应用方向,只有重新认识大数据,从思维上变革对大数据的认识,从商业模式上适应大数据的变化,创新大数据管理模式,加强制度建设,增强法律意识,保证个人和国家的安全,才能不断推动大数据的健康发展。

  4. Big Data Big Changes%大数据,大变革

    Institute of Scientific and Technical Information of China (English)

    梁爽

    2014-01-01

    大数据正时刻发生在人们的身边,大数据时代已经到来。本文通过对大数据特点的描述,分析了大数据在国内外的研究现状以及未来的应用方向,只有重新认识大数据,从思维上变革对大数据的认识,从商业模式上适应大数据的变化,创新大数据管理模式,加强制度建设,增强法律意识,保证个人和国家的安全,才能不断推动大数据的健康发展。%Big data are always happen in people’s side, big data era has arrived. This paper has described the characteristics of big data, analyzed big data research status and future application direction. Only to understand big data again, change the thinking of big data, adapt to changes in business model, innovative big data management, strengthen institution construction, enhance law awareness, ensure the personal and national security, it can continuously promote the healthy development of big data.

  5. "Take ten minutes": a dedicated ten minute medication review reduces polypharmacy in the elderly.

    Science.gov (United States)

    Walsh, E K; Cussen, K

    2010-09-01

    Multiple and inappropriate medications are often the cause for poor health status in the elderly. Medication reviews can improve prescribing. This study aimed to determine if a ten minute medication review by a general practitioner could reduce polypharmacy and inappropriate prescribing in elderly patients. A prospective, randomised study was conducted. Patients over the age of 65 (n = 50) underwent a 10-minute medication review. Inappropriate medications, dosage errors, and discrepancies between prescribed versus actual medication being consumed were recorded. A questionnaire to assess satisfaction was completed following review. The mean number of medications taken by patients was reduced (p < 0.001). A medication was stopped in 35 (70%) patients. Inappropriate medications were detected in 27 (54%) patients and reduced (p < 0.001). Dose errors were detected in 16 (32%). A high level of patient satisfaction was reported. A ten minute medication review reduces polypharmacy, improves prescribing and is associated with high levels of patient satisfaction. PMID:21046863

  6. "Take ten minutes": a dedicated ten minute medication review reduces polypharmacy in the elderly.

    LENUS (Irish Health Repository)

    Walsh, E K

    2010-09-01

    Multiple and inappropriate medications are often the cause for poor health status in the elderly. Medication reviews can improve prescribing. This study aimed to determine if a ten minute medication review by a general practitioner could reduce polypharmacy and inappropriate prescribing in elderly patients. A prospective, randomised study was conducted. Patients over the age of 65 (n = 50) underwent a 10-minute medication review. Inappropriate medications, dosage errors, and discrepancies between prescribed versus actual medication being consumed were recorded. A questionnaire to assess satisfaction was completed following review. The mean number of medications taken by patients was reduced (p < 0.001). A medication was stopped in 35 (70%) patients. Inappropriate medications were detected in 27 (54%) patients and reduced (p < 0.001). Dose errors were detected in 16 (32%). A high level of patient satisfaction was reported. A ten minute medication review reduces polypharmacy, improves prescribing and is associated with high levels of patient satisfaction.

  7. "Take ten minutes": a dedicated ten minute medication review reduces polypharmacy in the elderly.

    LENUS (Irish Health Repository)

    Walsh, E K

    2012-02-01

    Multiple and inappropriate medications are often the cause for poor health status in the elderly. Medication reviews can improve prescribing. This study aimed to determine if a ten minute medication review by a general practitioner could reduce polypharmacy and inappropriate prescribing in elderly patients. A prospective, randomised study was conducted. Patients over the age of 65 (n = 50) underwent a 10-minute medication review. Inappropriate medications, dosage errors, and discrepancies between prescribed versus actual medication being consumed were recorded. A questionnaire to assess satisfaction was completed following review. The mean number of medications taken by patients was reduced (p < 0.001). A medication was stopped in 35 (70%) patients. Inappropriate medications were detected in 27 (54%) patients and reduced (p < 0.001). Dose errors were detected in 16 (32%). A high level of patient satisfaction was reported. A ten minute medication review reduces polypharmacy, improves prescribing and is associated with high levels of patient satisfaction.

  8. Enhancement of β-catenin activity by BIG1 plus BIG2 via Arf activation and cAMP signals.

    Science.gov (United States)

    Li, Chun-Chun; Le, Kang; Kato, Jiro; Moss, Joel; Vaughan, Martha

    2016-05-24

    Multifunctional β-catenin, with critical roles in both cell-cell adhesion and Wnt-signaling pathways, was among HeLa cell proteins coimmunoprecipitated by antibodies against brefeldin A-inhibited guanine nucleotide-exchange factors 1 and 2 (BIG1 or BIG2) that activate ADP-ribosylation factors (Arfs) by accelerating the replacement of bound GDP with GTP. BIG proteins also contain A-kinase anchoring protein (AKAP) sequences that can act as scaffolds for multimolecular assemblies that facilitate and limit cAMP signaling temporally and spatially. Direct interaction of BIG1 N-terminal sequence with β-catenin was confirmed using yeast two-hybrid assays and in vitro synthesized proteins. Depletion of BIG1 and/or BIG2 or overexpression of guanine nucleotide-exchange factor inactive mutant, but not wild-type, proteins interfered with β-catenin trafficking, leading to accumulation at perinuclear Golgi structures. Both phospholipase D activity and vesicular trafficking were required for effects of BIG1 and BIG2 on β-catenin activation. Levels of PKA-phosphorylated β-catenin S675 and β-catenin association with PKA, BIG1, and BIG2 were also diminished after BIG1/BIG2 depletion. Inferring a requirement for BIG1 and/or BIG2 AKAP sequence in PKA modification of β-catenin and its effect on transcription activation, we confirmed dependence of S675 phosphorylation and transcription coactivator function on BIG2 AKAP-C sequence. PMID:27162341

  9. Ten years after the Chernobyl Accident

    International Nuclear Information System (INIS)

    About 5 percent of the total amount of cesium released from the Chernobyl reactor accident deposited in Sweden. The middle part of Sweden received the highest fallout. During the first period after the accident, cows in these areas were not allowed to graze. Due to the time of the year there were very few problems with cultivated crops, even during the first summer. Game, reindeer, fresh water fish, wild berries and mushrooms, however, were contaminated to a great extent and still after 10 years high concentrations of 137Cs can be found in these animals and in mushrooms, but to a lesser extent in wild berries. Intensive controls of the Cs content are still being carried out in reindeer at the time of slaughtering. During the last few years, hand instruments for estimation of the Cs content of live animals (reindeer mostly) has been available. This makes it possible to slaughter only animals estimated to have levels of Cs below the limit value. When offered for sale, the limit value for 137Cs is 300 Bq/kg for the 'basic foodstuffs' and for meat from game, reindeer, fresh water fish, nuts, wild berries and mushrooms 1500 Bq/kg. High levels of 137Cs will be found in reindeer and fresh water fish from some areas for many years in the future. 8 refs, 11 figs

  10. Reactor operations at SAFARI-1

    International Nuclear Information System (INIS)

    A vigorous commercial programme of isotope production and other radiation services has been followed by the SAFARI-1 research reactor over the past ten years - superimposed on the original purpose of the reactor to provide a basic tool for nuclear research, development and education to the country at an institutional level. A combination of the binding nature of the resulting contractual obligations and tighter regulatory control has demanded an equally vigorous programme of upgrading, replacement and renovation of many systems in order to improve the safety and reliability of the reactor. Not least among these changes is the more effective training and deployment of operations personnel that has been necessitated as the operational demands on the reactor evolved from five days per week to twenty four hours per day, seven days per week, with more than 300 days per year at full power. This paper briefly sketches the operational history of SAFARI-1 and then focuses on the training and structuring currently in place to meet the operational needs. There is a detailed step-by-step look at the operator?s career plan and pre-defined milestones. Shift work, especially the shift cycle, has a negative influence on the operator's career path development, especially due to his unavailability for training. Methods utilised to minimise this influence are presented. The increase of responsibilities regarding the operation of the reactor, ancillaries and experimental facilities as the operator progresses with his career are discussed. (author)

  11. Ten years of energy policy in Catalonia

    International Nuclear Information System (INIS)

    Catalonia is located in the north-east corner of Spain, on the Mediterranean coast and bordering with France in the north. It is one of the most industrialized and developed regions of Spain, with a per capita income of 10879 ECU per year (1989), 21.9% higher than the Spanish average. It contributes to 20.3% of the Spanish GDP. Primary energy consumption was 16.5 M toe in 1989, covered by the following sources: 3.5% coal, 51.8% oil, 9.4% natural gas, 3.6% hydro, 30.8% nuclear, 0.1% electrical import/export balance and 0.8% waste residuals. Oil dependence is distorted by the existence of a major petrochemical industry that uses more than 2.6 Mtoe of oil derivatives for non-energy purposes. Final energy demand in 1989 was 8.8 Mtoe, 40.9% for industry, 36.9% for transportation and 22.3% for the domestic and services sector. In the 60's, Catalonia's socio-economic development was accompanied by a spectacular increase in the demand for primary energy: from slightly above 2.5 million toe in 1960 to 5.6 Mtoe ten years later. If this decade can be characterized by a steady increase in the total value of consumption, the 70's also constitute years of major changes in the field of energy on an international scale (the oil crisis) and in politics in the Spanish State. 6 refs., 3 figs., 6 tabs

  12. Ten years of nuclear law development

    International Nuclear Information System (INIS)

    I took over the legal column in atw in early 1998. My second contribution was about the 8th amendment to the German Atomic Energy Act. My last but one article covered the 10th act amending the Atomic Energy Act focusing on the revision of the reliability audit and the regulations about competence for the Asse II mine. What are the changes in German atomic energy law over this ten-year period? What will be the future of atomic energy law in Germany? The term 'Atomic Energy Act' conceals the fact that the Atomic Energy Act of the 8th amendment does not have much in common any more with the Act of the 10th amendment. The dividing line appeared in the 9th amendment, which put into effect one of the key objectives of the red-green coalition government of the autumn of 1998: Terminating the peaceful use of nuclear power 'if possible by consensus' and without any indemnification of licensees. Although the Atomic Energy Act of April 22, 2002 formally kept its name, the original purpose of this piece of legislation was turned into the opposite by mentioning as the first objective the orderly termination of the use of nuclear power for commercial generation of electricity. On a European level, nuclear power has been re-evaluated in the meantime for various obvious reasons, and it is to be hoped that also Germany will find a way back to using nuclear power within the broad energy mix. With this contribution, which is my last one, I say goodbye to the readers of the legal column in atw. Thank you for your interest over all the years. (orig.)

  13. Reactor utilization

    International Nuclear Information System (INIS)

    In 1962, the RA reactor was operated almost three times more than in 1961, producing total of 25 555 MWh. Diagram containing comparative data about reactor operation for 1960, 1961, and 1962, percent of fuel used and U-235 burnup shows increase in reactor operation. Number of samples irradiated was 659, number of experiments done was 16. mean powered level was 5.93 MW. Fuel was added into the core twice during the reporting year. In fact the core was increased from 56 to 68 fuel channels and later to 84 fuel channels. Fuel was added to the core when the reactivity worth decreased to the minimum operation level due to burnup. In addition to this 5 central fuel channels were exchanged with fresh fuel in february for the purpose of irradiation in the VISA-2 channel

  14. Reactor Neutrinos

    CERN Document Server

    Lasserre, T; Lasserre, Thierry; Sobel, Henry W.

    2005-01-01

    We review the status and the results of reactor neutrino experiments, that toe the cutting edge of neutrino research. Short baseline experiments have provided the measurement of the reactor neutrino spectrum, and are still searching for important phenomena such as the neutrino magnetic moment. They could open the door to the measurement of coherent neutrino scattering in a near future. Middle and long baseline oscillation experiments at Chooz and KamLAND have played a relevant role in neutrino oscillation physics in the last years. It is now widely accepted that a new middle baseline disappearance reactor neutrino experiment with multiple detectors could provide a clean measurement of the last undetermined neutrino mixing angle theta13. We conclude by opening on possible use of neutrinos for Society: NonProliferation of Nuclear materials and Geophysics.

  15. Big Data over a 100 G network at Fermilab

    International Nuclear Information System (INIS)

    As the need for Big Data in science becomes ever more relevant, networks around the world are upgrading their infrastructure to support high-speed interconnections. To support its mission, the high-energy physics community as a pioneer in Big Data has always been relying on the Fermi National Accelerator Laboratory to be at the forefront of storage and data movement. This need was reiterated in recent years with the data-taking rate of the major LHC experiments reaching tens of petabytes per year. At Fermilab, this resulted regularly in peaks of data movement on the Wide area network (WAN) in and out of the laboratory of about 30 Gbit/s and on the Local area network (LAN) between storage and computational farms of 160 Gbit/s. To address these ever increasing needs, as of this year Fermilab is connected to the Energy Sciences Network (ESnet) through a 100 Gb/s link. To understand the optimal system-and application-level configuration to interface computational systems with the new highspeed interconnect, Fermilab has deployed a Network Research and Development facility connected to the ESnet 100 G Testbed. For the past two years, the High Throughput Data Program (HTDP) has been using the Testbed to identify gaps in data movement middleware [5] when transferring data at these high-speeds. The program has published evaluations of technologies typically used in High Energy Physics, such as GridFTP [4], XrootD [9], and Squid [8]. This work presents the new R and D facility and the continuation of the evaluation program.

  16. Big data analytics a management perspective

    CERN Document Server

    Corea, Francesco

    2016-01-01

    This book is about innovation, big data, and data science seen from a business perspective. Big data is a buzzword nowadays, and there is a growing necessity within practitioners to understand better the phenomenon, starting from a clear stated definition. This book aims to be a starting reading for executives who want (and need) to keep the pace with the technological breakthrough introduced by new analytical techniques and piles of data. Common myths about big data will be explained, and a series of different strategic approaches will be provided. By browsing the book, it will be possible to learn how to implement a big data strategy and how to use a maturity framework to monitor the progress of the data science team, as well as how to move forward from one stage to the next. Crucial challenges related to big data will be discussed, where some of them are more general - such as ethics, privacy, and ownership – while others concern more specific business situations (e.g., initial public offering, growth st...

  17. Nuclear reactors

    International Nuclear Information System (INIS)

    A nuclear reactor has a large prompt negative temperature coefficient of reactivity. A reactor core assembly of a plurality of fluid-tight fuel elements is located within a water-filled tank. Each fuel element contains a solid homogeneous mixture of 50-79 w/o zirconium hydride, 20-50 w/o uranium and 0.5-1.5 W erbium. The uranium is not more than 20 percent enriched, and the ratio of hydrogen atoms to zirconium atoms is between 1.5:1 and 7:1. The core has a long lifetime, E.G., at least about 1200 days

  18. Nuclear reactors

    International Nuclear Information System (INIS)

    In a liquid cooled nuclear reactor, the combination is described for a single-walled vessel containing liquid coolant in which the reactor core is submerged, and a containment structure, primarily of material for shielding against radioactivity, surrounding at least the liquid-containing part of the vessel with clearance therebetween and having that surface thereof which faces the vessel make compatible with the liquid, thereby providing a leak jacket for the vessel. The structure is preferably a metal-lined concrete vault, and cooling means are provided for protecting the concrete against reaching a temperature at which damage would occur. (U.S.)

  19. Big Data, Big machines, Big Science : vers une société sans sujet et sans causalité ?

    OpenAIRE

    Ibekwe-SanJuan, Fidelia

    2014-01-01

    International audience Les dernières " avancées " en matière des Technologies de l'information et de la communication (TIC) ont accéléré la virtualisation de nombreux secteurs d'activité. Le Big Data, le Cloud computing, l'Open Data et le web participatif entraînent des bouleversements importants en science et en société. Un des effets qui suscite de l'inquiétude est le recours croissant aux algorithmes de traitement des données massives (Big data) comme mode de pilotage des affaires. Le B...

  20. Big Data Solution for CTBT Monitoring Using Global Cross Correlation

    Science.gov (United States)

    Gaillard, P.; Bobrov, D.; Dupont, A.; Grenouille, A.; Kitov, I. O.; Rozhkov, M.

    2014-12-01

    Due to the mismatch between data volume and the performance of the Information Technology infrastructure used in seismic data centers, it becomes more and more difficult to process all the data with traditional applications in a reasonable elapsed time. To fulfill their missions, the International Data Centre of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO/IDC) and the Département Analyse Surveillance Environnement of Commissariat à l'Energie atomique et aux énergies alternatives (CEA/DASE) collect, process and produce complex data sets whose volume is growing exponentially. In the medium term, computer architectures, data management systems and application algorithms will require fundamental changes to meet the needs. This problem is well known and identified as a "Big Data" challenge. To tackle this major task, the CEA/DASE takes part during two years to the "DataScale" project. Started in September 2013, DataScale gathers a large set of partners (research laboratories, SMEs and big companies). The common objective is to design efficient solutions using the synergy between Big Data solutions and the High Performance Computing (HPC). The project will evaluate the relevance of these technological solutions by implementing a demonstrator for seismic event detections thanks to massive waveform correlations. The IDC has developed an expertise on such techniques leading to an algorithm called "Master Event" and provides a high-quality dataset for an extensive cross correlation study. The objective of the project is to enhance the Master Event algorithm and to reanalyze 10 years of waveform data from the International Monitoring System (IMS) network thanks to a dedicated HPC infrastructure operated by the "Centre de Calcul Recherche et Technologie" at the CEA of Bruyères-le-Châtel. The dataset used for the demonstrator includes more than 300,000 seismic events, tens of millions of raw detections and more than 30 terabytes of continuous seismic data

  1. Effects of variation of fundamental constants from big bang to atomic clocks

    International Nuclear Information System (INIS)

    Full text: Theories unifying gravity with other interactions suggest temporal and spatial variation of the fundamental 'constants' in expanding Universe. I discuss effects of variation of the fine structure constant alpha=e2/h c, strong interaction and quark mass. The measurements of these variations cover lifespan of the Universe from few minutes after Big Bang to the present time and give controversial results. There are some hints for the variation in Big Bang nucleosynthesis, quasar absorption spectra and Oklo natural nuclear reactor data. A very promising method to search for the variation of the fundamental constants consists in comparison of different atomic clocks. A billion times enhancement of the variation effects happens in transition between accidentally degenerate atomic energy levels. Copyright (2005) Australian Institute of Physics

  2. Intra- and interspecific responses to Rafinesque’s big-eared bat (Corynorhinus rafinesquii) social calls.

    Energy Technology Data Exchange (ETDEWEB)

    Loeb, Susan, C.; Britzke, Eric, R.

    2010-07-01

    Bats respond to the calls of conspecifics as well as to calls of other species; however, few studies have attempted to quantify these responses or understand the functions of these calls. We tested the response of Rafinesque’s big-eared bats (Corynorhinus rafinesquii) to social calls as a possible method to increase capture success and to understand the function of social calls. We also tested if calls of bats within the range of the previously designated subspecies differed, if the responses of Rafinesque’s big-eared bats varied with geographic origin of the calls, and if other species responded to the calls of C. rafinesquii. We recorded calls of Rafinesque’s big-eared bats at two colony roost sites in South Carolina, USA. Calls were recorded while bats were in the roosts and as they exited. Playback sequences for each site were created by copying typical pulses into the playback file. Two mist nets were placed approximately 50–500 m from known roost sites; the net with the playback equipment served as the Experimental net and the one without the equipment served as the Control net. Call structures differed significantly between the Mountain and Coastal Plains populations with calls from the Mountains being of higher frequency and longer duration. Ten of 11 Rafinesque’s big-eared bats were caught in the Control nets and, 13 of 19 bats of other species were captured at Experimental nets even though overall bat activity did not differ significantly between Control and Experimental nets. Our results suggest that Rafinesque’s big-eared bats are not attracted to conspecifics’ calls and that these calls may act as an intraspecific spacing mechanism during foraging.

  3. About Big Data and its Challenges and Benefits in Manufacturing

    OpenAIRE

    Bogdan NEDELCU

    2013-01-01

    The aim of this article is to show the importance of Big Data and its growing influence on companies. It also shows what kind of big data is currently generated and how much big data is estimated to be generated. We can also see how much are the companies willing to invest in big data and how much are they currently gaining from their big data. There are also shown some major influences that big data has over one major segment in the industry (manufacturing) and the challenges that appear.

  4. Tenåringsdrikking i utviklingspsykologisk perspektiv

    Directory of Open Access Journals (Sweden)

    Hilde Pape

    2009-10-01

    Full Text Available  SAMMENDRAGHvorfor er alkohol så populært blant unge mennesker? Dette viktige spørsmålet har vært gjenstand for fåempiriske studier. Forskningsbasert kunnskap om alkoholens positive sider og forsterkende egenskaper erderfor av begrenset omfang. Derimot har tallrike undersøkelser fokusert på ulike skadevirkninger som følgeav tenåringsdrikking. Resultatene av denne forskningen har bidratt til å understreke behovet for en aktivrusforebyggende innsats. Innsikt i alkoholens opplevde goder er imidlertid nødvendig for å kunne utvikleeffektive forebyggingsstrategier. På denne bakgrunn er søkelyset i artikkelen rettet mot psykososiale funksjonerved unge menneskers drikkevaner. Spørsmål knyttet til gruppepress og modell-læring vil også bli berørt.Hensikten er å formidle sentrale funn fra nyere forskning på feltet. Oppsummeringsvis tyder resultatene på atalkohol har en særlig appell til ungdom som er veltilpassede og sosialt anlagte. Samtidig ser det ut til atdrikking kan bidra til å fremme utviklingsprosessen i ungdomstida, men at det primært handler om indirekteeffekter. Hvilke implikasjoner de ulike funnene har mht. forebygging, er skissert i avslutningsdelen.Pape H. Teenage alcohol use from the perspective of psychological development.Nor J EpidemiolEWhy is alcohol so popular among young people? So far, few studies have addressed this important question.The body of scientific research on the positive and reinforcing aspects of drinking is accordingly of limitedextent. Numerous studies have focused on the harmful effects of teenage alcohol use and the findings clearlyunderscore the importance of primary prevention. Knowledge about the perceived advantages of alcohol useis needed to develop effective preventive programs, however. On this background, the article focuses onpsychosocial functions of youthful drinking. Findings from recent research regarding the link between alcoholuse and various indicators of adolescent

  5. Westinghouse's small and medium reactor portfolio

    International Nuclear Information System (INIS)

    Full text: Westinghouse has been a pioneer in the civil nuclear power industry. The first commercial nuclear reactor was a Westinghouse reactor in Shippingport, PA of the United States of America. The company was founded in 1886 by the inventor and entrepreneur George Westinghouse. Today, Westinghouse Electric Company is a nuclear technology company and 60% of the electricity produced from nuclear power in the world is based on Westinghouse technology. Westinghouse is working with partners worldwide to build its 1100 MWe advanced passive PWR. It is also developing small and medium size reactors to fill market niches and for what is known as Generation IV reactors. These reactors (<700 MWe) are suitable where there exists one or more of the following limitations: grid, financing, site etc. IRIS is one such reactor that utilizes a simplified, integral configuration. This integral, advanced PWR at 335 MWe locates major components inside the reactor pressure vessel to eliminate system piping and other components. IRIS is being developed by an international development team that includes ten countries and twenty four organizations. PMBR (Pebble Bed Modular Reactor) is a Generator IV high temperature gas reactor that supports co-generation operation. The PBMR design is being developed through a partnership between Westinghouse Electric Co. and PBMR (Pty) Ltd of the Republic of South Africa. The PBMR design is sized at 200 MWt and 80 MWe to support a broad range of process steam applications. Furthermore, the PBMR achieves inherent safety levels through the use of innovative TRISO fuel. In addition, Westinghouse parent company, Toshiba Corporation, is developing the 4S sodium fast reactor which is a 10-50 MWe reactor that is ideal for isolated areas of small power demand. The conference presentation will include specific product features and the development status of the small and medium reactors in the Westinghouse portfolio

  6. Bohmian Quantization of the Big Rip

    CERN Document Server

    Pinto-Neto, Nelson; 10.1103/PhysRevD.80.083509

    2009-01-01

    It is shown in this paper that minisuperspace quantization of homogeneous and isotropic geometries with phantom scalar fields, when examined in the light of the Bohm-de Broglie interpretation of quantum mechanics, does not eliminate, in general, the classical big rip singularity present in the classical model. For some values of the Hamilton-Jacobi separation constant present in a class of quantum state solutions of the Wheeler-DeWitt equation, the big rip can be either completely eliminated or may still constitute a future attractor for all expanding solutions. This is contrary to the conclusion presented in Ref.[1], using a different interpretation of the wave function, where the big rip singularity is completely eliminated ("smoothed out") through quantization, independently of such separation constant and for all members of the above mentioned class of solutions. This is an example of the very peculiar situation where different interpretations of the same quantum state of a system are predicting different...

  7. Hot big bang or slow freeze?

    International Nuclear Information System (INIS)

    We confront the big bang for the beginning of the universe with an equivalent picture of a slow freeze — a very cold and slowly evolving universe. In the freeze picture the masses of elementary particles increase and the gravitational constant decreases with cosmic time, while the Newtonian attraction remains unchanged. The freeze and big bang pictures both describe the same observations or physical reality. We present a simple “crossover model” without a big bang singularity. In the infinite past space–time is flat. Our model is compatible with present observations, describing the generation of primordial density fluctuations during inflation as well as the present transition to a dark energy-dominated universe

  8. One Second After the Big Bang

    CERN Document Server

    CERN. Geneva

    2014-01-01

    A new experiment called PTOLEMY (Princeton Tritium Observatory for Light, Early-Universe, Massive-Neutrino Yield) is under development at the Princeton Plasma Physics Laboratory with the goal of challenging one of the most fundamental predictions of the Big Bang – the present-day existence of relic neutrinos produced less than one second after the Big Bang. Using a gigantic graphene surface to hold 100 grams of a single-atomic layer of tritium, low noise antennas that sense the radio waves of individual electrons undergoing cyclotron motion, and a massive array of cryogenic sensors that sit at the transition between normal and superconducting states, the PTOLEMY project has the potential to challenge one of the most fundamental predictions of the Big Bang, to potentially uncover new interactions and properties of the neutrinos, and to search for the existence of a species of light dark matter known as sterile neutrinos.

  9. Big Data Issues: Performance, Scalability, Availability

    Directory of Open Access Journals (Sweden)

    Laura Matei

    2014-03-01

    Full Text Available Nowadays, Big Data is probably one of the most discussed topics not only in the area of data analysis, but, I believe, in the whole realm of information technology. The simple typing of the words „big data” on an online search engine like Google will retrieve approximately 1,660,000,000 results. Having such a buzz gathered around this term, I could not help but wonder what this phenomenon means.The ever greater portion that the combination of Internet, Cloud Computing and mobile devices has been occupying in our lives, lead to an ever increasing amount of data that must be captured, communicated, aggregated, stored, and analyzed. These sets of data that we are generating are called Big Data.

  10. Big Bear Exploration Ltd. 1998 annual report

    International Nuclear Information System (INIS)

    During the first quarter of 1998 Big Bear completed a purchase of additional assets in the Rainbow Lake area of Alberta in which light oil purchase was financed with new equity and bank debt. The business plan was to immediately exploit these light oil assets, the result of which would be increased reserves, production and cash flow. Although drilling results in the first quarter on the Rainbow Lake properties was mixed, oil prices started to free fall and drilling costs were much higher than expected. As a result, the company completed a reduced program which resulted in less incremental loss and cash flow than it budgeted for. On April 29, 1998, Big Bear entered into agreement with Belco Oil and Gas Corp. and Moan Investments Ltd. for the issuance of convertible preferred shares at a gross value of $15,750,000, which shares were eventually converted at 70 cents per share to common equity. As a result of the continued plunge in oil prices, the lending value of the company's assets continued to fall, requiring it to take action in order to meet its financial commitments. Late in the third quarter Big Bear issued equity for proceeds of $11,032,000 which further reduced the company's debt. Although the company has been extremely active in identifying and pursuing acquisition opportunities, it became evident that Belco Oil and Gas Corp. and Big Bear did nor share common criteria for acquisitions, which resulted in the restructuring of their relationship in the fourth quarter. With the future of oil prices in question, Big Bear decided that it would change its focus to that of natural gas and would refocus ts efforts to acquire natural gas assets to fuel its growth. The purchase of Blue Range put Big Bear in a difficult position in terms of the latter's growth. In summary, what started as a difficult year ended in disappointment

  11. Nuclear reactor

    International Nuclear Information System (INIS)

    In an improved reactor core for a high conversion BWR reactor, Pu-breeding type BWR type reactor, Pu-breeding type BWR type rector, FEBR type reactor, etc., two types of fuel assemblies are loaded such that fuel assemblies using a channel box of a smaller irradiation deformation ratio are loaded in a high conversion region, while other fuel assemblies are loaded in a burner region. This enables to suppress the irradiation deformation within an allowable limit in the high conversion region where the fast neutron flux is high and the load weight from the inside of the channel box due to the pressure loss is large. At the same time, the irradiation deformation can be restricted within an allowable limit without deteriorating the neutron economy in the burner region in which fast neutron flux is low and the load weight from the inside of the channel box is small since a channel box with smaller neutron absorption cross section or reduced wall thickness is charged. As a result, it is possible to prevent structural deformations such as swelling of the channel box, bending of the entire assemblies, bending of fuel rods, etc. (K.M.)

  12. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    Energy Technology Data Exchange (ETDEWEB)

    Lucas, Robert [University of Southern California, Information Sciences Institute; Ang, James [Sandia National Laboratories; Bergman, Keren [Columbia University; Borkar, Shekhar [Intel; Carlson, William [Institute for Defense Analyses; Carrington, Laura [University of California, San Diego; Chiu, George [IBM; Colwell, Robert [DARPA; Dally, William [NVIDIA; Dongarra, Jack [University of Tennessee; Geist, Al [Oak Ridge National Laboratory; Haring, Rud [IBM; Hittinger, Jeffrey [Lawrence Livermore National Laboratory; Hoisie, Adolfy [Pacific Northwest National Laboratory; Klein, Dean Micron; Kogge, Peter [University of Notre Dame; Lethin, Richard [Reservoir Labs; Sarkar, Vivek [Rice University; Schreiber, Robert [Hewlett Packard; Shalf, John [Lawrence Berkeley National Laboratory; Sterling, Thomas [Indiana University; Stevens, Rick [Argonne National Laboratory; Bashor, Jon [Lawrence Berkeley National Laboratory; Brightwell, Ron [Sandia National Laboratories; Coteus, Paul [IBM; Debenedictus, Erik [Sandia National Laboratories; Hiller, Jon [Science and Technology Associates; Kim, K. H. [IBM; Langston, Harper [Reservoir Labs; Murphy, Richard Micron; Webster, Clayton [Oak Ridge National Laboratory; Wild, Stefan [Argonne National Laboratory; Grider, Gary [Los Alamos National Laboratory; Ross, Rob [Argonne National Laboratory; Leyffer, Sven [Argonne National Laboratory; Laros III, James [Sandia National Laboratories

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a system that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.

  13. Fast reactor operating experience

    International Nuclear Information System (INIS)

    At the beginning of electricity generation from nuclear power there was the breeder, which fulfilled its duty in a number of smaller test and experimental reactors within national programs. Over the years, some of those reactors have attained impressive availabilities, while others have helped to improve our knowledge by the negative results they contributed. Worldwide a decisive step was taken by the mid- to late sixties in the planning and construction of medium sized demonstration fast breeder power plants (250 to 350 MW). In the Federal Republik of Germany, this step is taken belatedly in building the SNR-300. BN-350 in the USSR, Phenix in France, and PFR in the United Kingdom have now been in operation for some ten years. Over that period, valuable experience has been accumulated in sodium technology. The operating behavior of all components and systems working in sodium is called excellent; the hazards associated with sodium, the fire hazard in particular, thus often seem to be greatly overrated. Leakages have been brought under control. It has always been possible so far to trace them back to systemic faults produced in the welding process. The ability of fast sodium cooled reactors to produce more nuclear fuel than they consume has been demonstrated in Phenix, whose breeding ration has been measured to be 1.16. The first true large breeder, Super Phenix in France, is to be commissioned already in 1985. In building another three breeder power plants the European partners in an association hope to achieve the commercial breakthrough of the breeder line. (orig.)

  14. Big Bang riddles and their revelations

    OpenAIRE

    Magueijo, Joao; Baskerville, Kim

    1999-01-01

    We describe how cosmology has converged towards a beautiful model of the Universe: the Big Bang Universe. We praise this model, but show there is a dark side to it. This dark side is usually called ``the cosmological problems'': a set of coincidences and fine tuning features required for the Big Bang Universe to be possible. After reviewing these ``riddles'' we show how they have acted as windows into the very early Universe, revealing new physics and new cosmology just as the Universe came i...

  15. SQL Engines for Big Data Analytics

    OpenAIRE

    Xue, Rui

    2015-01-01

    The traditional relational database systems can not accommodate the need of analyzing data with large volume and various formats, i.e., Big Data. Apache Hadoop as the first generation of open-source Big Data solution provided a stable distributed data storage and resource management system. However, as a MapReduce framework, the only channel of utilizing the parallel computing power of Hadoop is the API. Given a problem, one has to code a corresponding MapReduce program in Java, which is time...

  16. Kansen voor Big data – WPA Vertrouwen

    OpenAIRE

    Broek, T.A. van den; Roosendaal, A.P.C.; Veenstra, A.F.E. van; Nunen, A.M. van

    2014-01-01

    Big data is expected to become a driver for economic growth, but this can only be achieved when services based on (big) data are accepted by citizens and consumers. In a recent policy brief, the Cabinet Office mentions trust as one of the three pillars (the others being transparency and control) for ePrivacy. As such, it is a requirement for realizing economic value of services based on (personal) data. Businesses play a role in guaranteeing data security and privacy of data subjects, but als...

  17. The big head and the long tail

    DEFF Research Database (Denmark)

    Helles, Rasmus

    2013-01-01

    This paper discusses how the advent of big data challenges established theories in Internet studies to redevelop existing explanatory strategies in order to incorporate the possibilities offered by this new empirical resource. The article suggests that established analytical procedures and...... theoretical frameworks used in Internet studies can be fruitfully employed to explain high–level structural phenomena that are only observable through the use of big data. The present article exemplifies this by offering a detailed analysis of how genre analysis of Web sites may be used to shed light on the...

  18. New physics and the new big bang

    International Nuclear Information System (INIS)

    The old concept of the big bang is reviewed, and modifications that have recently occurred in the theory are described. The concept of the false vacuum is explained, and its role in the cosmic inflation scenario is shown. The way inflation solves critical problems of the old big bang scenario is indicated. The potential of supersymmetry and Kaluza-Klein theories for the development of a superunified theory of physical forces is discussed. Superstrings and their possible role in a superunified theory, including their usefulness in solving the problem of infinities, is considered

  19. Effective dynamics of the matrix big bang

    International Nuclear Information System (INIS)

    We study the leading quantum effects in the recently introduced matrix big bang model. This amounts to a study of supersymmetric Yang-Mills theory compactified on the Milne orbifold. We find a one-loop potential that is attractive near the big bang. Surprisingly, the potential decays very rapidly at late times where it appears to be generated by D-brane effects. Usually, general covariance constrains the form of any effective action generated by renormalization group flow. However, the form of our one-loop potential seems to violate these constraints in a manner that suggests a connection between the cosmological singularity and long wavelength, late time physics

  20. Cognitive computing and big data analytics

    CERN Document Server

    Hurwitz, Judith; Bowles, Adrian

    2015-01-01

    MASTER THE ABILITY TO APPLY BIG DATA ANALYTICS TO MASSIVE AMOUNTS OF STRUCTURED AND UNSTRUCTURED DATA Cognitive computing is a technique that allows humans and computers to collaborate in order to gain insights and knowledge from data by uncovering patterns and anomalies. This comprehensive guide explains the underlying technologies, such as artificial intelligence, machine learning, natural language processing, and big data analytics. It then demonstrates how you can use these technologies to transform your organization. You will explore how different vendors and different industries are a