WorldWideScience

Sample records for big ten reactor

  1. Simulation of FCC Riser Reactor Based on Ten Lump Model

    Directory of Open Access Journals (Sweden)

    Debashri Paul

    2015-07-01

    Full Text Available The ten lump strategy and reaction schemes are based on the concentration of the various stocks i.e., paraffins, naphthenes, aromatic and aromatic substituent groups (paraffinic and napthenic groups attached to aromatic rings. The developed model has been studied using C++ programming language using Runge-Kutta Fehlberg mathematical method. At a space time of 4.5 s, the gasoline yield is predicted to be 72 mass % and 67 mass % for naphthenic and paraffinic feedstock respectively. Type of feed determines the yield of gasoline and coke. A highly naphthenic charge stock has given the greatest yield of gasoline among naphthenic, paraffinic and aromatic charge stock. In addition to this, effect of space time and temperature on the yield of coke and gasoline and conversion of gas oil has been presented. Also, the effect of catalyst to oil ratio is also taken in studies.

  2. Gender differences in personality across the ten aspects of the Big Five

    Directory of Open Access Journals (Sweden)

    Yanna J Weisberg

    2011-08-01

    Full Text Available This paper investigates gender differences in personality traits, both at the level of the Big Five and at the sublevel of two aspects within each Big Five domain. Replicating previous findings, women reported higher Big Five Extraversion, Agreeableness and Neuroticism scores than men. However, more extensive gender differences were found at the level of the aspects, with significant gender differences appearing in both aspects of every Big Five trait. For Extraversion, Openness, and Conscientiousness, the gender differences were found to diverge at the aspect level, rendering them either small or undetectable at the Big Five level. These findings clarify the nature of gender differences in personality and highlight the utility of measuring personality at the aspect level.

  3. Ten aspects of the Big Five in the Personality Inventory for DSM-5.

    Science.gov (United States)

    DeYoung, Colin G; Carey, Bridget E; Krueger, Robert F; Ross, Scott R

    2016-04-01

    Diagnostic and Statistical Manual of Mental Disorders (5th ed.; DSM-5) includes a dimensional model of personality pathology, operationalized in the Personality Inventory for DSM-5 (PID-5), with 25 facets grouped into 5 higher order factors resembling the Big Five personality dimensions. The present study tested how well these 25 facets could be integrated with the 10-factor structure of traits within the Big Five that is operationalized by the Big Five Aspect Scales (BFAS). In 2 healthy adult samples, 10-factor solutions largely confirmed our hypothesis that each of the 10 BFAS would be the highest loading BFAS on 1 and only 1 factor. Varying numbers of PID-5 scales were additional markers of each factor, and the overall factor structure in the first sample was well replicated in the second. Our results allow Cybernetic Big Five Theory (CB5T) to be brought to bear on manifestations of personality disorder, because CB5T offers mechanistic explanations of the 10 factors measured by the BFAS. Future research, therefore, may begin to test hypotheses derived from CB5T regarding the mechanisms that are dysfunctional in specific personality disorders.

  4. Big ambitions for small reactors as investors size up power options

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, John [nuclear24, Redditch (United Kingdom)

    2016-04-15

    Earlier this year, US nuclear developer NuScale Power completed a study for the UK's National Nuclear Laboratory (NNL) that supported the suitability of NuScale's small modular reactor (SMR) technology for the effective disposition of plutonium. The UK is a frontrunner to compete in the SMR marketplace, both in terms of technological capabilities, trade and political commitment. Industry observers are openly speculating whether SMR design and construction could start to move ahead faster than 'big and conventional' nuclear construction projects - not just in the UK but worldwide. Economies of scale could increase the attraction of SMRs to investors and the general public.

  5. TenBig Bangs” in Theory and Practice that Have Made a Difference to Australian Policing in the Last Three Decades

    Directory of Open Access Journals (Sweden)

    Rick Sarre

    2016-05-01

    Full Text Available This paper discusses what could be considered the top ten innovations that have occurred in policing in the last thirty years. The intent is to focus attention on how practice could be further inspired by additional innovation. The innovations are discussed here as “Big Bangs” as a way of drawing attention to the significant impact they have had on policing, in the same way that the cosmological Big Bang was an important watershed event in the universe’s existence. These ten policing innovations ushered in, it is argued, a new mindset, pattern or trend, and they affected Australian policing profoundly; although many had their roots in other settings long before Australian policy-makers implemented them.

  6. Reactors

    CERN Document Server

    International Electrotechnical Commission. Geneva

    1988-01-01

    This standard applies to the following types of reactors: shunt reactors, current-limiting reactors including neutral-earthing reactors, damping reactors, tuning (filter) reactors, earthing transformers (neutral couplers), arc-suppression reactors, smoothing reactors, with the exception of the following reactors: small reactors with a rating generally less than 2 kvar single-phase and 10 kvar three-phase, reactors for special purposes such as high-frequency line traps or reactors mounted on rolling stock.

  7. Reactor

    Science.gov (United States)

    Evans, Robert M.

    1976-10-05

    1. A neutronic reactor having a moderator, coolant tubes traversing the moderator from an inlet end to an outlet end, bodies of material fissionable by neutrons of thermal energy disposed within the coolant tubes, and means for circulating water through said coolant tubes characterized by the improved construction wherein the coolant tubes are constructed of aluminum having an outer diameter of 1.729 inches and a wall thickness of 0.059 inch, and the means for circulating a liquid coolant through the tubes includes a source of water at a pressure of approximately 350 pounds per square inch connected to the inlet end of the tubes, and said construction including a pressure reducing orifice disposed at the inlet ends of the tubes reducing the pressure of the water by approximately 150 pounds per square inch.

  8. Monte Carlo and deterministic simulations of activation ratio experiments for 238U(n,f), 238U(n,g) and 238U(n,2n) in the Big Ten benchmark critical assembly

    Energy Technology Data Exchange (ETDEWEB)

    Descalle, M; Clouse, C; Pruet, J

    2009-07-28

    The authors have compared calculations of critical assembly activation ratios using 3 different Monte Carlo codes and one deterministic code. There is excellent agreement. Discrepancies between the different Monte Carlo codes are the 1-2% level. Notably, the deterministic calculations with 87 groups are also in good agreement with the continuous energy Monte Carlo results. The three codes underestimate the {sup 238}U(n,f) reaction, suggesting that there is room for improvement in the evaluation, or in the evaluations of other reactions influencing the spectrum in BigTen. Until statistical uncertainties are implemented in Mercury, they strongly advise long runs to guarantee sufficient convergence of the flux at high energies, and they strongly encourage comparing Mercury results to a well-developed and documented code such as MCNP5 and/or COG. It may be that ENDL2008 will be available for use in COG within a year. Finally, it may be worthwhile to add a 'standard' reaction rate tally similar to those implemented in COG and MCNP5, if the goal is to expand the central fission and activation ratios simulations to include isotopes that are not part of the specifications for the assembly material composition.

  9. Big Data: Big Confusion? Big Challenges?

    Science.gov (United States)

    2015-05-01

    12th Annual Acquisition Research Symposium 12th Annual Acquisition Research Symposium Big Data : Big Confusion? Big Challenges? Mary Maureen... Data : Big Confusion? Big Challenges? 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK...Acquisition Research Symposium • ~!& UNC CHARlD1TE 90% of the data in the world today was created in the last two years Big Data growth from

  10. Big Data and Cycling

    NARCIS (Netherlands)

    Romanillos, Gustavo; Zaltz Austwick, Martin; Ettema, Dick; De Kruijf, Joost

    2016-01-01

    Big Data has begun to create significant impacts in urban and transport planning. This paper covers the explosion in data-driven research on cycling, most of which has occurred in the last ten years. We review the techniques, objectives and findings of a growing number of studies we have classified

  11. Big data, big governance

    NARCIS (Netherlands)

    Reep, Frans van der

    2016-01-01

    “Natuurlijk is het leuk dat mijn koelkast zelf melk bestelt op basis van data gerelateerde patronen. Deep learning op basis van big data kent grote beloften,” zegt Frans van der Reep van Inholland. Geen wonder dat dit op de Hannover Messe tijdens de Wissenstag van ScienceGuide een hoofdthema zal zij

  12. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  13. Update on the debate about the existence and utility of the Big Five: a ten-year follow-up on Carroll's "the Five-Factor Personality Model: how complete and satisfactory is it?

    Science.gov (United States)

    Merenda, Peter F

    2008-12-01

    This paper is a follow-up comment on John B. Carroll's critique of the Big Five Model and his suggestion years ago on how to design and conduct research properly on the structure of personality and its assessment. The status of research on personality factor models is discussed, and conclusions are reached regarding the likely consequences and further prospects of the failure of personality theorists and practitioners to follow through on Carroll's poignant suggestion for required effort.

  14. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......Astrophysics and cosmology are rich with data. The advent of wide-area digital cameras on large aperture telescopes has led to ever more ambitious surveys of the sky. Data volumes of entire surveys a decade ago can now be acquired in a single night and real-time analysis is often desired. Thus...... with label and measurement noise. We argue that this makes astronomy a great domain for computer science research, as it pushes the boundaries of data analysis. In the following, we will present this exciting application area for data scientists. We will focus on exemplary results, discuss main challenges...

  15. Experimental Study of Big Row Spacing Cultivation of Tomato Using Straw Biological Reactor Technology%应用秸秆生物反应堆技术大行距栽培番茄试验研究

    Institute of Scientific and Technical Information of China (English)

    王继涛; 张翔; 温学萍; 赵玮; 俞风娟; 汪金山

    2015-01-01

    应用秸秆生物反应堆技术能有效地改善设施内环境因素、减缓病害发生、提高产量效益,但此项技术在开沟过程中比较费工费力,为了降低秸秆生物反应堆技术劳动用工和生产投入,特开展秸秆生物反应堆技术大行距栽培番茄试验研究。结果表明:仅挖沟、埋秸秆、起垄、铺设滴管、定植环节比对照每公顷节省劳动用工35.7%,节约成本16810.5元/hm2,上市期提前5 d,产量增加26.68%,病虫害发病率明显降低。综合田间生长势及室内考种数据,建议在宁夏地区大面积推广应用秸秆生物反应堆技术大行距栽培番茄。%The application of the straw biological reactor technology can effectively improve the environmental factors within the facility, slow down the occurrence of the disease and improve the yield and benefit. But with this technology, in the process of ditching, a lot of work and effort are needed. In order to reduce the labor employment and production inputs in the utilization of the technology, an experiment research on the big row spacing cultivation of tomato using the straw biologi-cal reactor technology was conducted. The results showed that compared with the control, only in the links such as ditching, straw burring, ridging, laying of dropper and planting, 35.7% of the labor employment per hectare, 16,810.5 yuan/hm2 of the cost could be saved the marketing time could be advance by 5 days, the yield could be increased by 26.68% and the inci-dence of pests and diseases could be lowered significantly. In considering the comprehensive growth potential in the field and the indoor test data it is suggested that the big row spacing cultivation of tomato using the straw biological reactor technology should be extended and applied in large areas in Ningxia.

  16. Big Data

    Directory of Open Access Journals (Sweden)

    Prachi More

    2013-05-01

    Full Text Available Demand and spurt in collections and accumulation of data has coined new term “Big Data” has begun. Accidently, incidentally and by interaction of people, information so called data is massively generated. This BIG DATA is to be smartly and effectively used Computer scientists, physicists, economists, mathematicians, political scientists, bio-informaticists, sociologists and many Variety of Intellegesia debate over the potential benefits and costs of analysing information from Twitter, Google, Facebook, Wikipedia and every space where large groups of people leave digital traces and deposit data. Given the rise of Big Data as both a phenomenon and a methodological persuasion, it is time to start critically interrogating this phenomenon, its assumptions and its biases. Big data, which refers to the data sets that are too big to be handled using the existing database management tools, are emerging in many important applications, such as Internet search, business informatics, social networks, social media, genomics, and meteorology. Big data presents a grand challenge for database and data analytics research. This paper is a blend of non-technical and introductory-level technical detail, ideal for the novice. We conclude with some technical challenges as well as the solutions that can be used to these challenges. Big Data differs from other data with five characteristics like volume, variety, value, velocity and complexity. The article will focus on some current and future cases and causes for BIG DATA.

  17. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  18. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Ruppert, Evelyn; Flyverbom, Mikkel

    2016-01-01

    The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments that is contained in any big data practice. Secondly, it suggest a research agenda built around a set of sub-themes that each deserve dedicated scrutiny when studying the interplay between big...

  19. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Jeppesen, Jacob; Vaarst Andersen, Kristina; Lauto, Giancarlo;

    and locations, having a diverse knowledge set and capable of tackling more and more complex problems. This prose the question if Big Egos continues to dominate in this rising paradigm of big science. Using a dataset consisting of full bibliometric coverage from a Large Scale Research Facility, we utilize...

  20. Online stress corrosion crack and fatigue usages factor monitoring and prognostics in light water reactor components: Probabilistic modeling, system identification and data fusion based big data analytics approach

    Energy Technology Data Exchange (ETDEWEB)

    Mohanty, Subhasish M. [Argonne National Lab. (ANL), Argonne, IL (United States); Jagielo, Bryan J. [Argonne National Lab. (ANL), Argonne, IL (United States); Oakland Univ., Rochester, MI (United States); Iverson, William I. [Argonne National Lab. (ANL), Argonne, IL (United States); Univ. of Illinois at Urbana-Champaign, Champaign, IL (United States); Bhan, Chi Bum [Argonne National Lab. (ANL), Argonne, IL (United States); Pusan National Univ., Busan (Korea, Republic of); Soppet, William S. [Argonne National Lab. (ANL), Argonne, IL (United States); Majumdar, Saurin M. [Argonne National Lab. (ANL), Argonne, IL (United States); Natesan, Ken N. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-12-10

    Nuclear reactors in the United States account for roughly 20% of the nation's total electric energy generation, and maintaining their safety in regards to key component structural integrity is critical not only for long term use of such plants but also for the safety of personnel and the public living around the plant. Early detection of damage signature such as of stress corrosion cracking, thermal-mechanical loading related material degradation in safety-critical components is a necessary requirement for long-term and safe operation of nuclear power plant systems.

  1. Big Data

    OpenAIRE

    2013-01-01

    Demand and spurt in collections and accumulation of data has coined new term “Big Data” has begun. Accidently, incidentally and by interaction of people, information so called data is massively generated. This BIG DATA is to be smartly and effectively used Computer scientists, physicists, economists, mathematicians, political scientists, bio-informaticists, sociologists and many Variety of Intellegesia debate over the potential benefits and costs of analysing information from Twitter, Google,...

  2. Big Data-Survey

    Directory of Open Access Journals (Sweden)

    P.S.G. Aruna Sri

    2016-03-01

    Full Text Available Big data is the term for any gathering of information sets, so expensive and complex, that it gets to be hard to process for utilizing customary information handling applications. The difficulties incorporate investigation, catch, duration, inquiry, sharing, stockpiling, Exchange, perception, and protection infringement. To reduce spot business patterns, anticipate diseases, conflict etc., we require bigger data sets when compared with the smaller data sets. Enormous information is hard to work with utilizing most social database administration frameworks and desktop measurements and perception bundles, needing rather enormously parallel programming running on tens, hundreds, or even a large number of servers. In this paper there was an observation on Hadoop architecture, different tools used for big data and its security issues.

  3. Real-Time Pathogen Detection in the Era of Whole-Genome Sequencing and Big Data: Comparison of k-mer and Site-Based Methods for Inferring the Genetic Distances among Tens of Thousands of Salmonella Samples.

    Science.gov (United States)

    Pettengill, James B; Pightling, Arthur W; Baugher, Joseph D; Rand, Hugh; Strain, Errol

    2016-01-01

    The adoption of whole-genome sequencing within the public health realm for molecular characterization of bacterial pathogens has been followed by an increased emphasis on real-time detection of emerging outbreaks (e.g., food-borne Salmonellosis). In turn, large databases of whole-genome sequence data are being populated. These databases currently contain tens of thousands of samples and are expected to grow to hundreds of thousands within a few years. For these databases to be of optimal use one must be able to quickly interrogate them to accurately determine the genetic distances among a set of samples. Being able to do so is challenging due to both biological (evolutionary diverse samples) and computational (petabytes of sequence data) issues. We evaluated seven measures of genetic distance, which were estimated from either k-mer profiles (Jaccard, Euclidean, Manhattan, Mash Jaccard, and Mash distances) or nucleotide sites (NUCmer and an extended multi-locus sequence typing (MLST) scheme). When analyzing empirical data (whole-genome sequence data from 18,997 Salmonella isolates) there are features (e.g., genomic, assembly, and contamination) that cause distances inferred from k-mer profiles, which treat absent data as informative, to fail to accurately capture the distance between samples when compared to distances inferred from differences in nucleotide sites. Thus, site-based distances, like NUCmer and extended MLST, are superior in performance, but accessing the computing resources necessary to perform them may be challenging when analyzing large databases.

  4. Nuclear Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Hogerton, John

    1964-01-01

    This pamphlet describes how reactors work; discusses reactor design; describes research, teaching, and materials testing reactors; production reactors; reactors for electric power generation; reactors for supply heat; reactors for propulsion; reactors for space; reactor safety; and reactors of tomorrow. The appendix discusses characteristics of U.S. civilian power reactor concepts and lists some of the U.S. reactor power projects, with location, type, capacity, owner, and startup date.

  5. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  6. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  7. Big Dreams

    Science.gov (United States)

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  8. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  9. Big Data

    DEFF Research Database (Denmark)

    Aaen, Jon; Nielsen, Jeppe Agger

    2016-01-01

    Big Data byder sig til som en af tidens mest hypede teknologiske innovationer, udråbt til at rumme kimen til nye, værdifulde operationelle indsigter for private virksomheder og offentlige organisationer. Mens de optimistiske udmeldinger er mange, er forskningen i Big Data i den offentlige sektor...... indtil videre begrænset. Denne artikel belyser, hvordan den offentlige sundhedssektor kan genanvende og udnytte en stadig større mængde data under hensyntagen til offentlige værdier. Artiklen bygger på et casestudie af anvendelsen af store mængder sundhedsdata i Dansk AlmenMedicinsk Database (DAMD......). Analysen viser, at (gen)brug af data i nye sammenhænge er en flerspektret afvejning mellem ikke alene økonomiske rationaler og kvalitetshensyn, men også kontrol over personfølsomme data og etiske implikationer for borgeren. I DAMD-casen benyttes data på den ene side ”i den gode sags tjeneste” til...

  10. Big Man

    Institute of Scientific and Technical Information of China (English)

    郑秀文

    2012-01-01

    <正>梁炳"Edmond"说他演唱会后会跟太太去旅行。无论飞机降落在地球的哪角,有伴在旁就是幸福。他的concert名字是big man,初时我看错是big mac演唱会:心想干吗是大汉堡演唱会?嘻!后来才知看错。但其实细想,在成长路上,谁不曾是活得像个傻傻的面包,一团面粉暴露在这大千世界,时间和各式人生经历就是酵母,多少年月日,你我都会发酵成长。友情也是激发彼此成长的酵母,看到对方早已经从男仔成了男人,我都原来一早已不再能够以"女仔"称呼自己。在我眼中,他的改变是大的,爱玩外向的个性收窄了,现在的我们,

  11. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  12. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Jeppesen, Jacob; Vaarst Andersen, Kristina; Lauto, Giancarlo

    In this paper we investigate the micro-mechanisms governing the structural evolution of a scientific collaboration. Empirical evidence indicates that we have transcended into a new paradigm with a new modus operandi where scientific discovery are not lead by so called lone ?stars?, or big egos......, but instead by a group of people, from a multitude of institutions, having a diverse knowledge set and capable of operating more and more complex instrumentation. Using a dataset consisting of full bibliometric coverage from a Large Scale Research Facility, we utilize a stochastic actor oriented model...

  13. Tens bij bevallingen

    NARCIS (Netherlands)

    Tuin-Nuis, F.D.F.

    2000-01-01

    TENS (Transcutane Electrische Neuro Stimulatie) is een pijnverlichtingsmethode die berust op de Gate Control Theory van Melzack en Wall. Door middel van electrische pulsen via de huid zou de geleiding van nociceptieve signalen (pijnprikkels) worden beïnvloed en zou het lichaam endorfinen aanmaken: l

  14. Affordances: Ten Years On

    Science.gov (United States)

    Brown, Jill P.; Stillman, Gloria

    2014-01-01

    Ten years ago the construct, affordance, was rising in prominence in scholarly literature. A proliferation of different uses and meanings was evident. Beginning with its origin in the work of Gibson, we traced its development and use in various scholarly fields. This paper revisits our original question with respect to its utility in mathematics…

  15. IREDA: ten years on

    Energy Technology Data Exchange (ETDEWEB)

    Sahai, I.M.

    1998-04-01

    The Indian Renewable Energy Development Agency has a ten-year history of promoting small hydro energy sources for many communities beyond the reach of the national grid. Sources of funding for hydro installations in India are mentioned. It is suggested that the expansion of the hydro schemes would benefit from greater co-ordination between central and state governments. (UK)

  16. Powers of ten

    CERN Multimedia

    Pyramid FILMS

    1977-01-01

    Powers of Ten is a 1977 short documentary film written and directed by Charles Eames and his wife, Ray. The film depicts the relative scale of the Universe in factors of ten (see also logarithmic scale and order of magnitude). The film begins with an aerial image of a man reclining on a blanket; the view is that of one meter across. The viewpoint, accompanied by expository voiceover, then slowly zooms out to a view ten meters across ( or 101 m in standard form), revealing that the man is picnicking in a park with a female companion. The zoom-out continues, to a view of 100 meters (102 m), then 1 kilometer (103 m), and so on, increasing the perspective—the picnic is revealed to be taking place near Soldier Field on Chicago's waterfront—and continuing to zoom out to a field of view of 1024 meters, or the size of the observable universe. The camera then zooms back in to the picnic, and then to views of negative powers of ten—10-1 m (10 centimeters), and so forth, until we are viewing a carbon nucl...

  17. On Big Data Benchmarking

    OpenAIRE

    Han, Rui; Lu, Xiaoyi

    2014-01-01

    Big data systems address the challenges of capturing, storing, managing, analyzing, and visualizing big data. Within this context, developing benchmarks to evaluate and compare big data systems has become an active topic for both research and industry communities. To date, most of the state-of-the-art big data benchmarks are designed for specific types of systems. Based on our experience, however, we argue that considering the complexity, diversity, and rapid evolution of big data systems, fo...

  18. Big data=Big marketing?!

    Institute of Scientific and Technical Information of China (English)

    肖明超

    2012-01-01

    <正>互联网刚刚兴起的时候,有句话很流行:"在网上,没人知道你是一条狗。"但是,在20多年后的今天,这句话已经早被扔进了历史的垃圾堆,因为在技术的推动下,随着移动互联、社交网络、电子商务等的迅速发展,消费者的"行踪"变得越来越容易被把握,消费者在互联网上的眼球、行为轨迹、谈论、喜好、购物经历等等都可能被捕捉到,消费者进入一个几乎透明化生存的"大数据时代"(Age of Big Data)。数据不仅仅正在变得更加可用,人工智能(AI)技术,包括自然语言处理、模式识别和机器学习等技术的发展,正在让数据变得更加容易被计算机所理解,

  19. H Reactor

    Data.gov (United States)

    Federal Laboratory Consortium — The H Reactor was the first reactor to be built at Hanford after World War II.It became operational in October of 1949, and represented the fourth nuclear reactor on...

  20. Ten Outstanding Women Chosen

    Institute of Scientific and Technical Information of China (English)

    1995-01-01

    BEFORE the celebration of the 85th anniversary of International Women’s Day, the All-China Women’s Federation sponsored an activity to choose ten outstanding Chinese women, which involved the Ministries of Labor and Personnel, the General Political Department of the People’s Liberation Army, the All-China Federation of Trade Union, the Youth League of China, the China Association of Science and about a dozen Chinese news agencies. The results were recently announced, and including following women: Yue Xicui, one of the third generation of women aviators. Since she joined the air force she has accumulated 5,180 hours

  1. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  2. Nuclear reactor PBMR and cogeneration; Reactor nuclear PBMR y cogeneracion

    Energy Technology Data Exchange (ETDEWEB)

    Ramirez S, J. R.; Alonso V, G., E-mail: ramon.ramirez@inin.gob.mx [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2013-10-15

    In recent years the nuclear reactor designs for the electricity generation have increased their costs, so that at the moment costs are managed of around the 5000 US D for installed kw, reason for which a big nuclear plant requires of investments of the order of billions of dollars, the designed reactors as modular of low power seek to lighten the initial investment of a big reactor dividing the power in parts and dividing in modules the components to lower the production costs, this way it can begin to build a module and finished this to build other, differing the long term investment, getting less risk therefore in the investment. On the other hand the reactors of low power can be very useful in regions where is difficult to have access to the electric net being able to take advantage of the thermal energy of the reactor to feed other processes like the water desalination or the vapor generation for the processes industry like the petrochemical, or even more the possible hydrogen production to be used as fuel. In this work the possibility to generate vapor of high quality for the petrochemical industry is described using a spheres bed reactor of high temperature. (Author)

  3. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  4. Five Big Ideas

    Science.gov (United States)

    Morgan, Debbie

    2012-01-01

    Designing quality continuing professional development (CPD) for those teaching mathematics in primary schools is a challenge. If the CPD is to be built on the scaffold of five big ideas in mathematics, what might be these five big ideas? Might it just be a case of, if you tell me your five big ideas, then I'll tell you mine? Here, there is…

  5. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  6. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  7. Ten years of PAMELA

    Science.gov (United States)

    Spillantini, Piero

    2016-07-01

    Pamela experiment has been designed as a cosmic ray observatory at 1 AU, dedicated to the precise and high statistics study of CR fluxes on a three decades energy range, form a few tens MeV up to several hundred GeV region. It is the last step of the 'Russian-Italian Mission' (RIM) program born in 1992 between several Italian and Russian institutes and with the participation of the Royal Institute of Technology of Stockholm (Sweden) and the Siegen University (German). Launched the 16 June 2006 from Baikonur cosmodrome on board of the Resurs-DK1 Russian satellite by a Soyuz rocket in an elliptical (350-610 km) quasi polar orbit (70° inclination) it was activated on 21 June 2006, afterword has been in a continuous data taking mode for ten years. The Pamela program pays particular attention to the study of particles (protons and electrons) and antiparticles (antiprotons and positrons) energy spectra. It also includes search for possible signals of dark matter annihilation, search for primordial antimatter (antihelium), search for new Matter in the Universe (Strangelets?), study of cosmic-ray propagation, solar physics and solar modulation, terrestrial magnetosphere. This program is made possible thanks to the outstanding performance of the instrument, the low energy threshold, the quasi-polar orbit, the 10 years duration of the observation. Protons and helium nuclei are the most abundant components of the cosmic radiation and the precise measurements of their fluxes allow understanding the acceleration and propagation of cosmic rays in the Galaxy. Their spectral shapes cannot be well described by a single power law: at 230-240 GV they exhibit an abrupt spectral hardening. They challenge the current paradigm of cosmic-ray acceleration in supernova remnants followed by diffusive propagation in the Galaxy. Of paramount importance is the discover of the anomalous increase of the positron flux at energies higher that 50 GeV (the so called 'Pamela anomaly'). The review of

  8. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  9. Reactor Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ait Abderrahim, A

    2001-04-01

    The Reactor Physics and MYRRHA Department of SCK-CEN offers expertise in various areas of reactor physics, in particular in neutronics calculations, reactor dosimetry, reactor operation, reactor safety and control and non-destructive analysis of reactor fuel. This expertise is applied in the Department's own research projects in the VENUS critical facility, in the BR1 reactor and in the MYRRHA project (this project aims at designing a prototype Accelerator Driven System). Available expertise is also used in programmes external to the Department such as the reactor pressure steel vessel programme, the BR2 reactor dosimetry, and the preparation and interpretation of irradiation experiments by means of neutron and gamma calculations. The activities of the Fuzzy Logic and Intelligent Technologies in Nuclear Science programme cover several domains outside the department. Progress and achievements in these topical areas in 2000 are summarised.

  10. The Big Chills

    Science.gov (United States)

    Bond, G. C.; Dwyer, G. S.; Bauch, H. A.

    2002-12-01

    mechanisms that have not yet been identified. While the Younger Dryas event is dramatic, the Big Chills of the Holocene are clearly significant abrupt changes in their own right. Because they were a recurring feature of the interglacial climate we live in presently, they are especially relevant to the prediction of sudden changes in the future, more so probably than abrupt changes during the last glacial which took place within boundary conditions that are not likely to occur again soon, perhaps within tens of thousands of years.

  11. Reactor safeguards

    CERN Document Server

    Russell, Charles R

    1962-01-01

    Reactor Safeguards provides information for all who are interested in the subject of reactor safeguards. Much of the material is descriptive although some sections are written for the engineer or physicist directly concerned with hazards analysis or site selection problems. The book opens with an introductory chapter on radiation hazards, the construction of nuclear reactors, safety issues, and the operation of nuclear reactors. This is followed by separate chapters that discuss radioactive materials, reactor kinetics, control and safety systems, containment, safety features for water reactor

  12. Reactor operation

    CERN Document Server

    Shaw, J

    2013-01-01

    Reactor Operation covers the theoretical aspects and design information of nuclear reactors. This book is composed of nine chapters that also consider their control, calibration, and experimentation.The opening chapters present the general problems of reactor operation and the principles of reactor control and operation. The succeeding chapters deal with the instrumentation, start-up, pre-commissioning, and physical experiments of nuclear reactors. The remaining chapters are devoted to the control rod calibrations and temperature coefficient measurements in the reactor. These chapters also exp

  13. Reactor Neutrinos

    OpenAIRE

    Soo-Bong Kim; Thierry Lasserre; Yifang Wang

    2013-01-01

    We review the status and the results of reactor neutrino experiments. Short-baseline experiments have provided the measurement of the reactor neutrino spectrum, and their interest has been recently revived by the discovery of the reactor antineutrino anomaly, a discrepancy between the reactor neutrino flux state of the art prediction and the measurements at baselines shorter than one kilometer. Middle and long-baseline oscillation experiments at Daya Bay, Double Chooz, and RENO provided very ...

  14. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  15. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  16. REACTOR GROUT THERMAL PROPERTIES

    Energy Technology Data Exchange (ETDEWEB)

    Steimke, J.; Qureshi, Z.; Restivo, M.; Guerrero, H.

    2011-01-28

    Savannah River Site has five dormant nuclear production reactors. Long term disposition will require filling some reactor buildings with grout up to ground level. Portland cement based grout will be used to fill the buildings with the exception of some reactor tanks. Some reactor tanks contain significant quantities of aluminum which could react with Portland cement based grout to form hydrogen. Hydrogen production is a safety concern and gas generation could also compromise the structural integrity of the grout pour. Therefore, it was necessary to develop a non-Portland cement grout to fill reactors that contain significant quantities of aluminum. Grouts generate heat when they set, so the potential exists for large temperature increases in a large pour, which could compromise the integrity of the pour. The primary purpose of the testing reported here was to measure heat of hydration, specific heat, thermal conductivity and density of various reactor grouts under consideration so that these properties could be used to model transient heat transfer for different pouring strategies. A secondary purpose was to make qualitative judgments of grout pourability and hardened strength. Some reactor grout formulations were unacceptable because they generated too much heat, or started setting too fast, or required too long to harden or were too weak. The formulation called 102H had the best combination of characteristics. It is a Calcium Alumino-Sulfate grout that contains Ciment Fondu (calcium aluminate cement), Plaster of Paris (calcium sulfate hemihydrate), sand, Class F fly ash, boric acid and small quantities of additives. This composition afforded about ten hours of working time. Heat release began at 12 hours and was complete by 24 hours. The adiabatic temperature rise was 54 C which was within specification. The final product was hard and displayed no visible segregation. The density and maximum particle size were within specification.

  17. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  18. Big Boss Interval Games

    NARCIS (Netherlands)

    Alparslan-Gok, S.Z.; Brânzei, R.; Tijs, S.H.

    2008-01-01

    In this paper big boss interval games are introduced and various characterizations are given. The structure of the core of a big boss interval game is explicitly described and plays an important role relative to interval-type bi-monotonic allocation schemes for such games. Specifically, each element

  19. Big Ideas in Art

    Science.gov (United States)

    Day, Kathleen

    2008-01-01

    In this article, the author shares how she was able to discover some big ideas about art education. She relates how she found great ideas to improve her teaching from the book "Rethinking Curriculum in Art." She also shares how she designed a "Big Idea" unit in her class.

  20. Progress of Research on Demonstration Fast Reactor Main Pipe Material

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    The main characteristics of the sodium pipe system in demonstration fast reactor are high-temperature, thin-wall and big-caliber, which is different from the high-pressure and thick-wall of the pressurized water reactor system, and the system is long-term

  1. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  2. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  3. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  4. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  5. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    2016-01-01

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van dez

  6. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  7. Inhomogeneous Big Bang Cosmology

    CERN Document Server

    Wagh, S M

    2002-01-01

    In this letter, we outline an inhomogeneous model of the Big Bang cosmology. For the inhomogeneous spacetime used here, the universe originates in the infinite past as the one dominated by vacuum energy and ends in the infinite future as the one consisting of "hot and relativistic" matter. The spatial distribution of matter in the considered inhomogeneous spacetime is {\\em arbitrary}. Hence, observed structures can arise in this cosmology from suitable "initial" density contrast. Different problems of the standard model of Big Bang cosmology are also resolved in the present inhomogeneous model. This inhomogeneous model of the Big Bang Cosmology predicts "hot death" for the universe.

  8. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  9. 2007 China Harbor Ten People

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    @@ 2007 China Harbor Ten People elected the entrepreneurs who contributed a lot to port economy and enterprises this year trough their talent management.These ten people embody their social responsibility,professional skills,creative ability,and charming personality.Bearing full confidence in China's port economy,the port entrepreneurs are brave enough to explore a brand new area,so as to promote harbor economic development.

  10. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  11. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  12. Multifunctional reactors

    NARCIS (Netherlands)

    Westerterp, K.R.

    1992-01-01

    Multifunctional reactors are single pieces of equipment in which, besides the reaction, other functions are carried out simultaneously. The other functions can be a heat, mass or momentum transfer operation and even another reaction. Multifunctional reactors are not new, but they have received much

  13. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  14. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  15. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Klinkby Madsen, Anders; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  16. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  17. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  18. Big Data Analytics

    Indian Academy of Sciences (India)

    2016-08-01

    The volume and variety of data being generated using computersis doubling every two years. It is estimated that in 2015,8 Zettabytes (Zetta=1021) were generated which consistedmostly of unstructured data such as emails, blogs, Twitter,Facebook posts, images, and videos. This is called big data. Itis possible to analyse such huge data collections with clustersof thousands of inexpensive computers to discover patterns inthe data that have many applications. But analysing massiveamounts of data available in the Internet has the potential ofimpinging on our privacy. Inappropriate analysis of big datacan lead to misleading conclusions. In this article, we explainwhat is big data, how it is analysed, and give some case studiesillustrating the potentials and pitfalls of big data analytics.

  19. Big Creek Pit Tags

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BCPITTAGS database is used to store data from an Oncorhynchus mykiss (steelhead/rainbow trout) population dynamics study in Big Creek, a coastal stream along the...

  20. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  1. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  2. ANALYTICS OF BIG DATA

    OpenAIRE

    Asst. Prof. Shubhada Talegaon

    2014-01-01

    Big Data analytics has started to impact all types of organizations, as it carries the potential power to extract embedded knowledge from big amounts of data and react according to it in real time. The current technology enables us to efficiently store and query large datasets, the focus is now on techniques that make use of the complete data set, instead of sampling. This has tremendous implications in areas like machine learning, pattern recognition and classification, senti...

  3. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'.

  4. Big data need big theory too

    OpenAIRE

    Coveney, Peter V.; Dougherty, Edward R; Highfield, Roger R.

    2016-01-01

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, ma...

  5. Big data need big theory too.

    OpenAIRE

    Coveney, P. V.; Dougherty, E. R.; Highfield, R. R.

    2016-01-01

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, ma...

  6. Big data need big theory too

    Science.gov (United States)

    Dougherty, Edward R.; Highfield, Roger R.

    2016-01-01

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their ‘depth’ and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote ‘blind’ big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698035

  7. The origin of the future ten questions for the next ten years

    CERN Document Server

    Gribbin, John

    2006-01-01

    How did the universe begin? Where do galaxies come from? How do stars and planets form? Where do the material particles we are made of come from? How did life begin? Today we have only provisional answers to such questions. But scientific progress will improve these answers dramatically over the next ten years, predicts John Gribbin in this riveting book. He focuses on what we know—or think we know—about ten controversial, unanswered issues in the physical sciences and explains how current cutting-edge research may yield solutions in the very near future. With his trademark facility for engaging readers with or without a scientific background, the author explores ideas concerning the creation of the universe, the possibility of other forms of life, and the fate of the expanding cosmos. He examines “theories of everything,” including grand unified theories and string theory, and he discusses the Big Bang theory, the origin of structure and patterns of matter in the galaxies, and dark mass and dark ene...

  8. SETI as a part of Big History

    Science.gov (United States)

    Maccone, Claudio

    2014-08-01

    Big History is an emerging academic discipline which examines history scientifically from the Big Bang to the present. It uses a multidisciplinary approach based on combining numerous disciplines from science and the humanities, and explores human existence in the context of this bigger picture. It is taught at some universities. In a series of recent papers ([11] through [15] and [17] through [18]) and in a book [16], we developed a new mathematical model embracing Darwinian Evolution (RNA to Humans, see, in particular, [17] and Human History (Aztecs to USA, see [16]) and then we extrapolated even that into the future up to ten million years (see 18), the minimum time requested for a civilization to expand to the whole Milky Way (Fermi paradox). In this paper, we further extend that model in the past so as to let it start at the Big Bang (13.8 billion years ago) thus merging Big History, Evolution on Earth and SETI (the modern Search for ExtraTerrestrial Intelligence) into a single body of knowledge of a statistical type. Our idea is that the Geometric Brownian Motion (GBM), so far used as the key stochastic process of financial mathematics (Black-Sholes models and related 1997 Nobel Prize in Economics!) may be successfully applied to the whole of Big History. In particular, in this paper we derive book about GBMs will be written by the author. Mass Extinctions of the geological past also are one more topic that may be cast in the language of a decreasing GBM over a short time lapse, since Mass Extinctions are sudden all-lows in the number of living species. In this paper, we give formulae for the decreasing GBMs of Mass Extinctions, like the K-Pg one of 64 million years ago. Finally, we note that the Big History Equation is just the extension of the Drake Equation to 13.8 billion years of cosmic evolution. So, the relevant GBM starting at the Big Bang epoch (13.8 billion years ago) and growing up to nowadays in a stochastically increasing fashion becomes the GBM

  9. Reactor vessel

    OpenAIRE

    Makkee, M.; Kapteijn, F.; Moulijn, J.A

    1999-01-01

    A reactor vessel (1) comprises a reactor body (2) through which channels (3) are provided whose surface comprises longitudinal inwardly directed parts (4) and is provided with a catalyst (6), as well as buffer bodies (8, 12) connected to the channels (3) on both sides of the reactor body (2) and comprising connections for supplying (9, 10, 11) and discharging (13, 14, 15) via the channels (3) gases and/or liquids entering into a reaction with each other and substances formed upon this reactio...

  10. NUCLEAR REACTOR

    Science.gov (United States)

    Miller, H.I.; Smith, R.C.

    1958-01-21

    This patent relates to nuclear reactors of the type which use a liquid fuel, such as a solution of uranyl sulfate in ordinary water which acts as the moderator. The reactor is comprised of a spherical vessel having a diameter of about 12 inches substantially surrounded by a reflector of beryllium oxide. Conventionnl control rods and safety rods are operated in slots in the reflector outside the vessel to control the operation of the reactor. An additional means for increasing the safety factor of the reactor by raising the ratio of delayed neutrons to prompt neutrons, is provided and consists of a soluble sulfate salt of beryllium dissolved in the liquid fuel in the proper proportion to obtain the result desired.

  11. Reactor Neutrinos

    Directory of Open Access Journals (Sweden)

    Soo-Bong Kim

    2013-01-01

    Full Text Available We review the status and the results of reactor neutrino experiments. Short-baseline experiments have provided the measurement of the reactor neutrino spectrum, and their interest has been recently revived by the discovery of the reactor antineutrino anomaly, a discrepancy between the reactor neutrino flux state of the art prediction and the measurements at baselines shorter than one kilometer. Middle and long-baseline oscillation experiments at Daya Bay, Double Chooz, and RENO provided very recently the most precise determination of the neutrino mixing angle θ13. This paper provides an overview of the upcoming experiments and of the projects under development, including the determination of the neutrino mass hierarchy and the possible use of neutrinos for society, for nonproliferation of nuclear materials, and geophysics.

  12. Chemical Reactors.

    Science.gov (United States)

    Kenney, C. N.

    1980-01-01

    Describes a course, including content, reading list, and presentation on chemical reactors at Cambridge University, England. A brief comparison of chemical engineering education between the United States and England is also given. (JN)

  13. Reactor Neutrinos

    OpenAIRE

    Lasserre, T.; Sobel, H.W.

    2005-01-01

    We review the status and the results of reactor neutrino experiments, that toe the cutting edge of neutrino research. Short baseline experiments have provided the measurement of the reactor neutrino spectrum, and are still searching for important phenomena such as the neutrino magnetic moment. They could open the door to the measurement of coherent neutrino scattering in a near future. Middle and long baseline oscillation experiments at Chooz and KamLAND have played a relevant role in neutrin...

  14. Ten Problems in Experimental Mathematics

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, David H.; Borwein, Jonathan M.; Kapoor, Vishaal; Weisstein, Eric

    2004-09-30

    This article was stimulated by the recent SIAM ''100 DigitChallenge'' of Nick Trefethen, beautifully described in a recent book. Indeed, these ten numeric challenge problems are also listed in a recent book by two of present authors, where they are followed by the ten symbolic/numeric challenge problems that are discussed in this article. Our intent was to present ten problems that are characteristic of the sorts of problems that commonly arise in ''experimental mathematics''. The challenge in each case is to obtain a high precision numeric evaluation of the quantity, and then, if possible, to obtain a symbolic answer, ideally one with proof. Our goal in this article is to provide solutions to these ten problems, and in the process present a concise account of how one combines symbolic and numeric computation, which may be termed ''hybrid computation'', in the process of mathematical discovery.

  15. Ten Rules of Academic Writing

    NARCIS (Netherlands)

    Donovan, S.K.

    2011-01-01

    Creative writers are well served with 'how to' guides, but just how much do they help? And how might they be relevant to academic authors? A recent survey of writing tips by twenty-eight creative authors has been condensed to the ten most relevant to the academic, supported by some comments on metho

  16. A Ten-Year Reflection

    Science.gov (United States)

    Phillip, Cyndi

    2016-01-01

    Five initiatives launched during Cyndi Phillip's term as American Association of School Librarians (AASL) President (2006-2007) continue to have an impact on school librarians ten years later. They include the rewriting of AASL's learning standards, introduction of the SKILLS Act, the presentation of the Crystal Apple Award to Scholastic Library…

  17. Ten-dimensional Supergravity Revisited

    NARCIS (Netherlands)

    Bergshoeff, Eric; Roo, Mees de; Kerstan, Sven; Riccioni, Fabio; Diaz Alonso, J.; Mornas, L.

    2006-01-01

    We show that the exisiting supergravity theories in ten dimensions can be extended with extra gauge fields whose rank is equal to the spacetime dimension. These gauge fields have vanishing field strength but nevertheless play an important role in the coupling of supergravity to spacetime filling bra

  18. Focus : big data, little questions?

    OpenAIRE

    Uprichard, Emma

    2013-01-01

    Big data. Little data. Deep data. Surface data. Noisy, unstructured data. Big. The world of data has gone from being analogue and digital, qualitative and quantitative, transactional and a by-product, to, simply, BIG. It is as if we couldn’t quite deal with its omnipotence and just ran out of adjectives. BIG. With all the data power it is supposedly meant to entail, one might have thought that a slightly better descriptive term might have been latched onto. But, no. BIG. Just BIG.

  19. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  20. Big and Small

    CERN Document Server

    Ekers, R D

    2010-01-01

    Technology leads discovery in astronomy, as in all other areas of science, so growth in technology leads to the continual stream of new discoveries which makes our field so fascinating. Derek de Solla Price had analysed the discovery process in science in the 1960s and he introduced the terms 'Little Science' and 'Big Science' as part of his discussion of the role of exponential growth in science. I will show how the development of astronomical facilities has followed this same trend from 'Little Science' to 'Big Science' as a field matures. We can see this in the discoveries resulting in Nobel Prizes in astronomy. A more detailed analysis of discoveries in radio astronomy shows the same effect. I include a digression to look at how science progresses, comparing the roles of prediction, serendipity, measurement and explanation. Finally I comment on the differences between the 'Big Science' culture in Physics and in Astronomy.

  1. Passport to the Big Bang moves across the road

    CERN Multimedia

    Corinne Pralavorio

    2015-01-01

    The ATLAS platform of the Passport to the Big Bang circuit has been relocated in front of the CERN Reception.   The ATLAS platform of the Passport to the Big Bang, outside the CERN Reception building. The Passport to the Big Bang platform of the ATLAS Experiment has been moved in front of the CERN Reception to make it more visible and accessible. It had to be dismantled and moved from its previous location in the garden of the Globe of Science and Innovation due to the major refurbishment work in progress on the Globe, and is now fully operational in its new location on the other side of the road, in the Main Reception car-park. The Passport to the Big Bang circuit, inaugurated in 2013, comprises ten platforms installed in front of ten CERN sites and aims to help local residents and visitors to the region understand CERN's research. Dedicated Passport to the Big Bang flyers, containing all necessary information and riddles for you to solve, are available at the CERN Rec...

  2. Small Punch Test on Before and Post Irradiated Domestic Reactor Pressure Steel

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Problems may be caused when applying the standard specimen to study the properties of irradiated reactor materials, because of its big dimension, e.g.: The inner temperature gradient of the specimen is high when irradiated, the radiation

  3. Ten tendencies of criminal justice

    Institute of Scientific and Technical Information of China (English)

    HE Jiahong

    2007-01-01

    A study of the global tendencies of criminal justice will help us design a more scientific and rational pathway for the reformation of existing criminal justice system of China. In the forthcoming several hundred years to come, theworld's criminal justice is to take on ten tendencies, that is, the tendency toward unity, civilization, science, rule of law, human rights, justice, efficiency,specialization, standardization and harmony.

  4. Measuring Public Acceptance of Nuclear Technology with Big data

    Energy Technology Data Exchange (ETDEWEB)

    Roh, Seugkook [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    Surveys can be conducted only on people in specific region and time interval, and it may be misleading to generalize the results to represent the attitude of the public. For example, opinions of a person living in metropolitan area, far from the dangers of nuclear reactors and enjoying cheap electricity produced by the reactors, and a person living in proximity of nuclear power plants, subject to tremendous damage should nuclear meltdown occur, certainly differs for the topic of nuclear generation. To conclude, big data is a useful tool to measure the public acceptance of nuclear technology efficiently (i.e., saves cost, time, and effort of measurement and analysis) and this research was able to provide a case for using big data to analyze public acceptance of nuclear technology. Finally, the analysis identified opinion leaders, which allows target-marketing when policy is executed.

  5. Big data in history

    CERN Document Server

    Manning, Patrick

    2013-01-01

    Big Data in History introduces the project to create a world-historical archive, tracing the last four centuries of historical dynamics and change. Chapters address the archive's overall plan, how to interpret the past through a global archive, the missions of gathering records, linking local data into global patterns, and exploring the results.

  6. The big bang

    Science.gov (United States)

    Silk, Joseph

    Our universe was born billions of years ago in a hot, violent explosion of elementary particles and radiation - the big bang. What do we know about this ultimate moment of creation, and how do we know it? Drawing upon the latest theories and technology, this new edition of The big bang, is a sweeping, lucid account of the event that set the universe in motion. Joseph Silk begins his story with the first microseconds of the big bang, on through the evolution of stars, galaxies, clusters of galaxies, quasars, and into the distant future of our universe. He also explores the fascinating evidence for the big bang model and recounts the history of cosmological speculation. Revised and updated, this new edition features all the most recent astronomical advances, including: Photos and measurements from the Hubble Space Telescope, Cosmic Background Explorer Satellite (COBE), and Infrared Space Observatory; the latest estimates of the age of the universe; new ideas in string and superstring theory; recent experiments on neutrino detection; new theories about the presence of dark matter in galaxies; new developments in the theory of the formation and evolution of galaxies; the latest ideas about black holes, worm holes, quantum foam, and multiple universes.

  7. The Big Bang

    CERN Multimedia

    Moods, Patrick

    2006-01-01

    How did the Universe begin? The favoured theory is that everything - space, time, matter - came into existence at the same moment, around 13.7 thousand million years ago. This event was scornfully referred to as the "Big Bang" by Sir Fred Hoyle, who did not believe in it and maintained that the Universe had always existed.

  8. The Big Sky inside

    Science.gov (United States)

    Adams, Earle; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Hester, Carolyn; Knuth, Randy; Spangler, Todd; Jones, David; Henthorn, Melissa; Hammill, Brock; Smith, Paul; Salisbury, Rob; Reckin, Gene; Boulafentis, Johna

    2009-01-01

    The University of Montana (UM)-Missoula has implemented a problem-based program in which students perform scientific research focused on indoor air pollution. The Air Toxics Under the Big Sky program (Jones et al. 2007; Adams et al. 2008; Ward et al. 2008) provides a community-based framework for understanding the complex relationship between poor…

  9. Big Java late objects

    CERN Document Server

    Horstmann, Cay S

    2012-01-01

    Big Java: Late Objects is a comprehensive introduction to Java and computer programming, which focuses on the principles of programming, software engineering, and effective learning. It is designed for a two-semester first course in programming for computer science students.

  10. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with sp

  11. A Big Bang Lab

    Science.gov (United States)

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  12. Space big book

    CERN Document Server

    Homer, Charlene

    2007-01-01

    Our Combined resource includes all necessary areas of Space for grades five to eight. Get the big picture about the Solar System, Galaxies and the Universe as your students become fascinated by the interesting information about the Sun, Earth, Moon, Comets, Asteroids Meteoroids, Stars and Constellations. Also, thrill your young astronomers as they connect Earth and space cycles with their daily life.

  13. The NOAA Big Data Project

    Science.gov (United States)

    de la Beaujardiere, J.

    2015-12-01

    The US National Oceanic and Atmospheric Administration (NOAA) is a Big Data producer, generating tens of terabytes per day from hundreds of sensors on satellites, radars, aircraft, ships, and buoys, and from numerical models. These data are of critical importance and value for NOAA's mission to understand and predict changes in climate, weather, oceans, and coasts. In order to facilitate extracting additional value from this information, NOAA has established Cooperative Research and Development Agreements (CRADAs) with five Infrastructure-as-a-Service (IaaS) providers — Amazon, Google, IBM, Microsoft, Open Cloud Consortium — to determine whether hosting NOAA data in publicly-accessible Clouds alongside on-demand computational capability stimulates the creation of new value-added products and services and lines of business based on the data, and if the revenue generated by these new applications can support the costs of data transmission and hosting. Each IaaS provider is the anchor of a "Data Alliance" which organizations or entrepreneurs can join to develop and test new business or research avenues. This presentation will report on progress and lessons learned during the first 6 months of the 3-year CRADAs.

  14. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  15. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  16. Business and Science - Big Data, Big Picture

    Science.gov (United States)

    Rosati, A.

    2013-12-01

    Data Science is more than the creation, manipulation, and transformation of data. It is more than Big Data. The business world seems to have a hold on the term 'data science' and, for now, they define what it means. But business is very different than science. In this talk, I address how large datasets, Big Data, and data science are conceptually different in business and science worlds. I focus on the types of questions each realm asks, the data needed, and the consequences of findings. Gone are the days of datasets being created or collected to serve only one purpose or project. The trick with data reuse is to become familiar enough with a dataset to be able to combine it with other data and extract accurate results. As a Data Curator for the Advanced Cooperative Arctic Data and Information Service (ACADIS), my specialty is communication. Our team enables Arctic sciences by ensuring datasets are well documented and can be understood by reusers. Previously, I served as a data community liaison for the North American Regional Climate Change Assessment Program (NARCCAP). Again, my specialty was communicating complex instructions and ideas to a broad audience of data users. Before entering the science world, I was an entrepreneur. I have a bachelor's degree in economics and a master's degree in environmental social science. I am currently pursuing a Ph.D. in Geography. Because my background has embraced both the business and science worlds, I would like to share my perspectives on data, data reuse, data documentation, and the presentation or communication of findings. My experiences show that each can inform and support the other.

  17. U-opportunities why is ten equal to ten ?

    CERN Document Server

    Julia, B L

    2002-01-01

    It seems to me at this time that two recent developments may permit fast progress on our way to understand the symmetry structure of toroidally (compactified and) reduced M-theory. The first indication of a (possibly) thin spot in the wall that prevents us from deriving a priori the U-duality symmetries of these models is to be found in the analysis of the hyperbolic billiards that control the chaotic time evolution of (quasi)homogeneous anisotropic String, Supergravity or Einstein cosmologies near a spacelike singularity. What happens is that U-duality symmetry controls chaos via negative constant curvature. On the other hand it was noticed in 1982 that (symmetrizable) ''hyperbolic'' Kac-Moody algebras have maximal rank ten, exactly like superstring models and that two of these four rank ten algebras matched physical theories. My second reason for optimism actually predates also the previous breakthrough, it was the discovery in 1998 of surprising superalgebras extending U-dualities to all (p+1)-forms (assoc...

  18. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    Science.gov (United States)

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  19. Ten questions about systems biology

    DEFF Research Database (Denmark)

    Joyner, Michael J; Pedersen, Bente K

    2011-01-01

    to understand how whole animals adapt to the real world. We argue that a lack of fluency in these concepts is a major stumbling block for what has been narrowly defined as 'systems biology' by some of its leading advocates. We also point out that it is a failure of regulation at multiple levels that causes many......In this paper we raise 'ten questions' broadly related to 'omics', the term systems biology, and why the new biology has failed to deliver major therapeutic advances for many common diseases, especially diabetes and cardiovascular disease. We argue that a fundamentally narrow and reductionist...

  20. Ten Thousand Years of Solitude

    Energy Technology Data Exchange (ETDEWEB)

    Benford, G. (Los Alamos National Lab., NM (USA) California Univ., Irvine, CA (USA). Dept. of Physics); Kirkwood, C.W. (Los Alamos National Lab., NM (USA) Arizona State Univ., Tempe, AZ (USA). Coll. of Business Administration); Harry, O. (Los Alamos National Lab., NM (USA)); Pasqualetti, M.J. (Los Alamos National Lab., NM (USA) Arizona State Univ., Tempe, AZ (USA))

    1991-03-01

    This report documents the authors work as an expert team advising the US Department of Energy on modes of inadvertent intrusion over the next 10,000 years into the Waste Isolation Pilot Project (WIPP) nuclear waste repository. Credible types of potential future accidental intrusion into the WIPP are estimated as a basis for creating warning markers to prevent inadvertent intrusion. A six-step process is used to structure possible scenarios for such intrusion, and it is concluded that the probability of inadvertent intrusion into the WIPP repository over the next ten thousand years lies between one and twenty-five percent. 3 figs., 5 tabs.

  1. Sonochemical Reactors.

    Science.gov (United States)

    Gogate, Parag R; Patil, Pankaj N

    2016-10-01

    Sonochemical reactors are based on the generation of cavitational events using ultrasound and offer immense potential for the intensification of physical and chemical processing applications. The present work presents a critical analysis of the underlying mechanisms for intensification, available reactor configurations and overview of the different applications exploited successfully, though mostly at laboratory scales. Guidelines have also been presented for optimum selection of the important operating parameters (frequency and intensity of irradiation, temperature and liquid physicochemical properties) as well as the geometric parameters (type of reactor configuration and the number/position of the transducers) so as to maximize the process intensification benefits. The key areas for future work so as to transform the successful technique at laboratory/pilot scale into commercial technology have also been discussed. Overall, it has been established that there is immense potential for sonochemical reactors for process intensification leading to greener processing and economic benefits. Combined efforts from a wide range of disciplines such as material science, physics, chemistry and chemical engineers are required to harness the benefits at commercial scale operation.

  2. Big Data and reality

    Directory of Open Access Journals (Sweden)

    Ryan Shaw

    2015-11-01

    Full Text Available DNA sequencers, Twitter, MRIs, Facebook, particle accelerators, Google Books, radio telescopes, Tumblr: what do these things have in common? According to the evangelists of “data science,” all of these are instruments for observing reality at unprecedentedly large scales and fine granularities. This perspective ignores the social reality of these very different technological systems, ignoring how they are made, how they work, and what they mean in favor of an exclusive focus on what they generate: Big Data. But no data, big or small, can be interpreted without an understanding of the process that generated them. Statistical data science is applicable to systems that have been designed as scientific instruments, but is likely to lead to confusion when applied to systems that have not. In those cases, a historical inquiry is preferable.

  3. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects......This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... of the international development agenda to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development policies, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion...

  4. Really big numbers

    CERN Document Server

    Schwartz, Richard Evan

    2014-01-01

    In the American Mathematical Society's first-ever book for kids (and kids at heart), mathematician and author Richard Evan Schwartz leads math lovers of all ages on an innovative and strikingly illustrated journey through the infinite number system. By means of engaging, imaginative visuals and endearing narration, Schwartz manages the monumental task of presenting the complex concept of Big Numbers in fresh and relatable ways. The book begins with small, easily observable numbers before building up to truly gigantic ones, like a nonillion, a tredecillion, a googol, and even ones too huge for names! Any person, regardless of age, can benefit from reading this book. Readers will find themselves returning to its pages for a very long time, perpetually learning from and growing with the narrative as their knowledge deepens. Really Big Numbers is a wonderful enrichment for any math education program and is enthusiastically recommended to every teacher, parent and grandparent, student, child, or other individual i...

  5. ANALYTICS OF BIG DATA

    Directory of Open Access Journals (Sweden)

    Prof. Shubhada Talegaon

    2015-10-01

    Full Text Available Big Data analytics has started to impact all types of organizations, as it carries the potential power to extract embedded knowledge from big amounts of data and react according to it in real time. The current technology enables us to efficiently store and query large datasets, the focus is now on techniques that make use of the complete data set, instead of sampling. This has tremendous implications in areas like machine learning, pattern recognition and classification, sentiment analysis, social networking analysis to name a few. Therefore, there are a number of requirements for moving beyond standard data mining technique. Purpose of this paper is to understand various techniques to analysis data.

  6. Finding the big bang

    CERN Document Server

    Page, Lyman A; Partridge, R Bruce

    2009-01-01

    Cosmology, the study of the universe as a whole, has become a precise physical science, the foundation of which is our understanding of the cosmic microwave background radiation (CMBR) left from the big bang. The story of the discovery and exploration of the CMBR in the 1960s is recalled for the first time in this collection of 44 essays by eminent scientists who pioneered the work. Two introductory chapters put the essays in context, explaining the general ideas behind the expanding universe and fossil remnants from the early stages of the expanding universe. The last chapter describes how the confusion of ideas and measurements in the 1960s grew into the present tight network of tests that demonstrate the accuracy of the big bang theory. This book is valuable to anyone interested in how science is done, and what it has taught us about the large-scale nature of the physical universe.

  7. DARPA's Big Mechanism program.

    Science.gov (United States)

    Cohen, Paul R

    2015-07-16

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  8. DARPA's Big Mechanism program

    Science.gov (United States)

    Cohen, Paul R.

    2015-07-01

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  9. Big Bang 8

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Band 8 vermittelt auf verständliche Weise Relativitätstheorie, Kern- und Teilchenphysik (und deren Anwendungen in der Kosmologie und Astrophysik), Nanotechnologie sowie Bionik.

  10. Big Bang 6

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 6 RG behandelt die Gravitation, Schwingungen und Wellen, Thermodynamik und eine Einführung in die Elektrizität anhand von Alltagsbeispielen und Querverbindungen zu anderen Disziplinen.

  11. Big Bang 5

    CERN Document Server

    Apolin, Martin

    2007-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 5 RG behandelt die Grundlagen (Maßsystem, Größenordnungen) und die Mechanik (Translation, Rotation, Kraft, Erhaltungssätze).

  12. Big Bang 7

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. In Band 7 werden neben einer Einführung auch viele aktuelle Aspekte von Quantenmechanik (z. Beamen) und Elektrodynamik (zB Elektrosmog), sowie die Klimaproblematik und die Chaostheorie behandelt.

  13. Big Data Knowledge Mining

    Directory of Open Access Journals (Sweden)

    Huda Umar Banuqitah

    2016-11-01

    Full Text Available Big Data (BD era has been arrived. The ascent of big data applications where information accumulation has grown beyond the ability of the present programming instrument to catch, manage and process within tolerable short time. The volume is not only the characteristic that defines big data, but also velocity, variety, and value. Many resources contain BD that should be processed. The biomedical research literature is one among many other domains that hides a rich knowledge. MEDLINE is a huge biomedical research database which remain a significantly underutilized source of biological information. Discovering the useful knowledge from such huge corpus leading to many problems related to the type of information such as the related concepts of the domain of texts and the semantic relationship associated with them. In this paper, an agent-based system of two–level for Self-supervised relation extraction from MEDLINE using Unified Medical Language System (UMLS Knowledgebase, has been proposed . The model uses a Self-supervised Approach for Relation Extraction (RE by constructing enhanced training examples using information from UMLS with hybrid text features. The model incorporates Apache Spark and HBase BD technologies with multiple data mining and machine learning technique with the Multi Agent System (MAS. The system shows a better result in comparison with the current state of the art and naïve approach in terms of Accuracy, Precision, Recall and F-score.

  14. Disaggregating asthma: Big investigation versus big data.

    Science.gov (United States)

    Belgrave, Danielle; Henderson, John; Simpson, Angela; Buchan, Iain; Bishop, Christopher; Custovic, Adnan

    2017-02-01

    We are facing a major challenge in bridging the gap between identifying subtypes of asthma to understand causal mechanisms and translating this knowledge into personalized prevention and management strategies. In recent years, "big data" has been sold as a panacea for generating hypotheses and driving new frontiers of health care; the idea that the data must and will speak for themselves is fast becoming a new dogma. One of the dangers of ready accessibility of health care data and computational tools for data analysis is that the process of data mining can become uncoupled from the scientific process of clinical interpretation, understanding the provenance of the data, and external validation. Although advances in computational methods can be valuable for using unexpected structure in data to generate hypotheses, there remains a need for testing hypotheses and interpreting results with scientific rigor. We argue for combining data- and hypothesis-driven methods in a careful synergy, and the importance of carefully characterized birth and patient cohorts with genetic, phenotypic, biological, and molecular data in this process cannot be overemphasized. The main challenge on the road ahead is to harness bigger health care data in ways that produce meaningful clinical interpretation and to translate this into better diagnoses and properly personalized prevention and treatment plans. There is a pressing need for cross-disciplinary research with an integrative approach to data science, whereby basic scientists, clinicians, data analysts, and epidemiologists work together to understand the heterogeneity of asthma.

  15. Approaching Conformality with Ten Flavors

    CERN Document Server

    Appelquist, Thomas; Buchoff, Michael I; Cheng, Michael; Cohen, Saul D; Fleming, George T; Kiskis, Joe; Lin, Meifeng; Na, Heechang; Neil, Ethan T; Osborn, James C; Rebbi, Claudio; Schaich, David; Schroeder, Chris; Voronov, Gennady; Vranas, Pavlos

    2012-01-01

    We present first results for lattice simulations, on a single volume, of the low-lying spectrum of an SU(3) Yang-Mills gauge theory with ten light fermions in the fundamental representation. Fits to the fermion mass dependence of various observables are found to be globally consistent with the hypothesis that this theory is within or just outside the strongly-coupled edge of the conformal window, with mass anomalous dimension approximately equal to 1 over the range of scales simulated. We stress that we cannot rule out the possibility of spontaneous chiral-symmetry breaking at scales well below our infrared cutoff. We discuss important systematic effects, including finite-volume corrections, and consider directions for future improvement.

  16. Ten out of ten for LHC decapole magnets

    CERN Multimedia

    2001-01-01

    CERN's Albert Ijspeert (left) and Avinash Puntambekar of the Indian CAT laboratory with the ten Indian decapole magnets on the test bench. Tests will be carried out by the LHC-MTA group. A batch of 10 superconducting decapole magnets for the LHC has just arrived at CERN from India. These will be used to correct for slight imperfections in the dipole magnets that will steer proton beams around CERN's new accelerator. All magnets have slight imperfections in the fields they produce, and in the LHC dipoles these will be corrected for using sextupoles and decapoles. The sextupoles were the first LHC magnets to be given the production green-light following successful tests of pre-series magnets last year (Bulletin 21/2000, 22 May 2000). Now it is the turn of pre-series decapoles to go on trial at CERN. Of the LHC's 1232 dipole magnets, half will use sextupole correctors only and the other half will use both sextupoles and decapoles. That means that a total of 616 pairs of decapoles are needed. Like the sextupole...

  17. How Big is Earth?

    Science.gov (United States)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  18. Privacy and Big Data

    CERN Document Server

    Craig, Terence

    2011-01-01

    Much of what constitutes Big Data is information about us. Through our online activities, we leave an easy-to-follow trail of digital footprints that reveal who we are, what we buy, where we go, and much more. This eye-opening book explores the raging privacy debate over the use of personal data, with one undeniable conclusion: once data's been collected, we have absolutely no control over who uses it or how it is used. Personal data is the hottest commodity on the market today-truly more valuable than gold. We are the asset that every company, industry, non-profit, and government wants. Pri

  19. Big Data Challenges

    Directory of Open Access Journals (Sweden)

    Alexandru Adrian TOLE

    2013-10-01

    Full Text Available The amount of data that is traveling across the internet today, not only that is large, but is complex as well. Companies, institutions, healthcare system etc., all of them use piles of data which are further used for creating reports in order to ensure continuity regarding the services that they have to offer. The process behind the results that these entities requests represents a challenge for software developers and companies that provide IT infrastructure. The challenge is how to manipulate an impressive volume of data that has to be securely delivered through the internet and reach its destination intact. This paper treats the challenges that Big Data creates.

  20. Classical propagation of strings across a big crunch/big bang singularity

    Science.gov (United States)

    Niz, Gustavo; Turok, Neil

    2007-01-01

    One of the simplest time-dependent solutions of M theory consists of nine-dimensional Euclidean space times 1+1-dimensional compactified Milne space-time. With a further modding out by Z2, the space-time represents two orbifold planes which collide and re-emerge, a process proposed as an explanation of the hot big bang [J. Khoury, B. A. Ovrut, P. J. Steinhardt, and N. Turok, Phys. Rev. D 64, 123522 (2001).PRVDAQ0556-282110.1103/PhysRevD.64.123522][P. J. Steinhardt and N. Turok, Science 296, 1436 (2002).SCIEAS0036-807510.1126/science.1070462][N. Turok, M. Perry, and P. J. Steinhardt, Phys. Rev. D 70, 106004 (2004).PRVDAQ0556-282110.1103/PhysRevD.70.106004]. When the two planes are near, the light states of the theory consist of winding M2-branes, describing fundamental strings in a particular ten-dimensional background. They suffer no blue-shift as the M theory dimension collapses, and their equations of motion are regular across the transition from big crunch to big bang. In this paper, we study the classical evolution of fundamental strings across the singularity in some detail. We also develop a simple semiclassical approximation to the quantum evolution which allows one to compute the quantum production of excitations on the string and implement it in a simplified example.

  1. Can Pleasant Goat and Big Big Wolf Save China's Animation Industry?

    Institute of Scientific and Technical Information of China (English)

    Guo Liqin

    2009-01-01

    "My dreamed husband is big big wolf," claimed Miss Fang, a young lady who works in KPMG Beijing Office. This big big wolf is a lovely cartoon wolf appeared in a Pleasant Goat and Big Big Wolf produced independently by Chinese.

  2. Asteroids Were Born Big

    CERN Document Server

    Morbidelli, Alessandro; Nesvorny, David; Levison, Harold F

    2009-01-01

    How big were the first planetesimals? We attempt to answer this question by conducting coagulation simulations in which the planetesimals grow by mutual collisions and form larger bodies and planetary embryos. The size frequency distribution (SFD) of the initial planetesimals is considered a free parameter in these simulations, and we search for the one that produces at the end objects with a SFD that is consistent with asteroid belt constraints. We find that, if the initial planetesimals were small (e.g. km-sized), the final SFD fails to fulfill these constraints. In particular, reproducing the bump observed at diameter D~100km in the current SFD of the asteroids requires that the minimal size of the initial planetesimals was also ~100km. This supports the idea that planetesimals formed big, namely that the size of solids in the proto-planetary disk ``jumped'' from sub-meter scale to multi-kilometer scale, without passing through intermediate values. Moreover, we find evidence that the initial planetesimals ...

  3. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  4. Fremtidens landbrug bliver big business

    DEFF Research Database (Denmark)

    Hansen, Henning Otte

    2016-01-01

    Landbrugets omverdensforhold og konkurrencevilkår ændres, og det vil nødvendiggøre en udvikling i retning af “big business“, hvor landbrugene bliver endnu større, mere industrialiserede og koncentrerede. Big business bliver en dominerende udvikling i dansk landbrug - men ikke den eneste...

  5. Hybrid adsorptive membrane reactor

    Science.gov (United States)

    Tsotsis, Theodore T. (Inventor); Sahimi, Muhammad (Inventor); Fayyaz-Najafi, Babak (Inventor); Harale, Aadesh (Inventor); Park, Byoung-Gi (Inventor); Liu, Paul K. T. (Inventor)

    2011-01-01

    A hybrid adsorbent-membrane reactor in which the chemical reaction, membrane separation, and product adsorption are coupled. Also disclosed are a dual-reactor apparatus and a process using the reactor or the apparatus.

  6. D and DR Reactors

    Data.gov (United States)

    Federal Laboratory Consortium — The world's second full-scale nuclear reactor was the D Reactor at Hanford which was built in the early 1940's and went operational in December of 1944.D Reactor ran...

  7. Hybrid adsorptive membrane reactor

    Science.gov (United States)

    Tsotsis, Theodore T.; Sahimi, Muhammad; Fayyaz-Najafi, Babak; Harale, Aadesh; Park, Byoung-Gi; Liu, Paul K. T.

    2011-03-01

    A hybrid adsorbent-membrane reactor in which the chemical reaction, membrane separation, and product adsorption are coupled. Also disclosed are a dual-reactor apparatus and a process using the reactor or the apparatus.

  8. The Rise of Big Data in Neurorehabilitation.

    Science.gov (United States)

    Faroqi-Shah, Yasmeen

    2016-02-01

    In some fields, Big Data has been instrumental in analyzing, predicting, and influencing human behavior. However, Big Data approaches have so far been less central in speech-language pathology. This article introduces the concept of Big Data and provides examples of Big Data initiatives pertaining to adult neurorehabilitation. It also discusses the potential theoretical and clinical contributions that Big Data can make. The article also recognizes some impediments in building and using Big Data for scientific and clinical inquiry.

  9. Big Ten School in Cyberspace: A Brief History of Penn State's World Campus.

    Science.gov (United States)

    Hons, Christopher

    2002-01-01

    Describes Pennsylvania State University's World Campus, a virtual university. Topics include vision, expressed in a mission statement; resources needed; technical support; program development; measurement of results; funding; working adult students; online instructor-student interactions; partnerships with other universities and corporations; and…

  10. Transgender People at Four Big Ten Campuses: A Policy Discourse Analysis

    Science.gov (United States)

    Dirks, Doris Andrea

    2016-01-01

    This article examines the language used to discuss transgender people on university campuses. This study asks how, despite seemingly benefitting transgender people, the discourses carried by the documents that discuss trans people may actually undermine the intended goals of policy initiatives. For example, a report on the status of transgender…

  11. NCAA Money for Student Assistance Lands in Many Pockets, Big Ten Document Shows

    Science.gov (United States)

    Wolverton, Brad

    2013-01-01

    Amid a national debate about paying college athletes, the NCAA likes to tout its often-overlooked Student Assistance Fund, whose goal is to provide direct financial support to players. The fund--which draws from the association's multibillion-dollar media-rights deals--will distribute some $75-million this year to Division I athletes. The money…

  12. BIG DATA AND STATISTICS

    Science.gov (United States)

    Rossell, David

    2016-01-01

    Big Data brings unprecedented power to address scientific, economic and societal issues, but also amplifies the possibility of certain pitfalls. These include using purely data-driven approaches that disregard understanding the phenomenon under study, aiming at a dynamically moving target, ignoring critical data collection issues, summarizing or preprocessing the data inadequately and mistaking noise for signal. We review some success stories and illustrate how statistical principles can help obtain more reliable information from data. We also touch upon current challenges that require active methodological research, such as strategies for efficient computation, integration of heterogeneous data, extending the underlying theory to increasingly complex questions and, perhaps most importantly, training a new generation of scientists to develop and deploy these strategies.

  13. Big Bounce Genesis

    CERN Document Server

    Li, Changhong; Cheung, Yeuk-Kwan E

    2014-01-01

    We report on the possibility to use dark matter mass and its interaction cross section as a smoking gun signal of the existence of a big bounce at the early stage in the evolution of our currently observed universe. A model independent study of dark matter production in the contraction and expansion phases of the bounce universe reveals a new venue for achieving the observed relic abundance in which a significantly smaller amount of dark matter--compared to the standard cosmology--is produced and survives until today, diluted only by the cosmic expansion since the radiation dominated era. Once DM mass and its interaction strength with ordinary matter are determined by experiments, this alternative route becomes a signature of the bounce universe scenario.

  14. Big Hero 6

    Institute of Scientific and Technical Information of China (English)

    2015-01-01

    看《超能陆战队》如何让普通人变身超级英雄拯救城市!Hiro Hamada,14,lives in the future city of San Fransokyo.He has a robot(机器人)friend Baymax.Baymax is big and soft.His job is to nurse sick(生病的)people.One day,a bad man wants to take control of(控制)SanFransokyo.Hiro hopes to save(挽救)the city with Baymax.ButBaymax is just a nursing robot.This is not a problem for Hiro,(ho一we套ve盔r.甲He)knows a lot about robots.He makes a suit of armorfor Baymax and turns him into a super robot!

  15. The Last Big Bang

    Energy Technology Data Exchange (ETDEWEB)

    McGuire, Austin D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Meade, Roger Allen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-13

    As one of the very few people in the world to give the “go/no go” decision to detonate a nuclear device, Austin “Mac” McGuire holds a very special place in the history of both the Los Alamos National Laboratory and the world. As Commander of Joint Task Force Unit 8.1.1, on Christmas Island in the spring and summer of 1962, Mac directed the Los Alamos data collection efforts for twelve of the last atmospheric nuclear detonations conducted by the United States. Since data collection was at the heart of nuclear weapon testing, it fell to Mac to make the ultimate decision to detonate each test device. He calls his experience THE LAST BIG BANG, since these tests, part of Operation Dominic, were characterized by the dramatic displays of the heat, light, and sounds unique to atmospheric nuclear detonations – never, perhaps, to be witnessed again.

  16. Avoiding a Big Catastrophe

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Before last October,the South China tiger had almost slipped into mythi- cal status as it had been absent for so long from the public eye.In the previous 20-plus years,these tigers could not be found in the wild in China and the number of those in captivity numbered only around 60. The species—a direct descendent of the earliest tigers thought to have originat- ed in China 2 million years ago—is functionally extinct,according to experts. The big cat’s return to the media spotlight was completely unexpected. On October 12,2007,a digital picture,showing a wild South China tiger

  17. FMDP Reactor Alternative Summary Report: Volume 2 - CANDU heavy water reactor alternative

    Energy Technology Data Exchange (ETDEWEB)

    Greene, S.R.; Spellman, D.J.; Bevard, B.B. [and others

    1996-09-01

    The Department of Energy Office of Fissile Materials Disposition (DOE/MD) initiated a detailed analysis activity to evaluate each of ten plutonium disposition alternatives that survived an initial screening process. This document, Volume 2 of a four volume report, summarizes the results of these analyses for the CANDU reactor based plutonium disposition alternative.

  18. Reactor and method of operation

    Science.gov (United States)

    Wheeler, John A.

    1976-08-10

    A nuclear reactor having a flattened reactor activity curve across the reactor includes fuel extending over a lesser portion of the fuel channels in the central portion of the reactor than in the remainder of the reactor.

  19. Big Bang of Massenergy and Negative Big Bang of Spacetime

    Science.gov (United States)

    Cao, Dayong

    2017-01-01

    There is a balance between Big Bang of Massenergy and Negative Big Bang of Spacetime in the universe. Also some scientists considered there is an anti-Big Bang who could produce the antimatter. And the paper supposes there is a structure balance between Einstein field equation and negative Einstein field equation, a balance between massenergy structure and spacetime structure, a balance between an energy of nucleus of the stellar matter and a dark energy of nucleus of the dark matter-dark energy, and a balance between the particle and the wave-a balance system between massenergy (particle) and spacetime (wave). It should explain of the problems of the Big Bang. http://meetings.aps.org/Meeting/APR16/Session/M13.8

  20. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  1. Big Data: present and future

    Directory of Open Access Journals (Sweden)

    Mircea Raducu TRIFU

    2014-05-01

    Full Text Available The paper explains the importance of the Big Data concept, a concept that even now, after years of development, is for the most companies just a cool keyword. The paper also describes the level of the actual big data development and the things it can do, and also the things that can be done in the near future. The paper focuses on explaining to nontechnical and non-database related technical specialists what basically is big data, presents the three most important V's, as well as the new ones, the most important solutions used by companies like Google or Amazon, as well as some interesting perceptions based on this subject.

  2. The challenges of big data.

    Science.gov (United States)

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest.

  3. Big Data: present and future

    OpenAIRE

    Mircea Raducu TRIFU; Mihaela Laura IVAN

    2014-01-01

    The paper explains the importance of the Big Data concept, a concept that even now, after years of development, is for the most companies just a cool keyword. The paper also describes the level of the actual big data development and the things it can do, and also the things that can be done in the near future. The paper focuses on explaining to nontechnical and non-database related technical specialists what basically is big data, presents the three most important V's, as well as the new ...

  4. Big Data Mining: Tools & Algorithms

    Directory of Open Access Journals (Sweden)

    Adeel Shiraz Hashmi

    2016-03-01

    Full Text Available We are now in Big Data era, and there is a growing demand for tools which can process and analyze it. Big data analytics deals with extracting valuable information from that complex data which can’t be handled by traditional data mining tools. This paper surveys the available tools which can handle large volumes of data as well as evolving data streams. The data mining tools and algorithms which can handle big data have also been summarized, and one of the tools has been used for mining of large datasets using distributed algorithms.

  5. The challenges of big data

    Science.gov (United States)

    2016-01-01

    ABSTRACT The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. PMID:27147249

  6. Big Data is invading big places as CERN

    CERN Document Server

    CERN. Geneva

    2017-01-01

    Big Data technologies are becoming more popular with the constant grow of data generation in different fields such as social networks, internet of things and laboratories like CERN. How is CERN making use of such technologies? How machine learning is applied at CERN with Big Data technologies? How much data we move and how it is analyzed? All these questions will be answered during the talk.

  7. Big Data and Perioperative Nursing.

    Science.gov (United States)

    Westra, Bonnie L; Peterson, Jessica J

    2016-10-01

    Big data are large volumes of digital data that can be collected from disparate sources and are challenging to analyze. These data are often described with the five "Vs": volume, velocity, variety, veracity, and value. Perioperative nurses contribute to big data through documentation in the electronic health record during routine surgical care, and these data have implications for clinical decision making, administrative decisions, quality improvement, and big data science. This article explores methods to improve the quality of perioperative nursing data and provides examples of how these data can be combined with broader nursing data for quality improvement. We also discuss a national action plan for nursing knowledge and big data science and how perioperative nurses can engage in collaborative actions to transform health care. Standardized perioperative nursing data has the potential to affect care far beyond the original patient.

  8. Big, Fat World of Lipids

    Science.gov (United States)

    ... Science Home Page The Big, Fat World of Lipids By Emily Carlson Posted August 9, 2012 Cholesterol ... ways to diagnose and treat lipid-related conditions. Lipid Encyclopedia Just as genomics and proteomics spurred advances ...

  9. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  10. Clinical experience with TENS and TENS combined with nitrous oxide-oxygen. Report of 371 patients.

    OpenAIRE

    Quarnstrom, F. C.; Milgrom, P.

    1989-01-01

    Transcutaneous electrical nerve stimulation (TENS) alone or TENS combined with nitrous oxide-oxygen (N2O) was administered for restorative dentistry without local anesthesia to 371 adult patients. A total of 55% of TENS alone and 84% of TENS/N2O visits were rated successful. A total of 53% of TENS alone and 82% of TENS/N2O patients reported slight or no pain. In multivariable analyses, pain reports were related to the anesthesia technique and patient fear and unrelated to sex, race, age, toot...

  11. Ten Steps to Making Evaluation Matter

    Science.gov (United States)

    Sridharan, Sanjeev; Nakaima, April

    2011-01-01

    This paper proposes ten steps to make evaluations matter. The ten steps are a combination of the usual recommended practice such as developing program theory and implementing rigorous evaluation designs with a stronger focus on more unconventional steps including developing learning frameworks, exploring pathways of evaluation influence, and…

  12. Top-Ten IT Issues, 2010

    Science.gov (United States)

    Ingerman, Bret L.; Yang, Catherine

    2010-01-01

    The eleventh annual EDUCAUSE Current Issues Survey shows some very familiar themes among the top-ten IT issues of strategic importance to technology leaders in higher education. Indeed, all ten of the issues from the 2009 survey are back, albeit in a slightly different order. In addition, Strategic Planning returns as an issue of renewed…

  13. I Can Create Mental Images to Retell and Infer Big Ideas

    Science.gov (United States)

    Miller, Debbie

    2013-01-01

    As teachers, we are always reflecting on and refining our craft. In this article, the author shares how her understanding and implementation of comprehension strategy instruction has evolved over the past ten years. These shifts include her current thinking about the gradual release of responsibility instructional model, how content and big ideas…

  14. Big Bang Nucleosynthesis: 2015

    CERN Document Server

    Cyburt, Richard H; Olive, Keith A; Yeh, Tsung-Han

    2015-01-01

    Big-bang nucleosynthesis (BBN) describes the production of the lightest nuclides via a dynamic interplay among the four fundamental forces during the first seconds of cosmic time. We briefly overview the essentials of this physics, and present new calculations of light element abundances through li6 and li7, with updated nuclear reactions and uncertainties including those in the neutron lifetime. We provide fits to these results as a function of baryon density and of the number of neutrino flavors, N_nu. We review recent developments in BBN, particularly new, precision Planck cosmic microwave background (CMB) measurements that now probe the baryon density, helium content, and the effective number of degrees of freedom, n_eff. These measurements allow for a tight test of BBN and of cosmology using CMB data alone. Our likelihood analysis convolves the 2015 Planck data chains with our BBN output and observational data. Adding astronomical measurements of light elements strengthens the power of BBN. We include a ...

  15. Big bang darkleosynthesis

    Directory of Open Access Journals (Sweden)

    Gordan Krnjaic

    2015-12-01

    Full Text Available In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN, in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV/dark-nucleon binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S≫3/2, whose discovery would be smoking gun evidence for dark nuclei.

  16. Big Data Comes to School

    OpenAIRE

    Bill Cope; Mary Kalantzis

    2016-01-01

    The prospect of “big data” at once evokes optimistic views of an information-rich future and concerns about surveillance that adversely impacts our personal and private lives. This overview article explores the implications of big data in education, focusing by way of example on data generated by student writing. We have chosen writing because it presents particular complexities, highlighting the range of processes for collecting and interpreting evidence of learning in the era of computer-me...

  17. Big Data for Precision Medicine

    OpenAIRE

    Daniel Richard Leff; Guang-Zhong Yang

    2015-01-01

    This article focuses on the potential impact of big data analysis to improve health, prevent and detect disease at an earlier stage, and personalize interventions. The role that big data analytics may have in interrogating the patient electronic health record toward improved clinical decision support is discussed. We examine developments in pharmacogenetics that have increased our appreciation of the reasons why patients respond differently to chemotherapy. We also assess the expansion of onl...

  18. The role of big laboratories

    CERN Document Server

    Heuer, Rolf-Dieter

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward.

  19. Challenges of Big Data Analysis.

    Science.gov (United States)

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  20. Powering Big Data for Nursing Through Partnership.

    Science.gov (United States)

    Harper, Ellen M; Parkerson, Sara

    2015-01-01

    The Big Data Principles Workgroup (Workgroup) was established with support of the Healthcare Information and Management Systems Society. Building on the Triple Aim challenge, the Workgroup sought to identify Big Data principles, barriers, and challenges to nurse-sensitive data inclusion into Big Data sets. The product of this pioneering partnership Workgroup was the "Guiding Principles for Big Data in Nursing-Using Big Data to Improve the Quality of Care and Outcomes."

  1. Technology, Safety and Costs of Decommissioning Nuclear Reactors At Multiple-Reactor Stations

    Energy Technology Data Exchange (ETDEWEB)

    Wittenbrock, N. G.

    1982-01-01

    Safety and cost information is developed for the conceptual decommissioning of large (1175-MWe) pressurized water reactors (PWRs) and large (1155-MWe) boiling water reactors {BWRs) at multiple-reactor stations. Three decommissioning alternatives are studied: DECON (immediate decontamination), SAFSTOR (safe storage followed by deferred decontamination), and ENTOMB (entombment). Safety and costs of decommissioning are estimated by determining the impact of probable features of multiple-reactor-station operation that are considered to be unavailable at a single-reactor station, and applying these estimated impacts to the decommissioning costs and radiation doses estimated in previous PWR and BWR decommissioning studies. The multiple-reactor-station features analyzed are: the use of interim onsite nuclear waste storage with later removal to an offsite nuclear waste disposal facility, the use of permanent onsite nuclear waste disposal, the dedication of the site to nuclear power generation, and the provision of centralized services. Five scenarios for decommissioning reactors at a multiple-reactor station are investigated. The number of reactors on a site is assumed to be either four or ten; nuclear waste disposal is varied between immediate offsite disposal, interim onsite storage, and immediate onsite disposal. It is assumed that the decommissioned reactors are not replaced in one scenario but are replaced in the other scenarios. Centralized service facilities are provided in two scenarios but are not provided in the other three. Decommissioning of a PWR or a BWR at a multiple-reactor station probably will be less costly and result in lower radiation doses than decommissioning an identical reactor at a single-reactor station. Regardless of whether the light water reactor being decommissioned is at a single- or multiple-reactor station: • the estimated occupational radiation dose for decommissioning an LWR is lowest for SAFSTOR and highest for DECON • the estimated

  2. ALGORITHMS FOR TETRAHEDRAL NETWORK (TEN) GENERATION

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The Tetrahedral Network(TEN) is a powerful 3-D vector structure in GIS, which has a lot of advantages such as simple structure, fast topological relation processing and rapid visualization. The difficulty of TEN application is automatic creating data structure. Al though a raster algorithm has been introduced by some authors, the problems in accuracy, memory requirement, speed and integrity are still existent. In this paper, the raster algorithm is completed and a vector algorithm is presented after a 3-D data model and structure of TEN have been introducted. Finally, experiment, conclusion and future work are discussed.

  3. Big Data in Health: a Literature Review from the Year 2005.

    Science.gov (United States)

    de la Torre Díez, Isabel; Cosgaya, Héctor Merino; Garcia-Zapirain, Begoña; López-Coronado, Miguel

    2016-09-01

    The information stored in healthcare systems has increased over the last ten years, leading it to be considered Big Data. There is a wealth of health information ready to be analysed. However, the sheer volume raises a challenge for traditional methods. The aim of this article is to conduct a cutting-edge study on Big Data in healthcare from 2005 to the present. This literature review will help researchers to know how Big Data has developed in the health industry and open up new avenues for research. Information searches have been made on various scientific databases such as Pubmed, Science Direct, Scopus and Web of Science for Big Data in healthcare. The search criteria were "Big Data" and "health" with a date range from 2005 to the present. A total of 9724 articles were found on the databases. 9515 articles were discarded as duplicates or for not having a title of interest to the study. 209 articles were read, with the resulting decision that 46 were useful for this study. 52.6 % of the articles used were found in Science Direct, 23.7 % in Pubmed, 22.1 % through Scopus and the remaining 2.6 % through the Web of Science. Big Data has undergone extremely high growth since 2011 and its use is becoming compulsory in developed nations and in an increasing number of developing nations. Big Data is a step forward and a cost reducer for public and private healthcare.

  4. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  5. Considerations on Geospatial Big Data

    Science.gov (United States)

    LIU, Zhen; GUO, Huadong; WANG, Changlin

    2016-11-01

    Geospatial data, as a significant portion of big data, has recently gained the full attention of researchers. However, few researchers focus on the evolution of geospatial data and its scientific research methodologies. When entering into the big data era, fully understanding the changing research paradigm associated with geospatial data will definitely benefit future research on big data. In this paper, we look deep into these issues by examining the components and features of geospatial big data, reviewing relevant scientific research methodologies, and examining the evolving pattern of geospatial data in the scope of the four ‘science paradigms’. This paper proposes that geospatial big data has significantly shifted the scientific research methodology from ‘hypothesis to data’ to ‘data to questions’ and it is important to explore the generality of growing geospatial data ‘from bottom to top’. Particularly, four research areas that mostly reflect data-driven geospatial research are proposed: spatial correlation, spatial analytics, spatial visualization, and scientific knowledge discovery. It is also pointed out that privacy and quality issues of geospatial data may require more attention in the future. Also, some challenges and thoughts are raised for future discussion.

  6. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  7. Reactor Physics Programme

    Energy Technology Data Exchange (ETDEWEB)

    De Raedt, C

    2000-07-01

    The Reactor Physics and Department of SCK-CEN offers expertise in various areas of reactor physics, in particular in neutronics calculations, reactor dosimetry, reactor operation, reactor safety and control and non-destructive analysis on reactor fuel. This expertise is applied within the Reactor Physics and MYRRHA Research Department's own research projects in the VENUS critical facility, in the BR1 reactor and in the MYRRHA project (this project aims at designing a prototype Accelerator Driven System). Available expertise is also used in programmes external to the Department such as the reactor pressure steel vessel programme, the BR2 reactor dosimetry, and the preparation and interpretation of irradiation experiments. Progress and achievements in 1999 in the following areas are reported on: (1) investigations on the use of military plutonium in commercial power reactors; (2) neutron and gamma calculations performed for BR-2 and for other reactors; (3) the updating of neutron and gamma cross-section libraries; (4) the implementation of reactor codes; (6) the management of the UNIX workstations; and (6) fuel cycle studies.

  8. CLOUD COMPUTING WITH BIG DATA: A REVIEW

    OpenAIRE

    Anjali; Er. Amandeep Kaur; Mrs. Shakshi

    2016-01-01

    Big data is a collection of huge quantities of data. Big data is the process of examining large amounts of data. Big data and Cloud computing are the hot issues in Information Technology. Big data is the one of the main problem now a day’s. Researchers focusing how to handle huge amount of data with cloud computing and how to gain a perfect security for big data in cloud computing. To handle the Big Data problem Hadoop framework is used in which data is fragmented and executed parallel....

  9. Health evaluation of the 2nd International "Quit and Win" Antinicotine Campaign participants ten years later.

    Science.gov (United States)

    Kowalska, Alina; Stelmach, Włodzimierz; Krakowiak, Jan; Rzeźnicki, Adam; Pikala, Małgorzata; Dziankowska-Zaborszczyk, Elzbieta; Drygas, Wojciech

    2008-01-01

    Smoking is one of the most often noticed types of negative behaviour among the Poles. In the work, the results of the health evaluation are presented of the participants of the 'Quit and Win' competition ten years after making a decision to refrain from smoking, also the dependency between this evaluation and behaviour connected with smoking among the people living in big cities and small towns and villages was analysed. Among the 648 respondents, majority, which is 302 people (46.6%) evaluated their health as good, 236 (36.4%) as average, and 76 of the questioned (11.7%) as very good, 29 people (4.5%) as bad, and 5 of the questioned (0.8%) as very bad. The respondents most often evaluated negatively their health in the group of the still smoking living in the big cities, and the least often in the group of the non-smokers living in small towns and villages.

  10. Big Data: Astronomical or Genomical?

    Science.gov (United States)

    Stephens, Zachary D; Lee, Skylar Y; Faghri, Faraz; Campbell, Roy H; Zhai, Chengxiang; Efron, Miles J; Iyer, Ravishankar; Schatz, Michael C; Sinha, Saurabh; Robinson, Gene E

    2015-07-01

    Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a "four-headed beast"--it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the "genomical" challenges of the next decade.

  11. Big Data: Astronomical or Genomical?

    Directory of Open Access Journals (Sweden)

    Zachary D Stephens

    2015-07-01

    Full Text Available Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a "four-headed beast"--it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the "genomical" challenges of the next decade.

  12. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  13. 淀粉Big Bang!

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Big Bang,也叫"大爆炸",指的是宇宙诞生时期从密度极大且温度极高的太初状态开始发生不断膨胀的过程。换句话说,从Big Bang开始,我们现在的宇宙慢慢形成了。0K,从本期开始,"少电"将在微博引发Big Bang!——淀粉大爆炸!具体怎么爆呢?我想,看到本页版式的你已经明白了七八分了吧?

  14. Multiwavelength astronomy and big data

    Science.gov (United States)

    Mickaelian, A. M.

    2016-09-01

    Two major characteristics of modern astronomy are multiwavelength (MW) studies (fromγ-ray to radio) and big data (data acquisition, storage and analysis). Present astronomical databases and archives contain billions of objects observed at various wavelengths, both galactic and extragalactic, and the vast amount of data on them allows new studies and discoveries. Astronomers deal with big numbers. Surveys are the main source for discovery of astronomical objects and accumulation of observational data for further analysis, interpretation, and achieving scientific results. We review the main characteristics of astronomical surveys, compare photographic and digital eras of astronomical studies (including the development of wide-field observations), describe the present state of MW surveys, and discuss the Big Data in astronomy and related topics of Virtual Observatories and Computational Astrophysics. The review includes many numbers and data that can be compared to have a possibly overall understanding on the Universe, cosmic numbers and their relationship to modern computational facilities.

  15. Big Data Analytics in Healthcare

    Directory of Open Access Journals (Sweden)

    Ashwin Belle

    2015-01-01

    Full Text Available The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  16. Towards a big crunch dual

    Energy Technology Data Exchange (ETDEWEB)

    Hertog, Thomas E-mail: hertog@vulcan2.physics.ucsb.edu; Horowitz, Gary T

    2004-07-01

    We show there exist smooth asymptotically anti-de Sitter initial data which evolve to a big crunch singularity in a low energy supergravity limit of string theory. This opens up the possibility of using the dual conformal field theory to obtain a fully quantum description of the cosmological singularity. A preliminary study of this dual theory suggests that the big crunch is an endpoint of evolution even in the full string theory. We also show that any theory with scalar solitons must have negative energy solutions. The results presented here clarify our earlier work on cosmic censorship violation in N=8 supergravity. (author)

  17. Was There A Big Bang?

    CERN Document Server

    Soberman, Robert K

    2008-01-01

    The big bang hypothesis is widely accepted despite numerous physics conflicts. It rests upon two experimental supports, galactic red shift and the cosmic microwave background. Both are produced by dark matter, shown here to be hydrogen dominated aggregates with a few percent of helium nodules. Scattering from these non-radiating intergalactic masses produce a red shift that normally correlates with distance. Warmed by our galaxy to an Eigenvalue of 2.735 K, drawn near the Earth, these bodies, kept cold by ablation, resonance radiate the Planckian microwave signal. Several tests are proposed that will distinguish between this model and the big bang.

  18. [Big Data- challenges and risks].

    Science.gov (United States)

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-01

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented.

  19. Do big gods cause anything?

    DEFF Research Database (Denmark)

    Geertz, Armin W.

    2014-01-01

    Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere.......Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere....

  20. LMFBR type reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kawakami, Hiroto

    1995-02-07

    A reactor container of the present invention has a structure that the reactor container is entirely at the same temperature as that at the inlet of the reactor and, a hot pool is incorporated therein, and the reactor container has is entirely at the same temperature and has substantially uniform temperature follow-up property transiently. Namely, if the temperature at the inlet of the reactor core changes, the temperature of the entire reactor container changes following this change, but no great temperature gradient is caused in the axial direction and no great heat stresses due to axial temperature distribution is caused. Occurrence of thermal stresses caused by the axial temperature distribution can be suppressed to improve the reliability of the reactor container. In addition, since the laying of the reactor inlet pipelines over the inside of the reactor is eliminated, the reactor container is made compact and the heat shielding structures above the reactor and a protection structure of container walls are simplified. Further, secondary coolants are filled to the outside of the reactor container to simplify the shieldings. The combined effects described above can improve economical property and reliability. (N.H.).

  1. Big society, big data. The radicalisation of the network society

    NARCIS (Netherlands)

    Frissen, V.

    2011-01-01

    During the British election campaign of 2010, David Cameron produced the idea of the ‘Big Society’ as a cornerstone of his political agenda. At the core of the idea is a stronger civil society and local community coupled with a more withdrawn government. Although many commentators have dismissed thi

  2. Little Science to Big Science: Big Scientists to Little Scientists?

    Science.gov (United States)

    Simonton, Dean Keith

    2010-01-01

    This article presents the author's response to Hisham B. Ghassib's essay entitled "Where Does Creativity Fit into a Productivist Industrial Model of Knowledge Production?" Professor Ghassib's (2010) essay presents a provocative portrait of how the little science of the Babylonians, Greeks, and Arabs became the Big Science of the modern industrial…

  3. Light water reactor safety

    CERN Document Server

    Pershagen, B

    2013-01-01

    This book describes the principles and practices of reactor safety as applied to the design, regulation and operation of light water reactors, combining a historical approach with an up-to-date account of the safety, technology and operating experience of both pressurized water reactors and boiling water reactors. The introductory chapters set out the basic facts upon which the safety of light water reactors depend. The central section is devoted to the methods and results of safety analysis. The accidents at Three Mile Island and Chernobyl are reviewed and their implications for light wate

  4. Nuclear reactor physics

    CERN Document Server

    Stacey, Weston M

    2010-01-01

    Nuclear reactor physics is the core discipline of nuclear engineering. Nuclear reactors now account for a significant portion of the electrical power generated worldwide, and new power reactors with improved fuel cycles are being developed. At the same time, the past few decades have seen an ever-increasing number of industrial, medical, military, and research applications for nuclear reactors. The second edition of this successful comprehensive textbook and reference on basic and advanced nuclear reactor physics has been completely updated, revised and enlarged to include the latest developme

  5. Data Partitioning View of Mining Big Data

    OpenAIRE

    Zhang, Shichao

    2016-01-01

    There are two main approximations of mining big data in memory. One is to partition a big dataset to several subsets, so as to mine each subset in memory. By this way, global patterns can be obtained by synthesizing all local patterns discovered from these subsets. Another is the statistical sampling method. This indicates that data partitioning should be an important strategy for mining big data. This paper recalls our work on mining big data with a data partitioning and shows some interesti...

  6. 77 FR 27245 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN

    Science.gov (United States)

    2012-05-09

    ... Fish and Wildlife Service Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN... comprehensive conservation plan (CCP) and environmental assessment (EA) for Big Stone National Wildlife Refuge...: r3planning@fws.gov . Include ``Big Stone Draft CCP/ EA'' in the subject line of the message. Fax:...

  7. Research on the usage of a deep sea fast reactor

    Energy Technology Data Exchange (ETDEWEB)

    Otsubo, Akira; Kowata, Yasuki [Power Reactor and Nuclear Fuel Development Corp., Oarai, Ibaraki (Japan). Oarai Engineering Center

    1997-09-01

    Many new types of fast reactors have been studied in PNC. A deep sea fast reactor has the highest realization probability of the reactors studied because its development is desired by many specialists of oceanography, meteorology, deep sea bottom oil field, seismology and so on and because the development does not cost big budget and few technical problems remain to be solved. This report explains the outline and the usage of the reactor of 40 kWe and 200 to 400 kWe. The reactor can be used as a power source at an unmanned base for long term climate prediction and the earth science and an oil production base in a deep sea region. On the other hand, it is used for heat and electric power supply to a laboratory in the polar region. In future, it will be used in the space. At the present time, a large FBR development plan does not proceed successfully and a realization goal time of FBR has gone later and later. We think that it is the most important to develop the reactor as fast as possible and to plant a fast reactor technique in our present society. (author)

  8. BIG DATA IN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2015-06-01

    Full Text Available In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured in order to improve current transactions, to develop new business models, to provide a real image of the supply and demand and thereby, generate market advantages. So, the companies that turn to Big Data have a competitive advantage over other firms. Looking from the perspective of IT organizations, they must accommodate the storage and processing Big Data, and provide analysis tools that are easily integrated into business processes. This paper aims to discuss aspects regarding the Big Data concept, the principles to build, organize and analyse huge datasets in the business environment, offering a three-layer architecture, based on actual software solutions. Also, the article refers to the graphical tools for exploring and representing unstructured data, Gephi and NodeXL.

  9. The International Big History Association

    Science.gov (United States)

    Duffy, Michael; Duffy, D'Neil

    2013-01-01

    IBHA, the International Big History Association, was organized in 2010 and "promotes the unified, interdisciplinary study and teaching of history of the Cosmos, Earth, Life, and Humanity." This is the vision that Montessori embraced long before the discoveries of modern science fleshed out the story of the evolving universe. "Big…

  10. The Big European Bubble Chamber

    CERN Multimedia

    1977-01-01

    The 3.70 metre Big European Bubble Chamber (BEBC), dismantled on 9 August 1984. During operation it was one of the biggest detectors in the world, producing direct visual recordings of particle tracks. 6.3 million photos of interactions were taken with the chamber in the course of its existence.

  11. YOUNG CITY,BIG PARTY

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Shenzhen Universiade unites the world’s young people through sports with none of the usual hoop-la, no fireworks, no grand performances by celebrities and superstars, the Shenzhen Summer Universiade lowered the curtain on a big party for youth and college students on August 23.

  12. True Randomness from Big Data

    Science.gov (United States)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-09-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  13. True Randomness from Big Data

    Science.gov (United States)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-01-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514

  14. ATLAS: Big Data in a Small Package

    Science.gov (United States)

    Denneau, Larry; Tonry, John

    2015-08-01

    For even small telescope projects, the petabyte scale is now upon us. The Asteroid Terrestrial-impact Last Alert System (ATLAS; Tonry 2011) will robotically survey the entire visible sky from Hawaii multiple times per night to search for near-Earth asteroids (NEAs) on impact trajectories. While the ATLAS optical system is modest by modern astronomical standards -- two 0.5 m F/2.0 telescopes -- each year the ATLAS system will obtain ~103 measurements of 109 astronomical sources to a photometric accuracy of tens of minutes from detection. ATLAS's all-sky coverage ensures it will discover many ``rifle shot'' near-misses moving rapidly on the sky as they shoot past the Earth, so the system will need software to automatically detect highly-trailed sources and discriminate them from the thousands of satellites and pieces of space junk that ATLAS will see each night. Additional interrogation will identify interesting phenomena from beyond the solar system occurring over millions of transient sources per night. The data processing and storage requirements for ATLAS demand a ``big data'' approach typical of commercial Internet enterprises. We describe our approach to deploying a nimble, scalable and reliable data processing infrastructure, and promote ATLAS as steppingstone to eventual processing scales in the era of LSST.

  15. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  16. Spinning fluids reactor

    Science.gov (United States)

    Miller, Jan D; Hupka, Jan; Aranowski, Robert

    2012-11-20

    A spinning fluids reactor, includes a reactor body (24) having a circular cross-section and a fluid contactor screen (26) within the reactor body (24). The fluid contactor screen (26) having a plurality of apertures and a circular cross-section concentric with the reactor body (24) for a length thus forming an inner volume (28) bound by the fluid contactor screen (26) and an outer volume (30) bound by the reactor body (24) and the fluid contactor screen (26). A primary inlet (20) can be operatively connected to the reactor body (24) and can be configured to produce flow-through first spinning flow of a first fluid within the inner volume (28). A secondary inlet (22) can similarly be operatively connected to the reactor body (24) and can be configured to produce a second flow of a second fluid within the outer volume (30) which is optionally spinning.

  17. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  18. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  19. Data, Data, Data : Big, Linked & Open

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Eckartz, S.M.

    2013-01-01

    De gehele business en IT-wereld praat op dit moment over Big Data, een trend die medio 2013 Cloud Computing is gepasseerd (op basis van Google Trends). Ook beleidsmakers houden zich actief bezig met Big Data. Neelie Kroes, vice-president van de Europese Commissie, spreekt over de ‘Big Data Revolutio

  20. Big sagebrush seed bank densities following wildfires

    Science.gov (United States)

    Big sagebrush (Artemisia spp.) is a critical shrub to many wildlife species including sage grouse (Centrocercus urophasianus), mule deer (Odocoileus hemionus), and pygmy rabbit (Brachylagus idahoensis). Big sagebrush is killed by wildfires and big sagebrush seed is generally short-lived and do not s...

  1. A survey of big data research

    OpenAIRE

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions.

  2. REVIVAL PLANS, Ten Key Sectors Benefit?

    Institute of Scientific and Technical Information of China (English)

    Yantai CHEN; Hongbo CAI; Yang XU

    2009-01-01

    @@ To revive China's industrial establishment during the biggest Great Depression in the world after World War Ⅱ,China's Central government has launched the "Revival Plans of Ten Key Sectors"plus the 4 trillion stimulus package in the early 2009.Formulated by the National Development and Planning Commission(NDRC).these revival plans aimed at reinvigorating"ten key sectors",to be specified.the iron and steel.automotive,shipbuilding,petrochemical,textile,light,nonferrous metals,equipment manufacturing,electronics and information technology,and logistics industrial sectors.

  3. Big science transformed science, politics and organization in Europe and the United States

    CERN Document Server

    Hallonsten, Olof

    2016-01-01

    This book analyses the emergence of a transformed Big Science in Europe and the United States, using both historical and sociological perspectives. It shows how technology-intensive natural sciences grew to a prominent position in Western societies during the post-World War II era, and how their development cohered with both technological and social developments. At the helm of post-war science are large-scale projects, primarily in physics, which receive substantial funds from the public purse. Big Science Transformed shows how these projects, popularly called 'Big Science', have become symbols of progress. It analyses changes to the political and sociological frameworks surrounding publicly-funding science, and their impact on a number of new accelerator and reactor-based facilities that have come to prominence in materials science and the life sciences. Interdisciplinary in scope, this book will be of great interest to historians, sociologists and philosophers of science.

  4. Reactor Vessel Surveillance Program for Advanced Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kyeong-Hoon; Kim, Tae-Wan; Lee, Gyu-Mahn; Kim, Jong-Wook; Park, Keun-Bae; Kim, Keung-Koo

    2008-10-15

    This report provides the design requirements of an integral type reactor vessel surveillance program for an integral type reactor in accordance with the requirements of Korean MEST (Ministry of Education, Science and Technology Development) Notice 2008-18. This report covers the requirements for the design of surveillance capsule assemblies including their test specimens, test block materials, handling tools, and monitors of the surveillance capsule neutron fluence and temperature. In addition, this report provides design requirements for the program for irradiation surveillance of reactor vessel materials, a layout of specimens and monitors in the surveillance capsule, procedures of installation and retrieval of the surveillance capsule assemblies, and the layout of the surveillance capsule assemblies in the reactor.

  5. Czech, Slovak science ten years after split

    CERN Multimedia

    2003-01-01

    Ten years after the split of Czechoslovakia Czech and Slovak science are facing the same difficulties: shortage of money for research, poor salaries, obsolete equipment and brain drain, especially of the young, according to a feature in the Daily Lidove Noviny (1 page).

  6. 'Safe handling of nanotechnology' ten years on

    Science.gov (United States)

    Maynard, Andrew D.; Aitken, Robert J.

    2016-12-01

    In 2006, a group of scientists proposed five grand challenges to support the safe handling of nanotechnology. Ten years on, Andrew Maynard and Robert Aitken -- two of the original authors -- look at where we have come, and where we still need to go.

  7. Top-Ten IT Issues: 2009

    Science.gov (United States)

    Agee, Anne Scrivener; Yang, Catherine

    2009-01-01

    This article presents the top-ten IT-related issues in terms of strategic importance to the institution, as revealed by the tenth annual EDUCAUSE Current Issues Survey. These IT-related issues include: (1) Funding IT; (2) Administrative/ERP Information Systems; (3) Security; (4) Infrastructure/Cyberinfrastructure; (5) Teaching and Learning with…

  8. Ten themes of viscous liquid dynamics

    DEFF Research Database (Denmark)

    Dyre, J. C.

    2007-01-01

    Ten ‘themes' of viscous liquid physics are discussed with a focus on how they point to a general description of equilibrium viscous liquid dynamics (i.e., fluctuations) at a given temperature. This description is based on standard time-dependent Ginzburg-Landau equations for the density fields...

  9. Ten recommendations for software engineering in research.

    Science.gov (United States)

    Hastings, Janna; Haug, Kenneth; Steinbeck, Christoph

    2014-01-01

    Research in the context of data-driven science requires a backbone of well-written software, but scientific researchers are typically not trained at length in software engineering, the principles for creating better software products. To address this gap, in particular for young researchers new to programming, we give ten recommendations to ensure the usability, sustainability and practicality of research software.

  10. Supersymmetric R4-actions in ten dimensions

    NARCIS (Netherlands)

    Roo, M. de; Suelmann, H.; Wiedemann, A.

    1992-01-01

    We construct supersymmetric R+R4-actions in ten dimensions. Two invariants, of which the bosonic parts are known from string amplitude and sigma model calculations, are obtained. One of these invariants can be generalized to an R+F2+F4-invariant for supersymmetric Yang-Mills theory coupled to superg

  11. Orbits of Ten Visual Binary Stars

    Institute of Scientific and Technical Information of China (English)

    B.Novakovi(c)

    2007-01-01

    We present the orbits of ten visual binary stars:WDS 01015+6922.WDS 01424-0645,WDS 01461+6349,WDS 04374-0951,WDS 04478+5318,WDS 05255-0033,WDS 05491+6248,WDS 06404+4058,WDS 07479-1212,and WDS 18384+0850.We have also determined their masses,dynamical parallaxes and ephemerides.

  12. Editor's Journal: Poor Elijah's Almanack Top Ten.

    Science.gov (United States)

    Berger, Peter N.

    1998-01-01

    Reports on and responds to a list from the American Association of School Administrators detailing the "Top Ten Changes Affecting Students Since the 1960s." Argues that the list is part common sense, part nonsense, and part obvious. Concludes that there is almost nothing the schools can do about these problems because they are societal problems,…

  13. Strahlungsfelder und Strahlungsqualitäten

    Science.gov (United States)

    Krieger, Hanno

    Das Kapitel beginnt mit einer Darstellung der wichtigsten Größen zur Beschreibung von Strahlungsfeldern. Diese Größen können sowohl auf die Teilchenzahl als auch auf die Teilchenenergie bezogen sein. Im zweiten Teil werden ausführlich die Verfahren zur Charakterisierung der Strahlungsqualitäten der verschiedenen in der Radiologie verwendeten Strahlungsarten dargestellt.

  14. A reduced-boron OPR1000 core based on the BigT burnable absorber

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Hwan Yeal; Yahya, Mohd-Syukri; Kim, Yong Hee [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon (Korea, Republic of)

    2016-04-15

    Reducing critical boron concentration in a commercial pressurized water reactor core offers many advantages in view of safety and economics. This paper presents a preliminary investigation of a reduced-boron pressurized water reactor core to achieve a clearly negative moderator temperature coefficient at hot zero power using the newly-proposed 'Burnable absorber-Integrated Guide Thimble' (BigT) absorbers. The reference core is based on a commercial OPR1000 equilibrium configuration. The reduced-boron ORP1000 configuration was determined by simply replacing commercial gadolinia-based burnable absorbers with the optimized BigT-loaded design. The equilibrium cores in this study were directly searched via repetitive Monte Carlo depletion calculations until convergence. The results demonstrate that, with the same fuel management scheme as in the reference core, application of the BigT absorbers can effectively reduce the critical boron concentration at the beginning of cycle by about 65 ppm. More crucially, the analyses indicate promising potential of the reduced-boron OPR1000 core with the BigT absorbers, as its moderator temperature coefficient at the beginning of cycle is clearly more negative and all other vital neutronic parameters are within practical safety limits. All simulations were completed using the Monte Carlo Serpent code with the ENDF/B-VII.0 library.

  15. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    CERN Multimedia

    2008-01-01

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  16. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  17. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  18. Perspectives on Big Data and Big Data Analytics

    Directory of Open Access Journals (Sweden)

    Elena Geanina ULARU

    2012-12-01

    Full Text Available Nowadays companies are starting to realize the importance of using more data in order to support decision for their strategies. It was said and proved through study cases that “More data usually beats better algorithms”. With this statement companies started to realize that they can chose to invest more in processing larger sets of data rather than investing in expensive algorithms. The large quantity of data is better used as a whole because of the possible correlations on a larger amount, correlations that can never be found if the data is analyzed on separate sets or on a smaller set. A larger amount of data gives a better output but also working with it can become a challenge due to processing limitations. This article intends to define the concept of Big Data and stress the importance of Big Data Analytics.

  19. Antigravity and the big crunch/big bang transition

    CERN Document Server

    Bars, Itzhak; Steinhardt, Paul J; Turok, Neil

    2011-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  20. Solution of a Braneworld Big Crunch/Big Bang Cosmology

    CERN Document Server

    McFadden, P; Turok, N G; Fadden, Paul Mc; Steinhardt, Paul J.; Turok, Neil

    2005-01-01

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c)^2. At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly-separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios.

  1. The ethics of big data in big agriculture

    Directory of Open Access Journals (Sweden)

    Isabelle M. Carbonell

    2016-03-01

    Full Text Available This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique insights on a field-by-field basis into a third or more of the US farmland. This power asymmetry may be rebalanced through open-sourced data, and publicly-funded data analytic tools which rival Climate Corp. in complexity and innovation for use in the public domain.

  2. The Obstacles in Big Data Process

    Directory of Open Access Journals (Sweden)

    Rasim M. Alguliyev

    2017-04-01

    Full Text Available The increasing amount of data and a need to analyze the given data in a timely manner for multiple purposes has created a serious barrier in the big data analysis process. This article describes the challenges that big data creates at each step of the big data analysis process. These problems include typical analytical problems as well as the most uncommon challenges that are futuristic for the big data only. The article breaks down problems for each step of the big data analysis process and discusses these problems separately at each stage. It also offers some simplistic ways to solve these problems.

  3. ISSUES, CHALLENGES, AND SOLUTIONS: BIG DATA MINING

    Directory of Open Access Journals (Sweden)

    Jaseena K.U.

    2014-12-01

    Full Text Available Data has become an indispensable part of every economy, industry, organization, business function and individual. Big Data is a term used to identify the datasets that whose size is beyond the ability of typical database software tools to store, manage and analyze. The Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This paper presents the literature review about the Big data Mining and the issues and challenges with emphasis on the distinguished features of Big Data. It also discusses some methods to deal with big data.

  4. Big data is not a monolith

    CERN Document Server

    Ekbia, Hamid R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  5. SNTP program reactor design

    Science.gov (United States)

    Walton, Lewis A.; Sapyta, Joseph J.

    1993-06-01

    The Space Nuclear Thermal Propulsion (SNTP) program is evaluating the feasibility of a particle bed reactor for a high-performance nuclear thermal rocket engine. Reactors operating between 500 MW and 2,000 MW will produce engine thrusts ranging from 20,000 pounds to 80,000 pounds. The optimum reactor arrangement depends on the power level desired and the intended application. The key components of the reactor have been developed and are being tested. Flow-to-power matching considerations dominate the thermal-hydraulic design of the reactor. Optimal propellant management during decay heat cooling requires a three-pronged approach. Adequate computational methods exist to perform the neutronics analysis of the reactor core. These methods have been benchmarked to critical experiment data.

  6. Fast Spectrum Reactors

    CERN Document Server

    Todd, Donald; Tsvetkov, Pavel

    2012-01-01

    Fast Spectrum Reactors presents a detailed overview of world-wide technology contributing to the development of fast spectrum reactors. With a unique focus on the capabilities of fast spectrum reactors to address nuclear waste transmutation issues, in addition to the well-known capabilities of breeding new fuel, this volume describes how fast spectrum reactors contribute to the wide application of nuclear power systems to serve the global nuclear renaissance while minimizing nuclear proliferation concerns. Readers will find an introduction to the sustainable development of nuclear energy and the role of fast reactors, in addition to an economic analysis of nuclear reactors. A section devoted to neutronics offers the current trends in nuclear design, such as performance parameters and the optimization of advanced power systems. The latest findings on fuel management, partitioning and transmutation include the physics, efficiency and strategies of transmutation, homogeneous and heterogeneous recycling, in addit...

  7. Hybrid reactors. [Fuel cycle

    Energy Technology Data Exchange (ETDEWEB)

    Moir, R.W.

    1980-09-09

    The rationale for hybrid fusion-fission reactors is the production of fissile fuel for fission reactors. A new class of reactor, the fission-suppressed hybrid promises unusually good safety features as well as the ability to support 25 light-water reactors of the same nuclear power rating, or even more high-conversion-ratio reactors such as the heavy-water type. One 4000-MW nuclear hybrid can produce 7200 kg of /sup 233/U per year. To obtain good economics, injector efficiency times plasma gain (eta/sub i/Q) should be greater than 2, the wall load should be greater than 1 MW.m/sup -2/, and the hybrid should cost less than 6 times the cost of a light-water reactor. Introduction rates for the fission-suppressed hybrid are usually rapid.

  8. The big wheels of ATLAS

    CERN Multimedia

    2006-01-01

    The ATLAS cavern is filling up at an impressive rate. The installation of the first of the big wheels of the muon spectrometer, a thin gap chamber (TGC) wheel, was completed in September. The muon spectrometer will include four big moving wheels at each end, each measuring 25 metres in diameter. Of the eight wheels in total, six will be composed of thin gap chambers for the muon trigger system and the other two will consist of monitored drift tubes (MDTs) to measure the position of the muons (see Bulletin No. 13/2006). The installation of the 688 muon chambers in the barrel is progressing well, with three-quarters of them already installed between the coils of the toroid magnet.

  9. Big Numbers in String Theory

    CERN Document Server

    Schellekens, A N

    2016-01-01

    This paper contains some personal reflections on several computational contributions to what is now known as the "String Theory Landscape". It consists of two parts. The first part concerns the origin of big numbers, and especially the number $10^{1500}$ that appeared in work on the covariant lattice construction (with W. Lerche and D. Luest). This part contains some new results. I correct a huge but inconsequential error, discuss some more accurate estimates, and compare with the counting for free fermion constructions. In particular I prove that the latter only provide an exponentially small fraction of all even self-dual lattices for large lattice dimensions. The second part of the paper concerns dealing with big numbers, and contains some lessons learned from various vacuum scanning projects.

  10. Big Data for Precision Medicine

    Directory of Open Access Journals (Sweden)

    Daniel Richard Leff

    2015-09-01

    Full Text Available This article focuses on the potential impact of big data analysis to improve health, prevent and detect disease at an earlier stage, and personalize interventions. The role that big data analytics may have in interrogating the patient electronic health record toward improved clinical decision support is discussed. We examine developments in pharmacogenetics that have increased our appreciation of the reasons why patients respond differently to chemotherapy. We also assess the expansion of online health communications and the way in which this data may be capitalized on in order to detect public health threats and control or contain epidemics. Finally, we describe how a new generation of wearable and implantable body sensors may improve wellbeing, streamline management of chronic diseases, and improve the quality of surgical implants.

  11. Big data and ophthalmic research.

    Science.gov (United States)

    Clark, Antony; Ng, Jonathon Q; Morlet, Nigel; Semmens, James B

    2016-01-01

    Large population-based health administrative databases, clinical registries, and data linkage systems are a rapidly expanding resource for health research. Ophthalmic research has benefited from the use of these databases in expanding the breadth of knowledge in areas such as disease surveillance, disease etiology, health services utilization, and health outcomes. Furthermore, the quantity of data available for research has increased exponentially in recent times, particularly as e-health initiatives come online in health systems across the globe. We review some big data concepts, the databases and data linkage systems used in eye research-including their advantages and limitations, the types of studies previously undertaken, and the future direction for big data in eye research.

  12. George and the big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2012-01-01

    George has problems. He has twin baby sisters at home who demand his parents’ attention. His beloved pig Freddy has been exiled to a farm, where he’s miserable. And worst of all, his best friend, Annie, has made a new friend whom she seems to like more than George. So George jumps at the chance to help Eric with his plans to run a big experiment in Switzerland that seeks to explore the earliest moment of the universe. But there is a conspiracy afoot, and a group of evildoers is planning to sabotage the experiment. Can George repair his friendship with Annie and piece together the clues before Eric’s experiment is destroyed forever? This engaging adventure features essays by Professor Stephen Hawking and other eminent physicists about the origins of the universe and ends with a twenty-page graphic novel that explains how the Big Bang happened—in reverse!

  13. Statistical Inference: The Big Picture.

    Science.gov (United States)

    Kass, Robert E

    2011-02-01

    Statistics has moved beyond the frequentist-Bayesian controversies of the past. Where does this leave our ability to interpret results? I suggest that a philosophy compatible with statistical practice, labelled here statistical pragmatism, serves as a foundation for inference. Statistical pragmatism is inclusive and emphasizes the assumptions that connect statistical models with observed data. I argue that introductory courses often mis-characterize the process of statistical inference and I propose an alternative "big picture" depiction.

  14. Black Hole Blows Big Bubble

    Science.gov (United States)

    2010-07-01

    Combining observations made with ESO's Very Large Telescope and NASA's Chandra X-ray telescope, astronomers have uncovered the most powerful pair of jets ever seen from a stellar black hole. This object, also known as a microquasar, blows a huge bubble of hot gas, 1000 light-years across, twice as large and tens of times more powerful than other known microquasars. The discovery is reported this week in the journal Nature. "We have been astonished by how much energy is injected into the gas by the black hole," says lead author Manfred Pakull. "This black hole is just a few solar masses, but is a real miniature version of the most powerful quasars and radio galaxies, which contain black holes with masses of a few million times that of the Sun." Black holes are known to release a prodigious amount of energy when they swallow matter. It was thought that most of the energy came out in the form of radiation, predominantly X-rays. However, the new findings show that some black holes can release at least as much energy, and perhaps much more, in the form of collimated jets of fast moving particles. The fast jets slam into the surrounding interstellar gas, heating it and triggering an expansion. The inflating bubble contains a mixture of hot gas and ultra-fast particles at different temperatures. Observations in several energy bands (optical, radio, X-rays) help astronomers calculate the total rate at which the black hole is heating its surroundings. The astronomers could observe the spots where the jets smash into the interstellar gas located around the black hole, and reveal that the bubble of hot gas is inflating at a speed of almost one million kilometres per hour. "The length of the jets in NGC 7793 is amazing, compared to the size of the black hole from which they are launched," says co-author Robert Soria [1]. "If the black hole were shrunk to the size of a soccer ball, each jet would extend from the Earth to beyond the orbit of Pluto." This research will help

  15. Big data: the management revolution.

    Science.gov (United States)

    McAfee, Andrew; Brynjolfsson, Erik

    2012-10-01

    Big data, the authors write, is far more powerful than the analytics of the past. Executives can measure and therefore manage more precisely than ever before. They can make better predictions and smarter decisions. They can target more-effective interventions in areas that so far have been dominated by gut and intuition rather than by data and rigor. The differences between big data and analytics are a matter of volume, velocity, and variety: More data now cross the internet every second than were stored in the entire internet 20 years ago. Nearly real-time information makes it possible for a company to be much more agile than its competitors. And that information can come from social networks, images, sensors, the web, or other unstructured sources. The managerial challenges, however, are very real. Senior decision makers have to learn to ask the right questions and embrace evidence-based decision making. Organizations must hire scientists who can find patterns in very large data sets and translate them into useful business information. IT departments have to work hard to integrate all the relevant internal and external sources of data. The authors offer two success stories to illustrate how companies are using big data: PASSUR Aerospace enables airlines to match their actual and estimated arrival times. Sears Holdings directly analyzes its incoming store data to make promotions much more precise and faster.

  16. The BigBOSS Experiment

    CERN Document Server

    Schlegel, D; Abraham, T; Ahn, C; Prieto, C Allende; Annis, J; Aubourg, E; Azzaro, M; Baltay, S Bailey C; Baugh, C; Bebek, C; Becerril, S; Blanton, M; Bolton, A; Bromley, B; Cahn, R; Carton, P -H; Cervantes-Cota, J L; Chu, Y; Cortes, M; Dawson, K; Dey, A; Dickinson, M; Diehl, H T; Doel, P; Ealet, A; Edelstein, J; Eppelle, D; Escoffier, S; Evrard, A; Faccioli, L; Frenk, C; Geha, M; Gerdes, D; Gondolo, P; Gonzalez-Arroyo, A; Grossan, B; Heckman, T; Heetderks, H; Ho, S; Honscheid, K; Huterer, D; Ilbert, O; Ivans, I; Jelinsky, P; Jing, Y; Joyce, D; Kennedy, R; Kent, S; Kieda, D; Kim, A; Kim, C; Kneib, J -P; Kong, X; Kosowsky, A; Krishnan, K; Lahav, O; Lampton, M; LeBohec, S; Brun, V Le; Levi, M; Li, C; Liang, M; Lim, H; Lin, W; Linder, E; Lorenzon, W; de la Macorra, A; Magneville, Ch; Malina, R; Marinoni, C; Martinez, V; Majewski, S; Matheson, T; McCloskey, R; McDonald, P; McKay, T; McMahon, J; Menard, B; Miralda-Escude, J; Modjaz, M; Montero-Dorta, A; Morales, I; Mostek, N; Newman, J; Nichol, R; Nugent, P; Olsen, K; Padmanabhan, N; Palanque-Delabrouille, N; Park, I; Peacock, J; Percival, W; Perlmutter, S; Peroux, C; Petitjean, P; Prada, F; Prieto, E; Prochaska, J; Reil, K; Rockosi, C; Roe, N; Rollinde, E; Roodman, A; Ross, N; Rudnick, G; Ruhlmann-Kleider, V; Sanchez, J; Sawyer, D; Schimd, C; Schubnell, M; Scoccimaro, R; Seljak, U; Seo, H; Sheldon, E; Sholl, M; Shulte-Ladbeck, R; Slosar, A; Smith, D S; Smoot, G; Springer, W; Stril, A; Szalay, A S; Tao, C; Tarle, G; Taylor, E; Tilquin, A; Tinker, J; Valdes, F; Wang, J; Wang, T; Weaver, B A; Weinberg, D; White, M; Wood-Vasey, M; Yang, J; Yeche, X Yang Ch; Zakamska, N; Zentner, A; Zhai, C; Zhang, P

    2011-01-01

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy red...

  17. Big Data Comes to School

    Directory of Open Access Journals (Sweden)

    Bill Cope

    2016-03-01

    Full Text Available The prospect of “big data” at once evokes optimistic views of an information-rich future and concerns about surveillance that adversely impacts our personal and private lives. This overview article explores the implications of big data in education, focusing by way of example on data generated by student writing. We have chosen writing because it presents particular complexities, highlighting the range of processes for collecting and interpreting evidence of learning in the era of computer-mediated instruction and assessment as well as the challenges. Writing is significant not only because it is central to the core subject area of literacy; it is also an ideal medium for the representation of deep disciplinary knowledge across a number of subject areas. After defining what big data entails in education, we map emerging sources of evidence of learning that separately and together have the potential to generate unprecedented amounts of data: machine assessments, structured data embedded in learning, and unstructured data collected incidental to learning activity. Our case is that these emerging sources of evidence of learning have significant implications for the traditional relationships between assessment and instruction. Moreover, for educational researchers, these data are in some senses quite different from traditional evidentiary sources, and this raises a number of methodological questions. The final part of the article discusses implications for practice in an emerging field of education data science, including publication of data, data standards, and research ethics.

  18. Multi purpose research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Raina, V.K. [Research Reactor Design and Projects Division, Bhabha Atomic Research Centre, Mumbai 400085 (India)]. E-mail: vkrain@magnum.barc.ernet.in; Sasidharan, K. [Research Reactor Design and Projects Division, Bhabha Atomic Research Centre, Mumbai 400085 (India); Sengupta, Samiran [Research Reactor Design and Projects Division, Bhabha Atomic Research Centre, Mumbai 400085 (India); Singh, Tej [Research Reactor Services Division, Bhabha Atomic Research Centre, Mumbai 400085 (India)

    2006-04-15

    At present Dhruva and Cirus reactors provide the majority of research reactor based facilities to cater to the various needs of a vast pool of researchers in the field of material sciences, physics, chemistry, bio sciences, research and development work for nuclear power plants and production of radio isotopes. With a view to further consolidate and expand the scope of research and development in nuclear and allied sciences, a new 20 MWt multi purpose research reactor is being designed. This paper describes some of the design features and safety aspects of this reactor.

  19. INVAP's Research Reactor Designs

    Directory of Open Access Journals (Sweden)

    Eduardo Villarino

    2011-01-01

    Full Text Available INVAP, an Argentine company founded more than three decades ago, is today recognized as one of the leaders within the research reactor industry. INVAP has participated in several projects covering a wide range of facilities, designed in accordance with the requirements of our different clients. For complying with these requirements, INVAP developed special skills and capabilities to deal with different fuel assemblies, different core cooling systems, and different reactor layouts. This paper summarizes the general features and utilization of several INVAP research reactor designs, from subcritical and critical assemblies to high-power reactors.

  20. LMFBR type reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kanbe, Mitsuru

    1997-04-04

    An LMFBR type reactor comprises a plurality of reactor cores in a reactor container. Namely, a plurality of pot containing vessels are disposed in the reactor vessel and a plurality of reactor cores are formed in a state where an integrated-type fuel assembly is each inserted to a pot, and a coolant pipeline is connected to each of the pot containing-vessel to cool the reactor core respectively. When fuels are exchanged, the integrated-type fuel assembly is taken out together with the pot from the reactor vessel in a state where the integrated-type fuel assembly is immersed in the coolants in the pot as it is. Accordingly, coolants are supplied to each of the pot containing-vessel connected with the coolant pipeline and circulate while cooling the integrated-type fuel assembly for every pot. Then, when the fuels are exchanged, the integrated type fuel assembly is taken out to the outside of the reactor together with the pot by taking up the pot from the pot-containing vessel. Then, neutron economy is improved to thereby improve reactor power and the breeding ratio. (N.H.)

  1. Ten years of monetary union in retrospect

    OpenAIRE

    2008-01-01

    1 January 1999 saw the start of the third and final phase of European Economic and Monetary Union (EMU). Ten years on, membership has expanded from the initial 11 members to reach 16 countries by January 2009. This article reviews the first decade of monetary union from a number of angles. Monetary policy under EMU managed to secure historically low inflation, thereby creating the conditions for sustainable economic growth. Despite large relative price movements stemming from globalisation, i...

  2. Ten new withanolides from Physalis peruviana.

    Science.gov (United States)

    Fang, Sheng-Tao; Liu, Ji-Kai; Li, Bo

    2012-01-01

    Ten new withanolides, including four perulactone-type withanolides, perulactones E-H (1-4), three 28-hydroxy-withanolides, withaperuvins I-K (5-7), and three other withanolides, withaperuvins L-N (8-10), together with six known compounds (11-16) were isolated from the aerial parts of Physalis peruviana. The structures of these compounds were elucidated on the basis of extensive spectroscopic analyses (1D and 2D NMR, IR, HR-MS) and chemical methods.

  3. CNPC's Ten Major Technological Events in 2004

    Institute of Scientific and Technical Information of China (English)

    Technological Development Department of CNPC

    2005-01-01

    @@ Editor's note: To make a timely introduction of the latest technologies developed by CNPC, Technological Development Department of CNPC entrusted Petroleum Economic & Technological Research Center of CNPC to appraise the oil company 's major technological developments. Based on three rounds of voting by nearly 100 oil experts, ten major technological events in 2004 are finally selected from more than 1 00 technological projects of CNPC according to the measurement standards of innovation, technological maturity, function and scientific value.

  4. Identification of ten new Galactic HII regions

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    We discovered ten large HII regions in the Sino-German λ6 cm polarization survey of the Galactic plane. They have been identified according to their flat spectral indexes and the high ratio between the 60 μm infrared emission and the λ6 cm emission. The integrated flux densities as well as the sizes of these sources are given at 4800 MHz. Cross-identifications are made with other major radio catalogs.

  5. EEG Correlates of Ten Positive Emotions

    Science.gov (United States)

    Hu, Xin; Yu, Jianwen; Song, Mengdi; Yu, Chun; Wang, Fei; Sun, Pei; Wang, Daifa; Zhang, Dan

    2017-01-01

    Compared with the well documented neurophysiological findings on negative emotions, much less is known about positive emotions. In the present study, we explored the EEG correlates of ten different positive emotions (joy, gratitude, serenity, interest, hope, pride, amusement, inspiration, awe, and love). A group of 20 participants were invited to watch 30 short film clips with their EEGs simultaneously recorded. Distinct topographical patterns for different positive emotions were found for the correlation coefficients between the subjective ratings on the ten positive emotions per film clip and the corresponding EEG spectral powers in different frequency bands. Based on the similarities of the participants’ ratings on the ten positive emotions, these emotions were further clustered into three representative clusters, as ‘encouragement’ for awe, gratitude, hope, inspiration, pride, ‘playfulness’ for amusement, joy, interest, and ‘harmony’ for love, serenity. Using the EEG spectral powers as features, both the binary classification on the higher and lower ratings on these positive emotions and the binary classification between the three positive emotion clusters, achieved accuracies of approximately 80% and above. To our knowledge, our study provides the first piece of evidence on the EEG correlates of different positive emotions. PMID:28184194

  6. Big Bang Nucleosynthesis: Probing the First 20 Minutes

    CERN Document Server

    Steigman, G

    2003-01-01

    Within the first 20 minutes of the evolution of the hot, dense, early Universe, astrophysically interesting abundances of deuterium, helium-3, helium-4, and lithium-7 were synthesized by the cosmic nuclear reactor. The primordial abundances of these light nuclides produced during Big Bang Nucleosynthesis (BBN) are sensitive to the universal density of baryons and to the early-Universe expansion rate which at early epochs is governed by the energy density in relativistic particles (``radiation'') such as photons and neutrinos. Some 380 kyr later, when the cosmic background radiation (CBR) radiation was freed from the embrace of the ionized plasma of protons and electrons, the spectrum of temperature fluctuations imprinted on the CBR also depended on the baryon and radiation densities. The comparison between the constraints imposed by BBN and those from the CBR reveals a remarkably consistent picture of the Universe at two widely separated epochs in its evolution. Combining these two probes leads to new and tig...

  7. Light water reactor program

    Energy Technology Data Exchange (ETDEWEB)

    Franks, S.M.

    1994-12-31

    The US Department of Energy`s Light Water Reactor Program is outlined. The scope of the program consists of: design certification of evolutionary plants; design, development, and design certification of simplified passive plants; first-of-a-kind engineering to achieve commercial standardization; plant lifetime improvement; and advanced reactor severe accident program. These program activities of the Office of Nuclear Energy are discussed.

  8. Space Nuclear Reactor Engineering

    Energy Technology Data Exchange (ETDEWEB)

    Poston, David Irvin [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-03-06

    We needed to find a space reactor concept that could be attractive to NASA for flight and proven with a rapid turnaround, low-cost nuclear test. Heat-pipe-cooled reactors coupled to Stirling engines long identified as the easiest path to near-term, low-cost concept.

  9. Reactor Materials Research

    Energy Technology Data Exchange (ETDEWEB)

    Van Walle, E

    2001-04-01

    The activities of the Reactor Materials Research Department of the Belgian Nuclear Research Centre SCK-CEN in 2000 are summarised. The programmes within the department are focussed on studies concerning (1) fusion, in particular mechanical testing; (2) Irradiation Assisted Stress Corrosion Cracking (IASCC); (3) nuclear fuel; and (4) Reactor Pressure Vessel Steel (RPVS)

  10. Nuclear reactor design

    CERN Document Server

    2014-01-01

    This book focuses on core design and methods for design and analysis. It is based on advances made in nuclear power utilization and computational methods over the past 40 years, covering core design of boiling water reactors and pressurized water reactors, as well as fast reactors and high-temperature gas-cooled reactors. The objectives of this book are to help graduate and advanced undergraduate students to understand core design and analysis, and to serve as a background reference for engineers actively working in light water reactors. Methodologies for core design and analysis, together with physical descriptions, are emphasized. The book also covers coupled thermal hydraulic core calculations, plant dynamics, and safety analysis, allowing readers to understand core design in relation to plant control and safety.

  11. Status of French reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ballagny, A. [Commissariat a l`Energie Atomique, Saclay (France)

    1997-08-01

    The status of French reactors is reviewed. The ORPHEE and RHF reactors can not be operated with a LEU fuel which would be limited to 4.8 g U/cm{sup 3}. The OSIRIS reactor has already been converted to LEU. It will use U{sub 3}Si{sub 2} as soon as its present stock of UO{sub 2} fuel is used up, at the end of 1994. The decision to close down the SILOE reactor in the near future is not propitious for the start of a conversion process. The REX 2000 reactor, which is expected to be commissioned in 2005, will use LEU (except if the fast neutrons core option is selected). Concerning the end of the HEU fuel cycle, the best option is reprocessing followed by conversion of the reprocessed uranium to LEU.

  12. ATLAS: Big Data in a Small Package?

    Science.gov (United States)

    Denneau, Larry

    2016-01-01

    For even small astronomy projects, the petabyte scale is now upon us. The Asteroid Terrestrial-impact Last Alert System (Tonry 2011) will survey the entire visible sky from Hawaii multiple times per night to search for near-Earth asteroids on impact trajectories. While the ATLAS optical system is modest by modern astronomical standards - two 0.5 m F/2.0 telescopes - each night the ATLAS system will measure nearly 109 astronomical sources to a photometric accuracy of <5%, totaling 1012 individual observations over its initial 3-year mission. This ever-growing dataset must be searched in real-time for moving objects and transients then archived for further analysis, and alerts for newly discovered near-Earth asteroids (NEAs) disseminated within tens of minutes from detection. ATLAS's all-sky coverage ensures it will discover many `rifle shot' near-misses moving rapidly on the sky as they shoot past the Earth, so the system will need software to automatically detect highly-trailed sources and discriminate them from the thousands of low-Earth orbit (LEO) and geosynchronous orbit (GEO) satellites ATLAS will see each night. Additional interrogation will identify interesting phenomena from millions of transient sources per night beyond the solar system. The data processing and storage requirements for ATLAS demand a `big data' approach typical of commercial internet enterprises. We describe our experience in deploying a nimble, scalable and reliable data processing infrastructure, and suggest ATLAS as steppingstone to data processing capability needed as we enter the era of LSST.

  13. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Corridor Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A first-surface topography Digital Elevation Model (DEM) mosaic for the Big Sandy Creek Corridor Unit of Big Thicket National Preserve in Texas was produced from...

  14. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Corridor Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A bare-earth topography Digital Elevation Model (DEM) mosaic for the Big Sandy Creek Corridor Unit of Big Thicket National Preserve in Texas was produced from...

  15. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A first-surface topography digital elevation model (DEM) mosaic for the Big Sandy Creek Unit of Big Thicket National Preserve in Texas, was produced from remotely...

  16. EAARL-B Topography-Big Thicket National Preserve: Big Sandy Creek Unit, Texas, 2014

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A bare-earth topography digital elevation model (DEM) mosaic for the Big Sandy Creek Unit of Big Thicket National Preserve in Texas, was produced from remotely...

  17. TENS (transcutaneous electrical nerve stimulation) for labour pain.

    Science.gov (United States)

    Francis, Richard

    2012-05-01

    Because TENS is applied inconsistently and not always in line with optimal TENS application theory, this may explain why TENS for labour pain appears to be effective in some individuals and not in others. This article reviews TENS theory, advises upon optimal TENS application for labour pain and discusses some of the limitations of TENS research on labour pain. TENS application for labour pain may include TENS applied to either side of the lower spine, set to 200 mus pulse duration and 100 pulses per second. As pain increases, TENS intensity should be increased and as pain decreases, TENS intensity should be reduced to maintain a strong but pain free intensity of stimulation. This application may particularly reduce back pain during labour.

  18. Ten steps to successful poster presentation.

    Science.gov (United States)

    Hardicre, Jayne; Devitt, Patric; Coad, Jane

    Receiving a letter confirming acceptance for you to present a poster at a conference can evoke mixed emotions. Joy, panic, fear and dread are among the many possible emotions and this is not exclusive to first time presenters. Developing an effective poster presentation is a skill that you can learn and can provide a rewarding way to present your work in a manner less intimidating than oral presentation (Shelledy, 2004). The key to successful poster presentation is meticulous, timely, well informed preparation. This article outlines ten steps to help guide you through the process to maximize your success.

  19. Ten essential skills for electrical engineers

    CERN Document Server

    Dorr, Barry

    2014-01-01

    Engineers know that, as in any other discipline, getting a good job requires practical, up-to-date skills. An engineering degree provides a broad set of fundamentals. Ten Essential Skills applies those fundamentals to practical tasks required by employers. Written in a user-friendly, no-nonsense format, the book reviews practical skills using the latest tools and techniques, and features a companion website with interview practice problems and advanced material for readers wishing to pursue additional skills. With this book, aspiring and current engineers may approach job interviews confident

  20. Big data optimization recent developments and challenges

    CERN Document Server

    2016-01-01

    The main objective of this book is to provide the necessary background to work with big data by introducing some novel optimization algorithms and codes capable of working in the big data setting as well as introducing some applications in big data optimization for both academics and practitioners interested, and to benefit society, industry, academia, and government. Presenting applications in a variety of industries, this book will be useful for the researchers aiming to analyses large scale data. Several optimization algorithms for big data including convergent parallel algorithms, limited memory bundle algorithm, diagonal bundle method, convergent parallel algorithms, network analytics, and many more have been explored in this book.

  1. Medical big data: promise and challenges.

    Science.gov (United States)

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  2. Organizational Design Challenges Resulting From Big Data

    Directory of Open Access Journals (Sweden)

    Jay R. Galbraith

    2014-04-01

    Full Text Available Business firms and other types of organizations are feverishly exploring ways of taking advantage of the big data phenomenon. This article discusses firms that are at the leading edge of developing a big data analytics capability. Firms that are currently enjoying the most success in this area are able to use big data not only to improve their existing businesses but to create new businesses as well. Putting a strategic emphasis on big data requires adding an analytics capability to the existing organization. This transformation process results in power shifting to analytics experts and in decisions being made in real time.

  3. The big de Rham–Witt complex

    DEFF Research Database (Denmark)

    Hesselholt, Lars

    2015-01-01

    This paper gives a new and direct construction of the multi-prime big de Rham–Witt complex, which is defined for every commutative and unital ring; the original construction by Madsen and myself relied on the adjoint functor theorem and accordingly was very indirect. The construction given here....... It is the existence of these divided Frobenius operators that makes the new construction of the big de Rham–Witt complex possible. It is further shown that the big de Rham–Witt complex behaves well with respect to étale maps, and finally, the big de Rham–Witt complex of the ring of integers is explicitly evaluated....

  4. Big data analytics with R and Hadoop

    CERN Document Server

    Prajapati, Vignesh

    2013-01-01

    Big Data Analytics with R and Hadoop is a tutorial style book that focuses on all the powerful big data tasks that can be achieved by integrating R and Hadoop.This book is ideal for R developers who are looking for a way to perform big data analytics with Hadoop. This book is also aimed at those who know Hadoop and want to build some intelligent applications over Big data with R packages. It would be helpful if readers have basic knowledge of R.

  5. Traffic information computing platform for big data

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn; Liu, Yan, E-mail: ztduan@chd.edu.cn; Dai, Jiting, E-mail: ztduan@chd.edu.cn; Kang, Jun, E-mail: ztduan@chd.edu.cn [Chang' an University School of Information Engineering, Xi' an, China and Shaanxi Engineering and Technical Research Center for Road and Traffic Detection, Xi' an (China)

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  6. Big Data; A Management Revolution : The emerging role of big data in businesses

    OpenAIRE

    Blasiak, Kevin

    2014-01-01

    Big data is a term that was coined in 2012 and has since then emerged to one of the top trends in business and technology. Big data is an agglomeration of different technologies resulting in data processing capabilities that have been unreached before. Big data is generally characterized by 4 factors. Volume, velocity and variety. These three factors distinct it from the traditional data use. The possibilities to utilize this technology are vast. Big data technology has touch points in differ...

  7. Experimental reactor regulation: the nuclear safety authority's approach; Le controle des reacteurs experimentaux: la demarche de l'Autorite de surete nucleaire

    Energy Technology Data Exchange (ETDEWEB)

    Rieu, J.; Conte, D.; Chevalier, A. [Autorite de Surete Nucleaire, 75 - Paris (France)

    2007-07-15

    French research reactors can be classified into 6 categories: 1) critical scale models (Eole, Minerve and Masurca) whose purpose is the study of the neutron production through the fission reaction; 2) reactors that produce neutron beams (Orphee, and the high flux reactor in Grenoble); 3) reactors devoted to safety studies (Cabri, Phebus) whose purpose is to reproduce accidental configurations of power reactors in reduced scale; 4) experimental reactors (Osiris, Phenix) whose purpose is the carrying-out of irradiation experiments concerning nuclear fuels or structure materials; 5) teaching reactors (Ulysse, Isis); and 6) reactors involved in defense programs (Caliban, Prospero, Apareillage-B). We have to note that 3 research reactors are currently being dismantled: Strasbourg University's reactor, Siloe and Siloette. Research reactors in France are of different types and present different hazards. Even if methods of control become more and more similar to those of power reactors, the French Nuclear Safety Authority (ASN) works to allow the necessary flexibility in the ever changing research reactor field while ensuring a high level of safety. Adopting the internal authorizations for operations of minor safety significance, under certain conditions, is one example of this approach. Another challenge in the coming years for ASN is to monitor the ageing of the French research reactors. This includes periodic safety reviews for each facility every ten years. But ASN has also to regulate the new research reactor projects such as Jules Horowitz Reactor, International Thermonuclear Experimental Reactor, which are about to be built.

  8. Big Brother Has Bigger Say

    Institute of Scientific and Technical Information of China (English)

    Yang Wei

    2009-01-01

    @@ 156 delegates from all walks of life in Guangdong province composed the Guangdong delegation for the NPC this year. The import and export value of Guangdong makes up one-third of national total value, and accounts for one-eighth of national economic growth. Guangdong province has maintained its top spot in import and export value among China's many provinces and cities for several years, commonly referred to as "Big Brother". At the same time, it is the region where the global financial crisis has hit China hardest.

  9. Big Data en surveillance, deel 1 : Definities en discussies omtrent Big Data

    NARCIS (Netherlands)

    Timan, Tjerk

    2016-01-01

    Naar aanleiding van een (vrij kort) college over surveillance en Big Data, werd me gevraagd iets dieper in te gaan op het thema, definities en verschillende vraagstukken die te maken hebben met big data. In dit eerste deel zal ik proberen e.e.a. uiteen te zetten betreft Big Data theorie en terminolo

  10. Comparative validity of brief to medium-length Big Five and Big Six personality questionnaires

    NARCIS (Netherlands)

    Thalmayer, A.G.; Saucier, G.; Eigenhuis, A.

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five

  11. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  12. Forget the hype or reality. Big data presents new opportunities in Earth Science.

    Science.gov (United States)

    Lee, T. J.

    2015-12-01

    Earth science is arguably one of the most mature science discipline which constantly acquires, curates, and utilizes a large volume of data with diverse variety. We deal with big data before there is big data. For example, while developing the EOS program in the 1980s, the EOS data and information system (EOSDIS) was developed to manage the vast amount of data acquired by the EOS fleet of satellites. EOSDIS continues to be a shining example of modern science data systems in the past two decades. With the explosion of internet, the usage of social media, and the provision of sensors everywhere, the big data era has bring new challenges. First, Goggle developed the search algorithm and a distributed data management system. The open source communities quickly followed up and developed Hadoop file system to facility the map reduce workloads. The internet continues to generate tens of petabytes of data every day. There is a significant shortage of algorithms and knowledgeable manpower to mine the data. In response, the federal government developed the big data programs that fund research and development projects and training programs to tackle these new challenges. Meanwhile, comparatively to the internet data explosion, Earth science big data problem has become quite small. Nevertheless, the big data era presents an opportunity for Earth science to evolve. We learned about the MapReduce algorithms, in memory data mining, machine learning, graph analysis, and semantic web technologies. How do we apply these new technologies to our discipline and bring the hype to Earth? In this talk, I will discuss how we might want to apply some of the big data technologies to our discipline and solve many of our challenging problems. More importantly, I will propose new Earth science data system architecture to enable new type of scientific inquires.

  13. Big Challenges and Big Opportunities: The Power of "Big Ideas" to Change Curriculum and the Culture of Teacher Planning

    Science.gov (United States)

    Hurst, Chris

    2014-01-01

    Mathematical knowledge of pre-service teachers is currently "under the microscope" and the subject of research. This paper proposes a different approach to teacher content knowledge based on the "big ideas" of mathematics and the connections that exist within and between them. It is suggested that these "big ideas"…

  14. Slurry reactor design studies

    Energy Technology Data Exchange (ETDEWEB)

    Fox, J.M.; Degen, B.D.; Cady, G.; Deslate, F.D.; Summers, R.L. (Bechtel Group, Inc., San Francisco, CA (USA)); Akgerman, A. (Texas A and M Univ., College Station, TX (USA)); Smith, J.M. (California Univ., Davis, CA (USA))

    1990-06-01

    The objective of these studies was to perform a realistic evaluation of the relative costs of tublar-fixed-bed and slurry reactors for methanol, mixed alcohols and Fischer-Tropsch syntheses under conditions where they would realistically be expected to operate. The slurry Fischer-Tropsch reactor was, therefore, operated at low H{sub 2}/CO ratio on gas directly from a Shell gasifier. The fixed-bed reactor was operated on 2.0 H{sub 2}/CO ratio gas after adjustment by shift and CO{sub 2} removal. Every attempt was made to give each reactor the benefit of its optimum design condition and correlations were developed to extend the models beyond the range of the experimental pilot plant data. For the methanol design, comparisons were made for a recycle plant with high methanol yield, this being the standard design condition. It is recognized that this is not necessarily the optimum application for the slurry reactor, which is being proposed for a once-through operation, coproducing methanol and power. Consideration is also given to the applicability of the slurry reactor to mixed alcohols, based on conditions provided by Lurgi for an Octamix{trademark} plant using their standard tubular-fixed reactor technology. 7 figs., 26 tabs.

  15. An analysis of cross-sectional differences in big and non-big public accounting firms' audit programs

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans); Drieenhuizen, F.; Stein, M.T.; Simunic, D.A.

    2006-01-01

    A significant body of prior research has shown that audits by the Big 5 (now Big 4) public accounting firms are quality differentiated relative to non-Big 5 audits. This result can be derived analytically by assuming that Big 5 and non-Big 5 firms face different loss functions for "audit failures" a

  16. Analysis of High Temperature Reactor Control Rod Worth for the Initial and Full Core

    Science.gov (United States)

    Oktajianto, Hammam; Setiawati, Evi; Anam, Khoirul; Sugito, Heri

    2017-01-01

    Control rod is one important component in a nuclear reactor. In nuclear reactor operations the control rod functions to shut down the reactor. This research analyses ten control rods worth of HTR (High Temperature Reactor) at initial and full core. The HTR in this research adopts HTR-10 China and HTR- of pebble bed. Core calculations are performed by using MCNPX code after modelling the entire parts of core in condition of ten control rods fully withdrawn, all control rods in with 20 cm ranges of depth and the use of one control rod. Pebble bed and moderator balls are distributed in the core zone using a Body Centred Cubic (BCC) lattice by ratio of 57:43. The research results are obtained that the use of one control rod will decrease the reactor criticality of 2.04±0.12 %Δk/k at initial core and 1.57±0.10 %Δk/k at full core. The deeper control rods are in, the lesser criticality of reactor is with reactivity of ten control rods of 16.41±0.11 %Δk/k at initial core and 15.43±0.11 %Δk/k at full core. The results show that the use of ten control rods at full core will keep achieving subcritical condition even though the reactivity is smaller than reactivity at initial core.

  17. Big data in oncologic imaging.

    Science.gov (United States)

    Regge, Daniele; Mazzetti, Simone; Giannini, Valentina; Bracco, Christian; Stasi, Michele

    2016-09-13

    Cancer is a complex disease and unfortunately understanding how the components of the cancer system work does not help understand the behavior of the system as a whole. In the words of the Greek philosopher Aristotle "the whole is greater than the sum of parts." To date, thanks to improved information technology infrastructures, it is possible to store data from each single cancer patient, including clinical data, medical images, laboratory tests, and pathological and genomic information. Indeed, medical archive storage constitutes approximately one-third of total global storage demand and a large part of the data are in the form of medical images. The opportunity is now to draw insight on the whole to the benefit of each individual patient. In the oncologic patient, big data analysis is at the beginning but several useful applications can be envisaged including development of imaging biomarkers to predict disease outcome, assessing the risk of X-ray dose exposure or of renal damage following the administration of contrast agents, and tracking and optimizing patient workflow. The aim of this review is to present current evidence of how big data derived from medical images may impact on the diagnostic pathway of the oncologic patient.

  18. Baryon symmetric big bang cosmology

    Science.gov (United States)

    Stecker, F. W.

    1978-01-01

    Both the quantum theory and Einsteins theory of special relativity lead to the supposition that matter and antimatter were produced in equal quantities during the big bang. It is noted that local matter/antimatter asymmetries may be reconciled with universal symmetry by assuming (1) a slight imbalance of matter over antimatter in the early universe, annihilation, and a subsequent remainder of matter; (2) localized regions of excess for one or the other type of matter as an initial condition; and (3) an extremely dense, high temperature state with zero net baryon number; i.e., matter/antimatter symmetry. Attention is given to the third assumption, which is the simplest and the most in keeping with current knowledge of the cosmos, especially as pertains the universality of 3 K background radiation. Mechanisms of galaxy formation are discussed, whereby matter and antimatter might have collided and annihilated each other, or have coexisted (and continue to coexist) at vast distances. It is pointed out that baryon symmetric big bang cosmology could probably be proved if an antinucleus could be detected in cosmic radiation.

  19. Georges et le big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2011-01-01

    Georges et Annie, sa meilleure amie, sont sur le point d'assister à l'une des plus importantes expériences scientifiques de tous les temps : explorer les premiers instants de l'Univers, le Big Bang ! Grâce à Cosmos, leur super ordinateur, et au Grand Collisionneur de hadrons créé par Éric, le père d'Annie, ils vont enfin pouvoir répondre à cette question essentielle : pourquoi existons nous ? Mais Georges et Annie découvrent qu'un complot diabolique se trame. Pire, c'est toute la recherche scientifique qui est en péril ! Entraîné dans d'incroyables aventures, Georges ira jusqu'aux confins de la galaxie pour sauver ses amis...Une plongée passionnante au coeur du Big Bang. Les toutes dernières théories de Stephen Hawking et des plus grands scientifiques actuels.

  20. Gas cooled fast reactor

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1972-06-01

    Although most of the development work on fast breeder reactors has been devoted to the use of liquid metal cooling, interest has been expressed for a number of years in alternative breeder concepts using other coolants. One of a number of concepts in which interest has been retained is the Gas-Cooled Fast Reactor (GCFR). As presently envisioned, it would operate on the uranium-plutonium mixed oxide fuel cycle, similar to that used in the Liquid Metal Fast Breeder Reactor (LMFBR), and would use helium gas as the coolant.

  1. Microfluidic electrochemical reactors

    Science.gov (United States)

    Nuzzo, Ralph G [Champaign, IL; Mitrovski, Svetlana M [Urbana, IL

    2011-03-22

    A microfluidic electrochemical reactor includes an electrode and one or more microfluidic channels on the electrode, where the microfluidic channels are covered with a membrane containing a gas permeable polymer. The distance between the electrode and the membrane is less than 500 micrometers. The microfluidic electrochemical reactor can provide for increased reaction rates in electrochemical reactions using a gaseous reactant, as compared to conventional electrochemical cells. Microfluidic electrochemical reactors can be incorporated into devices for applications such as fuel cells, electrochemical analysis, microfluidic actuation, pH gradient formation.

  2. Fast Breeder Reactor studies

    Energy Technology Data Exchange (ETDEWEB)

    Till, C.E.; Chang, Y.I.; Kittel, J.H.; Fauske, H.K.; Lineberry, M.J.; Stevenson, M.G.; Amundson, P.I.; Dance, K.D.

    1980-07-01

    This report is a compilation of Fast Breeder Reactor (FBR) resource documents prepared to provide the technical basis for the US contribution to the International Nuclear Fuel Cycle Evaluation. The eight separate parts deal with the alternative fast breeder reactor fuel cycles in terms of energy demand, resource base, technical potential and current status, safety, proliferation resistance, deployment, and nuclear safeguards. An Annex compares the cost of decommissioning light-water and fast breeder reactors. Separate abstracts are included for each of the parts.

  3. Why Big Data Is a Big Deal (Ⅱ)

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    A new group of data mining technologies promises to change forever the way we sift through our vast stores of data,making it faster and cheaper.Some of the technologies are actively being used by people on the bleeding edge who need the technology now,like those involved in creating Web-based services that are driven by social media.They're also heavily contributing to these projects.In other vertical industries,businesses are realizing that much more of their value proposition is informationbased than they had previously thought,which will allow big data technologies to gain traction quickly,Olofson says.Couple that with affordable hardware and software,and enterprises find themselves in a perfect storm of business transformation opportunities.

  4. Big Data – Big Deal for Organization Design?

    Directory of Open Access Journals (Sweden)

    Janne J. Korhonen

    2014-04-01

    Full Text Available Analytics is an increasingly important source of competitive advantage. It has even been posited that big data will be the next strategic emphasis of organizations and that analytics capability will be manifested in organizational structure. In this article, I explore how analytics capability might be reflected in organizational structure using the notion of  “requisite organization” developed by Jaques (1998. Requisite organization argues that a new strategic emphasis requires the addition of a new stratum in the organization, resulting in greater organizational complexity. Requisite organization could serve as an objective, verifiable criterion for what qualifies as a genuine new strategic emphasis. Such a criterion is  necessary for research on the co-evolution of strategy and structure.

  5. Kansen voor Big data – WPA Vertrouwen

    NARCIS (Netherlands)

    Broek, T.A. van den; Roosendaal, A.P.C.; Veenstra, A.F.E. van; Nunen, A.M. van

    2014-01-01

    Big data is expected to become a driver for economic growth, but this can only be achieved when services based on (big) data are accepted by citizens and consumers. In a recent policy brief, the Cabinet Office mentions trust as one of the three pillars (the others being transparency and control) for

  6. The Death of the Big Men

    DEFF Research Database (Denmark)

    Martin, Keir

    2010-01-01

    Recently Tolai people og Papua New Guinea have adopted the term 'Big Shot' to decribe an emerging post-colonial political elite. The mergence of the term is a negative moral evaluation of new social possibilities that have arisen as a consequence of the Big Shots' privileged position within a glo...

  7. The Big Sleep in the Woods

    Institute of Scientific and Technical Information of China (English)

    王玉峰

    2002-01-01

    Now it's the time of the big sleep for the bees and the bears. Even the buds of the plants whose leaves fall off share in it. But the intensity of this winter sleep, or hibernation, depends on who's doing it.The big sleep of the bears ,for instance ,would probably be thought of as a

  8. A New Look at Big History

    Science.gov (United States)

    Hawkey, Kate

    2014-01-01

    The article sets out a "big history" which resonates with the priorities of our own time. A globalizing world calls for new spacial scales to underpin what the history curriculum addresses, "big history" calls for new temporal scales, while concern over climate change calls for a new look at subject boundaries. The article…

  9. 革命者BIG BANG

    Institute of Scientific and Technical Information of China (English)

    刘岩

    2015-01-01

    <正>在鄂尔多斯的繁荣时代,我遇见了那里的一位"意见领袖",因为他从美国回来,见过外面的世界,有着对奢侈品辽阔的见识和独到的品味。他引领着那座神秘财富城市中一个小圈子的购物风潮,他们一块接一块儿地购入Big Bang。那个时候,我并不太清楚他们迷恋这款腕表的原因,直到我一次次地去到巴塞尔表展,一次次地了解到Big Bang的想象力。是的,Big Bang的确充满了魅力。Big Bang进化史2005年Big Bang系列诞生2006年Big Bang全黑"全黑"理念使Big Bang更加纯粹和简洁。Big Bang全黑腕表从表壳到表盘浑然天成的亚光质感和多层次、不同材料融合起来的黑色,蕴含"不可见的可见"之禅意。

  10. An embedding for the big bang

    Science.gov (United States)

    Wesson, Paul S.

    1994-01-01

    A cosmological model is given that has good physical properties for the early and late universe but is a hypersurface in a flat five-dimensional manifold. The big bang can therefore be regarded as an effect of a choice of coordinates in a truncated higher-dimensional geometry. Thus the big bang is in some sense a geometrical illusion.

  11. Structuring the Curriculum around Big Ideas

    Science.gov (United States)

    Alleman, Janet; Knighton, Barbara; Brophy, Jere

    2010-01-01

    This article provides an inside look at Barbara Knighton's classroom teaching. She uses big ideas to guide her planning and instruction and gives other teachers suggestions for adopting the big idea approach and ways for making the approach easier. This article also represents a "small slice" of a dozen years of collaborative research,…

  12. Periprosthetic Knee Infection: Ten Strategies That Work

    Science.gov (United States)

    Cavanaugh, Priscilla Ku; Diaz-Ledezma, Claudio

    2013-01-01

    Periprosthetic joint infection (PJI) is one of the most serious complications following total knee arthroplasty (TKA). The demand for TKA is rapidly increasing, resulting in a subsequent increase in infections involving knee prosthesis. Despite the existence of common management practices, the best approach for several aspects in the management of periprosthetic knee infection remains controversial. This review examines the current understanding in the management of the following aspects of PJI: preoperative risk stratification, preoperative antibiotics, preoperative skin preparation, outpatient diagnosis, assessing for infection in revision cases, improving culture utility, irrigation and debridement, one and two-stage revision, and patient prognostic information. Moreover, ten strategies for the management of periprosthetic knee infection based on available literature, and experience of the authors were reviewed. PMID:24368992

  13. The ten grand challenges of synthetic life.

    Science.gov (United States)

    Porcar, Manuel; Danchin, Antoine; de Lorenzo, Victor; Dos Santos, Vitor A; Krasnogor, Natalio; Rasmussen, Steen; Moya, Andrés

    2011-06-01

    The construction of artificial life is one of the main scientific challenges of the Synthetic Biology era. Advances in DNA synthesis and a better understanding of regulatory processes make the goal of constructing the first artificial cell a realistic possibility. This would be both a fundamental scientific milestone and a starting point of a vast range of applications, from biofuel production to drug design. However, several major issues might hamper the objective of achieving an artificial cell. From the bottom-up to the selection-based strategies, this work encompasses the ten grand challenges synthetic biologists will have to be aware of in order to cope with the task of creating life in the lab.

  14. A novel concept for CRIEC-driven subcritical research reactors

    Energy Technology Data Exchange (ETDEWEB)

    Nieto, M.; Miley, G.H. [Illinois Univ., Fusion Studies Lab., Dept. of Nuclear, Plasma, and Radiological Engineering, Urbana, IL (United States)

    2001-07-01

    A novel scheme is proposed to drive a low-power subcritical fuel assembly by means of a long Cylindrical Radially-convergent Inertial Electrostatic Confinement (CRIEC) used as a neutron source. The concept is inherently safe in the sense that the fuel assembly remains subcritical at all times. Previous work has been done for the possible implementation of CRIEC as a subcritical assembly driver for power reactors. However, it has been found that the present technology and stage of development of IEC-based neutron sources can not meet the neutron flux requirements to drive a system as big as a power reactor. Nevertheless, smaller systems, such as research and training reactors, could be successfully driven with levels of neutron flux that seem more reasonable to be achieved in the near future by IEC devices. The need for custom-made expensive nuclear fission fuel, as in the case of the TRIGA reactors, is eliminated, and the CRIEC presents substantial advantages with respect to the accelerator-driven subcritical reactors in terms of simplicity and cost. In the present paper, a conceptual design for a research/training CRIEC-driven subcritical assembly is presented, emphasizing the description, principle of operation and performance of the CRIEC neutron source, highlighting its advantages and discussing some key issues that require study for the implementation of this concept. (author)

  15. Review - Scripture of the Ten Kings

    Directory of Open Access Journals (Sweden)

    Paul K. Nietupski

    2013-12-01

    Full Text Available Review of: Daniel Berounský (with Luboš Bělka, "Comparative Description of the Paintings". 2012. The Tibetan Version of the Scripture on the Ten Kings and the Quest for Chinese Influence on the Tibetan Perception of the Afterlife. Prague: Triton Publishing House. This book is a study of the various influences on the complex Tibetan visions of the afterlife. It is based on new text-critical research and includes an introduction and translation of a rare Tibetan manuscript entitled Scripture on the Ten Kings, housed in the National Gallery, Prague. The book includes extensive references to secondary scholarship, as well as collaborative work by competent scholars and an appended study of the text illustrations by Luboš Bělka From the outset, the book raises a range of interpretive questions of central importance to at least Tibetan and Asian studies. The author describes the plan of the book and its parameters in the Introduction, noting that the manuscript under study does not include tantric perspectives (12, and is instead more oriented to popular understanding and use. This is an important and controversial methodological position, consistent with the formalized one circulated in scholastic and monastic circles that emphasizes secrecy and limited access to fully developed tantric studies. This position can, however, be contrasted with, for example, the uses of texts noted by Cuevas (2003, and further emphasized by Prude (2005:1-3, who suggests more popular use of tantric texts. This view is discussed in detail by others, including Thurman (2006, in his obviously tantric-influenced introduction and translation of the..

  16. Cloud Based Big Data Infrastructure: Architectural Components and Automated Provisioning

    OpenAIRE

    Demchenko, Yuri; Turkmen, Fatih; Blanchet, Christophe; Loomis, Charles; Laat, Caees de

    2016-01-01

    This paper describes the general architecture and functional components of the cloud based Big Data Infrastructure (BDI). The proposed BDI architecture is based on the analysis of the emerging Big Data and data intensive technologies and supported by the definition of the Big Data Architecture Framework (BDAF) that defines the following components of the Big Data technologies: Big Data definition, Data Management including data lifecycle and data structures, Big Data Infrastructure (generical...

  17. Reactor BR2. Introduction

    Energy Technology Data Exchange (ETDEWEB)

    Gubel, P

    2001-04-01

    The BR2 is a materials testing reactor and is still one of SCK-CEN's important nuclear facilities. After an extensive refurbishment to compensate for the ageing of the installation, the reactor was restarted in April 1997. During the last three years, the availability of the installation was maintained at an average level of 97.6 percent. In the year 2000, the reactor was operated for a total of 104 days at a mean power of 56 MW. In 2000, most irradiation experiments were performed in the CALLISTO PWR loop. The report describes irradiations achieved or under preparation in 2000, including the development of advanced facilities and concept studies for new programmes. An overview of the scientific irradiation programmes as well as of the R and D programme of the BR2 reactor in 2000 is given.

  18. Reactor Neutrino Spectra

    CERN Document Server

    Hayes, A C

    2016-01-01

    We present a review of the antineutrino spectra emitted from reactors. Knowledge of these and their associated uncertainties are crucial for neutrino oscillation studies. The spectra used to-date have been determined by either conversion of measured electron spectra to antineutrino spectra or by summing over all of the thousands of transitions that makeup the spectra using modern databases as input. The uncertainties in the subdominant corrections to beta-decay plague both methods, and we provide estimates of these uncertainties. Improving on current knowledge of the antineutrino spectra from reactors will require new experiments. Such experiments would also address the so-called reactor neutrino anomaly and the possible origin of the shoulder observed in the antineutrino spectra measured in recent high-statistics reactor neutrino experiments.

  19. New reactor type proposed

    CERN Multimedia

    2003-01-01

    "Russian scientists at the Research Institute of Nuclear Power Engineering in Moscow are hoping to develop a new reactor that will use lead and bismuth as fuel instead of uranium and plutonium" (1/2 page).

  20. The ethics of biomedical big data

    CERN Document Server

    Mittelstadt, Brent Daniel

    2016-01-01

    This book presents cutting edge research on the new ethical challenges posed by biomedical Big Data technologies and practices. ‘Biomedical Big Data’ refers to the analysis of aggregated, very large datasets to improve medical knowledge and clinical care. The book describes the ethical problems posed by aggregation of biomedical datasets and re-use/re-purposing of data, in areas such as privacy, consent, professionalism, power relationships, and ethical governance of Big Data platforms. Approaches and methods are discussed that can be used to address these problems to achieve the appropriate balance between the social goods of biomedical Big Data research and the safety and privacy of individuals. Seventeen original contributions analyse the ethical, social and related policy implications of the analysis and curation of biomedical Big Data, written by leading experts in the areas of biomedical research, medical and technology ethics, privacy, governance and data protection. The book advances our understan...

  1. Ethics and Epistemology in Big Data Research.

    Science.gov (United States)

    Lipworth, Wendy; Mason, Paul H; Kerridge, Ian; Ioannidis, John P A

    2017-03-20

    Biomedical innovation and translation are increasingly emphasizing research using "big data." The hope is that big data methods will both speed up research and make its results more applicable to "real-world" patients and health services. While big data research has been embraced by scientists, politicians, industry, and the public, numerous ethical, organizational, and technical/methodological concerns have also been raised. With respect to technical and methodological concerns, there is a view that these will be resolved through sophisticated information technologies, predictive algorithms, and data analysis techniques. While such advances will likely go some way towards resolving technical and methodological issues, we believe that the epistemological issues raised by big data research have important ethical implications and raise questions about the very possibility of big data research achieving its goals.

  2. 2nd INNS Conference on Big Data

    CERN Document Server

    Manolopoulos, Yannis; Iliadis, Lazaros; Roy, Asim; Vellasco, Marley

    2017-01-01

    The book offers a timely snapshot of neural network technologies as a significant component of big data analytics platforms. It promotes new advances and research directions in efficient and innovative algorithmic approaches to analyzing big data (e.g. deep networks, nature-inspired and brain-inspired algorithms); implementations on different computing platforms (e.g. neuromorphic, graphics processing units (GPUs), clouds, clusters); and big data analytics applications to solve real-world problems (e.g. weather prediction, transportation, energy management). The book, which reports on the second edition of the INNS Conference on Big Data, held on October 23–25, 2016, in Thessaloniki, Greece, depicts an interesting collaborative adventure of neural networks with big data and other learning technologies.

  3. Big data analytics methods and applications

    CERN Document Server

    Rao, BLS; Rao, SB

    2016-01-01

    This book has a collection of articles written by Big Data experts to describe some of the cutting-edge methods and applications from their respective areas of interest, and provides the reader with a detailed overview of the field of Big Data Analytics as it is practiced today. The chapters cover technical aspects of key areas that generate and use Big Data such as management and finance; medicine and healthcare; genome, cytome and microbiome; graphs and networks; Internet of Things; Big Data standards; bench-marking of systems; and others. In addition to different applications, key algorithmic approaches such as graph partitioning, clustering and finite mixture modelling of high-dimensional data are also covered. The varied collection of themes in this volume introduces the reader to the richness of the emerging field of Big Data Analytics.

  4. Review Study of Mining Big Data

    Directory of Open Access Journals (Sweden)

    Mohammad Misagh Javaherian

    2016-06-01

    Full Text Available Big data is time period for collecting extensive and complex data set which including both structured and nonstructured information. Data can come from everywhere. sensors for collecting environment data are presented in online networking targets, computer images and recording and so on , this information is known as big data. The valuable data can be extracted from this big data using data mining. Data mining is a method to find attractive samples and also logical models of information in wide scale. This article shown types of big data and future problems in extensive information as a chart. Study of issues in data-centered model in addition to big data will be analyzed.

  5. Helias reactor studies

    Energy Technology Data Exchange (ETDEWEB)

    Beidler, C.D. [Max-Planck-Institut fuer Plasmaphysik, Garching (Germany); Grieger, G. [Max-Planck-Institut fuer Plasmaphysik, Garching (Germany); Harmeyer, E. [Max-Planck-Institut fuer Plasmaphysik, Garching (Germany); Kisslinger, J. [Max-Planck-Institut fuer Plasmaphysik, Garching (Germany); Karulin, N. [Nuclear Fusion Institute, Moscow (Russian Federation); Maurer, W. [Forschungszentrum Karlsruhe GmbH Technik und Umwelt (Germany); Nuehrenberg, J. [Max-Planck-Institut fuer Plasmaphysik, Garching (Germany); Rau, F. [Max-Planck-Institut fuer Plasmaphysik, Garching (Germany); Sapper, J. [Max-Planck-Institut fuer Plasmaphysik, Garching (Germany); Wobig, H. [Max-Planck-Institut fuer Plasmaphysik, Garching (Germany)

    1995-10-01

    The present status of Helias reactor studies is characterised by the identification and investigation of specific issues which result from the particular properties of this type of stellarator. On the technical side these are issues related to the coil system, while physics studies have concentrated on confinement, alpha-particle behaviour and ignition conditions. The usual assumptions have been made in those fields which are common to all toroidal fusion reactors: blanket and shield, refuelling and exhaust, safety and economic aspects. For blanket and shield sufficient space has been provided, a detailed concept will be developed in future. To date more emphasis has been placed on scoping and parameter studies as opposed to fixing a specific set of parameters and providing a detailed point study. One result of the Helias reactor studies is that physical dimensions are on the same order as those of tokamak reactors. However, it should be noticed that this comparison is difficult in view of the large spectrum of tokamak reactors ranging from a small reactor like Aries, to a large device such as SEAFP. The notion that the large aspect ratio of 10 or more in Helias configurations also leads to large reactors is misleading, since the large major radius of 22 m is compensated by the average plasma radius of 1.8 m and the average coil radius of 5 m. The plasma volume of 1400 m{sup 3} is about the same as the ITER reactor and the magnetic energy of the coil system is about the same or even slightly smaller than envisaged in ITER. (orig.)

  6. Future Reactor Experiments

    OpenAIRE

    He, Miao

    2013-01-01

    The measurement of the neutrino mixing angle $\\theta_{13}$ opens a gateway for the next generation experiments to measure the neutrino mass hierarchy and the leptonic CP-violating phase. Future reactor experiments will focus on mass hierarchy determination and the precision measurement of mixing parameters. Mass hierarchy can be determined from the disappearance of reactor electron antineutrinos based on the interference effect of two separated oscillation modes. Relative and absolute measure...

  7. Reactor Neutrino Experiments

    OpenAIRE

    Cao, Jun

    2007-01-01

    Precisely measuring $\\theta_{13}$ is one of the highest priority in neutrino oscillation study. Reactor experiments can cleanly determine $\\theta_{13}$. Past reactor neutrino experiments are reviewed and status of next precision $\\theta_{13}$ experiments are presented. Daya Bay is designed to measure $\\sin^22\\theta_{13}$ to better than 0.01 and Double Chooz and RENO are designed to measure it to 0.02-0.03. All are heading to full operation in 2010. Recent improvements in neutrino moment measu...

  8. Department of Reactor Technology

    DEFF Research Database (Denmark)

    Risø National Laboratory, Roskilde

    The general development of the Department of Reactor Technology at Risø during 1981 is presented, and the activities within the major subject fields are described in some detail. Lists of staff, publications, and computer programs are included.......The general development of the Department of Reactor Technology at Risø during 1981 is presented, and the activities within the major subject fields are described in some detail. Lists of staff, publications, and computer programs are included....

  9. Moon base reactor system

    Science.gov (United States)

    Chavez, H.; Flores, J.; Nguyen, M.; Carsen, K.

    1989-01-01

    The objective of our reactor design is to supply a lunar-based research facility with 20 MW(e). The fundamental layout of this lunar-based system includes the reactor, power conversion devices, and a radiator. The additional aim of this reactor is a longevity of 12 to 15 years. The reactor is a liquid metal fast breeder that has a breeding ratio very close to 1.0. The geometry of the core is cylindrical. The metallic fuel rods are of beryllium oxide enriched with varying degrees of uranium, with a beryllium core reflector. The liquid metal coolant chosen was natural lithium. After the liquid metal coolant leaves the reactor, it goes directly into the power conversion devices. The power conversion devices are Stirling engines. The heated coolant acts as a hot reservoir to the device. It then enters the radiator to be cooled and reenters the Stirling engine acting as a cold reservoir. The engines' operating fluid is helium, a highly conductive gas. These Stirling engines are hermetically sealed. Although natural lithium produces a lower breeding ratio, it does have a larger temperature range than sodium. It is also corrosive to steel. This is why the container material must be carefully chosen. One option is to use an expensive alloy of cerbium and zirconium. The radiator must be made of a highly conductive material whose melting point temperature is not exceeded in the reactor and whose structural strength can withstand meteor showers.

  10. Simultaneous determination of ten preservatives in ten kinds of foods by micellar electrokinetic chromatography.

    Science.gov (United States)

    Ding, Xiao-Jing; Xie, Na; Zhao, Shan; Wu, Yu-Chen; Li, Jiang; Wang, Zhi

    2015-08-15

    An improved micellar electrokinetic capillary chromatography method (MEKC) for the simultaneous determination of ten preservatives in ten different kinds of food samples was reported. An uncoated fused-silica capillary with 50 μm i.d. and 70 cm total length was used. Under the optimized conditions, the linear response was observed in the range of 1.2-200mg/L for the analytes. The limits of detection (LOD, S/N=3) and limits of quantitation (LOQ, S/N=10) ranging from 0.4 to 0.5mg/L and 1.2 to 1.5mg/L, respectively were obtained. The method was used for the determination of sorbic and benzoic acids in two FAPAS® (Food Analysis Performance Assessment Scheme) proficiency test samples (jam and chocolate cake). The results showed that the current method with simple sample pretreatment and small reagent consumption could meet the needs for routine analysis of the ten preservatives in ten types of food products.

  11. Relações hierárquicas entre os traços amplos do Big Five Hierarchical relationship between the broad traits of the Big Five

    Directory of Open Access Journals (Sweden)

    Cristiano Mauro Assis Gomes

    2012-01-01

    Full Text Available O modelo Big Five sustenta que a personalidade humana é composta por dezenas de fatores específicos. Apesar dessa diversidade, esses fatores confluem para cinco traços amplos que estão em um mesmo nível de hierarquia. O presente estudo apresenta uma hipótese alternativa, postulando níveis entre os traços amplos do modelo. Fizeram parte do estudo 684 estudantes do ensino fundamental e médio de uma escola particular de Belo Horizonte, MG, com idades entre 10 e 18 anos (m = 13,71 e DP= 2,11. Para medir os fatores do Big Five foi utilizado o Inventário de Características de Personalidade, anteriormente chamado de Inventário dos Adjetivos de Personalidade, de Pinheiro, Gomes e Braga (2009. O instrumento mensura oito polaridades das 10 polaridades presentes nos cinco traços amplos do Big Five. Dois modelos foram comparados via método path analysis: um modelo de quatro níveis hierárquicos e um modelo não hierárquico. O modelo hierárquico apresentou adequado grau de ajuste aos dados e mostrou-se superior ao modelo não hierárquico, que não se ajusta aos dados. Implicações são discutidas para o modelo Big Five.The Big Five model sustains that human personality is composed by dozens of specific factors. Despite of diversity, specific factors are integrated in five broad traits that are in the same hierarchical level. The current study presents an alternative hypothesis arguing that there are hierarchical levels between the broad traits of the model. Six hundred and eighty-four junior and high school level students from 10 to 18 years old (M = 13.71 and SD= 2.11 of a private school in the city of Belo Horizonte, Minas Gerais, Brazil participated in the study. The Big Five was measured by an Inventory of Personality Traits, initially named as Personality Adjective Inventory, elaborated by Pinheiro, Gomes and Braga (2009. This instrument measures eight polarities of the ten presented in the Big Five Model. Two models were compared

  12. Advancements in Big Data Processing

    CERN Document Server

    Vaniachine, A; The ATLAS collaboration

    2012-01-01

    The ever-increasing volumes of scientific data present new challenges for Distributed Computing and Grid-technologies. The emerging Big Data revolution drives new discoveries in scientific fields including nanotechnology, astrophysics, high-energy physics, biology and medicine. New initiatives are transforming data-driven scientific fields by pushing Bid Data limits enabling massive data analysis in new ways. In petascale data processing scientists deal with datasets, not individual files. As a result, a task (comprised of many jobs) became a unit of petascale data processing on the Grid. Splitting of a large data processing task into jobs enabled fine-granularity checkpointing analogous to the splitting of a large file into smaller TCP/IP packets during data transfers. Transferring large data in small packets achieves reliability through automatic re-sending of the dropped TCP/IP packets. Similarly, transient job failures on the Grid can be recovered by automatic re-tries to achieve reliable Six Sigma produc...

  13. Evidence of the Big Fix

    CERN Document Server

    Hamada, Yuta; Kawana, Kiyoharu

    2014-01-01

    We give an evidence of the Big Fix. The theory of wormholes and multiverse suggests that the parameters of the Standard Model are fixed in such a way that the total entropy at the late stage of the universe is maximized, which we call the maximum entropy principle. In this paper, we discuss how it can be confirmed by the experimental data, and we show that it is indeed true for the Higgs vacuum expectation value $v_{h}$. We assume that the baryon number is produced by the sphaleron process, and that the current quark masses, the gauge couplings and the Higgs self coupling are fixed when we vary $v_{h}$. It turns out that the existence of the atomic nuclei plays a crucial role to maximize the entropy. This is reminiscent of the anthropic principle, however it is required by the fundamental low in our case.

  14. Evidence of the big fix

    Science.gov (United States)

    Hamada, Yuta; Kawai, Hikaru; Kawana, Kiyoharu

    2014-06-01

    We give an evidence of the Big Fix. The theory of wormholes and multiverse suggests that the parameters of the Standard Model are fixed in such a way that the total entropy at the late stage of the universe is maximized, which we call the maximum entropy principle. In this paper, we discuss how it can be confirmed by the experimental data, and we show that it is indeed true for the Higgs vacuum expectation value vh. We assume that the baryon number is produced by the sphaleron process, and that the current quark masses, the gauge couplings and the Higgs self-coupling are fixed when we vary vh. It turns out that the existence of the atomic nuclei plays a crucial role to maximize the entropy. This is reminiscent of the anthropic principle, however it is required by the fundamental law in our case.

  15. Big Book of Apple Hacks

    CERN Document Server

    Seibold, Chris

    2008-01-01

    Bigger in size, longer in length, broader in scope, and even more useful than our original Mac OS X Hacks, the new Big Book of Apple Hacks offers a grab bag of tips, tricks and hacks to get the most out of Mac OS X Leopard, as well as the new line of iPods, iPhone, and Apple TV. With 125 entirely new hacks presented in step-by-step fashion, this practical book is for serious Apple computer and gadget users who really want to take control of these systems. Many of the hacks take you under the hood and show you how to tweak system preferences, alter or add keyboard shortcuts, mount drives and

  16. Was the Big Bang hot?

    Science.gov (United States)

    Wright, E. L.

    1983-01-01

    Techniques for verifying the spectrum defined by Woody and Richards (WR, 1981), which serves as a base for dust-distorted models of the 3 K background, are discussed. WR detected a sharp deviation from the Planck curve in the 3 K background. The absolute intensity of the background may be determined by the frequency dependence of the dipole anisotropy of the background or the frequency dependence effect in galactic clusters. Both methods involve the Doppler shift; analytical formulae are defined for characterization of the dipole anisotropy. The measurement of the 30-300 GHz spectra of cold galactic dust may reveal the presence of significant amounts of needle-shaped grains, which would in turn support a theory of a cold Big Bang.

  17. Big ideas for psychotherapy training.

    Science.gov (United States)

    Fauth, James; Gates, Sarah; Vinca, Maria Ann; Boles, Shawna; Hayes, Jeffrey A

    2007-12-01

    Research indicates that traditional psychotherapy training practices are ineffective in durably improving the effectiveness of psychotherapists. In addition, the quantity and quality of psychotherapy training research has also been limited in several ways. Thus, based on extant scholarship and personal experience, we offer several suggestions for improving on this state of affairs. Specifically, we propose that future psychotherapy trainings focus on a few "big ideas," target psychotherapist meta-cognitive skills, and attend more closely to the organizational/treatment context in which the training takes place. In terms of future training research, we recommend that researchers include a wider range of intermediate outcomes in their studies, examine the nature of trainee skill development, and investigate the role that organizational/treatment culture plays in terms of the retention of changes elicited by psychotherapy training. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  18. Microsystems - The next big thing

    Energy Technology Data Exchange (ETDEWEB)

    STINNETT,REGAN W.

    2000-05-11

    Micro-Electro-Mechanical Systems (MEMS) is a big name for tiny devices that will soon make big changes in everyday life and the workplace. These and other types of Microsystems range in size from a few millimeters to a few microns, much smaller than a human hair. These Microsystems have the capability to enable new ways to solve problems in commercial applications ranging from automotive, aerospace, telecommunications, manufacturing equipment, medical diagnostics to robotics, and in national security applications such as nuclear weapons safety and security, battlefield intelligence, and protection against chemical and biological weapons. This broad range of applications of Microsystems reflects the broad capabilities of future Microsystems to provide the ability to sense, think, act, and communicate, all in a single integrated package. Microsystems have been called the next silicon revolution, but like many revolutions, they incorporate more elements than their predecessors. Microsystems do include MEMS components fabricated from polycrystalline silicon processed using techniques similar to those used in the manufacture of integrated electrical circuits. They also include optoelectronic components made from gallium arsenide and other semiconducting compounds from the III-V groups of the periodic table. Microsystems components are also being made from pure metals and metal alloys using the LIGA process, which utilizes lithography, etching, and casting at the micron scale. Generically, Microsystems are micron scale, integrated systems that have the potential to combine the ability to sense light, heat, pressure, acceleration, vibration, and chemicals with the ability to process the collected data using CMOS circuitry, execute an electrical, mechanical, or photonic response, and communicate either optically or with microwaves.

  19. Limits on Cosmological Variation of Strong Interaction and Quark Masses from Big Bang Nucleosynthesis, Cosmic, Laboratory and Oklo Data

    CERN Document Server

    Flambaum, V V

    2002-01-01

    Recent data on cosmological variation of the electromagnetic fine structure constant from distant quasar (QSO) absorption spectra have inspired a more general discussion of possible variation of other constants. We discuss variation of strong scale and quark masses. We derive the limits on their relative change from (i) primordial Big-Bang Nucleosynthesis (BBN); (ii) Oklo natural nuclear reactor, (iii) quasar absorption spectra, and (iv) laboratory measurements of hyperfine intervals.

  20. What makes Big Data, Big Data? Exploring the ontological characteristics of 26 datasets

    Directory of Open Access Journals (Sweden)

    Rob Kitchin

    2016-02-01

    Full Text Available Big Data has been variously defined in the literature. In the main, definitions suggest that Big Data possess a suite of key traits: volume, velocity and variety (the 3Vs, but also exhaustivity, resolution, indexicality, relationality, extensionality and scalability. However, these definitions lack ontological clarity, with the term acting as an amorphous, catch-all label for a wide selection of data. In this paper, we consider the question ‘what makes Big Data, Big Data?’, applying Kitchin’s taxonomy of seven Big Data traits to 26 datasets drawn from seven domains, each of which is considered in the literature to constitute Big Data. The results demonstrate that only a handful of datasets possess all seven traits, and some do not possess either volume and/or variety. Instead, there are multiple forms of Big Data. Our analysis reveals that the key definitional boundary markers are the traits of velocity and exhaustivity. We contend that Big Data as an analytical category needs to be unpacked, with the genus of Big Data further delineated and its various species identified. It is only through such ontological work that we will gain conceptual clarity about what constitutes Big Data, formulate how best to make sense of it, and identify how it might be best used to make sense of the world.

  1. Reactor Safety Planning for Prometheus Project, for Naval Reactors Information

    Energy Technology Data Exchange (ETDEWEB)

    P. Delmolino

    2005-05-06

    The purpose of this letter is to submit to Naval Reactors the initial plan for the Prometheus project Reactor Safety work. The Prometheus project is currently developing plans for cold physics experiments and reactor prototype tests. These tests and facilities may require safety analysis and siting support. In addition to the ground facilities, the flight reactor units will require unique analyses to evaluate the risk to the public from normal operations and credible accident conditions. This letter outlines major safety documents that will be submitted with estimated deliverable dates. Included in this planning is the reactor servicing documentation and shipping analysis that will be submitted to Naval Reactors.

  2. Transcutaneous Electrical Nerve Stimulation (TENS) for fibromyalgia in adults

    OpenAIRE

    Johnson, MI; Claydon, LS; Herbison, GP; Paley, CA; Jones, G.

    2016-01-01

    This is the protocol for a review and there is no abstract. The objectives are as follows: To assess the analgesic efficacy and adverse events of TENS for fibromyalgia in adults. We will assess TENS on its own or added to usual care in comparisons with placebo (sham) TENS, usual care, or no treatment.

  3. [Big data in medicine and healthcare].

    Science.gov (United States)

    Rüping, Stefan

    2015-08-01

    Healthcare is one of the business fields with the highest Big Data potential. According to the prevailing definition, Big Data refers to the fact that data today is often too large and heterogeneous and changes too quickly to be stored, processed, and transformed into value by previous technologies. The technological trends drive Big Data: business processes are more and more executed electronically, consumers produce more and more data themselves - e.g. in social networks - and finally ever increasing digitalization. Currently, several new trends towards new data sources and innovative data analysis appear in medicine and healthcare. From the research perspective, omics-research is one clear Big Data topic. In practice, the electronic health records, free open data and the "quantified self" offer new perspectives for data analytics. Regarding analytics, significant advances have been made in the information extraction from text data, which unlocks a lot of data from clinical documentation for analytics purposes. At the same time, medicine and healthcare is lagging behind in the adoption of Big Data approaches. This can be traced to particular problems regarding data complexity and organizational, legal, and ethical challenges. The growing uptake of Big Data in general and first best-practice examples in medicine and healthcare in particular, indicate that innovative solutions will be coming. This paper gives an overview of the potentials of Big Data in medicine and healthcare.

  4. How to use Big Data technologies to optimize operations in Upstream Petroleum Industry

    Directory of Open Access Journals (Sweden)

    Abdelkader Baaziz

    2013-12-01

    Full Text Available “Big Data is the oil of the new economy” is the most famous citation during the three last years. It has even been adopted by the World Economic Forum in 2011. In fact, Big Data is like crude! It’s valuable, but if unrefined it cannot be used. It must be broken down, analyzed for it to have value. But what about Big Data generated by the Petroleum Industry and particularly its upstream segment? Upstream is no stranger to Big Data. Understanding and leveraging data in the upstream segment enables firms to remain competitive throughout planning, exploration, delineation, and field development.Oil & Gas Companies conduct advanced geophysics modeling and simulation to support operations where 2D, 3D & 4D Seismic generate significant data during exploration phases. They closely monitor the performance of their operational assets. To do this, they use tens of thousands of data-collecting sensors in subsurface wells and surface facilities to provide continuous and real-time monitoring of assets and environmental conditions. Unfortunately, this information comes in various and increasingly complex forms, making it a challenge to collect, interpret, and leverage the disparate data. As an example, Chevron’s internal IT traffic alone exceeds 1.5 terabytes a day.Big Data technologies integrate common and disparate data sets to deliver the right information at the appropriate time to the correct decision-maker. These capabilities help firms act on large volumes of data, transforming decision-making from reactive to proactive and optimizing all phases of exploration, development and production. Furthermore, Big Data offers multiple opportunities to ensure safer, more responsible operations. Another invaluable effect of that would be shared learning.The aim of this paper is to explain how to use Big Data technologies to optimize operations. How can Big Data help experts to decision-making leading the desired outcomes?Keywords:Big Data; Analytics

  5. Choledochal cysts: our ten year experience.

    LENUS (Irish Health Repository)

    Cianci, F

    2012-04-01

    We present our experience in the management of choledochal cysts from 1999 to 2009. A retrospective review of all charts with a diagnosis of choledochal cysts in our institution in this ten-year period. Data was collated using Excel. A total of 17 patients were diagnosed with choledochal cyst: 9 females and 8 males. The average age at diagnosis was 28 months (range from 0 to 9 years). The most common presenting symptoms were obstructive jaundice 6 (35%) and abdominal pain and vomiting 4 (23%). Ultrasound (US) was the initial diagnostic test in all cases with 4 patients requiring further investigations. All patients underwent Roux-en-Y Hepaticojejunostomy. The average length of stay was 11 days. Patients were followed up with Liver Function Tests (LFTS) and US 4-6 weeks post-operatively. Three patients developed complications including post-op collection, high drain output requiring blood transfusion and adhesive bowel obstruction. Our overall experience with choledochal cyst patients has been a positive one with effective management and low complication rates.

  6. Ten years for the public Web

    CERN Multimedia

    2003-01-01

    Ten years ago, CERN issued a statement declaring that a little known piece of software called the World Wide Web was in the public domain. Nowadays, the Web is an indispensable part of modern communications. The idea for the Web goes back to March 1989 when CERN Computer scientist Tim Berners-Lee wrote a proposal for a 'Distributed Information Management System' for the high-energy physics community. The Web was originaly conceived and developed to meet the demand for information sharing between scientists working all over the world. There were many obstacles in the 1980s to the effective exchange of information. There was, for example a great variety of computer and network systems, with hardly any common features. The main purpose of the web was to allow scientists to access information from any source in a consistent and simple way. By Christmas 1990, Berners-Lee's idea had become the World Wide Web, with its first server and browser running at CERN. Through 1991, the Web spread to other particle physics ...

  7. Comparative Genomics of Ten Solanaceous Plastomes

    Directory of Open Access Journals (Sweden)

    Harpreet Kaur

    2014-01-01

    Full Text Available Availability of complete plastid genomes of ten solanaceous species, Atropa belladonna, Capsicum annuum, Datura stramonium, Nicotiana sylvestris, Nicotiana tabacum, Nicotiana tomentosiformis, Nicotiana undulata, Solanum bulbocastanum, Solanum lycopersicum, and Solanum tuberosum provided us with an opportunity to conduct their in silico comparative analysis in depth. The size of complete chloroplast genomes and LSC and SSC regions of three species of Solanum is comparatively smaller than that of any other species studied till date (exception: SSC region of A. belladonna. AT content of coding regions was found to be less than noncoding regions. A duplicate copy of trnH gene in C. annuum and two alternative tRNA genes for proline in D. stramonium were observed for the first time in this analysis. Further, homology search revealed the presence of rps19 pseudogene and infA genes in A. belladonna and D. stramonium, a region identical to rps19 pseudogene in C. annum and orthologues of sprA gene in another six species. Among the eighteen intron-containing genes, 3 genes have two introns and 15 genes have one intron. The longest insertion was found in accD gene in C. annuum. Phylogenetic analysis using concatenated protein coding sequences gave two clades, one for Nicotiana species and another for Solanum, Capsicum, Atropa, and Datura.

  8. Ten tips for authors of scientific articles.

    Science.gov (United States)

    Hong, Sung-Tae

    2014-08-01

    Writing a good quality scientific article takes experience and skill. I propose 'Ten Tips' that may help to improve the quality of manuscripts for scholarly journals. It is advisable to draft first version of manuscript and revise it repeatedly for consistency and accuracy of the writing. During the drafting and revising the following tips can be considered: 1) focus on design to have proper content, conclusion, points compliant with scope of the target journal, appropriate authors and contributors list, and relevant references from widely visible sources; 2) format the manuscript in accordance with instructions to authors of the target journal; 3) ensure consistency and logical flow of ideas and scientific facts; 4) provide scientific confidence; 5) make your story interesting for your readers; 6) write up short, simple and attractive sentences; 7) bear in mind that properly composed and reflective titles increase chances of attracting more readers; 8) do not forget that well-structured and readable abstracts improve citability of your publications; 9) when revising adhere to the rule of 'First and Last' - open your text with topic paragraph and close it with resolution paragraph; 10) use connecting words linking sentences within a paragraph by repeating relevant keywords.

  9. Progress Towards Deployable Antineutrino Detectors for Reactor Safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Bowden, N; Bernstein, A; Dazeley, S; Keefer, G; Reyna, D; Cabrera-Palmer, B; Kiff, S

    2010-04-05

    Fission reactors emit large numbers of antineutrinos and this flux may be useful for the measurement of two quantities of interest for reactor safeguards: the reactor's power and plutonium inventory throughout its cycle. The high antineutrino flux and relatively low background rates means that simple cubic meter scale detectors at tens of meters standoff can record hundreds or thousands of antineutrino events per day. Such antineutrino detectors would add online, quasi-real-time bulk material accountancy to the set of reactor monitoring tools available to the IAEA and other safeguards agencies with minimal impact on reactor operations. Between 2003 and 2008, our LLNL/SNL collaboration successfully deployed several prototype safeguards detectors at a commercial reactor in order to test both the method and the practicality of its implementation in the field. Partially on the strength of the results obtained from these deployments, an Experts Meeting was convened by the IAEA Novel Technologies Group in 2008 to assess current antineutrino detection technology and examine how it might be incorporated into the safeguards regime. Here we present a summary of our previous deployments and discuss current work that seeks to provide expanded capabilities suggested by the Experts Panel, in particular aboveground detector operation.

  10. Effects of variation of fundamental constants from Big Bang to atomic clocks

    Science.gov (United States)

    Flambaum, Victor

    2004-05-01

    Theories unifying gravity with other interactions suggest temporal and spatial variation of the fundamental "constants" in expanding Universe. I discuss effects of variation of the fine structure constant, strong interaction, quark mass and gravitational constant. The measurements of these variations cover the lifespan of the Universe from few minutes after Big Bang to the present time and give controversial results. There are some hints for the variations in Big Bang nucleosynthesis, quasar absorption spectra and Oklo natural nuclear reactor data. A very promising method to search for the variation of the fundamental constants consists in comparison of different atomic clocks. A billion times enhancement of the variation effects happens in transitions between accidentally degenerate atomic energy levels.

  11. Scaleable, High Efficiency Microchannel Sabatier Reactor Project

    Data.gov (United States)

    National Aeronautics and Space Administration — A Microchannel Sabatier Reactor System (MSRS) consisting of cross connected arrays of isothermal or graded temperature reactors is proposed. The reactor array...

  12. Modeling, simulation, and analysis of a reactor system for the generation of white liquor of a pulp and paper industry

    Directory of Open Access Journals (Sweden)

    Ricardo Andreola

    2011-02-01

    Full Text Available An industrial system for the production of white liquor of a pulp and paper industry, Klabin Paraná Papéis, formed by ten reactors was modeled, simulated, and analyzed. The developed model considered possible water losses by the evaporation and reaction, in addition to variations in the volumetric flow of lime mud across the reactors due to the composition variations. The model predictions agreed well with the process measurements at the plant and the results showed that the slaking reaction was nearly complete at the third causticizing reactor, while causticizing ends by the seventh reactor. Water loss due to slaking reaction and evaporation occurred more pronouncedly in the slaker reactor than in the final causticizing reactors; nevertheless, the lime mud flow remained nearly constant across the reactors.

  13. Role of nanocrystalline silver dressings in the management of toxic epidermal necrolysis (TEN) and TEN/Stevens-Johnson syndrome overlap.

    Science.gov (United States)

    Smith, Saxon D; Dodds, Annabel; Dixit, Shreya; Cooper, Alan

    2015-11-01

    Toxic epidermal necrolysis (TEN) and Stevens-Johnson syndrome (SJS) are severe mucocutaneous eruptions. There is currently no defined optimal approach to wound care. The objective of this study was to evaluate silver dressings in the wound-care management of TEN and SJS/TEN syndrome overlap with a retrospective case review of nine patients with TEN and SJS/TEN overlap presenting to our institution. Nanocrystalline silver dressings appear to be useful in the rapid commencement of healing in these patients. TEN and SJS/TEN overlap are rare conditions. This contributed to a relatively small number of cases included in the study. The ease of application, antimicrobial properties and low frequency of change make nanocrystalline silver dressings ideal in TEN/SJS.

  14. Big data as governmentality in international development

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    2017-01-01

    Statistics have long shaped the field of visibility for the governance of development projects. The introduction of big data has altered the field of visibility. Employing Dean's “analytics of government” framework, we analyze two cases—malaria tracking in Kenya and monitoring of food prices...... in Indonesia. Our analysis shows that big data introduces a bias toward particular types of visualizations. What problems are being made visible through big data depends to some degree on how the underlying data is visualized and who is captured in the visualizations. It is also influenced by technical factors...

  15. Processing Solutions for Big Data in Astronomy

    Science.gov (United States)

    Fillatre, L.; Lepiller, D.

    2016-09-01

    This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.

  16. Big data governance an emerging imperative

    CERN Document Server

    Soares, Sunil

    2012-01-01

    Written by a leading expert in the field, this guide focuses on the convergence of two major trends in information management-big data and information governance-by taking a strategic approach oriented around business cases and industry imperatives. With the advent of new technologies, enterprises are expanding and handling very large volumes of data; this book, nontechnical in nature and geared toward business audiences, encourages the practice of establishing appropriate governance over big data initiatives and addresses how to manage and govern big data, highlighting the relevant processes,

  17. BLENDING IOT AND BIG DATA ANALYTICS

    OpenAIRE

    Tulasi.B*; Girish J Vemulkar

    2016-01-01

    Internet is continuously evolving and changing. Internet of Things (IoT) can be considered as the future of Internet applications which involves machine to machine learning (M2M). The actionable intelligence can be derived through fusion of Big Data and real time analytics with IoT. Big Data and IoT can be viewed as two sides of a coin. With the connection between Big Data and the objects on Internet benefits of IoT can be easily reaped. The applications of IoT spread across various domains l...

  18. Big data and the electronic health record.

    Science.gov (United States)

    Peters, Steve G; Buntrock, James D

    2014-01-01

    The electronic medical record has evolved from a digital representation of individual patient results and documents to information of large scale and complexity. Big Data refers to new technologies providing management and processing capabilities, targeting massive and disparate data sets. For an individual patient, techniques such as Natural Language Processing allow the integration and analysis of textual reports with structured results. For groups of patients, Big Data offers the promise of large-scale analysis of outcomes, patterns, temporal trends, and correlations. The evolution of Big Data analytics moves us from description and reporting to forecasting, predictive modeling, and decision optimization.

  19. Big Data and historical social science

    Directory of Open Access Journals (Sweden)

    Peter Bearman

    2015-11-01

    Full Text Available “Big Data” can revolutionize historical social science if it arises from substantively important contexts and is oriented towards answering substantively important questions. Such data may be especially important for answering previously largely intractable questions about the timing and sequencing of events, and of event boundaries. That said, “Big Data” makes no difference for social scientists and historians whose accounts rest on narrative sentences. Since such accounts are the norm, the effects of Big Data on the practice of historical social science may be more limited than one might wish.

  20. Big questions, big science: meeting the challenges of global ecology.

    Science.gov (United States)

    Schimel, David; Keller, Michael

    2015-04-01

    Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects.

  1. "Take ten minutes": a dedicated ten minute medication review reduces polypharmacy in the elderly.

    LENUS (Irish Health Repository)

    Walsh, E K

    2010-09-01

    Multiple and inappropriate medications are often the cause for poor health status in the elderly. Medication reviews can improve prescribing. This study aimed to determine if a ten minute medication review by a general practitioner could reduce polypharmacy and inappropriate prescribing in elderly patients. A prospective, randomised study was conducted. Patients over the age of 65 (n = 50) underwent a 10-minute medication review. Inappropriate medications, dosage errors, and discrepancies between prescribed versus actual medication being consumed were recorded. A questionnaire to assess satisfaction was completed following review. The mean number of medications taken by patients was reduced (p < 0.001). A medication was stopped in 35 (70%) patients. Inappropriate medications were detected in 27 (54%) patients and reduced (p < 0.001). Dose errors were detected in 16 (32%). A high level of patient satisfaction was reported. A ten minute medication review reduces polypharmacy, improves prescribing and is associated with high levels of patient satisfaction.

  2. "Take ten minutes": a dedicated ten minute medication review reduces polypharmacy in the elderly.

    LENUS (Irish Health Repository)

    Walsh, E K

    2012-02-01

    Multiple and inappropriate medications are often the cause for poor health status in the elderly. Medication reviews can improve prescribing. This study aimed to determine if a ten minute medication review by a general practitioner could reduce polypharmacy and inappropriate prescribing in elderly patients. A prospective, randomised study was conducted. Patients over the age of 65 (n = 50) underwent a 10-minute medication review. Inappropriate medications, dosage errors, and discrepancies between prescribed versus actual medication being consumed were recorded. A questionnaire to assess satisfaction was completed following review. The mean number of medications taken by patients was reduced (p < 0.001). A medication was stopped in 35 (70%) patients. Inappropriate medications were detected in 27 (54%) patients and reduced (p < 0.001). Dose errors were detected in 16 (32%). A high level of patient satisfaction was reported. A ten minute medication review reduces polypharmacy, improves prescribing and is associated with high levels of patient satisfaction.

  3. LMFBR type reactor

    Energy Technology Data Exchange (ETDEWEB)

    Shimizu, Takeshi; Iida, Masaaki; Moriki, Yasuyuki

    1994-10-18

    A reactor core is divided into a plurality of coolants flowrate regions, and electromagnetic pumps exclusively used for each of the flowrate regions are disposed to distribute coolants flowrates in the reactor core. Further, the flowrate of each of the electromagnetic pumps is automatically controlled depending on signals from a temperature detector disposed at the exit of the reactor core, so that the flowrate of the region can be controlled optimally depending on the burning of reactor core fuels. Then, the electromagnetic pumps disposed for every divided region are controlled respectively, so that the coolants flowrate distribution suitable to each of the regions can be attained. Margin for fuel design is decreased, fuels are used effectively, as well as an operation efficiency can be improved. Moreover, since the electromagnetic pump has less flow resistance compared with a mechanical type pump, and flow resistance of the reactor core flowrate control mechanism is eliminated, greater circulating flowrate can be ensured after occurrence of accident in a natural convection using a buoyancy of coolants utilizable for after-heat removal as a driving force. (N.H.).

  4. A design study of reactor core optimization for direct nuclear heat-to-electricity conversion in a space power reactor

    Energy Technology Data Exchange (ETDEWEB)

    Yoshikawa, Hidekazu; Takahashi, Makoto; Shimoda, Hiroshi; Takeoka, Satoshi [Kyoto Univ. (Japan); Nakagawa, Masayuki; Kugo, Teruhiko

    1998-01-01

    To propose a new design concept of a nuclear reactor used in the space, research has been conducted on the conceptual design of a new nuclear reactor on the basis of the following three main concepts: (1) Thermionic generation by thermionic fuel elements (TFE), (2) reactivity control by rotary reflector, and (3) reactor cooling by liquid metal. The outcomes of the research are: (1) A calculation algorithm was derived for obtaining convergent conditions by repeating nuclear characteristic calculation and thermal flow characteristic calculation for the space nuclear reactor. (2) Use of this algorithm and the parametric study established that a space nuclear reactor using 97% enriched uranium nitride as the fuel and lithium as the coolant and having a core with a radius of about 25 cm, a height of about 50 cm and a generation efficiency of about 7% can probably be operated continuously for at least more than ten years at 100 kW only by reactivity control by rotary reflector. (3) A new CAD/CAE system was developed to assist design work to optimize the core characteristics of the space nuclear reactor comprehensively. It is composed of the integrated design support system VINDS using virtual reality and the distributed system WINDS to collaboratively support design work using Internet. (N.H.)

  5. The TEN-T core network and the Fehmarnbelt region

    DEFF Research Database (Denmark)

    Guasco, Clement Nicolas

    This note is a snapshot picture, taken in early 2014, that places the Green STRING corridor project within the context of the TEN-T strategy and gives a summarized overview on the impact of this strategy in the region. Chapter 1 contains a summary of the TEN-T strategy today, chapter 2 presents...... the sources used for this note, chapter 3 presents all the relevant EU regulations with direct impact on the development of TEN-T corridors, chapter 4 gives practical examples of the challenges for the development of TEN-T corridors, chapter 5 pre-sents the national initiatives related to the TEN-T corridor...

  6. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-06-01

    The Big Sky Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts during the second performance period fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts begun in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for

  7. Reactor Structural Materials: Reactor Pressure Vessel Steels

    Energy Technology Data Exchange (ETDEWEB)

    Chaouadi, R

    2000-07-01

    The objectives of SCK-CEN's R and D programme on Rector Pressure Vessel (RPV) Steels are:(1) to complete the fracture toughness data bank of various reactor pressure vessel steels by using precracked Charpy specimens that were tested statically as well as dynamically; (2) to implement the enhanced surveillance approach in a user-friendly software; (3) to improve the existing reconstitution technology by reducing the input energy (short cycle welding) and modifying the stud geometry. Progress and achievements in 1999 are reported.

  8. Big Bang–Big Crunch Optimization Algorithm for Linear Phase Fir Digital Filter Design

    Directory of Open Access Journals (Sweden)

    Ms. Rashmi Singh Dr. H. K. Verma

    2012-02-01

    Full Text Available The Big Bang–Big Crunch (BB–BC optimization algorithm is a new optimization method that relies on the Big Bang and Big Crunch theory, one of the theories of the evolution of the universe. In this paper, a Big Bang–Big Crunch algorithm has been used here for the design of linear phase finite impulse response (FIR filters. Here the experimented fitness function based on the mean squared error between the actual and the ideal filter response. This paper presents the plot of magnitude response of FIR filters and error graph. The BB-BC seems to be promising tool for FIR filter design especially in a dynamic environment where filter coefficients have to be adapted and fast convergence is of importance.

  9. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2005-01-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. Efforts are underway to showcase the architecture of the GIS framework and initial results for sources and sinks. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is

  10. NOAA Big Data Partnership RFI

    Science.gov (United States)

    de la Beaujardiere, J.

    2014-12-01

    In February 2014, the US National Oceanic and Atmospheric Administration (NOAA) issued a Big Data Request for Information (RFI) from industry and other organizations (e.g., non-profits, research laboratories, and universities) to assess capability and interest in establishing partnerships to position a copy of NOAA's vast data holdings in the Cloud, co-located with easy and affordable access to analytical capabilities. This RFI was motivated by a number of concerns. First, NOAA's data facilities do not necessarily have sufficient network infrastructure to transmit all available observations and numerical model outputs to all potential users, or sufficient infrastructure to support simultaneous computation by many users. Second, the available data are distributed across multiple services and data facilities, making it difficult to find and integrate data for cross-domain analysis and decision-making. Third, large datasets require users to have substantial network, storage, and computing capabilities of their own in order to fully interact with and exploit the latent value of the data. Finally, there may be commercial opportunities for value-added products and services derived from our data. Putting a working copy of data in the Cloud outside of NOAA's internal networks and infrastructures should reduce demands and risks on our systems, and should enable users to interact with multiple datasets and create new lines of business (much like the industries built on government-furnished weather or GPS data). The NOAA Big Data RFI therefore solicited information on technical and business approaches regarding possible partnership(s) that -- at no net cost to the government and minimum impact on existing data facilities -- would unleash the commercial potential of its environmental observations and model outputs. NOAA would retain the master archival copy of its data. Commercial partners would not be permitted to charge fees for access to the NOAA data they receive, but

  11. Thermionic Reactor Design Studies

    Energy Technology Data Exchange (ETDEWEB)

    Schock, Alfred

    1994-08-01

    Paper presented at the 29th IECEC in Monterey, CA in August 1994. The present paper describes some of the author's conceptual designs and their rationale, and the special analytical techniques developed to analyze their (thermionic reactor) performance. The basic designs, first published in 1963, are based on single-cell converters, either double-ended diodes extending over the full height of the reactor core or single-ended diodes extending over half the core height. In that respect they are similar to the thermionic fuel elements employed in the Topaz-2 reactor subsequently developed in the Soviet Union, copies of which were recently imported by the U.S. As in the Topaz-2 case, electrically heated steady-state performance tests of the converters are possible before fueling.

  12. Nuclear Rocket Engine Reactor

    CERN Document Server

    Lanin, Anatoly

    2013-01-01

    The development of a nuclear rocket engine reactor (NRER ) is presented in this book. The working capacity of an active zone NRER under mechanical and thermal load, intensive neutron fluxes, high energy generation (up to 30 MBT/l) in a working medium (hydrogen) at temperatures up to 3100 K is displayed. Design principles and bearing capacity of reactors area discussed on the basis of simulation experiments and test data of a prototype reactor. Property data of dense constructional, porous thermal insulating and fuel materials like carbide and uranium carbide compounds in the temperatures interval 300 - 3000 K are presented. Technological aspects of strength and thermal strength resistance of materials are considered. The design procedure of possible emergency processes in the NRER is developed and risks for their origination are evaluated. Prospects of the NRER development for pilotless space devices and piloted interplanetary ships are viewed.

  13. Operation of Reactor

    Institute of Scientific and Technical Information of China (English)

    1996-01-01

    3.1 Annual Report of SPR Operation Chu Shaochu Having overseen by National Nuclear Safety Administration and specialists, the reactor restarted up successfully after Safety renovation on April 16, 1996. In August 1996 the normal operation of SPR was approved by the authorities of Naitonal Nuclear Safety Administration. 1 Operation status In 1996, the reactor operated safely for 40 d and the energy released was about 137.3 MW·d. The operation status of SPR is shown in table 1. The reactor started up to higher power (power more than 1 MW) and lower power (for physics experiments) 4 times and 14 times respectively. Measurement of control rod efficiency and other measurement tasks were 2 times and 5 times respectively.

  14. 6 Top Tools for Taming Big Data%6Top Tools for Taming Big Data

    Institute of Scientific and Technical Information of China (English)

    JakoB BJ orklund

    2012-01-01

    The industry now has a buzzword,"big data," for how we're going to do something with the huge amount of information piling up."Big data" is replacing "business intelligence,"which subsumed "reporting," which put a nicer gloss on "spreadsheets," which beat out the old-fashioned "printouts."Managers who long ago studied printouts are now hiring mathematicians who claim to be big data specialists to help them solve the same old problem:What's selling and why?

  15. "Big Data" : big gaps of knowledge in the field of internet science

    OpenAIRE

    Snijders, CCP Chris; Matzat, U Uwe; Reips, UD

    2012-01-01

    Research on so-called 'Big Data' has received a considerable momentum and is expected to grow in the future. One very interesting stream of research on Big Data analyzes online networks. Many online networks are known to have some typical macro-characteristics, such as 'small world' properties. Much less is known about underlying micro-processes leading to these properties. The models used by Big Data researchers usually are inspired by mathematical ease of exposition. We propose to follow in...

  16. BDGS: A Scalable Big Data Generator Suite in Big Data Benchmarking

    OpenAIRE

    Ming, Zijian; Luo, Chunjie; Gao, Wanling; Han, Rui; Yang, Qiang; Wang, Lei; Zhan, Jianfeng

    2014-01-01

    Data generation is a key issue in big data benchmarking that aims to generate application-specific data sets to meet the 4V requirements of big data. Specifically, big data generators need to generate scalable data (Volume) of different types (Variety) under controllable generation rates (Velocity) while keeping the important characteristics of raw data (Veracity). This gives rise to various new challenges about how we design generators efficiently and successfully. To date, most existing tec...

  17. 76 FR 7837 - Big Rivers Electric Corporation; Notice of Filing

    Science.gov (United States)

    2011-02-11

    ... December 1, 2010, the date that Big Rivers integrated its transmission facilities with the Midwest... Energy Regulatory Commission Big Rivers Electric Corporation; Notice of Filing Take notice that on February 4, 2011, Big Rivers Electric Corporation (Big Rivers) filed a notice of cancellation of its...

  18. An Overview of Reactor Concepts, a Survey of Reactor Designs.

    Science.gov (United States)

    1985-02-01

    Public Affairs Office and is releasaole to the National Technical Information Services (NTIS). At NTIS, it will be available to the general public...Reactors that use deu- terium (heavy water) as a coolant can use natural uranium as a fuel. The * Canadian reactor, CANDU , utilizes this concept...reactor core at the top and discharged at the Dotton while the reactor is in operation. The discharged fuel can then b inspected to see if it can De used

  19. Oscillatory flow chemical reactors

    Directory of Open Access Journals (Sweden)

    Slavnić Danijela S.

    2014-01-01

    Full Text Available Global market competition, increase in energy and other production costs, demands for high quality products and reduction of waste are forcing pharmaceutical, fine chemicals and biochemical industries, to search for radical solutions. One of the most effective ways to improve the overall production (cost reduction and better control of reactions is a transition from batch to continuous processes. However, the reactions of interests for the mentioned industry sectors are often slow, thus continuous tubular reactors would be impractically long for flow regimes which provide sufficient heat and mass transfer and narrow residence time distribution. The oscillatory flow reactors (OFR are newer type of tube reactors which can offer solution by providing continuous operation with approximately plug flow pattern, low shear stress rates and enhanced mass and heat transfer. These benefits are the result of very good mixing in OFR achieved by vortex generation. OFR consists of cylindrical tube containing equally spaced orifice baffles. Fluid oscillations are superimposed on a net (laminar flow. Eddies are generated when oscillating fluid collides with baffles and passes through orifices. Generation and propagation of vortices create uniform mixing in each reactor cavity (between baffles, providing an overall flow pattern which is close to plug flow. Oscillations can be created by direct action of a piston or a diaphragm on fluid (or alternatively on baffles. This article provides an overview of oscillatory flow reactor technology, its operating principles and basic design and scale - up characteristics. Further, the article reviews the key research findings in heat and mass transfer, shear stress, residence time distribution in OFR, presenting their advantages over the conventional reactors. Finally, relevant process intensification examples from pharmaceutical, polymer and biofuels industries are presented.

  20. The Big Five personality dimensions and mental health: The mediating role of alexithymia.

    Science.gov (United States)

    Atari, Mohammad; Yaghoubirad, Mahsa

    2016-12-01

    The role of personality constructs on mental health has attracted research attention in the last few decades. The Big Five personality traits have been introduced as parsimonious dimensions of non-pathological traits. The five-factor model of personality includes neuroticism, agreeableness, conscientiousness, extraversion, and openness to experience. The present study aimed to examine the relationship between the Big Five dimensions and mental health considering the mediating role of alexithymia as an important emotional-processing construct. A total of 257 participants were recruited from non-clinical settings in the general population. All participants completed the Ten-Item Personality Inventory (TIPI), 20-item Toronto Alexithymia Scale (TAS-20), and General Health Questionnaire-28 (GHQ-28). Structural equation modeling was utilized to examine the hypothesized mediated model. Findings indicated that the Big Five personality dimensions could significantly predict scores of alexithymia. Moreover, alexithymia could predict mental health scores as measured by indices of depression, anxiety, social functioning, and somatic symptoms. The fit indices (GFI=0.94; CFI=0.91; TLI=0.90; RMSEA=0.071; CMIN/df=2.29) indicated that the model fits the data. Therefore, the relationship between the Big Five personality dimensions and mental health is mediated by alexithymia.

  1. Soft computing in big data processing

    CERN Document Server

    Park, Seung-Jong; Lee, Jee-Hyong

    2014-01-01

    Big data is an essential key to build a smart world as a meaning of the streaming, continuous integration of large volume and high velocity data covering from all sources to final destinations. The big data range from data mining, data analysis and decision making, by drawing statistical rules and mathematical patterns through systematical or automatically reasoning. The big data helps serve our life better, clarify our future and deliver greater value. We can discover how to capture and analyze data. Readers will be guided to processing system integrity and implementing intelligent systems. With intelligent systems, we deal with the fundamental data management and visualization challenges in effective management of dynamic and large-scale data, and efficient processing of real-time and spatio-temporal data. Advanced intelligent systems have led to managing the data monitoring, data processing and decision-making in realistic and effective way. Considering a big size of data, variety of data and frequent chan...

  2. Scaling big data with Hadoop and Solr

    CERN Document Server

    Karambelkar, Hrishikesh Vijay

    2015-01-01

    This book is aimed at developers, designers, and architects who would like to build big data enterprise search solutions for their customers or organizations. No prior knowledge of Apache Hadoop and Apache Solr/Lucene technologies is required.

  3. Neutrino oscillations and Big Bang Nucleosynthesis

    OpenAIRE

    Bell, Nicole F.

    2001-01-01

    We outline how relic neutrino asymmetries may be generated in the early universe via active-sterile neutrino oscillations. We discuss possible consequences for big bang nucleosynthesis, within the context of a particular 4-neutrino model.

  4. Tick-Borne Diseases: The Big Two

    Science.gov (United States)

    ... of this page please turn Javascript on. Feature: Ticks and Diseases Tick-borne Diseases: The Big Two Past Issues / Spring - ... on the skin where there has been a tick bite. Photo: CDC/James Gathany Lyme disease Lyme ...

  5. Big Fish and Prized Trees Gain Protection

    Institute of Scientific and Technical Information of China (English)

    Fred Pearce; 吴敏

    2004-01-01

    @@ Decisions made at a key conservation① meeting are good news for big and quirky② fish and commercially prized trees. Several species will enjoy extra protection against trade following rulings made at the Convention on International Trade in Endangered Species (CITES).

  6. Big Data for Business Ecosystem Players

    Directory of Open Access Journals (Sweden)

    Perko Igor

    2016-06-01

    Full Text Available In the provided research, some of the Big Data most prospective usage domains connect with distinguished player groups found in the business ecosystem. Literature analysis is used to identify the state of the art of Big Data related research in the major domains of its use-namely, individual marketing, health treatment, work opportunities, financial services, and security enforcement. System theory was used to identify business ecosystem major player types disrupted by Big Data: individuals, small and mid-sized enterprises, large organizations, information providers, and regulators. Relationships between the domains and players were explained through new Big Data opportunities and threats and by players’ responsive strategies. System dynamics was used to visualize relationships in the provided model.

  7. ARC Code TI: BigView

    Data.gov (United States)

    National Aeronautics and Space Administration — BigView allows for interactive panning and zooming of images of arbitrary size on desktop PCs running linux. Additionally, it can work in a multi-screen environment...

  8. Big Data in food and agriculture

    Directory of Open Access Journals (Sweden)

    Kelly Bronson

    2016-06-01

    Full Text Available Farming is undergoing a digital revolution. Our existing review of current Big Data applications in the agri-food sector has revealed several collection and analytics tools that may have implications for relationships of power between players in the food system (e.g. between farmers and large corporations. For example, Who retains ownership of the data generated by applications like Monsanto Corproation's Weed I.D. “app”? Are there privacy implications with the data gathered by John Deere's precision agricultural equipment? Systematically tracing the digital revolution in agriculture, and charting the affordances as well as the limitations of Big Data applied to food and agriculture, should be a broad research goal for Big Data scholarship. Such a goal brings data scholarship into conversation with food studies and it allows for a focus on the material consequences of big data in society.

  9. Quantum nature of the big bang.

    Science.gov (United States)

    Ashtekar, Abhay; Pawlowski, Tomasz; Singh, Parampreet

    2006-04-14

    Some long-standing issues concerning the quantum nature of the big bang are resolved in the context of homogeneous isotropic models with a scalar field. Specifically, the known results on the resolution of the big-bang singularity in loop quantum cosmology are significantly extended as follows: (i) the scalar field is shown to serve as an internal clock, thereby providing a detailed realization of the "emergent time" idea; (ii) the physical Hilbert space, Dirac observables, and semiclassical states are constructed rigorously; (iii) the Hamiltonian constraint is solved numerically to show that the big bang is replaced by a big bounce. Thanks to the nonperturbative, background independent methods, unlike in other approaches the quantum evolution is deterministic across the deep Planck regime.

  10. Big Data Components for Business Process Optimization

    Directory of Open Access Journals (Sweden)

    Mircea Raducu TRIFU

    2016-01-01

    Full Text Available In these days, more and more people talk about Big Data, Hadoop, noSQL and so on, but very few technical people have the necessary expertise and knowledge to work with those concepts and technologies. The present issue explains one of the concept that stand behind two of those keywords, and this is the map reduce concept. MapReduce model is the one that makes the Big Data and Hadoop so powerful, fast, and diverse for business process optimization. MapReduce is a programming model with an implementation built to process and generate large data sets. In addition, it is presented the benefits of integrating Hadoop in the context of Business Intelligence and Data Warehousing applications. The concepts and technologies behind big data let organizations to reach a variety of objectives. Like other new information technologies, the main important objective of big data technology is to bring dramatic cost reduction.

  11. Cosmic relics from the big bang

    Energy Technology Data Exchange (ETDEWEB)

    Hall, L.J.

    1988-12-01

    A brief introduction to the big bang picture of the early universe is given. Dark matter is discussed; particularly its implications for elementary particle physics. A classification scheme for dark matter relics is given. 21 refs., 11 figs., 1 tab.

  12. Big Data and Analytics in Healthcare.

    Science.gov (United States)

    Tan, S S-L; Gao, G; Koch, S

    2015-01-01

    This editorial is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". The amount of data being generated in the healthcare industry is growing at a rapid rate. This has generated immense interest in leveraging the availability of healthcare data (and "big data") to improve health outcomes and reduce costs. However, the nature of healthcare data, and especially big data, presents unique challenges in processing and analyzing big data in healthcare. This Focus Theme aims to disseminate some novel approaches to address these challenges. More specifically, approaches ranging from efficient methods of processing large clinical data to predictive models that could generate better predictions from healthcare data are presented.

  13. Fisicos argentinos reproduciran el Big Bang

    CERN Multimedia

    De Ambrosio, Martin

    2008-01-01

    Two groups of argentine physicists from La Plata and Buenos Aires Universities work in a sery of experiments who while recreate the conditions of the big explosion that was at the origin of the universe. (1 page)

  14. THE ABRIDGED BIG 5 CIRCUMPLEX (AB5C) MODEL OF TRAIT STRUCTURE - COMPARISONS WITH HEYMANS CUBE, KIESLERS INTERPERSONAL CIRCLE, AND PEABODY AND GOLDBERG DOUBLE CONE MODEL

    NARCIS (Netherlands)

    HOFSTEE, WKB

    1994-01-01

    The Abridged Big Five Circumplex (AB5C) model represents personality traits by their projections on one of the ten two-dimensional slices (circumplexes) of the space formed by the five broad trait dimensions. In retrospect, HEYMANS' Cube may be viewed as an early example of this kind of model. The A

  15. Perspectives on reactor safety

    Energy Technology Data Exchange (ETDEWEB)

    Haskin, F.E. [New Mexico Univ., Albuquerque, NM (United States). Dept. of Chemical and Nuclear Engineering; Camp, A.L. [Sandia National Labs., Albuquerque, NM (United States)

    1994-03-01

    The US Nuclear Regulatory Commission (NRC) maintains a technical training center at Chattanooga, Tennessee to provide appropriate training to both new and experienced NRC employees. This document describes a one-week course in reactor, safety concepts. The course consists of five modules: (1) historical perspective; (2) accident sequences; (3) accident progression in the reactor vessel; (4) containment characteristics and design bases; and (5) source terms and offsite consequences. The course text is accompanied by slides and videos during the actual presentation of the course.

  16. Reactor Materials Research

    Energy Technology Data Exchange (ETDEWEB)

    Van Walle, E

    2002-04-01

    The activities of SCK-CEN's Reactor Materials Research Department for 2001 are summarised. The objectives of the department are: (1) to evaluate the integrity and behaviour of structural materials used in nuclear power industry; (2) to conduct research to unravel and understand the parameters that determine the material behaviour under or after irradiation; (3) to contribute to the interpretation, the modelling of the material behaviour and to develop and assess strategies for optimum life management of nuclear power plant components. The programmes within the department are focussed on studies concerning (1) Irradiation Assisted Stress Corrosion Cracking (IASCC); (2) nuclear fuel; and (3) Reactor Pressure Vessel Steel.

  17. Big bang nucleosynthesis: Present status

    Science.gov (United States)

    Cyburt, Richard H.; Fields, Brian D.; Olive, Keith A.; Yeh, Tsung-Han

    2016-01-01

    Big bang nucleosynthesis (BBN) describes the production of the lightest nuclides via a dynamic interplay among the four fundamental forces during the first seconds of cosmic time. A brief overview of the essentials of this physics is given, and new calculations presented of light-element abundances through 6Li and 7Li, with updated nuclear reactions and uncertainties including those in the neutron lifetime. Fits are provided for these results as a function of baryon density and of the number of neutrino flavors Nν. Recent developments are reviewed in BBN, particularly new, precision Planck cosmic microwave background (CMB) measurements that now probe the baryon density, helium content, and the effective number of degrees of freedom Neff. These measurements allow for a tight test of BBN and cosmology using CMB data alone. Our likelihood analysis convolves the 2015 Planck data chains with our BBN output and observational data. Adding astronomical measurements of light elements strengthens the power of BBN. A new determination of the primordial helium abundance is included in our likelihood analysis. New D/H observations are now more precise than the corresponding theoretical predictions and are consistent with the standard model and the Planck baryon density. Moreover, D/H now provides a tight measurement of Nν when combined with the CMB baryon density and provides a 2 σ upper limit Nνpointing to new physics. This paper concludes with a look at future directions including key nuclear reactions, astronomical observations, and theoretical issues.

  18. "Big Science" exhibition at Balexert

    CERN Multimedia

    2008-01-01

    CERN is going out to meet those members of the general public who were unable to attend the recent Open Day. The Laboratory will be taking its "Big Science" exhibition from the Globe of Science and Innovation to the Balexert shopping centre from 19 to 31 May 2008. The exhibition, which shows the LHC and its experiments through the eyes of a photographer, features around thirty spectacular photographs measuring 4.5 metres high and 2.5 metres wide. Welcomed and guided around the exhibition by CERN volunteers, shoppers at Balexert will also have the opportunity to discover LHC components on display and watch films. "Fun with Physics" workshops will be held at certain times of the day. Main hall of the Balexert shopping centre, ground floor, from 9.00 a.m. to 7.00 p.m. Monday to Friday and from 10 a.m. to 6 p.m. on the two Saturdays. Call for volunteers All members of the CERN personnel are invited to enrol as volunteers to help welcom...

  19. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-06-01

    The Big Sky Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts during the second performance period fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts begun in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for

  20. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan Capalbo

    2005-12-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I are organized into four areas: (1) Evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; (2) Development of GIS-based reporting framework that links with national networks; (3) Design of an integrated suite of monitoring, measuring, and verification technologies, market-based opportunities for carbon management, and an economic/risk assessment framework; (referred to below as the Advanced Concepts component of the Phase I efforts) and (4) Initiation of a comprehensive education and outreach program. As a result of the Phase I activities, the groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that complements the ongoing DOE research agenda in Carbon Sequestration. The geology of the Big Sky Carbon Sequestration Partnership Region is favorable for the potential sequestration of enormous volume of CO{sub 2}. The United States Geological Survey (USGS 1995) identified 10 geologic provinces and 111 plays in the region. These provinces and plays include both sedimentary rock types characteristic of oil, gas, and coal productions as well as large areas of mafic volcanic rocks. Of the 10 provinces and 111 plays, 1 province and 4 plays are located within Idaho. The remaining 9 provinces and 107 plays are dominated by sedimentary rocks and located in the states of Montana and Wyoming. The potential sequestration capacity of the 9 sedimentary provinces within the region ranges from 25,000 to almost 900,000 million metric tons of CO{sub 2}. Overall every sedimentary formation investigated

  1. Astronomical Surveys and Big Data

    CERN Document Server

    Mickaelian, A M

    2015-01-01

    Recent all-sky and large-area astronomical surveys and their catalogued data over the whole range of electromagnetic spectrum are reviewed, from Gamma-ray to radio, such as Fermi-GLAST and INTEGRAL in Gamma-ray, ROSAT, XMM and Chandra in X-ray, GALEX in UV, SDSS and several POSS I and II based catalogues (APM, MAPS, USNO, GSC) in optical range, 2MASS in NIR, WISE and AKARI IRC in MIR, IRAS and AKARI FIS in FIR, NVSS and FIRST in radio and many others, as well as most important surveys giving optical images (DSS I and II, SDSS, etc.), proper motions (Tycho, USNO, Gaia), variability (GCVS, NSVS, ASAS, Catalina, Pan-STARRS) and spectroscopic data (FBS, SBS, Case, HQS, HES, SDSS, CALIFA, GAMA). An overall understanding of the coverage along the whole wavelength range and comparisons between various surveys are given: galaxy redshift surveys, QSO/AGN, radio, Galactic structure, and Dark Energy surveys. Astronomy has entered the Big Data era. Astrophysical Virtual Observatories and Computational Astrophysics play a...

  2. Big-bang nucleosynthesis revisited

    Science.gov (United States)

    Olive, Keith A.; Schramm, David N.; Steigman, Gary; Walker, Terry P.

    1989-01-01

    The homogeneous big-bang nucleosynthesis yields of D, He-3, He-4, and Li-7 are computed taking into account recent measurements of the neutron mean-life as well as updates of several nuclear reaction rates which primarily affect the production of Li-7. The extraction of primordial abundances from observation and the likelihood that the primordial mass fraction of He-4, Y(sub p) is less than or equal to 0.24 are discussed. Using the primordial abundances of D + He-3 and Li-7 we limit the baryon-to-photon ratio (eta in units of 10 exp -10) 2.6 less than or equal to eta(sub 10) less than or equal to 4.3; which we use to argue that baryons contribute between 0.02 and 0.11 to the critical energy density of the universe. An upper limit to Y(sub p) of 0.24 constrains the number of light neutrinos to N(sub nu) less than or equal to 3.4, in excellent agreement with the LEP and SLC collider results. We turn this argument around to show that the collider limit of 3 neutrino species can be used to bound the primordial abundance of He-4: 0.235 less than or equal to Y(sub p) less than or equal to 0.245.

  3. Big Data Empowered Self Organized Networks

    OpenAIRE

    Baldo, Nicola; Giupponi, Lorenza; Mangues-Bafalluy, Josep

    2014-01-01

    Mobile networks are generating a huge amount of data in the form of network measurements as well as network control and management interactions, and 5G is expected to make it even bigger. In this paper, we discuss the different approaches according to which this information could be leveraged using a Big Data approach. In particular, we focus on Big Data Empowered Self Organized Networks, discussing its most peculiar traits, its potential, and the relevant related work, as well as analysing s...

  4. Congenital malalignment of the big toe nail.

    Science.gov (United States)

    Wagner, Gunnar; Sachse, Michael Max

    2012-05-01

    Congenital malalignment of the big toe nail is based on a lateral deviation of the nail plate. This longitudinal axis shift is due to a deviation of the nail matrix, possibly caused by increased traction of the hypertrophic extensor tendon of the hallux. Congenital malalignment of the big toe nail is typically present at birth. Ingrown toenails and onychogryphosis are among the most common complications. Depending on the degree of deviation, conservative or surgical treatment may be recommended.

  5. ISSUES, CHALLENGES, AND SOLUTIONS: BIG DATA MINING

    OpenAIRE

    Jaseena K.U,; Julie M. David

    2014-01-01

    Data has become an indispensable part of every economy, industry, organization, business function and individual. Big Data is a term used to identify the datasets that whose size is beyond the ability of typical database software tools to store, manage and analyze. The Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation and measurement errors. These challenges are distinguishe...

  6. Cincinnati Big Area Additive Manufacturing (BAAM)

    Energy Technology Data Exchange (ETDEWEB)

    Duty, Chad E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Love, Lonnie J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-03-04

    Oak Ridge National Laboratory (ORNL) worked with Cincinnati Incorporated (CI) to demonstrate Big Area Additive Manufacturing which increases the speed of the additive manufacturing (AM) process by over 1000X, increases the size of parts by over 10X and shows a cost reduction of over 100X. ORNL worked with CI to transition the Big Area Additive Manufacturing (BAAM) technology from a proof-of-principle (TRL 2-3) demonstration to a prototype product stage (TRL 7-8).

  7. Data Confidentiality Challenges in Big Data Applications

    Energy Technology Data Exchange (ETDEWEB)

    Yin, Jian; Zhao, Dongfang

    2015-12-15

    In this paper, we address the problem of data confidentiality in big data analytics. In many fields, much useful patterns can be extracted by applying machine learning techniques to big data. However, data confidentiality must be protected. In many scenarios, data confidentiality could well be a prerequisite for data to be shared. We present a scheme to provide provable secure data confidentiality and discuss various techniques to optimize performance of such a system.

  8. COBE looks back to the Big Bang

    Science.gov (United States)

    Mather, John C.

    1993-01-01

    An overview is presented of NASA-Goddard's Cosmic Background Explorer (COBE), the first NASA satellite designed to observe the primeval explosion of the universe. The spacecraft carries three extremely sensitive IR and microwave instruments designed to measure the faint residual radiation from the Big Bang and to search for the formation of the first galaxies. COBE's far IR absolute spectrophotometer has shown that the Big Bang radiation has a blackbody spectrum, proving that there was no large energy release after the explosion.

  9. Harnessing the Heart of Big Data

    OpenAIRE

    Scruggs, Sarah B; Watson, Karol; Su, Andrew I.; Hermjakob, Henning; Yates, John R.; Lindsey, Merry L.; Ping, Peipei

    2015-01-01

    The exponential increase in Big Data generation combined with limited capitalization on the wealth of information embedded within Big Data have prompted us to revisit our scientific discovery paradigms. A successful transition into this digital era of medicine holds great promise for advancing fundamental knowledge in biology, innovating human health and driving personalized medicine, however, this will require a drastic shift of research culture in how we conceptualize science and use data. ...

  10. Tenåringsdrikking i utviklingspsykologisk perspektiv

    Directory of Open Access Journals (Sweden)

    Hilde Pape

    2009-10-01

    Full Text Available  SAMMENDRAGHvorfor er alkohol så populært blant unge mennesker? Dette viktige spørsmålet har vært gjenstand for fåempiriske studier. Forskningsbasert kunnskap om alkoholens positive sider og forsterkende egenskaper erderfor av begrenset omfang. Derimot har tallrike undersøkelser fokusert på ulike skadevirkninger som følgeav tenåringsdrikking. Resultatene av denne forskningen har bidratt til å understreke behovet for en aktivrusforebyggende innsats. Innsikt i alkoholens opplevde goder er imidlertid nødvendig for å kunne utvikleeffektive forebyggingsstrategier. På denne bakgrunn er søkelyset i artikkelen rettet mot psykososiale funksjonerved unge menneskers drikkevaner. Spørsmål knyttet til gruppepress og modell-læring vil også bli berørt.Hensikten er å formidle sentrale funn fra nyere forskning på feltet. Oppsummeringsvis tyder resultatene på atalkohol har en særlig appell til ungdom som er veltilpassede og sosialt anlagte. Samtidig ser det ut til atdrikking kan bidra til å fremme utviklingsprosessen i ungdomstida, men at det primært handler om indirekteeffekter. Hvilke implikasjoner de ulike funnene har mht. forebygging, er skissert i avslutningsdelen.Pape H. Teenage alcohol use from the perspective of psychological development.Nor J EpidemiolEWhy is alcohol so popular among young people? So far, few studies have addressed this important question.The body of scientific research on the positive and reinforcing aspects of drinking is accordingly of limitedextent. Numerous studies have focused on the harmful effects of teenage alcohol use and the findings clearlyunderscore the importance of primary prevention. Knowledge about the perceived advantages of alcohol useis needed to develop effective preventive programs, however. On this background, the article focuses onpsychosocial functions of youthful drinking. Findings from recent research regarding the link between alcoholuse and various indicators of adolescent

  11. Big Data: Survey, Technologies, Opportunities, and Challenges

    Directory of Open Access Journals (Sweden)

    Nawsher Khan

    2014-01-01

    Full Text Available Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  12. Reactor operation environmental information document

    Energy Technology Data Exchange (ETDEWEB)

    Haselow, J.S.; Price, V.; Stephenson, D.E.; Bledsoe, H.W.; Looney, B.B.

    1989-12-01

    The Savannah River Site (SRS) produces nuclear materials, primarily plutonium and tritium, to meet the requirements of the Department of Defense. These products have been formed in nuclear reactors that were built during 1950--1955 at the SRS. K, L, and P reactors are three of five reactors that have been used in the past to produce the nuclear materials. All three of these reactors discontinued operation in 1988. Currently, intense efforts are being extended to prepare these three reactors for restart in a manner that protects human health and the environment. To document that restarting the reactors will have minimal impacts to human health and the environment, a three-volume Reactor Operations Environmental Impact Document has been prepared. The document focuses on the impacts of restarting the K, L, and P reactors on both the SRS and surrounding areas. This volume discusses the geology, seismology, and subsurface hydrology. 195 refs., 101 figs., 16 tabs.

  13. High Flux Isotope Reactor (HFIR)

    Data.gov (United States)

    Federal Laboratory Consortium — The HFIR at Oak Ridge National Laboratory is a light-water cooled and moderated reactor that is the United States’ highest flux reactor-based neutron source. HFIR...

  14. Reactor operation safety information document

    Energy Technology Data Exchange (ETDEWEB)

    1990-01-01

    The report contains a reactor facility description which includes K, P, and L reactor sites, structures, operating systems, engineered safety systems, support systems, and process and effluent monitoring systems; an accident analysis section which includes cooling system anomalies, radioactive materials releases, and anticipated transients without scram; a summary of onsite doses from design basis accidents; severe accident analysis (reactor core disruption); a description of operating contractor organization and emergency planning; and a summary of reactor safety evolution. (MB)

  15. Don’t miss the Passport to the Big Bang event this Sunday!

    CERN Multimedia

    CERN Bulletin

    2013-01-01

    Word has been going around for weeks now about the inauguration of the Passport to the Big Bang on Sunday 2 June. Ideal for a family day out or a day with friends, this is a CERN event not to be missed!   The Passport to the Big Bang is a 54-km scientific tourist trail comprising ten exhibition platforms in front of ten CERN sites in the Pays de Gex and the Canton of Geneva. Linked by cycle routes, these ten platforms will mark the same number of stages in the rally for competitive cyclists and the bicycle tour for families taking place this Sunday from 9 a.m. to 12 p.m. But that’s not all: from 2 p.m., you will also have the chance to take part in a huge range of activities provided by clubs and associations from CERN and the local region. Watch an oriental dance show, have a go at building detectors out of Kapla blocks and Lego, meet different reptile species, learn about wind instruments, try your hand at Nordic walking or Zumba fitness, get a better understanding of road safety...

  16. Lecture 10: The European Bioinformatics Institute - "Big data" for biomedical sciences

    CERN Document Server

    CERN. Geneva; Dana, Jose

    2013-01-01

    Part 1: Big data for biomedical sciences (Tom Hancocks) Ten years ago witnessed the completion of the first international 'Big Biology' project that sequenced the human genome. In the years since biological sciences, have seen a vast growth in data. In the coming years advances will come from integration of experimental approaches and the translation into applied technologies is the hospital, clinic and even at home. This talk will examine the development of infrastructure, physical and virtual, that will allow millions of life scientists across Europe better access to biological data Tom studied Human Genetics at the University of Leeds and McMaster University, before completing an MSc in Analytical Genomics at the University of Birmingham. He has worked for the UK National Health Service in diagnostic genetics and in training healthcare scientists and clinicians in bioinformatics. Tom joined the EBI in 2012 and is responsible for the scientific development and delivery of training for the BioMedBridges pr...

  17. Thermal Reactor Safety

    Energy Technology Data Exchange (ETDEWEB)

    1980-06-01

    Information is presented concerning fire risk and protection; transient thermal-hydraulic analysis and experiments; class 9 accidents and containment; diagnostics and in-service inspection; risk and cost comparison of alternative electric energy sources; fuel behavior and experiments on core cooling in LOCAs; reactor event reporting analysis; equipment qualification; post facts analysis of the TMI-2 accident; and computational methods.

  18. Chromatographic and Related Reactors.

    Science.gov (United States)

    1988-01-07

    special information about effects of surface heteroge- neity in the methanation reaction. Studies of an efficient multicolumn assembly for measuring...of organic basic catalysts such as pyridine and 4-methylpicoline. It was demonstrated that the chromatographic reactor gave special information about...Programmed Reaction to obtain special information about surface heterogeneity in the methanation reaction. Advantages of stopped flow over steady state

  19. Nuclear Reactors and Technology

    Energy Technology Data Exchange (ETDEWEB)

    Cason, D.L.; Hicks, S.C. [eds.

    1992-01-01

    This publication Nuclear Reactors and Technology (NRT) announces on a monthly basis the current worldwide information available from the open literature on nuclear reactors and technology, including all aspects of power reactors, components and accessories, fuel elements, control systems, and materials. This publication contains the abstracts of DOE reports, journal articles, conference papers, patents, theses, and monographs added to the Energy Science and Technology Database during the past month. Also included are US information obtained through acquisition programs or interagency agreements and international information obtained through the International Energy Agency`s Energy Technology Data Exchange or government-to-government agreements. The digests in NRT and other citations to information on nuclear reactors back to 1948 are available for online searching and retrieval on the Energy Science and Technology Database and Nuclear Science Abstracts (NSA) database. Current information, added daily to the Energy Science and Technology Database, is available to DOE and its contractors through the DOE Integrated Technical Information System. Customized profiles can be developed to provide current information to meet each user`s needs.

  20. Fusion reactor materials

    Energy Technology Data Exchange (ETDEWEB)

    none,

    1989-01-01

    This paper discuses the following topics on fusion reactor materials: irradiation, facilities, test matrices, and experimental methods; dosimetry, damage parameters, and activation calculations; materials engineering and design requirements; fundamental mechanical behavior; radiation effects; development of structural alloys; solid breeding materials; and ceramics.

  1. WATER BOILER REACTOR

    Science.gov (United States)

    King, L.D.P.

    1960-11-22

    As its name implies, this reactor utilizes an aqueous solution of a fissionable element salt, and is also conventional in that it contains a heat exchanger cooling coil immersed in the fuel. Its novelty lies in the utilization of a cylindrical reactor vessel to provide a critical region having a large and constant interface with a supernatant vapor region, and the use of a hollow sleeve coolant member suspended from the cover assembly in coaxial relation with the reactor vessel. Cool water is circulated inside this hollow coolant member, and a gap between its outer wall and the reactor vessel is used to carry off radiolytic gases for recombination in an external catalyst chamber. The central passage of the coolant member defines a reflux condenser passage into which the externally recombined gases are returned and condensed. The large and constant interface between fuel solution and vapor region prevents the formation of large bubbles and minimizes the amount of fuel salt carried off by water vapor, thus making possible higher flux densities, specific powers and power densities.

  2. The First Reactor.

    Science.gov (United States)

    Department of Energy, Washington, DC.

    On December 2, 1942, in a racquet court underneath the West Stands of Stagg Field at the University of Chicago, a team of scientists led by Enrico Fermi created the first controlled, self-sustaining nuclear chain reaction. This updated and revised story of the first reactor (or "pile") is based on postwar interviews (as told to Corbin…

  3. MULTISTAGE FLUIDIZED BED REACTOR

    Science.gov (United States)

    Jonke, A.A.; Graae, J.E.A.; Levitz, N.M.

    1959-11-01

    A multistage fluidized bed reactor is described in which each of a number of stages is arranged with respect to an associated baffle so that a fluidizing gas flows upward and a granular solid downward through the stages and baffles, whereas the granular solid stopsflowing downward when the flow of fluidizing gas is shut off.

  4. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    Energy Technology Data Exchange (ETDEWEB)

    Lucas, Robert [University of Southern California, Information Sciences Institute; Ang, James [Sandia National Laboratories; Bergman, Keren [Columbia University; Borkar, Shekhar [Intel; Carlson, William [Institute for Defense Analyses; Carrington, Laura [University of California, San Diego; Chiu, George [IBM; Colwell, Robert [DARPA; Dally, William [NVIDIA; Dongarra, Jack [University of Tennessee; Geist, Al [Oak Ridge National Laboratory; Haring, Rud [IBM; Hittinger, Jeffrey [Lawrence Livermore National Laboratory; Hoisie, Adolfy [Pacific Northwest National Laboratory; Klein, Dean Micron; Kogge, Peter [University of Notre Dame; Lethin, Richard [Reservoir Labs; Sarkar, Vivek [Rice University; Schreiber, Robert [Hewlett Packard; Shalf, John [Lawrence Berkeley National Laboratory; Sterling, Thomas [Indiana University; Stevens, Rick [Argonne National Laboratory; Bashor, Jon [Lawrence Berkeley National Laboratory; Brightwell, Ron [Sandia National Laboratories; Coteus, Paul [IBM; Debenedictus, Erik [Sandia National Laboratories; Hiller, Jon [Science and Technology Associates; Kim, K. H. [IBM; Langston, Harper [Reservoir Labs; Murphy, Richard Micron; Webster, Clayton [Oak Ridge National Laboratory; Wild, Stefan [Argonne National Laboratory; Grider, Gary [Los Alamos National Laboratory; Ross, Rob [Argonne National Laboratory; Leyffer, Sven [Argonne National Laboratory; Laros III, James [Sandia National Laboratories

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a system that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.

  5. Brazilian multipurpose reactor

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-07-01

    The Brazilian Multipurpose Reactor (RMB) Project is an action of the Federal Government, through the Ministry of Science Technology and Innovation (MCTI) and has its execution under the responsibility of the Brazilian National Nuclear Energy Commission (CNEN). Within the CNEN, the project is coordinated by the Research and Development Directorate (DPD) and developed through research units of this board: Institute of Nuclear Energy Research (IPEN); Nuclear Engineering Institute (IEN); Centre for Development of Nuclear Technology (CDTN); Regional Center of Nuclear Sciences (CRCN-NE); and Institute of Radiation Protection and Dosimetry (IRD). The Navy Technological Center in Sao Paulo (CTMSP) and also the participation of other research centers, universities, laboratories and companies in the nuclear sector are important and strategic partnerships. The conceptual design and the safety analysis of the reactor and main facilities, related to nuclear and environmental licensing, are performed by technicians of the research units of DPD / CNEN. The basic design was contracted to engineering companies as INTERTHECNE from Brazil and INVAP from Argentine. The research units from DPD/CNEN are also responsible for the design verification on all engineering documents developed by the contracted companies. The construction and installation should be performed by specific national companies and international partnerships. The Nuclear Reactor RMB will be a open pool type reactor with maximum power of 30 MW and have the OPAL nuclear reactor of 20 MW, built in Australia and designed by INVAP, as reference. The RMB reactor core will have a 5x5 configuration, consisting of 23 elements fuels (EC) of U{sub 3}Si{sub 2} dispersion-type Al having a density of up to 3.5 gU/cm{sup 3} and enrichment of 19.75% by weight of {sup 23{sup 5}}U. Two positions will be available in the core for materials irradiation devices. The main objectives of the RMB Reactor and the other nuclear and radioactive

  6. Boosting Big National Lab Data

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-02-21

    Introduction: Big data. Love it or hate it, solving the world’s most intractable problems requires the ability to make sense of huge and complex sets of data and do it quickly. Speeding up the process – from hours to minutes or from weeks to days – is key to our success. One major source of such big data are physical experiments. As many will know, these physical experiments are commonly used to solve challenges in fields such as energy security, manufacturing, medicine, pharmacology, environmental protection and national security. Experiments use different instruments and sensor types to research for example the validity of new drugs, the base cause for diseases, more efficient energy sources, new materials for every day goods, effective methods for environmental cleanup, the optimal ingredients composition for chocolate or determine how to preserve valuable antics. This is done by experimentally determining the structure, properties and processes that govern biological systems, chemical processes and materials. The speed and quality at which we can acquire new insights from experiments directly influences the rate of scientific progress, industrial innovation and competitiveness. And gaining new groundbreaking insights, faster, is key to the economic success of our nations. Recent years have seen incredible advances in sensor technologies, from house size detector systems in large experiments such as the Large Hadron Collider and the ‘Eye of Gaia’ billion pixel camera detector to high throughput genome sequencing. These developments have led to an exponential increase in data volumes, rates and variety produced by instruments used for experimental work. This increase is coinciding with a need to analyze the experimental results at the time they are collected. This speed is required to optimize the data taking and quality, and also to enable new adaptive experiments, where the sample is manipulated as it is observed, e.g. a substance is injected into a

  7. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2005-11-01

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO2 utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research agenda in Carbon Sequestration. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other DOE regional partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for soil C in the

  8. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-10-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification

  9. Small government or big government?

    Directory of Open Access Journals (Sweden)

    MATEO SPAHO

    2015-03-01

    Full Text Available Since the beginning of the twentieth century, economists and philosophers were polarizedon their positions beyond the role that the government should have in the economy. On one hand John Maynard Keynes represented, within the optics of market economy, a position where the state should intervene in the economy to maintain the aggregate demand and the employment in the country, without hesitation in creating budget deficits and public debt expansion. This approach happens especially in the moments when the domestic economy and global economic trends show a weak growth or a recession. This means a heavy interference inthe economy, with higher income but with high expenditure to GDP too. On the other side, Liberals and Neoliberalsled by Friedrich Hayek advocated a withdrawal of the government from economic activity not just in moments of economic growth but also during the crisis, believing that the market has self-regulating mechanisms within itself. The government, as a result will have a smaller dimension with lower income and also low expenditures compared to the GDP of the country. We took the South-Eastern Europe countries distinguishing those with a "Big Government" or countries with "Small Government". There are analyzed the economic performances during the global crisis (2007-2014. In which countries the public debt grew less? Which country managed to attract more investments and which were the countries that preserved the purchasing power of their consumers? We shall see if during the economic crisis in Eastern Europe the Great Government or the Liberal and "Small" one has been the most successful the model.

  10. Modeling Chemical Reactors I: Quiescent Reactors

    CERN Document Server

    Michoski, C E; Schmitz, P G

    2010-01-01

    We introduce a fully generalized quiescent chemical reactor system in arbitrary space $\\vdim =1,2$ or 3, with $n\\in\\mathbb{N}$ chemical constituents $\\alpha_{i}$, where the character of the numerical solution is strongly determined by the relative scaling between the local reactivity of species $\\alpha_{i}$ and the local functional diffusivity $\\mathscr{D}_{ij}(\\alpha)$ of the reaction mixture. We develop an operator time-splitting predictor multi-corrector RK--LDG scheme, and utilize $hp$-adaptivity relying only on the entropy $\\mathscr{S}_{\\mathfrak{R}}$ of the reactive system $\\mathfrak{R}$. This condition preserves these bounded nonlinear entropy functionals as a necessarily enforced stability condition on the coupled system. We apply this scheme to a number of application problems in chemical kinetics; including a difficult classical problem arising in nonequilibrium thermodynamics known as the Belousov-Zhabotinskii reaction where we utilize a concentration-dependent diffusivity tensor $\\mathscr{D}_{ij}(...

  11. Transcutaneous electric nerve stimulation (TENS) in dentistry- A review.

    Science.gov (United States)

    Kasat, Vikrant; Gupta, Aditi; Ladda, Ruchi; Kathariya, Mitesh; Saluja, Harish; Farooqui, Anjum-Ara

    2014-12-01

    Transcutaneous electric nerve stimulation (TENS) is a non-pharmacological method which is widely used by medical and paramedical professionals for the management of acute and chronic pain in a variety of conditions. Similarly, it can be utilized for the management of pain during various dental procedures as well as pain due to various conditions affecting maxillofacial region. This review aims to provide an insight into clinical research evidence available for the analgesic and non analgesic uses of TENS in pediatric as well as adult patients related to the field of dentistry. Also, an attempt is made to briefly discuss history of therapeutic electricity, mechanism of action of TENS, components of TENs equipment, types, techniques of administration, advantages and contradictions of TENS. With this we hope to raise awareness among dental fraternity regarding its dental applications thereby increasing its use in dentistry. Key words:Dentistry, pain, TENS.

  12. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2005-11-01

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO2 utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research agenda in Carbon Sequestration. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other DOE regional partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for soil C in the

  13. Big Sky Carbon Sequestration Partnership

    Energy Technology Data Exchange (ETDEWEB)

    Susan Capalbo

    2005-12-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I are organized into four areas: (1) Evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; (2) Development of GIS-based reporting framework that links with national networks; (3) Design of an integrated suite of monitoring, measuring, and verification technologies, market-based opportunities for carbon management, and an economic/risk assessment framework; (referred to below as the Advanced Concepts component of the Phase I efforts) and (4) Initiation of a comprehensive education and outreach program. As a result of the Phase I activities, the groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that complements the ongoing DOE research agenda in Carbon Sequestration. The geology of the Big Sky Carbon Sequestration Partnership Region is favorable for the potential sequestration of enormous volume of CO{sub 2}. The United States Geological Survey (USGS 1995) identified 10 geologic provinces and 111 plays in the region. These provinces and plays include both sedimentary rock types characteristic of oil, gas, and coal productions as well as large areas of mafic volcanic rocks. Of the 10 provinces and 111 plays, 1 province and 4 plays are located within Idaho. The remaining 9 provinces and 107 plays are dominated by sedimentary rocks and located in the states of Montana and Wyoming. The potential sequestration capacity of the 9 sedimentary provinces within the region ranges from 25,000 to almost 900,000 million metric tons of CO{sub 2}. Overall every sedimentary formation investigated

  14. Alternative approaches to fusion. [reactor design and reactor physics for Tokamak fusion reactors

    Science.gov (United States)

    Roth, R. J.

    1976-01-01

    The limitations of the Tokamak fusion reactor concept are discussed and various other fusion reactor concepts are considered that employ the containment of thermonuclear plasmas by magnetic fields (i.e., stellarators). Progress made in the containment of plasmas in toroidal devices is reported. Reactor design concepts are illustrated. The possibility of using fusion reactors as a power source in interplanetary space travel and electric power plants is briefly examined.

  15. Big Data Big Changes%大数据,大变革

    Institute of Scientific and Technical Information of China (English)

    梁爽

    2014-01-01

    大数据正时刻发生在人们的身边,大数据时代已经到来。本文通过对大数据特点的描述,分析了大数据在国内外的研究现状以及未来的应用方向,只有重新认识大数据,从思维上变革对大数据的认识,从商业模式上适应大数据的变化,创新大数据管理模式,加强制度建设,增强法律意识,保证个人和国家的安全,才能不断推动大数据的健康发展。%Big data are always happen in people’s side, big data era has arrived. This paper has described the characteristics of big data, analyzed big data research status and future application direction. Only to understand big data again, change the thinking of big data, adapt to changes in business model, innovative big data management, strengthen institution construction, enhance law awareness, ensure the personal and national security, it can continuously promote the healthy development of big data.

  16. Benchmarking Big Data Systems and the BigData Top100 List.

    Science.gov (United States)

    Baru, Chaitanya; Bhandarkar, Milind; Nambiar, Raghunath; Poess, Meikel; Rabl, Tilmann

    2013-03-01

    "Big data" has become a major force of innovation across enterprises of all sizes. New platforms with increasingly more features for managing big datasets are being announced almost on a weekly basis. Yet, there is currently a lack of any means of comparability among such platforms. While the performance of traditional database systems is well understood and measured by long-established institutions such as the Transaction Processing Performance Council (TCP), there is neither a clear definition of the performance of big data systems nor a generally agreed upon metric for comparing these systems. In this article, we describe a community-based effort for defining a big data benchmark. Over the past year, a Big Data Benchmarking Community has become established in order to fill this void. The effort focuses on defining an end-to-end application-layer benchmark for measuring the performance of big data applications, with the ability to easily adapt the benchmark specification to evolving challenges in the big data space. This article describes the efforts that have been undertaken thus far toward the definition of a BigData Top100 List. While highlighting the major technical as well as organizational challenges, through this article, we also solicit community input into this process.

  17. Five Big, Big Five Issues : Rationale, Content, Structure, Status, and Crosscultural Assessment

    NARCIS (Netherlands)

    De Raad, Boele

    1998-01-01

    This article discusses the rationale, content, structure, status, and crosscultural assessment of the Big Five trait factors, focusing on topics of dispute and misunderstanding. Taxonomic restrictions of the original Big Five forerunner, the "Norman Five," are discussed, and criticisms regarding the

  18. Big Data in Caenorhabditis elegans: quo vadis?

    Science.gov (United States)

    Hutter, Harald; Moerman, Donald

    2015-11-05

    A clear definition of what constitutes "Big Data" is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of "complete" data sets for this organism is actually rather small--not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein-protein interaction--important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell.

  19. Reactor monitoring using antineutrino detectors

    Science.gov (United States)

    Bowden, N. S.

    2011-08-01

    Nuclear reactors have served as the antineutrino source for many fundamental physics experiments. The techniques developed by these experiments make it possible to use these weakly interacting particles for a practical purpose. The large flux of antineutrinos that leaves a reactor carries information about two quantities of interest for safeguards: the reactor power and fissile inventory. Measurements made with antineutrino detectors could therefore offer an alternative means for verifying the power history and fissile inventory of a reactor as part of International Atomic Energy Agency (IAEA) and/or other reactor safeguards regimes. Several efforts to develop this monitoring technique are underway worldwide.

  20. Reactor vessel support system. [LMFBR

    Science.gov (United States)

    Golden, M.P.; Holley, J.C.

    1980-05-09

    A reactor vessel support system includes a support ring at the reactor top supported through a box ring on a ledge of the reactor containment. The box ring includes an annular space in the center of its cross-section to reduce heat flow and is keyed to the support ledge to transmit seismic forces from the reactor vessel to the containment structure. A coolant channel is provided at the outside circumference of the support ring to supply coolant gas through the keyways to channels between the reactor vessel and support ledge into the containment space.

  1. Migration and retention of elements at the Oklo natural reactor

    Science.gov (United States)

    Brookins, Douglas G.

    1982-09-01

    The Oklo natural reactor, Gabon, permits study of fission-produced elemental behavior in a natural geologic environment. The uranium ore that sustained fission reactions formed about 2 billion years before present (BYBP), and the reactor was operative for about 5 × 105 yrs between about 1.95 to 2 BYBP. The many tons of fission products can, for the most part, be studied for their abundance and distribution today. Since reactor shutdown, many fissiogenic elements have not migrated from host pitchblende, and several others have migrated only a few tens of meters from the reactor ore. Only Xe and Kr have apparently been largely removed from the reactor zones. An element by element assessment of the Oklo rocks' ability to retain the fission products, and actinides and radiogenic Pb and Bi as well, leads to the conclusion that no widespread migration of the elements occurred. This suggests that rocks with more favorable geologic characteristics are indeed well suited for consideration for the storage of radioactive waste.

  2. Strategies and Principles of Distributed Machine Learning on Big Data

    Directory of Open Access Journals (Sweden)

    Eric P. Xing

    2016-06-01

    Full Text Available The rise of big data has led to new demands for machine learning (ML systems to learn complex models, with millions to billions of parameters, that promise adequate capacity to digest massive datasets and offer powerful predictive analytics (such as high-dimensional latent features, intermediate representations, and decision functions thereupon. In order to run ML algorithms at such scales, on a distributed cluster with tens to thousands of machines, it is often the case that significant engineering efforts are required—and one might fairly ask whether such engineering truly falls within the domain of ML research. Taking the view that “big” ML systems can benefit greatly from ML-rooted statistical and algorithmic insights—and that ML researchers should therefore not shy away from such systems design—we discuss a series of principles and strategies distilled from our recent efforts on industrial-scale ML solutions. These principles and strategies span a continuum from application, to engineering, and to theoretical research and development of big ML systems and architectures, with the goal of understanding how to make them efficient, generally applicable, and supported with convergence and scaling guarantees. They concern four key questions that traditionally receive little attention in ML research: How can an ML program be distributed over a cluster? How can ML computation be bridged with inter-machine communication? How can such communication be performed? What should be communicated between machines? By exposing underlying statistical and algorithmic characteristics unique to ML programs but not typically seen in traditional computer programs, and by dissecting successful cases to reveal how we have harnessed these principles to design and develop both high-performance distributed ML software as well as general-purpose ML frameworks, we present opportunities for ML researchers and practitioners to further shape and enlarge the area

  3. Methanogenesis in Thermophilic Biogas Reactors

    DEFF Research Database (Denmark)

    Ahring, Birgitte Kiær

    1995-01-01

    Methanogenesis in thermophilic biogas reactors fed with different wastes is examined. The specific methanogenic activity with acetate or hydrogen as substrate reflected the organic loading of the specific reactor examined. Increasing the loading of thermophilic reactors stabilized the process...... as indicated by a lower concentration of volatile fatty acids in the effluent from the reactors. The specific methanogenic activity in a thermophilic pilot-plant biogas reactor fed with a mixture of cow and pig manure reflected the stability of the reactor. The numbers of methanogens counted by the most...... against Methanothrix soehngenii or Methanothrix CALS-I in any of the thermophilic biogas reactors examined. Studies using 2-14C-labeled acetate showed that at high concentrations (more than approx. 1 mM) acetate was metabolized via the aceticlastic pathway, transforming the methyl-group of acetate...

  4. Summary of space nuclear reactor power systems, 1983--1992

    Energy Technology Data Exchange (ETDEWEB)

    Buden, D.

    1993-08-11

    This report summarizes major developments in the last ten years which have greatly expanded the space nuclear reactor power systems technology base. In the SP-100 program, after a competition between liquid-metal, gas-cooled, thermionic, and heat pipe reactors integrated with various combinations of thermoelectric thermionic, Brayton, Rankine, and Stirling energy conversion systems, three concepts:were selected for further evaluation. In 1985, the high-temperature (1,350 K), lithium-cooled reactor with thermoelectric conversion was selected for full scale development. Since then, significant progress has been achieved including the demonstration of a 7-y-life uranium nitride fuel pin. Progress on the lithium-cooled reactor with thermoelectrics has progressed from a concept, through a generic flight system design, to the design, development, and testing of specific components. Meanwhile, the USSR in 1987--88 orbited a new generation of nuclear power systems beyond the, thermoelectric plants on the RORSAT satellites. The US has continued to advance its own thermionic fuel element development, concentrating on a multicell fuel element configuration. Experimental work has demonstrated a single cell operating time of about 1 1/2-y. Technology advances have also been made in the Stirling engine; an advanced engine that operates at 1,050 K is ready for testing. Additional concepts have been studied and experiments have been performed on a variety of systems to meet changing needs; such as powers of tens-to-hundreds of megawatts and highly survivable systems of tens-of-kilowatts power.

  5. Transcriptome marker diagnostics using big data.

    Science.gov (United States)

    Han, Henry; Liu, Ying

    2016-02-01

    The big omics data are challenging translational bioinformatics in an unprecedented way for its complexities and volumes. How to employ big omics data to achieve a rivalling-clinical, reproducible disease diagnosis from a systems approach is an urgent problem to be solved in translational bioinformatics and machine learning. In this study, the authors propose a novel transcriptome marker diagnosis to tackle this problem using big RNA-seq data by viewing whole transcriptome as a profile marker systematically. The systems diagnosis not only avoids the reproducibility issue of the existing gene-/network-marker-based diagnostic methods, but also achieves rivalling-clinical diagnostic results by extracting true signals from big RNA-seq data. Their method demonstrates a better fit for personalised diagnostics by attaining exceptional diagnostic performance via using systems information than its competitive methods and prepares itself as a good candidate for clinical usage. To the best of their knowledge, it is the first study on this topic and will inspire the more investigations in big omics data diagnostics.

  6. Conceptualization and theorization of the Big Data

    Directory of Open Access Journals (Sweden)

    Marcos Mazzieri

    2016-06-01

    Full Text Available The term Big Data is being used widely by companies and researchers who consider your relevant functionalities or applications to create value and business innovation. However some questions arise about what is this phenomenon and, more precisely, how it occurs and under what conditions it can create value and innovation in business. In our view, the lack of depth related to the principles involved in Big Data and the very absence of a conceptual definition, made it difficult to answer these questions that have been the basis for our research. To answer these questions we did a bibliometric study and extensive literature review. The bibliometric studies were realized based in articles and citation of Web of Knowledge database. The main result of our research is the providing a conceptual definition for the term Big Data. Also, we propose which principles discovered can contribute with other researches  that intend value creation by Big Data. Finally we propose see the value creation through Big Data using the  Resource Based View as the main theory used for discuss that theme.

  7. Volume and Value of Big Healthcare Data

    Science.gov (United States)

    Dinov, Ivo D.

    2016-01-01

    Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions. PMID:26998309

  8. Big data analytics a management perspective

    CERN Document Server

    Corea, Francesco

    2016-01-01

    This book is about innovation, big data, and data science seen from a business perspective. Big data is a buzzword nowadays, and there is a growing necessity within practitioners to understand better the phenomenon, starting from a clear stated definition. This book aims to be a starting reading for executives who want (and need) to keep the pace with the technological breakthrough introduced by new analytical techniques and piles of data. Common myths about big data will be explained, and a series of different strategic approaches will be provided. By browsing the book, it will be possible to learn how to implement a big data strategy and how to use a maturity framework to monitor the progress of the data science team, as well as how to move forward from one stage to the next. Crucial challenges related to big data will be discussed, where some of them are more general - such as ethics, privacy, and ownership – while others concern more specific business situations (e.g., initial public offering, growth st...

  9. Intra- and interspecific responses to Rafinesque’s big-eared bat (Corynorhinus rafinesquii) social calls.

    Energy Technology Data Exchange (ETDEWEB)

    Loeb, Susan, C.; Britzke, Eric, R.

    2010-07-01

    Bats respond to the calls of conspecifics as well as to calls of other species; however, few studies have attempted to quantify these responses or understand the functions of these calls. We tested the response of Rafinesque’s big-eared bats (Corynorhinus rafinesquii) to social calls as a possible method to increase capture success and to understand the function of social calls. We also tested if calls of bats within the range of the previously designated subspecies differed, if the responses of Rafinesque’s big-eared bats varied with geographic origin of the calls, and if other species responded to the calls of C. rafinesquii. We recorded calls of Rafinesque’s big-eared bats at two colony roost sites in South Carolina, USA. Calls were recorded while bats were in the roosts and as they exited. Playback sequences for each site were created by copying typical pulses into the playback file. Two mist nets were placed approximately 50–500 m from known roost sites; the net with the playback equipment served as the Experimental net and the one without the equipment served as the Control net. Call structures differed significantly between the Mountain and Coastal Plains populations with calls from the Mountains being of higher frequency and longer duration. Ten of 11 Rafinesque’s big-eared bats were caught in the Control nets and, 13 of 19 bats of other species were captured at Experimental nets even though overall bat activity did not differ significantly between Control and Experimental nets. Our results suggest that Rafinesque’s big-eared bats are not attracted to conspecifics’ calls and that these calls may act as an intraspecific spacing mechanism during foraging.

  10. New AB-Thermonuclear Reactor for Aerospace

    CERN Document Server

    Bolonkin, Alexander

    2007-01-01

    There are two main methods of nulcear fusion: inertial confinement fusion (ICF) and magnetic confinement fusion (MCF). Existing thermonuclear reactors are very complex, expensive, large, and heavy. They cannot achieve the Lawson creterion. The author offers an innovation. ICF has on the inside surface of the shell-shaped combustion chamber a covering of small Prism Reflectors (PR) and plasma reflector. These prism reflectors have a noteworthy advantage, in comparison with conventional mirror and especially with conventional shell: they multi-reflect the heat and laser radiation exactly back into collision with the fuel target capsule (pellet). The plasma reflector reflects the Bremsstrahlung radiation. The offered innovation decreases radiation losses, creates significant radiation pressure and increases the reaction time. The Lawson criterion increases by hundreds of times. The size, cost, and weight of a typical installation will decrease by tens of times. The author is researching the efficiency of these i...

  11. Big Data Solution for CTBT Monitoring Using Global Cross Correlation

    Science.gov (United States)

    Gaillard, P.; Bobrov, D.; Dupont, A.; Grenouille, A.; Kitov, I. O.; Rozhkov, M.

    2014-12-01

    Due to the mismatch between data volume and the performance of the Information Technology infrastructure used in seismic data centers, it becomes more and more difficult to process all the data with traditional applications in a reasonable elapsed time. To fulfill their missions, the International Data Centre of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO/IDC) and the Département Analyse Surveillance Environnement of Commissariat à l'Energie atomique et aux énergies alternatives (CEA/DASE) collect, process and produce complex data sets whose volume is growing exponentially. In the medium term, computer architectures, data management systems and application algorithms will require fundamental changes to meet the needs. This problem is well known and identified as a "Big Data" challenge. To tackle this major task, the CEA/DASE takes part during two years to the "DataScale" project. Started in September 2013, DataScale gathers a large set of partners (research laboratories, SMEs and big companies). The common objective is to design efficient solutions using the synergy between Big Data solutions and the High Performance Computing (HPC). The project will evaluate the relevance of these technological solutions by implementing a demonstrator for seismic event detections thanks to massive waveform correlations. The IDC has developed an expertise on such techniques leading to an algorithm called "Master Event" and provides a high-quality dataset for an extensive cross correlation study. The objective of the project is to enhance the Master Event algorithm and to reanalyze 10 years of waveform data from the International Monitoring System (IMS) network thanks to a dedicated HPC infrastructure operated by the "Centre de Calcul Recherche et Technologie" at the CEA of Bruyères-le-Châtel. The dataset used for the demonstrator includes more than 300,000 seismic events, tens of millions of raw detections and more than 30 terabytes of continuous seismic data

  12. Idaho National Laboratory Ten-Year Site Plan Project Description Document

    Energy Technology Data Exchange (ETDEWEB)

    Not Listed

    2012-03-01

    This document describes the currently active and proposed infrastructure projects listed in Appendix B of the Idaho National Laboratory 2013-2022 Ten Year Site Plan (DOE/ID-11449). It was produced in accordance with Contract Data Requirements List I.06. The projects delineated in this document support infrastructure needs at INL's Research and Education Campus, Materials and Fuels Complex, Advanced Test Reactor Complex and the greater site-wide area. The projects provide critical infrastructure needed to meet current and future INL opereational and research needs. Execution of these projects will restore, rebuild, and revitalize INL's physical infrastructure; enhance program execution, and make a significant contribution toward reducing complex-wide deferred maintenance.

  13. MEANS FOR COOLING REACTORS

    Science.gov (United States)

    Wheeler, J.A.

    1957-11-01

    A design of a reactor is presented in which the fuel elements may be immersed in a liquid coolant when desired without the necessity of removing them from the reactor structure. The fuel elements, containing the fissionable material are in plate form and are disposed within spaced slots in a moderator material, such as graphite to form the core. Adjacent the core is a tank containing the liquid coolant. The fuel elements are mounted in spaced relationship on a rotatable shaft which is located between the core and the tank so that by rotation of the shaft the fuel elements may be either inserted in the slots in the core to sustain a chain reaction or immersed in the coolant.

  14. Compact fusion reactors

    CERN Document Server

    CERN. Geneva

    2015-01-01

    Fusion research is currently to a large extent focused on tokamak (ITER) and inertial confinement (NIF) research. In addition to these large international or national efforts there are private companies performing fusion research using much smaller devices than ITER or NIF. The attempt to achieve fusion energy production through relatively small and compact devices compared to tokamaks decreases the costs and building time of the reactors and this has allowed some private companies to enter the field, like EMC2, General Fusion, Helion Energy, Lawrenceville Plasma Physics and Lockheed Martin. Some of these companies are trying to demonstrate net energy production within the next few years. If they are successful their next step is to attempt to commercialize their technology. In this presentation an overview of compact fusion reactor concepts is given.

  15. About Big Data and its Challenges and Benefits in Manufacturing

    OpenAIRE

    Bogdan NEDELCU

    2013-01-01

    The aim of this article is to show the importance of Big Data and its growing influence on companies. It also shows what kind of big data is currently generated and how much big data is estimated to be generated. We can also see how much are the companies willing to invest in big data and how much are they currently gaining from their big data. There are also shown some major influences that big data has over one major segment in the industry (manufacturing) and the challenges that appear.

  16. SoBigData - VRE specification and software 1

    OpenAIRE

    Assante, Massimiliano; Candela, Leonardo; Frosini, Luca; Lelii, Lucio; Mangiacrapa, Francesco; Pagano, Pasquale

    2016-01-01

    This deliverable complements "D10.5 SoBigData e-Infrastructure software release 1" by describing how such a software has been deployed to serve the current needs of the SoBigData community. In particular, it describes how such a software has been exploited to make available the components envisaged in "D10.2 SoBigData e-Infrastructure release plan 1", i.e. the SoBigData portal (and the underlying Virtual Organisation), the SoBigData Catalogue, and the SoBigData Virtual Research Environments.

  17. Big Data Mining: Challenges, Technologies, Tools and Applications

    OpenAIRE

    Asha M. PAWAR

    2016-01-01

    Big data is a data with large size means it has large volume, velocity and variety. Now a day's big data is expanding in a various science and engineering fields. And so there are many challenges to manage and analyse big data using various tools. This paper introduces the big data and its Characteristic concepts and Next section elaborates about the Challenges in Big data. In Particular, wed discuss about the technologies used in big data Analysis and Which Tools are mainly used to analyse t...

  18. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    Energy Technology Data Exchange (ETDEWEB)

    Susan M. Capalbo

    2004-10-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification

  19. Reactor Neutrino Spectra

    OpenAIRE

    Hayes, A. C.; Vogel, Petr

    2016-01-01

    We present a review of the antineutrino spectra emitted from reactors. Knowledge of these spectra and their associated uncertainties is crucial for neutrino oscillation studies. The spectra used to date have been determined either by converting measured electron spectra to antineutrino spectra or by summing over all of the thousands of transitions that make up the spectra, using modern databases as input. The uncertainties in the subdominant corrections to β-decay plague both methods, and we ...

  20. REACTOR MODERATOR STRUCTURE

    Science.gov (United States)

    Greenstreet, B.L.

    1963-12-31

    A system for maintaining the alignment of moderator block structures in reactors is presented. Integral restraining grids are placed between each layer of blocks in the moderator structure, at the top of the uppermost layer, and at the bottom of the lowermost layer. Slots are provided in the top and bottom surfaces of the moderator blocks so as to provide a keying action with the grids. The grids are maintained in alignment by vertical guiding members disposed about their peripheries. (AEC)

  1. Benchmark Dose Software Development and Maintenance Ten Berge Cxt Models

    Science.gov (United States)

    This report is intended to provide an overview of beta version 1.0 of the implementation of a concentration-time (CxT) model originally programmed and provided by Wil ten Berge (referred to hereafter as the ten Berge model). The recoding and development described here represent ...

  2. Ten Things Every Professor Should Know about Assessment

    Science.gov (United States)

    Wolf, Kenneth; Dunlap, Joanna; Stevens, Ellen

    2012-01-01

    This article describes ten key assessment practices for advancing student learning that all professors should be familiar with and strategically incorporate in their classrooms and programs. Each practice or concept is explained with examples and guidance for putting it into practice. The ten are: learning outcomes, performance assessments,…

  3. TRANSCUTANEOUS ELECTRICAL NERVE-STIMULATION (TENS) IN RAYNAUDS-PHENOMENON

    NARCIS (Netherlands)

    MULDER, P; DOMPELING, EC; VANSLOCHTERENVANDERBOOR, JC; KUIPERS, WD; SMIT, AJ

    1991-01-01

    Transcutaneous nerve stimulation (TENS) has been described as resulting in vasodilatation. The effect of 2 Hz TENS of the right hand during forty-five minutes on skin temperature and plethysmography of the third digit of both hands and feet and on transcutaneous oxygen tension (TcpO2) of the right h

  4. BOILER-SUPERHEATED REACTOR

    Science.gov (United States)

    Heckman, T.P.

    1961-05-01

    A nuclear power reactor of the type in which a liquid moderator-coolant is transformed by nuclear heating into a vapor that may be used to drive a turbo- generator is described. The core of this reactor comprises a plurality of freely suspended tubular fuel elements, called fuel element trains, within which nonboiling pressurized liquid moderator-coolant is preheated and sprayed through orifices in the walls of the trains against the outer walls thereof to be converted into vapor. Passage of the vapor ovcr other unwetted portions of the outside of the fuel elements causes the steam to be superheated. The moderatorcoolant within the fuel elements remains in the liqUid state, and that between the fuel elements remains substantiaily in the vapor state. A unique liquid neutron-absorber control system is used. Advantages expected from the reactor design include reduced fuel element failure, increased stability of operation, direct response to power demand, and circulation of a minimum amount of liquid moderatorcoolant. (A.G.W.)

  5. Big data in food safety; an overview.

    Science.gov (United States)

    Marvin, Hans J P; Janssen, Esmée M; Bouzembrak, Yamine; Hendriksen, Peter J M; Staats, Martijn

    2016-11-07

    Technology is now being developed that is able to handle vast amounts of structured and unstructured data from diverse sources and origins. These technologies are often referred to as big data, and opens new areas of research and applications that will have an increasing impact in all sectors of our society. In this paper we assessed to which extent big data is being applied in the food safety domain and identified several promising trends. In several parts of the world, governments stimulate the publication on internet of all data generated in public funded research projects. This policy opens new opportunities for stakeholders dealing with food safety to address issues which were not possible before. Application of mobile phones as detection devices for food safety and the use of social media as early warning of food safety problems are a few examples of the new developments that are possible due to big data.

  6. Adapting bioinformatics curricula for big data.

    Science.gov (United States)

    Greene, Anna C; Giffin, Kristine A; Greene, Casey S; Moore, Jason H

    2016-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs.

  7. Big Data Analytics for Genomic Medicine.

    Science.gov (United States)

    He, Karen Y; Ge, Dongliang; He, Max M

    2017-02-15

    Genomic medicine attempts to build individualized strategies for diagnostic or therapeutic decision-making by utilizing patients' genomic information. Big Data analytics uncovers hidden patterns, unknown correlations, and other insights through examining large-scale various data sets. While integration and manipulation of diverse genomic data and comprehensive electronic health records (EHRs) on a Big Data infrastructure exhibit challenges, they also provide a feasible opportunity to develop an efficient and effective approach to identify clinically actionable genetic variants for individualized diagnosis and therapy. In this paper, we review the challenges of manipulating large-scale next-generation sequencing (NGS) data and diverse clinical data derived from the EHRs for genomic medicine. We introduce possible solutions for different challenges in manipulating, managing, and analyzing genomic and clinical data to implement genomic medicine. Additionally, we also present a practical Big Data toolset for identifying clinically actionable genetic variants using high-throughput NGS data and EHRs.

  8. Unsupervised Tensor Mining for Big Data Practitioners.

    Science.gov (United States)

    Papalexakis, Evangelos E; Faloutsos, Christos

    2016-09-01

    Multiaspect data are ubiquitous in modern Big Data applications. For instance, different aspects of a social network are the different types of communication between people, the time stamp of each interaction, and the location associated to each individual. How can we jointly model all those aspects and leverage the additional information that they introduce to our analysis? Tensors, which are multidimensional extensions of matrices, are a principled and mathematically sound way of modeling such multiaspect data. In this article, our goal is to popularize tensors and tensor decompositions to Big Data practitioners by demonstrating their effectiveness, outlining challenges that pertain to their application in Big Data scenarios, and presenting our recent work that tackles those challenges. We view this work as a step toward a fully automated, unsupervised tensor mining tool that can be easily and broadly adopted by practitioners in academia and industry.

  9. Big Data Issues: Performance, Scalability, Availability

    Directory of Open Access Journals (Sweden)

    Laura Matei

    2014-03-01

    Full Text Available Nowadays, Big Data is probably one of the most discussed topics not only in the area of data analysis, but, I believe, in the whole realm of information technology. The simple typing of the words „big data” on an online search engine like Google will retrieve approximately 1,660,000,000 results. Having such a buzz gathered around this term, I could not help but wonder what this phenomenon means.The ever greater portion that the combination of Internet, Cloud Computing and mobile devices has been occupying in our lives, lead to an ever increasing amount of data that must be captured, communicated, aggregated, stored, and analyzed. These sets of data that we are generating are called Big Data.

  10. One Second After the Big Bang

    CERN Document Server

    CERN. Geneva

    2014-01-01

    A new experiment called PTOLEMY (Princeton Tritium Observatory for Light, Early-Universe, Massive-Neutrino Yield) is under development at the Princeton Plasma Physics Laboratory with the goal of challenging one of the most fundamental predictions of the Big Bang – the present-day existence of relic neutrinos produced less than one second after the Big Bang. Using a gigantic graphene surface to hold 100 grams of a single-atomic layer of tritium, low noise antennas that sense the radio waves of individual electrons undergoing cyclotron motion, and a massive array of cryogenic sensors that sit at the transition between normal and superconducting states, the PTOLEMY project has the potential to challenge one of the most fundamental predictions of the Big Bang, to potentially uncover new interactions and properties of the neutrinos, and to search for the existence of a species of light dark matter known as sterile neutrinos.

  11. Big Bang riddles and their revelations

    CERN Document Server

    Magueijo, J; Magueijo, Joao; Baskerville, Kim

    1999-01-01

    We describe how cosmology has converged towards a beautiful model of the Universe: the Big Bang Universe. We praise this model, but show there is a dark side to it. This dark side is usually called ``the cosmological problems'': a set of coincidences and fine tuning features required for the Big Bang Universe to be possible. After reviewing these ``riddles'' we show how they have acted as windows into the very early Universe, revealing new physics and new cosmology just as the Universe came into being. We describe inflation, pre Big Bang, and varying speed of light theories. At the end of the millennium, these proposals are seen respectively as a paradigm, a tentative idea, and outright speculation.

  12. Big Data Analytics for Genomic Medicine

    Science.gov (United States)

    He, Karen Y.; Ge, Dongliang; He, Max M.

    2017-01-01

    Genomic medicine attempts to build individualized strategies for diagnostic or therapeutic decision-making by utilizing patients’ genomic information. Big Data analytics uncovers hidden patterns, unknown correlations, and other insights through examining large-scale various data sets. While integration and manipulation of diverse genomic data and comprehensive electronic health records (EHRs) on a Big Data infrastructure exhibit challenges, they also provide a feasible opportunity to develop an efficient and effective approach to identify clinically actionable genetic variants for individualized diagnosis and therapy. In this paper, we review the challenges of manipulating large-scale next-generation sequencing (NGS) data and diverse clinical data derived from the EHRs for genomic medicine. We introduce possible solutions for different challenges in manipulating, managing, and analyzing genomic and clinical data to implement genomic medicine. Additionally, we also present a practical Big Data toolset for identifying clinically actionable genetic variants using high-throughput NGS data and EHRs. PMID:28212287

  13. FMDP Reactor Alternative Summary Report: Volume 3 - partially complete LWR alternative

    Energy Technology Data Exchange (ETDEWEB)

    Greene, S.R.; Fisher, S.E.; Bevard, B.B. [and others

    1996-09-01

    The Department of Energy Office of Fissile Materials Disposition (DOE/MD) initiated a detailed analysis activity to evaluate each of ten plutonium disposition alternatives that survived an initial screening process. This document, Volume 3 of a four volume report summarizes the results of these analyses for the partially complete LWR (PCLWR) reactor based plutonium disposition alternative.

  14. Using TENS for pain control: the state of the evidence.

    Science.gov (United States)

    Vance, Carol G T; Dailey, Dana L; Rakel, Barbara A; Sluka, Kathleen A

    2014-05-01

    Transcutaneous electrical nerve stimulation (TENS) is a nonpharmacological intervention that activates a complex neuronal network to reduce pain by activating descending inhibitory systems in the central nervous system to reduce hyperalgesia. The evidence for TENS efficacy is conflicting and requires not only description but also critique. Population-specific systemic reviews and meta-analyses are emerging, indicating both HF and LF TENS being shown to provide analgesia, specifically when applied at a strong, nonpainful intensity. The purpose of this article is to provide a critical review of the latest basic science and clinical evidence for TENS. Additional research is necessary to determine if TENS has effects specific to mechanical stimuli and/or beyond reduction of pain and will improve activity levels, function and quality of life.

  15. A Critical Axiology for Big Data Studies

    Directory of Open Access Journals (Sweden)

    Saif Shahin

    2016-01-01

    Full Text Available Los datos masivos ( Big Data han tenido un gran impacto en el periodis - mo y los estudios de comunicación, a la vez que han generado un gran número de preocupaciones sociales que van desde la vigilancia masiva hasta la legitimación de prejuicios, como el racismo. En este artículo, se desarrolla una agenda para la investigación crítica de Big Data y se discu - te cuál debería ser el propósito de dicha investigación, de qué obstáculos protegerse y la posibilidad de adaptar los métodos de Big Data para lle - var a cabo la investigación empírica desde un punto de vista crítico. Di - cho programa de investigación no solo permitirá que la erudición crítica desafíe significativamente a Big Data como una herramienta hegemónica, sino que también permitirá que los académicos usen los recursos de Big Data para abordar una serie de problemas sociales de formas previamente imposibles. El artículo llama a la innovación metodológica para combinar las técnicas emergentes de Big Data y los métodos críticos y cualitativos de investigación, como la etnografía y el análisis del discurso, de tal ma - nera que se puedan complementar.

  16. Nuclear research reactors in Brazil

    Energy Technology Data Exchange (ETDEWEB)

    Cota, Anna Paula Leite; Mesquita, Amir Zacarias, E-mail: aplc@cdtn.b, E-mail: amir@cdtn.b [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2011-07-01

    The rising concerns about global warming and energy security have spurred a revival of interest in nuclear energy, giving birth to a 'nuclear power renaissance' in several countries in the world. Particularly in Brazil, in the recent years, the nuclear power renaissance can be seen in the actions that comprise its nuclear program, summarily the increase of the investments in nuclear research institutes and the government target to design and build the Brazilian Multipurpose research Reactor (BMR). In the last 50 years, Brazilian research reactors have been used for training, for producing radioisotopes to meet demands in industry and nuclear medicine, for miscellaneous irradiation services and for academic research. Moreover, the research reactors are used as laboratories to develop technologies in power reactors, which are evaluated today at around 450 worldwide. In this application, those reactors become more viable in relation to power reactors by the lowest cost, by the operation at low temperatures and, furthermore, by lower demand for nuclear fuel. In Brazil, four research reactors were installed: the IEA-R1 and the MB-01 reactors, both at the Instituto de Pesquisas Energeticas Nucleares (IPEN, Sao Paulo); the Argonauta, at the Instituto de Engenharia Nuclear (IEN, Rio de Janeiro) and the IPR-R1 TRIGA reactor, at the Centro de Desenvolvimento da Tecnologia Nuclear (CDTN, Belo Horizonte). The present paper intends to enumerate the characteristics of these reactors, their utilization and current academic research. Therefore, through this paper, we intend to collaborate on the BMR project. (author)

  17. How do we identify big rivers? And how big is big?

    Science.gov (United States)

    Miall, Andrew D.

    2006-04-01

    "Big rivers" are the trunk rivers that carry the water and sediment load from major orogens, or that drain large areas of a continent. Identifying such rivers in the ancient record is a challenge. Some guidance may be provided by tectonic setting and sedimentological evidence, including the scale of architectural elements, and clues from provenance studies, but such data are not infallible guides to river magnitude. The scale of depositional elements is the most obvious clue to channel size, but evidence is typically sparse and inadequate, and may be misleading. For example, thick fining-upward successions may be tectonic cyclothems. Two examples of the analysis of large ancient river systems are discussed here in order to highlight problems of methodology and interpretation. The Hawkesbury Sandstone (Triassic) of the Sydney Basin, Australia, is commonly cited as the deposit of a large river, on the basis of abundant very large-scale crossbedding. An examination of very large outcrops of this unit, including a coastal cliff section 6 km long near Sydney, showed that even with 100% exposure there are ambiguities in the determination of channel scale. It was concluded in this case that the channel dimensions of the Hawkesbury rivers were about half the size of the modern Brahmaputra River. The tectonic setting of a major ancient fluvial system is commonly not a useful clue to river scale. The Hawkesbury Sandstone is a system draining transversely from a cratonic source into a foreland basin, whereas most large rivers in foreland basins flow axially and are derived mainly from the orogenic uplifts (e.g., the large tidally influenced rivers of the Athabasca Oil Sands, Alberta). Epeirogenic tilting of a continent by the dynamic topography process may generate drainages in unexpected directions. For example, analyses of detrital zircons in Upper Paleozoic-Mesozoic nonmarine successions in the SW United States suggests significant derivation from the Appalachian orogen

  18. From big data to smart data

    CERN Document Server

    Iafrate, Fernando

    2015-01-01

    A pragmatic approach to Big Data by taking the reader on a journey between Big Data (what it is) and the Smart Data (what it is for). Today's decision making can be reached via information (related to the data), knowledge (related to people and processes), and timing (the capacity to decide, act and react at the right time). The huge increase in volume of data traffic, and its format (unstructured data such as blogs, logs, and video) generated by the "digitalization" of our world modifies radically our relationship to the space (in motion) and time, dimension and by capillarity, the enterpr

  19. Probing Big Bounce with Dark Matter

    CERN Document Server

    Li, Changhong

    2014-01-01

    We investigate the production of dark matter in a generic bouncing universe framework. Our result shows that, if the future-experimentally-measured cross section and mass of dark matter particle satisfy the cosmological constraint, $\\langle \\sigma v\\rangle m_\\chi^2 < 1.82\\times 10^{-26}$, it becomes a strong indication that our universe went through a Big Bounce---instead of the inflationary phase as postulated in Standard Big Bang Cosmology---at the early stage of the cosmological evolution.

  20. Cognitive computing and big data analytics

    CERN Document Server

    Hurwitz, Judith; Bowles, Adrian

    2015-01-01

    MASTER THE ABILITY TO APPLY BIG DATA ANALYTICS TO MASSIVE AMOUNTS OF STRUCTURED AND UNSTRUCTURED DATA Cognitive computing is a technique that allows humans and computers to collaborate in order to gain insights and knowledge from data by uncovering patterns and anomalies. This comprehensive guide explains the underlying technologies, such as artificial intelligence, machine learning, natural language processing, and big data analytics. It then demonstrates how you can use these technologies to transform your organization. You will explore how different vendors and different industries are a

  1. Some notes on the big trip

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez-Diaz, Pedro F. [Colina de los Chopos, Centro de Fisica ' Miguel A. Catalan' , Instituto de Matematicas y Fisica Fundamental, Consejo Superior de Investigaciones Cientificas, Serrano 121, 28006 Madrid (Spain)]. E-mail: pedrogonzalez@mi.madritel.es

    2006-03-30

    The big trip is a cosmological process thought to occur in the future by which the entire universe would be engulfed inside a gigantic wormhole and might travel through it along space and time. In this Letter we discuss different arguments that have been raised against the viability of that process, reaching the conclusions that the process can actually occur by accretion of phantom energy onto the wormholes and that it is stable and might occur in the global context of a multiverse model. We finally argue that the big trip does not contradict any holographic bounds on entropy and information.

  2. The Big Idea. Dynamic Stakeholder Management

    Science.gov (United States)

    2014-12-01

    Defense AT&L: November–December 2014 8 The Big IDEA Dynamic Stakeholder Management Lt. Col. Franklin D. Gaillard II, USAF Frank Gaillard, Ph.D...currently valid OMB control number. 1. REPORT DATE DEC 2014 2. REPORT TYPE 3. DATES COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE The... Big Idea. Dynamic Stakeholder Management 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK

  3. How quantum is the big bang?

    Science.gov (United States)

    Bojowald, Martin

    2008-06-06

    When quantum gravity is used to discuss the big bang singularity, the most important, though rarely addressed, question is what role genuine quantum degrees of freedom play. Here, complete effective equations are derived for isotropic models with an interacting scalar to all orders in the expansions involved. The resulting coupling terms show that quantum fluctuations do not affect the bounce much. Quantum correlations, however, do have an important role and could even eliminate the bounce. How quantum gravity regularizes the big bang depends crucially on properties of the quantum state.

  4. Comparative validity of brief to medium-length Big Five and Big Six Personality Questionnaires.

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-12-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are faced with a variety of options as to inventory length. Furthermore, a 6-factor model has been proposed to extend and update the Big Five model, in part by adding a dimension of Honesty/Humility or Honesty/Propriety. In this study, 3 popular brief to medium-length Big Five measures (NEO Five Factor Inventory, Big Five Inventory [BFI], and International Personality Item Pool), and 3 six-factor measures (HEXACO Personality Inventory, Questionnaire Big Six Scales, and a 6-factor version of the BFI) were placed in competition to best predict important student life outcomes. The effect of test length was investigated by comparing brief versions of most measures (subsets of items) with original versions. Personality questionnaires were administered to undergraduate students (N = 227). Participants' college transcripts and student conduct records were obtained 6-9 months after data was collected. Six-factor inventories demonstrated better predictive ability for life outcomes than did some Big Five inventories. Additional behavioral observations made on participants, including their Facebook profiles and cell-phone text usage, were predicted similarly by Big Five and 6-factor measures. A brief version of the BFI performed surprisingly well; across inventory platforms, increasing test length had little effect on predictive validity. Comparative validity of the models and measures in terms of outcome prediction and parsimony is discussed.

  5. Slow pyrolysis of wood barks from Pinus brutia Ten. and product compositions.

    Science.gov (United States)

    Sensöz, Sevgi

    2003-09-01

    Biomass in the form of pine bark (Pinus brutia Ten.) was pyrolysed in an externally heated fixed-bed reactor. The effects of temperature and heating rate on the yields and compositions of the products were investigated. Pyrolysis runs were performed using reactor temperatures between 300 and 500 degrees C with heating rates of 7 and 40 degrees Cmin(-1). The product yields were significantly influenced by the process conditions. The bio-oil obtained at 450 degrees C, at which the liquid product yield was maximum, was analysed. It was characterized by Fourier transform infrared spectroscopy. In addition, the solid and liquid products were analysed to determine their elemental composition and calorific value. Chemical fractionation of bio-oil showed that only low quantities of hydrocarbons were present, while oxygenated and polar fractions dominated. The empirical formula of the bio-oil with heating value of 31.03 MJkg(-1) was established as CH(1.43)O(0.332)N(0.0013).

  6. Thermionic Reactor Design Studies

    Energy Technology Data Exchange (ETDEWEB)

    Schock, Alfred

    1994-06-01

    During the 1960's and early 70's the author performed extensive design studies, analyses, and tests aimed at thermionic reactor concepts that differed significantly from those pursued by other investigators. Those studies, like most others under Atomic Energy Commission (AEC and DOE) and the National Aeronautics and Space Administration (NASA) sponsorship, were terminated in the early 1970's. Some of this work was previously published, but much of it was never made available in the open literature. U.S. interest in thermionic reactors resumed in the early 80's, and was greatly intensified by reports about Soviet ground and flight tests in the late 80's. This recent interest resulted in renewed U.S. thermionic reactor development programs, primarily under Department of Defense (DOD) and Department of Energy (DOE) sponsorship. Since most current investigators have not had an opportunity to study all of the author's previous work, a review of the highlights of that work may be of value to them. The present paper describes some of the author's conceptual designs and their rationale, and the special analytical techniques developed to analyze their performance. The basic designs, first published in 1963, are based on single-cell converters, either double-ended diodes extending over the full height of the reactor core or single-ended diodes extending over half the core height. In that respect they are similar to the thermionic fuel elements employed in the Topaz-2 reactor subsequently developed in the Soviet Union, copies of which were recently imported by the U.S. As in the Topaz-2 case, electrically heated steady-state performance tests of the converters are possible before fueling. Where the author's concepts differed from the later Topaz-2 design was in the relative location of the emitter and the collector. Placing the fueled emitter on the outside of the cylindrical diodes permits much higher axial conductances to reduce ohmic

  7. BIG DATA, BIG CONSEQUENCES? EEN VERKENNING NAAR PRIVACY EN BIG DATA GEBRUIK BINNEN DE OPSPORING, VERVOLGING EN RECHTSPRAAK

    NARCIS (Netherlands)

    Lodder, A.R.; Meulen, van der N.S.; Wisman, T.H.A.; Meij, Lisette; Zwinkels, C.M.M.

    2014-01-01

    In deze verkenning is ingegaan op de privacy aspecten van Big Data analysis binnen het domein Veiligheid en Justitie. Besproken zijn toepassingen binnen de rechtspraak zoals voorspellen van uitspraken en gebruik in rechtszaken. Met betrekking tot opsporing is onder andere ingegaan op predictive poli

  8. Characterization of a continuous agitated cell reactor for oxygen dependent biocatalysis

    DEFF Research Database (Denmark)

    Pedersen, Asbjørn Toftgaard; Teresa de Melo Machado Simoes Carvalho, Ana; Sutherland, Euan

    2017-01-01

    for the ACR was developed. The model consisted of ten tanks-in-series with back-mixing occurring within and between each cell. The back-mixing was a necessary addition to the model in order to explain the observed phenomenon that the ACR behaved as two continuous stirred tank reactors (CSTRs) at low flow...... rates, while it at high flow rates behaved as the expected ten CSTRs in series. The performance of the ACR was evaluated by comparing the steady state conversion at varying residence times with the conversion observed in a stirred batch reactor of comparable size. It was found that the ACR could more......Biocatalytic oxidation reactions employing molecular oxygen as the electron acceptor are difficult to conduct in a continuous flow reactor because of the requirement for high oxygen transfer rates. In this paper, the oxidation of glucose to glucono-1,5-lactone by glucose oxidase was used as a model...

  9. Transcutaneous electrical nerve stimulation (TENS) in angina pectoris.

    Science.gov (United States)

    Mannheimer, C; Carlsson, C A; Vedin, A; Wilhelmsson, C

    1986-09-01

    The aim of this study was to determine the efficacy of transcutaneous electrical nerve stimulation (TENS) in the treatment of chronic stable severe angina pectoris. In a short-term study the effect of TENS was studied in 10 male patients with angina pectoris (functional class III and IV). All patients had previously been stabilized on long-term maximal oral treatment. The effects of the treatment were measured by means of repeated bicycle ergometer tests. All patients had an increased working capacity (16-85%), decreased ST segment depression and reduced recovery time during TENS. No adverse effects were observed. A long-term study of TENS on similarly selected patients showed beneficial effects in terms of pain reduction, reduced frequency of anginal attacks, increased physical activity and increased working capacity during bicycle ergometer tests. An invasive study was carried out with respect to systemic and coronary hemodynamics and myocardial metabolism during pacing provoked myocardial ischemia in 13 patients. The results showed that TENS led to an increased tolerance to pacing, improved lactate metabolism, less pronounced ST segment depression. A drop in systolic blood pressure during TENS treatment at identical pacing rates indicated a decreased afterload. An increased coronary flow to ischemic areas in the myocardium was supported by the fact that the rate pressure product during anginal pain increased during TENS.

  10. Turning points in reactor design

    Energy Technology Data Exchange (ETDEWEB)

    Beckjord, E.S.

    1995-09-01

    This article provides some historical aspects on nuclear reactor design, beginning with PWR development for Naval Propulsion and the first commercial application at Yankee Rowe. Five turning points in reactor design and some safety problems associated with them are reviewed: (1) stability of Dresden-1, (2) ECCS, (3) PRA, (4) TMI-2, and (5) advanced passive LWR designs. While the emphasis is on the thermal-hydraulic aspects, the discussion is also about reactor systems.

  11. Fast reactor programme in India

    Indian Academy of Sciences (India)

    P Chellapandi; P R Vasudeva Rao; Prabhat Kumar

    2015-09-01

    Role of fast breeder reactor (FBR) in the Indian context has been discussed with appropriate justification. The FBR programme since 1985 till 2030 is highlighted focussing on the current status and future direction of fast breeder test reactor (FBTR), prototype fast breeder reactor (PFBR) and FBR-1 and 2. Design and technological challenges of PFBR and design and safety targets with means to achieve the same are the major highlights of this paper.

  12. Acceptability of reactors in space

    Energy Technology Data Exchange (ETDEWEB)

    Buden, D.

    1981-04-01

    Reactors are the key to our future expansion into space. However, there has been some confusion in the public as to whether they are a safe and acceptable technology for use in space. The answer to these questions is explored. The US position is that when reactors are the preferred technical choice, that they can be used safely. In fact, it dies not appear that reactors add measurably to the risk associated with the Space Transportation System.

  13. Spiral-shaped disinfection reactors

    KAUST Repository

    Ghaffour, Noreddine

    2015-08-20

    This disclosure includes disinfection reactors and processes for the disinfection of water. Some disinfection reactors include a body that defines an inlet, an outlet, and a spiral flow path between the inlet and the outlet, in which the body is configured to receive water and a disinfectant at the inlet such that the water is exposed to the disinfectant as the water flows through the spiral flow path. Also disclosed are processes for disinfecting water in such disinfection reactors.

  14. Hydrogen Production in Fusion Reactors

    OpenAIRE

    Sudo, S.; Tomita, Y.; Yamaguchi, S.; Iiyoshi, A.; Momota, H; Motojima, O.; Okamoto, M.; Ohnishi, M.; Onozuka, M; Uenosono, C.

    1993-01-01

    As one of methods of innovative energy production in fusion reactors without having a conventional turbine-type generator, an efficient use of radiation produced in a fusion reactor with utilizing semiconductor and supplying clean fuel in a form of hydrogen gas are studied. Taking the candidates of reactors such as a toroidal system and an open system for application of the new concepts, the expected efficiency and a concept of plant system are investigated.

  15. Editorial for FGCS special issue: Big Data in the cloud

    OpenAIRE

    Chang, V; Ramachandran, M.; Wills, G.; Walters, RJ; Li, C-S; Watters, P

    2016-01-01

    Research associated with Big Data in the Cloud will be important topic over the next few years. The topic includes work on demonstrating architectures, applications, services, experiments and simulations in the Cloud to support the cases related to adoption of Big Data. A common approach to Big Data in the Cloud to allow better access, performance and efficiency when analysing and understanding the data is to deliver Everything as a Service. Organisations adopting Big Data this way find the b...

  16. STUDY OF FACTORS AFFECTING CUSTOMER BEHAVIOUR USING BIG DATA TECHNOLOGY

    OpenAIRE

    Prabin Sahoo; Dr. Nilay Yajnik

    2014-01-01

    Big data technology is getting momentum recently. There are several articles, books, blogs and discussion points to various facets of big data technology. The study in this paper focuses on big data as concept, and insights into 3 Vs such as Volume, Velocity and Variety and demonstrates their significance with respect to factors that can be processed using big data for studying customer behaviour for online users.

  17. Neutrino Oscillation Studies with Reactors

    CERN Document Server

    Vogel, Petr; Zhang, Chao

    2015-01-01

    Nuclear reactors are one of the most intense, pure, controllable, cost-effective, and well-understood sources of neutrinos. Reactors have played a major role in the study of neutrino oscillations, a phenomenon that indicates that neutrinos have mass and that neutrino flavors are quantum mechanical mixtures. Over the past several decades reactors were used in the discovery of neutrinos, were crucial in solving the solar neutrino puzzle, and allowed the determination of the smallest mixing angle $\\theta_{13}$. In the near future, reactors will help to determine the neutrino mass hierarchy and to solve the puzzling issue of sterile neutrinos.

  18. Big data in food safety; an overview

    NARCIS (Netherlands)

    Marvin, H.J.P.; Janssen, E.M.; Bouzembrak, Y.; Hendriksen, P.J.M.; Staats, M.

    2016-01-01

    Technology is now being developed that is able to handle vast amounts of structured and unstructured data from diverse sources and origins. These technologies are often referred to as big data, and opens new areas of research and applications that will have an increasing impact in all sectors of our

  19. A Big Problem for Magellan: Food Preservation

    Science.gov (United States)

    Galvao, Cecilia; Reis, Pedro; Freire, Sofia

    2008-01-01

    In this paper, we present data related to how a Portuguese teacher developed the module "A big problem for Magellan: Food preservation." Students were asked to plan an investigation in order to identify which were the best food preservation methods in the XV and XVI centuries of Portuguese overseas navigation, and then establish a…

  20. Functional connectomics from a "big data" perspective.

    Science.gov (United States)

    Xia, Mingrui; He, Yong

    2017-02-14

    In the last decade, explosive growth regarding functional connectome studies has been observed. Accumulating knowledge has significantly contributed to our understanding of the brain's functional network architectures in health and disease. With the development of innovative neuroimaging techniques, the establishment of large brain datasets and the increasing accumulation of published findings, functional connectomic research has begun to move into the era of "big data", which generates unprecedented opportunities for discovery in brain science and simultaneously encounters various challenging issues, such as data acquisition, management and analyses. Big data on the functional connectome exhibits several critical features: high spatial and/or temporal precision, large sample sizes, long-term recording of brain activity, multidimensional biological variables (e.g., imaging, genetic, demographic, cognitive and clinic) and/or vast quantities of existing findings. We review studies regarding functional connectomics from a big data perspective, with a focus on recent methodological advances in state-of-the-art image acquisition (e.g., multiband imaging), analysis approaches and statistical strategies (e.g., graph theoretical analysis, dynamic network analysis, independent component analysis, multivariate pattern analysis and machine learning), as well as reliability and reproducibility validations. We highlight the novel findings in the application of functional connectomic big data to the exploration of the biological mechanisms of cognitive functions, normal development and aging and of neurological and psychiatric disorders. We advocate the urgent need to expand efforts directed at the methodological challenges and discuss the direction of applications in this field.