WorldWideScience

Sample records for tiny science big

  1. Big science

    CERN Multimedia

    Nadis, S

    2003-01-01

    " "Big science" is moving into astronomy, bringing large experimental teams, multi-year research projects, and big budgets. If this is the wave of the future, why are some astronomers bucking the trend?" (2 pages).

  2. Big Science

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1986-05-15

    Astronomy, like particle physics, has become Big Science where the demands of front line research can outstrip the science budgets of whole nations. Thus came into being the European Southern Observatory (ESO), founded in 1962 to provide European scientists with a major modern observatory to study the southern sky under optimal conditions.

  3. Making every gram count - Big measurements from tiny platforms (Invited)

    Science.gov (United States)

    Fish, C. S.; Neilsen, T. L.; Stromberg, E. M.

    2013-12-01

    The most significant advances in Earth, solar, and space physics over the next decades will originate from new, system-level observational techniques. The most promising technique to still be fully developed and exploited requires conducting multi-point or distributed constellation-based observations. This system-level observational approach is required to understand the 'big picture' coupling between disparate regions such as the solar-wind, magnetosphere, ionosphere, upper atmosphere, land, and ocean. The national research council, NASA science mission directorate, and the larger heliophysics community have repeatedly identified the pressing need for multipoint scientific investigations to be implemented via satellite constellations. The NASA Solar Terrestrial Probes Magnetospheric Multiscale (MMS) mission and NASA Earth Science Division's 'A-train', consisting of the AQUA, CloudSat, CALIPSO and AURA satellites, are examples of such constellations. However, the costs to date of these and other similar proposed constellations have been prohibitive given the 'large satellite' architectures and the multiple launch vehicles required for implementing the constellations. Financially sustainable development and deployment of multi-spacecraft constellations can only be achieved through the use of small spacecraft that allow for multiple hostings per launch vehicle. The revolution in commercial mobile and other battery powered consumer technology has helped enable researchers in recent years to build and fly very small yet capable satellites, principally CubeSats. A majority of the CubeSat activity and development to date has come from international academia and the amateur radio satellite community, but several of the typical large-satellite vendors have developed CubeSats as well. Recent government-sponsored CubeSat initiatives, such as the NRO Colony, NSF CubeSat Space Weather, NASA Office of Chief Technologist Edison and CubeSat Launch Initiative (CSLI) Educational

  4. A tiny tick can cause a big health problem

    Directory of Open Access Journals (Sweden)

    Manuel John

    2017-01-01

    Full Text Available Ticks are tiny crawling bugs in the spider family that feed by sucking blood from animals. They are second only to mosquitoes as vectors of human disease, both infectious and toxic. Infected ticks spread over a hundred diseases, some of which are fatal if undetected. They spread the spirochete (which multiplies in the insect's gut with a subsequent bite to the next host. We describe the only reported cases of peri ocular tick bite from India that presented to us within a span of 3 days and its management. Due suspicion and magnification of the lesions revealed the ticks which otherwise masqueraded as small skin tags/moles on gross examination. The ticks were firmly latched on to the skin and careful removal prevented incarceration of the mouth parts. Rickettsial diseases that were believed to have disappeared from India are reemerging and their presence has recently been documented in at least 11 states in the country. Among vector borne diseases, the most common, Lyme disease, also known as the great mimicker, can present with rheumatoid arthritis, fibromyalgia, depression, attention deficit hyperactivity disorder, multiple sclerosis, chronic fatigue syndrome, cardiac manifestations, encephalitis, and mental illness, to name some of the many associations. Common ocular symptoms and signs include conjunctivitis, keratitis, uveitis, and retinitis. Early detection and treatment of tick borne diseases is important to prevent multi system complications that can develop later in life.

  5. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Andersen, Kristina Vaarst; Jeppesen, Jacob

    In this paper we investigate the micro-mechanisms governing structural evolution and performance of scientific collaboration. Scientific discovery tends not to be lead by so called lone ?stars?, or big egos, but instead by collaboration among groups of researchers, from a multitude of institutions...

  6. Big Data and Big Science

    OpenAIRE

    Di Meglio, Alberto

    2014-01-01

    Brief introduction to the challenges of big data in scientific research based on the work done by the HEP community at CERN and how the CERN openlab promotes collaboration among research institutes and industrial IT companies. Presented at the FutureGov 2014 conference in Singapore.

  7. Big Data in Space Science

    OpenAIRE

    Barmby, Pauline

    2018-01-01

    It seems like “big data” is everywhere these days. In planetary science and astronomy, we’ve been dealing with large datasets for a long time. So how “big” is our data? How does it compare to the big data that a bank or an airline might have? What new tools do we need to analyze big datasets, and how can we make better use of existing tools? What kinds of science problems can we address with these? I’ll address these questions with examples including ESA’s Gaia mission, ...

  8. The science of tiny things: physics at the nanoscale

    Energy Technology Data Exchange (ETDEWEB)

    Copp, Stacy Marla [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-06-07

    Nanoscience is the study of tiny objects that are only a billionth of a meter in size, or about 1,000 to 10,000 times smaller than a human hair. From the electronics in your smartphone to the molecular motors that are in your body’s cells, nanoscientists study and design materials that span a huge range of subjects, from physics to chemistry to biology. I will talk about some of what we do at LANL’s Center for Integrated Technologies, as well as how I first got interested in nanoscience and how I became a nanoscientist at LANL.

  9. Big data e data science

    OpenAIRE

    Cavique, Luís

    2014-01-01

    Neste artigo foram apresentados os conceitos básicos de Big Data e a nova área a que deu origem, a Data Science. Em Data Science foi discutida e exemplificada a noção de redução da dimensionalidade dos dados.

  10. Beyond Big Science

    CERN Multimedia

    Boyle, Alan

    2007-01-01

    "Billion-dollar science projects end up being about much more than the science, whether we're talking about particle physics, or fusion research, or the international space station, or missions to the moon and beyond, or the next-generation radio telescope." (3 pages)

  11. The faces of Big Science.

    Science.gov (United States)

    Schatz, Gottfried

    2014-06-01

    Fifty years ago, academic science was a calling with few regulations or financial rewards. Today, it is a huge enterprise confronted by a plethora of bureaucratic and political controls. This change was not triggered by specific events or decisions but reflects the explosive 'knee' in the exponential growth that science has sustained during the past three-and-a-half centuries. Coming to terms with the demands and benefits of 'Big Science' is a major challenge for today's scientific generation. Since its foundation 50 years ago, the European Molecular Biology Organization (EMBO) has been of invaluable help in meeting this challenge.

  12. Science: Big Bang comes to the Alps

    CERN Multimedia

    Cookson, Clive

    2008-01-01

    "The most extensive and expensive scientific instrument in history is due to start working this summer at CERN, the European particle physics laboratory near Geneva. Two beams of protons will accelerate in opposite directions around a 27 km tunnel under the alpine foothills until they are travelling almost at the speed of light - and then smash together, reproducing on a tiny scale the intense energy of the new-born universe after the inaugural Big Bang 15bn years ago. (1 page)

  13. Science Big Bang comes to the Alps

    CERN Multimedia

    2008-01-01

    The most extensive and expensive scientific instrument in history is due to start working this summer at Cern, the European particle physics laboratory near Geneva. Two beams of protons will accelerate in opposite directions around a 27km tunnel under the Alpine foothills until they are travelling almost at the speed of light - and then smash together, reproducing on a tiny scale the intense energy of the new-born universe after the inaugural Big Bang 15bn years ago.

  14. "Big Science" exhibition at Balexert

    CERN Multimedia

    2008-01-01

    CERN is going out to meet those members of the general public who were unable to attend the recent Open Day. The Laboratory will be taking its "Big Science" exhibition from the Globe of Science and Innovation to the Balexert shopping centre from 19 to 31 May 2008. The exhibition, which shows the LHC and its experiments through the eyes of a photographer, features around thirty spectacular photographs measuring 4.5 metres high and 2.5 metres wide. Welcomed and guided around the exhibition by CERN volunteers, shoppers at Balexert will also have the opportunity to discover LHC components on display and watch films. "Fun with Physics" workshops will be held at certain times of the day. Main hall of the Balexert shopping centre, ground floor, from 9.00 a.m. to 7.00 p.m. Monday to Friday and from 10 a.m. to 6 p.m. on the two Saturdays. Call for volunteers All members of the CERN personnel are invited to enrol as volunteers to help welcom...

  15. Big Data and historical social science

    Directory of Open Access Journals (Sweden)

    Peter Bearman

    2015-11-01

    Full Text Available “Big Data” can revolutionize historical social science if it arises from substantively important contexts and is oriented towards answering substantively important questions. Such data may be especially important for answering previously largely intractable questions about the timing and sequencing of events, and of event boundaries. That said, “Big Data” makes no difference for social scientists and historians whose accounts rest on narrative sentences. Since such accounts are the norm, the effects of Big Data on the practice of historical social science may be more limited than one might wish.

  16. Big Science and Long-tail Science

    CERN Document Server

    2008-01-01

    Jim Downing and I were privileged to be the guests of Salavtore Mele at CERN yesterday and to see the Atlas detector of the Large Hadron Collider . This is a wow experience - although I knew it was big, I hadnt realised how big.

  17. Big data in forensic science and medicine.

    Science.gov (United States)

    Lefèvre, Thomas

    2018-07-01

    In less than a decade, big data in medicine has become quite a phenomenon and many biomedical disciplines got their own tribune on the topic. Perspectives and debates are flourishing while there is a lack for a consensual definition for big data. The 3Vs paradigm is frequently evoked to define the big data principles and stands for Volume, Variety and Velocity. Even according to this paradigm, genuine big data studies are still scarce in medicine and may not meet all expectations. On one hand, techniques usually presented as specific to the big data such as machine learning techniques are supposed to support the ambition of personalized, predictive and preventive medicines. These techniques are mostly far from been new and are more than 50 years old for the most ancient. On the other hand, several issues closely related to the properties of big data and inherited from other scientific fields such as artificial intelligence are often underestimated if not ignored. Besides, a few papers temper the almost unanimous big data enthusiasm and are worth attention since they delineate what is at stakes. In this context, forensic science is still awaiting for its position papers as well as for a comprehensive outline of what kind of contribution big data could bring to the field. The present situation calls for definitions and actions to rationally guide research and practice in big data. It is an opportunity for grounding a true interdisciplinary approach in forensic science and medicine that is mainly based on evidence. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  18. Delivering Science from Big Data

    Science.gov (United States)

    Quinn, Peter Joseph

    2015-08-01

    The SKA will be capable of producing a stream of science data products that are Exa-scale in terms of their storage and processing requirements. This Google-scale enterprise is attracting considerable international interest and excitement from within the industrial and academic communities. In this paper we examine the data flow, storage and processing requirements of a number of key SKA survey science projects to be executed on the baseline SKA1 configuration. Based on a set of conservative assumptions about trends for HPC and storage costs, and the data flow process within the SKA Observatory, it is apparent that survey projects of the scale proposed will potentially drive construction and operations costs beyond the current anticipated SKA1 budget. This implies a sharing of the resources and costs to deliver SKA science between the community and what is contained within the SKA Observatory. A similar situation was apparent to the designers of the LHC more than 10 years ago. We propose that it is time for the SKA project and broader community to consider the effort and process needed to design and implement a distributed science data system that leans on the lessons of other projects and looks to recent developments in Cloud technologies to ensure an affordable, effective and global achievement of science goals.

  19. Big questions, big science: meeting the challenges of global ecology.

    Science.gov (United States)

    Schimel, David; Keller, Michael

    2015-04-01

    Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects.

  20. Can companies benefit from Big Science? Science and Industry

    CERN Document Server

    Autio, Erkko; Bianchi-Streit, M

    2003-01-01

    Several studies have indicated that there are significant returns on financial investment via "Big Science" centres. Financial multipliers ranging from 2.7 (ESA) to 3.7 (CERN) have been found, meaning that each Euro invested in industry by Big Science generates a two- to fourfold return for the supplier. Moreover, laboratories such as CERN are proud of their record in technology transfer, where research developments lead to applications in other fields - for example, with particle accelerators and detectors. Less well documented, however, is the effect of the experience that technological firms gain through working in the arena of Big Science. Indeed, up to now there has been no explicit empirical study of such benefits. Our findings reveal a variety of outcomes, which include technological learning, the development of new products and markets, and impact on the firm's organization. The study also demonstrates the importance of technologically challenging projects for staff at CERN. Together, these findings i...

  1. The Tiny Terminators

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 5. The Tiny Terminators - Mosquitoes and Diseases. P K Sumodan. General Article Volume 6 Issue 5 May 2001 pp 48-55. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/006/05/0048-0055 ...

  2. Before big science the pursuit of modern chemistry and physics, 1800-1940

    CERN Document Server

    Nye, Mary Jo

    1999-01-01

    Today's vast multinational scientific monoliths bear little resemblance to the modest laboratories of the early nineteenth century. Yet early in the nineteenth century--when heat and electricity were still counted among the elements--changes were already under way that would revolutionize chemistry and physics into the "big science" of the late twentieth century, expanding tiny, makeshift laboratories into bustling research institutes and replacing the scientific amateurs and generalist savants of the early Victorian era with the professional specialists of contemporary physical science. Mary Jo Nye traces the social and intellectual history of the physical sciences from the early 1800s to the beginning of the Second World War, examining the sweeping transformation of scientific institutions and professions during the period and the groundbreaking experiments that fueled that change, from the earliest investigations of molecular chemistry and field dynamics to the revolutionary breakthroughs of quantum mecha...

  3. Semantic Web technologies for the big data in life sciences.

    Science.gov (United States)

    Wu, Hongyan; Yamaguchi, Atsuko

    2014-08-01

    The life sciences field is entering an era of big data with the breakthroughs of science and technology. More and more big data-related projects and activities are being performed in the world. Life sciences data generated by new technologies are continuing to grow in not only size but also variety and complexity, with great speed. To ensure that big data has a major influence in the life sciences, comprehensive data analysis across multiple data sources and even across disciplines is indispensable. The increasing volume of data and the heterogeneous, complex varieties of data are two principal issues mainly discussed in life science informatics. The ever-evolving next-generation Web, characterized as the Semantic Web, is an extension of the current Web, aiming to provide information for not only humans but also computers to semantically process large-scale data. The paper presents a survey of big data in life sciences, big data related projects and Semantic Web technologies. The paper introduces the main Semantic Web technologies and their current situation, and provides a detailed analysis of how Semantic Web technologies address the heterogeneous variety of life sciences big data. The paper helps to understand the role of Semantic Web technologies in the big data era and how they provide a promising solution for the big data in life sciences.

  4. The Natural Science Underlying Big History

    Directory of Open Access Journals (Sweden)

    Eric J. Chaisson

    2014-01-01

    Full Text Available Nature’s many varied complex systems—including galaxies, stars, planets, life, and society—are islands of order within the increasingly disordered Universe. All organized systems are subject to physical, biological, or cultural evolution, which together comprise the grander interdisciplinary subject of cosmic evolution. A wealth of observational data supports the hypothesis that increasingly complex systems evolve unceasingly, uncaringly, and unpredictably from big bang to humankind. These are global history greatly extended, big history with a scientific basis, and natural history broadly portrayed across ∼14 billion years of time. Human beings and our cultural inventions are not special, unique, or apart from Nature; rather, we are an integral part of a universal evolutionary process connecting all such complex systems throughout space and time. Such evolution writ large has significant potential to unify the natural sciences into a holistic understanding of who we are and whence we came. No new science (beyond frontier, nonequilibrium thermodynamics is needed to describe cosmic evolution’s major milestones at a deep and empirical level. Quantitative models and experimental tests imply that a remarkable simplicity underlies the emergence and growth of complexity for a wide spectrum of known and diverse systems. Energy is a principal facilitator of the rising complexity of ordered systems within the expanding Universe; energy flows are as central to life and society as they are to stars and galaxies. In particular, energy rate density—contrasting with information content or entropy production—is an objective metric suitable to gauge relative degrees of complexity among a hierarchy of widely assorted systems observed throughout the material Universe. Operationally, those systems capable of utilizing optimum amounts of energy tend to survive, and those that cannot are nonrandomly eliminated.

  5. Big Machines and Big Science: 80 Years of Accelerators at Stanford

    Energy Technology Data Exchange (ETDEWEB)

    Loew, Gregory

    2008-12-16

    Longtime SLAC physicist Greg Loew will present a trip through SLAC's origins, highlighting its scientific achievements, and provide a glimpse of the lab's future in 'Big Machines and Big Science: 80 Years of Accelerators at Stanford.'

  6. The Ethics of Big Data and Nursing Science.

    Science.gov (United States)

    Milton, Constance L

    2017-10-01

    Big data is a scientific, social, and technological trend referring to the process and size of datasets available for analysis. Ethical implications arise as healthcare disciplines, including nursing, struggle over questions of informed consent, privacy, ownership of data, and its possible use in epistemology. The author offers straight-thinking possibilities for the use of big data in nursing science.

  7. 'Big data' in pharmaceutical science: challenges and opportunities.

    Science.gov (United States)

    Dossetter, Al G; Ecker, Gerhard; Laverty, Hugh; Overington, John

    2014-05-01

    Future Medicinal Chemistry invited a selection of experts to express their views on the current impact of big data in drug discovery and design, as well as speculate on future developments in the field. The topics discussed include the challenges of implementing big data technologies, maintaining the quality and privacy of data sets, and how the industry will need to adapt to welcome the big data era. Their enlightening responses provide a snapshot of the many and varied contributions being made by big data to the advancement of pharmaceutical science.

  8. Legitimizing ESS Big Science as a collaboration across boundaries

    CERN Document Server

    O'Dell, Tom

    2013-01-01

    Legitimizing ESS 'Big Science' is a broad epithet that can be associated with research projects as different as the Manhattan Project, the Hubble Telescope-construction, and the CERN-establishment in Geneva. While the science produced by these projects is vastly different, they have in common the fact that they all involve huge budgets, big facilities, complex instrumentation, years of planning, and large multidis...

  9. Big data science: A literature review of nursing research exemplars.

    Science.gov (United States)

    Westra, Bonnie L; Sylvia, Martha; Weinfurter, Elizabeth F; Pruinelli, Lisiane; Park, Jung In; Dodd, Dianna; Keenan, Gail M; Senk, Patricia; Richesson, Rachel L; Baukner, Vicki; Cruz, Christopher; Gao, Grace; Whittenburg, Luann; Delaney, Connie W

    Big data and cutting-edge analytic methods in nursing research challenge nurse scientists to extend the data sources and analytic methods used for discovering and translating knowledge. The purpose of this study was to identify, analyze, and synthesize exemplars of big data nursing research applied to practice and disseminated in key nursing informatics, general biomedical informatics, and nursing research journals. A literature review of studies published between 2009 and 2015. There were 650 journal articles identified in 17 key nursing informatics, general biomedical informatics, and nursing research journals in the Web of Science database. After screening for inclusion and exclusion criteria, 17 studies published in 18 articles were identified as big data nursing research applied to practice. Nurses clearly are beginning to conduct big data research applied to practice. These studies represent multiple data sources and settings. Although numerous analytic methods were used, the fundamental issue remains to define the types of analyses consistent with big data analytic methods. There are needs to increase the visibility of big data and data science research conducted by nurse scientists, further examine the use of state of the science in data analytics, and continue to expand the availability and use of a variety of scientific, governmental, and industry data resources. A major implication of this literature review is whether nursing faculty and preparation of future scientists (PhD programs) are prepared for big data and data science. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Toward a Big Data Science: A challenge of "Science Cloud"

    Science.gov (United States)

    Murata, Ken T.; Watanabe, Hidenobu

    2013-04-01

    During these 50 years, along with appearance and development of high-performance computers (and super-computers), numerical simulation is considered to be a third methodology for science, following theoretical (first) and experimental and/or observational (second) approaches. The variety of data yielded by the second approaches has been getting more and more. It is due to the progress of technologies of experiments and observations. The amount of the data generated by the third methodologies has been getting larger and larger. It is because of tremendous development and programming techniques of super computers. Most of the data files created by both experiments/observations and numerical simulations are saved in digital formats and analyzed on computers. The researchers (domain experts) are interested in not only how to make experiments and/or observations or perform numerical simulations, but what information (new findings) to extract from the data. However, data does not usually tell anything about the science; sciences are implicitly hidden in the data. Researchers have to extract information to find new sciences from the data files. This is a basic concept of data intensive (data oriented) science for Big Data. As the scales of experiments and/or observations and numerical simulations get larger, new techniques and facilities are required to extract information from a large amount of data files. The technique is called as informatics as a fourth methodology for new sciences. Any methodologies must work on their facilities: for example, space environment are observed via spacecraft and numerical simulations are performed on super-computers, respectively in space science. The facility of the informatics, which deals with large-scale data, is a computational cloud system for science. This paper is to propose a cloud system for informatics, which has been developed at NICT (National Institute of Information and Communications Technology), Japan. The NICT science

  11. Big data - a 21st century science Maginot Line? No-boundary thinking: shifting from the big data paradigm.

    Science.gov (United States)

    Huang, Xiuzhen; Jennings, Steven F; Bruce, Barry; Buchan, Alison; Cai, Liming; Chen, Pengyin; Cramer, Carole L; Guan, Weihua; Hilgert, Uwe Kk; Jiang, Hongmei; Li, Zenglu; McClure, Gail; McMullen, Donald F; Nanduri, Bindu; Perkins, Andy; Rekepalli, Bhanu; Salem, Saeed; Specker, Jennifer; Walker, Karl; Wunsch, Donald; Xiong, Donghai; Zhang, Shuzhong; Zhang, Yu; Zhao, Zhongming; Moore, Jason H

    2015-01-01

    Whether your interests lie in scientific arenas, the corporate world, or in government, you have certainly heard the praises of big data: Big data will give you new insights, allow you to become more efficient, and/or will solve your problems. While big data has had some outstanding successes, many are now beginning to see that it is not the Silver Bullet that it has been touted to be. Here our main concern is the overall impact of big data; the current manifestation of big data is constructing a Maginot Line in science in the 21st century. Big data is not "lots of data" as a phenomena anymore; The big data paradigm is putting the spirit of the Maginot Line into lots of data. Big data overall is disconnecting researchers and science challenges. We propose No-Boundary Thinking (NBT), applying no-boundary thinking in problem defining to address science challenges.

  12. Big Data: New science, new challenges, new dialogical opportunities

    OpenAIRE

    Fuller, Michael

    2015-01-01

    The advent of extremely large datasets, known as “big data”, has been heralded as the instantiation of a new science, requiring a new kind of practitioner: the “data scientist”. This paper explores the concept of big data, drawing attention to a number of new issues – not least ethical concerns, and questions surrounding interpretation – which big data sets present. It is observed that the skills required for data scientists are in some respects closer to those traditionally associated with t...

  13. The Human Genome Project: big science transforms biology and medicine

    OpenAIRE

    Hood, Leroy; Rowen, Lee

    2013-01-01

    The Human Genome Project has transformed biology through its integrated big science approach to deciphering a reference human genome sequence along with the complete sequences of key model organisms. The project exemplifies the power, necessity and success of large, integrated, cross-disciplinary efforts - so-called ‘big science’ - directed towards complex major objectives. In this article, we discuss the ways in which this ambitious endeavor led to the development of novel technologies and a...

  14. From big data to deep insight in developmental science.

    Science.gov (United States)

    Gilmore, Rick O

    2016-01-01

    The use of the term 'big data' has grown substantially over the past several decades and is now widespread. In this review, I ask what makes data 'big' and what implications the size, density, or complexity of datasets have for the science of human development. A survey of existing datasets illustrates how existing large, complex, multilevel, and multimeasure data can reveal the complexities of developmental processes. At the same time, significant technical, policy, ethics, transparency, cultural, and conceptual issues associated with the use of big data must be addressed. Most big developmental science data are currently hard to find and cumbersome to access, the field lacks a culture of data sharing, and there is no consensus about who owns or should control research data. But, these barriers are dissolving. Developmental researchers are finding new ways to collect, manage, store, share, and enable others to reuse data. This promises a future in which big data can lead to deeper insights about some of the most profound questions in behavioral science. © 2016 The Authors. WIREs Cognitive Science published by Wiley Periodicals, Inc.

  15. Nursing Knowledge: Big Data Science-Implications for Nurse Leaders.

    Science.gov (United States)

    Westra, Bonnie L; Clancy, Thomas R; Sensmeier, Joyce; Warren, Judith J; Weaver, Charlotte; Delaney, Connie W

    2015-01-01

    The integration of Big Data from electronic health records and other information systems within and across health care enterprises provides an opportunity to develop actionable predictive models that can increase the confidence of nursing leaders' decisions to improve patient outcomes and safety and control costs. As health care shifts to the community, mobile health applications add to the Big Data available. There is an evolving national action plan that includes nursing data in Big Data science, spearheaded by the University of Minnesota School of Nursing. For the past 3 years, diverse stakeholders from practice, industry, education, research, and professional organizations have collaborated through the "Nursing Knowledge: Big Data Science" conferences to create and act on recommendations for inclusion of nursing data, integrated with patient-generated, interprofessional, and contextual data. It is critical for nursing leaders to understand the value of Big Data science and the ways to standardize data and workflow processes to take advantage of newer cutting edge analytics to support analytic methods to control costs and improve patient quality and safety.

  16. Science Fiction and the Big Questions

    Science.gov (United States)

    O'Keefe, M.

    Advocates of space science promote investment in science education and the development of new technologies necessary for space travel. Success in these areas requires an increase of interest and support among the general public. What role can entertainment media play in inspiring the public ­ especially young people ­ to support the development of space science? Such inspiration is badly needed. Science education and funding in the United States are in a state of crisis. This bleak situation exists during a boom in the popularity of science-oriented television shows and science fiction movies. This paper draws on interviews with professionals in science, technology, engineering and mathematics (STEM) fields, as well as students interested in those fields. The interviewees were asked about their lifelong media-viewing habits. Analysis of these interviews, along with examples from popular culture, suggests that science fiction can be a valuable tool for space advocates. Specifically, the aspects of character, story, and special effects can provide viewers with inspiration and a sense of wonder regarding space science and the prospect of long-term human space exploration.

  17. Decision Sciences, Economics, Finance, Business, Computing, and Big Data: Connections

    NARCIS (Netherlands)

    C-L. Chang (Chia-Lin); M.J. McAleer (Michael); W.-K. Wong (Wing-Keung)

    2018-01-01

    textabstractThis paper provides a review of some connecting literature in Decision Sciences, Economics, Finance, Business, Computing, and Big Data. We then discuss some research that is related to the six cognate disciplines. Academics could develop theoretical models and subsequent

  18. French environmental labs may get 'big science' funds

    CERN Multimedia

    2000-01-01

    France is considering expanding its network of enviromental laboratories to study the long term impacts of environmental change. It has been suggested that this could be funded using the 'big science' budget usually used for facilities such as particle accelerators (2 para).

  19. Big Data: Philosophy, Emergence, Crowdledge, and Science Education

    Science.gov (United States)

    dos Santos, Renato P.

    2015-01-01

    Big Data already passed out of hype, is now a field that deserves serious academic investigation, and natural scientists should also become familiar with Analytics. On the other hand, there is little empirical evidence that any science taught in school is helping people to lead happier, more prosperous, or more politically well-informed lives. In…

  20. Big science transformed science, politics and organization in Europe and the United States

    CERN Document Server

    Hallonsten, Olof

    2016-01-01

    This book analyses the emergence of a transformed Big Science in Europe and the United States, using both historical and sociological perspectives. It shows how technology-intensive natural sciences grew to a prominent position in Western societies during the post-World War II era, and how their development cohered with both technological and social developments. At the helm of post-war science are large-scale projects, primarily in physics, which receive substantial funds from the public purse. Big Science Transformed shows how these projects, popularly called 'Big Science', have become symbols of progress. It analyses changes to the political and sociological frameworks surrounding publicly-funding science, and their impact on a number of new accelerator and reactor-based facilities that have come to prominence in materials science and the life sciences. Interdisciplinary in scope, this book will be of great interest to historians, sociologists and philosophers of science.

  1. Data Management and Preservation Planning for Big Science

    Directory of Open Access Journals (Sweden)

    Juan Bicarregui

    2013-06-01

    Full Text Available ‘Big Science’ - that is, science which involves large collaborations with dedicated facilities, and involving large data volumes and multinational investments – is often seen as different when it comes to data management and preservation planning. Big Science handles its data differently from other disciplines and has data management problems that are qualitatively different from other disciplines. In part, these differences arise from the quantities of data involved, but possibly more importantly from the cultural, organisational and technical distinctiveness of these academic cultures. Consequently, the data management systems are typically and rationally bespoke, but this means that the planning for data management and preservation (DMP must also be bespoke.These differences are such that ‘just read and implement the OAIS specification’ is reasonable Data Management and Preservation (DMP advice, but this bald prescription can and should be usefully supported by a methodological ‘toolkit’, including overviews, case-studies and costing models to provide guidance on developing best practice in DMP policy and infrastructure for these projects, as well as considering OAIS validation, audit and cost modelling.In this paper, we build on previous work with the LIGO collaboration to consider the role of DMP planning within these big science scenarios, and discuss how to apply current best practice. We discuss the result of the MaRDI-Gross project (Managing Research Data Infrastructures – Big Science, which has been developing a toolkit to provide guidelines on the application of best practice in DMP planning within big science projects. This is targeted primarily at projects’ engineering managers, but intending also to help funders collaborate on DMP plans which satisfy the requirements imposed on them.

  2. Big Data and Data Science in Critical Care.

    Science.gov (United States)

    Sanchez-Pinto, L Nelson; Luo, Yuan; Churpek, Matthew M

    2018-05-09

    The digitalization of the healthcare system has resulted in a deluge of clinical Big Data and has prompted the rapid growth of data science in medicine. Data science, which is the field of study dedicated to the principled extraction of knowledge from complex data, is particularly relevant in the critical care setting. The availability of large amounts of data in the intensive care unit, the need for better evidence-based care, and the complexity of critical illness makes the use of data science techniques and data-driven research particularly appealing to intensivists. Despite the increasing number of studies and publications in the field, so far there have been few examples of data science projects that have resulted in successful implementations of data-driven systems in the intensive care unit. However, given the expected growth in the field, intensivists should be familiar with the opportunities and challenges of Big Data and data science. In this paper, we review the definitions, types of algorithms, applications, challenges, and future of Big Data and data science in critical care. Copyright © 2018. Published by Elsevier Inc.

  3. Towards Geo-spatial Information Science in Big Data Era

    Directory of Open Access Journals (Sweden)

    LI Deren

    2016-04-01

    Full Text Available Since the 1990s, with the advent of worldwide information revolution and the development of internet, geospatial information science have also come of age, which pushed forward the building of digital Earth and cyber city. As we entered the 21st century, with the development and integration of global information technology and industrialization, internet of things and cloud computing came into being, human society enters into the big data era. This article covers the key features (ubiquitous, multi-dimension and dynamics, internet+networking, full automation and real-time, from sensing to recognition, crowdsourcing and VGI, and service-oriented of geospatial information science in the big data era and addresses the key technical issues (non-linear four dimensional Earth reference frame system, space based enhanced GNSS, space-air and land unified network communication techniques, on board processing techniques for multi-sources image data, smart interface service techniques for space-borne information, space based resource scheduling and network security, design and developing of a payloads based multi-functional satellite platform. That needs to be resolved to provide a new definition of geospatial information science in big data era. Based on the discussion in this paper, the author finally proposes a new definition of geospatial information science (geomatics, i.e. Geomatics is a multiple discipline science and technology which, using a systematic approach, integrates all the means for spatio-temporal data acquisition, information extraction, networked management, knowledge discovering, spatial sensing and recognition, as well as intelligent location based services of any physical objects and human activities around the earth and its environment. Starting from this new definition, geospatial information science will get much more chances and find much more tasks in big data era for generation of smart earth and smart city . Our profession

  4. Britain's big science in a bind

    CERN Multimedia

    Williams, N

    1996-01-01

    UK 1994 science-administration reforms, which formed the Particle Physics and Astronomy Research Council (PPARC) to separate the two capital-intensive fields from other disciplines, has not been a success. Most of PPARC's funds go to CERN and ESA dues, with little left to use for other resources.

  5. Earth Science Data Analysis in the Era of Big Data

    Science.gov (United States)

    Kuo, K.-S.; Clune, T. L.; Ramachandran, R.

    2014-01-01

    Anyone with even a cursory interest in information technology cannot help but recognize that "Big Data" is one of the most fashionable catchphrases of late. From accurate voice and facial recognition, language translation, and airfare prediction and comparison, to monitoring the real-time spread of flu, Big Data techniques have been applied to many seemingly intractable problems with spectacular successes. They appear to be a rewarding way to approach many currently unsolved problems. Few fields of research can claim a longer history with problems involving voluminous data than Earth science. The problems we are facing today with our Earth's future are more complex and carry potentially graver consequences than the examples given above. How has our climate changed? Beside natural variations, what is causing these changes? What are the processes involved and through what mechanisms are these connected? How will they impact life as we know it? In attempts to answer these questions, we have resorted to observations and numerical simulations with ever-finer resolutions, which continue to feed the "data deluge." Plausibly, many Earth scientists are wondering: How will Big Data technologies benefit Earth science research? As an example from the global water cycle, one subdomain among many in Earth science, how would these technologies accelerate the analysis of decades of global precipitation to ascertain the changes in its characteristics, to validate these changes in predictive climate models, and to infer the implications of these changes to ecosystems, economies, and public health? Earth science researchers need a viable way to harness the power of Big Data technologies to analyze large volumes and varieties of data with velocity and veracity. Beyond providing speedy data analysis capabilities, Big Data technologies can also play a crucial, albeit indirect, role in boosting scientific productivity by facilitating effective collaboration within an analysis environment

  6. Processes meet big data : connecting data science with process science

    NARCIS (Netherlands)

    van der Aalst, W.; Damiani, E.

    2015-01-01

    As more and more companies are embracing Big data, it has become apparent that the ultimate challenge is to relate massive amounts of event data to processes that are highly dynamic. To unleash the value of event data, events need to be tightly connected to the control and management of operational

  7. Big data in medical science--a biostatistical view.

    Science.gov (United States)

    Binder, Harald; Blettner, Maria

    2015-02-27

    Inexpensive techniques for measurement and data storage now enable medical researchers to acquire far more data than can conveniently be analyzed by traditional methods. The expression "big data" refers to quantities on the order of magnitude of a terabyte (1012 bytes); special techniques must be used to evaluate such huge quantities of data in a scientifically meaningful way. Whether data sets of this size are useful and important is an open question that currently confronts medical science. In this article, we give illustrative examples of the use of analytical techniques for big data and discuss them in the light of a selective literature review. We point out some critical aspects that should be considered to avoid errors when large amounts of data are analyzed. Machine learning techniques enable the recognition of potentially relevant patterns. When such techniques are used, certain additional steps should be taken that are unnecessary in more traditional analyses; for example, patient characteristics should be differentially weighted. If this is not done as a preliminary step before similarity detection, which is a component of many data analysis operations, characteristics such as age or sex will be weighted no higher than any one out of 10 000 gene expression values. Experience from the analysis of conventional observational data sets can be called upon to draw conclusions about potential causal effects from big data sets. Big data techniques can be used, for example, to evaluate observational data derived from the routine care of entire populations, with clustering methods used to analyze therapeutically relevant patient subgroups. Such analyses can provide complementary information to clinical trials of the classic type. As big data analyses become more popular, various statistical techniques for causality analysis in observational data are becoming more widely available. This is likely to be of benefit to medical science, but specific adaptations will

  8. TinyDebug

    DEFF Research Database (Denmark)

    Hansen, Morten Tranberg

    2011-01-01

    Debugging embedded wireless systems can be cumbersome due to low visibility. To ease the task of debugging this paper present TinyDebug which is a multi-purpose passive debugging framework for developing embedded wireless sys- tems. TinyDebug is designed to be used throughout the entire system...... logging to extraction and show how the frame- work improves upon existing message based and event log- ging debugging techniques while enabling distributed event processing. We also present a number of optional event anal- ysis tools demonstrating the generality of the TinyDebug debug messages....

  9. The Human Genome Project: big science transforms biology and medicine.

    Science.gov (United States)

    Hood, Leroy; Rowen, Lee

    2013-01-01

    The Human Genome Project has transformed biology through its integrated big science approach to deciphering a reference human genome sequence along with the complete sequences of key model organisms. The project exemplifies the power, necessity and success of large, integrated, cross-disciplinary efforts - so-called 'big science' - directed towards complex major objectives. In this article, we discuss the ways in which this ambitious endeavor led to the development of novel technologies and analytical tools, and how it brought the expertise of engineers, computer scientists and mathematicians together with biologists. It established an open approach to data sharing and open-source software, thereby making the data resulting from the project accessible to all. The genome sequences of microbes, plants and animals have revolutionized many fields of science, including microbiology, virology, infectious disease and plant biology. Moreover, deeper knowledge of human sequence variation has begun to alter the practice of medicine. The Human Genome Project has inspired subsequent large-scale data acquisition initiatives such as the International HapMap Project, 1000 Genomes, and The Cancer Genome Atlas, as well as the recently announced Human Brain Project and the emerging Human Proteome Project.

  10. Neuroblastoma, a Paradigm for Big Data Science in Pediatric Oncology.

    Science.gov (United States)

    Salazar, Brittany M; Balczewski, Emily A; Ung, Choong Yong; Zhu, Shizhen

    2016-12-27

    Pediatric cancers rarely exhibit recurrent mutational events when compared to most adult cancers. This poses a challenge in understanding how cancers initiate, progress, and metastasize in early childhood. Also, due to limited detected driver mutations, it is difficult to benchmark key genes for drug development. In this review, we use neuroblastoma, a pediatric solid tumor of neural crest origin, as a paradigm for exploring "big data" applications in pediatric oncology. Computational strategies derived from big data science-network- and machine learning-based modeling and drug repositioning-hold the promise of shedding new light on the molecular mechanisms driving neuroblastoma pathogenesis and identifying potential therapeutics to combat this devastating disease. These strategies integrate robust data input, from genomic and transcriptomic studies, clinical data, and in vivo and in vitro experimental models specific to neuroblastoma and other types of cancers that closely mimic its biological characteristics. We discuss contexts in which "big data" and computational approaches, especially network-based modeling, may advance neuroblastoma research, describe currently available data and resources, and propose future models of strategic data collection and analyses for neuroblastoma and other related diseases.

  11. Think big: learning contexts, algorithms and data science

    Directory of Open Access Journals (Sweden)

    Baldassarre Michele

    2016-12-01

    Full Text Available Due to the increasing growth in available data in recent years, all areas of research and the managements of institutions and organisations, specifically schools and universities, feel the need to give meaning to this availability of data. This article, after a brief reference to the definition of big data, intends to focus attention and reflection on their type to proceed to an extension of their characterisation. One of the hubs to make feasible the use of Big Data in operational contexts is to give a theoretical basis to which to refer. The Data, Information, Knowledge and Wisdom (DIKW model correlates these four aspects, concluding in Data Science, which in many ways could revolutionise the established pattern of scientific investigation. The Learning Analytics applications on online learning platforms can be tools for evaluating the quality of teaching. And that is where some problems arise. It becomes necessary to handle with care the available data. Finally, a criterion for deciding whether it makes sense to think of an analysis based on Big Data can be to think about the interpretability and relevance in relation to both institutional and personal processes.

  12. Accelerating Science Impact through Big Data Workflow Management and Supercomputing

    Directory of Open Access Journals (Sweden)

    De K.

    2016-01-01

    Full Text Available The Large Hadron Collider (LHC, operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. ATLAS, one of the largest collaborations ever assembled in the the history of science, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. To manage the workflow for all data processing on hundreds of data centers the PanDA (Production and Distributed AnalysisWorkload Management System is used. An ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF, is realizing within BigPanDA and megaPanDA projects. These projects are now exploring how PanDA might be used for managing computing jobs that run on supercomputers including OLCF’s Titan and NRC-KI HPC2. The main idea is to reuse, as much as possible, existing components of the PanDA system that are already deployed on the LHC Grid for analysis of physics data. The next generation of PanDA will allow many data-intensive sciences employing a variety of computing platforms to benefit from ATLAS experience and proven tools in highly scalable processing.

  13. Applying science and mathematics to big data for smarter buildings.

    Science.gov (United States)

    Lee, Young M; An, Lianjun; Liu, Fei; Horesh, Raya; Chae, Young Tae; Zhang, Rui

    2013-08-01

    Many buildings are now collecting a large amount of data on operations, energy consumption, and activities through systems such as a building management system (BMS), sensors, and meters (e.g., submeters and smart meters). However, the majority of data are not utilized and are thrown away. Science and mathematics can play an important role in utilizing these big data and accurately assessing how energy is consumed in buildings and what can be done to save energy, make buildings energy efficient, and reduce greenhouse gas (GHG) emissions. This paper discusses an analytical tool that has been developed to assist building owners, facility managers, operators, and tenants of buildings in assessing, benchmarking, diagnosing, tracking, forecasting, and simulating energy consumption in building portfolios. © 2013 New York Academy of Sciences.

  14. Data science and big data an environment of computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2017-01-01

    This book presents a comprehensive and up-to-date treatise of a range of methodological and algorithmic issues. It also discusses implementations and case studies, identifies the best design practices, and assesses data analytics business models and practices in industry, health care, administration and business. Data science and big data go hand in hand and constitute a rapidly growing area of research and have attracted the attention of industry and business alike. The area itself has opened up promising new directions of fundamental and applied research and has led to interesting applications, especially those addressing the immediate need to deal with large repositories of data and building tangible, user-centric models of relationships in data. Data is the lifeblood of today’s knowledge-driven economy. Numerous data science models are oriented towards end users and along with the regular requirements for accuracy (which are present in any modeling), come the requirements for ability to process huge and...

  15. Big Biomedical data as the key resource for discovery science

    Energy Technology Data Exchange (ETDEWEB)

    Toga, Arthur W.; Foster, Ian; Kesselman, Carl; Madduri, Ravi; Chard, Kyle; Deutsch, Eric W.; Price, Nathan D.; Glusman, Gustavo; Heavner, Benjamin D.; Dinov, Ivo D.; Ames, Joseph; Van Horn, John; Kramer, Roger; Hood, Leroy

    2015-07-21

    Modern biomedical data collection is generating exponentially more data in a multitude of formats. This flood of complex data poses significant opportunities to discover and understand the critical interplay among such diverse domains as genomics, proteomics, metabolomics, and phenomics, including imaging, biometrics, and clinical data. The Big Data for Discovery Science Center is taking an “-ome to home” approach to discover linkages between these disparate data sources by mining existing databases of proteomic and genomic data, brain images, and clinical assessments. In support of this work, the authors developed new technological capabilities that make it easy for researchers to manage, aggregate, manipulate, integrate, and model large amounts of distributed data. Guided by biological domain expertise, the Center’s computational resources and software will reveal relationships and patterns, aiding researchers in identifying biomarkers for the most confounding conditions and diseases, such as Parkinson’s and Alzheimer’s.

  16. "Big Science: the LHC in Pictures" in the Globe

    CERN Multimedia

    2008-01-01

    An exhibition of spectacular photographs of the LHC and its experiments is about to open in the Globe. The LHC and its four experiments are not only huge in size but also uniquely beautiful, as the exhibition "Big Science: the LHC in Pictures" in the Globe of Science and Innovation will show. The exhibition features around thirty spectacular photographs measuring 4.5 metres high and 2.5 metres wide. These giant pictures reflecting the immense scale of the LHC and the mysteries of the Universe it is designed to uncover fill the Globe with shape and colour. The exhibition, which will open on 4 March, is divided into six different themes: CERN, the LHC and the four experiments ATLAS, LHCb, CMS and ALICE. Facts about all these subjects will be available at information points and in an explanatory booklet accompanying the exhibition (which visitors will be able to buy if they wish to take it home with them). Globe of Science and Innovatio...

  17. EDITORIAL: Big science at the nanoscale Big science at the nanoscale

    Science.gov (United States)

    Reed, Mark

    2009-10-01

    In 1990, the journal Nanotechnology was the first academic publication dedicated to disseminating the results of research in what was then a new field of scientific endeavour. To celebrate the 20th volume of Nanotechnology, we are publishing a special issue of top research papers covering all aspects of this multidisciplinary science, including biology, electronics and photonics, quantum phenomena, sensing and actuating, patterning and fabrication, material synthesis and the properties of nanomaterials. In the early 1980s, scanning probe microscopes brought the concepts of matter and interactions at the nanoscale into visual reality, and hastened a flurry of activity in the burgeoning new field of nanoscience. Twenty years on and nanotechnology has truly come of age. The ramifications are pervasive throughout daily life in communication, health care and entertainment technology. For example, DVDs have now consigned videotapes to the ark and mobile phones are as prevalent as house keys, and these technologies already look set to be superseded by internet phones and Blu-Ray discs. Nanotechnology has been in the unique position of following the explosive growth of this discipline from its outset. The surge of activity in the field is notable in the number of papers published by the journal each year, which has skyrocketed. The journal is now published weekly, publishing over 1400 articles a year. What is more, the quality of these articles is also constantly improving; the average number of citations to articles within two years of publication, quantified by the ISI impact factor, continues to increase every year. The rate of activity in the field shows no signs of slowing down, as is evident from the wealth of great research published each week. The aim of the 20th volume special issue is to present some of the very best and most recent research in many of the wide-ranging fields covered by the journal, a celebration of the present state of play in nanotechnology and

  18. TinyOS Alliance Structure

    DEFF Research Database (Denmark)

    Bonnet, Philippe; Culler, David; Estrin, Deborah

    2006-01-01

    This memo describes the goals and organization structure of the TinyOS Alliance. It covers membership, the working group forums for contribution, intellectual property, source licensing, and the TinyOS Steering Committee (TSC)....

  19. Opening the Big Black Box: European study reveals visitors' impressions of science laboratories

    CERN Multimedia

    2004-01-01

    "On 29 - 30 March the findings of 'Inside the Big Black Box'- a Europe-wide science and society project - will be revealed during a two-day seminar hosted by CERN*. The principle aim of Inside the Big Black Box (IN3B) is to determine whether a working scientific laboratory can capture the curiosity of the general public through visits" (1 page)

  20. Green data science : using big data in an "environmentally friendly" manner

    NARCIS (Netherlands)

    Van Der Aalst, W.M.P.

    2016-01-01

    The widespread use of "Big Data" is heavily impacting organizations and individuals for which these data are collected. Sophisticated data science techniques aim to extract as much value from data as possible. Powerful mixtures of Big Data and analytics are rapidly changing the way we do business,

  1. Big biomedical data as the key resource for discovery science.

    Science.gov (United States)

    Toga, Arthur W; Foster, Ian; Kesselman, Carl; Madduri, Ravi; Chard, Kyle; Deutsch, Eric W; Price, Nathan D; Glusman, Gustavo; Heavner, Benjamin D; Dinov, Ivo D; Ames, Joseph; Van Horn, John; Kramer, Roger; Hood, Leroy

    2015-11-01

    Modern biomedical data collection is generating exponentially more data in a multitude of formats. This flood of complex data poses significant opportunities to discover and understand the critical interplay among such diverse domains as genomics, proteomics, metabolomics, and phenomics, including imaging, biometrics, and clinical data. The Big Data for Discovery Science Center is taking an "-ome to home" approach to discover linkages between these disparate data sources by mining existing databases of proteomic and genomic data, brain images, and clinical assessments. In support of this work, the authors developed new technological capabilities that make it easy for researchers to manage, aggregate, manipulate, integrate, and model large amounts of distributed data. Guided by biological domain expertise, the Center's computational resources and software will reveal relationships and patterns, aiding researchers in identifying biomarkers for the most confounding conditions and diseases, such as Parkinson's and Alzheimer's. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Speaking sociologically with big data: symphonic social science and the future for big data research

    OpenAIRE

    Halford, Susan; Savage, Mike

    2017-01-01

    Recent years have seen persistent tension between proponents of big data analytics, using new forms of digital data to make computational and statistical claims about ‘the social’, and many sociologists sceptical about the value of big data, its associated methods and claims to knowledge. We seek to move beyond this, taking inspiration from a mode of argumentation pursued by Putnam (2000), Wilkinson and Pickett (2009) and Piketty (2014) that we label ‘symphonic social science’. This bears bot...

  3. The Whole Shebang: How Science Produced the Big Bang Model.

    Science.gov (United States)

    Ferris, Timothy

    2002-01-01

    Offers an account of the accumulation of evidence that has led scientists to have confidence in the big bang theory of the creation of the universe. Discusses the early work of Ptolemy, Copernicus, Kepler, Galileo, and Newton, noting the rise of astrophysics, and highlighting the birth of the big bang model (the cosmic microwave background theory…

  4. The Role of Big Data in the Social Sciences

    Science.gov (United States)

    Ovadia, Steven

    2013-01-01

    Big Data is an increasingly popular term across scholarly and popular literature but lacks a formal definition (Lohr 2012). This is beneficial in that it keeps the term flexible. For librarians, Big Data represents a few important ideas. One idea is the idea of balancing accessibility with privacy. Librarians tend to want information to be as open…

  5. Big Data, Computational Science, Economics, Finance, Marketing, Management, and Psychology: Connections

    NARCIS (Netherlands)

    C-L. Chang (Chia-Lin); M.J. McAleer (Michael); W.-K. Wong (Wing-Keung)

    2018-01-01

    textabstractThe paper provides a review of the literature that connects Big Data, Computational Science, Economics, Finance, Marketing, Management, and Psychology, and discusses some research that is related to the seven disciplines. Academics could develop theoretical models and subsequent

  6. Revenge of tiny Miranda

    International Nuclear Information System (INIS)

    Goldreich, P.; Nicholson, P.

    1977-01-01

    Reference is made to Dermott and Gold (Nature 267: 590 (1977)) who proposed a resonance model for the rings of Uranus. They assumed that the rings are composed of small particles librating about stable resonances determined by pairs of satellites, either Ariel and Titania or Ariel and Oberon. They dismissed as insignificant resonances involving 'tiny Miranda'. It is reported here that, by a wide margin, the strongest resonances are all associated with Miranda. It is also shown that the hypothesis that the rings are made up of librating particles, whilst original and ingenious, is incorrect. (author)

  7. Perspectives on Policy and the Value of Nursing Science in a Big Data Era.

    Science.gov (United States)

    Gephart, Sheila M; Davis, Mary; Shea, Kimberly

    2018-01-01

    As data volume explodes, nurse scientists grapple with ways to adapt to the big data movement without jeopardizing its epistemic values and theoretical focus that celebrate while acknowledging the authority and unity of its body of knowledge. In this article, the authors describe big data and emphasize ways that nursing science brings value to its study. Collective nursing voices that call for more nursing engagement in the big data era are answered with ways to adapt and integrate theoretical and domain expertise from nursing into data science.

  8. Has the time come for big science in wildlife health?

    Science.gov (United States)

    Sleeman, Jonathan M.

    2013-01-01

    The consequences of wildlife emerging diseases are global and profound with increased burden on the public health system, negative impacts on the global economy, declines and extinctions of wildlife species, and subsequent loss of ecological integrity. Examples of health threats to wildlife include Batrachochytrium dendrobatidis, which causes a cutaneous fungal infection of amphibians and is linked to declines of amphibians globally; and the recently discovered Pseudogymnoascus (Geomyces) destructans, the etiologic agent of white nose syndrome which has caused precipitous declines of North American bat species. Of particular concern are the novel pathogens that have emerged as they are particularly devastating and challenging to manage. A big science approach to wildlife health research is needed if we are to make significant and enduring progress in managing these diseases. The advent of new analytical models and bench assays will provide us with the mathematical and molecular tools to identify and anticipate threats to wildlife, and understand the ecology and epidemiology of these diseases. Specifically, new molecular diagnostic techniques have opened up avenues for pathogen discovery, and the application of spatially referenced databases allows for risk assessments that can assist in targeting surveillance. Long-term, systematic collection of data for wildlife health and integration with other datasets is also essential. Multidisciplinary research programs should be expanded to increase our understanding of the drivers of emerging diseases and allow for the development of better disease prevention and management tools, such as vaccines. Finally, we need to create a National Fish and Wildlife Health Network that provides the operational framework (governance, policies, procedures, etc.) by which entities with a stake in wildlife health cooperate and collaborate to achieve optimal outcomes for human, animal, and ecosystem health.

  9. What science for what kind of society? Reflecting the development of big science

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    Lecture will be in English– Translation available in French Without any doubt, CERN can be described as being among the most ambitious scientific enterprises ever undertaken. For 60 years, the Member States have not only invested considerable financial means into this institution, but have also supported the creation of a highly visionary research programme. And this has led to a change in the way science is done, as captured by the idea of "big science". Yet this naturally also raises a number of quite fundamental questions: How did the meaning of "doing science" change? What justifies societal engagement with and support for such a cost-intensive long-term scientific undertaking? And finally, in what ways does (and did) this research enterprise contribute to the development of contemporary societies? By focusing on some key examples, the talk will thus explore how the ways of doing research and scientific and societal relations have undergone change over the ...

  10. Forget the hype or reality. Big data presents new opportunities in Earth Science.

    Science.gov (United States)

    Lee, T. J.

    2015-12-01

    Earth science is arguably one of the most mature science discipline which constantly acquires, curates, and utilizes a large volume of data with diverse variety. We deal with big data before there is big data. For example, while developing the EOS program in the 1980s, the EOS data and information system (EOSDIS) was developed to manage the vast amount of data acquired by the EOS fleet of satellites. EOSDIS continues to be a shining example of modern science data systems in the past two decades. With the explosion of internet, the usage of social media, and the provision of sensors everywhere, the big data era has bring new challenges. First, Goggle developed the search algorithm and a distributed data management system. The open source communities quickly followed up and developed Hadoop file system to facility the map reduce workloads. The internet continues to generate tens of petabytes of data every day. There is a significant shortage of algorithms and knowledgeable manpower to mine the data. In response, the federal government developed the big data programs that fund research and development projects and training programs to tackle these new challenges. Meanwhile, comparatively to the internet data explosion, Earth science big data problem has become quite small. Nevertheless, the big data era presents an opportunity for Earth science to evolve. We learned about the MapReduce algorithms, in memory data mining, machine learning, graph analysis, and semantic web technologies. How do we apply these new technologies to our discipline and bring the hype to Earth? In this talk, I will discuss how we might want to apply some of the big data technologies to our discipline and solve many of our challenging problems. More importantly, I will propose new Earth science data system architecture to enable new type of scientific inquires.

  11. Big-Data-Driven Stem Cell Science and Tissue Engineering: Vision and Unique Opportunities.

    Science.gov (United States)

    Del Sol, Antonio; Thiesen, Hans J; Imitola, Jaime; Carazo Salas, Rafael E

    2017-02-02

    Achieving the promises of stem cell science to generate precise disease models and designer cell samples for personalized therapeutics will require harnessing pheno-genotypic cell-level data quantitatively and predictively in the lab and clinic. Those requirements could be met by developing a Big-Data-driven stem cell science strategy and community. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. A Big Data Guide to Understanding Climate Change: The Case for Theory-Guided Data Science.

    Science.gov (United States)

    Faghmous, James H; Kumar, Vipin

    2014-09-01

    Global climate change and its impact on human life has become one of our era's greatest challenges. Despite the urgency, data science has had little impact on furthering our understanding of our planet in spite of the abundance of climate data. This is a stark contrast from other fields such as advertising or electronic commerce where big data has been a great success story. This discrepancy stems from the complex nature of climate data as well as the scientific questions climate science brings forth. This article introduces a data science audience to the challenges and opportunities to mine large climate datasets, with an emphasis on the nuanced difference between mining climate data and traditional big data approaches. We focus on data, methods, and application challenges that must be addressed in order for big data to fulfill their promise with regard to climate science applications. More importantly, we highlight research showing that solely relying on traditional big data techniques results in dubious findings, and we instead propose a theory-guided data science paradigm that uses scientific theory to constrain both the big data techniques as well as the results-interpretation process to extract accurate insight from large climate data .

  13. An overview of big data and data science education at South African universities

    Directory of Open Access Journals (Sweden)

    Eduan Kotzé

    2016-02-01

    Full Text Available Man and machine are generating data electronically at an astronomical speed and in such a way that society is experiencing cognitive challenges to analyse this data meaningfully. Big data firms, such as Google and Facebook, identified this problem several years ago and are continuously developing new technologies or improving existing technologies in order to facilitate the cognitive analysis process of these large data sets. The purpose of this article is to contribute to our theoretical understanding of the role that big data might play in creating new training opportunities for South African universities. The article investigates emerging literature on the characteristics and main components of big data, together with the Hadoop application stack as an example of big data technology. Due to the rapid development of big data technology, a paradigm shift of human resources is required to analyse these data sets; therefore, this study examines the state of big data teaching at South African universities. This article also provides an overview of possible big data sources for South African universities, as well as relevant big data skills that data scientists need. The study also investigates existing academic programs in South Africa, where the focus is on teaching advanced database systems. The study found that big data and data science topics are introduced to students on a postgraduate level, but that the scope is very limited. This article contributes by proposing important theoretical topics that could be introduced as part of the existing academic programs. More research is required, however, to expand these programs in order to meet the growing demand for data scientists with big data skills.

  14. "small problems, Big Trouble": An Art and Science Collaborative Exhibition Reflecting Seemingly small problems Leading to Big Threats

    Science.gov (United States)

    Waller, J. L.; Brey, J. A.

    2014-12-01

    "small problems, Big Trouble" (spBT) is an exhibition of artist Judith Waller's paintings accompanied by text panels written by Earth scientist Dr. James A. Brey and several science researchers and educators. The text panels' message is as much the focus of the show as the art--true interdisciplinarity! Waller and Brey's history of art and earth science collaborations include the successful exhibition "Layers: Places in Peril". New in spBT is extended collaboration with other scientists in order to create awareness of geoscience and other subjects (i.e. soil, parasites, dust, pollutants, invasive species, carbon, ground water contaminants, solar wind) small in scale which pose significant threats. The paintings are the size of a mirror, a symbol suggesting the problems depicted are those we increasingly need to face, noting our collective reflections of shared current and future reality. Naturalistic rendering and abstract form in the art helps reach a broad audience including those familiar with art and those familiar with science. The goal is that gallery visitors gain greater appreciation and understanding of both—and of the sober content of the show as a whole. "small problems, Big Trouble" premiers in Wisconsin April, 2015. As in previous collaborations, Waller and Brey actively utilize art and science (specifically geoscience) as an educational vehicle for active student learning. Planned are interdisciplinary university and area high school activities linked through spBT. The exhibition in a public gallery offers a means to enhance community awareness of and action on scientific issues through art's power to engage people on an emotional level. This AGU presentation includes a description of past Waller and Brey activities: incorporating art and earth science in lab and studio classrooms, producing gallery and museum exhibitions and delivering workshops and other presentations. They also describe how walking the paths of several past earth science

  15. The big questions in science the quest to solve the great unknowns

    CERN Document Server

    Birch, Hayley; Stuart, Colin

    2016-01-01

    What are the great scientific questions of our modern age and why don't we know the answers? The Big Questions in Science takes on the most fascinating and pressing mysteries we have yet to crack and explains how tantalizingly close science is to solving them (or how frustratingly out of reach they remain). Some, such as "Can we live forever? and "What makes us human? " are eternal questions; others, such as "How do we solve the population problem? " and "How do we get more energy from the sun? " are essential to our future survival. Written by experienced science writers, adept at translating the complicated concepts of "hard science" into an engaging and insightful discussion for the general reader, The Big Questions in Science grapples with 20 hot topics across the disciplines of biology, chemistry, physics, astronomy and computer science to ignite the inquistitive scientist in all of us.

  16. The role of administrative data in the big data revolution in social science research.

    Science.gov (United States)

    Connelly, Roxanne; Playford, Christopher J; Gayle, Vernon; Dibben, Chris

    2016-09-01

    The term big data is currently a buzzword in social science, however its precise meaning is ambiguous. In this paper we focus on administrative data which is a distinctive form of big data. Exciting new opportunities for social science research will be afforded by new administrative data resources, but these are currently under appreciated by the research community. The central aim of this paper is to discuss the challenges associated with administrative data. We emphasise that it is critical for researchers to carefully consider how administrative data has been produced. We conclude that administrative datasets have the potential to contribute to the development of high-quality and impactful social science research, and should not be overlooked in the emerging field of big data. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Big Data in Science and Healthcare: A Review of Recent Literature and Perspectives

    Science.gov (United States)

    Miron-Shatz, T.; Lau, A. Y. S.; Paton, C.

    2014-01-01

    Summary Objectives As technology continues to evolve and rise in various industries, such as healthcare, science, education, and gaming, a sophisticated concept known as Big Data is surfacing. The concept of analytics aims to understand data. We set out to portray and discuss perspectives of the evolving use of Big Data in science and healthcare and, to examine some of the opportunities and challenges. Methods A literature review was conducted to highlight the implications associated with the use of Big Data in scientific research and healthcare innovations, both on a large and small scale. Results Scientists and health-care providers may learn from one another when it comes to understanding the value of Big Data and analytics. Small data, derived by patients and consumers, also requires analytics to become actionable. Connectivism provides a framework for the use of Big Data and analytics in the areas of science and healthcare. This theory assists individuals to recognize and synthesize how human connections are driving the increase in data. Despite the volume and velocity of Big Data, it is truly about technology connecting humans and assisting them to construct knowledge in new ways. Concluding Thoughts The concept of Big Data and associated analytics are to be taken seriously when approaching the use of vast volumes of both structured and unstructured data in science and health-care. Future exploration of issues surrounding data privacy, confidentiality, and education are needed. A greater focus on data from social media, the quantified self-movement, and the application of analytics to “small data” would also be useful. PMID:25123717

  18. Neuroblastoma, a Paradigm for Big Data Science in Pediatric Oncology

    Directory of Open Access Journals (Sweden)

    Brittany M. Salazar

    2016-12-01

    Full Text Available Pediatric cancers rarely exhibit recurrent mutational events when compared to most adult cancers. This poses a challenge in understanding how cancers initiate, progress, and metastasize in early childhood. Also, due to limited detected driver mutations, it is difficult to benchmark key genes for drug development. In this review, we use neuroblastoma, a pediatric solid tumor of neural crest origin, as a paradigm for exploring “big data” applications in pediatric oncology. Computational strategies derived from big data science–network- and machine learning-based modeling and drug repositioning—hold the promise of shedding new light on the molecular mechanisms driving neuroblastoma pathogenesis and identifying potential therapeutics to combat this devastating disease. These strategies integrate robust data input, from genomic and transcriptomic studies, clinical data, and in vivo and in vitro experimental models specific to neuroblastoma and other types of cancers that closely mimic its biological characteristics. We discuss contexts in which “big data” and computational approaches, especially network-based modeling, may advance neuroblastoma research, describe currently available data and resources, and propose future models of strategic data collection and analyses for neuroblastoma and other related diseases.

  19. Big Data and Clinicians: A Review on the State of the Science

    Science.gov (United States)

    Wang, Weiqi

    2014-01-01

    Background In the past few decades, medically related data collection saw a huge increase, referred to as big data. These huge datasets bring challenges in storage, processing, and analysis. In clinical medicine, big data is expected to play an important role in identifying causality of patient symptoms, in predicting hazards of disease incidence or reoccurrence, and in improving primary-care quality. Objective The objective of this review was to provide an overview of the features of clinical big data, describe a few commonly employed computational algorithms, statistical methods, and software toolkits for data manipulation and analysis, and discuss the challenges and limitations in this realm. Methods We conducted a literature review to identify studies on big data in medicine, especially clinical medicine. We used different combinations of keywords to search PubMed, Science Direct, Web of Knowledge, and Google Scholar for literature of interest from the past 10 years. Results This paper reviewed studies that analyzed clinical big data and discussed issues related to storage and analysis of this type of data. Conclusions Big data is becoming a common feature of biological and clinical studies. Researchers who use clinical big data face multiple challenges, and the data itself has limitations. It is imperative that methodologies for data analysis keep pace with our ability to collect and store data. PMID:25600256

  20. From darwin to the census of marine life: marine biology as big science.

    Directory of Open Access Journals (Sweden)

    Niki Vermeulen

    Full Text Available With the development of the Human Genome Project, a heated debate emerged on biology becoming 'big science'. However, biology already has a long tradition of collaboration, as natural historians were part of the first collective scientific efforts: exploring the variety of life on earth. Such mappings of life still continue today, and if field biology is gradually becoming an important subject of studies into big science, research into life in the world's oceans is not taken into account yet. This paper therefore explores marine biology as big science, presenting the historical development of marine research towards the international 'Census of Marine Life' (CoML making an inventory of life in the world's oceans. Discussing various aspects of collaboration--including size, internationalisation, research practice, technological developments, application, and public communication--I will ask if CoML still resembles traditional collaborations to collect life. While showing both continuity and change, I will argue that marine biology is a form of natural history: a specific way of working together in biology that has transformed substantially in interaction with recent developments in the life sciences and society. As a result, the paper does not only give an overview of transformations towards large scale research in marine biology, but also shines a new light on big biology, suggesting new ways to deepen the understanding of collaboration in the life sciences by distinguishing between different 'collective ways of knowing'.

  1. From darwin to the census of marine life: marine biology as big science.

    Science.gov (United States)

    Vermeulen, Niki

    2013-01-01

    With the development of the Human Genome Project, a heated debate emerged on biology becoming 'big science'. However, biology already has a long tradition of collaboration, as natural historians were part of the first collective scientific efforts: exploring the variety of life on earth. Such mappings of life still continue today, and if field biology is gradually becoming an important subject of studies into big science, research into life in the world's oceans is not taken into account yet. This paper therefore explores marine biology as big science, presenting the historical development of marine research towards the international 'Census of Marine Life' (CoML) making an inventory of life in the world's oceans. Discussing various aspects of collaboration--including size, internationalisation, research practice, technological developments, application, and public communication--I will ask if CoML still resembles traditional collaborations to collect life. While showing both continuity and change, I will argue that marine biology is a form of natural history: a specific way of working together in biology that has transformed substantially in interaction with recent developments in the life sciences and society. As a result, the paper does not only give an overview of transformations towards large scale research in marine biology, but also shines a new light on big biology, suggesting new ways to deepen the understanding of collaboration in the life sciences by distinguishing between different 'collective ways of knowing'.

  2. Detection and Characterisation of Meteors as a Big Data Citizen Science project

    Science.gov (United States)

    Gritsevich, M.

    2017-12-01

    Out of a total around 50,000 meteorites currently known to science, the atmospheric passage was recorded instrumentally in only 30 cases with the potential to derive their atmospheric trajectories and pre-impact heliocentric orbits. Similarly, while the observations of meteors, add thousands of new entries per month to existing databases, it is extremely rare they lead to meteorite recovery. Meteor studies thus represent an excellent example of the Big Data citizen science project, where progress in the field largely depends on the prompt identification and characterisation of meteor events as well as on extensive and valuable contributions by amateur observers. Over the last couple of decades technological advancements in observational techniques have yielded drastic improvements in the quality, quantity and diversity of meteor data, while even more ambitious instruments are about to become operational. This empowers meteor science to boost its experimental and theoretical horizons and seek more advanced scientific goals. We review some of the developments that push meteor science into the Big Data era that requires more complex methodological approaches through interdisciplinary collaborations with other branches of physics and computer science. We argue that meteor science should become an integral part of large surveys in astronomy, aeronomy and space physics, and tackle the complexity of micro-physics of meteor plasma and its interaction with the atmosphere. The recent increased interest in meteor science triggered by the Chelyabinsk fireball helps in building the case for technologically and logistically more ambitious meteor projects. This requires developing new methodological approaches in meteor research, with Big Data science and close collaboration between citizen science, geoscience and astronomy as critical elements. We discuss possibilities for improvements and promote an opportunity for collaboration in meteor science within the currently

  3. NOAA's Big Data Partnership and Applications to Ocean Sciences

    Science.gov (United States)

    Kearns, E. J.

    2016-02-01

    New opportunities for the distribution of NOAA's oceanographic and other environmental data are being explored through NOAA's Big Data Partnership (BDP) with Amazon Web Services, Google Cloud Platform, IBM, Microsoft Corp. and the Open Cloud Consortium. This partnership was established in April 2015 through Cooperative Research and Development Agreements, and is seeking new, financially self-sustaining collaborations between the Partners and the federal government centered upon NOAA's data and their potential value in the information marketplace. We will discuss emerging opportunities for collaboration among businesses and NOAA, progress in making NOAA's ocean data more widely accessible through the Partnerships, and applications based upon this access to NOAA's data.

  4. The sociology of big science | Public Lecture by Ulrike Felt | 15 July

    CERN Multimedia

    2014-01-01

    "The sociology of big science" Public Lecture by Prof. Ulrike Felt Tuesday 15 July 2014 - 7.30 p.m. Globe of Science and Innovation Lecture in English, translated in French. Entrance free. Limited number of seats. Reservation essential: +41 22 767 76 76 or cern.reception@cern.ch What science for what kind of society? Reflecting the development of big science Without any doubt, CERN can be described as being among the most ambitious scientific enterprises ever undertaken. For 60 years, the Member States have not only invested considerable financial means into this institution, but have also supported the creation of a highly visionary research programme. And this has led to a change in the way science is done, as captured by the idea of "big science". Yet this naturally also raises a number of quite fundamental questions: How did the meaning of "doing science" change? What justifies societal engagement with and support for such a cost-intensive long-t...

  5. Big Data in Plant Science: Resources and Data Mining Tools for Plant Genomics and Proteomics.

    Science.gov (United States)

    Popescu, George V; Noutsos, Christos; Popescu, Sorina C

    2016-01-01

    In modern plant biology, progress is increasingly defined by the scientists' ability to gather and analyze data sets of high volume and complexity, otherwise known as "big data". Arguably, the largest increase in the volume of plant data sets over the last decade is a consequence of the application of the next-generation sequencing and mass-spectrometry technologies to the study of experimental model and crop plants. The increase in quantity and complexity of biological data brings challenges, mostly associated with data acquisition, processing, and sharing within the scientific community. Nonetheless, big data in plant science create unique opportunities in advancing our understanding of complex biological processes at a level of accuracy without precedence, and establish a base for the plant systems biology. In this chapter, we summarize the major drivers of big data in plant science and big data initiatives in life sciences with a focus on the scope and impact of iPlant, a representative cyberinfrastructure platform for plant science.

  6. Legal dimensions of Big Data in the Health and Life Sciences

    DEFF Research Database (Denmark)

    Minssen, Timo

    2016-01-01

    Please find below my welcome speech at last-weeks mini-symposium on “Legal dimensions of Big Data in the Health and Life Sciences – From Intellectual Property Rights and Global Pandemics to Privacy and Ethics at the University of Copenhagen (UCPH). The event was organized by our Global Genes –Local...

  7. Who Owns Educational Theory? Big Data, Algorithms and the Expert Power of Education Data Science

    Science.gov (United States)

    Williamson, Ben

    2017-01-01

    "Education data science" is an emerging methodological field which possesses the algorithm-driven technologies required to generate insights and knowledge from educational big data. This article consists of an analysis of the Lytics Lab, Stanford University's laboratory for research and development in learning analytics, and the Center…

  8. Communicating the Nature of Science through "The Big Bang Theory": Evidence from a Focus Group Study

    Science.gov (United States)

    Li, Rashel; Orthia, Lindy A.

    2016-01-01

    In this paper, we discuss a little-studied means of communicating about or teaching the nature of science (NOS)--through fiction television. We report some results of focus group research which suggest that the American sitcom "The Big Bang Theory" (2007-present), whose main characters are mostly working scientists, has influenced…

  9. Big Data Science Education: A Case Study of a Project-Focused Introductory Course

    Science.gov (United States)

    Saltz, Jeffrey; Heckman, Robert

    2015-01-01

    This paper reports on a case study of a project-focused introduction to big data science course. The pedagogy of the course leveraged boundary theory, where students were positioned to be at the boundary between a client's desire to understand their data and the academic class. The results of the case study demonstrate that using live clients…

  10. Earth science big data at users' fingertips: the EarthServer Science Gateway Mobile

    Science.gov (United States)

    Barbera, Roberto; Bruno, Riccardo; Calanducci, Antonio; Fargetta, Marco; Pappalardo, Marco; Rundo, Francesco

    2014-05-01

    The EarthServer project (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, aims at establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending leading-edge Array Database technology. The core idea is to use database query languages as client/server interface to achieve barrier-free "mix & match" access to multi-source, any-size, multi-dimensional space-time data -- in short: "Big Earth Data Analytics" - based on the open standards of the Open Geospatial Consortium Web Coverage Processing Service (OGC WCPS) and the W3C XQuery. EarthServer combines both, thereby achieving a tight data/metadata integration. Further, the rasdaman Array Database System (www.rasdaman.com) is extended with further space-time coverage data types. On server side, highly effective optimizations - such as parallel and distributed query processing - ensure scalability to Exabyte volumes. In this contribution we will report on the EarthServer Science Gateway Mobile, an app for both iOS and Android-based devices that allows users to seamlessly access some of the EarthServer applications using SAML-based federated authentication and fine-grained authorisation mechanisms.

  11. Big Data and Data Science: Opportunities and Challenges of iSchools

    Directory of Open Access Journals (Sweden)

    Il-Yeol Song

    2017-08-01

    Full Text Available Due to the recent explosion of big data, our society has been rapidly going through digital transformation and entering a new world with numerous eye-opening developments. These new trends impact the society and future jobs, and thus student careers. At the heart of this digital transformation is data science, the discipline that makes sense of big data. With many rapidly emerging digital challenges ahead of us, this article discusses perspectives on iSchools’ opportunities and suggestions in data science education. We argue that iSchools should empower their students with “information computing” disciplines, which we define as the ability to solve problems and create values, information, and knowledge using tools in application domains. As specific approaches to enforcing information computing disciplines in data science education, we suggest the three foci of user-based, tool-based, and application-based. These three foci will serve to differentiate the data science education of iSchools from that of computer science or business schools. We present a layered Data Science Education Framework (DSEF with building blocks that include the three pillars of data science (people, technology, and data, computational thinking, data-driven paradigms, and data science lifecycles. Data science courses built on the top of this framework should thus be executed with user-based, tool-based, and application-based approaches. This framework will help our students think about data science problems from the big picture perspective and foster appropriate problem-solving skills in conjunction with broad perspectives of data science lifecycles. We hope the DSEF discussed in this article will help fellow iSchools in their design of new data science curricula.

  12. Science on the streets of the Big Apple.

    Science.gov (United States)

    Greene, Brian; Nurse, Paul

    2008-05-30

    A five-day festival of science takes place this week at venues across New York City. The festival features not only leading researchers from New York and beyond but also actors, writers, musicians, and choreographers in a series of multimedia programs designed to reveal science to the general public in exciting new ways.

  13. Resonance – Journal of Science Education | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education. Balasubramanian Karthick. Articles written in Resonance – Journal of Science Education. Volume 20 Issue 10 October 2015 pp 919-930 General Article. The Diatoms: Big Significance of Tiny Glass Houses · Aditi Kale Balasubramanian Karthick · More Details ...

  14. Origins of tiny neutrino mass and large flavor mixings

    International Nuclear Information System (INIS)

    Haba, Naoyuki

    2015-01-01

    Active neutrino masses are extremely smaller than those of other quarks and leptons, and there are large flavor mixings in the lepton sector, contrary to the quark sector. They are great mysteries in the standard model, but also excellent hints of new physics beyond the standard model. Thus, questions 'What is an origin of tiny neutrino mass?' and 'What is an origin of large lepton flavor mixings?' are very important. In this paper, we overview various attempts to solve these big questions. (author)

  15. Ontologies, methodologies, and new uses of Big Data in the social and cultural sciences

    Directory of Open Access Journals (Sweden)

    Robin Wagner-Pacifici

    2015-12-01

    Full Text Available In our Introduction to the Conceiving the Social with Big Data Special Issue of Big Data & Society , we survey the 18 contributions from scholars in the humanities and social sciences, and highlight several questions and themes that emerge within and across them. These emergent issues reflect the challenges, problems, and promises of working with Big Data to access and assess the social. They include puzzles about the locus and nature of human life, the nature of interpretation, the categorical constructions of individual entities and agents, the nature and relevance of contexts and temporalities, and the determinations of causality. As such, the Introduction reflects on the contributions along a series of binaries that capture the dualities and dynamisms of these themes: Life/Data; Mind/Machine; and Induction/Deduction.

  16. Big Science comes of age talk of the demise of Big Science is premature. But its characteristics have changed significantly

    CERN Multimedia

    1999-01-01

    Large scale science research facilities are now not only multi-user and multi-experimental, they are also used by scientists from different scientific areas. This demonstrates a significant trend - the growing collaboration between separate scientific disciplines and hence a real interdisciplinary approach to scientific questions (1/2 page).

  17. Big Data and Intellectual Property Rights in the Health and Life Sciences

    DEFF Research Database (Denmark)

    Minssen, Timo

    The vast prospects of Big Data and the shift to more “personalized”, “open” and “transparent” innovation models highlight the importance of an effective governance, regulation and stimulation of high-quality data-uses in the health and life sciences. Intellectual Property Rights (IPRs) and related...... rights come into play when research is translated into safe and efficient “real world” uses. While the need of recalibrating IPRs to fully support Big Data advances is being intensely debated among multiple stakeholders, there seems to be much confusion about the availability of IPRs and their legal...... effects. In this very brief presentation I intend to provide a very brief overview on the most relevant IPRs for data-based life science research. Realizing that the choice of how to address, use and interact with IPRs differs among various areas of applications, I also intend to sketch out and discuss...

  18. Lecture 10: The European Bioinformatics Institute - "Big data" for biomedical sciences

    CERN Multimedia

    CERN. Geneva; Dana, Jose

    2013-01-01

    Part 1: Big data for biomedical sciences (Tom Hancocks) Ten years ago witnessed the completion of the first international 'Big Biology' project that sequenced the human genome. In the years since biological sciences, have seen a vast growth in data. In the coming years advances will come from integration of experimental approaches and the translation into applied technologies is the hospital, clinic and even at home. This talk will examine the development of infrastructure, physical and virtual, that will allow millions of life scientists across Europe better access to biological data Tom studied Human Genetics at the University of Leeds and McMaster University, before completing an MSc in Analytical Genomics at the University of Birmingham. He has worked for the UK National Health Service in diagnostic genetics and in training healthcare scientists and clinicians in bioinformatics. Tom joined the EBI in 2012 and is responsible for the scientific development and delivery of training for the BioMedBridges pr...

  19. Big names in science for the public at large

    CERN Multimedia

    2000-01-01

    The ninth Wright Science Colloquium will be held in Geneva between 13 and 17 November 2000. The purpose of these biennial colloquia, founded by Dr H. Dudley Wright in 1984, is to bring recent progress in science to the attention of the general public. Each Colloquium consists of a series of lectures given by eminent scientists, this year including two Nobel Prize Winners, all of which are open to the general public. The 2000 series of Colloquium lectures is entitled “Time, Matter, Energy : from stars to our genes”, three familiar notions which nevertheless remain intangible for us. This series of five lectures will examine these notions in original ways. Thus the notion of time will be viewed from the perspective of the astronomer who, with the aid of telescopes, is able to go back in time and watch time expanding with the universe. The biologist has a different viewpoint since his notion of time is based on the biological clocks of the animal world. Matter will be addressed from the point of view of its ...

  20. Data Grid tools: enabling science on big distributed data

    Energy Technology Data Exchange (ETDEWEB)

    Allcock, Bill [Mathematics and Computer Science, Argonne National Laboratory, Argonne, IL 60439 (United States); Chervenak, Ann [Information Sciences Institute, University of Southern California, Marina del Rey, CA 90291 (United States); Foster, Ian [Mathematics and Computer Science, Argonne National Laboratory, Argonne, IL 60439 (United States); Department of Computer Science, University of Chicago, Chicago, IL 60615 (United States); Kesselman, Carl [Information Sciences Institute, University of Southern California, Marina del Rey, CA 90291 (United States); Livny, Miron [Department of Computer Science, University of Wisconsin, Madison, WI 53705 (United States)

    2005-01-01

    A particularly demanding and important challenge that we face as we attempt to construct the distributed computing machinery required to support SciDAC goals is the efficient, high-performance, reliable, secure, and policy-aware management of large-scale data movement. This problem is fundamental to diverse application domains including experimental physics (high energy physics, nuclear physics, light sources), simulation science (climate, computational chemistry, fusion, astrophysics), and large-scale collaboration. In each case, highly distributed user communities require high-speed access to valuable data, whether for visualization or analysis. The quantities of data involved (terabytes to petabytes), the scale of the demand (hundreds or thousands of users, data-intensive analyses, real-time constraints), and the complexity of the infrastructure that must be managed (networks, tertiary storage systems, network caches, computers, visualization systems) make the problem extremely challenging. Data management tools developed under the auspices of the SciDAC Data Grid Middleware project have become the de facto standard for data management in projects worldwide. Day in and day out, these tools provide the 'plumbing' that allows scientists to do more science on an unprecedented scale in production environments.

  1. Data Grid tools: enabling science on big distributed data

    International Nuclear Information System (INIS)

    Allcock, Bill; Chervenak, Ann; Foster, Ian; Kesselman, Carl; Livny, Miron

    2005-01-01

    A particularly demanding and important challenge that we face as we attempt to construct the distributed computing machinery required to support SciDAC goals is the efficient, high-performance, reliable, secure, and policy-aware management of large-scale data movement. This problem is fundamental to diverse application domains including experimental physics (high energy physics, nuclear physics, light sources), simulation science (climate, computational chemistry, fusion, astrophysics), and large-scale collaboration. In each case, highly distributed user communities require high-speed access to valuable data, whether for visualization or analysis. The quantities of data involved (terabytes to petabytes), the scale of the demand (hundreds or thousands of users, data-intensive analyses, real-time constraints), and the complexity of the infrastructure that must be managed (networks, tertiary storage systems, network caches, computers, visualization systems) make the problem extremely challenging. Data management tools developed under the auspices of the SciDAC Data Grid Middleware project have become the de facto standard for data management in projects worldwide. Day in and day out, these tools provide the 'plumbing' that allows scientists to do more science on an unprecedented scale in production environments

  2. Artificial intelligence and big data management: the dynamic duo for moving forward data centric sciences

    OpenAIRE

    Vargas Solar, Genoveva

    2017-01-01

    After vivid discussions led by the emergence of the buzzword “Big Data”, it seems that industry and academia have reached an objective understanding about data properties (volume, velocity, variety, veracity and value), the resources and “know how” it requires, and the opportunities it opens. Indeed, new applications promising fundamental changes in society, industry and science, include face recognition, machine translation, digital assistants, self-driving cars, ad-serving, chat-bots, perso...

  3. Opportunities and challenges of big data for the social sciences: The case of genomic data.

    Science.gov (United States)

    Liu, Hexuan; Guo, Guang

    2016-09-01

    In this paper, we draw attention to one unique and valuable source of big data, genomic data, by demonstrating the opportunities they provide to social scientists. We discuss different types of large-scale genomic data and recent advances in statistical methods and computational infrastructure used to address challenges in managing and analyzing such data. We highlight how these data and methods can be used to benefit social science research. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Big data, computational science, economics, finance, marketing, management, and psychology: connections

    OpenAIRE

    Chang, Chia-Lin; McAleer, Michael; Wong, Wing-Keung

    2018-01-01

    textabstractThe paper provides a review of the literature that connects Big Data, Computational Science, Economics, Finance, Marketing, Management, and Psychology, and discusses some research that is related to the seven disciplines. Academics could develop theoretical models and subsequent econometric and statistical models to estimate the parameters in the associated models, as well as conduct simulation to examine whether the estimators in their theories on estimation and hypothesis testin...

  5. Team Structure and Scientific Impact of "Big Science" Research

    DEFF Research Database (Denmark)

    Lauto, Giancarlo; Valentin, Finn; Jeppesen, Jacob

    This paper summarizes preliminary results from a project studying how the organizational and cognitive features of research carried out in a Large Scale Research Facility (LSRF) affect scientific impact. The study is based on exhaustive bibliometric mapping of the scientific publications...... of the Neutron Science Department of Oak Ridge National Laboratories in 2006-2009. Given the collaborative nature of research carried out at LSRFs, it is important to understand how its organization affects scientific impact. Diversity of teams along the institutional and cognitive dimensions affects both...... opportunities for combination of knowledge and coordination costs. The way specific collaborative configurations strike this trade-offs between these opportunities and costs have notable effects on research performance. The findings of the paper show that i.) scientists combining affiliations to both...

  6. Panel session: Part 1, In flux -- Science Policy and the social structure of Big Laboratories, 1964--1979

    Energy Technology Data Exchange (ETDEWEB)

    Westfall, C. [Michigan State Univ., East Lansing, MI (United States)]|[CEBAF, Newport News, VA (United States)]|[Fermilab History Collaboration, Batavia, IL (United States)

    1993-09-01

    This report discusses the in flux of science policy and the social structure of big laboratories during the period of 1964 to 1979 and some sociological consequences of high energy physicists` development of the standard model during the same period.

  7. Data science and big data analytics discovering, analyzing, visualizing and presenting data

    CERN Document Server

    2014-01-01

    Data Science and Big Data Analytics is about harnessing the power of data for new insights. The book covers the breadth of activities and methods and tools that Data Scientists use. The content focuses on concepts, principles and practical applications that are applicable to any industry and technology environment, and the learning is supported and explained with examples that you can replicate using open-source software. This book will help you: Become a contributor on a data science teamDeploy a structured lifecycle approach to data analytics problemsApply appropriate analytic techniques and

  8. Big Data: An Opportunity for Collaboration with Computer Scientists on Data-Driven Science

    Science.gov (United States)

    Baru, C.

    2014-12-01

    Big data technologies are evolving rapidly, driven by the need to manage ever increasing amounts of historical data; process relentless streams of human and machine-generated data; and integrate data of heterogeneous structure from extremely heterogeneous sources of information. Big data is inherently an application-driven problem. Developing the right technologies requires an understanding of the applications domain. Though, an intriguing aspect of this phenomenon is that the availability of the data itself enables new applications not previously conceived of! In this talk, we will discuss how the big data phenomenon creates an imperative for collaboration among domain scientists (in this case, geoscientists) and computer scientists. Domain scientists provide the application requirements as well as insights about the data involved, while computer scientists help assess whether problems can be solved with currently available technologies or require adaptaion of existing technologies and/or development of new technologies. The synergy can create vibrant collaborations potentially leading to new science insights as well as development of new data technologies and systems. The area of interface between geosciences and computer science, also referred to as geoinformatics is, we believe, a fertile area for interdisciplinary research.

  9. IBM Watson: How Cognitive Computing Can Be Applied to Big Data Challenges in Life Sciences Research.

    Science.gov (United States)

    Chen, Ying; Elenee Argentinis, J D; Weber, Griff

    2016-04-01

    Life sciences researchers are under pressure to innovate faster than ever. Big data offer the promise of unlocking novel insights and accelerating breakthroughs. Ironically, although more data are available than ever, only a fraction is being integrated, understood, and analyzed. The challenge lies in harnessing volumes of data, integrating the data from hundreds of sources, and understanding their various formats. New technologies such as cognitive computing offer promise for addressing this challenge because cognitive solutions are specifically designed to integrate and analyze big datasets. Cognitive solutions can understand different types of data such as lab values in a structured database or the text of a scientific publication. Cognitive solutions are trained to understand technical, industry-specific content and use advanced reasoning, predictive modeling, and machine learning techniques to advance research faster. Watson, a cognitive computing technology, has been configured to support life sciences research. This version of Watson includes medical literature, patents, genomics, and chemical and pharmacological data that researchers would typically use in their work. Watson has also been developed with specific comprehension of scientific terminology so it can make novel connections in millions of pages of text. Watson has been applied to a few pilot studies in the areas of drug target identification and drug repurposing. The pilot results suggest that Watson can accelerate identification of novel drug candidates and novel drug targets by harnessing the potential of big data. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Big Science, co-publication and collaboration: getting to the core

    Energy Technology Data Exchange (ETDEWEB)

    Kahn, M.

    2016-07-01

    International collaboration in science has risen considerably in the last two decades (UNESCO, 2010). In the same period Big Science collaborations have proliferated in physics, astronomy, astrophysics, and medicine. Publications that use Big Science data draw on the expertise of those who design and build the equipment and software, as well as the scientific community. Over time a set of ‘rules of use’ has emerged that protects their intellectual property but that may have the unintended consequence of enhancing co-publication counts. This in turn distorts the use of co-publication data as a proxy for collaboration. The distorting effects are illustrated by means of a case study of the BRICS countries that recently issued a declaration on scientific and technological cooperation with specific fields allocated to each country. It is found that with a single exception the dominant research areas of collaboration are different to individual country specializations. The disjuncture between such ‘collaboration’ and the intent of the declaration raises questions of import to science policy, for the BRICS in particular and the measurement of scientific collaboration more generally. (Author)

  11. Complementary social science? Quali-quantitative experiments in a Big Data world

    Directory of Open Access Journals (Sweden)

    Anders Blok

    2014-08-01

    Full Text Available The rise of Big Data in the social realm poses significant questions at the intersection of science, technology, and society, including in terms of how new large-scale social databases are currently changing the methods, epistemologies, and politics of social science. In this commentary, we address such epochal (“large-scale” questions by way of a (situated experiment: at the Danish Technical University in Copenhagen, an interdisciplinary group of computer scientists, physicists, economists, sociologists, and anthropologists (including the authors is setting up a large-scale data infrastructure, meant to continually record the digital traces of social relations among an entire freshman class of students ( N  > 1000. At the same time, fieldwork is carried out on friendship (and other relations amongst the same group of students. On this basis, the question we pose is the following: what kind of knowledge is obtained on this social micro-cosmos via the Big (computational, quantitative and Small (embodied, qualitative Data, respectively? How do the two relate? Invoking Bohr’s principle of complementarity as analogy, we hypothesize that social relations, as objects of knowledge, depend crucially on the type of measurement device deployed. At the same time, however, we also expect new interferences and polyphonies to arise at the intersection of Big and Small Data, provided that these are, so to speak, mixed with care. These questions, we stress, are important not only for the future of social science methods but also for the type of societal (self-knowledge that may be expected from new large-scale social databases.

  12. Data Science and its Relationship to Big Data and Data-Driven Decision Making.

    Science.gov (United States)

    Provost, Foster; Fawcett, Tom

    2013-03-01

    Companies have realized they need to hire data scientists, academic institutions are scrambling to put together data-science programs, and publications are touting data science as a hot-even "sexy"-career choice. However, there is confusion about what exactly data science is, and this confusion could lead to disillusionment as the concept diffuses into meaningless buzz. In this article, we argue that there are good reasons why it has been hard to pin down exactly what is data science. One reason is that data science is intricately intertwined with other important concepts also of growing importance, such as big data and data-driven decision making. Another reason is the natural tendency to associate what a practitioner does with the definition of the practitioner's field; this can result in overlooking the fundamentals of the field. We believe that trying to define the boundaries of data science precisely is not of the utmost importance. We can debate the boundaries of the field in an academic setting, but in order for data science to serve business effectively, it is important (i) to understand its relationships to other important related concepts, and (ii) to begin to identify the fundamental principles underlying data science. Once we embrace (ii), we can much better understand and explain exactly what data science has to offer. Furthermore, only once we embrace (ii) should we be comfortable calling it data science. In this article, we present a perspective that addresses all these concepts. We close by offering, as examples, a partial list of fundamental principles underlying data science.

  13. Nanocellulose, a tiny fiber with huge applications.

    Science.gov (United States)

    Abitbol, Tiffany; Rivkin, Amit; Cao, Yifeng; Nevo, Yuval; Abraham, Eldho; Ben-Shalom, Tal; Lapidot, Shaul; Shoseyov, Oded

    2016-06-01

    Nanocellulose is of increasing interest for a range of applications relevant to the fields of material science and biomedical engineering due to its renewable nature, anisotropic shape, excellent mechanical properties, good biocompatibility, tailorable surface chemistry, and interesting optical properties. We discuss the main areas of nanocellulose research: photonics, films and foams, surface modifications, nanocomposites, and medical devices. These tiny nanocellulose fibers have huge potential in many applications, from flexible optoelectronics to scaffolds for tissue regeneration. We hope to impart the readers with some of the excitement that currently surrounds nanocellulose research, which arises from the green nature of the particles, their fascinating physical and chemical properties, and the diversity of applications that can be impacted by this material. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Big Data, Computational Science, Economics, Finance, Marketing, Management, and Psychology: Connections

    Directory of Open Access Journals (Sweden)

    Chia-Lin Chang

    2018-03-01

    Full Text Available The paper provides a review of the literature that connects Big Data, Computational Science, Economics, Finance, Marketing, Management, and Psychology, and discusses research issues that are related to the various disciplines. Academics could develop theoretical models and subsequent econometric and statistical models to estimate the parameters in the associated models, as well as conduct simulation to examine whether the estimators in their theories on estimation and hypothesis testing have good size and high power. Thereafter, academics and practitioners could apply theory to analyse some interesting issues in the seven disciplines and cognate areas.

  15. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    Science.gov (United States)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-01-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  16. Big Data and Intellectual Property Rights in the Health and Life Sciences

    DEFF Research Database (Denmark)

    Minssen, Timo; Pierce, Justin

    2018-01-01

    , especially in the life science sectors where competitive innovation and research and development (R&D) resources are persistent considerations. For private actors, the like of pharmaceutical companies, health care providers, laboratories and insurance companies, it is becoming common practice to accumulate R......Undeniably “Big Data” plays a crucial role in the ongoing evolution of health care and life science sector innovations. In recent years U.S. and European authorities have developed public platforms and infrastructures providing access to vast stores of health-care knowledge, including data from......&D data making it searchable through medical databases. This trend is advanced and supported by recent initiatives and legislation that are increasing the transparency of various forms of data, such as clinical trials data. As a result, researchers, companies, patients and health care providers gain...

  17. Taking a 'Big Data' approach to data quality in a citizen science project.

    Science.gov (United States)

    Kelling, Steve; Fink, Daniel; La Sorte, Frank A; Johnston, Alison; Bruns, Nicholas E; Hochachka, Wesley M

    2015-11-01

    Data from well-designed experiments provide the strongest evidence of causation in biodiversity studies. However, for many species the collection of these data is not scalable to the spatial and temporal extents required to understand patterns at the population level. Only data collected from citizen science projects can gather sufficient quantities of data, but data collected from volunteers are inherently noisy and heterogeneous. Here we describe a 'Big Data' approach to improve the data quality in eBird, a global citizen science project that gathers bird observations. First, eBird's data submission design ensures that all data meet high standards of completeness and accuracy. Second, we take a 'sensor calibration' approach to measure individual variation in eBird participant's ability to detect and identify birds. Third, we use species distribution models to fill in data gaps. Finally, we provide examples of novel analyses exploring population-level patterns in bird distributions.

  18. Big Data Science: Opportunities and Challenges to Address Minority Health and Health Disparities in the 21st Century

    Science.gov (United States)

    Zhang, Xinzhi; Pérez-Stable, Eliseo J.; Bourne, Philip E.; Peprah, Emmanuel; Duru, O. Kenrik; Breen, Nancy; Berrigan, David; Wood, Fred; Jackson, James S.; Wong, David W.S.; Denny, Joshua

    2017-01-01

    Addressing minority health and health disparities has been a missing piece of the puzzle in Big Data science. This article focuses on three priority opportunities that Big Data science may offer to the reduction of health and health care disparities. One opportunity is to incorporate standardized information on demographic and social determinants in electronic health records in order to target ways to improve quality of care for the most disadvantaged populations over time. A second opportunity is to enhance public health surveillance by linking geographical variables and social determinants of health for geographically defined populations to clinical data and health outcomes. Third and most importantly, Big Data science may lead to a better understanding of the etiology of health disparities and understanding of minority health in order to guide intervention development. However, the promise of Big Data needs to be considered in light of significant challenges that threaten to widen health disparities. Care must be taken to incorporate diverse populations to realize the potential benefits. Specific recommendations include investing in data collection on small sample populations, building a diverse workforce pipeline for data science, actively seeking to reduce digital divides, developing novel ways to assure digital data privacy for small populations, and promoting widespread data sharing to benefit under-resourced minority-serving institutions and minority researchers. With deliberate efforts, Big Data presents a dramatic opportunity for reducing health disparities but without active engagement, it risks further widening them. PMID:28439179

  19. Big Data Science: Opportunities and Challenges to Address Minority Health and Health Disparities in the 21st Century.

    Science.gov (United States)

    Zhang, Xinzhi; Pérez-Stable, Eliseo J; Bourne, Philip E; Peprah, Emmanuel; Duru, O Kenrik; Breen, Nancy; Berrigan, David; Wood, Fred; Jackson, James S; Wong, David W S; Denny, Joshua

    2017-01-01

    Addressing minority health and health disparities has been a missing piece of the puzzle in Big Data science. This article focuses on three priority opportunities that Big Data science may offer to the reduction of health and health care disparities. One opportunity is to incorporate standardized information on demographic and social determinants in electronic health records in order to target ways to improve quality of care for the most disadvantaged populations over time. A second opportunity is to enhance public health surveillance by linking geographical variables and social determinants of health for geographically defined populations to clinical data and health outcomes. Third and most importantly, Big Data science may lead to a better understanding of the etiology of health disparities and understanding of minority health in order to guide intervention development. However, the promise of Big Data needs to be considered in light of significant challenges that threaten to widen health disparities. Care must be taken to incorporate diverse populations to realize the potential benefits. Specific recommendations include investing in data collection on small sample populations, building a diverse workforce pipeline for data science, actively seeking to reduce digital divides, developing novel ways to assure digital data privacy for small populations, and promoting widespread data sharing to benefit under-resourced minority-serving institutions and minority researchers. With deliberate efforts, Big Data presents a dramatic opportunity for reducing health disparities but without active engagement, it risks further widening them.

  20. Tiny plastic lung mimics human pulmonary function

    Science.gov (United States)

    Careers Inclusion & Diversity Work-Life Balance Career Resources Apply for a Job Postdocs Students Goals Recycling Green Purchasing Pollution Prevention Reusing Water Resources Environmental Management Releases - 2016 » April » Tiny plastic lung mimics human pulmonary function Tiny plastic lung mimics

  1. Preliminary investigations on TINI based distributed instrumentation systems

    International Nuclear Information System (INIS)

    Bezboruah, T.; Kalita, M.

    2006-04-01

    A prototype web enabled distributed instrumentation system is being proposed in the Department of Electronics Science, Gauhati University, Assam, India. The distributed instrumentation system contains sensors, legacy hardware, TCP/IP protocol converter, TCP/IP network Ethernet, Database Server, Web/Application Server and Client PCs. As part of the proposed work, Tiny Internet Interface (TINI, TBM390: Dallas Semiconductor) has been deployed as TCP/IP stack, and java programming language as software tools. A feature supported by Java, that is particularly relevant to the distributed system is its applet. An applet is a java class that can be downloaded from the web server and can be run in a context application such as web browser or an applet viewer. TINI has been installed as TCP/IP stack, as it is the best suited embedded system with java programming language and it has been uniquely designed for communicating over One Wire Devices (OWD) over network. Here we will discuss the hardware and software aspects of TINI with OWD for the present system. (author)

  2. Big data and tactical analysis in elite soccer: future challenges and opportunities for sports science.

    Science.gov (United States)

    Rein, Robert; Memmert, Daniel

    2016-01-01

    Until recently tactical analysis in elite soccer were based on observational data using variables which discard most contextual information. Analyses of team tactics require however detailed data from various sources including technical skill, individual physiological performance, and team formations among others to represent the complex processes underlying team tactical behavior. Accordingly, little is known about how these different factors influence team tactical behavior in elite soccer. In parts, this has also been due to the lack of available data. Increasingly however, detailed game logs obtained through next-generation tracking technologies in addition to physiological training data collected through novel miniature sensor technologies have become available for research. This leads however to the opposite problem where the shear amount of data becomes an obstacle in itself as methodological guidelines as well as theoretical modelling of tactical decision making in team sports is lacking. The present paper discusses how big data and modern machine learning technologies may help to address these issues and aid in developing a theoretical model for tactical decision making in team sports. As experience from medical applications show, significant organizational obstacles regarding data governance and access to technologies must be overcome first. The present work discusses these issues with respect to tactical analyses in elite soccer and propose a technological stack which aims to introduce big data technologies into elite soccer research. The proposed approach could also serve as a guideline for other sports science domains as increasing data size is becoming a wide-spread phenomenon.

  3. How Big Science Came to Long Island: the Birth of Brookhaven Lab (429th Brookhaven Lecture)

    International Nuclear Information System (INIS)

    Crease, Robert P.

    2007-01-01

    Robert P. Crease, historian for the U.S. Department of Energy's Brookhaven National Laboratory and Chair of the Philosophy Department at Stony Brook University, will give two talks on the Laboratory's history on October 31 and December 12. Crease's October 31 talk, titled 'How Big Science Came to Long Island: The Birth of Brookhaven Lab,' will cover the founding of the Laboratory soon after World War II as a peacetime facility to construct and maintain basic research facilities, such as nuclear reactors and particle accelerators, that were too large for single institutions to build and operate. He will discuss the key figures involved in starting the Laboratory, including Nobel laureates I.I. Rabi and Norman Ramsey, as well as Donald Dexter Van Slyke, one of the most renowned medical researchers in American history. Crease also will focus on the many problems that had to be overcome in creating the Laboratory and designing its first big machines, as well as the evolving relations of the Laboratory with the surrounding Long Island community and news media. Throughout his talk, Crease will tell fascinating stories about Brookhaven's scientists and their research.

  4. Nanotechnology, Big things from a Tiny World: a Review

    OpenAIRE

    Debnath Bhattacharyya; Shashank Singh; Niraj Satnalika; Ankesh Khandelwal; Seung-Hwan Jeon

    2009-01-01

    The purpose of this paper is to look into the present aspects of “Nanotechnology”. This paper gives a brief description of what Nanotechnology is?? And its application in various fields viz. computing, medicine, food technology, Robotics, Solar cells etc. It also deals with the future perspectives of Nanotechnology, risks in advanced nanotechnology.

  5. Nanotechnology: the revolution of the big future with tiny medicine.

    Science.gov (United States)

    Meetoo, Danny

    The historically unprecedented developments of nanoscience and nanotechnology (NT) promise to revolutionize the diagnosis, treatment and prevention of disease and traumatic injury, to relieve pain, and to preserve and improve human health, using molecular tools and molecular knowledge of the body. This article focuses on what is known as nanomedicine, referring to a definition of NT, its historical overview and developments as well as its application to medicine. In revolutionizing the manufacturing process to a nanoscale, NT promises to resolve the problems currently faced by the human race. However, in embracing this panacea, its implications, particularly within health care, cannot be ignored. This article, therefore, provides a thought-provoking consideration of how NT is likely to impact on nursing, together with the issues likely to be encountered during the care delivery process. Finally, some of the ethical questions nurses need to debate has been raised.

  6. Nano-remediation: tiny particles cleaning up big environmental problems

    DEFF Research Database (Denmark)

    Grieger, Khara; Hjorth, Rune; Rice, Jacelyn

    2015-01-01

    Over the past decade, there has been an increased use of engineered nanomaterials (ENMs) in everything from consumer products to industrial manufacturing. Nanomaterials are substances which are less than 100 nanometres in size (a nanometre is one billionth of a metre). Although natural nanomateri...

  7. Nanoclay minerals and plastics: tiny particles deliver big impact

    CSIR Research Space (South Africa)

    Sinah Ray, S

    2015-10-01

    Full Text Available A polymer nanocomposite is an advanced plastic material where the incorporation of nanostructures such as clay minerals and other nanoparticles into the polymer has been achieved on the nano-level so that the material exhibits improvements in colour...

  8. The Diatoms: Big Significance of Tiny Glass Houses

    Indian Academy of Sciences (India)

    Permanent link: https://www.ias.ac.in/article/fulltext/reso/020/10/0919-0930. Keywords. Algae; primary production; frustule; nanotechnology; silica cell wall. Author Affiliations. Aditi Kale1 Balasubramanian Karthick1. Biodiversity and Paleobiology Group Agharkar Research Institute G G Agarkar Road, Pune 411004, India ...

  9. Enabling a new Paradigm to Address Big Data and Open Science Challenges

    Science.gov (United States)

    Ramamurthy, Mohan; Fisher, Ward

    2017-04-01

    Data are not only the lifeblood of the geosciences but they have become the currency of the modern world in science and society. Rapid advances in computing, communi¬cations, and observational technologies — along with concomitant advances in high-resolution modeling, ensemble and coupled-systems predictions of the Earth system — are revolutionizing nearly every aspect of our field. Modern data volumes from high-resolution ensemble prediction/projection/simulation systems and next-generation remote-sensing systems like hyper-spectral satellite sensors and phased-array radars are staggering. For example, CMIP efforts alone will generate many petabytes of climate projection data for use in assessments of climate change. And NOAA's National Climatic Data Center projects that it will archive over 350 petabytes by 2030. For researchers and educators, this deluge and the increasing complexity of data brings challenges along with the opportunities for discovery and scientific breakthroughs. The potential for big data to transform the geosciences is enormous, but realizing the next frontier depends on effectively managing, analyzing, and exploiting these heterogeneous data sources, extracting knowledge and useful information from heterogeneous data sources in ways that were previously impossible, to enable discoveries and gain new insights. At the same time, there is a growing focus on the area of "Reproducibility or Replicability in Science" that has implications for Open Science. The advent of cloud computing has opened new avenues for not only addressing both big data and Open Science challenges to accelerate scientific discoveries. However, to successfully leverage the enormous potential of cloud technologies, it will require the data providers and the scientific communities to develop new paradigms to enable next-generation workflows and transform the conduct of science. Making data readily available is a necessary but not a sufficient condition. Data providers

  10. Big Data Science Cafés: High School Students Experiencing Real Research with Scientists

    Science.gov (United States)

    Walker, C. E.; Pompea, S. M.

    2017-12-01

    The Education and Public Outreach group at the National Optical Astronomy Observatory has designed an outside-of-school education program to excite the interest of talented youth in future projects like the Large Synoptic Survey Telescope (LSST) and the NOAO (archival) Data Lab - their data approaches and key science projects. Originally funded by the LSST Corporation, the program cultivates talented youth to enter STEM disciplines and serves as a model to disseminate to the 40+ institutions involved in LSST. One Saturday a month during the academic year, high school students have the opportunity to interact with expert astronomers who work with large astronomical data sets in their scientific work. Students learn about killer asteroids, the birth and death of stars, colliding galaxies, the structure of the universe, gravitational waves, dark energy, dark matter, and more. The format for the Saturday science cafés has been a short presentation, discussion (plus food), computer lab activity and more discussion. They last about 2.5 hours and have been planned by a group of interested local high school students, an undergraduate student coordinator, the presenting astronomers, the program director and an evaluator. High school youth leaders help ensure an enjoyable and successful program for fellow students. They help their fellow students with the activities and help evaluate how well the science café went. Their remarks shape the next science café and improve the program. The experience offers youth leaders ownership of the program, opportunities to take on responsibilities and learn leadership and communication skills, as well as foster their continued interests in STEM. The prototype Big Data Science Academy was implemented successfully in the Spring 2017 and engaged almost 40 teens from greater Tucson in the fundamentals of astronomy concepts and research. As with any first implementation there were bumps. However, staff, scientists, and student leaders all

  11. PANGAEA® - Data Publisher for Earth & Environmental Science - Research data enters scholarly communication and big data analysis

    Science.gov (United States)

    Diepenbroek, Michael; Schindler, Uwe; Riedel, Morris; Huber, Robert

    2014-05-01

    The ISCU World Data Center PANGAEA is an information system for acquisition, processing, long term storage, and publication of geo-referenced data related to earth science fields. Storing more than 350.000 data sets from all fields of geosciences it belongs to the largest archives for observational earth science data. Standard conform interfaces (ISO, OGC, W3C, OAI) enable access from a variety of data and information portals, among them the search engine of PANGAEA itself ((www.pangaea.de) and e.g. GBIF. All data sets in PANGAEA are citable, fully documented, and can be referenced via persistent identifiers (Digital Object Identifier - DOI) - a premise for data publication. Together with other ICSU World Data Centers (www.icsu-wds.org) and the Technical Information Library in Germany (TIB) PANGAEA had a share in the implementation of a DOI based registry for scientific data, which by now is supported by a worldwide consortium of libraries (www.datacite.org). A further milestone was building up strong co-operations with science publishers as Elsevier, Springer, Wiley, AGU, Nature and others. A common web service allows to reference supplementary data in PANGAEA directly from an articles abstract page (e.g. Science Direct). The next step with science publishers is to further integrate the editorial process for the publication of supplementary data with the publication procedures on the journal side. Data centric research efforts such as environmental modelling or big data analysing approaches represent new challenges for PANGAEA. Integrated data warehouse technologies are used for highly efficient retrievals and compilations of time slices or surface data matrixes on any measurement parameters out of the whole data continuum. Further, new and emerging big data approaches are currently investigated within PANGAEA to e.g. evaluate its usability for quality control or data clustering. PANGAEA is operated as a joint long term facility by MARUM at the University Bremen

  12. High Performance Numerical Computing for High Energy Physics: A New Challenge for Big Data Science

    International Nuclear Information System (INIS)

    Pop, Florin

    2014-01-01

    Modern physics is based on both theoretical analysis and experimental validation. Complex scenarios like subatomic dimensions, high energy, and lower absolute temperature are frontiers for many theoretical models. Simulation with stable numerical methods represents an excellent instrument for high accuracy analysis, experimental validation, and visualization. High performance computing support offers possibility to make simulations at large scale, in parallel, but the volume of data generated by these experiments creates a new challenge for Big Data Science. This paper presents existing computational methods for high energy physics (HEP) analyzed from two perspectives: numerical methods and high performance computing. The computational methods presented are Monte Carlo methods and simulations of HEP processes, Markovian Monte Carlo, unfolding methods in particle physics, kernel estimation in HEP, and Random Matrix Theory used in analysis of particles spectrum. All of these methods produce data-intensive applications, which introduce new challenges and requirements for ICT systems architecture, programming paradigms, and storage capabilities.

  13. Discourse, Power, and Knowledge in the Management of "Big Science": The Production of Consensus in a Nuclear Fusion Research Laboratory.

    Science.gov (United States)

    Kinsella, William J.

    1999-01-01

    Extends a Foucauldian view of power/knowledge to the archetypical knowledge-intensive organization, the scientific research laboratory. Describes the discursive production of power/knowledge at the "big science" laboratory conducting nuclear fusion research and illuminates a critical incident in which the fusion research…

  14. Research in an emerging 'big science' discipline. The case of neutron scattering in Spain

    International Nuclear Information System (INIS)

    Borja Gonzalez-Albo; Maria Bordons; Pedro Gorria

    2010-01-01

    Neutron scattering (NS) is a 'big science' discipline whose research spans over a wide spectrum of fields, from fundamental or basic science to technological applications. The objective of this paper is to track the evolution of Spanish research in NS from a bibliometric perspective and to place it in the international context. Scientific publications of Spanish authors included in the Web of Science (WoS 1970-2006) are analysed with respect to five relevant dimensions: volume of research output, impact, disciplinary diversity, structural field features and internationalisation. NS emerges as a highly internationalised fast-growing field whose research is firmly rooted in Physics, Chemistry and Engineering, but with applications in a wide range of fields. International collaboration links -present in around 70% of the documents- and national links have largely contributed to mould the existing structure of research in the area, which evolves around major neutron scattering facilities abroad. The construction of a new European neutron source (ESS) would contribute to the consolidation of the field within the EU, since it will strengthen research and improve current activity. (author)

  15. Data Prospecting Framework - a new approach to explore "big data" in Earth Science

    Science.gov (United States)

    Ramachandran, R.; Rushing, J.; Lin, A.; Kuo, K.

    2012-12-01

    Due to advances in sensors, computation and storage, cost and effort required to produce large datasets have been significantly reduced. As a result, we are seeing a proliferation of large-scale data sets being assembled in almost every science field, especially in geosciences. Opportunities to exploit the "big data" are enormous as new hypotheses can be generated by combining and analyzing large amounts of data. However, such a data-driven approach to science discovery assumes that scientists can find and isolate relevant subsets from vast amounts of available data. Current Earth Science data systems only provide data discovery through simple metadata and keyword-based searches and are not designed to support data exploration capabilities based on the actual content. Consequently, scientists often find themselves downloading large volumes of data, struggling with large amounts of storage and learning new analysis technologies that will help them separate the wheat from the chaff. New mechanisms of data exploration are needed to help scientists discover the relevant subsets We present data prospecting, a new content-based data analysis paradigm to support data-intensive science. Data prospecting allows the researchers to explore big data in determining and isolating data subsets for further analysis. This is akin to geo-prospecting in which mineral sites of interest are determined over the landscape through screening methods. The resulting "data prospects" only provide an interaction with and feel for the data through first-look analytics; the researchers would still have to download the relevant datasets and analyze them deeply using their favorite analytical tools to determine if the datasets will yield new hypotheses. Data prospecting combines two traditional categories of data analysis, data exploration and data mining within the discovery step. Data exploration utilizes manual/interactive methods for data analysis such as standard statistical analysis and

  16. Perspective: Materials Informatics and Big Data: Realization of the Fourth Paradigm of Science in Materials Science

    Science.gov (United States)

    2016-08-17

    algorithm Bagging29 Ensembling Builds multiple models on bootstrapped training data subsets to improve model stability by reducing variance Random subspace30...36 053208-6 A. Agrawal and A. Choudhary APL Mater. 4, 053208 (2016) domains like business and marketing,51–53 healthcare,54–60 climate science,61–63

  17. Evaluation of Big Data Containers for Popular Storage, Retrieval, and Computation Primitives in Earth Science Analysis

    Science.gov (United States)

    Das, K.; Clune, T.; Kuo, K. S.; Mattmann, C. A.; Huang, T.; Duffy, D.; Yang, C. P.; Habermann, T.

    2015-12-01

    Data containers are infrastructures that facilitate storage, retrieval, and analysis of data sets. Big data applications in Earth Science require a mix of processing techniques, data sources and storage formats that are supported by different data containers. Some of the most popular data containers used in Earth Science studies are Hadoop, Spark, SciDB, AsterixDB, and RasDaMan. These containers optimize different aspects of the data processing pipeline and are, therefore, suitable for different types of applications. These containers are expected to undergo rapid evolution and the ability to re-test, as they evolve, is very important to ensure the containers are up to date and ready to be deployed to handle large volumes of observational data and model output. Our goal is to develop an evaluation plan for these containers to assess their suitability for Earth Science data processing needs. We have identified a selection of test cases that are relevant to most data processing exercises in Earth Science applications and we aim to evaluate these systems for optimal performance against each of these test cases. The use cases identified as part of this study are (i) data fetching, (ii) data preparation for multivariate analysis, (iii) data normalization, (iv) distance (kernel) computation, and (v) optimization. In this study we develop a set of metrics for performance evaluation, define the specifics of governance, and test the plan on current versions of the data containers. The test plan and the design mechanism are expandable to allow repeated testing with both new containers and upgraded versions of the ones mentioned above, so that we can gauge their utility as they evolve.

  18. Big Data Challenges in Climate Science: Improving the Next-Generation Cyberinfrastructure

    Science.gov (United States)

    Schnase, John L.; Lee, Tsengdar J.; Mattmann, Chris A.; Lynnes, Christopher S.; Cinquini, Luca; Ramirez, Paul M.; Hart, Andre F.; Williams, Dean N.; Waliser, Duane; Rinsland, Pamela; hide

    2016-01-01

    The knowledge we gain from research in climate science depends on the generation, dissemination, and analysis of high-quality data. This work comprises technical practice as well as social practice, both of which are distinguished by their massive scale and global reach. As a result, the amount of data involved in climate research is growing at an unprecedented rate. Climate model intercomparison (CMIP) experiments, the integration of observational data and climate reanalysis data with climate model outputs, as seen in the Obs4MIPs, Ana4MIPs, and CREATE-IP activities, and the collaborative work of the Intergovernmental Panel on Climate Change (IPCC) provide examples of the types of activities that increasingly require an improved cyberinfrastructure for dealing with large amounts of critical scientific data. This paper provides an overview of some of climate science's big data problems and the technical solutions being developed to advance data publication, climate analytics as a service, and interoperability within the Earth System Grid Federation (ESGF), the primary cyberinfrastructure currently supporting global climate research activities.

  19. Tiny Molybdenites Tell Diffusion Tales

    Science.gov (United States)

    Stein, H. J.; Hannah, J. L.

    2014-12-01

    Diffusion invokes micron-scale exchange during crystal growth and dissolution in magma chambers on short time-scales. Fundamental to interpreting such data are assumptions on magma-fluid dynamics at all scales. Nevertheless, elemental diffusion profiles are used to estimate time scales for magma storage, eruption, and recharge. An underutilized timepiece to evaluate diffusion and 3D mobility of magmatic fluids is high-precision Re-Os dating of molybdenite. With spatially unique molybdenite samples from a young ore system (e.g., 1 Ma) and a double Os spike, analytical errors of 1-3 ka unambiguously separate events in time. Re-Os ages show that hydrous shallow magma chambers locally recharge and expel Cu-Mo-Au-silica as superimposed stockwork vein networks at time scales less than a few thousand years [1]. Re-Os ages provide diffusion rates controlled by a dynamic crystal mush, accumulation and expulsion of metalliferous fluid, and magma reorganization after explosive crystallization events. Importantly, this approach has broad application far from ore deposits. Here, we use Re-Os dating of molybdenite to assess time scales for generating and diffusing metals through the deep crust. To maximize opportunity for chemical diffusion, we use a continental-scale Sveconorwegian mylonite zone for the study area. A geologically constrained suite of molybdenite samples was acquired from quarry exposures. Molybdenite, previously unreported, is extremely scarce. Tiny but telling molybdenites include samples from like occurrences to assure geologic accuracy in Re-Os ages. Ages range from mid-Mesoproterozoic to mid-Neoproterozoic, and correspond to early metamorphic dehydration of a regionally widespread biotite-rich gneiss, localized melting of gneiss to form cm-m-scale K-feldspar ± quartz pods, development of vapor-rich, vuggy mm stringers that serve as volatile collection surfaces in felsic leucosomes, and low-angle (relative to foliation) cross-cutting cm-scale quartz veins

  20. Big Data and Regional Science: Opportunities, Challenges, and Directions for Future Research

    OpenAIRE

    Schintler, Laurie A.; Fischer, Manfred M.

    2018-01-01

    Recent technological, social, and economic trends and transformations are contributing to the production of what is usually referred to as Big Data. Big Data, which is typically defined by four dimensions -- Volume, Velocity, Veracity, and Variety -- changes the methods and tactics for using, analyzing, and interpreting data, requiring new approaches for data provenance, data processing, data analysis and modeling, and knowledge representation. The use and analysis of Big Data involves severa...

  1. John C. Mather, the Big Bang, and the COBE

    Science.gov (United States)

    Bang theory and showing that the Big Bang was complete in the first instants, with only a tiny fraction dropdown arrow Site Map A-Z Index Menu Synopsis John C. Mather, the Big Bang, and the COBE Resources with collaborative work on understanding the Big Bang. Mather and Smoot analyzed data from NASA's Cosmic Background

  2. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  3. Linking Big and Small Data Across the Social, Engineering, and Earth Sciences

    Science.gov (United States)

    Chen, R. S.; de Sherbinin, A. M.; Levy, M. A.; Downs, R. R.

    2014-12-01

    The challenges of sustainable development cut across the social, health, ecological, engineering, and Earth sciences, across a wide range of spatial and temporal scales, and across the spectrum from basic to applied research and decision making. The rapidly increasing availability of data and information in digital form from a variety of data repositories, networks, and other sources provides new opportunities to link and integrate both traditional data holdings as well as emerging "big data" resources in ways that enable interdisciplinary research and facilitate the use of objective scientific data and information in society. Taking advantage of these opportunities not only requires improved technical and scientific data interoperability across disciplines, scales, and data types, but also concerted efforts to bridge gaps and barriers between key communities, institutions, and networks. Given the long time perspectives required in planning sustainable approaches to development, it is also imperative to address user requirements for long-term data continuity and stewardship by trustworthy repositories. We report here on lessons learned by CIESIN working on a range of sustainable development issues to integrate data across multiple repositories and networks. This includes CIESIN's roles in developing policy-relevant climate and environmental indicators, soil data for African agriculture, and exposure and risk measures for hazards, disease, and conflict, as well as CIESIN's participation in a range of national and international initiatives related both to sustainable development and to open data access, interoperability, and stewardship.

  4. The SAMI Galaxy Survey: A prototype data archive for Big Science exploration

    Science.gov (United States)

    Konstantopoulos, I. S.; Green, A. W.; Foster, C.; Scott, N.; Allen, J. T.; Fogarty, L. M. R.; Lorente, N. P. F.; Sweet, S. M.; Hopkins, A. M.; Bland-Hawthorn, J.; Bryant, J. J.; Croom, S. M.; Goodwin, M.; Lawrence, J. S.; Owers, M. S.; Richards, S. N.

    2015-11-01

    We describe the data archive and database for the SAMI Galaxy Survey, an ongoing observational program that will cover ≈3400 galaxies with integral-field (spatially-resolved) spectroscopy. Amounting to some three million spectra, this is the largest sample of its kind to date. The data archive and built-in query engine use the versatile Hierarchical Data Format (HDF5), which precludes the need for external metadata tables and hence the setup and maintenance overhead those carry. The code produces simple outputs that can easily be translated to plots and tables, and the combination of these tools makes for a light system that can handle heavy data. This article acts as a contextual companion to the SAMI Survey Database source code repository, samiDB, which is freely available online and written entirely in Python. We also discuss the decisions related to the selection of tools and the creation of data visualisation modules. It is our aim that the work presented in this article-descriptions, rationale, and source code-will be of use to scientists looking to set up a maintenance-light data archive for a Big Science data load.

  5. Automated protocols for spaceborne sub-meter resolution "Big Data" products for Earth Science

    Science.gov (United States)

    Neigh, C. S. R.; Carroll, M.; Montesano, P.; Slayback, D. A.; Wooten, M.; Lyapustin, A.; Shean, D. E.; Alexandrov, O.; Macander, M. J.; Tucker, C. J.

    2017-12-01

    The volume of available remotely sensed data has grown exceeding Petabytes per year and the cost for data, storage systems and compute power have both dropped exponentially. This has opened the door for "Big Data" processing systems with high-end computing (HEC) such as the Google Earth Engine, NASA Earth Exchange (NEX), and NASA Center for Climate Simulation (NCCS). At the same time, commercial very high-resolution (VHR) satellites have grown into a constellation with global repeat coverage that can support existing NASA Earth observing missions with stereo and super-spectral capabilities. Through agreements with the National Geospatial-Intelligence Agency NASA-Goddard Space Flight Center is acquiring Petabytes of global sub-meter to 4 meter resolution imagery from WorldView-1,2,3 Quickbird-2, GeoEye-1 and IKONOS-2 satellites. These data are a valuable no-direct cost for the enhancement of Earth observation research that supports US government interests. We are currently developing automated protocols for generating VHR products to support NASA's Earth observing missions. These include two primary foci: 1) on demand VHR 1/2° ortho mosaics - process VHR to surface reflectance, orthorectify and co-register multi-temporal 2 m multispectral imagery compiled as user defined regional mosaics. This will provide an easy access dataset to investigate biodiversity, tree canopy closure, surface water fraction, and cropped area for smallholder agriculture; and 2) on demand VHR digital elevation models (DEMs) - process stereo VHR to extract VHR DEMs with the NASA Ames stereo pipeline. This will benefit Earth surface studies on the cryosphere (glacier mass balance, flow rates and snow depth), hydrology (lake/water body levels, landslides, subsidence) and biosphere (forest structure, canopy height/cover) among others. Recent examples of products used in NASA Earth Science projects will be provided. This HEC API could foster surmounting prior spatial-temporal limitations while

  6. Managing globally distributed expertise with new competence management solutions a big-science collaboration as a pilot case.

    CERN Document Server

    Ferguson, J; Livan, M; Nordberg, M; Salmia, T; Vuola, O

    2003-01-01

    In today's global organisations and networks, a critical factor for effective innovation and project execution is appropriate competence and skills management. The challenges include selection of strategic competences, competence development, and leveraging the competences and skills to drive innovation and collaboration for shared goals. This paper presents a new industrial web-enabled competence management and networking solution and its implementation and piloting in a complex big-science environment of globally distributed competences.

  7. Managing globally distributed expertise with new competence management solutions: a big-science collaboration as a pilot case.

    OpenAIRE

    Ferguson, J; Koivula, T; Livan, M; Nordberg, M; Salmia, T; Vuola, O

    2003-01-01

    In today's global organisations and networks, a critical factor for effective innovation and project execution is appropriate competence and skills management. The challenges include selection of strategic competences, competence development, and leveraging the competences and skills to drive innovation and collaboration for shared goals. This paper presents a new industrial web-enabled competence management and networking solution and its implementation and piloting in a complex big-science ...

  8. Analysis of Big Data technologies for use in agro-environmental science

    NARCIS (Netherlands)

    Lokers, Rob; Knapen, Rob; Janssen, Sander; Randen, van Yke; Jansen, Jacques

    2016-01-01

    Recent developments like the movements of open access and open data and the unprecedented growth of data, which has come forward as Big Data, have shifted focus to methods to effectively handle such data for use in agro-environmental research. Big Data technologies, together with the increased

  9. The Big Challenge in Big Earth Science Data: Maturing to Transdisciplinary Data Platforms that are Relevant to Government, Research and Industry

    Science.gov (United States)

    Wyborn, Lesley; Evans, Ben

    2016-04-01

    Collecting data for the Earth Sciences has a particularly long history going back centuries. Initially scientific data came only from simple human observations recorded by pen on paper. Scientific instruments soon supplemented data capture, and as these instruments became more capable (e.g, automation, more information captured, generation of digitally-born outputs), Earth Scientists entered the 'Big Data' era where progressively data became too big to store and process locally in the old style vaults. To date, most funding initiatives for collection and storage of large volume data sets in the Earth Sciences have been specialised within a single discipline (e.g., climate, geophysics, and Earth Observation) or specific to an individual institution. To undertake interdisciplinary research, it is hard for users to integrate data from these individual repositories mainly due to limitations on physical access to/movement of the data, and/or data being organised without enough information to make sense of it without discipline specialised knowledge. Smaller repositories have also gradually been seen as inefficient in terms of the cost to manage and access (including scarce skills) and effective implementation of new technology and techniques. Within the last decade, the trend is towards fewer and larger data repositories that increasingly are collocated with HPC/cloud resources. There has also been a growing recognition that digital data can be a valuable resource that can be reused and repurposed - publicly funded data from either the academic of government sector is seen as a shared resource, and that efficiencies can be gained by co-location. These new, highly capable, 'transdisciplinary' data repositories are emerging as a fundamental 'infrastructure' both for research and other innovation. The sharing of academic and government data resources on the same infrastructures is enabling new research programmes that will enable integration beyond the traditional physical

  10. The Quantified Self: Fundamental Disruption in Big Data Science and Biological Discovery.

    Science.gov (United States)

    Swan, Melanie

    2013-06-01

    A key contemporary trend emerging in big data science is the quantified self (QS)-individuals engaged in the self-tracking of any kind of biological, physical, behavioral, or environmental information as n=1 individuals or in groups. There are opportunities for big data scientists to develop new models to support QS data collection, integration, and analysis, and also to lead in defining open-access database resources and privacy standards for how personal data is used. Next-generation QS applications could include tools for rendering QS data meaningful in behavior change, establishing baselines and variability in objective metrics, applying new kinds of pattern recognition techniques, and aggregating multiple self-tracking data streams from wearable electronics, biosensors, mobile phones, genomic data, and cloud-based services. The long-term vision of QS activity is that of a systemic monitoring approach where an individual's continuous personal information climate provides real-time performance optimization suggestions. There are some potential limitations related to QS activity-barriers to widespread adoption and a critique regarding scientific soundness-but these may be overcome. One interesting aspect of QS activity is that it is fundamentally a quantitative and qualitative phenomenon since it includes both the collection of objective metrics data and the subjective experience of the impact of these data. Some of this dynamic is being explored as the quantified self is becoming the qualified self in two new ways: by applying QS methods to the tracking of qualitative phenomena such as mood, and by understanding that QS data collection is just the first step in creating qualitative feedback loops for behavior change. In the long-term future, the quantified self may become additionally transformed into the extended exoself as data quantification and self-tracking enable the development of new sense capabilities that are not possible with ordinary senses. The

  11. Opening the Black Box: Understanding the Science Behind Big Data and Predictive Analytics.

    Science.gov (United States)

    Hofer, Ira S; Halperin, Eran; Cannesson, Maxime

    2018-05-25

    Big data, smart data, predictive analytics, and other similar terms are ubiquitous in the lay and scientific literature. However, despite the frequency of usage, these terms are often poorly understood, and evidence of their disruption to clinical care is hard to find. This article aims to address these issues by first defining and elucidating the term big data, exploring the ways in which modern medical data, both inside and outside the electronic medical record, meet the established definitions of big data. We then define the term smart data and discuss the transformations necessary to make big data into smart data. Finally, we examine the ways in which this transition from big to smart data will affect what we do in research, retrospective work, and ultimately patient care.

  12. Research funding. Big names or big ideas: do peer-review panels select the best science proposals?

    Science.gov (United States)

    Li, Danielle; Agha, Leila

    2015-04-24

    This paper examines the success of peer-review panels in predicting the future quality of proposed research. We construct new data to track publication, citation, and patenting outcomes associated with more than 130,000 research project (R01) grants funded by the U.S. National Institutes of Health from 1980 to 2008. We find that better peer-review scores are consistently associated with better research outcomes and that this relationship persists even when we include detailed controls for an investigator's publication history, grant history, institutional affiliations, career stage, and degree types. A one-standard deviation worse peer-review score among awarded grants is associated with 15% fewer citations, 7% fewer publications, 19% fewer high-impact publications, and 14% fewer follow-on patents. Copyright © 2015, American Association for the Advancement of Science.

  13. Principales parámetros para el estudio de la colaboración científica en Big Science

    Directory of Open Access Journals (Sweden)

    Ortoll, Eva

    2014-12-01

    Full Text Available In several scientific disciplines research has shifted from experiments of a reduced scale to large and complex collaborations. Many recent scientific achievements like the human genome sequencing or the discovery of the Higgs boson have taken place within the “big science” paradigm. The study of scientific collaboration needs to take into account all the diverse factors that have an influence on it. In the case of big science experiments, some of those aspects are particularly important: number of institutions involved, cultural differences, diversity of spaces and infrastructures or the conceptualization of research problems. By considering these specific factors we present a set of parameters for the analysis of scientific collaboration in big science projects. The utility of these parameters is illustrated through a comparative study of two large big science projects: the ATLAS experiment and the Human Genome Project.En varias áreas de la ciencia se ha pasado de trabajar en experimentos reducidos a participar en grandes y complejas colaboraciones. Muchos de los grandes avances científicos recientes como la secuenciación del genoma humano o el descubrimiento del bosón de Higgs se enmarcan en el paradigma denominado big science. El estudio de la colaboración científica debe tener en cuenta los factores de todo tipo que influyen en dicha colaboración. Los experimentos de big science inciden especialmente en algunos de estos aspectos: volumen de instituciones implicadas, diferencias culturales, diversidad de espacios e infraestructuras o la propia conceptualización del problema de investigación. Atendiendo a estas particularidades, en este trabajo presentamos un conjunto de parámetros para el análisis de la colaboración científica en proyectos big science. Ilustramos la utilidad de esos parámetros mediante un estudio comparativo de dos grandes proyectos de big science: el experimento ATLAS y el Proyecto Genoma Humano.

  14. Molecular pathological epidemiology: new developing frontiers of big data science to study etiologies and pathogenesis.

    Science.gov (United States)

    Hamada, Tsuyoshi; Keum, NaNa; Nishihara, Reiko; Ogino, Shuji

    2017-03-01

    Molecular pathological epidemiology (MPE) is an integrative field that utilizes molecular pathology to incorporate interpersonal heterogeneity of a disease process into epidemiology. In each individual, the development and progression of a disease are determined by a unique combination of exogenous and endogenous factors, resulting in different molecular and pathological subtypes of the disease. Based on "the unique disease principle," the primary aim of MPE is to uncover an interactive relationship between a specific environmental exposure and disease subtypes in determining disease incidence and mortality. This MPE approach can provide etiologic and pathogenic insights, potentially contributing to precision medicine for personalized prevention and treatment. Although breast, prostate, lung, and colorectal cancers have been among the most commonly studied diseases, the MPE approach can be used to study any disease. In addition to molecular features, host immune status and microbiome profile likely affect a disease process, and thus serve as informative biomarkers. As such, further integration of several disciplines into MPE has been achieved (e.g., pharmaco-MPE, immuno-MPE, and microbial MPE), to provide novel insights into underlying etiologic mechanisms. With the advent of high-throughput sequencing technologies, available genomic and epigenomic data have expanded dramatically. The MPE approach can also provide a specific risk estimate for each disease subgroup, thereby enhancing the impact of genome-wide association studies on public health. In this article, we present recent progress of MPE, and discuss the importance of accounting for the disease heterogeneity in the era of big-data health science and precision medicine.

  15. ldentifying Episodes of Earth Science Phenomena Using a Big-Data Technology

    Science.gov (United States)

    Kuo, Kwo-Sen; Oloso, Amidu; Rushing, John; Lin, Amy; Fekete, Gyorgy; Ramachandran, Rahul; Clune, Thomas; Dunny, Daniel

    2014-01-01

    's intricate dynamics, we are continuously discovering novel ES phenomena. We generally gain understanding of a given phenomenon by observing and studying individual events. This process usually begins by identifying the occurrences of these events. Once representative events are identified or found, we must locate associated observed or simulated data prior to commencing analysis and concerted studies of the phenomenon. Knowledge concerning the phenomenon can accumulate only after analysis has started. However, as mentioned previously, comprehensive records only exist for a very limited set of high-impact phenomena; aside from these, finding events and locating associated data currently may take a prohibitive amount of time and effort on the part of an individual investigator. The reason for the lack of comprehensive records for most of the ES phenomena is mainly due to the perception that they do not pose immediate and/or severe threat to life and property. Thus they are not consistently tracked, monitored, and catalogued. Many phenomena even lack precise and/or commonly accepted criteria for definitions. Moreover, various Earth Science observations and data have accumulated to a previously unfathomable volume; NASA Earth Observing System Data Information System (EOSDIS) alone archives several petabytes (PB) of satellite remote sensing data, which are steadily increasing. All of these factors contribute to the difficulty of methodically identifying events corresponding to a given phenomenon and significantly impede systematic investigations. We have not only envisioned AES as an environment for identifying customdefined events but also aspired for it to be an interactive environment with quick turnaround time for revisions of query criteria and results, as well as a collaborative environment where geographically distributed experts may work together on the same phenomena. A Big Data technology is thus required for the realization of such a system. In the following, we first

  16. Physicists tackles questions of tiny dimensions

    CERN Multimedia

    Moran, Barbara

    2003-01-01

    Today's physicists have a dilemna: they are using two separate theories to describe the universe. General relativity, which describes gravity, works for large objects like planets. Quantum mechanics, which involves the other forces, works for tiny objects like atoms. Unfortunately, the two theories don't match up.

  17. A 'tiny-orange' spectrometer for electrons

    International Nuclear Information System (INIS)

    Silva, N.C. da.

    1990-01-01

    An tiny-orange electron spectrometer was designed and constructed using flat permanent magnets and a surface barrier detector. The transmission functions of different system configurations were determined for energies in the 200-1100 KeV range. A mathematical model for the system was developed. (L.C.J.A.)

  18. Leros: A Tiny Microcontroller for FPGAs

    DEFF Research Database (Denmark)

    Schoeberl, Martin

    2011-01-01

    Leros is a tiny microcontroller that is optimized for current low-cost FPGAs. Leros is designed with a balanced logic to on-chip memory relation. The design goal is a microcontroller that can be clocked in about half of the speed a pipelined on-chip memory and consuming less than 300 logic cells...

  19. A Future Accelerated Cognitive Distributed Hybrid Testbed for Big Data Science Analytics

    Science.gov (United States)

    Halem, M.; Prathapan, S.; Golpayegani, N.; Huang, Y.; Blattner, T.; Dorband, J. E.

    2016-12-01

    As increased sensor spectral data volumes from current and future Earth Observing satellites are assimilated into high-resolution climate models, intensive cognitive machine learning technologies are needed to data mine, extract and intercompare model outputs. It is clear today that the next generation of computers and storage, beyond petascale cluster architectures, will be data centric. They will manage data movement and process data in place. Future cluster nodes have been announced that integrate multiple CPUs with high-speed links to GPUs and MICS on their backplanes with massive non-volatile RAM and access to active flash RAM disk storage. Active Ethernet connected key value store disk storage drives with 10Ge or higher are now available through the Kinetic Open Storage Alliance. At the UMBC Center for Hybrid Multicore Productivity Research, a future state-of-the-art Accelerated Cognitive Computer System (ACCS) for Big Data science is being integrated into the current IBM iDataplex computational system `bluewave'. Based on the next gen IBM 200 PF Sierra processor, an interim two node IBM Power S822 testbed is being integrated with dual Power 8 processors with 10 cores, 1TB Ram, a PCIe to a K80 GPU and an FPGA Coherent Accelerated Processor Interface card to 20TB Flash Ram. This system is to be updated to the Power 8+, an NVlink 1.0 with the Pascal GPU late in 2016. Moreover, the Seagate 96TB Kinetic Disk system with 24 Ethernet connected active disks is integrated into the ACCS storage system. A Lightweight Virtual File System developed at the NASA GSFC is installed on bluewave. Since remote access to publicly available quantum annealing computers is available at several govt labs, the ACCS will offer an in-line Restricted Boltzmann Machine optimization capability to the D-Wave 2X quantum annealing processor over the campus high speed 100 Gb network to Internet 2 for large files. As an evaluation test of the cognitive functionality of the architecture, the

  20. The phytotronist and the phenotype: plant physiology, Big Science, and a Cold War biology of the whole plant.

    Science.gov (United States)

    Munns, David P D

    2015-04-01

    This paper describes how, from the early twentieth century, and especially in the early Cold War era, the plant physiologists considered their discipline ideally suited among all the plant sciences to study and explain biological functions and processes, and ranked their discipline among the dominant forms of the biological sciences. At their apex in the late-1960s, the plant physiologists laid claim to having discovered nothing less than the "basic laws of physiology." This paper unwraps that claim, showing that it emerged from the construction of monumental big science laboratories known as phytotrons that gave control over the growing environment. Control meant that plant physiologists claimed to be able to produce a standard phenotype valid for experimental biology. Invoking the standards of the physical sciences, the plant physiologists heralded basic biological science from the phytotronic produced phenotype. In the context of the Cold War era, the ability to pursue basic science represented the highest pinnacle of standing within the scientific community. More broadly, I suggest that by recovering the history of an underappreciated discipline, plant physiology, and by establishing the centrality of the story of the plant sciences in the history of biology can historians understand the massive changes wrought to biology by the conceptual emergence of the molecular understanding of life, the dominance of the discipline of molecular biology, and the rise of biotechnology in the 1980s. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. 5th Annual Pan-European Science and Big Physics Symposium on March 5th, 2012, Zurich, Switzerland

    CERN Multimedia

    Balle, Ch

    2012-01-01

    The 5th Annual Pan-European Science and Big Physics Symposium on March 5th is a technical workshop that covers topics in the areas of control, measurement and diagnostics for accelerators, cyclotrons, tokamaks and telescopes. The symposium brings together over 60 scientists and engineers from major research labs around the world such as CERN, PSI, INFN, NPL, ESRF and other research institutions. Attend this event to share ideas and results and to learn from the presentations of your peers from different labs and experiments worldwide.

  2. Vectors into the Future of Mass and Interpersonal Communication Research: Big Data, Social Media, and Computational Social Science.

    Science.gov (United States)

    Cappella, Joseph N

    2017-10-01

    Simultaneous developments in big data, social media, and computational social science have set the stage for how we think about and understand interpersonal and mass communication. This article explores some of the ways that these developments generate 4 hypothetical "vectors" - directions - into the next generation of communication research. These vectors include developments in network analysis, modeling interpersonal and social influence, recommendation systems, and the blurring of distinctions between interpersonal and mass audiences through narrowcasting and broadcasting. The methods and research in these arenas are occurring in areas outside the typical boundaries of the communication discipline but engage classic, substantive questions in mass and interpersonal communication.

  3. "Air Toxics under the Big Sky": Examining the Effectiveness of Authentic Scientific Research on High School Students' Science Skills and Interest

    Science.gov (United States)

    Ward, Tony J.; Delaloye, Naomi; Adams, Earle Raymond; Ware, Desirae; Vanek, Diana; Knuth, Randy; Hester, Carolyn Laurie; Marra, Nancy Noel; Holian, Andrij

    2016-01-01

    "Air Toxics Under the Big Sky" is an environmental science outreach/education program that incorporates the Next Generation Science Standards (NGSS) 8 Practices with the goal of promoting knowledge and understanding of authentic scientific research in high school classrooms through air quality research. This research explored: (1)…

  4. Examining the Big-Fish-Little-Pond Effect on Students' Self-Concept of Learning Science in Taiwan Based on the TIMSS Databases

    Science.gov (United States)

    Liou, Pey-Yan

    2014-01-01

    The purpose of this study is to examine the relationship between student self-concept and achievement in science in Taiwan based on the big-fish-little-pond effect (BFLPE) model using the Trends in International Mathematics and Science Study (TIMSS) 2003 and 2007 databases. Hierarchical linear modeling was used to examine the effects of the…

  5. Crowd-funded micro-grants for genomics and "big data": an actionable idea connecting small (artisan) science, infrastructure science, and citizen philanthropy.

    Science.gov (United States)

    Özdemir, Vural; Badr, Kamal F; Dove, Edward S; Endrenyi, Laszlo; Geraci, Christy Jo; Hotez, Peter J; Milius, Djims; Neves-Pereira, Maria; Pang, Tikki; Rotimi, Charles N; Sabra, Ramzi; Sarkissian, Christineh N; Srivastava, Sanjeeva; Tims, Hesther; Zgheib, Nathalie K; Kickbusch, Ilona

    2013-04-01

    Biomedical science in the 21(st) century is embedded in, and draws from, a digital commons and "Big Data" created by high-throughput Omics technologies such as genomics. Classic Edisonian metaphors of science and scientists (i.e., "the lone genius" or other narrow definitions of expertise) are ill equipped to harness the vast promises of the 21(st) century digital commons. Moreover, in medicine and life sciences, experts often under-appreciate the important contributions made by citizen scholars and lead users of innovations to design innovative products and co-create new knowledge. We believe there are a large number of users waiting to be mobilized so as to engage with Big Data as citizen scientists-only if some funding were available. Yet many of these scholars may not meet the meta-criteria used to judge expertise, such as a track record in obtaining large research grants or a traditional academic curriculum vitae. This innovation research article describes a novel idea and action framework: micro-grants, each worth $1000, for genomics and Big Data. Though a relatively small amount at first glance, this far exceeds the annual income of the "bottom one billion"-the 1.4 billion people living below the extreme poverty level defined by the World Bank ($1.25/day). We describe two types of micro-grants. Type 1 micro-grants can be awarded through established funding agencies and philanthropies that create micro-granting programs to fund a broad and highly diverse array of small artisan labs and citizen scholars to connect genomics and Big Data with new models of discovery such as open user innovation. Type 2 micro-grants can be funded by existing or new science observatories and citizen think tanks through crowd-funding mechanisms described herein. Type 2 micro-grants would also facilitate global health diplomacy by co-creating crowd-funded micro-granting programs across nation-states in regions facing political and financial instability, while sharing similar disease

  6. Principales parámetros para el estudio de la colaboración científica en Big Science

    OpenAIRE

    Ortoll, Eva; Canals, Agustí; Garcia, Montserrat; Cobarsí, Josep

    2014-01-01

    In several scientific disciplines research has shifted from experiments of a reduced scale to large and complex collaborations. Many recent scientific achievements like the human genome sequencing or the discovery of the Higgs boson have taken place within the “big science” paradigm. The study of scientific collaboration needs to take into account all the diverse factors that have an influence on it. In the case of big science experiments, some of those aspects are particularly important: num...

  7. The scientific production on data quality in big data: a study in the Web of Science database

    Directory of Open Access Journals (Sweden)

    Priscila Basto Fagundes

    2017-11-01

    Full Text Available More and more, the big data theme has attracted interest in researchers from different areas of knowledge, among them information scientists who need to understand their concepts and applications in order to contribute with new proposals for the management of the information generated from the data stored in these environments. The objective of this article is to present a survey of publications about data quality in big data in the Web of Science database until the year 2016. Will be presented the total number of publications indexed in the database, the number of publications per year, the location the origin of the research and a synthesis of the studies found. The survey in the database was conducted in July 2017 and resulted in a total of 23 publications. In order to make it possible to present a summary of the publications in this article, searches were made of the full texts of all the publications on the Internet and read the ones that were available. With this survey it was possible to conclude that the studies on data quality in big data had their publications starting in 2013, most of which present literature reviews and few effective proposals for the monitoring and management of data quality in environments with large volumes of data. Therefore, it is intended with this survey to contribute and foster new research on the context of data quality in big data environments.

  8. Advances in developing TiNi nanoparticles

    International Nuclear Information System (INIS)

    Castro, A. Torres; Cuellar, E. Lopez; Mendez, U. Ortiz; Yacaman, M. Jose

    2006-01-01

    The elaboration of nanoparticles has become a field of great interest for many scientists. Nanoparticles possess different properties than those ones shown in bulk materials. Shape memory alloys have the exceptional ability to recuperate its original shape by simple heating after being 'plastically' deformed. When this process is originated, important changes in properties, as mechanical and electrical, are developed in bulk material. If there is possible to obtain nanoparticles with shape memory effects, these nanoparticles could be used in the elaboration of nanofluids with the ability to change their electrical and thermal conductivity with temperature changes, i.e., smart nanofluids. In this work, some recent results and discussion of TiNi nanoparticles obtained by ion beam milling directly from a TiNi wire with shape memory are presented. The nanoparticles obtained by this process are about 2 nm of diameter with a composition of Ti-41.0 at.% Ni. Synthesized nanoparticles elaborated by this method have an ordered structure

  9. Tiny Devices Project Sharp, Colorful Images

    Science.gov (United States)

    2009-01-01

    Displaytech Inc., based in Longmont, Colorado and recently acquired by Micron Technology Inc. of Boise, Idaho, first received a Small Business Innovation Research contract in 1993 from Johnson Space Center to develop tiny, electronic, color displays, called microdisplays. Displaytech has since sold over 20 million microdisplays and was ranked one of the fastest growing technology companies by Deloitte and Touche in 2005. Customers currently incorporate the microdisplays in tiny pico-projectors, which weigh only a few ounces and attach to media players, cell phones, and other devices. The projectors can convert a digital image from the typical postage stamp size into a bright, clear, four-foot projection. The company believes sales of this type of pico-projector may exceed $1.1 billion within 5 years.

  10. Enhancing Teachers' Awareness About Relations Between Science and Religion. The Debate Between Steady State and Big Bang Theories

    Science.gov (United States)

    Bagdonas, Alexandre; Silva, Cibelle Celestino

    2015-11-01

    Educators advocate that science education can help the development of more responsible worldviews when students learn not only scientific concepts, but also about science, or "nature of science". Cosmology can help the formation of worldviews because this topic is embedded in socio-cultural and religious issues. Indeed, during the Cold War period, the cosmological controversy between Big Bang and Steady State theory was tied up with political and religious arguments. The present paper discusses a didactic sequence developed for and applied in a pre-service science teacher-training course on history of science. After studying the historical case, pre-service science teachers discussed how to deal with possible conflicts between scientific views and students' personal worldviews related to religion. The course focused on the study of primary and secondary sources about cosmology and religion written by cosmologists such as Georges Lemaître, Fred Hoyle and the Pope Pius XII. We used didactic strategies such as short seminars given by groups of pre-service teachers, videos, computer simulations, role-play, debates and preparation of written essays. Along the course, most pre-service teachers emphasized differences between science and religion and pointed out that they do not feel prepared to conduct classroom discussions about this topic. Discussing the relations between science and religion using the history of cosmology turned into an effective way to teach not only science concepts but also to stimulate reflections about nature of science. This topic may contribute to increasing students' critical stance on controversial issues, without the need to explicitly defend certain positions, or disapprove students' cultural traditions. Moreover, pre-service teachers practiced didactic strategies to deal with this kind of unusual content.

  11. From tiny microalgae to huge biorefineries

    OpenAIRE

    Gouveia, L.

    2014-01-01

    Microalgae are an emerging research field due to their high potential as a source of several biofuels in addition to the fact that they have a high-nutritional value and contain compounds that have health benefits. They are also highly used for water stream bioremediation and carbon dioxide mitigation. Therefore, the tiny microalgae could lead to a huge source of compounds and products, giving a good example of a real biorefinery approach. This work shows and presents examples of experimental...

  12. Betsy Pugel, Tiny houses: Planetary protection-focused materials selection for spaceflight hardware surfaces

    OpenAIRE

    Schriml, Lynn

    2017-01-01

    Betsy Pugel, National Aeronautics and Space Administration Tiny houses: Planetary protection-focused materials selection for spaceflight hardware surfacesOn October 10-12th, 2017 the Alfred P. Sloan Foundation and The National Academies of Sciences, Engineering and Medicine co-hosting MoBE 2017 (Microbiology of the Built Environment Research and Applications Symposium) at the National Academy of Sciences Building to present the current state-of-the-science in understanding the formation and ...

  13. Towards efficient data exchange and sharing for big-data driven materials science: metadata and data formats

    Science.gov (United States)

    Ghiringhelli, Luca M.; Carbogno, Christian; Levchenko, Sergey; Mohamed, Fawzi; Huhs, Georg; Lüders, Martin; Oliveira, Micael; Scheffler, Matthias

    2017-11-01

    With big-data driven materials research, the new paradigm of materials science, sharing and wide accessibility of data are becoming crucial aspects. Obviously, a prerequisite for data exchange and big-data analytics is standardization, which means using consistent and unique conventions for, e.g., units, zero base lines, and file formats. There are two main strategies to achieve this goal. One accepts the heterogeneous nature of the community, which comprises scientists from physics, chemistry, bio-physics, and materials science, by complying with the diverse ecosystem of computer codes and thus develops "converters" for the input and output files of all important codes. These converters then translate the data of each code into a standardized, code-independent format. The other strategy is to provide standardized open libraries that code developers can adopt for shaping their inputs, outputs, and restart files, directly into the same code-independent format. In this perspective paper, we present both strategies and argue that they can and should be regarded as complementary, if not even synergetic. The represented appropriate format and conventions were agreed upon by two teams, the Electronic Structure Library (ESL) of the European Center for Atomic and Molecular Computations (CECAM) and the NOvel MAterials Discovery (NOMAD) Laboratory, a European Centre of Excellence (CoE). A key element of this work is the definition of hierarchical metadata describing state-of-the-art electronic-structure calculations.

  14. Lowering the barriers for accessing distributed geospatial big data to advance spatial data science: the PolarHub solution

    Science.gov (United States)

    Li, W.

    2017-12-01

    Data is the crux of science. The widespread availability of big data today is of particular importance for fostering new forms of geospatial innovation. This paper reports a state-of-the-art solution that addresses a key cyberinfrastructure research problem—providing ready access to big, distributed geospatial data resources on the Web. We first formulate this data-access problem and introduce its indispensable elements, including identifying the cyber-location, space and time coverage, theme, and quality of the dataset. We then propose strategies to tackle each data-access issue and make the data more discoverable and usable for geospatial data users and decision makers. Among these strategies is large-scale web crawling as a key technique to support automatic collection of online geospatial data that are highly distributed, intrinsically heterogeneous, and known to be dynamic. To better understand the content and scientific meanings of the data, methods including space-time filtering, ontology-based thematic classification, and service quality evaluation are incorporated. To serve a broad scientific user community, these techniques are integrated into an operational data crawling system, PolarHub, which is also an important cyberinfrastructure building block to support effective data discovery. A series of experiments were conducted to demonstrate the outstanding performance of the PolarHub system. We expect this work to contribute significantly in building the theoretical and methodological foundation for data-driven geography and the emerging spatial data science.

  15. Limitations of constitutive relations for TiNi shape memory alloys

    International Nuclear Information System (INIS)

    Tang, W.; Sandstroem, R.

    1995-01-01

    Phase transformation tensor Ω in the constitutive equation proposed by Tanaka has been evaluated by employing experimental data of TiNi alloys in a constrained recovery process. It demonstrates that the absolute value of Ω for the constrained recovery process is typically about 0.6 ∼ 0.7 x 10 3 MPa, which is much smaller than that for the stress - induced martensitic transformation (typically 2.5 ∼ 3.5 x 10 3 ). Based on the evaluated results for Ω, predicted recovery stress - temperature relations by the constitutive equation are compared with the experimental data for TiNi rods under different strains. Big discrepancy exists for large strain conditions. Several transformation kinetic expressions are examined for the constitutive relation of the constrained recovery process. (orig.)

  16. The Role of Distributed Computing in Big Data Science: Case Studies in Forensics and Bioinformatics

    OpenAIRE

    Roscigno, Gianluca

    2016-01-01

    2014 - 2015 The era of Big Data is leading the generation of large amounts of data, which require storage and analysis capabilities that can be only ad- dressed by distributed computing systems. To facilitate large-scale distributed computing, many programming paradigms and frame- works have been proposed, such as MapReduce and Apache Hadoop, which transparently address some issues of distributed systems and hide most of their technical details. Hadoop is curren...

  17. Big Data in Science and Healthcare: A Review of Recent Literature and Perspectives. Contribution of the IMIA Social Media Working Group.

    Science.gov (United States)

    Hansen, M M; Miron-Shatz, T; Lau, A Y S; Paton, C

    2014-08-15

    As technology continues to evolve and rise in various industries, such as healthcare, science, education, and gaming, a sophisticated concept known as Big Data is surfacing. The concept of analytics aims to understand data. We set out to portray and discuss perspectives of the evolving use of Big Data in science and healthcare and, to examine some of the opportunities and challenges. A literature review was conducted to highlight the implications associated with the use of Big Data in scientific research and healthcare innovations, both on a large and small scale. Scientists and health-care providers may learn from one another when it comes to understanding the value of Big Data and analytics. Small data, derived by patients and consumers, also requires analytics to become actionable. Connectivism provides a framework for the use of Big Data and analytics in the areas of science and healthcare. This theory assists individuals to recognize and synthesize how human connections are driving the increase in data. Despite the volume and velocity of Big Data, it is truly about technology connecting humans and assisting them to construct knowledge in new ways. Concluding Thoughts: The concept of Big Data and associated analytics are to be taken seriously when approaching the use of vast volumes of both structured and unstructured data in science and health-care. Future exploration of issues surrounding data privacy, confidentiality, and education are needed. A greater focus on data from social media, the quantified self-movement, and the application of analytics to "small data" would also be useful.

  18. Big inquiry

    Energy Technology Data Exchange (ETDEWEB)

    Wynne, B [Lancaster Univ. (UK)

    1979-06-28

    The recently published report entitled 'The Big Public Inquiry' from the Council for Science and Society and the Outer Circle Policy Unit is considered, with especial reference to any future enquiry which may take place into the first commercial fast breeder reactor. Proposals embodied in the report include stronger rights for objectors and an attempt is made to tackle the problem that participation in a public inquiry is far too late to be objective. It is felt by the author that the CSS/OCPU report is a constructive contribution to the debate about big technology inquiries but that it fails to understand the deeper currents in the economic and political structure of technology which so influence the consequences of whatever formal procedures are evolved.

  19. Thinking about information work of nuclear science and technology in the age of big data: speaking of the information analysis and research

    International Nuclear Information System (INIS)

    Chen Tieyong

    2014-01-01

    Human society is entering a 'PB' (1024TB) the new era as the unit of structured and unstructured data, In the network era, with the development of mobile communications, electronic commerce, the emergence and development of social network. Now, a large-scale production, sharing and application data era is opening. How to explore the value of data, to conquer big data, to get useful information, is an important task of our science and technology information workers. This paper tries to analyze the development of the nuclear science and technology information work from big data obtain, analysis, application. Our analysis and research work for information will be increasingly based on all data and analysis, Instead of random sampling. The data 'sound' is possible. A lot of results of information analysis and research can be expressed quantitatively. We should attach great importance to data collection, careful analysis of the big data. We involves the professional division of labor, but also to cooperation In nuclear science and technology information analysis and research process. In addition, we should strengthen the nuclear science and technology information resource construction, improve Information supply; strengthen the analysis and research of nuclear science and technology information, improve the information service; strengthen information management of nuclear science and technology, pay attention to the security problems and intellectual property rights in information sharing; strengthen personnel training, continuously improve the nuclear science and technology information work efficiency and performance. In the age of big data, our nuclear science and technology information workers shall be based on the information analysis and study as the core, one hand grasping information collection, another hand grasping information service, forge ahead and innovation, continuous improvement working ability of nuclear science and technology information, improve the

  20. Nursing Management Minimum Data Set: Cost-Effective Tool To Demonstrate the Value of Nurse Staffing in the Big Data Science Era.

    Science.gov (United States)

    Pruinelli, Lisiane; Delaney, Connie W; Garciannie, Amy; Caspers, Barbara; Westra, Bonnie L

    2016-01-01

    There is a growing body of evidence of the relationship of nurse staffing to patient, nurse, and financial outcomes. With the advent of big data science and developing big data analytics in nursing, data science with the reuse of big data is emerging as a timely and cost-effective approach to demonstrate nursing value. The Nursing Management Minimum Date Set (NMMDS) provides standard administrative data elements, definitions, and codes to measure the context where care is delivered and, consequently, the value of nursing. The integration of the NMMDS elements in the current health system provides evidence for nursing leaders to measure and manage decisions, leading to better patient, staffing, and financial outcomes. It also enables the reuse of data for clinical scholarship and research.

  1. Data Science as an Innovation Challenge: From Big Data to Value Proposition

    Directory of Open Access Journals (Sweden)

    Victoria Kayser

    2018-03-01

    Full Text Available Analyzing “big data” holds huge potential for generating business value. The ongoing advancement of tools and technology over recent years has created a new ecosystem full of opportunities for data-driven innovation. However, as the amount of available data rises to new heights, so too does complexity. Organizations are challenged to create the right contexts, by shaping interfaces and processes, and by asking the right questions to guide the data analysis. Lifting the innovation potential requires teaming and focus to efficiently assign available resources to the most promising initiatives. With reference to the innovation process, this article will concentrate on establishing a process for analytics projects from first ideas to realization (in most cases: a running application. The question we tackle is: what can the practical discourse on big data and analytics learn from innovation management? The insights presented in this article are built on our practical experiences in working with various clients. We will classify analytics projects as well as discuss common innovation barriers along this process.

  2. The PACA Project: Convergence of Scientific Research, Social Media and Citizen Science in the Era of Astronomical Big Data

    Science.gov (United States)

    Yanamandra-Fisher, Padma A.

    2015-08-01

    The Pro-Am Collaborative Astronomy (PACA) project promotes and supports the professional-amateur astronomer collaboration in scientific research via social media and has been implemented in several comet observing campaigns. In 2014, two comet observing campaigns involving pro-am collaborations were initiated: (1) C/2013 A1 (C/SidingSpring) and (2) 67P/Churyumov-Gerasimenko (CG), target for ESA/Rosetta mission. The evolving need for individual customized observing campaigns has been incorporated into the evolution of The PACA Project that currently is focused on comets: from supporting observing campaigns of current comets, legacy data, historical comets; interconnected with social media and a set of shareable documents addressing observational strategies; consistent standards for data; data access, use, and storage, to align with the needs of professional observers in the era of astronmical big data. The empowerment of amateur astronomers vis-à-vis their partnerships with the professional scientists creates a new demographic of data scientists, enabling citizen science of the integrated data from both the professional and amateur communities.While PACA identifies a consistent collaborative approach to pro-am collaborations, given the volume of data generated for each campaign, new ways of rapid data analysis, mining access and storage are needed. Several interesting results emerged from the synergistic inclusion of both social media and amateur astronomers. The PACA Project is expanding to include pro-am collaborations on other solar system objects; allow for immersive outreach and include various types of astronomical communities, ranging from individuals, to astronmical societies and telescopic networks. Enabling citizen science research in the era of astronomical big data is a challenge which requires innovative approaches and integration of professional and amateur astronomers with data scientists and some examples of recent projects will be highlighted.

  3. LLNL's Big Science Capabilities Help Spur Over $796 Billion in U.S. Economic Activity Sequencing the Human Genome

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Jeffrey S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-28

    LLNL’s successful history of taking on big science projects spans beyond national security and has helped create billions of dollars per year in new economic activity. One example is LLNL’s role in helping sequence the human genome. Over $796 billion in new economic activity in over half a dozen fields has been documented since LLNL successfully completed this Grand Challenge.

  4. Crowd-Funded Micro-Grants for Genomics and “Big Data”: An Actionable Idea Connecting Small (Artisan) Science, Infrastructure Science, and Citizen Philanthropy

    Science.gov (United States)

    Badr, Kamal F.; Dove, Edward S.; Endrenyi, Laszlo; Geraci, Christy Jo; Hotez, Peter J.; Milius, Djims; Neves-Pereira, Maria; Pang, Tikki; Rotimi, Charles N.; Sabra, Ramzi; Sarkissian, Christineh N.; Srivastava, Sanjeeva; Tims, Hesther; Zgheib, Nathalie K.; Kickbusch, Ilona

    2013-01-01

    Abstract Biomedical science in the 21st century is embedded in, and draws from, a digital commons and “Big Data” created by high-throughput Omics technologies such as genomics. Classic Edisonian metaphors of science and scientists (i.e., “the lone genius” or other narrow definitions of expertise) are ill equipped to harness the vast promises of the 21st century digital commons. Moreover, in medicine and life sciences, experts often under-appreciate the important contributions made by citizen scholars and lead users of innovations to design innovative products and co-create new knowledge. We believe there are a large number of users waiting to be mobilized so as to engage with Big Data as citizen scientists—only if some funding were available. Yet many of these scholars may not meet the meta-criteria used to judge expertise, such as a track record in obtaining large research grants or a traditional academic curriculum vitae. This innovation research article describes a novel idea and action framework: micro-grants, each worth $1000, for genomics and Big Data. Though a relatively small amount at first glance, this far exceeds the annual income of the “bottom one billion”—the 1.4 billion people living below the extreme poverty level defined by the World Bank ($1.25/day). We describe two types of micro-grants. Type 1 micro-grants can be awarded through established funding agencies and philanthropies that create micro-granting programs to fund a broad and highly diverse array of small artisan labs and citizen scholars to connect genomics and Big Data with new models of discovery such as open user innovation. Type 2 micro-grants can be funded by existing or new science observatories and citizen think tanks through crowd-funding mechanisms described herein. Type 2 micro-grants would also facilitate global health diplomacy by co-creating crowd-funded micro-granting programs across nation-states in regions facing political and financial instability, while

  5. A Big Video Manifesto

    DEFF Research Database (Denmark)

    Mcilvenny, Paul Bruce; Davidsen, Jacob

    2017-01-01

    and beautiful visualisations. However, we also need to ask what the tools of big data can do both for the Humanities and for more interpretative approaches and methods. Thus, we prefer to explore how the power of computation, new sensor technologies and massive storage can also help with video-based qualitative......For the last few years, we have witnessed a hype about the potential results and insights that quantitative big data can bring to the social sciences. The wonder of big data has moved into education, traffic planning, and disease control with a promise of making things better with big numbers...

  6. Big Sib Students' Perceptions of the Educational Environment at the School of Medical Sciences, Universiti Sains Malaysia, using Dundee Ready Educational Environment Measure (DREEM) Inventory.

    Science.gov (United States)

    Arzuman, Hafiza; Yusoff, Muhamad Saiful Bahri; Chit, Som Phong

    2010-07-01

    A cross-sectional descriptive study was conducted among Big Sib students to explore their perceptions of the educational environment at the School of Medical Sciences, Universiti Sains Malaysia (USM) and its weak areas using the Dundee Ready Educational Environment Measure (DREEM) inventory. The DREEM inventory is a validated global instrument for measuring educational environments in undergraduate medical and health professional education. The English version of the DREEM inventory was administered to all Year 2 Big Sib students (n = 67) at a regular Big Sib session. The purpose of the study as well as confidentiality and ethical issues were explained to the students before the questionnaire was administered. The response rate was 62.7% (42 out of 67 students). The overall DREEM score was 117.9/200 (SD 14.6). The DREEM indicated that the Big Sib students' perception of educational environment of the medical school was more positive than negative. Nevertheless, the study also revealed some problem areas within the educational environment. This pilot study revealed that Big Sib students perceived a positive learning environment at the School of Medical Sciences, USM. It also identified some low-scored areas that require further exploration to pinpoint the exact problems. The relatively small study population selected from a particular group of students was the major limitation of the study. This small sample size also means that the study findings cannot be generalised.

  7. Bringing the Tools of Big Science to Bear on Local Environmental Challenges

    Science.gov (United States)

    Bronson, Scott; Jones, Keith W.; Brown, Maria

    2013-01-01

    We describe an interactive collaborative environmental education project that makes advanced laboratory facilities at Brookhaven National Laboratory accessible for one-year or multi-year science projects for the high school level. Cyber-enabled Environmental Science (CEES) utilizes web conferencing software to bring multi-disciplinary,…

  8. A Big Data Task Force Review of Advances in Data Access and Discovery Within the Science Disciplines of the NASA Science Mission Directorate (SMD)

    Science.gov (United States)

    Walker, R. J.; Beebe, R. F.

    2017-12-01

    One of the basic problems the NASA Science Mission Directorate (SMD) faces when dealing with preservation of scientific data is the variety of the data. This stems from the fact that NASA's involvement in the sciences spans a broad range of disciplines across the Science Mission Directorate: Astrophysics, Earth Sciences, Heliophysics and Planetary Science. As the ability of some missions to produce large data volumes has accelerated, the range of problems associated with providing adequate access to the data has demanded diverse approaches for data access. Although mission types, complexity and duration vary across the disciplines, the data can be characterized by four characteristics: velocity, veracity, volume, and variety. The rate of arrival of the data (velocity) must be addressed at the individual mission level, validation and documentation of the data (veracity), data volume and the wide variety of data products present huge challenges as the science disciplines strive to provide transparent access to their available data. Astrophysics, supports an integrated system of data archives based on frequencies covered (UV, visible, IR, etc.) or subject areas (extrasolar planets, extra galactic, etc.) and is accessed through the Astrophysics Data Center (https://science.nasa.gov/astrophysics/astrophysics-data-centers/). Earth Science supports the Earth Observing System (https://earthdata.nasa.gov/) that manages the earth science satellite data. The discipline supports 12 Distributed Active Archive Centers. Heliophysics provides the Space Physics Data Facility (https://spdf.gsfc.nasa.gov/) that supports the heliophysics community and Solar Data Analysis Center (https://umbra.nascom.nasa.gov/index.html) that allows access to the solar data. The Planetary Data System (https://pds.nasa.gov) is the main archive for planetary science data. It consists of science discipline nodes (Atmospheres, Geosciences, Cartography and Imaging Sciences, Planetary Plasma Interactions

  9. The Big Bang, COBE, and the Relic Radiation of Creation (LBNL Science at the Theater)

    Energy Technology Data Exchange (ETDEWEB)

    Smoot, George

    2007-03-05

    Berkeley Lab's George Smoot won the 2006 Physics Nobel Prize, together with John Mather of NASA Goddard Space Flight Center, for "the discovery of the blackbody form and anisotropy of the cosmic microwave background radiation." The anisotropy showed as small variations in the map of the early universe. This research looks back into the infant universe and provides a better understanding of the origin of galaxies and stars. The cosmic background radiation is a tool to understand the structure and history of the universe and the structure of space-time. These observations have provided increased support for the big bang theory of the universe's origin. The Cosmic Background Explorer (COBE) NASA satellite, launched in 1989, carries instruments that measured various aspects of cosmic microwave background radiation, and produced the data for these compelling scientific results, which opened up a field that continues very actively today.

  10. Big data, open science and the brain: lessons learned from genomics

    Directory of Open Access Journals (Sweden)

    Suparna eChoudhury

    2014-05-01

    Full Text Available The BRAIN Initiative aims to break new ground in the scale and speed of data collection in neuroscience, requiring tools to handle data in the magnitude of yottabytes (1024. The scale, investment and organization of it are being compared to the Human Genome Project (HGP, which has exemplified ‘big science’ for biology. In line with the trend towards Big Data in genomic research, the promise of the BRAIN Initiative, as well as the European Human Brain Project, rests on the possibility to amass vast quantities of data to model the complex interactions between the brain and behaviour and inform the diagnosis and prevention of neurological disorders and psychiatric disease. Advocates of this ‘data driven’ paradigm in neuroscience argue that harnessing the large quantities of data generated across laboratories worldwide has numerous methodological, ethical and economic advantages, but it requires the neuroscience community to adopt a culture of data sharing and open access to benefit from them. In this article, we examine the rationale for data sharing among advocates and briefly exemplify these in terms of new ‘open neuroscience’ projects. Then, drawing on the frequently invoked model of data sharing in genomics, we go on to demonstrate the complexities of data sharing, shedding light on the sociological and ethical challenges within the realms of institutions, researchers and participants, namely dilemmas around public/private interests in data, (lack of motivation to share in the academic community, and potential loss of participant anonymity. Our paper serves to highlight some foreseeable tensions around data sharing relevant to the emergent ‘open neuroscience’ movement.

  11. Big Data, Big Problems: A Healthcare Perspective.

    Science.gov (United States)

    Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M

    2017-01-01

    Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."

  12. 'Big science' forum gets a broader role and warns of a need for more neutron sources

    CERN Multimedia

    Dickson, D

    1998-01-01

    After a positive external review, the intergovernmental Megascience Forum set up by OECD to discuss issues concerning funding of major science facilities, is now likely to continue its work under a new name and with a wider mandate (1 page).

  13. The Big Science Questions About Mercury's Ice-Bearing Polar Deposits After MESSENGER

    Science.gov (United States)

    Chabot, N. L.; Lawrence, D. J.

    2018-05-01

    Mercury’s polar deposits provide many well-characterized locations that are known to have large expanses of exposed water ice and/or other volatile materials — presenting unique opportunities to address fundamental science questions.

  14. Examining the Big-Fish-Little-Pond Effect on Students' Self-Concept of Learning Science in Taiwan Based on the TIMSS Databases

    Science.gov (United States)

    Liou, Pey-Yan

    2014-08-01

    The purpose of this study is to examine the relationship between student self-concept and achievement in science in Taiwan based on the big-fish-little-pond effect (BFLPE) model using the Trends in International Mathematics and Science Study (TIMSS) 2003 and 2007 databases. Hierarchical linear modeling was used to examine the effects of the student-level and school-level science achievement on student self-concept of learning science. The results indicated that student science achievement was positively associated with individual self-concept of learning science in both TIMSS 2003 and 2007. On the contrary, while school-average science achievement was negatively related to student self-concept in TIMSS 2003, it had no statistically significant relationship with student self-concept in TIMSS 2007. The findings of this study shed light on possible explanations for the existence of BFLPE and also lead to an international discussion on the generalization of BFLPE.

  15. Big Data from Europe's Natural Science Collections through DiSSCo

    Science.gov (United States)

    Addink, Wouter; Koureas, Dimitris; Casino, Ana

    2017-04-01

    DiSSCo, a Distributed System of Scientific Collections, will be a Research Infrastructure delivering big data describing the history of Planet Earth. Approximately 1.5 billion biological and geological specimens, representing the last 300 years of scientific study on the natural world, reside in collections all over Europe. These span 4.5 billion years of history, from the formation of the solar system to the present day. In the European landscape of environmental Research Infrastructures, different projects and landmarks describe services that aim at aggregating, monitoring, analysing and modelling geo-diversity information. The effectiveness of these services, however, is based on the quality and availability of primary reference data that today is scattered and uncomplete. DiSSCo provides the required bio-geographical, taxonomic and species trait data at the level of precision and accuracy required to enable and speed up research for the rapidly growing seven grand societal challenges that are priorities of the Europe 2020 strategy. DiSSCo enables better connections between collection data and observations in biodiversity observation networks, such as EU BON and GEOBON. This supports research areas like long term ecological research, for which the continuity and long term research is a strength of biological collections.

  16. Citizen Science, Crowdsourcing and Big Data: A Scientific and Social Framework for Natural Resources and Environments

    Science.gov (United States)

    Glynn, P. D.; Jones, J. W.; Liu, S. B.; Shapiro, C. D.; Jenter, H. L.; Hogan, D. M.; Govoni, D. L.; Poore, B. S.

    2014-12-01

    We describe a conceptual framework for Citizen Science that can be applied to improve the understanding and management of natural resources and environments. For us, Citizen Science represents an engagement from members of the public, usually volunteers, in collaboration with paid professionals and technical experts to observe and understand natural resources and environments for the benefit of science and society. Our conceptual framework for Citizen Science includes crowdsourcing of observations (or sampling). It considers a wide range of activities, including volunteer and professional monitoring (e.g. weather and climate variables, water availability and quality, phenology, biota, image capture and remote sensing), as well as joint fact finding and analyses, and participatory mapping and modeling. Spatial distribution and temporal dynamics of the biophysical processes that control natural resources and environments are taken into account within this conceptual framework, as are the availability, scaling and diversity of tools and efforts that are needed to properly describe these biophysical processes. Opportunities are sought within the framework to properly describe, QA/QC, archive, and make readily accessible, the large amounts of information and traceable knowledge required to better understand and manage natural resources and environments. The framework also considers human motivational needs, primarily through a modern version of Maslow's hierarchy of needs. We examine several USGS-based Citizen Science efforts within the context of our framework, including the project called "iCoast - Did the Coast Change?", to understand the utility of the framework, its costs and benefits, and to offer concrete examples of how to expand and sustain specific projects. We make some recommendations that could aid its implementation on a national or larger scale. For example, implementation might be facilitated (1) through greater engagement of paid professionals, and (2

  17. The Big Crunch: A Hybrid Solution to Earth and Space Science Instruction for Elementary Education Majors

    Science.gov (United States)

    Cervato, Cinzia; Kerton, Charles; Peer, Andrea; Hassall, Lesya; Schmidt, Allan

    2013-01-01

    We describe the rationale and process for the development of a new hybrid Earth and Space Science course for elementary education majors. A five-step course design model, applicable to both online and traditional courses, is presented. Assessment of the course outcomes after two semesters indicates that the intensive time invested in the…

  18. Tiny galaxies help unravel dark matter mystery

    CERN Multimedia

    O'Hanlon, Larry

    2007-01-01

    "The 70-year effort to unravel the mysteries of dark matter just got a big boost from some very puny galaxies. In the pas few years, a score of dwarf galaxies have been discovered hanging about the fringes of the Milky way. Now new measurements of the few stars int hese dwarfs reveal them to be dark mater distilleries, with upwards of 1'000 times more dark than normal matter." (3 pages)

  19. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......, and highlight some recent methodological advancements in machine learning and image analysis triggered by astronomical applications....

  20. In Defense of the National Labs and Big-Budget Science

    Energy Technology Data Exchange (ETDEWEB)

    Goodwin, J R

    2008-07-29

    The purpose of this paper is to present the unofficial and unsanctioned opinions of a Visiting Scientist at Lawrence Livermore National Laboratory on the values of LLNL and the other National Labs. The basic founding value and goal of the National Labs is big-budget scientific research, along with smaller-budget scientific research that cannot easily be done elsewhere. The most important example in the latter category is classified defense-related research. The historical guiding light here is the Manhattan Project. This endeavor was unique in human history, and might remain so. The scientific expertise and wealth of an entire nation was tapped in a project that was huge beyond reckoning, with no advance guarantee of success. It was in many respects a clash of scientific titans, with a large supporting cast, collaborating toward a single well-defined goal. Never had scientists received so much respect, so much money, and so much intellectual freedom to pursue scientific progress. And never was the gap between theory and implementation so rapidly narrowed, with results that changed the world, completely. Enormous resources are spent at the national or international level on large-scale scientific projects. LLNL has the most powerful computer in the world, Blue Gene/L. (Oops, Los Alamos just seized the title with Roadrunner; such titles regularly change hands.) LLNL also has the largest laser in the world, the National Ignition Facility (NIF). Lawrence Berkeley National Lab (LBNL) has the most powerful microscope in the world. Not only is it beyond the resources of most large corporations to make such expenditures, but the risk exceeds the possible rewards for those corporations that could. Nor can most small countries afford to finance large scientific projects, and not even the richest can afford largess, especially if Congress is under major budget pressure. Some big-budget research efforts are funded by international consortiums, such as the Large Hadron Collider

  1. The History of Radio Astronomy and the National Radio Astronomy Observatory: Evolution Toward Big Science

    Science.gov (United States)

    Malphrus, Benjamin Kevin

    1990-01-01

    The purpose of this study is to examine the sequence of events that led to the establishment of the NRAO, the construction and development of instrumentation and the contributions and discovery events and to relate the significance of these events to the evolution of the sciences of radio astronomy and cosmology. After an overview of the resources, a brief discussion of the early days of the science is given to set the stage for an examination of events that led to the establishment of the NRAO. The developmental and construction phases of the major instruments including the 85-foot Tatel telescope, the 300-foot telescope, the 140-foot telescope, and the Green Bank lnterferometer are examined. The technical evolution of these instruments is traced and their relevance to scientific programs and discovery events is discussed. The history is told in narrative format that is interspersed with technical and scientific explanations. Through the use of original data technical and scientific information of historical concern is provided to elucidate major developments and events. An interpretive discussion of selected programs, events and technological developments that epitomize the contributions of the NRAO to the science of radio astronomy is provided. Scientific programs conducted with the NRAO instruments that were significant to galactic and extragalactic astronomy are presented. NRAO research programs presented include continuum and source surveys, mapping, a high precision verification of general relativity, and SETI programs. Cosmic phenomena investigated in these programs include galactic and extragalactic HI and HII, emission nebula, supernova remnants, cosmic masers, giant molecular clouds, radio stars, normal and radio galaxies, and quasars. Modern NRAO instruments including the VLA and VLBA and their scientific programs are presented in the final chapter as well as plans for future NRAO instruments such as the GBT.

  2. Nobel prize winner returns home to tell a fascinating 'Big Science' story

    International Nuclear Information System (INIS)

    Angiolillo, C.; Dranga, R.

    2015-01-01

    This paper is about the Sudbury Neutrino Observatory (SNO) experiment. SNO achieved a major breakthrough on the study of the behavior of an elementary and enigmatic particle of the universe - the neutrino. The experiment was the result of the synthesis of over 30-years of work on particle physics, astrophysics and nuclear science that saw early germination at Chalk River Laboratories. Preliminary SNO results led to a major leap forward on how to measure sub-atomic phenomena that were never used to this extent before and have also provided new insights into the Standard Model of physics, and indeed in our fundamental understanding of the entire universe.

  3. Nobel prize winner returns home to tell a fascinating 'Big Science' story

    Energy Technology Data Exchange (ETDEWEB)

    Angiolillo, C.; Dranga, R. [Canadian Nuclear Laboratories, Chalk River, Ontario (Canada)

    2015-12-15

    This paper is about the Sudbury Neutrino Observatory (SNO) experiment. SNO achieved a major breakthrough on the study of the behavior of an elementary and enigmatic particle of the universe - the neutrino. The experiment was the result of the synthesis of over 30-years of work on particle physics, astrophysics and nuclear science that saw early germination at Chalk River Laboratories. Preliminary SNO results led to a major leap forward on how to measure sub-atomic phenomena that were never used to this extent before and have also provided new insights into the Standard Model of physics, and indeed in our fundamental understanding of the entire universe.

  4. A requirement for Australian research: access to 'big science' facilities, a report by the Australian National Committee for crystallography

    International Nuclear Information System (INIS)

    1989-03-01

    Two types of 'Big Science' research facility - synchrotron radiation sources and intense neutron beams - are now recognised as essential resources for a wide range of research activities in chemistry, physics and biology. The cost of such facilities and the lack of a sufficiently large user base will probably preclude their construction in Australia in the foreseeable future. The needs of Australian crystallographers for access to such facilities are assessed. In relation to synchrotron radiation sources, the Committee considered only the question of access to such facilities overseas. In relation to neutron beam sources, the Committee's inquiries included not only the question of access to powerful facilities overseas but also the special problems which confront Australian crystallographers as a result of the obsolescence of the HIFAR reactor. The arguments about, and options for, funding Australian use of facilities overseas are presented. The Committee concluded there is a strong case for the purchase of a beam-line at an overseas synchrotron radiation facility and a strong, though less urgent, case for substantial Australian involvement in an overseas neutron beam facility. The Committee recommended that the Australian HIFAR reactor be refurbished in its present shell, retaining the present flux and power levels, and that in the upgrading of the neutron scattering instrumentation at HIFAR special consideration be given to including items which are sufficiently specialised to attract the international neutron scattering community

  5. Multivariate methods for the analysis of complex and big data in forensic sciences. Application to age estimation in living persons.

    Science.gov (United States)

    Lefèvre, Thomas; Chariot, Patrick; Chauvin, Pierre

    2016-09-01

    Researchers handle increasingly higher dimensional datasets, with many variables to explore. Such datasets pose several problems, since they are difficult to handle and present unexpected features. As dimensionality increases, classical statistical analysis becomes inoperative. Variables can present redundancy, and the reduction of dataset dimensionality to its lowest possible value is often needed. Principal components analysis (PCA) has proven useful to reduce dimensionality but present several shortcomings. As others, forensic sciences will face the issues specific related to an evergrowing quantity of data to be integrated. Age estimation in living persons, an unsolved problem so far, could benefit from the integration of various sources of data, e.g., clinical, dental and radiological data. We present here novel multivariate techniques (nonlinear dimensionality reduction techniques, NLDR), applied to a theoretical example. Results were compared to those of PCA. NLDR techniques were then applied to clinical, dental and radiological data (13 variables) used for age estimation. The correlation dimension of these data was estimated. NLDR techniques outperformed PCA results. They showed that two living persons sharing similar characteristics may present rather different estimated ages. Moreover, data presented a very high informational redundancy, i.e., a correlation dimension of 2. NLDR techniques should be used with or preferred to PCA techniques to analyze complex and big data. Data routinely used for age estimation may not be considered suitable for this purpose. How integrating other data or approaches could improve age estimation in living persons is still uncertain. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Rethinking Approaches to Exploration and Analysis of Big Data in Earth Science

    Science.gov (United States)

    Graves, S. J.; Maskey, M.

    2015-12-01

    With increasing amounts of data available for exploration and analysis, there are increasing numbers of users that need information extracted from the data for very specific purposes. Many of the specific purposes may not have even been considered yet so how do computational and data scientists plan for this diverse and not well defined set of possible users? There are challenges to be considered in the computational architectures, as well as the organizational structures for the data to allow for the best possible exploration and analytical capabilities. Data analytics need to be a key component in thinking about the data structures and types of storage of these large amounts of data, coming from a variety of sensing platforms that may be space based, airborne, in situ and social media. How do we provide for better capabilities for exploration and anaylsis at the point of collection for real-time or near real-time requirements? This presentation will address some of the approaches being considered and the challenges the computational and data science communities are facing in collaboration with the Earth Science research and application communities.

  7. Convergence in France facing Big Data era and Exascale challenges for Climate Sciences

    Science.gov (United States)

    Denvil, Sébastien; Dufresne, Jean-Louis; Salas, David; Meurdesoif, Yann; Valcke, Sophie; Caubel, Arnaud; Foujols, Marie-Alice; Servonnat, Jérôme; Sénési, Stéphane; Derouillat, Julien; Voury, Pascal

    2014-05-01

    The presentation will introduce a french national project : CONVERGENCE that has been funded for four years. This project will tackle big data and computational challenges faced by climate modeling community in HPC context. Model simulations are central to the study of complex mechanisms and feedbacks in the climate system and to provide estimates of future and past climate changes. Recent trends in climate modelling are to add more physical components in the modelled system, increasing the resolution of each individual component and the more systematic use of large suites of simulations to address many scientific questions. Climate simulations may therefore differ in their initial state, parameter values, representation of physical processes, spatial resolution, model complexity, and degree of realism or degree of idealisation. In addition, there is a strong need for evaluating, improving and monitoring the performance of climate models using a large ensemble of diagnostics and better integration of model outputs and observational data. High performance computing is currently reaching the exascale and has the potential to produce this exponential increase of size and numbers of simulations. However, post-processing, analysis, and exploration of the generated data have stalled and there is a strong need for new tools to cope with the growing size and complexity of the underlying simulations and datasets. Exascale simulations require new scalable software tools to generate, manage and mine those simulations ,and data to extract the relevant information and to take the correct decision. The primary purpose of this project is to develop a platform capable of running large ensembles of simulations with a suite of models, to handle the complex and voluminous datasets generated, to facilitate the evaluation and validation of the models and the use of higher resolution models. We propose to gather interdisciplinary skills to design, using a component-based approach, a

  8. Kulturní hodnoty tzv. Mariánského trojúhelníku - Sloup, Vranov, Křtiny

    OpenAIRE

    Bezděková, Veronika

    2009-01-01

    In the Moravian Karst there are three big churches consecrated to The Virgin Mary. These are visited by many pilgrims and have their own calendar of pilgrimages. They are Vranov, Křtiny and Sloup. Churches in these towns are consecrated to The Virgin Mary: Vranov commemorates the birth of The Virgin Mary, Křtiny commemorates the name of The Virgin Mary and Sloup commemorates the sufferings of The Virgin Mary. So we talk about the triangle of The Virgin Mary. This term is the main point of my ...

  9. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  10. Modernizing Earth and Space Science Modeling Workflows in the Big Data Era

    Science.gov (United States)

    Kinter, J. L.; Feigelson, E.; Walker, R. J.; Tino, C.

    2017-12-01

    Modeling is a major aspect of the Earth and space science research. The development of numerical models of the Earth system, planetary systems or astrophysical systems is essential to linking theory with observations. Optimal use of observations that are quite expensive to obtain and maintain typically requires data assimilation that involves numerical models. In the Earth sciences, models of the physical climate system are typically used for data assimilation, climate projection, and inter-disciplinary research, spanning applications from analysis of multi-sensor data sets to decision-making in climate-sensitive sectors with applications to ecosystems, hazards, and various biogeochemical processes. In space physics, most models are from first principles, require considerable expertise to run and are frequently modified significantly for each case study. The volume and variety of model output data from modeling Earth and space systems are rapidly increasing and have reached a scale where human interaction with data is prohibitively inefficient. A major barrier to progress is that modeling workflows isn't deemed by practitioners to be a design problem. Existing workflows have been created by a slow accretion of software, typically based on undocumented, inflexible scripts haphazardly modified by a succession of scientists and students not trained in modern software engineering methods. As a result, existing modeling workflows suffer from an inability to onboard new datasets into models; an inability to keep pace with accelerating data production rates; and irreproducibility, among other problems. These factors are creating an untenable situation for those conducting and supporting Earth system and space science. Improving modeling workflows requires investments in hardware, software and human resources. This paper describes the critical path issues that must be targeted to accelerate modeling workflows, including script modularization, parallelization, and

  11. III. FROM SMALL TO BIG: METHODS FOR INCORPORATING LARGE SCALE DATA INTO DEVELOPMENTAL SCIENCE.

    Science.gov (United States)

    Davis-Kean, Pamela E; Jager, Justin

    2017-06-01

    For decades, developmental science has been based primarily on relatively small-scale data collections with children and families. Part of the reason for the dominance of this type of data collection is the complexity of collecting cognitive and social data on infants and small children. These small data sets are limited in both power to detect differences and the demographic diversity to generalize clearly and broadly. Thus, in this chapter we will discuss the value of using existing large-scale data sets to tests the complex questions of child development and how to develop future large-scale data sets that are both representative and can answer the important questions of developmental scientists. © 2017 The Society for Research in Child Development, Inc.

  12. Opening up to Big Data: Computer-Assisted Analysis of Textual Data in Social Sciences

    Directory of Open Access Journals (Sweden)

    Gregor Wiedemann

    2013-05-01

    Full Text Available Two developments in computational text analysis may change the way qualitative data analysis in social sciences is performed: 1. the availability of digital text worth to investigate is growing rapidly, and 2. the improvement of algorithmic information extraction approaches, also called text mining, allows for further bridging the gap between qualitative and quantitative text analysis. The key factor hereby is the inclusion of context into computational linguistic models which extends conventional computational content analysis towards the extraction of meaning. To clarify methodological differences of various computer-assisted text analysis approaches the article suggests a typology from the perspective of a qualitative researcher. This typology shows compatibilities between manual qualitative data analysis methods and computational, rather quantitative approaches for large scale mixed method text analysis designs. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1302231

  13. Structural analysis of an off-grid tiny house

    Science.gov (United States)

    Calluari, Karina Arias; Alonso-Marroquín, Fernando

    2017-06-01

    The off-grid technologies and tiny house movement have experimented an unprecedented growth in recent years. Putting both sides together, we are trying to achieve an economic and environmental friendly solution to the higher cost of residential properties. This solution is the construction of off-grid tiny houses. This article presents a design for a small modular off-grid house made by pine timber. A numerical analysis of the proposed tiny house was performed to ensure its structural stability. The results were compared with the suggested serviceability limit state criteria, which are contended in the Australia Guidelines Standards making this design reliable for construction.

  14. What Role for Law, Human Rights, and Bioethics in an Age of Big Data, Consortia Science, and Consortia Ethics? The Importance of Trustworthiness.

    Science.gov (United States)

    Dove, Edward S; Özdemir, Vural

    2015-09-01

    The global bioeconomy is generating new paradigm-shifting practices of knowledge co-production, such as collective innovation; large-scale, data-driven global consortia science (Big Science); and consortia ethics (Big Ethics). These bioeconomic and sociotechnical practices can be forces for progressive social change, but they can also raise predicaments at the interface of law, human rights, and bioethics. In this article, we examine one such double-edged practice: the growing, multivariate exploitation of Big Data in the health sector, particularly by the private sector. Commercial exploitation of health data for knowledge-based products is a key aspect of the bioeconomy and is also a topic of concern among publics around the world. It is exacerbated in the current age of globally interconnected consortia science and consortia ethics, which is characterized by accumulating epistemic proximity, diminished academic independence, "extreme centrism", and conflicted/competing interests among innovation actors. Extreme centrism is of particular importance as a new ideology emerging from consortia science and consortia ethics; this relates to invariably taking a middle-of-the-road populist stance, even in the event of human rights breaches, so as to sustain the populist support needed for consortia building and collective innovation. What role do law, human rights, and bioethics-separate and together-have to play in addressing these predicaments and opportunities in early 21st century science and society? One answer we propose is an intertwined ethico-legal normative construct, namely trustworthiness . By considering trustworthiness as a central pillar at the intersection of law, human rights, and bioethics, we enable others to trust us, which in turns allows different actors (both nonprofit and for-profit) to operate more justly in consortia science and ethics, as well as to access and responsibly use health data for public benefit.

  15. What Role for Law, Human Rights, and Bioethics in an Age of Big Data, Consortia Science, and Consortia Ethics? The Importance of Trustworthiness

    Science.gov (United States)

    Dove, Edward S.; Özdemir, Vural

    2015-01-01

    The global bioeconomy is generating new paradigm-shifting practices of knowledge co-production, such as collective innovation; large-scale, data-driven global consortia science (Big Science); and consortia ethics (Big Ethics). These bioeconomic and sociotechnical practices can be forces for progressive social change, but they can also raise predicaments at the interface of law, human rights, and bioethics. In this article, we examine one such double-edged practice: the growing, multivariate exploitation of Big Data in the health sector, particularly by the private sector. Commercial exploitation of health data for knowledge-based products is a key aspect of the bioeconomy and is also a topic of concern among publics around the world. It is exacerbated in the current age of globally interconnected consortia science and consortia ethics, which is characterized by accumulating epistemic proximity, diminished academic independence, “extreme centrism”, and conflicted/competing interests among innovation actors. Extreme centrism is of particular importance as a new ideology emerging from consortia science and consortia ethics; this relates to invariably taking a middle-of-the-road populist stance, even in the event of human rights breaches, so as to sustain the populist support needed for consortia building and collective innovation. What role do law, human rights, and bioethics—separate and together—have to play in addressing these predicaments and opportunities in early 21st century science and society? One answer we propose is an intertwined ethico-legal normative construct, namely trustworthiness. By considering trustworthiness as a central pillar at the intersection of law, human rights, and bioethics, we enable others to trust us, which in turns allows different actors (both nonprofit and for-profit) to operate more justly in consortia science and ethics, as well as to access and responsibly use health data for public benefit. PMID:26345196

  16. What Role for Law, Human Rights, and Bioethics in an Age of Big Data, Consortia Science, and Consortia Ethics? The Importance of Trustworthiness

    Directory of Open Access Journals (Sweden)

    Edward S. Dove

    2015-08-01

    Full Text Available The global bioeconomy is generating new paradigm-shifting practices of knowledge co-production, such as collective innovation; large-scale, data-driven global consortia science (Big Science; and consortia ethics (Big Ethics. These bioeconomic and sociotechnical practices can be forces for progressive social change, but they can also raise predicaments at the interface of law, human rights, and bioethics. In this article, we examine one such double-edged practice: the growing, multivariate exploitation of Big Data in the health sector, particularly by the private sector. Commercial exploitation of health data for knowledge-based products is a key aspect of the bioeconomy and is also a topic of concern among publics around the world. It is exacerbated in the current age of globally interconnected consortia science and consortia ethics, which is characterized by accumulating epistemic proximity, diminished academic independence, “extreme centrism”, and conflicted/competing interests among innovation actors. Extreme centrism is of particular importance as a new ideology emerging from consortia science and consortia ethics; this relates to invariably taking a middle-of-the-road populist stance, even in the event of human rights breaches, so as to sustain the populist support needed for consortia building and collective innovation. What role do law, human rights, and bioethics—separate and together—have to play in addressing these predicaments and opportunities in early 21st century science and society? One answer we propose is an intertwined ethico-legal normative construct, namely trustworthiness. By considering trustworthiness as a central pillar at the intersection of law, human rights, and bioethics, we enable others to trust us, which in turns allows different actors (both nonprofit and for-profit to operate more justly in consortia science and ethics, as well as to access and responsibly use health data for public benefit.

  17. A Big Bang Lab

    Science.gov (United States)

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  18. Tiny Ultraviolet Polarimeter for Earth Stratosphere from Space Investigation

    Science.gov (United States)

    Nevodovskyi, P. V.; Morozhenko, O. V.; Vidmachenko, A. P.; Ivakhiv, O.; Geraimchuk, M.; Zbrutskyi, O.

    2015-09-01

    One of the reasons for climate change (i.e., stratospheric ozone concentrations) is connected with the variations in optical thickness of aerosols in the upper sphere of the atmosphere (at altitudes over 30 km). Therefore, aerosol and gas components of the atmosphere are crucial in the study of the ultraviolet (UV) radiation passing upon the Earth. Moreover, a scrupulous study of aerosol components of the Earth atmosphere at an altitude of 30 km (i.e., stratospheric aerosol), such as the size of particles, the real part of refractive index, optical thickness and its horizontal structure, concentration of ozone or the upper border of the stratospheric ozone layer is an important task in the research of the Earth climate change. At present, the Main Astronomical Observatory of the National Academy of Sciences (NAS) of Ukraine, the National Technical University of Ukraine "KPI"and the Lviv Polytechnic National University are engaged in the development of methodologies for the study of stratospheric aerosol by means of ultraviolet polarimeter using a microsatellite. So fare, there has been created a sample of a tiny ultraviolet polarimeter (UVP) which is considered to be a basic model for carrying out space experiments regarding the impact of the changes in stratospheric aerosols on both global and local climate.

  19. The Need for a Definition of Big Data for Nursing Science: A Case Study of Disaster Preparedness

    Science.gov (United States)

    Wong, Ho Ting; Chiang, Vico Chung Lim; Choi, Kup Sze; Loke, Alice Yuen

    2016-01-01

    The rapid development of technology has made enormous volumes of data available and achievable anytime and anywhere around the world. Data scientists call this change a data era and have introduced the term “Big Data”, which has drawn the attention of nursing scholars. Nevertheless, the concept of Big Data is quite fuzzy and there is no agreement on its definition among researchers of different disciplines. Without a clear consensus on this issue, nursing scholars who are relatively new to the concept may consider Big Data to be merely a dataset of a bigger size. Having a suitable definition for nurse researchers in their context of research and practice is essential for the advancement of nursing research. In view of the need for a better understanding on what Big Data is, the aim in this paper is to explore and discuss the concept. Furthermore, an example of a Big Data research study on disaster nursing preparedness involving six million patient records is used for discussion. The example demonstrates that a Big Data analysis can be conducted from many more perspectives than would be possible in traditional sampling, and is superior to traditional sampling. Experience gained from the process of using Big Data in this study will shed light on future opportunities for conducting evidence-based nursing research to achieve competence in disaster nursing. PMID:27763525

  20. The Need for a Definition of Big Data for Nursing Science: A Case Study of Disaster Preparedness

    Directory of Open Access Journals (Sweden)

    Ho Ting Wong

    2016-10-01

    Full Text Available The rapid development of technology has made enormous volumes of data available and achievable anytime and anywhere around the world. Data scientists call this change a data era and have introduced the term “Big Data”, which has drawn the attention of nursing scholars. Nevertheless, the concept of Big Data is quite fuzzy and there is no agreement on its definition among researchers of different disciplines. Without a clear consensus on this issue, nursing scholars who are relatively new to the concept may consider Big Data to be merely a dataset of a bigger size. Having a suitable definition for nurse researchers in their context of research and practice is essential for the advancement of nursing research. In view of the need for a better understanding on what Big Data is, the aim in this paper is to explore and discuss the concept. Furthermore, an example of a Big Data research study on disaster nursing preparedness involving six million patient records is used for discussion. The example demonstrates that a Big Data analysis can be conducted from many more perspectives than would be possible in traditional sampling, and is superior to traditional sampling. Experience gained from the process of using Big Data in this study will shed light on future opportunities for conducting evidence-based nursing research to achieve competence in disaster nursing.

  1. The Need for a Definition of Big Data for Nursing Science: A Case Study of Disaster Preparedness.

    Science.gov (United States)

    Wong, Ho Ting; Chiang, Vico Chung Lim; Choi, Kup Sze; Loke, Alice Yuen

    2016-10-17

    The rapid development of technology has made enormous volumes of data available and achievable anytime and anywhere around the world. Data scientists call this change a data era and have introduced the term "Big Data", which has drawn the attention of nursing scholars. Nevertheless, the concept of Big Data is quite fuzzy and there is no agreement on its definition among researchers of different disciplines. Without a clear consensus on this issue, nursing scholars who are relatively new to the concept may consider Big Data to be merely a dataset of a bigger size. Having a suitable definition for nurse researchers in their context of research and practice is essential for the advancement of nursing research. In view of the need for a better understanding on what Big Data is, the aim in this paper is to explore and discuss the concept. Furthermore, an example of a Big Data research study on disaster nursing preparedness involving six million patient records is used for discussion. The example demonstrates that a Big Data analysis can be conducted from many more perspectives than would be possible in traditional sampling, and is superior to traditional sampling. Experience gained from the process of using Big Data in this study will shed light on future opportunities for conducting evidence-based nursing research to achieve competence in disaster nursing.

  2. Struggling to Hear? Tiny Devices Can Keep You Connected

    Science.gov (United States)

    ... Human Services Search form Search Site Menu Home Latest Issue Past Issues Special Issues Subscribe May 2018 Print this issue Struggling to Hear? Tiny Devices Can Keep You Connected En español Send us ...

  3. Reviews Book: Extended Project Student Guide Book: My Inventions Book: ASE Guide to Research in Science Education Classroom Video: The Science of Starlight Software: SPARKvue Book: The Geek Manifesto Ebook: A Big Ball of Fire Apps

    Science.gov (United States)

    2014-05-01

    WE RECOMMEND Level 3 Extended Project Student Guide A non-specialist, generally useful and nicely put together guide to project work ASE Guide to Research in Science Education Few words wasted in this handy introduction and reference The Science of Starlight Slow but steady DVD covers useful ground SPARKvue Impressive software now available as an app WORTH A LOOK My Inventions and Other Writings Science, engineering, autobiography, visions and psychic phenomena mixed in a strange but revealing concoction The Geek Manifesto: Why Science Matters More enthusiasm than science, but a good motivator and interesting A Big Ball of Fire: Your questions about the Sun answered Free iTunes download made by and for students goes down well APPS Collider visualises LHC experiments ... Science Museum app enhances school trips ... useful information for the Cambridge Science Festival

  4. Nursing Needs Big Data and Big Data Needs Nursing.

    Science.gov (United States)

    Brennan, Patricia Flatley; Bakken, Suzanne

    2015-09-01

    Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.

  5. Small wormholes change our picture of the big bang

    CERN Multimedia

    1990-01-01

    Matt Visser has studied tiny wormholes, which may be produced on a subatomic scale by quantum fluctuations in the energy of the vacuum. He believes these quantum wormholes could change our picture of the origin of the Universe in the big bang (1/2 p)

  6. MERRA Analytic Services: Meeting the Big Data Challenges of Climate Science through Cloud-Enabled Climate Analytics-as-a-Service

    Science.gov (United States)

    Schnase, J. L.; Duffy, D.; Tamkin, G. S.; Nadeau, D.; Thompson, J. H.; Grieg, C. M.; McInerney, M.; Webster, W. P.

    2013-12-01

    Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRA/AS) is an example of cloud-enabled CAaaS built on this principle. MERRA/AS enables MapReduce analytics over NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRA/AS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRA/AS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to

  7. MERRA Analytic Services: Meeting the Big Data Challenges of Climate Science Through Cloud-enabled Climate Analytics-as-a-service

    Science.gov (United States)

    Schnase, John L.; Duffy, Daniel Quinn; Tamkin, Glenn S.; Nadeau, Denis; Thompson, John H.; Grieg, Christina M.; McInerney, Mark A.; Webster, William P.

    2014-01-01

    Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we it see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRAAS) is an example of cloud-enabled CAaaS built on this principle. MERRAAS enables MapReduce analytics over NASAs Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRAAS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRAAS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to

  8. Using CyberShake Workflows to Manage Big Seismic Hazard Data on Large-Scale Open-Science HPC Resources

    Science.gov (United States)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2015-12-01

    migrating from file-based communication to MPI messaging, to greatly reduce the I/O demands and node-hour requirements of CyberShake. We will also present performance metrics from CyberShake Study 15.4, and discuss challenges that producers of Big Data on open-science HPC resources face moving forward.

  9. Biomedical Big Data Training Collaborative (BBDTC): An effort to bridge the talent gap in biomedical science and research.

    Science.gov (United States)

    Purawat, Shweta; Cowart, Charles; Amaro, Rommie E; Altintas, Ilkay

    2016-06-01

    The BBDTC (https://biobigdata.ucsd.edu) is a community-oriented platform to encourage high-quality knowledge dissemination with the aim of growing a well-informed biomedical big data community through collaborative efforts on training and education. The BBDTC collaborative is an e-learning platform that supports the biomedical community to access, develop and deploy open training materials. The BBDTC supports Big Data skill training for biomedical scientists at all levels, and from varied backgrounds. The natural hierarchy of courses allows them to be broken into and handled as modules . Modules can be reused in the context of multiple courses and reshuffled, producing a new and different, dynamic course called a playlist . Users may create playlists to suit their learning requirements and share it with individual users or the wider public. BBDTC leverages the maturity and design of the HUBzero content-management platform for delivering educational content. To facilitate the migration of existing content, the BBDTC supports importing and exporting course material from the edX platform. Migration tools will be extended in the future to support other platforms. Hands-on training software packages, i.e., toolboxes , are supported through Amazon EC2 and Virtualbox virtualization technologies, and they are available as: ( i ) downloadable lightweight Virtualbox Images providing a standardized software tool environment with software packages and test data on their personal machines, and ( ii ) remotely accessible Amazon EC2 Virtual Machines for accessing biomedical big data tools and scalable big data experiments. At the moment, the BBDTC site contains three open Biomedical big data training courses with lecture contents, videos and hands-on training utilizing VM toolboxes, covering diverse topics. The courses have enhanced the hands-on learning environment by providing structured content that users can use at their own pace. A four course biomedical big data series is

  10. Air Toxics Under the Big Sky: examining the effectiveness of authentic scientific research on high school students' science skills and interest

    Science.gov (United States)

    Ward, Tony J.; Delaloye, Naomi; Adams, Earle Raymond; Ware, Desirae; Vanek, Diana; Knuth, Randy; Hester, Carolyn Laurie; Marra, Nancy Noel; Holian, Andrij

    2016-04-01

    Air Toxics Under the Big Sky is an environmental science outreach/education program that incorporates the Next Generation Science Standards (NGSS) 8 Practices with the goal of promoting knowledge and understanding of authentic scientific research in high school classrooms through air quality research. This research explored: (1) how the program affects student understanding of scientific inquiry and research and (2) how the open-inquiry learning opportunities provided by the program increase student interest in science as a career path. Treatment students received instruction related to air pollution (airborne particulate matter), associated health concerns, and training on how to operate air quality testing equipment. They then participated in a yearlong scientific research project in which they developed and tested hypotheses through research of their own design regarding the sources and concentrations of air pollution in their homes and communities. Results from an external evaluation revealed that treatment students developed a deeper understanding of scientific research than did comparison students, as measured by their ability to generate good hypotheses and research designs, and equally expressed an increased interest in pursuing a career in science. These results emphasize the value of and need for authentic science learning opportunities in the modern science classroom.

  11. Air Toxics Under the Big Sky: Examining the Effectiveness of Authentic Scientific Research on High School Students’ Science Skills and Interest

    Science.gov (United States)

    Delaloye, Naomi; Adams, Earle Raymond; Ware, Desirae; Vanek, Diana; Knuth, Randy; Hester, Carolyn Laurie; Marra, Nancy Noel; Holian, Andrij

    2016-01-01

    Air Toxics Under the Big Sky is an environmental science outreach/education program that incorporates the Next Generation Science Standards (NGSS) 8 Practices with the goal of promoting knowledge and understanding of authentic scientific research in high school classrooms through air quality research. A quasi-experimental design was used in order to understand: 1) how the program affects student understanding of scientific inquiry and research and 2) how the open inquiry learning opportunities provided by the program increase student interest in science as a career path. Treatment students received instruction related to air pollution (airborne particulate matter), associated health concerns, and training on how to operate air quality testing equipment. They then participated in a yearlong scientific research project in which they developed and tested hypotheses through research of their own design regarding the sources and concentrations of air pollution in their homes and communities. Results from an external evaluation revealed that treatment students developed a deeper understanding of scientific research than did comparison students, as measured by their ability to generate good hypotheses and research designs, and equally expressed an increased interest in pursuing a career in science. These results emphasize the value of and need for authentic science learning opportunities in the modern science classroom. PMID:28286375

  12. Air Toxics Under the Big Sky: Examining the Effectiveness of Authentic Scientific Research on High School Students' Science Skills and Interest.

    Science.gov (United States)

    Ward, Tony J; Delaloye, Naomi; Adams, Earle Raymond; Ware, Desirae; Vanek, Diana; Knuth, Randy; Hester, Carolyn Laurie; Marra, Nancy Noel; Holian, Andrij

    2016-01-01

    Air Toxics Under the Big Sky is an environmental science outreach/education program that incorporates the Next Generation Science Standards (NGSS) 8 Practices with the goal of promoting knowledge and understanding of authentic scientific research in high school classrooms through air quality research. A quasi-experimental design was used in order to understand: 1) how the program affects student understanding of scientific inquiry and research and 2) how the open inquiry learning opportunities provided by the program increase student interest in science as a career path . Treatment students received instruction related to air pollution (airborne particulate matter), associated health concerns, and training on how to operate air quality testing equipment. They then participated in a yearlong scientific research project in which they developed and tested hypotheses through research of their own design regarding the sources and concentrations of air pollution in their homes and communities. Results from an external evaluation revealed that treatment students developed a deeper understanding of scientific research than did comparison students, as measured by their ability to generate good hypotheses and research designs, and equally expressed an increased interest in pursuing a career in science. These results emphasize the value of and need for authentic science learning opportunities in the modern science classroom.

  13. Big Java late objects

    CERN Document Server

    Horstmann, Cay S

    2012-01-01

    Big Java: Late Objects is a comprehensive introduction to Java and computer programming, which focuses on the principles of programming, software engineering, and effective learning. It is designed for a two-semester first course in programming for computer science students.

  14. Urbanising Big

    DEFF Research Database (Denmark)

    Ljungwall, Christer

    2013-01-01

    Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis.......Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis....

  15. Big Argumentation?

    Directory of Open Access Journals (Sweden)

    Daniel Faltesek

    2013-08-01

    Full Text Available Big Data is nothing new. Public concern regarding the mass diffusion of data has appeared repeatedly with computing innovations, in the formation before Big Data it was most recently referred to as the information explosion. In this essay, I argue that the appeal of Big Data is not a function of computational power, but of a synergistic relationship between aesthetic order and a politics evacuated of a meaningful public deliberation. Understanding, and challenging, Big Data requires an attention to the aesthetics of data visualization and the ways in which those aesthetics would seem to depoliticize information. The conclusion proposes an alternative argumentative aesthetic as the appropriate response to the depoliticization posed by the popular imaginary of Big Data.

  16. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations......The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  17. Virginia Tech researchers find tiny bubbles a storehouse of knowledge

    OpenAIRE

    Trulove, Susan

    2005-01-01

    Fluid inclusions -- tiny bubbles of fluid or vapor trapped inside rock as it forms-- are clues to the location of ores and even petroleum; and they are time capsules that contain insights on the power of volcanos and hints of life in the universe.

  18. How Big Is Too Big?

    Science.gov (United States)

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  19. Big Data, IPRs & Competition Law in the Pharma & Life Sciences- future issues in a rapidly evolving field

    DEFF Research Database (Denmark)

    Minssen, Timo

    the merger on the condition that the merged firm would make copies of its database available for purchase by existing and new potential competitors. The previous decision of the European Court of Justice in the IMS Health case has already set out that there are limitations to the extent IPRs can be used...... is an area that is very much in flux. There remains no consensus on the application of antitrust law to Big Data much less as to how it applies. Disagreement aside there is a growing number of decisions, which highlight the use of antitrust rules to Big Data cases. Historically the European and the U...... to pharmaceutical laboratories using Euris software while selling to laboratories using Cegedim’s own and other competing CRM management software. Following the decision from the French Authority finding that the refusal was unjustified it is clear that refusal to sell may in certain circumstances give rise...

  20. Big data in biomedicine.

    Science.gov (United States)

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Tiny molecule, big power: Multi-target approach for curcumin in diabetic cardiomyopathy.

    Science.gov (United States)

    Karuppagounder, Vengadeshprabhu; Arumugam, Somasundaram; Giridharan, Vijayasree V; Sreedhar, Remya; Bose, Rajendran J C; Vanama, Jyothi; Palaniyandi, Suresh S; Konishi, Tetsuya; Watanabe, Kenichi; Thandavarayan, Rajarajan A

    2017-02-01

    Diabetic cardiomyopathy (DCM) is described as impaired cardiac diastolic and systolic functions. Diabetes mellitus (DM), a related cardiovascular disease, has become one of the major causes of death in DM patients. Mortality in these diseases is 2 to 3 times higher than in non-DM patients with cardiovascular disease. The progression of DCM and the cellular and molecular perturbations associated with the pathogenesis are complex and multifactorial. Although considerable progress has been achieved, the molecular etiologies of DCM remain poorly understood. There is an expanding need for natural antidiabetic medicines that do not cause the side effects of modern drugs. Curcumin, a pleiotropic molecule, from Curcuma longa, is known to possess numerous impacts such as scavenging free radical, antioxidant, antitumor, and antiinflammatory activities. The reports from preclinical and clinical findings revealed that curcumin can reverse insulin resistance, hyperglycemia, obesity, and obesity-related metabolic diseases. The current review provides an updated overview of the possible molecular mechanism of DCM and multitarget approach of curcumin in alleviating DCM and diabetic complication. Additionally, we mentioned the approaches that are currently being implemented to improve the bioavailability of this promising natural product in diabetes therapeutics. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Molecular entomology: analyzing tiny molecules to answer big questions about disease vectors and their biology

    Science.gov (United States)

    The entomologists at the Arthropod-Borne Animal Diseases Research Unit at USDA-Agricultural Research Service are tasked with protecting the nation’s livestock from domestic, foreign and emerging vector-borne diseases. To accomplish this task, a vast array of molecular techniques are being used in pr...

  3. Tiny cells meet big questions: a closer look at bacterial cell biology.

    Science.gov (United States)

    Goley, Erin D

    2013-04-01

    While studying actin assembly as a graduate student with Matt Welch at the University of California at Berkeley, my interest was piqued by reports of surprising observations in bacteria: the identification of numerous cytoskeletal proteins, actin homologues fulfilling spindle-like functions, and even the presence of membrane-bound organelles. Curiosity about these phenomena drew me to Lucy Shapiro's lab at Stanford University for my postdoctoral research. In the Shapiro lab, and now in my lab at Johns Hopkins, I have focused on investigating the mechanisms of bacterial cytokinesis. Spending time as both a eukaryotic cell biologist and a bacterial cell biologist has convinced me that bacterial cells present the same questions as eukaryotic cells: How are chromosomes organized and accurately segregated? How is force generated for cytokinesis? How is polarity established? How are signals transduced within and between cells? These problems are conceptually similar between eukaryotes and bacteria, although their solutions can differ significantly in specifics. In this Perspective, I provide a broad view of cell biological phenomena in bacteria, the technical challenges facing those of us who peer into bacterial cells, and areas of common ground as research in eukaryotic and bacterial cell biology moves forward.

  4. Tiny symbols tell big stories : Naming and concealing masturbation in diaries (1660-1940)

    NARCIS (Netherlands)

    Vermeer, Leonieke

    2017-01-01

    Symbols, encryptions and codes are a way to hide sensitive or highly personal content in diaries. This kind of private language is an important feature of diary practise, regardless of time and place, but it has barely been studied yet. This article highlights symbols that designate masturbation in

  5. Tiny cells meet big questions: a closer look at bacterial cell biology

    OpenAIRE

    Goley, Erin D.

    2013-01-01

    While studying actin assembly as a graduate student with Matt Welch at the University of California at Berkeley, my interest was piqued by reports of surprising observations in bacteria: the identification of numerous cytoskeletal proteins, actin homologues fulfilling spindle-like functions, and even the presence of membrane-bound organelles. Curiosity about these phenomena drew me to Lucy Shapiro's lab at Stanford University for my postdoctoral research. In the Shapiro lab, and now in my lab...

  6. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2015 The Authors.

  7. Finding the big bang

    CERN Document Server

    Page, Lyman A; Partridge, R Bruce

    2009-01-01

    Cosmology, the study of the universe as a whole, has become a precise physical science, the foundation of which is our understanding of the cosmic microwave background radiation (CMBR) left from the big bang. The story of the discovery and exploration of the CMBR in the 1960s is recalled for the first time in this collection of 44 essays by eminent scientists who pioneered the work. Two introductory chapters put the essays in context, explaining the general ideas behind the expanding universe and fossil remnants from the early stages of the expanding universe. The last chapter describes how the confusion of ideas and measurements in the 1960s grew into the present tight network of tests that demonstrate the accuracy of the big bang theory. This book is valuable to anyone interested in how science is done, and what it has taught us about the large-scale nature of the physical universe.

  8. Big Data and reality

    Directory of Open Access Journals (Sweden)

    Ryan Shaw

    2015-11-01

    Full Text Available DNA sequencers, Twitter, MRIs, Facebook, particle accelerators, Google Books, radio telescopes, Tumblr: what do these things have in common? According to the evangelists of “data science,” all of these are instruments for observing reality at unprecedentedly large scales and fine granularities. This perspective ignores the social reality of these very different technological systems, ignoring how they are made, how they work, and what they mean in favor of an exclusive focus on what they generate: Big Data. But no data, big or small, can be interpreted without an understanding of the process that generated them. Statistical data science is applicable to systems that have been designed as scientific instruments, but is likely to lead to confusion when applied to systems that have not. In those cases, a historical inquiry is preferable.

  9. Biomedical Big Data Training Collaborative (BBDTC): An effort to bridge the talent gap in biomedical science and research.

    Science.gov (United States)

    Purawat, Shweta; Cowart, Charles; Amaro, Rommie E; Altintas, Ilkay

    2017-05-01

    The BBDTC (https://biobigdata.ucsd.edu) is a community-oriented platform to encourage high-quality knowledge dissemination with the aim of growing a well-informed biomedical big data community through collaborative efforts on training and education. The BBDTC is an e-learning platform that empowers the biomedical community to develop, launch and share open training materials. It deploys hands-on software training toolboxes through virtualization technologies such as Amazon EC2 and Virtualbox. The BBDTC facilitates migration of courses across other course management platforms. The framework encourages knowledge sharing and content personalization through the playlist functionality that enables unique learning experiences and accelerates information dissemination to a wider community.

  10. Big Data as Information Barrier

    Directory of Open Access Journals (Sweden)

    Victor Ya. Tsvetkov

    2014-07-01

    Full Text Available The article covers analysis of ‘Big Data’ which has been discussed over last 10 years. The reasons and factors for the issue are revealed. It has proved that the factors creating ‘Big Data’ issue has existed for quite a long time, and from time to time, would cause the informational barriers. Such barriers were successfully overcome through the science and technologies. The conducted analysis refers the “Big Data” issue to a form of informative barrier. This issue may be solved correctly and encourages development of scientific and calculating methods.

  11. The role of big laboratories

    CERN Document Server

    Heuer, Rolf-Dieter

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward.

  12. The role of big laboratories

    International Nuclear Information System (INIS)

    Heuer, R-D

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward. (paper)

  13. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  14. Big Dreams

    Science.gov (United States)

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  15. Laser welding of Ti-Ni type shape memory alloy

    International Nuclear Information System (INIS)

    Hirose, Akio; Araki, Takao; Uchihara, Masato; Honda, Keizoh; Kondoh, Mitsuaki.

    1990-01-01

    The present study was undertaken to apply the laser welding to the joining of a shape memory alloy. Butt welding of a Ti-Ni type shape memory alloy was performed using 10 kW CO 2 laser. The laser welded specimens showed successfully the shape memory effect and super elasticity. These properties were approximately identical with those of the base metal. The change in super elasticity of the welded specimen during tension cycling was investigated. Significant changes in stress-strain curves and residual strain were not observed in the laser welded specimen after the 50-time cyclic test. The weld metal exhibited the celler dendrite. It was revealed by electron diffraction analysis that the phase of the weld metal was the TiNi phase of B2 structure which is the same as the parent phase of base metal and oxide inclusions crystallized at the dendrite boundary. However, oxygen contamination in the weld metal by laser welding did not occur because there was almost no difference in oxygen content between the base metal and the weld metal. The transformation temperatures of the weld metal were almost the same as those of the base metal. From these results, laser welding is applicable to the joining of the Ti-Ni type shape memory alloy. As the application of laser welding to new shape memory devices, the multiplex shape memory device of welded Ti-50.5 at % Ni and Ti-51.0 at % Ni was produced. The device showed two-stage shape memory effects due to the difference in transformation temperature between the two shape memory alloys. (author)

  16. Big data is not a monolith

    CERN Document Server

    Ekbia, Hamid R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  17. The Big Bang Theory--Coping with Multi-Religious Beliefs in the Super-Diverse Science Classroom

    Science.gov (United States)

    De Carvalho, Roussel

    2013-01-01

    Large urban schools have to cope with a "super-diverse" population with a multireligious background in their classrooms. The job of the science teacher within this environment requires an ultra-sensitive pedagogical approach, and a deeper understanding of students' backgrounds and of scientific epistemology. Teachers must create a safe…

  18. Big Data, Biostatistics and Complexity Reduction

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2018-01-01

    Roč. 14, č. 2 (2018), s. 24-32 ISSN 1801-5603 R&D Projects: GA MZd(CZ) NV15-29835A Institutional support: RVO:67985807 Keywords : Biostatistics * Big data * Multivariate statistics * Dimensionality * Variable selection Subject RIV: IN - Informatics, Computer Science OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) https://www.ejbi.org/scholarly-articles/big-data-biostatistics-and-complexity-reduction.pdf

  19. Big Data

    OpenAIRE

    Bútora, Matúš

    2017-01-01

    Cieľom bakalárskej práca je popísať problematiku Big Data a agregačné operácie OLAP pre podporu rozhodovania, ktoré sú na ne aplikované pomocou technológie Apache Hadoop. Prevažná časť práce je venovaná popisu práve tejto technológie. Posledná kapitola sa zaoberá spôsobom aplikovania agregačných operácií a problematikou ich realizácie. Nasleduje celkové zhodnotenie práce a možnosti využitia výsledného systému do budúcna. The aim of the bachelor thesis is to describe the Big Data issue and ...

  20. Big Data & Datamining: Using APIs to computationally determine who follows space science, & what do they care about?

    Science.gov (United States)

    Gay, Pamela L.; Bakerman, Maya; Graziano, Nancy; Murph, Susan; Reiheld, Alison; CosmoQuest

    2017-10-01

    In today's connected world, scientists & space science projects are turning to social media outlets like Twitter to share our achievements, request aid, & discuss the issues of our profession. Maintaining these disparate feeds requires time & resources that are already in short supply. To justify these efforts, we must examine the data to determine: are we speaking to our intended audiences; are our varied efforts needed; & what types of messages achieve the greatest interactions. The software used to support this project is available on GitHub.Previously, it has been unclear if our day-to-day social media efforts have been merely preaching to one homogeneous choir from which we have all drawn our audiences, or if our individual efforts have been able to reach into different communities to multiply our impact. In this preliminary study, we examine the social media audiences of several space science Twitter feeds that relate to: podcasting; professional societies; individual programs; & individuals. This study directly measures the overlap in audiences & the diversity of interests held by these audiences. Through statistical analysis, we can discern if these audiences are all drawn from one single population, or if we are sampling different base populations with different feeds.The data generated in this project allow us to look beyond how our audiences interact with space science, with the added benefit of revealing their other interests. These interests are reflected by the non-space science accounts they follow on Twitter. This information will allow us to effectively recruit new people from space science adjacent interests.After applying large data analytics & statistics to social media interactions, we can model online communications, audience population types, & the causal relationships between how we tweet &how our audiences interact. With this knowledge, we are then able to institute reliable communications & effective interactions with our target audience

  1. BIG DATA

    OpenAIRE

    Abhishek Dubey

    2018-01-01

    The term 'Big Data' portrays inventive methods and advances to catch, store, disseminate, oversee and break down petabyte-or bigger estimated sets of data with high-speed & diverted structures. Enormous information can be organized, non-structured or half-organized, bringing about inadequacy of routine information administration techniques. Information is produced from different distinctive sources and can touch base in the framework at different rates. With a specific end goal to handle this...

  2. Ultrahigh Sensitivity Piezoresistive Pressure Sensors for Detection of Tiny Pressure.

    Science.gov (United States)

    Li, Hongwei; Wu, Kunjie; Xu, Zeyang; Wang, Zhongwu; Meng, Yancheng; Li, Liqiang

    2018-05-31

    High sensitivity pressure sensors are crucial for the ultra-sensitive touch technology and E-skin, especially at the tiny pressure range below 100 Pa. However, it is highly challenging to substantially promote sensitivity beyond the current level at several to two hundred kPa -1 , and to improve the detection limit lower than 0.1 Pa, which is significant for the development of pressure sensors toward ultrasensitive and highly precise detection. Here, we develop an efficient strategy to greatly improve the sensitivity near to 2000 kPa -1 by using short channel coplanar device structure and sharp microstructure, which is systematically proposed for the first time and rationalized by the mathematic calculation and analysis. Significantly, benefiting from the ultrahigh sensitivity, the detection limit is improved to be as small as 0.075 Pa. The sensitivity and detection limit are both superior to the current levels, and far surpass the function of human skin. Furthermore, the sensor shows fast response time (50 μs), excellent reproducibility and stability, and low power consumption. Remarkably, the sensor shows excellent detection capacity in the tiny pressure range including LED switching with a pressure of 7 Pa, ringtone (2-20 Pa) recognition, and ultrasensitive (0.1 Pa) electronic glove. This work represents a performance and strategic progress in the field of pressure sensing.

  3. Swimming of a Tiny Subtropical Sea Butterfly with Coiled Shell

    Science.gov (United States)

    Murphy, David; Karakas, Ferhat; Maas, Amy

    2017-11-01

    Sea butterflies, also known as pteropods, include a variety of small, zooplanktonic marine snails. Thecosomatous pteropods possess a shell and swim at low Reynolds numbers by beating their wing-like parapodia in a manner reminiscent of insect flight. In fact, previous studies of the pteropod Limacina helicina have shown that pteropod swimming hydrodynamics and tiny insect flight aerodynamics are dynamically similar. Studies of L. helicina swimming have been performed in polar (0 degrees C) and temperate conditions (12 degrees C). Here we present measurements of the swimming of Heliconoides inflatus, a smaller yet morphologically similar pteropod that lives in warm Bermuda seawater (21 degrees C) with a viscosity almost half that of the polar seawater. The collected H. inflatus have shell sizes less than 1.5 mm in diameter, beat their wings at frequencies up to 11 Hz, and swim upwards in sawtooth trajectories at speeds up to approximately 25 mm/s. Using three-dimensional wing and body kinematics collected with two orthogonal high speed cameras and time-resolved, 2D flow measurements collected with a micro-PIV system, we compare the effects of smaller body size and lower water viscosity on the flow physics underlying flapping-based swimming by pteropods and flight by tiny insects.

  4. Bringing soil science to society after catastrophic events such as big forest fires. Some examples of field approaches in Spanish Mediterranean areas

    Science.gov (United States)

    Mataix-Solera, Jorge; Arcenegui, Vicky; Cerdà, Artemi; García-Orenes, Fuensanta; Moltó, Jorge; Chrenkovà, Katerina; Torres, Pilar; Lozano, Elena; Jimenez-Pinilla, Patricia; Jara-Navarro, Ana B.

    2015-04-01

    Forest fires must be considered a natural factor in Mediterranean ecosystems, but the changes in land use in the last six decades have altered its natural regime making them an ongoing environmental problem. Some big forest fires (> 500 has) also have a great socio-economical impact on human population. Our research team has experience of 20 years studying the effects of forest fires on soil properties, their recovery after fire and the impact of some post-fire management treatments. In this work we want to show our experience of how to transfer part of our knowledge to society after two catastrophic events of forest fires in the Alicante Province (E Spain). Two big forest fires: one in "Sierra de Mariola (Alcoi)" and other in "Montgó Natural Park (Javea-Denia)" occurred in in July 2012 and September 2014 respectivelly, and as consequence a great impact was produced on the populations of nearby affected villages. Immediatelly, some groups were formed through social networks with the aim of trying to help recover the affected areas as soon as possible. Usually, society calls for early reforestation and this preassure on forest managers and politicians can produce a response with a greater impact on fire-affected area than the actual fire. The soil is a fragile ecosystem after forest fire, and the situation after fire can vary greatly depending on many factors such as fire severity, previous history of fire in the area, soil type, topography, etc. An evaluation of the site to make the best decision for recovery of the area, protecting the soil and avoiding degradation of the ecosystem is necessary. In these 2 cases we organized some field activities and conferences to give society knowledge of how soil is affected by forest fires, and what would be the best post-fire management depending on how healthy the soil is and the vegetation resilience after fire and our expectations for a natural recovery. The application of different types of mulch in vulnerable areas, the

  5. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  6. A big bang in a little room the quest to create new universes

    CERN Document Server

    Merali, Zeeya

    2017-01-01

    What if you could become God, with the ability to build a whole new universe? As startling as it sounds, modern physics suggests that within the next two decades, scientists may be able to perform this seemingly divine feat-to concoct an entirely new baby universe, complete with its own physical laws, star systems, galaxies, and even intelligent life. A Big Bang in a Little Room takes the reader on a journey through the history of cosmology and unravels-particle by particle, theory by theory, and experiment by experiment-the ideas behind this provocative claim made by some of the most respected physicists alive today. Beyond simply explaining the science, A Big Bang in a Little Room also tells the story of the people who have been laboring for more than thirty years to make this seemingly impossible dream a reality. What has driven them to continue on what would seem, at first glance, to be a quixotic quest? This mind-boggling book reveals that we can nurse other worlds in the tiny confines of a lab, raising...

  7. Keynote: Big Data, Big Opportunities

    OpenAIRE

    Borgman, Christine L.

    2014-01-01

    The enthusiasm for big data is obscuring the complexity and diversity of data in scholarship and the challenges for stewardship. Inside the black box of data are a plethora of research, technology, and policy issues. Data are not shiny objects that are easily exchanged. Rather, data are representations of observations, objects, or other entities used as evidence of phenomena for the purposes of research or scholarship. Data practices are local, varying from field to field, individual to indiv...

  8. NASA's Big Data Task Force

    Science.gov (United States)

    Holmes, C. P.; Kinter, J. L.; Beebe, R. F.; Feigelson, E.; Hurlburt, N. E.; Mentzel, C.; Smith, G.; Tino, C.; Walker, R. J.

    2017-12-01

    Two years ago NASA established the Ad Hoc Big Data Task Force (BDTF - https://science.nasa.gov/science-committee/subcommittees/big-data-task-force), an advisory working group with the NASA Advisory Council system. The scope of the Task Force included all NASA Big Data programs, projects, missions, and activities. The Task Force focused on such topics as exploring the existing and planned evolution of NASA's science data cyber-infrastructure that supports broad access to data repositories for NASA Science Mission Directorate missions; best practices within NASA, other Federal agencies, private industry and research institutions; and Federal initiatives related to big data and data access. The BDTF has completed its two-year term and produced several recommendations plus four white papers for NASA's Science Mission Directorate. This presentation will discuss the activities and results of the TF including summaries of key points from its focused study topics. The paper serves as an introduction to the papers following in this ESSI session.

  9. Data-Intensive Science meets Inquiry-Driven Pedagogy: Interactive Big Data Exploration, Threshold Concepts, and Liminality

    Science.gov (United States)

    Ramachandran, Rahul; Word, Andrea; Nair, Udasysankar

    2014-01-01

    Threshold concepts in any discipline are the core concepts an individual must understand in order to master a discipline. By their very nature, these concepts are troublesome, irreversible, integrative, bounded, discursive, and reconstitutive. Although grasping threshold concepts can be extremely challenging for each learner as s/he moves through stages of cognitive development relative to a given discipline, the learner's grasp of these concepts determines the extent to which s/he is prepared to work competently and creatively within the field itself. The movement of individuals from a state of ignorance of these core concepts to one of mastery occurs not along a linear path but in iterative cycles of knowledge creation and adjustment in liminal spaces - conceptual spaces through which learners move from the vaguest awareness of concepts to mastery, accompanied by understanding of their relevance, connectivity, and usefulness relative to questions and constructs in a given discipline. For example, challenges in the teaching and learning of atmospheric science can be traced to threshold concepts in fluid dynamics. In particular, Dynamic Meteorology is one of the most challenging courses for graduate students and undergraduates majoring in Atmospheric Science. Dynamic Meteorology introduces threshold concepts - those that prove troublesome for the majority of students but that are essential, associated with fundamental relationships between forces and motion in the atmosphere and requiring the application of basic classical statics, dynamics, and thermodynamic principles to the three dimensionally varying atmospheric structure. With the explosive growth of data available in atmospheric science, driven largely by satellite Earth observations and high-resolution numerical simulations, paradigms such as that of dataintensive science have emerged. These paradigm shifts are based on the growing realization that current infrastructure, tools and processes will not allow

  10. Unsolved Mysteries of Science: A Mind-Expanding Journey through a Universe of Big Bangs, Particle Waves, and Other Perplexing Concepts

    Science.gov (United States)

    Malone, John

    2001-08-01

    A LIVELY EXPLORATION OF THE BIGGEST QUESTIONS IN SCIENCE How Did the Universe Begin? The Big Bang has been the accepted theory for decades, but does it explain everything? How Did Life on Earth Get Started? What triggered the cell division that started the evolutionary chain? Did life come from outer space, buried in a chunk of rock? What is Gravity? Newton's apple just got the arguments started, Einstein made things more complicated. Just how does gravity fit in with quantum theory? What Is the Inside of the Earth Like? What exactly is happening beneath our feet, and can we learn enough to help predict earthquakes and volcanic eruptions? How Do We Learn Language? Is language acquisition an inborn biological ability, or does every child have to start from scratch? Is There a Missing Link? The story of human evolution is not complete. In addition to hoaxes such as "Piltdown Man" and extraordinary finds such as "Lucy," many puzzles remain. What, in the end, do we mean by a "missing link"?

  11. Embodied Genetics in Science-Fiction, Big-Budget to Low-Budget: from Jeunet’s Alien: Resurrection (1997 to Piccinini’s Workshop (2011

    Directory of Open Access Journals (Sweden)

    Virginás Andrea

    2014-09-01

    Full Text Available The article uses and revises to some extent Vivian Sobchack’s categorization of (basically American science-fiction output as “optimistic big-budget,” “wondrous middle-ground” and “pessimistic low-budget” seen as such in relation to what Sobchack calls the “double view” of alien beings in filmic diegesis (Screening Space, 2001. The argument is advanced that based on how diegetic encounters are constructed between “genetically classical” human agents and beings only partially “genetically classical” and/or human (due to genetic diseases, mutations, splicing, and cloning, we may differentiate between various methods of visualization (nicknamed “the museum,” “the lookalike,” and “incest” that are correlated to Sobchack’s mentioned categories, while also displaying changes in tone. Possibilities of revision appear thanks to the later timeframe (the late 1990s/2000s and the different national-canonical belongings (American, Icelandic-German- Danish, Hungarian-German, Canadian-French-American, and Australian that characterize filmic and artistic examples chosen for analysis as compared to Sobchack’s work in Screening Space.1

  12. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  13. Stress transmission through Ti-Ni alloy, titanium and stainless steel in impact compression test.

    Science.gov (United States)

    Yoneyama, T; Doi, H; Kobayashi, E; Hamanaka, H; Tanabe, Y; Bonfield, W

    2000-06-01

    Impact stress transmission of Ti-Ni alloy was evaluated for biomedical stress shielding. Transformation temperatures of the alloy were investigated by means of DSC. An impact compression test was carried out with use of split-Hopkinson pressure-bar technique with cylindrical specimens of Ti-Ni alloy, titanium and stainless steel. As a result, the transmitted pulse through Ti-Ni alloy was considerably depressed as compared with those through titanium and stainless steel. The initial stress reduction was large through Ti-Ni alloy and titanium, but the stress reduction through Ti-Ni alloy was more continuous than titanium. The maximum value in the stress difference between incident and transmitted pulses through Ti-Ni alloy or titanium was higher than that through stainless steel, while the stress reduction in the maximum stress through Ti-Ni alloy was statistically larger than that through titanium or stainless steel. Ti-Ni alloy transmitted less impact stress than titanium or stainless steel, which suggested that the loading stress to adjacent tissues could be decreased with use of Ti-Ni alloy as a component material in an implant system. Copyright 2000 Kluwer Academic Publishers

  14. SCADA SYSTEM SIMULATION USING THE TINY TIGER 2 DEVELOPMENT BOARD

    Directory of Open Access Journals (Sweden)

    AGAPE C.P.

    2015-12-01

    Full Text Available This paper presents a new design for a surveillance and control system of a medium voltage cell. The accent is on the acquisition of information of the consumer’s state, the instantaneous current consumption, power and voltage apparent to the consumer. The proposed design is based on Wilke Technology development board at its basis being a Tiny-tiger 2 Multitasking Microcontroller. This computer has 2 MByte or 4 MByte Flash for programming, and 1 MByte SRAM with backup input for data. On the software’s behalf we managed to create a Delphi Interface which communicates with the serial port on the development board. The interface takes information about the consumer and its capacity to load with voltage.

  15. What¿s the deal with the web/blogs/the next big technology: a key role for information science in e-social science research?

    NARCIS (Netherlands)

    Thelwall, M.; Wouters, P.

    2005-01-01

    Since many nations have provided substantial funding for new e-social science and humanities investigations, there is now an opportunity for information scientists to adopt an enabling role for this new kind of research. Logically, a more information-centred environment should be more conducive to

  16. Big Data

    DEFF Research Database (Denmark)

    Aaen, Jon; Nielsen, Jeppe Agger

    2016-01-01

    Big Data byder sig til som en af tidens mest hypede teknologiske innovationer, udråbt til at rumme kimen til nye, værdifulde operationelle indsigter for private virksomheder og offentlige organisationer. Mens de optimistiske udmeldinger er mange, er forskningen i Big Data i den offentlige sektor...... indtil videre begrænset. Denne artikel belyser, hvordan den offentlige sundhedssektor kan genanvende og udnytte en stadig større mængde data under hensyntagen til offentlige værdier. Artiklen bygger på et casestudie af anvendelsen af store mængder sundhedsdata i Dansk AlmenMedicinsk Database (DAMD......). Analysen viser, at (gen)brug af data i nye sammenhænge er en flerspektret afvejning mellem ikke alene økonomiske rationaler og kvalitetshensyn, men også kontrol over personfølsomme data og etiske implikationer for borgeren. I DAMD-casen benyttes data på den ene side ”i den gode sags tjeneste” til...

  17. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  18. Research Ethics in Big Data.

    Science.gov (United States)

    Hammer, Marilyn J

    2017-05-01

    The ethical conduct of research includes, in part, patient agreement to participate in studies and the protection of health information. In the evolving world of data science and the accessibility of large quantities of web-based data created by millions of individuals, novel methodologic approaches to answering research questions are emerging. This article explores research ethics in the context of big data.

  19. The International Big History Association

    Science.gov (United States)

    Duffy, Michael; Duffy, D'Neil

    2013-01-01

    IBHA, the International Big History Association, was organized in 2010 and "promotes the unified, interdisciplinary study and teaching of history of the Cosmos, Earth, Life, and Humanity." This is the vision that Montessori embraced long before the discoveries of modern science fleshed out the story of the evolving universe. "Big…

  20. Data-Intensive Science Meets Inquiry-Driven Pedagogy: Interactive Big Data Exploration, Threshold Concepts, and Liminality

    Science.gov (United States)

    Ramachandran, R.; Nair, U. S.; Word, A.

    2014-12-01

    Threshold concepts in any discipline are the core concepts an individual must understand in order to master a discipline. By their very nature, these concepts are troublesome, irreversible, integrative, bounded, discursive, and reconstitutive. Although grasping threshold concepts can be extremely challenging for each learner as s/he moves through stages of cognitive development relative to a given discipline, the learner's grasp of these concepts determines the extent to which s/he is prepared to work competently and creatively within the field itself. The movement of individuals from a state of ignorance of these core concepts to one of mastery occurs not along a linear path but in iterative cycles of knowledge creation and adjustment in liminal spaces - conceptual spaces through which learners move from the vaguest awareness of concepts to mastery, accompanied by understanding of their relevance, connectivity, and usefulness relative to questions and constructs in a given discipline. With the explosive growth of data available in atmospheric science, driven largely by satellite Earth observations and high-resolution numerical simulations, paradigms such as that of data-intensive science have emerged. These paradigm shifts are based on the growing realization that current infrastructure, tools and processes will not allow us to analyze and fully utilize the complex and voluminous data that is being gathered. In this emerging paradigm, the scientific discovery process is driven by knowledge extracted from large volumes of data. In this presentation, we contend that this paradigm naturally lends to inquiry-driven pedagogy where knowledge is discovered through inductive engagement with large volumes of data rather than reached through traditional, deductive, hypothesis-driven analyses. In particular, data-intensive techniques married with an inductive methodology allow for exploration on a scale that is not possible in the traditional classroom with its typical

  1. Addressing big data issues in Scientific Data Infrastructure

    NARCIS (Netherlands)

    Demchenko, Y.; Membrey, P.; Grosso, P.; de Laat, C.; Smari, W.W.; Fox, G.C.

    2013-01-01

    Big Data are becoming a new technology focus both in science and in industry. This paper discusses the challenges that are imposed by Big Data on the modern and future Scientific Data Infrastructure (SDI). The paper discusses a nature and definition of Big Data that include such features as Volume,

  2. Making big communities small: using network science to understand the ecological and behavioral requirements for community social capital.

    Science.gov (United States)

    Neal, Zachary

    2015-06-01

    The concept of social capital is becoming increasingly common in community psychology and elsewhere. However, the multiple conceptual and operational definitions of social capital challenge its utility as a theoretical tool. The goals of this paper are to clarify two forms of social capital (bridging and bonding), explicitly link them to the structural characteristics of small world networks, and explore the behavioral and ecological prerequisites of its formation. First, I use the tools of network science and specifically the concept of small-world networks to clarify what patterns of social relationships are likely to facilitate social capital formation. Second, I use an agent-based model to explore how different ecological characteristics (diversity and segregation) and behavioral tendencies (homophily and proximity) impact communities' potential for developing social capital. The results suggest diverse communities have the greatest potential to develop community social capital, and that segregation moderates the effects that the behavioral tendencies of homophily and proximity have on community social capital. The discussion highlights how these findings provide community-based researchers with both a deeper understanding of the contextual constraints with which they must contend, and a useful tool for targeting their efforts in communities with the greatest need or greatest potential.

  3. Advancing stroke genomic research in the age of Trans-Omics big data science: Emerging priorities and opportunities.

    Science.gov (United States)

    Owolabi, Mayowa; Peprah, Emmanuel; Xu, Huichun; Akinyemi, Rufus; Tiwari, Hemant K; Irvin, Marguerite R; Wahab, Kolawole Wasiu; Arnett, Donna K; Ovbiagele, Bruce

    2017-11-15

    We systematically reviewed the genetic variants associated with stroke in genome-wide association studies (GWAS) and examined the emerging priorities and opportunities for rapidly advancing stroke research in the era of Trans-Omics science. Using the PRISMA guideline, we searched PubMed and NHGRI- EBI GWAS catalog for stroke studies from 2007 till May 2017. We included 31 studies. The major challenge is that the few validated variants could not account for the full genetic risk of stroke and have not been translated for clinical use. None of the studies included continental Africans. Genomic study of stroke among Africans presents a unique opportunity for the discovery, validation, functional annotation, Trans-Omics study and translation of genomic determinants of stroke with implications for global populations. This is because all humans originated from Africa, a continent with a unique genomic architecture and a distinctive epidemiology of stroke; as well as substantially higher heritability and resolution of fine mapping of stroke genes. Understanding the genomic determinants of stroke and the corresponding molecular mechanisms will revolutionize the development of a new set of precise biomarkers for stroke prediction, diagnosis and prognostic estimates as well as personalized interventions for reducing the global burden of stroke. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Prospect of Ti-Ni shape memory alloy applied in reactor structures

    International Nuclear Information System (INIS)

    Duan Yuangang

    1995-01-01

    Shape memory effect mechanism, physical property, composition, manufacturing process and application in mechanical structure of Ti-Ni shape memory alloy are introduced. Applications of Ti-Ni shape memory alloy in reactor structure are prospected and some necessary technical conditions of shape memory alloy applied in the reactor structure are put forward initially

  5. A Multidisciplinary Perspective of Big Data in Management Research

    OpenAIRE

    Sheng, Jie; Amankwah-Amoah, J.; Wang, X.

    2017-01-01

    In recent years, big data has emerged as one of the prominent buzzwords in business and management. In spite of the mounting body of research on big data across the social science disciplines, scholars have offered little synthesis on the current state of knowledge. To take stock of academic research that contributes to the big data revolution, this paper tracks scholarly work's perspectives on big data in the management domain over the past decade. We identify key themes emerging in manageme...

  6. How Big is Earth?

    Science.gov (United States)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  7. SMS Security System on Mobile Devices Using Tiny Encryption Algorithm

    Science.gov (United States)

    Novelan, M. S.; Husein, A. M.; Harahap, M.; Aisyah, S.

    2018-04-01

    The development of telecommunications technology is so rapid has given such great benefits. With the telecommunication technology, distance and time no longer be a significant obstacle. One of the results of telecommunications technology that is well known is the Short Message Service. In this study developed an application on the mobile phone to modify the SMS message into ciphertext so that the information content of the SMS is not known by others. SMS delivery system for encrypting messages into ciphertext using a key that is entered by the sender then sends to the destination number. SMS reception system to decrypt it to others via SMS without the fear of information from these messages will be known by others. The method used in the system encrypt and decrypt the message is the algorithm Tiny Encryption Algorithm and implemented using the Java programming language. JDK 1.7 as the Java programming language ciphertext into plaintext using the key entered by the receiver and displays the original message to the recipient. This application can be used by someone who wants to send a confidential information and the Java compiler. Eclipse, a Java SDK and the Android SDK as a Java source code editor.

  8. Tiny timekeepers witnessing high-rate exhumation processes.

    Science.gov (United States)

    Zhong, Xin; Moulas, Evangelos; Tajčmanová, Lucie

    2018-02-02

    Tectonic forces and surface erosion lead to the exhumation of rocks from the Earth's interior. Those rocks can be characterized by many variables including peak pressure and temperature, composition and exhumation duration. Among them, the duration of exhumation in different geological settings can vary by more than ten orders of magnitude (from hours to billion years). Constraining the duration is critical and often challenging in geological studies particularly for rapid magma ascent. Here, we show that the time information can be reconstructed using a simple combination of laser Raman spectroscopic data from mineral inclusions with mechanical solutions for viscous relaxation of the host. The application of our model to several representative geological settings yields best results for short events such as kimberlite magma ascent (less than ~4,500 hours) and a decompression lasting up to ~17 million years for high-pressure metamorphic rocks. This is the first precise time information obtained from direct microstructural observations applying a purely mechanical perspective. We show an unprecedented geological value of tiny mineral inclusions as timekeepers that contributes to a better understanding on the large-scale tectonic history and thus has significant implications for a new generation of geodynamic models.

  9. Purple Salt and Tiny Drops of Water in Meteorites

    Science.gov (United States)

    Taylor, G. J.

    1999-12-01

    Some meteorites, especially those called carbonaceous chondrites, have been greatly affected by reaction with water on the asteroids in which they formed. These reactions, which took place during the first 10 million years of the Solar System's history, formed assorted water-bearing minerals, but nobody has found any of the water that caused the alteration. Nobody, that is, until now. Michael Zolensky and team of scientists from the Johnson Space Center in Houston and Virginia Tech (Blacksburg, Virginia) discovered strikingly purple sodium chloride (table salt) crystals in two meteorites. The salt contains tiny droplets of salt water (with some other elements dissolved in it). The salt is as old as the Solar System, so the water trapped inside the salt is also ancient. It might give us clues to the nature of the water that so pervasively altered carbonaceous chondrites and formed oceans on Europa and perhaps other icy satellites. However, how the salt got into the two meteorites and how it trapped the water remains a mystery - at least for now.

  10. 'Big science' and public trust

    CERN Multimedia

    2001-01-01

    The LHC project now faces cost overruns of several hundred million dollars. It has been suggested that poor financial controls and inadequate accounting procedures are at least partly to blame (1/2 page).

  11. Spark plasma sintering of TiNi nano-powders for biological application

    International Nuclear Information System (INIS)

    Fu, Y Q; Gu, Y W; Shearwood, C; Luo, J K; Flewitt, A J; Milne, W I

    2006-01-01

    Nano-sized TiNi powder with an average size of 50 nm was consolidated using spark plasma sintering (SPS) at 800 deg. C for 5 min. A layer of anatase TiO 2 coating was formed on the sintered TiNi by chemical reaction with a hydrogen peroxide (H 2 O 2 ) solution at 60 deg. C followed by heat treatment at 400 deg. C to enhance the bioactivity of the metal surface. Cell culture using osteoblast cells and a biomimetic test in simulated body fluid proved the biocompatibility of the chemically treated SPS TiNi

  12. Epidemiology in the Era of Big Data

    Science.gov (United States)

    Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M

    2015-01-01

    Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called ‘3 Vs’: variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that, while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field’s future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future. PMID:25756221

  13. [Big data in imaging].

    Science.gov (United States)

    Sewerin, Philipp; Ostendorf, Benedikt; Hueber, Axel J; Kleyer, Arnd

    2018-04-01

    Until now, most major medical advancements have been achieved through hypothesis-driven research within the scope of clinical trials. However, due to a multitude of variables, only a certain number of research questions could be addressed during a single study, thus rendering these studies expensive and time consuming. Big data acquisition enables a new data-based approach in which large volumes of data can be used to investigate all variables, thus opening new horizons. Due to universal digitalization of the data as well as ever-improving hard- and software solutions, imaging would appear to be predestined for such analyses. Several small studies have already demonstrated that automated analysis algorithms and artificial intelligence can identify pathologies with high precision. Such automated systems would also seem well suited for rheumatology imaging, since a method for individualized risk stratification has long been sought for these patients. However, despite all the promising options, the heterogeneity of the data and highly complex regulations covering data protection in Germany would still render a big data solution for imaging difficult today. Overcoming these boundaries is challenging, but the enormous potential advances in clinical management and science render pursuit of this goal worthwhile.

  14. XPS characterization of surface and interfacial structure of sputtered TiNi films on Si substrate

    International Nuclear Information System (INIS)

    Fu Yongqing; Du Hejun; Zhang, Sam; Huang Weimin

    2005-01-01

    TiNi films were prepared by co-sputtering TiNi and Ti targets. X-ray photoelectron spectroscopy (XPS) was employed to study surface chemistry of the films and interfacial structure of Si/TiNi system. Exposure of the TiNi film to the ambient atmosphere (23 deg. C and 80% relatively humidity) facilitated quick adsorption of oxygen and carbon on the surface. With time, carbon and oxygen content increased drastically at the surface, while oxygen diffused further into the layer. After a year, carbon content at the surface became as high as 65.57% and Ni dropped below the detection limit of XPS. Depth profiling revealed that significant inter-diffusion occurred between TiNi film and Si substrate with a layer of 90-100 nm. The detailed bond changes of different elements with depth were obtained using XPS and the formation of titanium silicides at the interface were identified

  15. Crystal structure of TiNi nanoparticles obtained by Ar ion beam deposition

    International Nuclear Information System (INIS)

    Castro, A. Torres; Cuellar, E. Lopez; Mendez, U. Ortiz; Yacaman, M. Jose

    2008-01-01

    Nanoparticles are a state of matter that have properties different from either molecules or bulk solids, turning them into a very interesting class of materials to study. In the present work, the crystal structure of TiNi nanoparticles obtained by ion beam deposition is characterized. TiNi nanoparticles were obtained from TiNi wire samples by sputtering with Ar ions using a Gatan precision ion polishing system. The TiNi nanoparticles were deposited on a Lacey carbon film that was used for characterization by transmission electron microscopy. The nanoparticles were characterized by high-resolution transmission electron microscopy, high-angle annular dark-field imaging, electron diffraction, scanning transmission electron microscopy and energy-dispersive X-ray spectroscopy. Results of nanodiffraction seem to indicate that the nanoparticles keep the same B2 crystal structure as the bulk material but with a decreased lattice parameter

  16. Galled by the Gallbladder?: Your Tiny, Hard-Working Digestive Organ

    Science.gov (United States)

    ... Galled by the Gallbladder? Your Tiny, Hard-Working Digestive Organ En español Send us your comments Most ... among the most common and costly of all digestive system diseases. By some estimates, up to 20 ...

  17. The kinetics of Cr layer coated on TiNi films for hydrogen absorption

    Indian Academy of Sciences (India)

    Abstract. The effect of hydrogen absorption on electrical resistance with temperature ... pressure by thermal evaporation on the glass substrate at room temperature. ... and charging rate becomes faster in comparison to FeTi and TiNi thin films.

  18. Exploiting big data for critical care research.

    Science.gov (United States)

    Docherty, Annemarie B; Lone, Nazir I

    2015-10-01

    Over recent years the digitalization, collection and storage of vast quantities of data, in combination with advances in data science, has opened up a new era of big data. In this review, we define big data, identify examples of critical care research using big data, discuss the limitations and ethical concerns of using these large datasets and finally consider scope for future research. Big data refers to datasets whose size, complexity and dynamic nature are beyond the scope of traditional data collection and analysis methods. The potential benefits to critical care are significant, with faster progress in improving health and better value for money. Although not replacing clinical trials, big data can improve their design and advance the field of precision medicine. However, there are limitations to analysing big data using observational methods. In addition, there are ethical concerns regarding maintaining confidentiality of patients who contribute to these datasets. Big data have the potential to improve medical care and reduce costs, both by individualizing medicine, and bringing together multiple sources of data about individual patients. As big data become increasingly mainstream, it will be important to maintain public confidence by safeguarding data security, governance and confidentiality.

  19. Big physics quartet win government backing

    Science.gov (United States)

    Banks, Michael

    2014-09-01

    Four major physics-based projects are among 10 to have been selected by Japan’s Ministry of Education, Culture, Sports, Science and Technology for funding in the coming decade as part of its “roadmap” of big-science projects.

  20. The kinetics of Cr layer coated on TiNi films for hydrogen absorption

    Indian Academy of Sciences (India)

    The effect of hydrogen absorption on electrical resistance with temperature for TiNi and TiNi–Cr thin films was investigated. The TiNi thin films of thickness 800 Å were deposited at different angles ( = 0°, 30°, 45°, 60° and 75°) under 10−5 Torr pressure by thermal evaporation on the glass substrate at room temperature.

  1. TinyOS-based quality of service management in wireless sensor networks

    Science.gov (United States)

    Peterson, N.; Anusuya-Rangappa, L.; Shirazi, B.A.; Huang, R.; Song, W.-Z.; Miceli, M.; McBride, D.; Hurson, A.; LaHusen, R.

    2009-01-01

    Previously the cost and extremely limited capabilities of sensors prohibited Quality of Service (QoS) implementations in wireless sensor networks. With advances in technology, sensors are becoming significantly less expensive and the increases in computational and storage capabilities are opening the door for new, sophisticated algorithms to be implemented. Newer sensor network applications require higher data rates with more stringent priority requirements. We introduce a dynamic scheduling algorithm to improve bandwidth for high priority data in sensor networks, called Tiny-DWFQ. Our Tiny-Dynamic Weighted Fair Queuing scheduling algorithm allows for dynamic QoS for prioritized communications by continually adjusting the treatment of communication packages according to their priorities and the current level of network congestion. For performance evaluation, we tested Tiny-DWFQ, Tiny-WFQ (traditional WFQ algorithm implemented in TinyOS), and FIFO queues on an Imote2-based wireless sensor network and report their throughput and packet loss. Our results show that Tiny-DWFQ performs better in all test cases. ?? 2009 IEEE.

  2. The martensitic transformation in Ti-rich TiNi shape memory alloys

    International Nuclear Information System (INIS)

    Lin, H.C.; Wu, S.K.; Lin, J.C.

    1994-01-01

    The martensitic (Ms) transformation temperatures and their ΔH values of Ti 51 Ni 49 and Ti 50.5 Ni 49.5 alloys are higher than those of equiatomic or Ni-rich TiNi alloys. The Ti-rich TiNi alloys exhibit good shape recovery in spite of a great deal of second phase Ti 2 Ni or Ti 4 Ni 2 O existing around B2 grain boundaries. The nearly identical transformation temperatures indicate that the absorbed oxygen in Ti-rich TiNi alloys may react with Ti 2 Ni particles, instead of the TiNi matrix, to form Ti 4 Ni 2 O. Martensite stabilization can be induced by cold rolling at room temperature. Thermal cycling can depress the transformation temperatures significantly, especially in the initial 20 cycles. The R-phase transformation can be promoted by both cold rolling and thermal cycling in Ti-rich TiNi alloys due to introduced dislocations depressing the Ms temperature. The strengthening effects of cold rolling and thermal cycling on the Ms temperature of Ti-rich TiNi alloys are found to follow the expression Ms = To - KΔσ y . The K values are affected by different strengthening processes and related to the as-annealed transformation temperatures. The higher the as-annealed Ms (or As), the larger the K value. (orig.)

  3. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  4. Stalin's Big Fleet Program

    National Research Council Canada - National Science Library

    Mauner, Milan

    2002-01-01

    Although Dr. Milan Hauner's study 'Stalin's Big Fleet program' has focused primarily on the formation of Big Fleets during the Tsarist and Soviet periods of Russia's naval history, there are important lessons...

  5. Big Data Semantics

    NARCIS (Netherlands)

    Ceravolo, Paolo; Azzini, Antonia; Angelini, Marco; Catarci, Tiziana; Cudré-Mauroux, Philippe; Damiani, Ernesto; Mazak, Alexandra; van Keulen, Maurice; Jarrar, Mustafa; Santucci, Giuseppe; Sattler, Kai-Uwe; Scannapieco, Monica; Wimmer, Manuel; Wrembel, Robert; Zaraket, Fadi

    2018-01-01

    Big Data technology has discarded traditional data modeling approaches as no longer applicable to distributed data processing. It is, however, largely recognized that Big Data impose novel challenges in data and infrastructure management. Indeed, multiple components and procedures must be

  6. Characterizing Big Data Management

    OpenAIRE

    Rogério Rossi; Kechi Hirama

    2015-01-01

    Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: t...

  7. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  8. The role of atmospheric diagnosis and Big Data science in improving hydroclimatic extreme prediction and the merits of climate informed prediction for future water resources management

    Science.gov (United States)

    Lu, Mengqian; Lall, Upmanu

    2017-04-01

    The threats that hydroclimatic extremes pose to sustainable development, safety and operation of infrastructure are both severe and growing. Recent heavy precipitation triggered flood events in many regions and increasing frequency and intensity of extreme precipitation suggested by various climate projections highlight the importance of understanding the associated hydrometeorological patterns and space-time variability of such extreme events, and developing a new approach to improve predictability with a better estimation of uncertainty. This clear objective requires the optimal utility of Big Data analytics on multi-source datasets to extract informative predictors from the complex ocean-atmosphere coupled system and develop a statistical and physical based framework. The proposed presentation includes the essence of our selected works in the past two years, as part of our Global Floods Initiatives. Our approach for an improved extreme prediction begins with a better understanding of the associated atmospheric circulation patterns, under the influence and regulation of slowly changing oceanic boundary conditions [Lu et al., 2013, 2016a; Lu and Lall, 2016]. The study of the associated atmospheric circulation pattern and the regulation of teleconnected climate signals adopted data science techniques and statistical modeling recognizing the nonstationarity and nonlinearity of the system, as the underlying statistical assumptions of the classical extreme value frequency analysis are challenged in hydroclimatic studies. There are two main factors that are considered important for understanding how future flood risk will change. One is the consideration of moisture holding capacity as a function of temperature, as suggested by Clausius-Clapeyron equation. The other is the strength of the convergence or convection associated with extreme precipitation. As convergence or convection gets stronger, rain rates can be expected to increase if the moisture is available. For

  9. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  10. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  11. Is Pluto a planet? Student powered video rap ';battle' over tiny Pluto's embattled planetary standing

    Science.gov (United States)

    Beisser, K.; Cruikshank, D. P.; McFadden, T.

    2013-12-01

    Is Pluto a planet? Some creative low income Bay-area middle-schoolers put a musical spin on this hot science debate with a video rap ';battle' over tiny Pluto's embattled planetary standing. The students' timing was perfect, with NASA's New Horizons mission set to conduct the first reconnaissance of Pluto and its moons in July 2015. Pluto - the last of the nine original planets to be explored by spacecraft - has been the subject of scientific study and speculation since Clyde Tombaugh discovered it in 1930, orbiting the Sun far beyond Neptune. Produced by the students and a very creative educator, the video features students 'battling' back and forth over the idea of Pluto being a planet. The group collaborated with actual space scientists to gather information and shot their video before a 'green screen' that was eventually filled with animations and visuals supplied by the New Horizons mission team. The video debuted at the Pluto Science Conference in Maryland in July 2013 - to a rousing response from researchers in attendance. The video marks a nontraditional approach to the ongoing 'great planet debate' while educating viewers on a recently discovered region of the solar system. By the 1990s, researchers had learned that Pluto possessed multiple exotic ices on its surface, a complex atmosphere and seasonal cycles, and a large moon (Charon) that likely resulted from a giant impact on Pluto itself. It also became clear that Pluto was no misfit among the planets - as had long been thought - but the largest and brightest body in a newly discovered 'third zone' of our planetary system called the Kuiper Belt. More recent observations have revealed that Pluto has a rich system of satellites - five known moons - and a surface that changes over time. Scientists even speculate that Pluto may possess an internal ocean. For these and other reasons, the 2003 Planetary Decadal Survey ranked a Pluto/Kuiper Belt mission as the highest priority mission for NASA's newly created

  12. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  13. Small Artifacts - Big Technologies

    DEFF Research Database (Denmark)

    Kreiner, Kristian

    2005-01-01

    The computer IC is the heart of the information and telecommunication technology. It is a tiny artifact, but with incredible organizing powers. We use this physical artifact as the location for studying central problems of the knowledge economy. First, the paper describes the history of chip design...

  14. Tiny Microbes with a Big Impact: The Role of Cyanobacteria and Their Metabolites in Shaping Our Future

    Directory of Open Access Journals (Sweden)

    Sophie Mazard

    2016-05-01

    Full Text Available Cyanobacteria are among the first microorganisms to have inhabited the Earth. Throughout the last few billion years, they have played a major role in shaping the Earth as the planet we live in, and they continue to play a significant role in our everyday lives. Besides being an essential source of atmospheric oxygen, marine cyanobacteria are prolific secondary metabolite producers, often despite the exceptionally small genomes. Secondary metabolites produced by these organisms are diverse and complex; these include compounds, such as pigments and fluorescent dyes, as well as biologically-active compounds with a particular interest for the pharmaceutical industry. Cyanobacteria are currently regarded as an important source of nutrients and biofuels and form an integral part of novel innovative energy-efficient designs. Being autotrophic organisms, cyanobacteria are well suited for large-scale biotechnological applications due to the low requirements for organic nutrients. Recent advances in molecular biology techniques have considerably enhanced the potential for industries to optimize the production of cyanobacteria secondary metabolites with desired functions. This manuscript reviews the environmental role of marine cyanobacteria with a particular focus on their secondary metabolites and discusses current and future developments in both the production of desired cyanobacterial metabolites and their potential uses in future innovative projects.

  15. Tiny Microbes with a Big Impact: The Role of Cyanobacteria and Their Metabolites in Shaping Our Future.

    Science.gov (United States)

    Mazard, Sophie; Penesyan, Anahit; Ostrowski, Martin; Paulsen, Ian T; Egan, Suhelen

    2016-05-17

    Cyanobacteria are among the first microorganisms to have inhabited the Earth. Throughout the last few billion years, they have played a major role in shaping the Earth as the planet we live in, and they continue to play a significant role in our everyday lives. Besides being an essential source of atmospheric oxygen, marine cyanobacteria are prolific secondary metabolite producers, often despite the exceptionally small genomes. Secondary metabolites produced by these organisms are diverse and complex; these include compounds, such as pigments and fluorescent dyes, as well as biologically-active compounds with a particular interest for the pharmaceutical industry. Cyanobacteria are currently regarded as an important source of nutrients and biofuels and form an integral part of novel innovative energy-efficient designs. Being autotrophic organisms, cyanobacteria are well suited for large-scale biotechnological applications due to the low requirements for organic nutrients. Recent advances in molecular biology techniques have considerably enhanced the potential for industries to optimize the production of cyanobacteria secondary metabolites with desired functions. This manuscript reviews the environmental role of marine cyanobacteria with a particular focus on their secondary metabolites and discusses current and future developments in both the production of desired cyanobacterial metabolites and their potential uses in future innovative projects.

  16. Disaggregating asthma: Big investigation versus big data.

    Science.gov (United States)

    Belgrave, Danielle; Henderson, John; Simpson, Angela; Buchan, Iain; Bishop, Christopher; Custovic, Adnan

    2017-02-01

    We are facing a major challenge in bridging the gap between identifying subtypes of asthma to understand causal mechanisms and translating this knowledge into personalized prevention and management strategies. In recent years, "big data" has been sold as a panacea for generating hypotheses and driving new frontiers of health care; the idea that the data must and will speak for themselves is fast becoming a new dogma. One of the dangers of ready accessibility of health care data and computational tools for data analysis is that the process of data mining can become uncoupled from the scientific process of clinical interpretation, understanding the provenance of the data, and external validation. Although advances in computational methods can be valuable for using unexpected structure in data to generate hypotheses, there remains a need for testing hypotheses and interpreting results with scientific rigor. We argue for combining data- and hypothesis-driven methods in a careful synergy, and the importance of carefully characterized birth and patient cohorts with genetic, phenotypic, biological, and molecular data in this process cannot be overemphasized. The main challenge on the road ahead is to harness bigger health care data in ways that produce meaningful clinical interpretation and to translate this into better diagnoses and properly personalized prevention and treatment plans. There is a pressing need for cross-disciplinary research with an integrative approach to data science, whereby basic scientists, clinicians, data analysts, and epidemiologists work together to understand the heterogeneity of asthma. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  17. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  18. Aplicación de técnicas de Big Data Science para la gestión de crisis

    OpenAIRE

    Fernandes Moreno, Caio

    2016-01-01

    A pesar de la existencia de una multitud de investigaciones sobre el análisis de sentimiento, existen pocos trabajos que traten el tema de su implantación práctica y real y su integración con la inteligencia de negocio y big data de tal forma que dichos análisis de sentimiento estén incorporados en una arquitectura (que soporte todo el proceso desde la obtención de datos hasta su explotación con las herramientas de BI) aplicada a la gestión de la crisis. Se busca, por medio de este trabajo, i...

  19. Victoria Stodden: Scholarly Communication in the Era of Big Data and Big Computation

    OpenAIRE

    Stodden, Victoria

    2015-01-01

    Victoria Stodden gave the keynote address for Open Access Week 2015. "Scholarly communication in the era of big data and big computation" was sponsored by the University Libraries, Computational Modeling and Data Analytics, the Department of Computer Science, the Department of Statistics, the Laboratory for Interdisciplinary Statistical Analysis (LISA), and the Virginia Bioinformatics Institute. Victoria Stodden is an associate professor in the Graduate School of Library and Information Scien...

  20. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  1. Stable isotope deltas: Tiny, yet robust signatures in nature

    Science.gov (United States)

    Brand, Willi A.; Coplen, Tyler B.

    2012-01-01

    Although most of them are relatively small, stable isotope deltas of naturally occurring substances are robust and enable workers in anthropology, atmospheric sciences, biology, chemistry, environmental sciences, food and drug authentication, forensic science, geochemistry, geology, oceanography, and paleoclimatology to study a variety of topics. Two fundamental processes explain the stable isotope deltas measured in most terrestrial systems: isotopic fractionation and isotope mixing. Isotopic fractionation is the result of equilibrium or kinetic physicochemical processes that fractionate isotopes because of small differences in physical or chemical properties of molecular species having different isotopes. It is shown that the mixing of radioactive and stable isotope end members can be modelled to provide information on many natural processes, including 14C abundances in the modern atmosphere and the stable hydrogen and oxygen isotopic compositions of the oceans during glacial and interglacial times. The calculation of mixing fractions using isotope balance equations with isotope deltas can be substantially in error when substances with high concentrations of heavy isotopes (e.g. 13C, 2H, and 18O ) are mixed. In such cases, calculations using mole fractions are preferred as they produce accurate mixing fractions. Isotope deltas are dimensionless quantities. In the International System of Units (SI), these quantities have the unit 1 and the usual list of prefixes is not applicable. To overcome traditional limitations with expressing orders of magnitude differences in isotope deltas, we propose the term urey (symbol Ur), after Harold C. Urey, for the unit 1. In such a manner, an isotope delta value expressed traditionally as−25 per mil can be written as−25 mUr (or−2.5 cUr or−0.25 dUr; the use of any SI prefix is possible). Likewise, very small isotopic differences often expressed in per meg ‘units’ are easily included (e.g. either+0.015 ‰ or+15 per meg

  2. The SIKS/BiGGrid Big Data Tutorial

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Lammerts, Evert; de Vries, A.P.

    2011-01-01

    The School for Information and Knowledge Systems SIKS and the Dutch e-science grid BiG Grid organized a new two-day tutorial on Big Data at the University of Twente on 30 November and 1 December 2011, just preceding the Dutch-Belgian Database Day. The tutorial is on top of some exciting new

  3. Toward a manifesto for the 'public understanding of big data'.

    Science.gov (United States)

    Michael, Mike; Lupton, Deborah

    2016-01-01

    In this article, we sketch a 'manifesto' for the 'public understanding of big data'. On the one hand, this entails such public understanding of science and public engagement with science and technology-tinged questions as follows: How, when and where are people exposed to, or do they engage with, big data? Who are regarded as big data's trustworthy sources, or credible commentators and critics? What are the mechanisms by which big data systems are opened to public scrutiny? On the other hand, big data generate many challenges for public understanding of science and public engagement with science and technology: How do we address publics that are simultaneously the informant, the informed and the information of big data? What counts as understanding of, or engagement with, big data, when big data themselves are multiplying, fluid and recursive? As part of our manifesto, we propose a range of empirical, conceptual and methodological exhortations. We also provide Appendix 1 that outlines three novel methods for addressing some of the issues raised in the article. © The Author(s) 2015.

  4. Big Data: You Are Adding to . . . and Using It

    Science.gov (United States)

    Makela, Carole J.

    2016-01-01

    "Big data" prompts a whole lexicon of terms--data flow; analytics; data mining; data science; smart you name it (cars, houses, cities, wearables, etc.); algorithms; learning analytics; predictive analytics; data aggregation; data dashboards; digital tracks; and big data brokers. New terms are being coined frequently. Are we paying…

  5. The Role of Social Responsibility in Big Business Practics

    OpenAIRE

    V A Gurinov

    2010-01-01

    The study of corporate social responsibility has become especially relevant in national science in the context of the development of big business able to assume significant social responsibilities. The article focuses on the issues of the nature and specificity of social responsibility of big business in Russia. The levels of social responsibility and the arrangements for social programmes implementation are also highlighted.

  6. The big bang

    International Nuclear Information System (INIS)

    Chown, Marcus.

    1987-01-01

    The paper concerns the 'Big Bang' theory of the creation of the Universe 15 thousand million years ago, and traces events which physicists predict occurred soon after the creation. Unified theory of the moment of creation, evidence of an expanding Universe, the X-boson -the particle produced very soon after the big bang and which vanished from the Universe one-hundredth of a second after the big bang, and the fate of the Universe, are all discussed. (U.K.)

  7. Fast Response Shape Memory Effect Titanium Nickel (TiNi) Foam Torque Tubes

    Science.gov (United States)

    Jardine, Peter

    2014-01-01

    Shape Change Technologies has developed a process to manufacture net-shaped TiNi foam torque tubes that demonstrate the shape memory effect. The torque tubes dramatically reduce response time by a factor of 10. This Phase II project matured the actuator technology by rigorously characterizing the process to optimize the quality of the TiNi and developing a set of metrics to provide ISO 9002 quality assurance. A laboratory virtual instrument engineering workbench (LabVIEW'TM')-based, real-time control of the torsional actuators was developed. These actuators were developed with The Boeing Company for aerospace applications.

  8. Surface characterization of TiNi deformed by high-pressure torsion

    Energy Technology Data Exchange (ETDEWEB)

    Awang Shri, Dayangku Noorfazidah [Graduate School of Pure and Applied Sciences, University of Tsukuba, Tsukuba, Ibaraki 305-8577 (Japan); Structural Materials Unit, National Institute for Materials Science, Tsukuba, Ibaraki 305-0047 (Japan); Tsuchiya, Koichi, E-mail: tsuchiya.koichi@nims.go.jp [Graduate School of Pure and Applied Sciences, University of Tsukuba, Tsukuba, Ibaraki 305-8577 (Japan); Structural Materials Unit, National Institute for Materials Science, Tsukuba, Ibaraki 305-0047 (Japan); Yamamoto, Akiko [Biomaterials Unit, International Center for Material Nanoarchitectonics (WPI-MANA), National Institute for Materials Science, Namiki 1-1, Tsukuba, Ibaraki 305-0044 (Japan)

    2014-01-15

    Effect of grain refinements and amorphization by high-pressure torsion (HPT) on surface chemistry was investigated on TiNi. X-ray diffraction and micro-Vickers tests were used to check the phase changes and hardness before and after HPT. X-ray photoelectron spectroscopy was used to observe the changes in the natural passive film formation on the surface. Phase analysis reveals the change of crystalline TiNi to nanostructured one with increased hardness with straining by HPT. Grain refinement and amorphization caused by HPT reduce the amount of metallic Ni in the passive films and also increase the thickness of the film.

  9. Fabrication, microstructure and stress effects in sputtered TiNi thin films

    International Nuclear Information System (INIS)

    Grummon, D.S.

    2000-01-01

    Sputtered thin films of equiatomic TiNi and TiNiX ternary alloys have excellent mechanical properties and exhibit robust shape-memory and transformational superelasticity. Furthermore, the energetic nature of the sputter deposition process allows the creation of highly refined microstructures that are difficult to achieve by melt-solidification. The present paper will present recent work on the relationship between processing, microstructure and properties of binary TiNi thin films, focusing primarily on residual stresses, kinetics of stress-relaxation and crystallization, and fine grain sizes achievable using hot-substrate direct crystallization. (orig.)

  10. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  11. Data: Big and Small.

    Science.gov (United States)

    Jones-Schenk, Jan

    2017-02-01

    Big data is a big topic in all leadership circles. Leaders in professional development must develop an understanding of what data are available across the organization that can inform effective planning for forecasting. Collaborating with others to integrate data sets can increase the power of prediction. Big data alone is insufficient to make big decisions. Leaders must find ways to access small data and triangulate multiple types of data to ensure the best decision making. J Contin Educ Nurs. 2017;48(2):60-61. Copyright 2017, SLACK Incorporated.

  12. Nanomedicine: tiny particles and machines give huge gains.

    Science.gov (United States)

    Tong, Sheng; Fine, Eli J; Lin, Yanni; Cradick, Thomas J; Bao, Gang

    2014-02-01

    Nanomedicine is an emerging field that integrates nanotechnology, biomolecular engineering, life sciences and medicine; it is expected to produce major breakthroughs in medical diagnostics and therapeutics. Nano-scale structures and devices are compatible in size with proteins and nucleic acids in living cells. Therefore, the design, characterization and application of nano-scale probes, carriers and machines may provide unprecedented opportunities for achieving a better control of biological processes, and drastic improvements in disease detection, therapy, and prevention. Recent advances in nanomedicine include the development of nanoparticle (NP)-based probes for molecular imaging, nano-carriers for drug/gene delivery, multifunctional NPs for theranostics, and molecular machines for biological and medical studies. This article provides an overview of the nanomedicine field, with an emphasis on NPs for imaging and therapy, as well as engineered nucleases for genome editing. The challenges in translating nanomedicine approaches to clinical applications are discussed.

  13. Biophotonics: the big picture

    Science.gov (United States)

    Marcu, Laura; Boppart, Stephen A.; Hutchinson, Mark R.; Popp, Jürgen; Wilson, Brian C.

    2018-02-01

    The 5th International Conference on Biophotonics (ICOB) held April 30 to May 1, 2017, in Fremantle, Western Australia, brought together opinion leaders to discuss future directions for the field and opportunities to consider. The first session of the conference, "How to Set a Big Picture Biophotonics Agenda," was focused on setting the stage for developing a vision and strategies for translation and impact on society of biophotonic technologies. The invited speakers, panelists, and attendees engaged in discussions that focused on opportunities and promising applications for biophotonic techniques, challenges when working at the confluence of the physical and biological sciences, driving factors for advances of biophotonic technologies, and educational opportunities. We share a summary of the presentations and discussions. Three main themes from the conference are presented in this position paper that capture the current status, opportunities, challenges, and future directions of biophotonics research and key areas of applications: (1) biophotonics at the nano- to microscale level; (2) biophotonics at meso- to macroscale level; and (3) biophotonics and the clinical translation conundrum.

  14. Big Data Comes to School

    Directory of Open Access Journals (Sweden)

    Bill Cope

    2016-03-01

    Full Text Available The prospect of “big data” at once evokes optimistic views of an information-rich future and concerns about surveillance that adversely impacts our personal and private lives. This overview article explores the implications of big data in education, focusing by way of example on data generated by student writing. We have chosen writing because it presents particular complexities, highlighting the range of processes for collecting and interpreting evidence of learning in the era of computer-mediated instruction and assessment as well as the challenges. Writing is significant not only because it is central to the core subject area of literacy; it is also an ideal medium for the representation of deep disciplinary knowledge across a number of subject areas. After defining what big data entails in education, we map emerging sources of evidence of learning that separately and together have the potential to generate unprecedented amounts of data: machine assessments, structured data embedded in learning, and unstructured data collected incidental to learning activity. Our case is that these emerging sources of evidence of learning have significant implications for the traditional relationships between assessment and instruction. Moreover, for educational researchers, these data are in some senses quite different from traditional evidentiary sources, and this raises a number of methodological questions. The final part of the article discusses implications for practice in an emerging field of education data science, including publication of data, data standards, and research ethics.

  15. A Brief Review on Leading Big Data Models

    Directory of Open Access Journals (Sweden)

    Sugam Sharma

    2014-11-01

    Full Text Available Today, science is passing through an era of transformation, where the inundation of data, dubbed data deluge is influencing the decision making process. The science is driven by the data and is being termed as data science. In this internet age, the volume of the data has grown up to petabytes, and this large, complex, structured or unstructured, and heterogeneous data in the form of “Big Data” has gained significant attention. The rapid pace of data growth through various disparate sources, especially social media such as Facebook, has seriously challenged the data analytic capabilities of traditional relational databases. The velocity of the expansion of the amount of data gives rise to a complete paradigm shift in how new age data is processed. Confidence in the data engineering of the existing data processing systems is gradually fading whereas the capabilities of the new techniques for capturing, storing, visualizing, and analyzing data are evolving. In this review paper, we discuss some of the modern Big Data models that are leading contributors in the NoSQL era and claim to address Big Data challenges in reliable and efficient ways. Also, we take the potential of Big Data into consideration and try to reshape the original operationaloriented definition of “Big Science” (Furner, 2003 into a new data-driven definition and rephrase it as “The science that deals with Big Data is Big Science.”

  16. The Milky Way's Tiny but Tough Galactic Neighbour

    Science.gov (United States)

    2009-10-01

    Today ESO announces the release of a stunning new image of one of our nearest galactic neighbours, Barnard's Galaxy, also known as NGC 6822. The galaxy contains regions of rich star formation and curious nebulae, such as the bubble clearly visible in the upper left of this remarkable vista. Astronomers classify NGC 6822 as an irregular dwarf galaxy because of its odd shape and relatively diminutive size by galactic standards. The strange shapes of these cosmic misfits help researchers understand how galaxies interact, evolve and occasionally "cannibalise" each other, leaving behind radiant, star-filled scraps. In the new ESO image, Barnard's Galaxy glows beneath a sea of foreground stars in the direction of the constellation of Sagittarius (the Archer). At the relatively close distance of about 1.6 million light-years, Barnard's Galaxy is a member of the Local Group, the archipelago of galaxies that includes our home, the Milky Way. The nickname of NGC 6822 comes from its discoverer, the American astronomer Edward Emerson Barnard, who first spied this visually elusive cosmic islet using a 125-millimetre aperture refractor in 1884. Astronomers obtained this latest portrait using the Wide Field Imager (WFI) attached to the 2.2-metre MPG/ESO telescope at ESO's La Silla Observatory in northern Chile. Even though Barnard's Galaxy lacks the majestic spiral arms and glowing, central bulge that grace its big galactic neighbours, the Milky Way, the Andromeda and the Triangulum galaxies, this dwarf galaxy has no shortage of stellar splendour and pyrotechnics. Reddish nebulae in this image reveal regions of active star formation, where young, hot stars heat up nearby gas clouds. Also prominent in the upper left of this new image is a striking bubble-shaped nebula. At the nebula's centre, a clutch of massive, scorching stars send waves of matter smashing into the surrounding interstellar material, generating a glowing structure that appears ring-like from our perspective

  17. Is big data risk assessment a novelty?

    NARCIS (Netherlands)

    Swuste, P.H.J.J.

    2016-01-01

    Objective: What metaphors, models and theories were developed in the safety science domain? And which research was based upon ‘big data’? Method: The study was confined to original articles and documents, written in English or Dutch from the period under consideration. Results and conclusions: From

  18. Challenges of Big Data in Educational Assessment

    Science.gov (United States)

    Gibson, David C.; Webb, Mary; Ifenthaler, Dirk

    2015-01-01

    This paper briefly discusses four measurement challenges of data science or "big data" in educational assessments that are enabled by technology: 1. Dealing with change over time via time-based data. 2. How a digital performance space's relationships interact with learner actions, communications and products. 3. How layers of…

  19. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  20. Dual of big bang and big crunch

    International Nuclear Information System (INIS)

    Bak, Dongsu

    2007-01-01

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory

  1. A Proposed Concentration Curriculum Design for Big Data Analytics for Information Systems Students

    Science.gov (United States)

    Molluzzo, John C.; Lawler, James P.

    2015-01-01

    Big Data is becoming a critical component of the Information Systems curriculum. Educators are enhancing gradually the concentration curriculum for Big Data in schools of computer science and information systems. This paper proposes a creative curriculum design for Big Data Analytics for a program at a major metropolitan university. The design…

  2. A study on the shape memory characteristics of Ti-Ni50-x-Pdx alloys

    International Nuclear Information System (INIS)

    Lee, H. W.; Chun, B. S.; Oh, S. J.; Kuk, I.H.

    1991-01-01

    The shape memory characteristics in TiNi alloys are greatly effected by the alloy composition and heat treatment condition. The present work was aimed to investigate the effect of Pd x (x=5,10,15,20) addition on the shape memory chracteristics of TiNi alloys by means of electrical resistance measurement. X-ray diffraction, differential scanning calorimetry and electron dispersive analysis X-ray measurement. The results obtained from this study are as follows; 1. The martensitic transformation start temperature, Ms of Ti-Ni 50-x -Pd x alloys decreased considerably with the increase of Pd content up to 10at%, whereas increased largely with the increase of Pd content in the alloys with Pd content more than 15at%. 2. The Ms temperature of Ti-Ni 50-x -Pd x alloys with cold working was significantly lower than that of the fully annealed alloys because high density dislocation has been introduced by the cold working which suppressed the martensitic transformation. (Author)

  3. Homogenization of stationary Navier–Stokes equations in domains with tiny holes

    Czech Academy of Sciences Publication Activity Database

    Feireisl, Eduard; Lu, Y.

    2015-01-01

    Roč. 17, č. 2 (2015), s. 381-392 ISSN 1422-6928 Keywords : compressible Navier - Stokes system * homogenization * tiny holes Subject RIV: BA - General Mathematics Impact factor: 1.023, year: 2015 http://link.springer.com/article/10.1007%2Fs00021-015-0200-2

  4. Micromechanical Analysis of Crack Closure Mechanism for Intelligent Material Containing TiNi Fibers

    Science.gov (United States)

    Araki, Shigetoshi; Ono, Hiroyuki; Saito, Kenji

    In our previous study, the micromechanical modeling of an intelligent material containing TiNi fibers was performed and the stress intensity factor KI at the tip of the crack in the material was expressed in terms of the magnitude of the shape memory shrinkage of the fibers and the thermal expansion strain in the material. In this study, the value of KI at the tip of the crack in the TiNi/epoxy material is calculated numerically by using analytical expressions obtained in our first report. As a result, we find that the KI value decreases with increasing shrink strain of the fibers, and this tendency agrees with that of the experimental result obtained by Shimamoto etal.(Trans. Jpn. Soc. Mech. Eng., Vol. 65, No. 634 (1999), pp. 1282-1286). Moreover, there exists an optimal value of the shrink strain of the fibers to make the KI value zero. The change in KI with temperature during the heating process from the reference temperature to the inverse austenitic finishing temperature of TiNi fiber is also consistent with the experimental result. These results can be explained by the changes in the shrink strain, the thermal expansion strain, and the elastic moduli of TiNi fiber with temperature. These results may be useful in designing intelligent materials containing TiNi fibers from the viewpoint of crack closure.

  5. Tiny Integrated Network Analyzer for Noninvasive Measurements of Electrically Small Antennas

    DEFF Research Database (Denmark)

    Buskgaard, Emil Feldborg; Krøyer, Ben; Tatomirescu, Alexandru

    2016-01-01

    the system. The tiny integrated network analyzer is a stand-alone Arduino-based measurement system that utilizes the transmit signal of the system under test as its reference. It features a power meter with triggering ability, on-board memory, universal serial bus, and easy extendibility with general...

  6. Long the fixation of physicists worldwide, a tiny particle is found

    CERN Multimedia

    2006-01-01

    "After decades of intensive effort by both experimental and theoretical physicists worldwide, a tiny particle with no charge, a very low mass and a lifetime much shorter than a nanosecond, dubbed the "axion", has now been detected by the University at Buffalo physicist who first suggested its existence in a little-read paper as early as 194." (2 pages)

  7. Tiny bubbles challenge giant turbines: Three Gorges puzzle.

    Science.gov (United States)

    Li, Shengcai

    2015-10-06

    Since the birth of the first prototype of the modern reaction turbine, cavitation as conjectured by Euler in 1754 always presents as a challenge. Following his theory, the evolution of modern reaction (Francis and Kaplan) turbines has been completed by adding the final piece of the element 'draft-tube' that enables turbines to explore water energy at efficiencies of almost 100%. However, during the last two and a half centuries, with increasing unit capacity and specific speed, the problem of cavitation has been manifested and complicated by the draft-tube surges rather than being solved. Particularly, during the last 20 years, the fierce competition in the international market for extremely large turbines with compact design has encouraged the development of giant Francis turbines of 700-1000 MW. The first group (24 units) of such giant turbines of 700 MW each was installed in the Three Gorges project. Immediately after commission, a strange erosion phenomenon appeared on the guide vane of the machines that has puzzled professionals. From a multi-disciplinary analysis, this Three Gorges puzzle could reflect an unknown type of cavitation inception presumably triggered by turbulence production from the boundary-layer streak transitional process. It thus presents a fresh challenge not only to this old turbine industry, but also to the fundamental sciences.

  8. Phosphoinositides: Tiny Lipids With Giant Impact on Cell Regulation

    Science.gov (United States)

    2013-01-01

    Phosphoinositides (PIs) make up only a small fraction of cellular phospholipids, yet they control almost all aspects of a cell's life and death. These lipids gained tremendous research interest as plasma membrane signaling molecules when discovered in the 1970s and 1980s. Research in the last 15 years has added a wide range of biological processes regulated by PIs, turning these lipids into one of the most universal signaling entities in eukaryotic cells. PIs control organelle biology by regulating vesicular trafficking, but they also modulate lipid distribution and metabolism via their close relationship with lipid transfer proteins. PIs regulate ion channels, pumps, and transporters and control both endocytic and exocytic processes. The nuclear phosphoinositides have grown from being an epiphenomenon to a research area of its own. As expected from such pleiotropic regulators, derangements of phosphoinositide metabolism are responsible for a number of human diseases ranging from rare genetic disorders to the most common ones such as cancer, obesity, and diabetes. Moreover, it is increasingly evident that a number of infectious agents hijack the PI regulatory systems of host cells for their intracellular movements, replication, and assembly. As a result, PI converting enzymes began to be noticed by pharmaceutical companies as potential therapeutic targets. This review is an attempt to give an overview of this enormous research field focusing on major developments in diverse areas of basic science linked to cellular physiology and disease. PMID:23899561

  9. Big climate data analysis

    Science.gov (United States)

    Mudelsee, Manfred

    2015-04-01

    The Big Data era has begun also in the climate sciences, not only in economics or molecular biology. We measure climate at increasing spatial resolution by means of satellites and look farther back in time at increasing temporal resolution by means of natural archives and proxy data. We use powerful supercomputers to run climate models. The model output of the calculations made for the IPCC's Fifth Assessment Report amounts to ~650 TB. The 'scientific evolution' of grid computing has started, and the 'scientific revolution' of quantum computing is being prepared. This will increase computing power, and data amount, by several orders of magnitude in the future. However, more data does not automatically mean more knowledge. We need statisticians, who are at the core of transforming data into knowledge. Statisticians notably also explore the limits of our knowledge (uncertainties, that is, confidence intervals and P-values). Mudelsee (2014 Climate Time Series Analysis: Classical Statistical and Bootstrap Methods. Second edition. Springer, Cham, xxxii + 454 pp.) coined the term 'optimal estimation'. Consider the hyperspace of climate estimation. It has many, but not infinite, dimensions. It consists of the three subspaces Monte Carlo design, method and measure. The Monte Carlo design describes the data generating process. The method subspace describes the estimation and confidence interval construction. The measure subspace describes how to detect the optimal estimation method for the Monte Carlo experiment. The envisaged large increase in computing power may bring the following idea of optimal climate estimation into existence. Given a data sample, some prior information (e.g. measurement standard errors) and a set of questions (parameters to be estimated), the first task is simple: perform an initial estimation on basis of existing knowledge and experience with such types of estimation problems. The second task requires the computing power: explore the hyperspace to

  10. Big data analytics a management perspective

    CERN Document Server

    Corea, Francesco

    2016-01-01

    This book is about innovation, big data, and data science seen from a business perspective. Big data is a buzzword nowadays, and there is a growing necessity within practitioners to understand better the phenomenon, starting from a clear stated definition. This book aims to be a starting reading for executives who want (and need) to keep the pace with the technological breakthrough introduced by new analytical techniques and piles of data. Common myths about big data will be explained, and a series of different strategic approaches will be provided. By browsing the book, it will be possible to learn how to implement a big data strategy and how to use a maturity framework to monitor the progress of the data science team, as well as how to move forward from one stage to the next. Crucial challenges related to big data will be discussed, where some of them are more general - such as ethics, privacy, and ownership – while others concern more specific business situations (e.g., initial public offering, growth st...

  11. Big Data and Neuroimaging.

    Science.gov (United States)

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  12. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  13. From Big Data to Big Business

    DEFF Research Database (Denmark)

    Lund Pedersen, Carsten

    2017-01-01

    Idea in Brief: Problem: There is an enormous profit potential for manufacturing firms in big data, but one of the key barriers to obtaining data-driven growth is the lack of knowledge about which capabilities are needed to extract value and profit from data. Solution: We (BDBB research group at C...

  14. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  15. Cavitation erosion of Ti-Ni shape memory alloy deposited coatings and Fe base shape memory alloy solid

    International Nuclear Information System (INIS)

    Hattori, Shuji; Fujisawa, Seiji; Owa, Tomonobu

    2007-01-01

    In this study, cavitation erosion tests were carried out by using thermal spraying and deposition of Ti-Ni shape memory alloy for the surface coating. The results show the test speciment of Ti-Ni thermal spraying has many initial defects, so that the erosion resistance is very low. The erosion resistance of Ti-Ni deposit is about 5-10 times higher than that of SUS 304, thus erosion resistance of Ti-Ni deposit is better than that of Ti-Ni thermal spraying. The cavitation erosion tests were carried out by using Fe-Mn-Si with shape memory and gunmetal with low elastic modulus. The erosion resistance of Fe-Mn-Si shape memory alloy solid is about 9 times higher than that of SUS 304. The erosion resistance of gunmetal is almost the same as SUS 304, because the test specimen of gunmetal has many small defects on the original surface. (author)

  16. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  17. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  18. Implications of Big Data for cell biology

    OpenAIRE

    Dolinski, Kara; Troyanskaya, Olga G.

    2015-01-01

    Big Data” has surpassed “systems biology” and “omics” as the hottest buzzword in the biological sciences, but is there any substance behind the hype? Certainly, we have learned about various aspects of cell and molecular biology from the many individual high-throughput data sets that have been published in the past 15–20 years. These data, although useful as individual data sets, can provide much more knowledge when interrogated with Big Data approaches, such as applying integrative methods ...

  19. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van

  20. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  1. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Boyd, Richard N.

    2001-01-01

    The precision of measurements in modern cosmology has made huge strides in recent years, with measurements of the cosmic microwave background and the determination of the Hubble constant now rivaling the level of precision of the predictions of big bang nucleosynthesis. However, these results are not necessarily consistent with the predictions of the Standard Model of big bang nucleosynthesis. Reconciling these discrepancies may require extensions of the basic tenets of the model, and possibly of the reaction rates that determine the big bang abundances

  2. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  3. Big Data, indispensable today

    Directory of Open Access Journals (Sweden)

    Radu-Ioan ENACHE

    2015-10-01

    Full Text Available Big data is and will be used more in the future as a tool for everything that happens both online and offline. Of course , online is a real hobbit, Big Data is found in this medium , offering many advantages , being a real help for all consumers. In this paper we talked about Big Data as being a plus in developing new applications, by gathering useful information about the users and their behaviour.We've also presented the key aspects of real-time monitoring and the architecture principles of this technology. The most important benefit brought to this paper is presented in the cloud section.

  4. Alloying process of sputter-deposited Ti/Ni multilayer thin films

    International Nuclear Information System (INIS)

    Cho, H.; Kim, H.Y.; Miyazaki, S.

    2006-01-01

    Alloying process of a Ti/Ni multilayer thin film was investigated in detail by differential scanning calorimetry (DSC), X-ray diffractometry (XRD) and transmission electron microscopy (TEM). The Ti/Ni multilayer thin film was prepared by depositing Ti and Ni layers alternately on a SiO 2 /Si substrate. The number of each metal layer was 100, and the total thickness was 3 μm. The alloy composition was determined as Ti-51 at.%Ni by electron probe micro analysis (EPMA). The DSC curve exhibited three exothermic peaks at 621, 680 and 701 K during heating the as-sputtered multilayer thin film. In order to investigate the alloying process, XRD and TEM observation was carried out for the specimens heated up to various temperatures with the heating rate same as the DSC measurement. The XRD profile of the as-sputtered film revealed only diffraction peaks of Ti and Ni. But reaction layers of 3 nm in thickness were observed at the interfaces of Ti and Ni layers in cross-sectional TEM images. The reaction layer was confirmed as an amorphous phase by the nano beam diffraction analysis. The XRD profiles exhibited that the intensity of Ti diffraction peak decreased in the specimen heat-treated above 600 K. The peak from Ni became broad and shifted to lower diffraction angle. The amorphous layer thickened up to 6 nm in the specimen heated up to 640 K. The diffraction peak corresponding to Ti-Ni B2 phase appeared and the peak from Ni disappeared for the specimen heated up to 675 K. The Ti-Ni B2 crystallized from the amorphous reaction layer. After further heating above the third exothermic peak, the intensity of the peak from the Ti-Ni B2 phase increased, the peak from Ti disappeared and the peaks corresponding to Ti 2 Ni appeared. The Ti 2 Ni phase was formed by the reaction of the Ti-Ni B2 and Ti

  5. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  6. Small Big Data Congress 2017

    NARCIS (Netherlands)

    Doorn, J.

    2017-01-01

    TNO, in collaboration with the Big Data Value Center, presents the fourth Small Big Data Congress! Our congress aims at providing an overview of practical and innovative applications based on big data. Do you want to know what is happening in applied research with big data? And what can already be

  7. Cryptography for Big Data Security

    Science.gov (United States)

    2015-07-13

    Cryptography for Big Data Security Book Chapter for Big Data: Storage, Sharing, and Security (3S) Distribution A: Public Release Ariel Hamlin1 Nabil...Email: arkady@ll.mit.edu ii Contents 1 Cryptography for Big Data Security 1 1.1 Introduction...48 Chapter 1 Cryptography for Big Data Security 1.1 Introduction With the amount

  8. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  9. The BIG Data Center: from deposition to integration to translation.

    Science.gov (United States)

    2017-01-04

    Biological data are generated at unprecedentedly exponential rates, posing considerable challenges in big data deposition, integration and translation. The BIG Data Center, established at Beijing Institute of Genomics (BIG), Chinese Academy of Sciences, provides a suite of database resources, including (i) Genome Sequence Archive, a data repository specialized for archiving raw sequence reads, (ii) Gene Expression Nebulas, a data portal of gene expression profiles based entirely on RNA-Seq data, (iii) Genome Variation Map, a comprehensive collection of genome variations for featured species, (iv) Genome Warehouse, a centralized resource housing genome-scale data with particular focus on economically important animals and plants, (v) Methylation Bank, an integrated database of whole-genome single-base resolution methylomes and (vi) Science Wikis, a central access point for biological wikis developed for community annotations. The BIG Data Center is dedicated to constructing and maintaining biological databases through big data integration and value-added curation, conducting basic research to translate big data into big knowledge and providing freely open access to a variety of data resources in support of worldwide research activities in both academia and industry. All of these resources are publicly available and can be found at http://bigd.big.ac.cn. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  10. Thinking Through Computational Exposure as an Evolving Paradign Shift for Exposure Science: Development and Application of Predictive Models from Big Data

    Science.gov (United States)

    Symposium Abstract: Exposure science has evolved from a time when the primary focus was on measurements of environmental and biological media and the development of enabling field and laboratory methods. The Total Exposure Assessment Method (TEAM) studies of the 1980s were class...

  11. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  12. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  13. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  14. BigDansing

    KAUST Repository

    Khayyat, Zuhair; Ilyas, Ihab F.; Jindal, Alekh; Madden, Samuel; Ouzzani, Mourad; Papotti, Paolo; Quiané -Ruiz, Jorge-Arnulfo; Tang, Nan; Yin, Si

    2015-01-01

    of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic

  15. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  16. Big Creek Pit Tags

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BCPITTAGS database is used to store data from an Oncorhynchus mykiss (steelhead/rainbow trout) population dynamics study in Big Creek, a coastal stream along the...

  17. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-01-01

    on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also

  18. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  19. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  20. Big Bang baryosynthesis

    International Nuclear Information System (INIS)

    Turner, M.S.; Chicago Univ., IL

    1983-01-01

    In these lectures I briefly review Big Bang baryosynthesis. In the first lecture I discuss the evidence which exists for the BAU, the failure of non-GUT symmetrical cosmologies, the qualitative picture of baryosynthesis, and numerical results of detailed baryosynthesis calculations. In the second lecture I discuss the requisite CP violation in some detail, further the statistical mechanics of baryosynthesis, possible complications to the simplest scenario, and one cosmological implication of Big Bang baryosynthesis. (orig./HSI)

  1. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  2. Big Data and medicine: a big deal?

    Science.gov (United States)

    Mayer-Schönberger, V; Ingelsson, E

    2018-05-01

    Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research. Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data's role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval. © 2017 The Association for the Publication of the Journal of Internal Medicine.

  3. Solid-state reaction in Ti/Ni multilayered films studied by using magneto-optical spectroscopy

    CERN Document Server

    Lee, Y P; Kim, K W; Kim, C G; Kudryavtsev, Y V; Nemoshkalenko, V V; Szymanski, B

    2000-01-01

    A comparative study of the solid-state reaction (SSR) in a series of Ti/Ni multilayered films (MLDs) with bilayer periods of 0.65-22.2 nm and a constant Ti to Ni sublayer thickness ratio was performed by using experimental and computer-simulated magneto-optical (MO) spectroscopy based on different models of MLFs, as well as x-ray diffraction (XRD). The spectral and sublayer thickness dependences of the MO properties of the Ti/Ni MLFs were explained on the basis of the electromagnetic theory. The existence of a threshold nominal Ni-sublayer thickness of about 3 nm for the as-deposited Ti/Ni MLF to observe of the equatorial Kerr effect was explained by a solid-state reaction which formed nonmagnetic alloyed regions between pure components during the MLF deposition. The SSR in the Ti/Ni MLFs, which was caused by the low temperature annealing, led to the formation of an amorphous Ti-Ni alloy and took place mainly in the Ti/Ni MLFs with ''thick'' sublayers. For the caes of Ti/Ni MLFs, the MO approach turned out to...

  4. Big data uncertainties.

    Science.gov (United States)

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  5. El síndrome «big science» y su influencia en el proceso de maduración de la Física mexicana de partículas elementales

    Directory of Open Access Journals (Sweden)

    Luna Morales, Mª Elena

    2002-12-01

    Full Text Available The influence of the international circumstances on the evolution and the process of research maturation in elementary particle physics in Mexico is identified. For this purpose a bibliometric study of the scientific production and impact, as well as the human resources involved have been employed. The study is conformed in the following three information sources: (1 an international information system specialized in the area of High Energy Physic (SLAC-SPIRES-HEP; (2 the Latin American Catalogs of Programs and Human Resources in Physics 1985-2001; and (3 a database locally developed about the production and impact of Physics in Mexico, during the period 1971-2000. Big science research has given place to a new organizational structure of the FMPE and a new dynamic of growing resources and products. This was quantified in the present work in terms of input and output bibliometric indicators, mainly to the increase of human resources, research programs and groups; and in the increment of the production and scientific output.

    Identificamos la influencia de las circunstancias internacionales sobre la evolución de la física de partículas elementales en el proceso de maduración de la disciplina en el ámbito local mexicano, a través de un estudio bibliométrico de la producción e impacto científicos y los recursos humanos de la física mexicana de partículas elementales (FMPE. El estudio se apoya en tres fuentes de información: (1 un sistema internacional de información especializado en el área de física de altas energías (SLACSPIRES- HEP; (2 Los Catálogos Latinoamericanos de Programas y Recursos Humanos en Física 1985-2001; (3 una base de datos desarrollada localmente sobre la producción e impacto de la Física mexicana en el área, durante el periodo 1971-2000. Encontramos que la influencia «big science» ha dado lugar a una nueva estructura organizacional de la FMPE y a una nueva dinámica de crecimiento de recursos y

  6. Detection of tiny amounts of fissile materials in large-sized containers with radioactive waste

    Science.gov (United States)

    Batyaev, V. F.; Skliarov, S. V.

    2018-01-01

    The paper is devoted to non-destructive control of tiny amounts of fissile materials in large-sized containers filled with radioactive waste (RAW). The aim of this work is to model an active neutron interrogation facility for detection of fissile ma-terials inside NZK type containers with RAW and determine the minimal detectable mass of U-235 as a function of various param-eters: matrix type, nonuniformity of container filling, neutron gen-erator parameters (flux, pulse frequency, pulse duration), meas-urement time. As a result the dependence of minimal detectable mass on fissile materials location inside container is shown. Nonu-niformity of the thermal neutron flux inside a container is the main reason of the space-heterogeneity of minimal detectable mass in-side a large-sized container. Our experiments with tiny amounts of uranium-235 (<1 g) confirm the detection of fissile materials in NZK containers by using active neutron interrogation technique.

  7. Analysis of the transformations temperatures of helicoidal Ti-Ni actuators using computational numerical methods

    Directory of Open Access Journals (Sweden)

    Carlos Augusto do N. Oliveira

    2013-01-01

    Full Text Available The development of shape memory actuators has enabled noteworthy applications in the mechanical engineering, robotics, aerospace, and oil industries and in medicine. These applications have been targeted on miniaturization and taking full advantage of spaces. This article analyses a Ti-Ni shape memory actuator used as part of a flow control system. A Ti-Ni spring actuator is subjected to thermomechanical training and parameters such as transformation temperature, thermal hysteresis and shape memory effect performance were investigated. These parameters were important for understanding the behavior of the actuator related to martensitic phase transformation during the heating and cooling cycles which it undergoes when in service. The multiple regression methodology was used as a computational tool for analysing data in order to simulate and predict the results for stress and cycles where the experimental data was not developed. The results obtained using the training cycles enable actuators to be characterized and the numerical simulation to be validated.

  8. The application of Tiny Triplet Finder (TTF) in BTeV pixel trigger

    International Nuclear Information System (INIS)

    Wu, Jin-Yuan; Wang, M.; Gottschalk, E.; Shi, Z.; Fermilab

    2006-01-01

    We describe a track segment recognition scheme called the Tiny Triplet Finder (TTF) that involves grouping of three hits satisfying a constraint such as forming of a straight line. The TTF performs this O(n 3 ) function in O(n) time, where n is number of hits in each detector plane. The word ''tiny'' reflects the fact that the FPGA resource usage is small. The number of logic elements needed for the TTF is O(Nlog(N)), where N is the number of bins in the coordinate considered, which for large N, is significantly smaller than O(N 2 ) needed for typical implementations of similar functions. The TTF is also suitable for software implementations as well as many other pattern recognition problems

  9. The application of Tiny Triplet Finder (TTF) in BTeV pixel trigger

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Jin-Yuan; Wang, M.; Gottschalk, E.; Shi, Z.; /Fermilab

    2006-03-01

    We describe a track segment recognition scheme called the Tiny Triplet Finder (TTF) that involves grouping of three hits satisfying a constraint such as forming of a straight line. The TTF performs this O(n{sup 3}) function in O(n) time, where n is number of hits in each detector plane. The word ''tiny'' reflects the fact that the FPGA resource usage is small. The number of logic elements needed for the TTF is O(Nlog(N)), where N is the number of bins in the coordinate considered, which for large N, is significantly smaller than O(N{sup 2}) needed for typical implementations of similar functions. The TTF is also suitable for software implementations as well as many other pattern recognition problems.

  10. Improving Healthcare Using Big Data Analytics

    Directory of Open Access Journals (Sweden)

    Revanth Sonnati

    2017-03-01

    Full Text Available In daily terms we call the current era as Modern Era which can also be named as the era of Big Data in the field of Information Technology. Our daily lives in todays world are rapidly advancing never quenching ones thirst. The fields of science engineering and technology are producing data at an exponential rate leading to Exabytes of data every day. Big data helps us to explore and re-invent many areas not limited to education health and law. The primary purpose of this paper is to provide an in-depth analysis in the area of Healthcare using the big data and analytics. The main purpose is to emphasize on the usage of the big data which is being stored all the time helping to look back in the history but this is the time to emphasize on the analyzation to improve the medication and services. Although many big data implementations happen to be in-house development this proposed implementation aims to propose a broader extent using Hadoop which just happen to be the tip of the iceberg. The focus of this paper is not limited to the improvement and analysis of the data it also focusses on the strengths and drawbacks compared to the conventional techniques available.

  11. (Updated) Nanotechnology: Understanding the Tiny Particles That May Save a Life | Poster

    Science.gov (United States)

    By Nathalie Walker, Guest Writer Could nanotechnology—the study of tiny matter ranging in size from 1 to 200 nanometers—be the future of cancer treatment? Although it is a relatively new field in cancer research, nanotechnology is not new to everyday life. Have you ever thought about the tennis ball you’ve thrown with your dog at the park and wondered what it is made of?

  12. Heterogeneous tiny energy: An appealing opportunity to power wireless sensor motes in a corrosive environment

    International Nuclear Information System (INIS)

    Qiao, Guofu; Sun, Guodong; Li, Hui; Ou, Jinping

    2014-01-01

    Highlights: • Ultra-low ambient energy was scavenged to power the first of its kind wireless corrosion sensors. • Three feasible tiny-energy sources were exploited for long-term corrosion monitoring. • Automatic recharging control of heterogeneous tiny energy was proposed for human-free monitoring. • Corrosion itself was applied as an energy source to power the wireless corrosion-monitoring motes. - Abstract: Reinforcing steel corrosion is a significant factor leading to the durability deterioration of reinforced concrete (RC) structures. The on-line monitoring of the corrosion of RC structures in a long-term, human-free manner is not only valuable in industry, but also a significant challenge in academia. This paper presents the first of its kind corrosion-monitoring approach that only exploits three heterogeneous tiny energy sources to power commercial-off-the-shelf wireless sensor motes such that the corrosion-related data are automatically and autonomously captured and sent to users via wireless channels. We first investigated the availability of these three tiny energy sources: corrosion energy, a cement battery, and a weak solar energy. In particular, the two former energy sources inherently exist in RC structures and can be generated continually in the service-life of RC structures, which beneficial for the prospects of long-term corrosion monitoring. We then proposed a proof-of-concept prototype, which consisted of a Telosb wireless sensor mote and an energy harvester in order to evaluate the feasibility and effectiveness of the ultralow-power ambient energy as a type of power supply in corrosion monitoring applications. The critical metrics for the holographic monitoring of RC structures, including electrochemical noise, humidity and temperature, were successfully acquired and analysed using a post-processing program. This paper describes a unique and novel approach towards the realisation of smart structural monitoring and control system in the

  13. In situ crystallization of sputter-deposited TiNi by ion irradiation

    International Nuclear Information System (INIS)

    Ikenaga, Noriaki; Kishi, Yoichi; Yajima, Zenjiro; Sakudo, Noriyuki

    2013-01-01

    Highlights: ► We developed a sputtering deposition process equipped with an ion irradiation system. ► Ion irradiation enables crystallization at lower substrate temperature. ► Ion fluence has an effective range for low-temperature crystallization. ► Crystallized films made on polyimide by the process show the shape memory effect. -- Abstract: TiNi is well known as a typical shape-memory alloy, and the shape-memory property appears only when the structure is crystalline. Until recently, the material has been formed as amorphous film by single-target sputtering deposition at first and then crystallized by being annealed at high temperature over 500 °C. Therefore, it has been difficult to make crystalline TiNi film directly on a substrate of polymer-based material because of the low heat resistance of substrate. In order to realize an actuator from the crystallized TiNi film on polymer substrates, the substrate temperature should be kept below 200 °C throughout the whole process. In our previous studies we have found that deposited film can be crystallized at very low temperature without annealing but with simultaneous irradiation of Ar ions during sputter-deposition. And we have also demonstrated the shape-memory effect with the TiNi film made by the new process. In order to investigate what parameters of the process contribute to the low-temperature crystallization, we have focused to the ion fluence of the ion irradiation. Resultantly, it was found that the transition from amorphous structure to crystal one has a threshold range of ion fluence

  14. Pacific Research Platform - Creation of a West Coast Big Data Freeway System Applied to the CONNected objECT (CONNECT) Data Mining Framework for Earth Science Knowledge Discovery

    Science.gov (United States)

    Sellars, S. L.; Nguyen, P.; Tatar, J.; Graham, J.; Kawsenuk, B.; DeFanti, T.; Smarr, L.; Sorooshian, S.; Ralph, M.

    2017-12-01

    A new era in computational earth sciences is within our grasps with the availability of ever-increasing earth observational data, enhanced computational capabilities, and innovative computation approaches that allow for the assimilation, analysis and ability to model the complex earth science phenomena. The Pacific Research Platform (PRP), CENIC and associated technologies such as the Flash I/O Network Appliance (FIONA) provide scientists a unique capability for advancing towards this new era. This presentation reports on the development of multi-institutional rapid data access capabilities and data pipeline for applying a novel image characterization and segmentation approach, CONNected objECT (CONNECT) algorithm to study Atmospheric River (AR) events impacting the Western United States. ARs are often associated with torrential rains, swollen rivers, flash flooding, and mudslides. CONNECT is computationally intensive, reliant on very large data transfers, storage and data mining techniques. The ability to apply the method to multiple variables and datasets located at different University of California campuses has previously been challenged by inadequate network bandwidth and computational constraints. The presentation will highlight how the inter-campus CONNECT data mining framework improved from our prior download speeds of 10MB/s to 500MB/s using the PRP and the FIONAs. We present a worked example using the NASA MERRA data to describe how the PRP and FIONA have provided researchers with the capability for advancing knowledge about ARs. Finally, we will discuss future efforts to expand the scope to additional variables in earth sciences.

  15. Fabrication of TiNi/CFRP smart composite using cold drawn TiNi wires

    Science.gov (United States)

    Xu, Ya; Otsuka, Kazuhiro; Toyama, Nobuyuki; Yoshida, Hitoshi; Jang, Byung-Koog; Nagai, Hideki; Oishi, Ryutaro; Kishi, Teruo

    2002-07-01

    In recent years, pre-strained TiNi shape memory alloys (SMA) have been used for fabricating smart structure with carbon fibers reinforced plastics (CFRP) in order to suppress microscopic mechanical damages. However, since the cure temperature of CFRP is higher than the reverse transformation temperatures of TiNi SMA, special fixture jigs have to be used for keeping the pre-strain during fabrication, which restricted its practical application. In order to overcome this difficulty, we developed a new method to fabricate SMA/CFRP smart composites without using special fixture jigs by controlling the transformation temperatures of SMA during fabrication. This method consists of using heavily cold-worked wires to increase the reverse transformation temperatures, and of using flash electrical heating of the wires after fabrication in order to decrease the reverse transformation temperatures to a lower temperature range again without damaging the epoxy resin around SMA wires. By choosing proper cold-working rate and composition of TiNi alloys, the reverse transformation temperatures were well controlled, and the TiNi/CFRP hybrid smart composite was fabricated without using special fixture jigs. The damage suppressing effect of cold drawn wires embedded in CFRP was confirmed.

  16. Cytocompatibility evaluation and surface characterization of TiNi deformed by high-pressure torsion

    Energy Technology Data Exchange (ETDEWEB)

    Awang Shri, Dayangku Noorfazidah, E-mail: AWANGSHRI.Dayangku@nims.go.jp [Graduate School of Pure and Applied Sciences, University of Tsukuba, Tsukuba, Ibaraki 305-8577 (Japan); Structural Materials Unit, National Institute for Materials Science, Tsukuba, Ibaraki 305-0047 (Japan); Tsuchiya, Koichi [Graduate School of Pure and Applied Sciences, University of Tsukuba, Tsukuba, Ibaraki 305-8577 (Japan); Structural Materials Unit, National Institute for Materials Science, Tsukuba, Ibaraki 305-0047 (Japan); Yamamoto, Akiko [Biomaterials Unit, International Center for Material Nanoarchitectonics (WPI-MANA), National Institute for Materials Science, Namiki 1-1, Tsukuba, Ibaraki 305-0044 (Japan)

    2014-10-01

    Effect of high-pressure torsion (HPT) deformation on biocompatibility and surface chemistry of TiNi was systematically investigated. Ti–50 mol% Ni was subjected to HPT straining for different numbers of turns, N = 0.25, 0.5, 1, 5 and 10 at a rotation speed of 1 rpm. X-ray photoelectron spectroscopy observations after 7 days of cell culture revealed the changes in the surface oxide composition, enrichment of Ti and detection of nitrogen derived from organic molecules in the culture medium. Plating efficiency of L929 cells was slightly increased by HPT deformation though no significant difference was observed. Albumin adsorption was higher in HPT-deformed samples, while vitronectin adsorption was peaked at N = 1. HPT deformation was also found to effectively suppress the Ni ion release from the TiNi samples into the cell culture medium even after the low degree of deformation at N = 0.25. - Highlights: • Nanostructured Ti–50 mol%Ni alloy was produced using high-pressure torsion. • HPT deformation improved L929 growth on TiNi samples. • Changes in surface chemistry were observed in HPT deformed samples. • Protein adsorption behavior was influenced by the surface chemistry. • Ni ion release was suppressed in HPT deformed samples.

  17. Big data has big potential for applications to climate change adaptation

    NARCIS (Netherlands)

    Ford, James D.; Tilleard, Simon E.; Berrang-Ford, Lea; Araos, Malcolm; Biesbroek, Robbert; Lesnikowski, Alexandra C.; MacDonald, Graham K.; Hsu, Angel; Chen, Chen; Bizikova, Livia

    2016-01-01

    The capacity to collect and analyze massive amounts
    of data is transforming research in the natural and social
    sciences (1). And yet, the climate change adaptation
    community has largely overlooked these developments.
    Here, we examine how “big data” can inform adaptation
    research

  18. Big data analysis new algorithms for a new society

    CERN Document Server

    Stefanowski, Jerzy

    2016-01-01

    This edited volume is devoted to Big Data Analysis from a Machine Learning standpoint as presented by some of the most eminent researchers in this area. It demonstrates that Big Data Analysis opens up new research problems which were either never considered before, or were only considered within a limited range. In addition to providing methodological discussions on the principles of mining Big Data and the difference between traditional statistical data analysis and newer computing frameworks, this book presents recently developed algorithms affecting such areas as business, financial forecasting, human mobility, the Internet of Things, information networks, bioinformatics, medical systems and life science. It explores, through a number of specific examples, how the study of Big Data Analysis has evolved and how it has started and will most likely continue to affect society. While the benefits brought upon by Big Data Analysis are underlined, the book also discusses some of the warnings that have been issued...

  19. Commentary: Epidemiology in the era of big data.

    Science.gov (United States)

    Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M

    2015-05-01

    Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called "three V's": variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field's future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future.

  20. THE ROLE OF TINY GRAINS ON THE ACCRETION PROCESS IN PROTOPLANETARY DISKS

    International Nuclear Information System (INIS)

    Bai Xuening

    2011-01-01

    Tiny grains such as polycyclic aromatic hydrocarbons (PAHs) have been thought to dramatically reduce the coupling between the gas and magnetic fields in weakly ionized gas such as in protoplanetary disks (PPDs) because they provide a tremendous surface area to recombine free electrons. The presence of tiny grains in PPDs thus raises the question of whether the magnetorotational instability (MRI) is able to drive rapid accretion consistent with observations. Charged tiny grains have similar conduction properties as ions, whose presence leads to qualitatively new behaviors in the conductivity tensor, characterized by n-bar /n e >1, where n e and n-bar denote the number densities of free electrons and all other charged species, respectively. In particular, Ohmic conductivity becomes dominated by charged grains rather than by electrons when n-bar /n e exceeds about 10 3 , and Hall and ambipolar diffusion (AD) coefficients are reduced by a factor of ( n-bar /n e ) 2 in the AD-dominated regime relative to that in the Ohmic regime. Applying the methodology of Bai, we find that in PPDs, when PAHs are sufficiently abundant (∼> 10 -9 per H 2 molecule), there exists a transition radius r trans of about 10-20 AU, beyond which the MRI active layer extends to the disk midplane. At r trans , the optimistically predicted MRI-driven accretion rate M-dot is one to two orders of magnitude smaller than that in the grain-free case, which is too small compared with the observed rates, but is in general no smaller than the predicted M-dot with solar-abundance 0.1 μm grains. At r > r trans , we find that, remarkably, the predicted M-dot exceeds the grain-free case due to a net reduction of AD by charged tiny grains and reaches a few times 10 -8 M sun yr -1 . This is sufficient to account for the observed M-dot in transitional disks. Larger grains (∼> 0.1 μm) are too massive to reach such high abundance as tiny grains and to facilitate the accretion process.

  1. BigDansing

    KAUST Repository

    Khayyat, Zuhair

    2015-06-02

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to scaling to big datasets. This presents a serious impediment since data cleansing often involves costly computations such as enumerating pairs of tuples, handling inequality joins, and dealing with user-defined functions. In this paper, we present BigDansing, a Big Data Cleansing system to tackle efficiency, scalability, and ease-of-use issues in data cleansing. The system can run on top of most common general purpose data processing platforms, ranging from DBMSs to MapReduce-like frameworks. A user-friendly programming interface allows users to express data quality rules both declaratively and procedurally, with no requirement of being aware of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic and real datasets show that BigDansing outperforms existing baseline systems up to more than two orders of magnitude without sacrificing the quality provided by the repair algorithms.

  2. The Sounds of the Little and Big Bangs

    Directory of Open Access Journals (Sweden)

    Edward Shuryak

    2017-11-01

    Full Text Available Studies on heavy ion collisions have discovered that tiny fireballs of a new phase of matter—quark gluon plasma (QGP—undergo an explosion, called the Little Bang. In spite of its small size, not only is it well described by hydrodynamics, but even small perturbations on top of the explosion turned out to be well described by hydrodynamical sound modes. The cosmological Big Bang also went through phase transitions, related with Quantum Chromodynamics (QCD and electroweak/Higgs symmetry breaking, which are also expected to produce sounds. We discuss their subsequent evolution and hypothetical inverse acoustic cascade, amplifying the amplitude. Ultimately, the collision of two sound waves leads to the formation of one gravity waves. We briefly discuss how these gravity waves can be detected.

  3. The Sounds of the Little and Big Bangs

    Science.gov (United States)

    Shuryak, Edward

    2017-11-01

    Studies of heavy ion collisions have discovered that tiny fireballs of new phase of matter -- quark gluon plasma (QGP) -- undergoes explosion, called the Little Bang. In spite of its small size, it is not only well described by hydrodynamics, but even small perturbations on top of the explosion turned to be well described by hydrodynamical sound modes. The cosmological Big Bang also went through phase transitions, the QCD and electroweak ones, which are expected to produce sounds as well. We discuss their subsequent evolution and hypothetical inverse acoustic cascade, amplifying the amplitude. Ultimately, collision of two sound waves leads to formation of gravity waves, with the smallest wavelength. We briefly discuss how those can be detected.

  4. Machine learning for Big Data analytics in plants.

    Science.gov (United States)

    Ma, Chuang; Zhang, Hao Helen; Wang, Xiangfeng

    2014-12-01

    Rapid advances in high-throughput genomic technology have enabled biology to enter the era of 'Big Data' (large datasets). The plant science community not only needs to build its own Big-Data-compatible parallel computing and data management infrastructures, but also to seek novel analytical paradigms to extract information from the overwhelming amounts of data. Machine learning offers promising computational and analytical solutions for the integrative analysis of large, heterogeneous and unstructured datasets on the Big-Data scale, and is gradually gaining popularity in biology. This review introduces the basic concepts and procedures of machine-learning applications and envisages how machine learning could interface with Big Data technology to facilitate basic research and biotechnology in the plant sciences. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  6. Thick-Big Descriptions

    DEFF Research Database (Denmark)

    Lai, Signe Sophus

    The paper discusses the rewards and challenges of employing commercial audience measurements data – gathered by media industries for profitmaking purposes – in ethnographic research on the Internet in everyday life. It questions claims to the objectivity of big data (Anderson 2008), the assumption...... communication systems, language and behavior appear as texts, outputs, and discourses (data to be ‘found’) – big data then documents things that in earlier research required interviews and observations (data to be ‘made’) (Jensen 2014). However, web-measurement enterprises build audiences according...... to a commercial logic (boyd & Crawford 2011) and is as such directed by motives that call for specific types of sellable user data and specific segmentation strategies. In combining big data and ‘thick descriptions’ (Geertz 1973) scholars need to question how ethnographic fieldwork might map the ‘data not seen...

  7. Big data bioinformatics.

    Science.gov (United States)

    Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao

    2014-12-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.

  8. Artificial Intelligence techniques for big data analysis

    OpenAIRE

    Aditya Khatri

    2017-01-01

    During my stay in Salamanca (Spain), I was fortunate enough to participate in the BISITE Research Group of the University of Salamanca. The University of Salamanca is the oldest university in Spain and in 2018 it celebrates its 8th centenary. As a computer science researcher, I participated in one of the many international projects that the research group has active, especially in big data analysis using Artificial Intelligence (AI) techniques. AI is one of BISITE's main lines of rese...

  9. A Guided Inquiry on Hubble Plots and the Big Bang

    Science.gov (United States)

    Forringer, Ted

    2014-01-01

    In our science for non-science majors course "21st Century Physics," we investigate modern "Hubble plots" (plots of velocity versus distance for deep space objects) in order to discuss the Big Bang, dark matter, and dark energy. There are two potential challenges that our students face when encountering these topics for the…

  10. Small decisions with big impact on data analytics

    OpenAIRE

    Jana Diesner

    2015-01-01

    Big social data have enabled new opportunities for evaluating the applicability of social science theories that were formulated decades ago and were often based on small- to medium-sized samples. Big Data coupled with powerful computing has the potential to replace the statistical practice of sampling and estimating effects by measuring phenomena based on full populations. Preparing these data for analysis and conducting analytics involves a plethora of decisions, some of which are already em...

  11. The Role of Social Responsibility in Big Business Practics

    Directory of Open Access Journals (Sweden)

    V A Gurinov

    2010-06-01

    Full Text Available The study of corporate social responsibility has become especially relevant in national science in the context of the development of big business able to assume significant social responsibilities. The article focuses on the issues of the nature and specificity of social responsibility of big business in Russia. The levels of social responsibility and the arrangements for social programmes implementation are also highlighted.

  12. Towards cloud based big data analytics for smart future cities

    OpenAIRE

    Khan, Zaheer; Anjum, Ashiq; Soomro, Kamran; Tahir, Muhammad

    2015-01-01

    A large amount of land-use, environment, socio-economic, energy and transport data is generated in cities. An integrated perspective of managing and analysing such big data can answer a number of science, policy, planning, governance and business questions and support decision making in enabling a smarter environment. This paper presents a theoretical and experimental perspective on the smart cities focused big data management and analysis by proposing a cloud-based analytics service. A proto...

  13. [Application of big data analyses for musculoskeletal cell differentiation].

    Science.gov (United States)

    Imai, Yuuki

    2016-04-01

    Next generation sequencer has strongly progress big data analyses in life science. Among various kinds of sequencing data sets, epigenetic platform has just been important key to clarify the questions on broad and detail phenomenon in various forms of life. In this report, it is introduced that the research on identification of novel transcription factors in osteoclastogenesis using DNase-seq. Big data on musculoskeletal research will be organized by IFMRS and is getting more crucial.

  14. High temperature annealing effect on structural and magnetic properties of Ti/Ni multilayers

    International Nuclear Information System (INIS)

    Bhatt, Pramod; Ganeshan, V.; Reddy, V.R.; Chaudhari, S.M.

    2006-01-01

    High temperature annealing effect on structural and magnetic properties of Ti/Ni multilayer (ML) up to 600 deg. C have been studied and reported in this paper. Ti/Ni multilayer samples having constant layer thicknesses of 50 A each are deposited on float glass and Si(1 1 1) substrates using electron-beam evaporation technique under ultra-high vacuum (UHV) conditions at room temperatures. The micro-structural parameters and their evolution with temperature for as-deposited as well as annealed multilayer samples up to 600 deg. C in a step of 100 deg. C for 1 h are determined by using X-ray diffraction (XRD) and grazing incidence X-ray reflectivity techniques. The X-ray diffraction pattern recorded at 300 deg. C annealed multilayer sample shows interesting structural transformation (from crystalline to amorphous) because of the solid-state reaction (SSR) and subsequent re-crystallization at higher temperatures of annealing, particularly at ≥400 deg. C due to the formation of TiNi 3 and Ti 2 Ni alloy phases. Sample quality and surface morphology are examined by using atomic force microscopy (AFM) technique for both as-deposited as well as annealed multilayer samples. In addition to this, a temperature dependent dc resistivity measurement is also used to study the structural transformation and subsequent alloy phase formation due to annealing treatment. The corresponding magnetization behavior of multilayer samples after each stage of annealing has been investigated by using Magneto-Optical Kerr Effect (MOKE) technique and results are interpreted in terms of observed micro-structural changes

  15. Effects of Surface Dipole Lengths on Evaporation of Tiny Water Aggregation

    International Nuclear Information System (INIS)

    Wang Shen; Wan Rongzheng; Fang Haiping; Tu Yusong

    2013-01-01

    Using molecular dynamics simulation, we compared evaporation behavior of a tiny amount of water molecules adsorbed on solid surfaces with different dipole lengths, including surface dipole lengths of 1 fold, 2 folds, 4 folds, 6 folds and 8 folds of 0.14 nm and different charges from 0.1e to 0.9e. Surfaces with short dipole lengths (1-fold system) can always maintain hydrophobic character and the evaporation speeds are not influenced, whether the surface charges are enhanced or weakened; but when surface dipole lengths get to 8 folds, surfaces become more hydrophilic as the surface charge increases, and the evaporation speeds increase gradually and monotonically. By tuning dipole lengths from 1-fold to 8-fold systems, we confirmed non-monotonic variation of the evaporation flux (first increases, then decreases) in 4 fold system with charges (0.1e–0.7e), reported in our previous paper [S. Wang, et al., J. Phys. Chem. B 116 (2012) 13863], and also show the process from the enhancement of this unexpected non-monotonic variation to its vanishment with surface dipole lengths increasing. Herein, we demonstrated two key factors to influence the evaporation flux of a tiny amount of water molecules adsorbed on solid surfaces: the exposed surficial area of water aggregation from where the water molecules can evaporate directly and the attraction potential from the substrate hindering the evaporation. In addition, more interestingly, we showed extra steric effect of surface dipoles on further increase of evaporation flux for 2-folds, 4-folds, 6-folds and 8-folds systems with charges around larger than 0.7e. (The steric effect is first reported by parts of our authors [C. Wang, et al., Sci. Rep. 2 (2012) 358]). This study presents a complete physical picture of the influence of surface dipole lengths on the evaporation behavior of the adsorbed tiny amount of water. (condensed matter: structural, mechanical, and thermal properties)

  16. Tiny intracranial aneurysms: Endovascular treatment by coil embolisation or sole stent deployment

    International Nuclear Information System (INIS)

    Lu Jun; Liu Jiachun; Wang Lijun; Qi Peng; Wang Daming

    2012-01-01

    Purpose: Tiny intracranial aneurysms pose a significant therapeutic challenge for interventional neuroradiologists. The authors report their preliminary results of endovascular treatment of these aneurysms. Methods: Between January 2002 and December 2009, 52 tiny intracranial aneurysms (defined as ≤3 mm in maximum diameter) in 46 patients (22 men; mean age, 57.9 years) were treated by endosaccular coil embolisation or sole stent deployment in the parent artery. Of 52 aneurysms, 29 had ruptured and 23 remained unruptured. The initial angiographic results, procedural complications, and clinical outcomes were assessed at discharge. Imaging follow-up was performed with cerebral angiography. Results: One aneurysm coiling procedure failed because of unsuccessful micro-catheterization. Forty-three aneurysms were successfully coil embolized, of which complete occlusion was obtained in 14, subtotal occlusion in 18 and incomplete occlusion in 11. The other 8 aneurysms were treated by sole stent deployment in the parent artery. Procedural complications (2 intraprocedural ruptures and 3 thromboembolic events) occurred in 5 (9.6%) of 52 aneurysms, resulting in permanent morbidity in only 1 (2.2%, 1/46) patient. No rebleeding occurred during clinical follow-up (mean duration, 46.7 months). Of the 16 coiled aneurysms that receiving repetitive angiography, 6 initially completely and 3 subtotally occluded aneurysms remained unchanged, 4 initially subtotally and 3 incompletely occluded aneurysms progressed to total occlusion. Five sole stent deployed aneurysms received angiographic follow-up (mean duration, 10.0 months), of which 3 remained unchanged, 1 became smaller and 1 progressed to total occlusion. Conclusion: Endovascular treatment of tiny intracranial aneurysms is technical feasible and relatively safe. Coil embolisation seems to be effective in preventing early recanalisation, whereas sole stenting technique needs further investigation to determine its effectiveness.

  17. Big ideas: innovation policy

    OpenAIRE

    John Van Reenen

    2011-01-01

    In the last CentrePiece, John Van Reenen stressed the importance of competition and labour market flexibility for productivity growth. His latest in CEP's 'big ideas' series describes the impact of research on how policy-makers can influence innovation more directly - through tax credits for business spending on research and development.

  18. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with

  19. Big data in history

    CERN Document Server

    Manning, Patrick

    2013-01-01

    Big Data in History introduces the project to create a world-historical archive, tracing the last four centuries of historical dynamics and change. Chapters address the archive's overall plan, how to interpret the past through a global archive, the missions of gathering records, linking local data into global patterns, and exploring the results.

  20. The Big Sky inside

    Science.gov (United States)

    Adams, Earle; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Hester, Carolyn; Knuth, Randy; Spangler, Todd; Jones, David; Henthorn, Melissa; Hammill, Brock; Smith, Paul; Salisbury, Rob; Reckin, Gene; Boulafentis, Johna

    2009-01-01

    The University of Montana (UM)-Missoula has implemented a problem-based program in which students perform scientific research focused on indoor air pollution. The Air Toxics Under the Big Sky program (Jones et al. 2007; Adams et al. 2008; Ward et al. 2008) provides a community-based framework for understanding the complex relationship between poor…

  1. Moving Another Big Desk.

    Science.gov (United States)

    Fawcett, Gay

    1996-01-01

    New ways of thinking about leadership require that leaders move their big desks and establish environments that encourage trust and open communication. Educational leaders must trust their colleagues to make wise choices. When teachers are treated democratically as leaders, classrooms will also become democratic learning organizations. (SM)

  2. New 'bigs' in cosmology

    International Nuclear Information System (INIS)

    Yurov, Artyom V.; Martin-Moruno, Prado; Gonzalez-Diaz, Pedro F.

    2006-01-01

    This paper contains a detailed discussion on new cosmic solutions describing the early and late evolution of a universe that is filled with a kind of dark energy that may or may not satisfy the energy conditions. The main distinctive property of the resulting space-times is that they make to appear twice the single singular events predicted by the corresponding quintessential (phantom) models in a manner which can be made symmetric with respect to the origin of cosmic time. Thus, big bang and big rip singularity are shown to take place twice, one on the positive branch of time and the other on the negative one. We have also considered dark energy and phantom energy accretion onto black holes and wormholes in the context of these new cosmic solutions. It is seen that the space-times of these holes would then undergo swelling processes leading to big trip and big hole events taking place on distinct epochs along the evolution of the universe. In this way, the possibility is considered that the past and future be connected in a non-paradoxical manner in the universes described by means of the new symmetric solutions

  3. The Big Bang

    CERN Multimedia

    Moods, Patrick

    2006-01-01

    How did the Universe begin? The favoured theory is that everything - space, time, matter - came into existence at the same moment, around 13.7 thousand million years ago. This event was scornfully referred to as the "Big Bang" by Sir Fred Hoyle, who did not believe in it and maintained that the Universe had always existed.

  4. Big Data Analytics

    Indian Academy of Sciences (India)

    The volume and variety of data being generated using computersis doubling every two years. It is estimated that in 2015,8 Zettabytes (Zetta=1021) were generated which consistedmostly of unstructured data such as emails, blogs, Twitter,Facebook posts, images, and videos. This is called big data. Itis possible to analyse ...

  5. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  6. What Lies Behind NSF Astronomer Demographics? Subjectivities of Women, Minorities and Foreign-born Astronomers within Meshworks of Big Science Astronomy

    Science.gov (United States)

    Guillen, Reynal; Gu, D.; Holbrook, J.; Murillo, L. F.; Traweek, S.

    2011-01-01

    Our current research focuses on the trajectory of scientists working with large-scale databases in astronomy, following them as they strategically build their careers, digital infrastructures, and make their epistemological commitments. We look specifically at how gender, ethnicity, nationality intersect in the process of subject formation in astronomy, as well as in the process of enrolling partners for the construction of instruments, design and implementation of large-scale databases. Work once figured as merely technical support, such assembling data catalogs, or as graphic design, generating pleasing images for public support, has been repositioned at the core of the field. Some have argued that such databases enable a new kind of scientific inquiry based on data exploration, such as the "fourth paradigm" or "data-driven" science. Our preliminary findings based on oral history interviews and ethnography provide insights into meshworks of women, African-American, "Hispanic," Asian-American and foreign-born astronomers. Our preliminary data suggest African-American men are more successful in sustaining astronomy careers than Chicano and Asian-American men. A distinctive theme in our data is the glocal character of meshworks available to and created by foreign-born women astronomers working at US facilities. Other data show that the proportion of Asian to Asian American and foreign-born Latina/o to Chicana/o astronomers is approximately equal. Futhermore, Asians and Latinas/os are represented in significantly greater numbers than Asian Americans and Chicanas/os. Among professional astronomers in the US, each ethnic minority group is numbered on the order of tens, not hundreds. Project support is provided by the NSF EAGER program to University of California, Los Angeles under award 0956589.

  7. Shape memory and pseudoelastic properties of Fe-Mn-Si and Ti-Ni based alloys

    International Nuclear Information System (INIS)

    Guenin, G.

    1997-01-01

    The aim of this presentation is to analyse and discuss some recent advances in shape memory and pseudoelastic properties of different alloys. Experimental work in connection with theoretical ones will be reviewed. The first part is devoted to the microstructural origin of shape memory properties of Fe-Mn-Si based alloys (γ-ε transformation); the second part is a synthetic analysis of the effects of thermomechanical treatments on shape memory and pseudoelastic effects in Ti-Ni alloys, with some focus on the behaviour of the R phase introduced. (orig.)

  8. Tiny optical fiber temperature sensor based on temperature-dependent refractive index of zinc telluride film

    Science.gov (United States)

    Bian, Qiang; Song, Zhangqi; Song, Dongyu; Zhang, Xueliang; Li, Bingsheng; Yu, Yang; Chen, Yuzhong

    2018-03-01

    The temperature-dependent refractive index of zinc telluride film can be used to develop a tiny, low cost and film-coated optical fiber temperature sensor. Pulse reference-based compensation technique is used to largely reduce the background noise which makes it possible to detect the minor reflectivity change of the film in different temperatures. The temperature sensitivity is 0.0034dB/° and the background noise is measured to be 0.0005dB, so the resolution can achieve 0.2°.

  9. Mechanical properties and related substructure of TiNi shape memory alloys

    International Nuclear Information System (INIS)

    Filip, P.; Kneissl, A.C.

    1995-01-01

    The mechanical properties of binary near equiatomic TiNi shape memory alloys were investigated after different types of mechanical and heat treatments. The changes of deformation behaviour are explained on the basis of substructure differences after work hardening. The ''elastic moduli'' of both the high-temperature phase B2 and the martensite B19' as well as the ''easy stage of deformation'' are dependent on the work hardening intensity and these changes are related to the mobility of B2/B19' interfaces. The martensite changes its morphology after work hardening. In contrast to a twinned martensite, typical for annealed alloys, the internally slipped martensite was detected after work hardening. (orig.)

  10. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  11. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-07-31

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to big data scaling. This presents a serious impediment since identify- ing and repairing dirty data often involves processing huge input datasets, handling sophisticated error discovery approaches and managing huge arbitrary errors. With large datasets, error detection becomes overly expensive and complicated especially when considering user-defined functions. Furthermore, a distinctive algorithm is de- sired to optimize inequality joins in sophisticated error discovery rather than na ̈ıvely parallelizing them. Also, when repairing large errors, their skewed distribution may obstruct effective error repairs. In this dissertation, I present solutions to overcome the above three problems in scaling data cleansing. First, I present BigDansing as a general system to tackle efficiency, scalability, and ease-of-use issues in data cleansing for Big Data. It automatically parallelizes the user’s code on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also enables parallel execution of serial repair algorithms by exploiting the graph representation of discovered errors. The experimental results show that BigDansing outperforms existing baselines up to more than two orders of magnitude. Although BigDansing scales cleansing jobs, it still lacks the ability to handle sophisticated error discovery requiring inequality joins. Therefore, I developed IEJoin as an algorithm for fast inequality joins. It is based on sorted arrays and space efficient bit-arrays to reduce the problem’s search space. By comparing IEJoin against well- known optimizations, I show that it is more scalable, and several orders of magnitude faster. BigDansing depends on vertex-centric graph systems, i.e., Pregel

  12. The big data-big model (BDBM) challenges in ecological research

    Science.gov (United States)

    Luo, Y.

    2015-12-01

    The field of ecology has become a big-data science in the past decades due to development of new sensors used in numerous studies in the ecological community. Many sensor networks have been established to collect data. For example, satellites, such as Terra and OCO-2 among others, have collected data relevant on global carbon cycle. Thousands of field manipulative experiments have been conducted to examine feedback of terrestrial carbon cycle to global changes. Networks of observations, such as FLUXNET, have measured land processes. In particular, the implementation of the National Ecological Observatory Network (NEON), which is designed to network different kinds of sensors at many locations over the nation, will generate large volumes of ecological data every day. The raw data from sensors from those networks offer an unprecedented opportunity for accelerating advances in our knowledge of ecological processes, educating teachers and students, supporting decision-making, testing ecological theory, and forecasting changes in ecosystem services. Currently, ecologists do not have the infrastructure in place to synthesize massive yet heterogeneous data into resources for decision support. It is urgent to develop an ecological forecasting system that can make the best use of multiple sources of data to assess long-term biosphere change and anticipate future states of ecosystem services at regional and continental scales. Forecasting relies on big models that describe major processes that underlie complex system dynamics. Ecological system models, despite great simplification of the real systems, are still complex in order to address real-world problems. For example, Community Land Model (CLM) incorporates thousands of processes related to energy balance, hydrology, and biogeochemistry. Integration of massive data from multiple big data sources with complex models has to tackle Big Data-Big Model (BDBM) challenges. Those challenges include interoperability of multiple

  13. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    Science.gov (United States)

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  14. Growth and surface morphology of ion-beam sputtered Ti-Ni thin films

    International Nuclear Information System (INIS)

    Rao, Ambati Pulla; Sunandana, C.S.

    2008-01-01

    Titanium-nickel thin films have been deposited on float glass substrates by ion beam sputtering in 100% pure argon atmosphere. Sputtering is predominant at energy region of incident ions, 1000 eV to 100 keV. The as-deposited films were investigated by X-ray photoelectron spectroscopy (XPS) and atomic force microscope (AFM). In this paper we attempted to study the surface morphology and elemental composition through AFM and XPS, respectively. Core level as well as valence band spectra of ion-beam sputtered Ti-Ni thin films at various Ar gas rates (5, 7 and 12 sccm) show that the thin film deposited at 3 sccm possess two distinct peaks at binding energies 458.55 eV and 464.36 eV mainly due to TiO 2 . Upon increasing Ar rate oxidation of Ti-Ni is reduced and the Ti-2p peaks begin approaching those of pure elemental Ti. Here Ti-2p peaks are observed at binding energy positions of 454.7 eV and 460.5 eV. AFM results show that the average grain size and roughness decrease, upon increasing Ar gas rate, from 2.90 μm to 0.096 μm and from 16.285 nm to 1.169 nm, respectively

  15. Shape memory characteristics of sputter-deposited Ti-Ni thin films

    International Nuclear Information System (INIS)

    Miyazaki, Shuichi; Ishida, Akira.

    1994-01-01

    Ti-Ni shape memory alloy thin films were deposited using an RF magnetron sputtering apparatus. The as-sputtered films were heat-treated in order to crystallize and memorize. After the heat treatment, the shape memory characteristics have been investigated using DSC and thermomechanical tests. Upon cooling the thin films, the solution-treated films showed a single peak in the DSC curve indicating a single stage transformation occurring from B2 to the martensitic phase, while the age-treated films showed double peaks indicating a two-stage transformation, i.e., from B2 to the R-phase, then to the martensitic phase. A perfect shape memory effect was achieved in these sputter-deposited Ti-Ni thin films in association both with the R-phase and martensitic transformations. Transformation temperatures increased linearly with increasing applied stress. The transformation strain also increased with increasing stress. The shape memory characteristics were strongly affected by heat-treatment conditions. (author)

  16. Detection of tiny amounts of fissile materials in large-sized containers with radioactive waste

    Directory of Open Access Journals (Sweden)

    Batyaev V.F.

    2018-01-01

    Full Text Available The paper is devoted to non-destructive control of tiny amounts of fissile materials in large-sized containers filled with radioactive waste (RAW. The aim of this work is to model an active neutron interrogation facility for detection of fissile ma-terials inside NZK type containers with RAW and determine the minimal detectable mass of U-235 as a function of various param-eters: matrix type, nonuniformity of container filling, neutron gen-erator parameters (flux, pulse frequency, pulse duration, meas-urement time. As a result the dependence of minimal detectable mass on fissile materials location inside container is shown. Nonu-niformity of the thermal neutron flux inside a container is the main reason of the space-heterogeneity of minimal detectable mass in-side a large-sized container. Our experiments with tiny amounts of uranium-235 (<1 g confirm the detection of fissile materials in NZK containers by using active neutron interrogation technique.

  17. Effect of Substrate Roughness on Adhesion and Structural Properties of Ti-Ni Shape Memory Alloy Thin Film.

    Science.gov (United States)

    Kim, Donghwan; Lee, Hyunsuk; Bae, Joohyeon; Jeong, Hyomin; Choi, Byeongkeun; Nam, Taehyun; Noh, Jungpil

    2018-09-01

    Ti-Ni shape memory alloy (SMA) thin films are very attractive material for industrial and medical applications such as micro-actuator, micro-sensors, and stents for blood vessels. An important property besides shape memory effect in the application of SMA thin films is the adhesion between the film and the substrate. When using thin films as micro-actuators or micro-sensors in MEMS, the film must be strongly adhered to the substrate. On the other hand, when using SMA thin films in medical devices such as stents, the deposited alloy thin film must be easily separable from the substrate for efficient processing. In this study, we investigated the effect of substrate roughness on the adhesion of Ti-Ni SMA thin films, as well as the structural properties and phase-transformation behavior of the fabricated films. Ti-Ni SMA thin films were deposited onto etched glass substrates with magnetron sputtering. Radio frequency plasma was used for etching the substrate. The adhesion properties were investigated through progressive scratch test. Structural properties of the films were determined via Feld emission scanning electron microscopy, X-ray diffraction measurements (XRD) and Energy-dispersive X-ray spectroscopy analysis. Phase transformation behaviors were observed with differential scanning calorimetry and low temperature-XRD. Ti-Ni SMA thin film deposited onto rough substrate provides higher adhesive strength than smooth substrate. However the roughness of the substrate has no influence on the growth and crystallization of the Ti-Ni SMA thin films.

  18. Big Data-Survey

    Directory of Open Access Journals (Sweden)

    P.S.G. Aruna Sri

    2016-03-01

    Full Text Available Big data is the term for any gathering of information sets, so expensive and complex, that it gets to be hard to process for utilizing customary information handling applications. The difficulties incorporate investigation, catch, duration, inquiry, sharing, stockpiling, Exchange, perception, and protection infringement. To reduce spot business patterns, anticipate diseases, conflict etc., we require bigger data sets when compared with the smaller data sets. Enormous information is hard to work with utilizing most social database administration frameworks and desktop measurements and perception bundles, needing rather enormously parallel programming running on tens, hundreds, or even a large number of servers. In this paper there was an observation on Hadoop architecture, different tools used for big data and its security issues.

  19. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Klinkby Madsen, Anders; Rasche, Andreas

    data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects......This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... of international development agendas to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development efforts, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion...

  20. Big nuclear accidents

    International Nuclear Information System (INIS)

    Marshall, W.; Billingon, D.E.; Cameron, R.F.; Curl, S.J.

    1983-09-01

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but just imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the risks of nuclear power. The paper reviews the way in which the probability and consequences of big nuclear accidents have been presented in the past and makes recommendations for the future, including the presentation of the long-term consequences of such accidents in terms of 'loss of life expectancy', 'increased chance of fatal cancer' and 'equivalent pattern of compulsory cigarette smoking'. The paper presents mathematical arguments, which show the derivation and validity of the proposed methods of presenting the consequences of imaginable big nuclear accidents. (author)

  1. Big Bounce and inhomogeneities

    International Nuclear Information System (INIS)

    Brizuela, David; Mena Marugan, Guillermo A; Pawlowski, Tomasz

    2010-01-01

    The dynamics of an inhomogeneous universe is studied with the methods of loop quantum cosmology, via a so-called hybrid quantization, as an example of the quantization of vacuum cosmological spacetimes containing gravitational waves (Gowdy spacetimes). The analysis of this model with an infinite number of degrees of freedom, performed at the effective level, shows that (i) the initial Big Bang singularity is replaced (as in the case of homogeneous cosmological models) by a Big Bounce, joining deterministically two large universes, (ii) the universe size at the bounce is at least of the same order of magnitude as that of the background homogeneous universe and (iii) for each gravitational wave mode, the difference in amplitude at very early and very late times has a vanishing statistical average when the bounce dynamics is strongly dominated by the inhomogeneities, whereas this average is positive when the dynamics is in a near-vacuum regime, so that statistically the inhomogeneities are amplified. (fast track communication)

  2. Really big numbers

    CERN Document Server

    Schwartz, Richard Evan

    2014-01-01

    In the American Mathematical Society's first-ever book for kids (and kids at heart), mathematician and author Richard Evan Schwartz leads math lovers of all ages on an innovative and strikingly illustrated journey through the infinite number system. By means of engaging, imaginative visuals and endearing narration, Schwartz manages the monumental task of presenting the complex concept of Big Numbers in fresh and relatable ways. The book begins with small, easily observable numbers before building up to truly gigantic ones, like a nonillion, a tredecillion, a googol, and even ones too huge for names! Any person, regardless of age, can benefit from reading this book. Readers will find themselves returning to its pages for a very long time, perpetually learning from and growing with the narrative as their knowledge deepens. Really Big Numbers is a wonderful enrichment for any math education program and is enthusiastically recommended to every teacher, parent and grandparent, student, child, or other individual i...

  3. Implementing the “Big Data” Concept in Official Statistics

    Directory of Open Access Journals (Sweden)

    О. V.

    2017-02-01

    Full Text Available Big data is a huge resource that needs to be used at all levels of economic planning. The article is devoted to the study of the development of the concept of “Big Data” in the world and its impact on the transformation of statistical simulation of economic processes. Statistics at the current stage should take into account the complex system of international economic relations, which functions in the conditions of globalization and brings new forms of economic development in small open economies. Statistical science should take into account such phenomena as gig-economy, common economy, institutional factors, etc. The concept of “Big Data” and open data are analyzed, problems of implementation of “Big Data” in the official statistics are shown. The ways of implementation of “Big Data” in the official statistics of Ukraine through active use of technological opportunities of mobile operators, navigation systems, surveillance cameras, social networks, etc. are presented. The possibilities of using “Big Data” in different sectors of the economy, also on the level of companies are shown. The problems of storage of large volumes of data are highlighted. The study shows that “Big Data” is a huge resource that should be used across the Ukrainian economy.

  4. Big Data and HPC: A Happy Marriage

    KAUST Repository

    Mehmood, Rashid

    2016-01-25

    International Data Corporation (IDC) defines Big Data technologies as “a new generation of technologies and architectures, designed to economically extract value from very large volumes of a wide variety of data produced every day, by enabling high velocity capture, discovery, and/or analysis”. High Performance Computing (HPC) most generally refers to “the practice of aggregating computing power in a way that delivers much higher performance than one could get out of a typical desktop computer or workstation in order to solve large problems in science, engineering, or business”. Big data platforms are built primarily considering the economics and capacity of the system for dealing with the 4V characteristics of data. HPC traditionally has been more focussed on the speed of digesting (computing) the data. For these reasons, the two domains (HPC and Big Data) have developed their own paradigms and technologies. However, recently, these two have grown fond of each other. HPC technologies are needed by Big Data to deal with the ever increasing Vs of data in order to forecast and extract insights from existing and new domains, faster, and with greater accuracy. Increasingly more data is being produced by scientific experiments from areas such as bioscience, physics, and climate, and therefore, HPC needs to adopt data-driven paradigms. Moreover, there are synergies between them with unimaginable potential for developing new computing paradigms, solving long-standing grand challenges, and making new explorations and discoveries. Therefore, they must get married to each other. In this talk, we will trace the HPC and big data landscapes through time including their respective technologies, paradigms and major applications areas. Subsequently, we will present the factors that are driving the convergence of the two technologies, the synergies between them, as well as the benefits of their convergence to the biosciences field. The opportunities and challenges of the

  5. Big Bang Circus

    Science.gov (United States)

    Ambrosini, C.

    2011-06-01

    Big Bang Circus is an opera I composed in 2001 and which was premiered at the Venice Biennale Contemporary Music Festival in 2002. A chamber group, four singers and a ringmaster stage the story of the Universe confronting and interweaving two threads: how early man imagined it and how scientists described it. Surprisingly enough fancy, myths and scientific explanations often end up using the same images, metaphors and sometimes even words: a strong tension, a drumskin starting to vibrate, a shout…

  6. Big Bang 5

    CERN Document Server

    Apolin, Martin

    2007-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 5 RG behandelt die Grundlagen (Maßsystem, Größenordnungen) und die Mechanik (Translation, Rotation, Kraft, Erhaltungssätze).

  7. Big Bang 8

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Band 8 vermittelt auf verständliche Weise Relativitätstheorie, Kern- und Teilchenphysik (und deren Anwendungen in der Kosmologie und Astrophysik), Nanotechnologie sowie Bionik.

  8. Big Bang 6

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 6 RG behandelt die Gravitation, Schwingungen und Wellen, Thermodynamik und eine Einführung in die Elektrizität anhand von Alltagsbeispielen und Querverbindungen zu anderen Disziplinen.

  9. Big Bang 7

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. In Band 7 werden neben einer Einführung auch viele aktuelle Aspekte von Quantenmechanik (z. Beamen) und Elektrodynamik (zB Elektrosmog), sowie die Klimaproblematik und die Chaostheorie behandelt.

  10. Big Bang Darkleosynthesis

    OpenAIRE

    Krnjaic, Gordan; Sigurdson, Kris

    2014-01-01

    In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis , a dark-sector version of big-bang nucleosynthesis (BBN), in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD , which generica...

  11. Big³. Editorial.

    Science.gov (United States)

    Lehmann, C U; Séroussi, B; Jaulent, M-C

    2014-05-22

    To provide an editorial introduction into the 2014 IMIA Yearbook of Medical Informatics with an overview of the content, the new publishing scheme, and upcoming 25th anniversary. A brief overview of the 2014 special topic, Big Data - Smart Health Strategies, and an outline of the novel publishing model is provided in conjunction with a call for proposals to celebrate the 25th anniversary of the Yearbook. 'Big Data' has become the latest buzzword in informatics and promise new approaches and interventions that can improve health, well-being, and quality of life. This edition of the Yearbook acknowledges the fact that we just started to explore the opportunities that 'Big Data' will bring. However, it will become apparent to the reader that its pervasive nature has invaded all aspects of biomedical informatics - some to a higher degree than others. It was our goal to provide a comprehensive view at the state of 'Big Data' today, explore its strengths and weaknesses, as well as its risks, discuss emerging trends, tools, and applications, and stimulate the development of the field through the aggregation of excellent survey papers and working group contributions to the topic. For the first time in history will the IMIA Yearbook be published in an open access online format allowing a broader readership especially in resource poor countries. For the first time, thanks to the online format, will the IMIA Yearbook be published twice in the year, with two different tracks of papers. We anticipate that the important role of the IMIA yearbook will further increase with these changes just in time for its 25th anniversary in 2016.

  12. Modeling and Analysis in Marine Big Data: Advances and Challenges

    Directory of Open Access Journals (Sweden)

    Dongmei Huang

    2015-01-01

    Full Text Available It is aware that big data has gathered tremendous attentions from academic research institutes, governments, and enterprises in all aspects of information sciences. With the development of diversity of marine data acquisition techniques, marine data grow exponentially in last decade, which forms marine big data. As an innovation, marine big data is a double-edged sword. On the one hand, there are many potential and highly useful values hidden in the huge volume of marine data, which is widely used in marine-related fields, such as tsunami and red-tide warning, prevention, and forecasting, disaster inversion, and visualization modeling after disasters. There is no doubt that the future competitions in marine sciences and technologies will surely converge into the marine data explorations. On the other hand, marine big data also brings about many new challenges in data management, such as the difficulties in data capture, storage, analysis, and applications, as well as data quality control and data security. To highlight theoretical methodologies and practical applications of marine big data, this paper illustrates a broad view about marine big data and its management, makes a survey on key methods and models, introduces an engineering instance that demonstrates the management architecture, and discusses the existing challenges.

  13. 企业科技型人才大五人格特质与知识共享绩效的关系研究%Research on the Relationship between Science and Technology Talent’s Big Five Personality Trait and Knowledge Sharing Performance

    Institute of Scientific and Technical Information of China (English)

    芮雪琴; 蒋媛卉

    2015-01-01

    This paper discusses the relationships among the big five personality traits of science and technology talents, knowledge sharing environment and knowledge sharing performance,constructs the theoretical model hypotheses,and pro-poses research hypothesis. This paper uses the big five personality inventory,knowledge sharing environment scale and knowledge sharing performance scale as the research tool,interviews the science and technology talents in typical enterpri-ses with strong R&D level,and uses multiple linear regression method for empirical test. Results show that,the big five personality traits of enterprise science and technology talents can predict the performance of knowledge sharing;knowledge sharing environment plays a partly intermediary role between science and technology talent personality and knowledge sha-ring performance.%探讨科技型人才大五人格特质、知识共享环境与知识共享绩效的关系,建立相关理论模型并提出研究假设。以大五人格量表、知识共享环境量表和知识共享绩效量表为工具,对具有较强研发水平的典型企业中的科技型人才进行访谈与问卷调研,运用多元线性回归等方法进行实证检验。结论表明,企业科技型人才大五人格特质对知识共享绩效具有预测作用;知识共享环境在科技型人才人格特质与知识共享绩效之间起到部分中介作用。

  14. EKALAVYA MODEL OF HIGHER EDUCATION – AN INNOVATION OF IBM’S BIG DATA UNIVERSITY

    OpenAIRE

    Dr. P. S. Aithal; Shubhrajyotsna Aithal

    2016-01-01

    Big Data Science is a new multi-disciplinary subject in the society, comprising of business intelligence, data analytics, and the related fields have become increasingly important in both the academic and the business communities during the 21st century. Many organizations and business intelligence experts have foreseen the significant development in the big data field as next big wave in future research arena in many industry sectors and the society. To become an expert and skilled in this n...

  15. Recent big flare

    International Nuclear Information System (INIS)

    Moriyama, Fumio; Miyazawa, Masahide; Yamaguchi, Yoshisuke

    1978-01-01

    The features of three big solar flares observed at Tokyo Observatory are described in this paper. The active region, McMath 14943, caused a big flare on September 16, 1977. The flare appeared on both sides of a long dark line which runs along the boundary of the magnetic field. Two-ribbon structure was seen. The electron density of the flare observed at Norikura Corona Observatory was 3 x 10 12 /cc. Several arc lines which connect both bright regions of different magnetic polarity were seen in H-α monochrome image. The active region, McMath 15056, caused a big flare on December 10, 1977. At the beginning, several bright spots were observed in the region between two main solar spots. Then, the area and the brightness increased, and the bright spots became two ribbon-shaped bands. A solar flare was observed on April 8, 1978. At first, several bright spots were seen around the solar spot in the active region, McMath 15221. Then, these bright spots developed to a large bright region. On both sides of a dark line along the magnetic neutral line, bright regions were generated. These developed to a two-ribbon flare. The time required for growth was more than one hour. A bright arc which connects two ribbons was seen, and this arc may be a loop prominence system. (Kato, T.)

  16. Big Data Technologies

    Science.gov (United States)

    Bellazzi, Riccardo; Dagliati, Arianna; Sacchi, Lucia; Segagni, Daniele

    2015-01-01

    The so-called big data revolution provides substantial opportunities to diabetes management. At least 3 important directions are currently of great interest. First, the integration of different sources of information, from primary and secondary care to administrative information, may allow depicting a novel view of patient’s care processes and of single patient’s behaviors, taking into account the multifaceted nature of chronic care. Second, the availability of novel diabetes technologies, able to gather large amounts of real-time data, requires the implementation of distributed platforms for data analysis and decision support. Finally, the inclusion of geographical and environmental information into such complex IT systems may further increase the capability of interpreting the data gathered and extract new knowledge from them. This article reviews the main concepts and definitions related to big data, it presents some efforts in health care, and discusses the potential role of big data in diabetes care. Finally, as an example, it describes the research efforts carried on in the MOSAIC project, funded by the European Commission. PMID:25910540

  17. The Rise of Big Data in Oncology.

    Science.gov (United States)

    Fessele, Kristen L

    2018-05-01

    To describe big data and data science in the context of oncology nursing care. Peer-reviewed and lay publications. The rapid expansion of real-world evidence from sources such as the electronic health record, genomic sequencing, administrative claims and other data sources has outstripped the ability of clinicians and researchers to manually review and analyze it. To promote high-quality, high-value cancer care, big data platforms must be constructed from standardized data sources to support extraction of meaningful, comparable insights. Nurses must advocate for the use of standardized vocabularies and common data elements that represent terms and concepts that are meaningful to patient care. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Big bang and big crunch in matrix string theory

    OpenAIRE

    Bedford, J; Papageorgakis, C; Rodríguez-Gómez, D; Ward, J

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  19. Concurrence of big data analytics and healthcare: A systematic review.

    Science.gov (United States)

    Mehta, Nishita; Pandit, Anil

    2018-06-01

    The application of Big Data analytics in healthcare has immense potential for improving the quality of care, reducing waste and error, and reducing the cost of care. This systematic review of literature aims to determine the scope of Big Data analytics in healthcare including its applications and challenges in its adoption in healthcare. It also intends to identify the strategies to overcome the challenges. A systematic search of the articles was carried out on five major scientific databases: ScienceDirect, PubMed, Emerald, IEEE Xplore and Taylor & Francis. The articles on Big Data analytics in healthcare published in English language literature from January 2013 to January 2018 were considered. Descriptive articles and usability studies of Big Data analytics in healthcare and medicine were selected. Two reviewers independently extracted information on definitions of Big Data analytics; sources and applications of Big Data analytics in healthcare; challenges and strategies to overcome the challenges in healthcare. A total of 58 articles were selected as per the inclusion criteria and analyzed. The analyses of these articles found that: (1) researchers lack consensus about the operational definition of Big Data in healthcare; (2) Big Data in healthcare comes from the internal sources within the hospitals or clinics as well external sources including government, laboratories, pharma companies, data aggregators, medical journals etc.; (3) natural language processing (NLP) is most widely used Big Data analytical technique for healthcare and most of the processing tools used for analytics are based on Hadoop; (4) Big Data analytics finds its application for clinical decision support; optimization of clinical operations and reduction of cost of care (5) major challenge in adoption of Big Data analytics is non-availability of evidence of its practical benefits in healthcare. This review study unveils that there is a paucity of information on evidence of real-world use of

  20. Database Resources of the BIG Data Center in 2018.

    Science.gov (United States)

    2018-01-04

    The BIG Data Center at Beijing Institute of Genomics (BIG) of the Chinese Academy of Sciences provides freely open access to a suite of database resources in support of worldwide research activities in both academia and industry. With the vast amounts of omics data generated at ever-greater scales and rates, the BIG Data Center is continually expanding, updating and enriching its core database resources through big-data integration and value-added curation, including BioCode (a repository archiving bioinformatics tool codes), BioProject (a biological project library), BioSample (a biological sample library), Genome Sequence Archive (GSA, a data repository for archiving raw sequence reads), Genome Warehouse (GWH, a centralized resource housing genome-scale data), Genome Variation Map (GVM, a public repository of genome variations), Gene Expression Nebulas (GEN, a database of gene expression profiles based on RNA-Seq data), Methylation Bank (MethBank, an integrated databank of DNA methylomes), and Science Wikis (a series of biological knowledge wikis for community annotations). In addition, three featured web services are provided, viz., BIG Search (search as a service; a scalable inter-domain text search engine), BIG SSO (single sign-on as a service; a user access control system to gain access to multiple independent systems with a single ID and password) and Gsub (submission as a service; a unified submission service for all relevant resources). All of these resources are publicly accessible through the home page of the BIG Data Center at http://bigd.big.ac.cn. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. The Big Science of stockpile stewardship

    Energy Technology Data Exchange (ETDEWEB)

    Reis, Victor H.; Hanrahan, Robert J.; Levedahl, W. Kirk

    2016-08-15

    In the quarter century since the US last exploded a nuclear weapon, an extensive research enterprise has maintained the resources and know-how needed to preserve confidence in the country’s stockpile.

  2. ISOLDE takes big science to nanoscale

    CERN Multimedia

    Roberto Cantoni

    2010-01-01

    New materials that could replace the semiconductors currently used in Blu-ray and other electronic devices, cost-efficient silicon for a new generation of solar panels, innovative investigation techniques for archaeology, biophysics and biochemistry…behind all this are the studies using nuclear hyperfine interactions. Of paramount importance in such studies is the availability of a large variety of radioactive ion beams: at CERN, these are produced by the ISOLDE facility.   Students from the University of Leuven and ITN Lisbon working at ISOLDE on a technique used to locate impurities in materials. Nuclear hyperfine interactions and their wide range of applications were the focus of the third Joint International Conference on Hyperfine Interactions and International Symposium on Nuclear Quadrupole Interactions, held at CERN from 12 to 17 September. The conference featured theoretical talks but also studies on magnetic materials, semiconductors, thin films, nano-structures and quantum...

  3. Science gateways for biomedical big data analysis

    NARCIS (Netherlands)

    Shahand, S.

    2015-01-01

    Biomedical researchers are facing data deluge challenges such as dealing with large volume of complex heterogeneous data and complex and computationally demanding data processing methods. Such scale and complexity of biomedical research requires multi-disciplinary collaboration between scientists

  4. Effect of phase formation on valence band photoemission and photoresonance study of Ti/Ni multilayers using synchrotron radiation

    International Nuclear Information System (INIS)

    Bhatt, Pramod; Chaudhari, S.M.

    2006-01-01

    This paper presents investigation of Ti-Ni alloy phase formation and its effect on valence band (VB) photoemission and photoresonance study of as-deposited as well as annealed Ti/Ni multilayers (MLs) up to 600 deg. C using synchrotron radiation. For this purpose [Ti (50 A)/Ni (50 A)]X 10 ML structures were deposited by using electron-beam evaporation technique under ultra-high vacuum (UHV) conditions. Formation of different phases of Ti-Ni alloy due to annealing treatment has been confirmed by the X-ray diffraction (XRD) technique. The XRD pattern corresponding as-deposited ML sample shows crystalline nature of both Ti and Ni deposited layers, whereas 300 deg. C annealed ML sample show solid-state reaction (SSR) leading to amorphization and subsequent recrystallisation at higher temperatures of annealing (≥400 deg. C) with the formation of TiNi, TiNi 3 and Ti 2 Ni alloy phases. The survey scans corresponding to 400, 500 and 600 deg. C annealed ML sample shows interdiffusion and intermixing of Ni atoms into Ti layers leading to chemical Ti-Ni alloys phase formation at interface. The corresponding recorded VB spectra using synchrotron radiation at 134 eV on as-deposited ML sample with successive sputtering shows alternately photoemission bands due to Ti 3d and Ni 3d, respectively, indicating there is no mixing of the consequent layers and any phase formation at the interface during deposition. However, ML samples annealed at higher temperatures of annealing, particularly at 400, 500 and 600 deg. C show a clear shift in Ni 3d band and its satellite peak position to higher BE side indicates Ti-Ni alloy phase formation. In addition to this, reduction of satellite peak intensity and Ni 3d density of states (DOS) near Fermi level is also observed due to Ti-Ni phase formation with higher annealing temperatures. The variable photon energy VB measurements on as-deposited and ML samples annealed at 400 deg. C confirms existence and BE position of observed Ni 3d satellite

  5. Fixing the Big Bang Theory's Lithium Problem

    Science.gov (United States)

    Kohler, Susanna

    2017-02-01

    How did our universe come into being? The Big Bang theory is a widely accepted and highly successful cosmological model of the universe, but it does introduce one puzzle: the cosmological lithium problem. Have scientists now found a solution?Too Much LithiumIn the Big Bang theory, the universe expanded rapidly from a very high-density and high-temperature state dominated by radiation. This theory has been validated again and again: the discovery of the cosmic microwave background radiation and observations of the large-scale structure of the universe both beautifully support the Big Bang theory, for instance. But one pesky trouble-spot remains: the abundance of lithium.The arrows show the primary reactions involved in Big Bang nucleosynthesis, and their flux ratios, as predicted by the authors model, are given on the right. Synthesizing primordial elements is complicated! [Hou et al. 2017]According to Big Bang nucleosynthesis theory, primordial nucleosynthesis ran wild during the first half hour of the universes existence. This produced most of the universes helium and small amounts of other light nuclides, including deuterium and lithium.But while predictions match the observed primordial deuterium and helium abundances, Big Bang nucleosynthesis theory overpredicts the abundance of primordial lithium by about a factor of three. This inconsistency is known as the cosmological lithium problem and attempts to resolve it using conventional astrophysics and nuclear physics over the past few decades have not been successful.In a recent publicationled by Suqing Hou (Institute of Modern Physics, Chinese Academy of Sciences) and advisorJianjun He (Institute of Modern Physics National Astronomical Observatories, Chinese Academy of Sciences), however, a team of scientists has proposed an elegant solution to this problem.Time and temperature evolution of the abundances of primordial light elements during the beginning of the universe. The authors model (dotted lines

  6. Evaporation of tiny water aggregation on solid surfaces with different wetting properties.

    Science.gov (United States)

    Wang, Shen; Tu, Yusong; Wan, Rongzheng; Fang, Haiping

    2012-11-29

    The evaporation of a tiny amount of water on the solid surface with different wettabilities has been studied by molecular dynamics simulations. From nonequilibrium MD simulations, we found that, as the surface changed from hydrophobic to hydrophilic, the evaporation speed did not show a monotonic decrease as intuitively expected, but increased first, and then decreased after it reached a maximum value. The analysis of the simulation trajectory and calculation of the surface water interaction illustrate that the competition between the number of water molecules on the water-gas surface from where the water molecules can evaporate and the potential barrier to prevent those water molecules from evaporating results in the unexpected behavior of the evaporation. This finding is helpful in understanding the evaporation on biological surfaces, designing artificial surfaces of ultrafast water evaporating, or preserving water in soil.

  7. Geology and geochronology of the Sub-Antarctic Snares Islands/Tini Heke, New Zealand

    DEFF Research Database (Denmark)

    Scott, JM; Turnbull, IM; Sagar, MW

    2015-01-01

    are prismatic and yield an essentially unimodal age population of c. 116 Ma that is within error of the granodiorite. These properties suggest that the dated raft represents a meta-igneous rock despite its mica-rich nature. Some schistose rocks on the Western Chain contain coarse relict plagioclase phenocrysts...... and appear to have an igneous protolith. No conclusive metasedimentary rocks have been identified, although sillimanite-bearing mica-rich schist occurs on Rua. Deformation of the crystalline rocks occurred after Snares Granite intrusion and before cooling below muscovite K–Ar closure at 400 ± 50 °C at 95 Ma......The first comprehensive geological map, a summary of lithologies and new radiogenic isotope data (U–Pb, Rb–Sr) are presented for crystalline rocks of the Sub-Antarctic Snares Islands/Tini Heke, 150 km south of Stewart Island. The main lithology is Snares Granite (c. 109 Ma from U–Pb dating...

  8. Tiny individuals attached to a new Silurian arthropod suggest a unique mode of brood care

    Science.gov (United States)

    Briggs, Derek E. G.; Siveter, Derek J.; Siveter, David J.; Sutton, Mark D.

    2016-04-01

    The ˜430-My-old Herefordshire, United Kingdom, Lagerstätte has yielded a diversity of remarkably preserved invertebrates, many of which provide fundamental insights into the evolutionary history and ecology of particular taxa. Here we report a new arthropod with 10 tiny arthropods tethered to its tergites by long individual threads. The head of the host, which is covered by a shield that projects anteriorly, bears a long stout uniramous antenna and a chelate limb followed by two biramous appendages. The trunk comprises 11 segments, all bearing limbs and covered by tergites with long slender lateral spines. A short telson bears long parallel cerci. Our phylogenetic analysis resolves the new arthropod as a stem-group mandibulate. The evidence suggests that the tethered individuals are juveniles and the association represents a complex brooding behavior. Alternative possibilities—that the tethered individuals represent a different epizoic or parasitic arthropod—appear less likely.

  9. Tiny Grains Give Huge Gains: Nanocrystal–Based Signal Amplification for Biomolecule Detection

    Science.gov (United States)

    Tong, Sheng; Ren, Binbin; Zheng, Zhilan; Shen, Han; Bao, Gang

    2013-01-01

    Nanocrystals, despite their tiny sizes, contain thousands to millions of atoms. Here we show that the large number of atoms packed in each metallic nanocrystal can provide a huge gain in signal amplification for biomolecule detection. We have devised a highly sensitive, linear amplification scheme by integrating the dissolution of bound nanocrystals and metal-induced stoichiometric chromogenesis, and demonstrated that signal amplification is fully defined by the size and atom density of nanocrystals, which can be optimized through well-controlled nanocrystal synthesis. Further, the rich library of chromogenic reactions allows implementation of this scheme in various assay formats, as demonstrated by the iron oxide nanoparticle linked immunosorbent assay (ILISA) and blotting assay developed in this study. Our results indicate that, owing to the inherent simplicity, high sensitivity and repeatability, the nanocrystal based amplification scheme can significantly improve biomolecule quantification in both laboratory research and clinical diagnostics. This novel method adds a new dimension to current nanoparticle-based bioassays. PMID:23659350

  10. Macronuclear genome structure of the ciliate Nyctotherus ovalis: Single-gene chromosomes and tiny introns

    Directory of Open Access Journals (Sweden)

    Landweber Laura F

    2008-12-01

    Full Text Available Abstract Background Nyctotherus ovalis is a single-celled eukaryote that has hydrogen-producing mitochondria and lives in the hindgut of cockroaches. Like all members of the ciliate taxon, it has two types of nuclei, a micronucleus and a macronucleus. N. ovalis generates its macronuclear chromosomes by forming polytene chromosomes that subsequently develop into macronuclear chromosomes by DNA elimination and rearrangement. Results We examined the structure of these gene-sized macronuclear chromosomes in N. ovalis. We determined the telomeres, subtelomeric regions, UTRs, coding regions and introns by sequencing a large set of macronuclear DNA sequences (4,242 and cDNAs (5,484 and comparing them with each other. The telomeres consist of repeats CCC(AAAACCCCn, similar to those in spirotrichous ciliates such as Euplotes, Sterkiella (Oxytricha and Stylonychia. Per sequenced chromosome we found evidence for either a single protein-coding gene, a single tRNA, or the complete ribosomal RNAs cluster. Hence the chromosomes appear to encode single transcripts. In the short subtelomeric regions we identified a few overrepresented motifs that could be involved in gene regulation, but there is no consensus polyadenylation site. The introns are short (21–29 nucleotides, and a significant fraction (1/3 of the tiny introns is conserved in the distantly related ciliate Paramecium tetraurelia. As has been observed in P. tetraurelia, the N. ovalis introns tend to contain in-frame stop codons or have a length that is not dividable by three. This pattern causes premature termination of mRNA translation in the event of intron retention, and potentially degradation of unspliced mRNAs by the nonsense-mediated mRNA decay pathway. Conclusion The combination of short leaders, tiny introns and single genes leads to very minimal macronuclear chromosomes. The smallest we identified contained only 150 nucleotides.

  11. Pollination networks of oil-flowers: a tiny world within the smallest of all worlds.

    Science.gov (United States)

    Bezerra, Elisângela L S; Machado, Isabel C; Mello, Marco A R

    2009-09-01

    1. In the Neotropics, most plants depend on animals for pollination. Solitary bees are the most important vectors, and among them members of the tribe Centridini depend on oil from flowers (mainly Malpighiaceae) to feed their larvae. This specialized relationship within 'the smallest of all worlds' (a whole pollination network) could result in a 'tiny world' different from the whole system. This 'tiny world' would have higher nestedness, shorter path lengths, lower modularity and higher resilience if compared with the whole pollination network. 2. In the present study, we contrasted a network of oil-flowers and their visitors from a Brazilian steppe ('caatinga') to whole pollination networks from all over the world. 3. A network approach was used to measure network structure and, finally, to test fragility. The oil-flower network studied was more nested (NODF = 0.84, N = 0.96) than all of the whole pollination networks studied. Average path lengths in the two-mode network were shorter (one node, both for bee and plant one-mode network projections) and modularity was lower (M = 0.22 and four modules) than in all of the whole pollination networks. Extinctions had no or small effects on the network structure, with an average change in nestedness smaller than 2% in most of the cases studied; and only two species caused coextinctions. The higher the degree of the removed species, the stronger the effect and the higher the probability of a decrease in nestedness. 4. We conclude that the oil-flower subweb is more cohesive and resilient than whole pollination networks. Therefore, the Malpighiaceae have a robust pollination service in the Neotropics. Our findings reinforce the hypothesis that each ecological service is in fact a mosaic of different subservices with a hierarchical structure ('webs within webs').

  12. Big data and educational research

    OpenAIRE

    Beneito-Montagut, Roser

    2017-01-01

    Big data and data analytics offer the promise to enhance teaching and learning, improve educational research and progress education governance. This chapter aims to contribute to the conceptual and methodological understanding of big data and analytics within educational research. It describes the opportunities and challenges that big data and analytics bring to education as well as critically explore the perils of applying a data driven approach to education. Despite the claimed value of the...

  13. The trashing of Big Green

    International Nuclear Information System (INIS)

    Felten, E.

    1990-01-01

    The Big Green initiative on California's ballot lost by a margin of 2-to-1. Green measures lost in five other states, shocking ecology-minded groups. According to the postmortem by environmentalists, Big Green was a victim of poor timing and big spending by the opposition. Now its supporters plan to break up the bill and try to pass some provisions in the Legislature

  14. A Hybrid Evaluation System Framework (Shell & Web) with Standardized Access to Climate Model Data and Verification Tools for a Clear Climate Science Infrastructure on Big Data High Performance Computers

    Science.gov (United States)

    Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Ulbrich, Uwe; Cubasch, Ulrich

    2015-04-01

    by other users-saving CPU time, I/O and disk space. This study presents the different techniques and advantages of such a hybrid evaluation system making use of a Big Data HPC in climate science. website: www-miklip.dkrz.de visitor-login: click on "Guest"

  15. Big data in fashion industry

    Science.gov (United States)

    Jain, S.; Bruniaux, J.; Zeng, X.; Bruniaux, P.

    2017-10-01

    Significant work has been done in the field of big data in last decade. The concept of big data includes analysing voluminous data to extract valuable information. In the fashion world, big data is increasingly playing a part in trend forecasting, analysing consumer behaviour, preference and emotions. The purpose of this paper is to introduce the term fashion data and why it can be considered as big data. It also gives a broad classification of the types of fashion data and briefly defines them. Also, the methodology and working of a system that will use this data is briefly described.

  16. Big Data Analytics An Overview

    Directory of Open Access Journals (Sweden)

    Jayshree Dwivedi

    2015-08-01

    Full Text Available Big data is a data beyond the storage capacity and beyond the processing power is called big data. Big data term is used for data sets its so large or complex that traditional data it involves data sets with sizes. Big data size is a constantly moving target year by year ranging from a few dozen terabytes to many petabytes of data means like social networking sites the amount of data produced by people is growing rapidly every year. Big data is not only a data rather it become a complete subject which includes various tools techniques and framework. It defines the epidemic possibility and evolvement of data both structured and unstructured. Big data is a set of techniques and technologies that require new forms of assimilate to uncover large hidden values from large datasets that are diverse complex and of a massive scale. It is difficult to work with using most relational database management systems and desktop statistics and visualization packages exacting preferably massively parallel software running on tens hundreds or even thousands of servers. Big data environment is used to grab organize and resolve the various types of data. In this paper we describe applications problems and tools of big data and gives overview of big data.

  17. Was there a big bang

    International Nuclear Information System (INIS)

    Narlikar, J.

    1981-01-01

    In discussing the viability of the big-bang model of the Universe relative evidence is examined including the discrepancies in the age of the big-bang Universe, the red shifts of quasars, the microwave background radiation, general theory of relativity aspects such as the change of the gravitational constant with time, and quantum theory considerations. It is felt that the arguments considered show that the big-bang picture is not as soundly established, either theoretically or observationally, as it is usually claimed to be, that the cosmological problem is still wide open and alternatives to the standard big-bang picture should be seriously investigated. (U.K.)

  18. Classification, (big) data analysis and statistical learning

    CERN Document Server

    Conversano, Claudio; Vichi, Maurizio

    2018-01-01

    This edited book focuses on the latest developments in classification, statistical learning, data analysis and related areas of data science, including statistical analysis of large datasets, big data analytics, time series clustering, integration of data from different sources, as well as social networks. It covers both methodological aspects as well as applications to a wide range of areas such as economics, marketing, education, social sciences, medicine, environmental sciences and the pharmaceutical industry. In addition, it describes the basic features of the software behind the data analysis results, and provides links to the corresponding codes and data sets where necessary. This book is intended for researchers and practitioners who are interested in the latest developments and applications in the field. The peer-reviewed contributions were presented at the 10th Scientific Meeting of the Classification and Data Analysis Group (CLADAG) of the Italian Statistical Society, held in Santa Margherita di Pul...

  19. Privacy and Big Data

    CERN Document Server

    Craig, Terence

    2011-01-01

    Much of what constitutes Big Data is information about us. Through our online activities, we leave an easy-to-follow trail of digital footprints that reveal who we are, what we buy, where we go, and much more. This eye-opening book explores the raging privacy debate over the use of personal data, with one undeniable conclusion: once data's been collected, we have absolutely no control over who uses it or how it is used. Personal data is the hottest commodity on the market today-truly more valuable than gold. We are the asset that every company, industry, non-profit, and government wants. Pri

  20. Visualizing big energy data

    DEFF Research Database (Denmark)

    Hyndman, Rob J.; Liu, Xueqin Amy; Pinson, Pierre

    2018-01-01

    Visualization is a crucial component of data analysis. It is always a good idea to plot the data before fitting models, making predictions, or drawing conclusions. As sensors of the electric grid are collecting large volumes of data from various sources, power industry professionals are facing th...... the challenge of visualizing such data in a timely fashion. In this article, we demonstrate several data-visualization solutions for big energy data through three case studies involving smart-meter data, phasor measurement unit (PMU) data, and probabilistic forecasts, respectively....

  1. Big Data Challenges

    Directory of Open Access Journals (Sweden)

    Alexandru Adrian TOLE

    2013-10-01

    Full Text Available The amount of data that is traveling across the internet today, not only that is large, but is complex as well. Companies, institutions, healthcare system etc., all of them use piles of data which are further used for creating reports in order to ensure continuity regarding the services that they have to offer. The process behind the results that these entities requests represents a challenge for software developers and companies that provide IT infrastructure. The challenge is how to manipulate an impressive volume of data that has to be securely delivered through the internet and reach its destination intact. This paper treats the challenges that Big Data creates.

  2. Big data naturally rescaled

    International Nuclear Information System (INIS)

    Stoop, Ruedi; Kanders, Karlis; Lorimer, Tom; Held, Jenny; Albert, Carlo

    2016-01-01

    We propose that a handle could be put on big data by looking at the systems that actually generate the data, rather than the data itself, realizing that there may be only few generic processes involved in this, each one imprinting its very specific structures in the space of systems, the traces of which translate into feature space. From this, we propose a practical computational clustering approach, optimized for coping with such data, inspired by how the human cortex is known to approach the problem.

  3. A Matrix Big Bang

    OpenAIRE

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matr...

  4. BIG Data - BIG Gains? Understanding the Link Between Big Data Analytics and Innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance for product innovations. Since big data technologies provide new data information practices, they create new decision-making possibilities, which firms can use to realize innovations. Applying German firm-level data we find suggestive evidence that big data analytics matters for the likelihood of becoming a product innovator as well as the market success of the firms’ product innovat...

  5. BIG data - BIG gains? Empirical evidence on the link between big data analytics and innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance in terms of product innovations. Since big data technologies provide new data information practices, they create novel decision-making possibilities, which are widely believed to support firms’ innovation process. Applying German firm-level data within a knowledge production function framework we find suggestive evidence that big data analytics is a relevant determinant for the likel...

  6. Making big sense from big data in toxicology by read-across.

    Science.gov (United States)

    Hartung, Thomas

    2016-01-01

    Modern information technologies have made big data available in safety sciences, i.e., extremely large data sets that may be analyzed only computationally to reveal patterns, trends and associations. This happens by (1) compilation of large sets of existing data, e.g., as a result of the European REACH regulation, (2) the use of omics technologies and (3) systematic robotized testing in a high-throughput manner. All three approaches and some other high-content technologies leave us with big data--the challenge is now to make big sense of these data. Read-across, i.e., the local similarity-based intrapolation of properties, is gaining momentum with increasing data availability and consensus on how to process and report it. It is predominantly applied to in vivo test data as a gap-filling approach, but can similarly complement other incomplete datasets. Big data are first of all repositories for finding similar substances and ensure that the available data is fully exploited. High-content and high-throughput approaches similarly require focusing on clusters, in this case formed by underlying mechanisms such as pathways of toxicity. The closely connected properties, i.e., structural and biological similarity, create the confidence needed for predictions of toxic properties. Here, a new web-based tool under development called REACH-across, which aims to support and automate structure-based read-across, is presented among others.

  7. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Fields, Brian D.; Olive, Keith A.

    2006-01-01

    We present an overview of the standard model of big bang nucleosynthesis (BBN), which describes the production of the light elements in the early universe. The theoretical prediction for the abundances of D, 3 He, 4 He, and 7 Li is discussed. We emphasize the role of key nuclear reactions and the methods by which experimental cross section uncertainties are propagated into uncertainties in the predicted abundances. The observational determination of the light nuclides is also discussed. Particular attention is given to the comparison between the predicted and observed abundances, which yields a measurement of the cosmic baryon content. The spectrum of anisotropies in the cosmic microwave background (CMB) now independently measures the baryon density to high precision; we show how the CMB data test BBN, and find that the CMB and the D and 4 He observations paint a consistent picture. This concordance stands as a major success of the hot big bang. On the other hand, 7 Li remains discrepant with the CMB-preferred baryon density; possible explanations are reviewed. Finally, moving beyond the standard model, primordial nucleosynthesis constraints on early universe and particle physics are also briefly discussed

  8. Adapting bioinformatics curricula for big data.

    Science.gov (United States)

    Greene, Anna C; Giffin, Kristine A; Greene, Casey S; Moore, Jason H

    2016-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs. © The Author 2015. Published by Oxford University Press.

  9. Adapting bioinformatics curricula for big data

    Science.gov (United States)

    Greene, Anna C.; Giffin, Kristine A.; Greene, Casey S.

    2016-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs. PMID:25829469

  10. TinyONet: A Cache-Based Sensor Network Bridge Enabling Sensing Data Reusability and Customized Wireless Sensor Network Services

    Science.gov (United States)

    Jung, Eui-Hyun; Park, Yong-Jin

    2008-01-01

    In recent years, a few protocol bridge research projects have been announced to enable a seamless integration of Wireless Sensor Networks (WSNs) with the TCP/IP network. These studies have ensured the transparent end-to-end communication between two network sides in the node-centric manner. Researchers expect this integration will trigger the development of various application domains. However, prior research projects have not fully explored some essential features for WSNs, especially the reusability of sensing data and the data-centric communication. To resolve these issues, we suggested a new protocol bridge system named TinyONet. In TinyONet, virtual sensors play roles as virtual counterparts of physical sensors and they dynamically group to make a functional entity, Slice. Instead of direct interaction with individual physical sensors, each sensor application uses its own WSN service provided by Slices. If a new kind of service is required in TinyONet, the corresponding function can be dynamically added at runtime. Beside the data-centric communication, it also supports the node-centric communication and the synchronous access. In order to show the effectiveness of the system, we implemented TinyONet on an embedded Linux machine and evaluated it with several experimental scenarios. PMID:27873968

  11. [Big data from clinical routine].

    Science.gov (United States)

    Mansmann, U

    2018-04-01

    Over the past 100 years, evidence-based medicine has undergone several fundamental changes. Through the field of physiology, medical doctors were introduced to the natural sciences. Since the late 1940s, randomized and epidemiological studies have come to provide the evidence for medical practice, which led to the emergence of clinical epidemiology as a new field in the medical sciences. Within the past few years, big data has become the driving force behind the vision for having a comprehensive set of health-related data which tracks individual healthcare histories and consequently that of large populations. The aim of this article is to discuss the implications of data-driven medicine, and to examine how it can find a place within clinical care. The EU-wide discussion on the development of data-driven medicine is presented. The following features and suggested actions were identified: harmonizing data formats, data processing and analysis, data exchange, related legal frameworks and ethical challenges. For the effective development of data-driven medicine, pilot projects need to be conducted to allow for open and transparent discussion on the advantages and challenges. The Federal Ministry of Education and Research ("Bundesministerium für Bildung und Forschung," BMBF) Arthromark project is an important example. Another example is the Medical Informatics Initiative of the BMBF. The digital revolution affects clinic practice. Data can be generated and stored in quantities that are almost unimaginable. It is possible to take advantage of this for development of a learning healthcare system if the principles of medical evidence generation are integrated into innovative IT-infrastructures and processes.

  12. Evolution of the Air Toxics under the Big Sky Program

    Science.gov (United States)

    Marra, Nancy; Vanek, Diana; Hester, Carolyn; Holian, Andrij; Ward, Tony; Adams, Earle; Knuth, Randy

    2011-01-01

    As a yearlong exploration of air quality and its relation to respiratory health, the "Air Toxics Under the Big Sky" program offers opportunities for students to learn and apply science process skills through self-designed inquiry-based research projects conducted within their communities. The program follows a systematic scope and sequence…

  13. NASA EOSDIS Evolution in the BigData Era

    Science.gov (United States)

    Lynnes, Christopher

    2015-01-01

    NASA's EOSDIS system faces several challenges in the Big Data Era. Although volumes are large (but not unmanageably so), the variety of different data collections is daunting. That variety also brings with it a large and diverse user community. One key evolution EOSDIS is working toward is to enable more science analysis to be performed close to the data.

  14. Big Data analytics in the Geo-Spatial Domain

    NARCIS (Netherlands)

    R.A. Goncalves (Romulo); M.G. Ivanova (Milena); M.L. Kersten (Martin); H. Scholten; S. Zlatanova; F. Alvanaki (Foteini); P. Nourian (Pirouz); E. Dias

    2014-01-01

    htmlabstractBig data collections in many scientific domains have inherently rich spatial and geo-spatial features. Spatial location is among the core aspects of data in Earth observation sciences, astronomy, and seismology to name a few. The goal of our project is to design an efficient data

  15. The Big Bang: UK Young Scientists' and Engineers' Fair 2010

    Science.gov (United States)

    Allison, Simon

    2010-01-01

    The Big Bang: UK Young Scientists' and Engineers' Fair is an annual three-day event designed to promote science, technology, engineering and maths (STEM) careers to young people aged 7-19 through experiential learning. It is supported by stakeholders from business and industry, government and the community, and brings together people from various…

  16. Addressing big data challenges for scientific data infrastructure

    NARCIS (Netherlands)

    Demchenko, Y.; Zhao, Z.; Grosso, P.; Wibisono, A.; de Laat, C.

    2012-01-01

    This paper discusses the challenges that are imposed by Big Data Science on the modern and future Scientific Data Infrastructure (SDI). The paper refers to different scientific communities to define requirements on data management, access control and security. The paper introduces the Scientific

  17. Steering with big words: articulating ideographs in research programs

    NARCIS (Netherlands)

    Bos, Colette; Walhout, Bart; Walhout, Bart; Peine, Alexander; van Lente, Harro

    2014-01-01

    Nowadays, science should address societal challenges, such as ‘sustainability’, or ‘responsible research and innovation’. This emerging form of steering toward broad and generic goals involves the use of ‘big words’: encompassing concepts that are uncontested themselves, but that allow for multiple

  18. Steering with big words: articulating ideographs in nanotechnology

    NARCIS (Netherlands)

    Bos, Colette; Walhout, Albert; Peine, Alex; van Lente, Harro

    2014-01-01

    Nowadays, science should address societal challenges, such as ‘sustainability’, or ‘responsible research and innovation’. This emerging form of steering toward broad and generic goals involves the use of ‘big words’: encompassing concepts that are uncontested themselves, but that allow for multiple

  19. Recreating big Ban to learn more about universe

    CERN Multimedia

    2005-01-01

    A multi-nation effort at Gemeva-based CERN laboratory to recreate conditions existing just after the Big Ban could give vital clues to the creation of the universe and help overcome prejudices against this widely held scientific theory, an eminent science writer said in Kolkata on Tuesday

  20. Technology for Mining the Big Data of MOOCs

    Science.gov (United States)

    O'Reilly, Una-May; Veeramachaneni, Kalyan

    2014-01-01

    Because MOOCs bring big data to the forefront, they confront learning science with technology challenges. We describe an agenda for developing technology that enables MOOC analytics. Such an agenda needs to efficiently address the detailed, low level, high volume nature of MOOC data. It also needs to help exploit the data's capacity to reveal, in…