WorldWideScience

Sample records for big heads small

  1. Data: Big and Small.

    Science.gov (United States)

    Jones-Schenk, Jan

    2017-02-01

    Big data is a big topic in all leadership circles. Leaders in professional development must develop an understanding of what data are available across the organization that can inform effective planning for forecasting. Collaborating with others to integrate data sets can increase the power of prediction. Big data alone is insufficient to make big decisions. Leaders must find ways to access small data and triangulate multiple types of data to ensure the best decision making. J Contin Educ Nurs. 2017;48(2):60-61. Copyright 2017, SLACK Incorporated.

  2. Small Big Data Congress 2017

    NARCIS (Netherlands)

    Doorn, J.

    2017-01-01

    TNO, in collaboration with the Big Data Value Center, presents the fourth Small Big Data Congress! Our congress aims at providing an overview of practical and innovative applications based on big data. Do you want to know what is happening in applied research with big data? And what can already be

  3. Big Data, Small Sample.

    Science.gov (United States)

    Gerlovina, Inna; van der Laan, Mark J; Hubbard, Alan

    2017-05-20

    Multiple comparisons and small sample size, common characteristics of many types of "Big Data" including those that are produced by genomic studies, present specific challenges that affect reliability of inference. Use of multiple testing procedures necessitates calculation of very small tail probabilities of a test statistic distribution. Results based on large deviation theory provide a formal condition that is necessary to guarantee error rate control given practical sample sizes, linking the number of tests and the sample size; this condition, however, is rarely satisfied. Using methods that are based on Edgeworth expansions (relying especially on the work of Peter Hall), we explore the impact of departures of sampling distributions from typical assumptions on actual error rates. Our investigation illustrates how far the actual error rates can be from the declared nominal levels, suggesting potentially wide-spread problems with error rate control, specifically excessive false positives. This is an important factor that contributes to "reproducibility crisis". We also review some other commonly used methods (such as permutation and methods based on finite sampling inequalities) in their application to multiple testing/small sample data. We point out that Edgeworth expansions, providing higher order approximations to the sampling distribution, offer a promising direction for data analysis that could improve reliability of studies relying on large numbers of comparisons with modest sample sizes.

  4. Small millets, big potential

    International Development Research Centre (IDRC) Digital Library (Canada)

    consumption of small millets, mainly due to limited productivity, high ... for effective integration of small millets in the ... replicated in other cities. ... to micro-, small- and medium-entrepreneurs producing millet-based ... and Activities Network,.

  5. Small dose... big poison.

    Science.gov (United States)

    Braitberg, George; Oakley, Ed

    2010-11-01

    It is not possible to identify all toxic substances in a single journal article. However, there are some exposures that in small doses are potentially fatal. Many of these exposures are particularly toxic to children. Using data from poison control centres, it is possible to recognise this group of exposures. This article provides information to assist the general practitioner to identify potential toxic substance exposures in children. In this article the authors report the signs and symptoms of toxic exposures and identify the time of onset. Where clear recommendations on the period of observation and known fatal dose are available, these are provided. We do not discuss management or disposition, and advise readers to contact the Poison Information Service or a toxicologist for this advice.

  6. Starting Small, Thinking Big - Continuum Magazine | NREL

    Science.gov (United States)

    , Thinking Big Stories NREL Helps Agencies Target New Federal Sustainability Goals Student Engagements Help solar power in the territory. Photo by Don Buchanan, VIEO Starting Small, Thinking Big NREL helps have used these actions to optimize that energy use.'" NREL's cross-organizational work supports

  7. Small quarks make big nuggets

    International Nuclear Information System (INIS)

    Deligeorges, S.

    1985-01-01

    After a brief recall on the classification of subatomic particles, this paper deals with quark nuggets, particle with more than three quarks, a big bag, which is called ''nuclearite''. Neutron stars, in fact, are big sacks of quarks, gigantic nuggets. Now, physicists try to calculate which type of nuggets of strange quark matter is stable, what has been the influence of quark nuggets on the primordial nucleosynthesis. At the present time, one says that if these ''nuggets'' exist, and in a large proportion, they may be candidates for the missing mass [fr

  8. Radioactivity: ''small users, big problems''

    International Nuclear Information System (INIS)

    McDonnell, C.

    1993-01-01

    In the United Kingdom there are at least one thousand small users of radioactivity in industry, in medicine, in higher education establishments and even schools. These users of small amounts of radioactivity, covering a wide variety of forms and applications, have difficulty in disposing of their wastes. Disposal provisions for users outside the nuclear industry, the practical problems they encounter and the future developments likely are discussed. (UK)

  9. The big head and the long tail

    DEFF Research Database (Denmark)

    Helles, Rasmus

    2013-01-01

    This paper discusses how the advent of big data challenges established theories in Internet studies to redevelop existing explanatory strategies in order to incorporate the possibilities offered by this new empirical resource. The article suggests that established analytical procedures and theore......This paper discusses how the advent of big data challenges established theories in Internet studies to redevelop existing explanatory strategies in order to incorporate the possibilities offered by this new empirical resource. The article suggests that established analytical procedures...... and theoretical frameworks used in Internet studies can be fruitfully employed to explain high–level structural phenomena that are only observable through the use of big data. The present article exemplifies this by offering a detailed analysis of how genre analysis of Web sites may be used to shed light...... on the generative mechanism behind the long–tail distribution of Web site use. The analysis shows that the long tail should be seen as a tiered version of popular top sites, and argues that downsizing of large–scale datasets in combination with qualitative and/or small–scale quantitative procedures may provide...

  10. Small head size after atomic irradiation

    International Nuclear Information System (INIS)

    Miller, R.W.; Mulvihill, J.J.

    1975-01-01

    A study of children exposed to nuclear explosions in Hiroshima and Nagasaki showed small head size and mental retardation when exposure occurred less than 18 weeks of gestational age. Increased frequency of small head size occurred when maternal exposure was 10 to 19 rad. Tables and graphs are presented to show relationships between dose, gestational age, and frequency of small head size

  11. Diet of the endangered big-headed turtle Platysternon megacephalum

    Directory of Open Access Journals (Sweden)

    Yik-Hei Sung

    2016-12-01

    Full Text Available Populations of the big-headed turtle Platysternon megacephalum are declining at unprecedented rates across most of its distribution in Southeast Asia owing to unsustainable harvest for pet, food, and Chinese medicine markets. Research on Asian freshwater turtles becomes more challenging as populations decline and basic ecological information is needed to inform conservation efforts. We examined fecal samples collected from P. megacephalum in five streams in Hong Kong to quantify the diet, and we compared the germination success of ingested and uningested seeds. Fruits, primarily of Machilus spp., were most frequently consumed, followed by insects, plant matter, crabs and mollusks. The niche breadth of adults was wider than that of juveniles. Diet composition differed between sites, which may be attributable to the history of illegal trapping at some sites, which reduced the proportion of larger and older individuals. Digestion of Machilus spp. fruits by P. megacephalum enhanced germination success of seeds by about 30%. However, most digested seeds are likely defecated in water in this highly aquatic species, which limits the potential benefit to dispersal. The results of our study can be used by conservation-related captive breeding programs to ensure a more optimal diet is provided to captive P. megacephalum.

  12. Small decisions with big impact on data analytics

    OpenAIRE

    Jana Diesner

    2015-01-01

    Big social data have enabled new opportunities for evaluating the applicability of social science theories that were formulated decades ago and were often based on small- to medium-sized samples. Big Data coupled with powerful computing has the potential to replace the statistical practice of sampling and estimating effects by measuring phenomena based on full populations. Preparing these data for analysis and conducting analytics involves a plethora of decisions, some of which are already em...

  13. How big is big and how small is small the sizes of everything and why

    CERN Document Server

    Smith, Timothy Paul

    2013-01-01

    This book is about how big is the universe and how small are quarks, and what are the sizes of dozens of things between these two extremes. It describes the sizes of atoms and planets, quarks and galaxies, cells and sequoias. It is a romp through forty-five orders of magnitude from the smallest sub-nuclear particles we have measured, to the edge of the observed universe. It also looks at time, from the epic age of the cosmos to the fleeting lifetimes of ethereal particles. It is a narrative that trips its way from stellar magnitudes to the clocks on GPS satellites, from the nearly logarithmic scales of a piano keyboard through a system of numbers invented by Archimedes and on to the measurement of the size of an atom. Why do some things happen at certain scales? Why are cells a hundred thousandths of a meter across? Why are stars never smaller than about 100 million meters in diameter? Why are trees limited to about 120 meters in height? Why are planets spherical, but asteroids not? Often the size of an objec...

  14. A Big Year for Small Bodies

    Science.gov (United States)

    Mayo, Louis; Erickson, K.

    2013-10-01

    2013 is a watershed year for celestial events involving the solar system’s unsung heroes, small bodies. The Cosmic Valentine of Asteroid 2012 DA14 which passed within ~ 3.5 Earth radii of the Earth's surface (February 15, 2013), Comet C/2011 L4 PANSTARRS and the Thanksgiving 2013 pass of Comet ISON, which will pass less than 0.012 AU (1.8 million km) from the solar surface and could be visible during the day. All this in addition to Comet Lemmon and a host of meteor showers makes 2013 a landmark year to deliver the excitement of planetary science to the audiences worldwide. To deliver the excitement and wonder of our solar system’s small bodies to worldwide audiences, NASA’s JPL and GSFC education teams in partnership with NASA EDGE will reach out to the public through multiple venues including broadcast media, social media, science and math focused educational activities, observing challenges, interactive visualization tools like “Eyes on the Solar System” and more culminating in the Thanksgiving Day Comet ISON perihelion passage. This talk will highlight NASA’s focused education effort to engage the public in small bodies science and the role these objects play in our understanding of the formation and evolution of the solar system.

  15. Small and low head pumped storage projects

    International Nuclear Information System (INIS)

    Makarechian, A.H.

    1991-01-01

    The purpose of this paper is to focus attention on small and low head pumped storage projects. These projects may be defined as having a capacity of less than 200-300 MW and down to about 20 MW, with heads of 1200 ft to about 300 ft or less. Many advantages of these smaller pumped storage projects include more flexibility in siting of a project, considerably shorter licensing and construction period, adaptability to closed system design concept to reduce adverse environmental impacts, considerably reduced risks of delays and substantial cost over-runs, better suited to meeting peaking capacity requirements for individual utilities, and much less transmission inter-connection requirements. An overall licensing and construction schedule of about 3 to 3 1/2 years is realistic for many smaller pumped storage projects, and competitive costs in terms of dollars per kW installed can be achieved

  16. Small data, data infrastructures and big data (Working Paper 1)

    OpenAIRE

    Kitchin, Rob; Lauriault, Tracey P.

    2014-01-01

    The production of academic knowledge has progressed for the past few centuries using small data studies characterized by sampled data generated to answer specific questions. It is a strategy that has been remarkably successful, enabling the sciences, social sciences and humanities to advance in leaps and bounds. This approach is presently being challenged by the development of big data. Small data studies will, however, continue to be important in the future because of their utility in answer...

  17. Small data in the era of big data

    OpenAIRE

    Kitchin, Rob; Lauriault, Tracey P.

    2015-01-01

    Academic knowledge building has progressed for the past few centuries using small data studies characterized by sampled data generated to answer specific questions. It is a strategy that has been remarkably successful, enabling the sciences, social sciences and humanities to advance in leaps and bounds. This approach is presently being challenged by the development of big data. Small data studies will however, we argue, continue to be popular and valuable in the fut...

  18. How big of an effect do small dams have? Using geomorphological footprints to quantify spatial impact of low-head dams and identify patterns of across-dam variation

    Science.gov (United States)

    Fencl, Jane S.; Mather, Martha E.; Costigan, Katie H.; Daniels, Melinda D.

    2015-01-01

    Longitudinal connectivity is a fundamental characteristic of rivers that can be disrupted by natural and anthropogenic processes. Dams are significant disruptions to streams. Over 2,000,000 low-head dams (research and conservation is impaired by not knowing the magnitude of low-head dam impacts. Based on the geomorphic literature, we refined a methodology that allowed us to quantify the spatial extent of low-head dam impacts (herein dam footprint), assessed variation in dam footprints across low-head dams within a river network, and identified select aspects of the context of this variation. Wetted width, depth, and substrate size distributions upstream and downstream of six low-head dams within the Upper Neosho River, Kansas, United States of America were measured. Total dam footprints averaged 7.9 km (3.0–15.3 km) or 287 wetted widths (136–437 wetted widths). Estimates included both upstream (mean: 6.7 km or 243 wetted widths) and downstream footprints (mean: 1.2 km or 44 wetted widths). Altogether the six low-head dams impacted 47.3 km (about 17%) of the mainstem in the river network. Despite differences in age, size, location, and primary function, the sizes of geomorphic footprints of individual low-head dams in the Upper Neosho river network were relatively similar. The number of upstream dams and distance to upstream dams, but not dam height, affected the spatial extent of dam footprints. In summary, ubiquitous low-head dams individually and cumulatively altered lotic ecosystems. Both characteristics of individual dams and the context of neighboring dams affected low-head dam impacts within the river network. For these reasons, low-head dams require a different, more integrative, approach for research and management than the individualistic approach that has been applied to larger dams.

  19. Developing countries: small technology with big effects

    International Nuclear Information System (INIS)

    McRobie, G.; Carr, M.

    1978-01-01

    As far the poor countries of the world are concerned, during the past twenty years they have had access only to the technologies developed by the rich to suit the rich. It is now beyond question that some of the most daunting problems confronting the majority of the worlds populations stem directly from the kind of technology transferred to them under current aid and development programs. That the technology of the rich is generally inappropriate to meet the needs and resources of the poor countries is becoming more widley recognized both by aid-givers and aid-receivers. Yet it is this technology that continues to be almost exclusively and most powerfully promoted in the developing countries. To meet their needs a new technology must be discovered or devised: one that lies between the sickle and the combine harvester and is small, simple and cheap enough to harmonise withlocal human and material resources and lends itself to widespread reproduction with the minimum of outside help. What we now need most urgently is a new set of technologies, designed, by people who are informed by the need to develop capital-saving technologies capable of being decentralized to the maximum extend. The technology gap is not only wide, but the knowledge an resources required to fill is, although they exist in the industrialized countries, have not been mobilized to provide the right kind of knowledge and to make it available to those who need it. It was to do this that the Intermediate Technology Development Group was set up ten years ago. (orig.) 891 HP 892 EKI [de

  20. Small Area Model-Based Estimators Using Big Data Sources

    Directory of Open Access Journals (Sweden)

    Marchetti Stefano

    2015-06-01

    Full Text Available The timely, accurate monitoring of social indicators, such as poverty or inequality, on a finegrained spatial and temporal scale is a crucial tool for understanding social phenomena and policymaking, but poses a great challenge to official statistics. This article argues that an interdisciplinary approach, combining the body of statistical research in small area estimation with the body of research in social data mining based on Big Data, can provide novel means to tackle this problem successfully. Big Data derived from the digital crumbs that humans leave behind in their daily activities are in fact providing ever more accurate proxies of social life. Social data mining from these data, coupled with advanced model-based techniques for fine-grained estimates, have the potential to provide a novel microscope through which to view and understand social complexity. This article suggests three ways to use Big Data together with small area estimation techniques, and shows how Big Data has the potential to mirror aspects of well-being and other socioeconomic phenomena.

  1. Effect of furosemide and dietary sodium on kidney and plasma big and small renin

    International Nuclear Information System (INIS)

    Iwao, H.; Michelakis, A.M.

    1981-01-01

    Renin was found in mouse plasma in high-molecular-weight forms (big big renin, big renin) and a low-molecular-weight form (small renin). They were measuerd by a radioimmunoassay procedure for the direct measurement of renin. In the kidney, 89% of total renin was small renin and the rest was big big and big renin. This distribution pattern of renins was not changed when the kideny tissue was homogenized in the presence of protease inhibitors. Low-sodium or high-sodium diets changed renal renin content, but not the distribution pattern of renins in the kidney. Acute stimulation of renin release by furosemide increased small renin but not big big and big renin in plasma. However, dietary sodium depletion for 2 weeks significantly increased big big, big, and small renin in plasma of mice with or without submaxillary glands. In contrast, high-sodium intake significantly decreased big big, big, and small renin in plasma of mice with or without submaxillary glands

  2. Small Bodies, Big Discoveries: NASA's Small Bodies Education Program

    Science.gov (United States)

    Mayo, L.; Erickson, K. J.

    2014-12-01

    2014 is turning out to be a watershed year for celestial events involving the solar system's unsung heroes, small bodies. This includes the close flyby of comet C/2013 A1 / Siding Spring with Mars in October and the historic Rosetta mission with its Philae lander to comet 67P/Churyumov-Gerasimenko. Beyond 2014, the much anticipated 2015 Pluto flyby by New Horizons and the February Dawn Mission arrival at Ceres will take center stage. To deliver the excitement and wonder of our solar system's small bodies to worldwide audiences, NASA's JPL and GSFC education teams in partnership with NASA EDGE will reach out to the public through multiple venues including broadcast media, social media, science and math focused educational activities, observing challenges, interactive visualization tools like "Eyes on the Solar System" and more. This talk will highlight NASA's focused education effort to engage the public in small bodies mission science and the role these objects play in our understanding of the formation and evolution of the solar system.

  3. Confronting hip resurfacing and big femoral head replacement gait analysis

    Directory of Open Access Journals (Sweden)

    Panagiotis K. Karampinas

    2014-03-01

    Full Text Available Improved hip kinematics and bone preservation have been reported after resurfacing total hip replacement (THRS. On the other hand, hip kinematics with standard total hip replacement (THR is optimized with large diameter femoral heads (BFH-THR. The purpose of this study is to evaluate the functional outcomes of THRS and BFH-THR and correlate these results to bone preservation or the large femoral heads. Thirty-one patients were included in the study. Gait speed, postural balance, proprioception and overall performance. Our results demonstrated a non-statistically significant improvement in gait, postural balance and proprioception in the THRS confronting to BFH-THR group. THRS provide identical outcomes to traditional BFH-THR. The THRS choice as bone preserving procedure in younger patients is still to be evaluated.

  4. Small decisions with big impact on data analytics

    Directory of Open Access Journals (Sweden)

    Jana Diesner

    2015-11-01

    Full Text Available Big social data have enabled new opportunities for evaluating the applicability of social science theories that were formulated decades ago and were often based on small- to medium-sized samples. Big Data coupled with powerful computing has the potential to replace the statistical practice of sampling and estimating effects by measuring phenomena based on full populations. Preparing these data for analysis and conducting analytics involves a plethora of decisions, some of which are already embedded in previously collected data and built tools. These decisions refer to the recording, indexing and representation of data and the settings for analysis methods. While these choices can have tremendous impact on research outcomes, they are not often obvious, not considered or not being made explicit. Consequently, our awareness and understanding of the impact of these decisions on analysis results and derived implications are highly underdeveloped. This might be attributable to occasional high levels of over-confidence in computational solutions as well as the possible yet questionable assumption that Big Data can wash out minor data quality issues, among other reasons. This article provides examples for how to address this issue. It argues that checking, ensuring and validating the quality of big social data and related auxiliary material is a key ingredient for empowering users to gain reliable insights from their work. Scrutinizing data for accuracy issues, systematically fixing them and diligently documenting these processes can have another positive side effect: Closely interacting with the data, thereby forcing ourselves to understand their idiosyncrasies and patterns, can help us to move from being able to precisely model and formally describe effects in society to also understand and explain them.

  5. "small problems, Big Trouble": An Art and Science Collaborative Exhibition Reflecting Seemingly small problems Leading to Big Threats

    Science.gov (United States)

    Waller, J. L.; Brey, J. A.

    2014-12-01

    "small problems, Big Trouble" (spBT) is an exhibition of artist Judith Waller's paintings accompanied by text panels written by Earth scientist Dr. James A. Brey and several science researchers and educators. The text panels' message is as much the focus of the show as the art--true interdisciplinarity! Waller and Brey's history of art and earth science collaborations include the successful exhibition "Layers: Places in Peril". New in spBT is extended collaboration with other scientists in order to create awareness of geoscience and other subjects (i.e. soil, parasites, dust, pollutants, invasive species, carbon, ground water contaminants, solar wind) small in scale which pose significant threats. The paintings are the size of a mirror, a symbol suggesting the problems depicted are those we increasingly need to face, noting our collective reflections of shared current and future reality. Naturalistic rendering and abstract form in the art helps reach a broad audience including those familiar with art and those familiar with science. The goal is that gallery visitors gain greater appreciation and understanding of both—and of the sober content of the show as a whole. "small problems, Big Trouble" premiers in Wisconsin April, 2015. As in previous collaborations, Waller and Brey actively utilize art and science (specifically geoscience) as an educational vehicle for active student learning. Planned are interdisciplinary university and area high school activities linked through spBT. The exhibition in a public gallery offers a means to enhance community awareness of and action on scientific issues through art's power to engage people on an emotional level. This AGU presentation includes a description of past Waller and Brey activities: incorporating art and earth science in lab and studio classrooms, producing gallery and museum exhibitions and delivering workshops and other presentations. They also describe how walking the paths of several past earth science

  6. Big data in small steps : Assessing the value of data

    NARCIS (Netherlands)

    Veenstra, A.F.E. van; Bakker, T.P.; Esmeijer, J.

    2013-01-01

    Data is seen as the new oil: an important driver of innovation and economic growth. At the same time, many find it difficult to determine the value of big data for their organization. TNO presents a stepwise big data model that supports private and public organizations to assess the potential of big

  7. Pocket data mining big data on small devices

    CERN Document Server

    Gaber, Mohamed Medhat; Gomes, Joao Bartolo

    2014-01-01

    Owing to continuous advances in the computational power of handheld devices like smartphones and tablet computers, it has become possible to perform Big Data operations including modern data mining processes onboard these small devices. A decade of research has proved the feasibility of what has been termed as Mobile Data Mining, with a focus on one mobile device running data mining processes. However, it is not before 2010 until the authors of this book initiated the Pocket Data Mining (PDM) project exploiting the seamless communication among handheld devices performing data analysis tasks that were infeasible until recently. PDM is the process of collaboratively extracting knowledge from distributed data streams in a mobile computing environment. This book provides the reader with an in-depth treatment on this emerging area of research. Details of techniques used and thorough experimental studies are given. More importantly and exclusive to this book, the authors provide detailed practical guide on the depl...

  8. Transforming fragments into candidates: small becomes big in medicinal chemistry.

    Science.gov (United States)

    de Kloe, Gerdien E; Bailey, David; Leurs, Rob; de Esch, Iwan J P

    2009-07-01

    Fragment-based drug discovery (FBDD) represents a logical and efficient approach to lead discovery and optimisation. It can draw on structural, biophysical and biochemical data, incorporating a wide range of inputs, from precise mode-of-binding information on specific fragments to wider ranging pharmacophoric screening surveys using traditional HTS approaches. It is truly an enabling technology for the imaginative medicinal chemist. In this review, we analyse a representative set of 23 published FBDD studies that describe how low molecular weight fragments are being identified and efficiently transformed into higher molecular weight drug candidates. FBDD is now becoming warmly endorsed by industry as well as academia and the focus on small interacting molecules is making a big scientific impact.

  9. "Small Steps, Big Rewards": You Can Prevent Type 2 Diabetes

    Science.gov (United States)

    ... Steps, Big Rewards": You Can Prevent Type 2 Diabetes Past Issues / Winter 2008 Table of Contents For ... million Americans are at risk for type 2 diabetes." "Fifty four million Americans are at risk for ...

  10. Small wormholes change our picture of the big bang

    CERN Multimedia

    1990-01-01

    Matt Visser has studied tiny wormholes, which may be produced on a subatomic scale by quantum fluctuations in the energy of the vacuum. He believes these quantum wormholes could change our picture of the origin of the Universe in the big bang (1/2 p)

  11. Making a Big Bang on the small screen

    Science.gov (United States)

    Thomas, Nick

    2010-01-01

    While the quality of some TV sitcoms can leave viewers feeling cheated out of 30 minutes of their lives, audiences and critics are raving about the science-themed US comedy The Big Bang Theory. First shown on the CBS network in 2007, the series focuses on two brilliant postdoc physicists, Leonard and Sheldon, who are totally absorbed by science. Adhering to the stereotype, they also share a fanatical interest in science fiction, video-gaming and comic books, but unfortunately lack the social skills required to connect with their 20-something nonacademic contemporaries.

  12. Theorizing the narrative dimension of psychotherapy and counseling: A big and small story approach

    NARCIS (Netherlands)

    Sools, Anna Maria; Schuhmann, Carmen

    2014-01-01

    In this article, we develop a theoretically substantiated narrative framework for assessing psychotherapy practices, based on a big and small story approach. This approach stretches the narrative scope of these practices by making explicit and advancing small story counseling. We demonstrate how

  13. More Differences or More Similarities Regarding Education in Big, Middle-sized and Small Companies

    Directory of Open Access Journals (Sweden)

    Marjana Merkač

    2001-12-01

    Full Text Available The article presents the results of research of education and qualifying of employees in small, middle-sized and big Slovenian companies. The research shows some differences regarding the attitude to the development of employees as a part of a company's business strategy, some obstacles for developing their abilities, and connections between job satisfaction and motivation for learning. It also shows how important it is for the subjects concerning education and qualifying if an individual works for a big, middle-sized, or small company.

  14. Small millets, big potential: diverse, nutritious, and climate smart ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2016-04-29

    Apr 29, 2016 ... Integrated and focused public support is now needed for ... Include small millets in the Indian public distribution system (PDS) to 10 kg per ... but post-harvest losses, due to factors such as poor handling, transport, and.

  15. When Small is Big: Microcredit and Economic Development

    Directory of Open Access Journals (Sweden)

    George Brown

    2010-10-01

    Full Text Available Microcredit - the extension of small loans - gives people who would otherwise not have access to credit the opportunity to begin or expand businesses or to pursue job-specific training. These borrowers lack the income, credit history, assets, or security to borrow from other sources. Although the popularity and success of microcredit in developing countries has been trumpeted in the media, microcredit is established and growing in the United States and Canada as well. Its appeal comes from its capacity to provide the means for those who have the ability, drive, and commitment to overcome the hurdles to self-sufficiency. In this article, the role of microcredit as a stimulant for economic development is examined. First, its importance for the establishment of small businesss is described. Second, the article provides an overview of the general microcredit climate in the United states and the local situation in the Ottawa area. Third, brief stories about individuals who have received this type of loan reveal the human impact behind the economic benefits. Finally, the role of microcredit in funding startups is analyzed in comparison to other sources of available funding. The article concludes with a summary of the benefits of microcredit as a win-win proposition for economic development.

  16. Denmark: Small state with big voice and bigger dilemmas

    DEFF Research Database (Denmark)

    Andersen, Mikael Skou; Nielsen, Helle Ørsted

    2016-01-01

    Denmark is a relatively small country by any standard and its influence in international climate negotiations remains limited. Nevertheless, the pioneering of certain climate-related policies, (including in renewables, energy efficiency and taxes on CO2), has, over the previous decades, been...... attracting international interest and has provided Denmark with some leverage to act as instigator and self-declared leader into international climate diplomacy. The independent Climate Change Performance Index, for instance, over three consecutive years ranked Denmark as its top performer in acknowledgement......), there are numerous dilemmas both at the substantive and diplomatic level when it comes to climate policy. Denmark’s actual greenhouse gas emissions (GHGE) per capita remains above the EU average (OECD, 2012:73) reflecting a high level of individual consumption as well as sectoral interests related to resource...

  17. Small Scaffolds, Big Potential: Developing Miniature Proteins as Therapeutic Agents.

    Science.gov (United States)

    Holub, Justin M

    2017-09-01

    Preclinical Research Miniature proteins are a class of oligopeptide characterized by their short sequence lengths and ability to adopt well-folded, three-dimensional structures. Because of their biomimetic nature and synthetic tractability, miniature proteins have been used to study a range of biochemical processes including fast protein folding, signal transduction, catalysis and molecular transport. Recently, miniature proteins have been gaining traction as potential therapeutic agents because their small size and ability to fold into defined tertiary structures facilitates their development as protein-based drugs. This research overview discusses emerging developments involving the use of miniature proteins as scaffolds to design novel therapeutics for the treatment and study of human disease. Specifically, this review will explore strategies to: (i) stabilize miniature protein tertiary structure; (ii) optimize biomolecular recognition by grafting functional epitopes onto miniature protein scaffolds; and (iii) enhance cytosolic delivery of miniature proteins through the use of cationic motifs that facilitate endosomal escape. These objectives are discussed not only to address challenges in developing effective miniature protein-based drugs, but also to highlight the tremendous potential miniature proteins hold for combating and understanding human disease. Drug Dev Res 78 : 268-282, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  18. Thinking small: Onsite power generation may soon be big

    International Nuclear Information System (INIS)

    Davidson, K.G.; Braun, G.W.

    1993-01-01

    Utilities are retheinking the way they do business. Eventually, smaller and cleaner generation units located near major load centers could begin to supplement power from central plants. The technologies necessary to this transition are emerging in the form of open-quotes distributed generation.close quotes These technologies typically produce power on a relatively small scale (less than 50 MW per unit) and can be sited in congested urban areas as well as near remote customers. This allows utilities to meet new demand for electricity without building central generating stations and without substantially expanding or upgrading the power delivery system-in other words, at lower cost. Some distributed-generation technologies, such as fuel cells and solar energy harnessed by photovoltaic (PV) cells, are just beginning to carve out niches in th power market. Others, such as engine generator sets and battery storage, have evolved into robust, high-technology systems. In the case of fuel cells and engine-driven systems, natural gas is emerging as an environmentally friendly fuel that should remain available for decades at competitive prices. As gas-fueled distributed power is deployed, utility infrastructures for delivering gas and electricity to customers could become more integrated, allowing planners to smooth load profiles for energy services and creating greater synergies between the two. As distributed-generation technologies become more practical and cost-effective, utilities may find that change can be a path toward least-cost service and sustainable profitability

  19. Focus on Extracellular Vesicles: Introducing the Next Small Big Thing

    Directory of Open Access Journals (Sweden)

    Hina Kalra

    2016-02-01

    Full Text Available Intercellular communication was long thought to be regulated exclusively through direct contact between cells or via release of soluble molecules that transmit the signal by binding to a suitable receptor on the target cell, and/or via uptake into that cell. With the discovery of small secreted vesicular structures that contain complex cargo, both in their lumen and the lipid membrane that surrounds them, a new frontier of signal transduction was discovered. These “extracellular vesicles” (EV were initially thought to be garbage bags through which the cell ejected its waste. Whilst this is a major function of one type of EV, i.e., apoptotic bodies, many EVs have intricate functions in intercellular communication and compound exchange; although their physiological roles are still ill-defined. Additionally, it is now becoming increasingly clear that EVs mediate disease progression and therefore studying EVs has ignited significant interests among researchers from various fields of life sciences. Consequently, the research effort into the pathogenic roles of EVs is significantly higher even though their protective roles are not well established. The “Focus on extracellular vesicles” series of reviews highlights the current state of the art regarding various topics in EV research, whilst this review serves as an introductory overview of EVs, their biogenesis and molecular composition.

  20. Evaluation research of small and medium-sized enterprise informatization on big data

    Science.gov (United States)

    Yang, Na

    2017-09-01

    Under the background of big data, key construction of small and medium-sized enterprise informationization level was needed, but information construction cost was large, while information cost of inputs can bring benefit to small and medium-sized enterprises. This paper established small and medium-sized enterprise informatization evaluation system from hardware and software security level, information organization level, information technology application and the profit level, and information ability level. The rough set theory was used to brief indexes, and then carry out evaluation by support vector machine (SVM) model. At last, examples were used to verify the theory in order to prove the effectiveness of the method.

  1. Comparing return to sport activities after short metaphyseal femoral arthroplasty with resurfacing and big femoral head arthroplasties.

    Science.gov (United States)

    Karampinas, Panagiotis K; Papadelis, Eustratios G; Vlamis, John A; Basiliadis, Hlias; Pneumaticos, Spiros G

    2017-07-01

    Young patients feel that maintaining sport activities after total hip arthroplasty constitutes an important part of their quality of life. The majority of hip surgeons allow patients to return to low-impact activities, but significant caution is advised to taking part in high-impact activities. The purpose of this study is to compare and evaluate the post-operative return to daily living habits and sport activities following short-metaphyseal hip and high functional total hip arthroplasties (resurfacing and big femoral head arthroplasties). In a study design, 48 patients (55 hips) were enrolled in three different comparative groups, one with the short-metaphyseal arthroplasties, a second with high functional resurfacing arthroplasties and a third of big femoral head arthroplasties. Each patient experienced a clinical examination and evaluated with Harris Hip Score, WOMAC, Sf-36, UCLA activity score, satisfaction VAS, anteroposterior and lateral X-rays of the hip and were followed in an outpatient setting for 2 years. Statistical analysis revealed no notable differences between the three groups regarding their demographic data however significant differences have been found between preoperative and postoperative clinical scores of each group. Also, we fail to reveal any significant differences when comparing data of all three groups at the final 2 years postoperative control regarding their clinical scores. The overall outcome of all three groups was similar, all the patients were satisfied and returned to previous level of sport activities. Short metaphyseal hip arthroplasties in young patients intending to return to previous and even high impact sport activities, similar to high functional resurfacing, big femoral head arthroplasties. Short stems with hard on hard bearing surfaces might become an alternative to standard stems and hip resurfacing.

  2. Guide for the 2 infinities - the infinitely big and the infinitely small

    International Nuclear Information System (INIS)

    Armengaud, E.; Arnaud, N.; Aubourg, E.; Bassler, U.; Binetruy, P.; Bouquet, A.; Boutigny, D.; Brun, P.; Chassande-Mottin, E.; Chardin, G.; Coustenis, A.; Descotes-Genon, S.; Dole, H.; Drouart, A.; Elbaz, D.; Ferrando, Ph.; Glicenstein, J.F.; Giraud-Heraud, Y.; Halloin, H.; Kerhoas-Cavata, S.; De Kerret, H.; Klein, E.; Lachieze-Rey, M.; Lagage, P.O.; Langer, M.; Lebrun, F.; Lequeux, J.; Meheut, H.; Moniez, M.; Palanque-Delabrouille, N.; Paul, J.; Piquemal, F.; Polci, F.; Proust, D.; Richard, F.; Robert, J.L.; Rosnet, Ph.; Roudeau, P.; Royole-Degieux, P.; Sacquin, Y.; Serreau, J.; Shifrin, G.; Sida, J.L.; Smith, D.; Sordini, V.; Spiro, M.; Stolarczyk, Th.; Suomijdrvi, T.; Tagger, M.; Vangioni, E.; Vauclair, S.; Vial, J.C.; Viaud, B.; Vignaud, D.

    2010-01-01

    This book is to be read from both ends: one is dedicated to the path towards the infinitely big and the other to the infinitely small. Each path is made of a series of various subject entries illustrating important concepts or achievements in the quest for the understanding of the concerned infinity. For instance the part concerning the infinitely small includes entries like: quarks, Higgs bosons, radiation detection, Chooz neutrinos... while the part for the infinitely big includes: the universe, cosmic radiations, black matter, antimatter... and a series of experiments such as HESS, INTEGRAL, ANTARES, JWST, LOFAR, Planck, LSST, SOHO, Virgo, VLT, or XMM-Newton. This popularization work includes also an important glossary that explains scientific terms used in the entries. (A.C.)

  3. Head losses in small hydropower plant trash racks (SHP

    Directory of Open Access Journals (Sweden)

    N. Walczak

    2016-12-01

    Full Text Available Small hydropower plants (SHP are technical facilities that are part of alternative energy sources [Paish 2002]. They are primarily characterised by low unit power (in Poland below 5 MW and are often constructed on existing barrages. Electrical current produced by these plants is used to meet local demand. Considering the exploitation of SHPs, it is important to ensure a stable flow through turbines. Aggidis et al. [2010] analysed SHP equipment costs depending on the turbine set. The turbines are protected against damage with trash racks applied for capturing water-borne detritus, such as plant debris carried by water. However, trash racks as solid equipment of SHPs cause head losses, and as a consequence reduce the efficiency of the system. These losses result not only from the spacing of bars, their shape and the technical condition of the inlet chamber, but also from plant debris, its nature, and the quantity of accumulated material that effectively limits the flow. The plant debris captured on trash racks is characterised by diversity in terms of species composition related to the vegetation period and the area where hydraulic facilities are located. Therefore, it is important to maintain trash racks clean by regular removal of the accumulated material. In this context, modernised and newly built power plants are fitted with mechanical cleaners. In older facilities, manual intervention for regular cleaning is required. The present study analyses how the bar shape and the orientation angle of trash racks as well as the accumulated plant debris affect head losses. The results were obtained from laboratory tests. The research examined the impact the inclination angle of trash racks (30°, 60° and 80° has on head loss values for three different shapes of bars (cylindrical, angled and flat rectangular and various weight portions of plant debris (0.25, 0.375 and 0.5 kg. The summarised losses were determined by measuring the difference in water

  4. Eating on nightshift: A big vs small snack impairs glucose response to breakfast

    Directory of Open Access Journals (Sweden)

    Stephanie Centofanti

    2018-01-01

    Full Text Available Shift work is a risk factor for chronic diseases such as Type 2 diabetes. Food choice may play a role, however simply eating at night when the body is primed for sleep may have implications for health. This study examined the impact of consuming a big versus small snack at night on glucose metabolism. N = 31 healthy subjects (21–35 y; 18 F participated in a simulated nightshift laboratory study that included one baseline night of sleep (22:00 h-07:00 h and one night awake with allocation to either a big snack (2100 kJ or small snack (840 kJ group. The snack was consumed between 00:00–00:30 h and consisted of low fat milk, a sandwich, chips and fruit (big snack or half sandwich and fruit (small snack. Subjects ate an identical mixed meal breakfast (2100 kJ at 08:30 h after one full night of sleep and a simulated nightshift. Interstitial glucose was measured continuously during the entire study using Medtronic Continual Glucose Monitors. Only subjects with identical breakfast consumption and complete datasets were analysed (N = 20. Glucose data were averaged into 5-minute bins and area under the curve (AUC was calculated for 90 min post-breakfast. Pre-breakfast, glucose levels were not significantly different between Day1 and Day2, nor were they different between snack groups (p > 0.05. A snack group by day interaction effect was found (F1,16 = 5.36, p = 0.034 and post-hocs revealed that in the big snack group, AUC response to breakfast was significantly higher following nightshift (Day2 compared to Day1 (p = 0.001. This translated to a 20.8% (SEM 5.6 increase. AUC was not significantly different between days in the small snack group. Consuming a big snack at 00:00 h impaired the glucose response to breakfast at 08:30 h, compared to a smaller snack. Further research in this area will inform dietary advice for shift workers, which could include recommendations on how much to eat as well as content.

  5. Big data from small data: data-sharing in the ‘long tail’ of neuroscience

    Science.gov (United States)

    Ferguson, Adam R; Nielson, Jessica L; Cragin, Melissa H; Bandrowski, Anita E; Martone, Maryann E

    2016-01-01

    The launch of the US BRAIN and European Human Brain Projects coincides with growing international efforts toward transparency and increased access to publicly funded research in the neurosciences. The need for data-sharing standards and neuroinformatics infrastructure is more pressing than ever. However, ‘big science’ efforts are not the only drivers of data-sharing needs, as neuroscientists across the full spectrum of research grapple with the overwhelming volume of data being generated daily and a scientific environment that is increasingly focused on collaboration. In this commentary, we consider the issue of sharing of the richly diverse and heterogeneous small data sets produced by individual neuroscientists, so-called long-tail data. We consider the utility of these data, the diversity of repositories and options available for sharing such data, and emerging best practices. We provide use cases in which aggregating and mining diverse long-tail data convert numerous small data sources into big data for improved knowledge about neuroscience-related disorders. PMID:25349910

  6. Small values in big data: The continuing need for appropriate metadata

    Science.gov (United States)

    Stow, Craig A.; Webster, Katherine E.; Wagner, Tyler; Lottig, Noah R.; Soranno, Patricia A.; Cha, YoonKyung

    2018-01-01

    Compiling data from disparate sources to address pressing ecological issues is increasingly common. Many ecological datasets contain left-censored data – observations below an analytical detection limit. Studies from single and typically small datasets show that common approaches for handling censored data — e.g., deletion or substituting fixed values — result in systematic biases. However, no studies have explored the degree to which the documentation and presence of censored data influence outcomes from large, multi-sourced datasets. We describe left-censored data in a lake water quality database assembled from 74 sources and illustrate the challenges of dealing with small values in big data, including detection limits that are absent, range widely, and show trends over time. We show that substitutions of censored data can also bias analyses using ‘big data’ datasets, that censored data can be effectively handled with modern quantitative approaches, but that such approaches rely on accurate metadata that describe treatment of censored data from each source.

  7. Big Bath as a Determinant of Creative Accounting in Small and Micro Enterprises

    Directory of Open Access Journals (Sweden)

    Lenka Zemánková

    2015-01-01

    Full Text Available Creative accounting is a 21st century phenomenon and in the context of the economic crisis and deficit budget it has been receiving increasing attention, in particular in the area of prevention and detection of accounting manipulation. The focus of the research on small and micro-enterprises stems from little attention paid to these enterprises and the undeniable importance of small and micro-enterprises for the economy. Primary research is based on the examination of a phenomenological paradigm, i.e. it focuses on understanding human behaviour on the basis of a reference framework for research participants. The main research method used in research is a comparative case study, which is one of few methods that allow research of this sensitive topic. Research will focus on the existence of a big bath in the company’s ratio of profit and turnover as a determinant of a change in the company’s approach to creative accounting.

  8. Affordable Development and Demonstration of a Small NTR Engine and Stage: How Small is Big Enough?

    Science.gov (United States)

    Borowski, Stanley K.; Sefcik, Robert J.; Fittje, James E.; McCurdy, David R.; Qualls, Arthur L.; Schnitzler, Bruce G.; Werner, James E.; Weitzberg (Abraham); Joyner, Claude R.

    2015-01-01

    The Nuclear Thermal Rocket (NTR) derives its energy from fission of uranium-235 atoms contained within fuel elements that comprise the engine's reactor core. It generates high thrust and has a specific impulse potential of approximately 900 seconds - a 100% increase over today's best chemical rockets. The Nuclear Thermal Propulsion (NTP) project, funded by NASA's AES program, includes five key task activities: (1) Recapture, demonstration, and validation of heritage graphite composite (GC) fuel (selected as the "Lead Fuel" option); (2) Engine Conceptual Design; (3) Operating Requirements Definition; (4) Identification of Affordable Options for Ground Testing; and (5) Formulation of an Affordable Development Strategy. During FY'14, a preliminary DDT&E plan and schedule for NTP development was outlined by GRC, DOE and industry that involved significant system-level demonstration projects that included GTD tests at the NNSS, followed by a FTD mission. To reduce cost for the GTD tests and FTD mission, small NTR engines, in either the 7.5 or 16.5 klbf thrust class, were considered. Both engine options used GC fuel and a "common" fuel element (FE) design. The small approximately 7.5 klbf "criticality-limited" engine produces approximately 157 megawatts of thermal power (MWt) and its core is configured with parallel rows of hexagonal-shaped FEs and tie tubes (TTs) with a FE to TT ratio of approximately 1:1. The larger approximately 16.5 klbf Small Nuclear Rocket Engine (SNRE), developed by LANL at the end of the Rover program, produces approximately 367 MWt and has a FE to TT ratio of approximately 2:1. Although both engines use a common 35 inch (approximately 89 cm) long FE, the SNRE's larger diameter core contains approximately 300 more FEs needed to produce an additional 210 MWt of power. To reduce the cost of the FTD mission, a simple "1-burn" lunar flyby mission was considered to reduce the LH2 propellant loading, the stage size and complexity. Use of existing and

  9. Blind links, a big challenge in the linked data idea: Analysis of Persian Subject Headings

    Directory of Open Access Journals (Sweden)

    Atefeh Sharif

    2014-12-01

    Full Text Available In this survey, Linked data concept as exposing, sharing, and connecting pieces of data, information, and knowledge on the Semantic Web and some potential problems in converting Persian subject headings (PSHs Records into linked data were discussed. A data set (11233 records of PSHs was searched in three information retrieval systems including National Library of Iran (NLI online catalog, Library of Congress (LC online catalog and NOSA books. Correct links between Persian and English subject headings in the 9519 common records of two catalogs were recorded. The results indicate that the links between Persian and English subjects in 20% of records were failed. The maximum error was associated with the anonymous databases (6/7 % in NLI online catalog. It is recommended to preprocess the PSHs records before any conversion projects. It seems that, during the preprocessing, the potential errors could be identified and corrected.

  10. Head First Data Analysis A learner's guide to big numbers, statistics, and good decisions

    CERN Document Server

    Milton, Michael

    2009-01-01

    Today, interpreting data is a critical decision-making factor for businesses and organizations. If your job requires you to manage and analyze all kinds of data, turn to Head First Data Analysis, where you'll quickly learn how to collect and organize data, sort the distractions from the truth, find meaningful patterns, draw conclusions, predict the future, and present your findings to others. Whether you're a product developer researching the market viability of a new product or service, a marketing manager gauging or predicting the effectiveness of a campaign, a salesperson who needs data t

  11. Big ambitions for small reactors as investors size up power options

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, John [nuclear24, Redditch (United Kingdom)

    2016-04-15

    Earlier this year, US nuclear developer NuScale Power completed a study for the UK's National Nuclear Laboratory (NNL) that supported the suitability of NuScale's small modular reactor (SMR) technology for the effective disposition of plutonium. The UK is a frontrunner to compete in the SMR marketplace, both in terms of technological capabilities, trade and political commitment. Industry observers are openly speculating whether SMR design and construction could start to move ahead faster than 'big and conventional' nuclear construction projects - not just in the UK but worldwide. Economies of scale could increase the attraction of SMRs to investors and the general public.

  12. Penguin head movement detected using small accelerometers: a proxy of prey encounter rate.

    Science.gov (United States)

    Kokubun, Nobuo; Kim, Jeong-Hoon; Shin, Hyoung-Chul; Naito, Yasuhiko; Takahashi, Akinori

    2011-11-15

    Determining temporal and spatial variation in feeding rates is essential for understanding the relationship between habitat features and the foraging behavior of top predators. In this study we examined the utility of head movement as a proxy of prey encounter rates in medium-sized Antarctic penguins, under the presumption that the birds should move their heads actively when they encounter and peck prey. A field study of free-ranging chinstrap and gentoo penguins was conducted at King George Island, Antarctica. Head movement was recorded using small accelerometers attached to the head, with simultaneous monitoring for prey encounter or body angle. The main prey was Antarctic krill (>99% in wet mass) for both species. Penguin head movement coincided with a slow change in body angle during dives. Active head movements were extracted using a high-pass filter (5 Hz acceleration signals) and the remaining acceleration peaks (higher than a threshold acceleration of 1.0 g) were counted. The timing of head movements coincided well with images of prey taken from the back-mounted cameras: head movement was recorded within ±2.5 s of a prey image on 89.1±16.1% (N=7 trips) of images. The number of head movements varied largely among dive bouts, suggesting large temporal variations in prey encounter rates. Our results show that head movement is an effective proxy of prey encounter, and we suggest that the method will be widely applicable for a variety of predators.

  13. Radiation-related small head sizes among prenatally exposed atomic bomb survivors

    International Nuclear Information System (INIS)

    Otaki, Masanori; Schull, William J.

    2004-01-01

    The population prenatally exposed to the atomic bombings of Hiroshima and Nagasaki, referred to as the In Utero Clinical Sample, on whom Dosimetry System 1986 doses are available consists of 1566 individuals (1242 in Hiroshima and 324 in Nagasaki). Of these study subjects, 1473 had the circumference of their heads measured at least once between ages 9 to 19. Among these 1473 individuals, 62 had small heads - the circumference of the head was two standard deviations or more below the observed specific age-at-measurement mean. Twenty-six of the 30 cases with severe mental retardation described elsewhere are included among these subjects. Of these 26 severely mentally retarded cases, 15 (58%) had small heads. Most (86%) of the individuals with small heads were exposed in the first or second trimester of pregnancy - 55% in the former period and 31% in the latter. Various dose-response relationships, with and without a threshold, have been fitted to the data grouped by the trimester or postovulatory age (weeks after ovulation) at which exposure occurred. A significant effect of radiation on the frequency of individuals with atypically small heads is observed only in the first and second trimesters and for the intervals postovulation of 0-7 weeks and 8-15 weeks. Although the risk of a small head at 0-7 weeks postovulation increases significantly with increasing dose, no increase in risk for severe mental retardation is noted in this period. No excess risk of a small head was seen in the third trimester or among individuals exposed at ≥ 16 weeks postovulation. The estimated threshold, based either on a linear or a linear-quadratic dose-response relationship, is zero or thereabouts. This apparent absence of a threshold and the somewhat different periods of vulnerability suggest an embryological difference in the development of both a small head and mental retardation. Mean IQ (using the Koga test) and its standard deviation are 63.8 and 8.5, respectively, for the

  14. Phosphorus mass balance in a highly eutrophic semi-enclosed inlet near a big metropolis: a small inlet can contribute towards particulate organic matter production.

    Science.gov (United States)

    Asaoka, Satoshi; Yamamoto, Tamiji

    2011-01-01

    Terrigenous loading into enclosed water bodies has been blamed for eutrophic conditions marked by massive algal growth and subsequent hypoxia due to decomposition of dead algal cells. This study aims to describe the eutrophication and hypoxia processes in a semi-enclosed water body lying near a big metropolis. Phosphorus mass balance in a small inlet, Ohko Inlet, located at the head of Hiroshima Bay, Japan, was quantified using a numerical model. Dissolved inorganic phosphorous inflow from Kaita Bay next to the inlet was five times higher than that from terrigenous load, which may cause an enhancement of primary production. Therefore, it was concluded that not only the reduction of material load from the land and the suppression of benthic flux are needed, but also reducing the inflow of high phosphorus and oxygen depleted water from Kaita Bay will form a collective alternative measure to remediate the environmental condition of the inlet. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Small Core, Big Network: A Comprehensive Approach to GIS Teaching Practice Based on Digital Three-Dimensional Campus Reconstruction

    Science.gov (United States)

    Cheng, Liang; Zhang, Wen; Wang, Jiechen; Li, Manchun; Zhong, Lishan

    2014-01-01

    Geographic information science (GIS) features a wide range of disciplines and has broad applicability. Challenges associated with rapidly developing GIS technology and the currently limited teaching and practice materials hinder universities from cultivating highly skilled GIS graduates. Based on the idea of "small core, big network," a…

  16. Big Data solutions on a small scale: Evaluating accessible high-performance computing for social research

    OpenAIRE

    Murthy, Dhiraj; Bowman, S. A.

    2014-01-01

    Though full of promise, Big Data research success is often contingent on access to the newest, most advanced, and often expensive hardware systems and the expertise needed to build and implement such systems. As a result, the accessibility of the growing number of Big Data-capable technology solutions has often been the preserve of business analytics. Pay as you store/process services like Amazon Web Services have opened up possibilities for smaller scale Big Data projects. There is high dema...

  17. Small Bodies, Big Concepts: Engaging Teachers and Their Students in Visual Analysis of Comets and Asteroids

    Science.gov (United States)

    Cobb, W. H.; Buxner, S.; Lebofsky, L. A.; Ristvey, J.; Weeks, S.; Zolensky, M.

    2011-12-01

    Small Bodies, Big Concepts is a multi-disciplinary, professional development project that engages 5th - 8th grade teachers in high end planetary science using a research-based pedagogical framework, Designing Effective Science Instruction (DESI). In addition to developing sound background knowledge with a focus on visual analysis, teachers' awareness of the process of learning new content is heightened, and they use that experience to deepen their science teaching practice. Culling from NASA E/PO educational materials, activities are sequenced to enhance conceptual understanding of big ideas in space science: what do we know, how do we know it, why do we care? Helping teachers develop a picture of the history and evolution of our understanding of the solar system, and honing in on the place of comets and asteroids in helping us answer old questions and discover new ones, teachers see the power and excitement underlying planetary science as human endeavor. Research indicates that science inquiry is powerful in the classroom and mission scientists are real-life models of science inquiry in action. Using guest scientist facilitators from the Planetary Science Institute, NASA Johnson Space Center, Lockheed Martin, and NASA E/PO professionals from McREL and NASA AESP, teachers practice framing scientific questions, using current visual data, and adapting NASA E/PO activities related to current exploration of asteroids and comets in our Solar System. Cross-curricular elements included examining research-based strategies for enhancing English language learners' ability to engage in higher order questions and a professional astronomy artist's insight into how visual analysis requires not just our eyes engaged, but our brains: comparing, synthesizing, questioning, evaluating, and wondering. This summer we pilot tested the SBBC curriculum with thirteen 5th- 10th grade teachers modeling a variety of instructional approaches over eight days. Each teacher developed lesson plans

  18. On the equivalence between small-step and big-step abstract machines: a simple application of lightweight fusion

    DEFF Research Database (Denmark)

    Danvy, Olivier; Millikin, Kevin

    2008-01-01

    -step specification. We illustrate this observation here with a recognizer for Dyck words, the CEK machine, and Krivine’s machine with call/cc. The need for such a simple proof is motivated by our current work on small-step abstract machines as obtained by refocusing a function implementing a reduction semantics (a...... syntactic correspondence), and big-step abstract machines as obtained by CPStransforming and then defunctionalizing a function implementing a big-step semantics (a functional correspondence). © 2007 Elsevier B.V. All rights reserved....

  19. The Astronaut Glove Challenge: Big Innovation from a (Very) Small Team

    Science.gov (United States)

    Homer, Peter

    2008-01-01

    Many measurements were taken by test engineers from Hamilton Sundstrand, the prime contractor for the current EVA suit. Because the raw measurements needed to be converted to torques and combined into a final score, it was impossible to keep track of who was ahead in this phase. The final comfort and dexterity test was performed in a depressurized glove box to simulate real on-orbit conditions. Each competitor was required to exercise the glove through a defined set of finger, thumb, and wrist motions without any sign of abrasion or bruising of the competitor's hand. I learned a lot about arm fatigue! This was a pass-fail event, and both of the remaining competitors came through intact. After taking what seemed like an eternity to tally the final scores, the judges announced that I had won the competition. My glove was the only one to have achieved lower finger-bending torques than the Phase VI glove. Looking back, I see three sources of the success of this project that I believe also operate in other programs where small teams have broken new ground in aerospace technologies. These are awareness, failure, and trust. By remaining aware of the big picture, continuously asking myself, "Am I converging on a solution?" and "Am I converging fast enough?" I was able to see that my original design was not going to succeed, leading to the decision to start over. I was also aware that, had I lingered over this choice or taken time to analyze it, I would not have been ready on the first day of competition. Failure forced me to look outside conventional thinking and opened the door to innovation. Choosing to make incremental failures enabled me to rapidly climb the learning curve. Trusting my "gut" feelings-which are really an internalized accumulation of experiences-and my newly acquired skills allowed me to devise new technologies rapidly and complete both gloves just in time. Awareness, failure, and trust are intertwined: failure provides experiences that inform awareness

  20. [Cultivation strategy and path analysis on big brand Chinese medicine for small and medium-sized enterprises].

    Science.gov (United States)

    Wang, Yong-Yan; Yang, Hong-Jun

    2014-03-01

    Small and medium-sized enterprises (SMEs) are important components in Chinese medicine industry. However, the lack of big brand is becoming an urgent problem which is critical to the survival of SMEs. This article discusses the concept and traits of Chinese medicine of big brand, from clinical, scientific and market value three aspects. Guided by market value, highlighting clinical value, aiming at the scientific value improvement of big brand cultivation, we put forward the key points in cultivation, aiming at obtaining branded Chinese medicine with widely recognized efficacy, good quality control system and mechanism well explained and meanwhile which can bring innovation improvement to theory of Chinese medicine. According to the characters of SMEs, we hold a view that to build multidisciplinary research union could be considered as basic path, and then, from top-level design, skill upgrading and application three stages to probe the implementation strategy.

  1. Genetic structure, nestmate recognition and behaviour of two cryptic species of the invasive big-headed ant Pheidole megacephala.

    Directory of Open Access Journals (Sweden)

    Denis Fournier

    Full Text Available Biological invasions are recognized as a major cause of biodiversity decline and have considerable impact on the economy and human health. The African big-headed ant Pheidole megacephala is considered one of the world's most harmful invasive species.To better understand its ecological and demographic features, we combined behavioural (aggression tests, chemical (quantitative and qualitative analyses of cuticular lipids and genetic (mitochondrial divergence and polymorphism of DNA microsatellite markers data obtained for eight populations in Cameroon. Molecular data revealed two cryptic species of P. megacephala, one inhabiting urban areas and the other rainforests. Urban populations belong to the same phylogenetic group than those introduced in Australia and in other parts of the world. Behavioural analyses show that the eight populations sampled make up four mutually aggressive supercolonies. The maximum distance between nests from the same supercolony was 49 km and the closest distance between two nests belonging to two different supercolonies was 46 m. The genetic data and chemical analyses confirmed the behavioural tests as all of the nests were correctly assigned to their supercolony. Genetic diversity appears significantly greater in Africa than in introduced populations in Australia; by contrast, urban and Australian populations are characterized by a higher chemical diversity than rainforest ones.Overall, our study shows that populations of P. megacephala in Cameroon adopt a unicolonial social structure, like invasive populations in Australia. However, the size of the supercolonies appears several orders of magnitude smaller in Africa. This implies competition between African supercolonies and explains why they persist over evolutionary time scales.

  2. Husbandry and propagation of the Chinese big-headed turtle (Platysternon megacephalum) at the Wildlife Conservation Society's Prospect Park Zoo.

    Science.gov (United States)

    Shelmidine, Nichole; Murphy, Brittany; Massarone, Katelyn

    2016-01-01

    Turtles worldwide are facing increasing pressures on their wild populations and many are listed as endangered or critically endangered. Chinese big-headed turtles (Platysternon megacephalum) are currently listed on IUCN's Red List as endangered and on Cites Appendix II. As part of the Wildlife Conservation Society's initiative on turtle and tortoise conservation, this species became a focus for propagation at Prospect Park Zoo (PPZ) in 2008. PPZ successfully bred and obtained eggs, with successful hatchings in 2013 and 2014. The staff fluctuated water and ambient temperatures along with photoperiod in order to simulate seasonal changes. Each May, the female was placed in the male's enclosure daily for at least 15 min for breeding. Once two confirmed copulations were observed, breeding introductions were discontinued. The female laid her eggs in July and August, and clutch sizes ranged from 5 to 6 eggs. Eggs were successfully incubated in a RCOM Juragon reptile incubator at 23.3°C with 90-95% humidity. The eggs hatched after an average incubation period of 102 days (98-105 days, n = 9). Hatchlings had a mean body mass of 8.84 g (8.11-10 g) and average carapace length × width of 36.17 × 32.20 mm. This article aims to share the team's experiences working with this species as well as build upon previous publications and successes. Our hope is that with continued efforts to increase our knowledgebase a future viable, sustainable North American captive population will become a reality for this species. © 2016 Wiley Periodicals, Inc.

  3. Recombination and evolution of duplicate control regions in the mitochondrial genome of the Asian big-headed turtle, Platysternon megacephalum.

    Directory of Open Access Journals (Sweden)

    Chenfei Zheng

    Full Text Available Complete mitochondrial (mt genome sequences with duplicate control regions (CRs have been detected in various animal species. In Testudines, duplicate mtCRs have been reported in the mtDNA of the Asian big-headed turtle, Platysternon megacephalum, which has three living subspecies. However, the evolutionary pattern of these CRs remains unclear. In this study, we report the completed sequences of duplicate CRs from 20 individuals belonging to three subspecies of this turtle and discuss the micro-evolutionary analysis of the evolution of duplicate CRs. Genetic distances calculated with MEGA 4.1 using the complete duplicate CR sequences revealed that within turtle subspecies, genetic distances between orthologous copies from different individuals were 0.63% for CR1 and 1.2% for CR2app:addword:respectively, and the average distance between paralogous copies of CR1 and CR2 was 4.8%. Phylogenetic relationships were reconstructed from the CR sequences, excluding the variable number of tandem repeats (VNTRs at the 3' end using three methods: neighbor-joining, maximum likelihood algorithm, and Bayesian inference. These data show that any two CRs within individuals were more genetically distant from orthologous genes in different individuals within the same subspecies. This suggests independent evolution of the two mtCRs within each P. megacephalum subspecies. Reconstruction of separate phylogenetic trees using different CR components (TAS, CD, CSB, and VNTRs suggested the role of recombination in the evolution of duplicate CRs. Consequently, recombination events were detected using RDP software with break points at ≈290 bp and ≈1,080 bp. Based on these results, we hypothesize that duplicate CRs in P. megacephalum originated from heterological ancestral recombination of mtDNA. Subsequent recombination could have resulted in homogenization during independent evolutionary events, thus maintaining the functions of duplicate CRs in the mtDNA of P

  4. Start small, dream big: Experiences of physical activity in public spaces in Colombia.

    Science.gov (United States)

    Díaz Del Castillo, Adriana; González, Silvia Alejandra; Ríos, Ana Paola; Páez, Diana C; Torres, Andrea; Díaz, María Paula; Pratt, Michael; Sarmiento, Olga L

    2017-10-01

    Multi-sectoral strategies to promote active recreation and physical activity in public spaces are crucial to building a "culture of health". However, studies on the sustainability and scalability of these strategies are limited. This paper identifies the factors related to the sustainability and scaling up of two community-based programs offering physical activity classes in public spaces in Colombia: Bogotá's Recreovía and Colombia's "Healthy Habits and Lifestyles Program-HEVS". Both programs have been sustained for more than 10years, and have benefited 1455 communities. We used a mixed-methods approach including semi-structured interviews, document review and an analysis of data regarding the programs' history, characteristics, funding, capacity building and challenges. Interviews were conducted between May-October 2015. Based on the sustainability frameworks of Shediac-Rizkallah and Bone and Scheirer, we developed categories to independently code each interview. All information was independently analyzed by four of the authors and cross-compared between programs. Findings showed that these programs underwent adaptation processes to address the challenges that threatened their continuation and growth. The primary strategies included flexibility/adaptability, investing in the working conditions and training of instructors, allocating public funds and requesting accountability, diversifying resources, having community support and champions at different levels and positions, and carrying out continuous advocacy to include physical activity in public policies. Recreovía and HEVS illustrate sustainability as an incremental, multi-level process at different levels. Lessons learned for similar initiatives include the importance of individual actions and small events, a willingness to start small while dreaming big, being flexible, and prioritizing the human factor. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. CREDIT SCORING MODELS IN ESTIMATING THE CREDITWORTHINESS OF SMALL AND MEDIUM AND BIG ENTERPRISES

    Directory of Open Access Journals (Sweden)

    Robert Zenzerović

    2011-02-01

    Full Text Available This paper is focused on estimating the credit scoring models for companies operating in the Republic of Croatia. According to level of economic and legal development, especially in the area of bankruptcy regulation as well as business ethics in the Republic of Croatia, the models derived can be applied in wider region particularly in South-eastern European countries that twenty years ago transferred from state directed to free market economy. The purpose of this paper is to emphasize the relevance and possibilities of particular financial ratios in estimating the creditworthiness of business entities what was realized by performing the research among 110 companies. Along most commonly used research methods of description, analysis and synthesis, induction, deduction and surveys, the mathematical and statistical logistic regression method took the central part in this research. The designed sample of 110 business entities represented the structure of firms operating in Republic of Croatia according to their activities as well as to their size. The sample was divided in two sub samples where the first one consist of small and medium enterprises (SME and the second one consist of big business entities. In the next phase the logistic regression method was applied on the 50 independent variables – financial ratios calculated for each sample unit in order to find ones that best discriminate financially stable from unstable companies. As the result of logistic regression analysis, two credit scoring models were derived. First model include the liquidity, solvency and profitability ratios and is applicable for SME’s. With its classification accuracy of 97% the model has high predictive ability and can be used as an effective decision support tool. Second model is applicable for big companies and include only two independent variables – liquidity and solvency ratios. The classification accuracy of this model is 92,5% and, according to criteria of

  6. Big Data solutions on a small scale: Evaluating accessible high-performance computing for social research

    Directory of Open Access Journals (Sweden)

    Dhiraj Murthy

    2014-11-01

    Full Text Available Though full of promise, Big Data research success is often contingent on access to the newest, most advanced, and often expensive hardware systems and the expertise needed to build and implement such systems. As a result, the accessibility of the growing number of Big Data-capable technology solutions has often been the preserve of business analytics. Pay as you store/process services like Amazon Web Services have opened up possibilities for smaller scale Big Data projects. There is high demand for this type of research in the digital humanities and digital sociology, for example. However, scholars are increasingly finding themselves at a disadvantage as available data sets of interest continue to grow in size and complexity. Without a large amount of funding or the ability to form interdisciplinary partnerships, only a select few find themselves in the position to successfully engage Big Data. This article identifies several notable and popular Big Data technologies typically implemented using large and extremely powerful cloud-based systems and investigates the feasibility and utility of development of Big Data analytics systems implemented using low-cost commodity hardware in basic and easily maintainable configurations for use within academic social research. Through our investigation and experimental case study (in the growing field of social Twitter analytics, we found that not only are solutions like Cloudera’s Hadoop feasible, but that they can also enable robust, deep, and fruitful research outcomes in a variety of use-case scenarios across the disciplines.

  7. Modified duval procedure for small-duct chronic pancreatitis without head dominance.

    Science.gov (United States)

    Oida, Takatsugu; Aramaki, Osamu; Kano, Hisao; Mimatsu, Kenji; Kawasaki, Atsushi; Kuboi, Youichi; Fukino, Nobutada; Kida, Kazutoshi; Amano, Sadao

    2011-01-01

    In the case of small-duct chronic pancreatitis, surgery for pain relief is broadly divided into resection and drainage procedures. These procedures should be selected according to the location of dominant lesion, diameter of the pancreatic duct and extent of the disease. The appropriate procedure for the treatment of small-duct chronic pancreatitis, especially small-duct chronic pancreatitis without head dominance, remains controversial. We developed the modified Duval procedure for the treatment of small-duct chronic pancreatitis without head dominance and determined the efficacy of this procedure. We retrospectively studied 14 patients who underwent surgical drainage with or without pancreatic resection for chronic pancreatitis with small pancreatic duct (Puestow procedure group and the modified Duval procedure group. No complications occurred in the modified Duval group. In the modified Puestow procedure group, complete and partial pain relief were observed in 62.5%, and 37.5% of patients respectively. In contrast, complete pain relief was observed in all the patients in the modified Duval procedure group. Our modified Duval procedure is useful and should be considered the appropriate surgical technique for the treatment of small-duct chronic pancreatitis without head dominance.

  8. In a world of big data, small effects can still matter: a reply to Boyce, Daly, Hounkpatin, and Wood (2017)

    OpenAIRE

    Matz, SC; Gladstone, JJ; Stillwell, David John

    2017-01-01

    We make three points in response to Boyce, Daly, Hounkpatin, and Wood (2017). First, we clarify a misunderstanding of the goal of our analyses, which was to investigate the links between life satisfaction and spending patterns, rather than spending volume. Second, we report a simulation study we ran to demonstrate that our results were not driven by the proposed statistical artifact. Finally, we discuss the broader issue of why, in a world of big data, small but reliable effect sizes can be v...

  9. «If a woman has a big head»: Physiognomy and female nature in an assyro-babylonian text

    OpenAIRE

    Couto Ferreira, Érica

    2008-01-01

    In Mesopotamia, the human body was understood as an object for divination, that is, a system of signs, which carried messages about the individual, and whose meaning had to be decoded by means of observation and interpretation. Taking the physiognomic series Šumma sinništu qaqqada rabât («If a woman has a big head») as the main source of my article, I analyse, on the one hand, the processes that take part in the promotion of a particular perception of women based on a speci...

  10. Big is not always beautiful - small can be a short cut to blue oceans

    DEFF Research Database (Denmark)

    Kvistgaard, Peter

    2007-01-01

    Often it is claimed that big investments are the only way to success in tourism and the experience economy. Only by building some of the world's biggest hotels - like the ones in Dubai or Las Vegas where hotels with 3-4,000 rooms are not uncommon - success can be achieved. It is understandable...... that hotels have to be big in Las Vegas in order to secure a good return on investment. It is also understandable that they build big hotels when 37 million people came to visit and 22,000 conventions were held in Las Vegas in 2004 according to the official website of Las Vegas (www.lasvegasnevada.gov/factsstatistics/funfacts.htm)....

  11. New solar telescope in Big Bear: evidence for super-diffusivity and small-scale solar dynamos?

    International Nuclear Information System (INIS)

    Goode, Philip R; Abramenko, Valentyna; Yurchyshyn, Vasyl

    2012-01-01

    The 1.6 m clear aperture New Solar Telescope (NST) in Big Bear Solar Observatory (BBSO) is now providing the highest resolution solar data ever. These data have revealed surprises about the Sun on small-scales including the observation that bright points (BPs), which can be used as proxies for the intense, compact magnetic elements that are apparent in photospheric intergranular lanes. The BPs are ever more numerous on ever smaller spatial scales as though there were no limit to how small the BPs can be. Here we discuss high resolution NST data on BPs that provide support for the ideas that a turbulent regime of super-diffusivity dominates in the quiet Sun, and there are local dynamos operating near the solar surface. (comment)

  12. Small head size following in utero exposure to atomic radiation, Hiroshima and Nagasaki

    Energy Technology Data Exchange (ETDEWEB)

    Miller, R W; Blot, W J

    1972-01-01

    There was a progressive increase with dose in the frequency of abnormality among persons whose mothers were exposed before the 18th week of pregnancy. In Hiroshima the minimum dose producing an effect was 10 to 19 rad, but in Nagasaki no effect was observed under 150 rad. At maternal doses of > 150 rad, small head circumference was often accompanied by mental retardation. The low doses in Hiroshima are not directly applicable to medical radiology because of the presence of neutrons and environmental disturbances. (DLC)

  13. Risk assessment for fish passage through small, low-head turbines

    Energy Technology Data Exchange (ETDEWEB)

    Turnpenny, A.W.H.; Clough, S.; Hanson, K.P.; Ramsay, R.; McEwan, D.

    2000-07-01

    This report summarises the findings of a study to improve the accuracy of prediction methods for fish fatalities for small-head Francis and Kaplan propeller turbine designs and gives details of computational fluid dynamic modelling to estimate pressure fluxes and shear stresses. Biological data is reviewed, and the STRIKER Excel spreadsheet model is used to predict death caused by pressure flux, shear turbulence, and blade strike. Field validation is discussed, and drawings of the Francis 1 and Kaplan 1 turbines, results of the fish passage trials, and STRIKER instructions and sample runs are presented in the appendices.

  14. Risk assessment for fish passage through small, low-head turbines

    International Nuclear Information System (INIS)

    Turnpenny, A.W.H.; Clough, S.; Hanson, K.P.; Ramsay, R.; McEwan, D.

    2000-01-01

    This report summarises the findings of a study to improve the accuracy of prediction methods for fish fatalities for small-head Francis and Kaplan propeller turbine designs and gives details of computational fluid dynamic modelling to estimate pressure fluxes and shear stresses. Biological data is reviewed, and the STRIKER Excel spreadsheet model is used to predict death caused by pressure flux, shear turbulence, and blade strike. Field validation is discussed, and drawings of the Francis 1 and Kaplan 1 turbines, results of the fish passage trials, and STRIKER instructions and sample runs are presented in the appendices

  15. Comparison of micelle structure of glycolipids with different head groups by small angle neutron scattering

    International Nuclear Information System (INIS)

    He, Lizhong; Middelberg, Anton; Hartmann, Thorsten; Niemeyer, Bernd; Garamus, V.M.; Willumeit, Regine

    2005-01-01

    Full text: Glycolipids such as n-alkyl- beta-D-glucopyranoside and n-alkyl- beta-D-maltopyranoside can self-assemble into different structures depending on solution conditions. Their amphiphilic properties enable them to serve as biosurfactants in biology and biotechnology, especially for solubilizing membrane proteins. The physicochemical properties of glycolipids have attracted attentions from several research groups, aiming to better understand their application in biological and environmental processes. For example, small angle neutron and X-ray scattering have been used to study micelle structures formed by glycolipids. Our previous work has shown that n-octyl-beta- D-glucopyranoside and n-octyl- beta-D-maltopyranoside form micelles with different structure, suggesting an important role of the sugar head group in micelle formation. In the present work, we further compare micelle structures of n-octyl- beta-Dglucopyranoside and n-octyl- beta-D-galactopyranoside. These two glycolipids have the same hydrophobic tail and their head sugar groups differ only in the conformation with one hydroxyl group pointing to different direction. Our SANS data together with phase behaviours reported by other group have suggested that a slight alteration of head group conformation can significantly affect self-assembly of glycolipids. (authors)

  16. Make your small practice thrive. Physicians moving from big practices to small must know the business side of medicine.

    Science.gov (United States)

    Cowan, D

    2001-01-01

    Trying to gain a measure of control over their working lives, some physicians are abandoning large group practices for smaller groups. Large groups enjoy whole teams of people performing vital business tasks. Small practices rely on one or two key physicians and managers to tackle everything from customer service to marketing, medical records to human resources. Learn valuable tips for thriving in a small environment and using that extra control to achieve job satisfaction.

  17. Digital loyalty card "big data' and small business marketing: Formal versus informal or complementary?

    OpenAIRE

    Donnelly, Christina; Simmons, Geoff; Armstrong, Gillian; Fearne, Andrew

    2015-01-01

    This article proposes that a complementary relationship exists between the formalised nature of digital loyalty card data, and the informal nature of small business market orientation. A longitudinal, case-based research approach analysed this relationship in small firms given access to Tesco Clubcard data. The findings reveal a new-found structure and precision in small firm marketing planning from data exposure; this complemented rather than conflicted with an intuitive feel for markets. In...

  18. Linking Big and Small Data Across the Social, Engineering, and Earth Sciences

    Science.gov (United States)

    Chen, R. S.; de Sherbinin, A. M.; Levy, M. A.; Downs, R. R.

    2014-12-01

    The challenges of sustainable development cut across the social, health, ecological, engineering, and Earth sciences, across a wide range of spatial and temporal scales, and across the spectrum from basic to applied research and decision making. The rapidly increasing availability of data and information in digital form from a variety of data repositories, networks, and other sources provides new opportunities to link and integrate both traditional data holdings as well as emerging "big data" resources in ways that enable interdisciplinary research and facilitate the use of objective scientific data and information in society. Taking advantage of these opportunities not only requires improved technical and scientific data interoperability across disciplines, scales, and data types, but also concerted efforts to bridge gaps and barriers between key communities, institutions, and networks. Given the long time perspectives required in planning sustainable approaches to development, it is also imperative to address user requirements for long-term data continuity and stewardship by trustworthy repositories. We report here on lessons learned by CIESIN working on a range of sustainable development issues to integrate data across multiple repositories and networks. This includes CIESIN's roles in developing policy-relevant climate and environmental indicators, soil data for African agriculture, and exposure and risk measures for hazards, disease, and conflict, as well as CIESIN's participation in a range of national and international initiatives related both to sustainable development and to open data access, interoperability, and stewardship.

  19. From Big Data to Small Transportable Products for Decision Support for Floods in Namibia

    Science.gov (United States)

    Mandl, D.; Frye, S.; Cappelaere, P.; Policelli, F.; Handy, M.; Sohlberg, R. A.; Grossman, R.

    2013-12-01

    During the past four years, a team from NASA, Oklahoma University, University of Maryland and University of Chicago in collaboration with the Namibia Hydrological Services (NHS) has explored ways to provide decision support products for floods. The products include a variety of data including a hydrological model, ground measurements such as river gauges, and earth remote sensing data. This poster or presentation highlights the lessons learned in acquiring, storing, managing big data on the cloud and turning it into relevant products for GEOSS users. Technology that has been explored includes the use of Hadoop/MapReduce and Accumulo to process and manage the large data sets. OpenStreetMap was explored for use in cataloging water boundaries and enabling collaborative mapping of the base water mask and floods. A Flood Dashboard was created to customize displays of various data products. Finally, a higher level Geo-Social Application Processing Interface (API) was developed so that users can discover, generate products dynamically for their specific needs/societal benefit areas and then share them with their Community of Practice over social networks. Results of this experiment have included 100x reduction in size of some flood products, making it possible to distribute these products to mobile platforms and/or bandwidth-limited users.

  20. 'The big ole gay express': sexual minority stigma, mobility and health in the small city.

    Science.gov (United States)

    Keene, Danya E; Eldahan, Adam I; White Hughto, Jaclyn M; Pachankis, John E

    2017-03-01

    Recent research has examined how gay and bisexual men experience and navigate the variations in sexual minority stigma that exist across geographic contexts, with implications for their health. We extend this literature on stigma, mobility, and health by considering the unique and understudied setting of the small city. Drawing on semi-structured interviews (n = 29) conducted in two small US cities (New Haven and Hartford), we find that these small cities serve as both destinations and points of departure for gay and bisexual men in the context of stigma. New Haven and Hartford attracted gay and bisexual men from surrounding suburbs where sexual minority stigma was more prevalent and where there were fewer spaces and opportunities for gay life. Conversely, participants noted that these small cities did not contain the same identity affirming communities as urban gay enclaves, thus motivating movement from small cities to larger ones. Our data suggest these forms of mobility may mitigate stigma, but may also produce sexual health risks, thus drawing attention to small cities as uniquely important sites for HIV prevention. Furthermore, our analysis contributes to an understanding of how place, stigma and mobility can intersect to generate spatially distinct experiences of stigmatised identities and related health consequences.

  1. Determination of material distribution in heading process of small bimetallic bar

    Science.gov (United States)

    Presz, Wojciech; Cacko, Robert

    2018-05-01

    The electrical connectors mostly have silver contacts joined by riveting. In order to reduce costs, the core of the contact rivet can be replaced with cheaper material, e.g. copper. There is a wide range of commercially available bimetallic (silver-copper) rivets on the market for the production of contacts. Following that, new conditions in the riveting process are created because the bi-metal object is riveted. In the analyzed example, it is a small size object, which can be placed on the border of microforming. Based on the FEM modeling of the load process of bimetallic rivets with different material distributions, the desired distribution was chosen and the choice was justified. Possible material distributions were parameterized with two parameters referring to desirable distribution characteristics. The parameter: Coefficient of Mutual Interactions of Plastic Deformations and the method of its determination have been proposed. The parameter is determined based of two-parameter stress-strain curves and is a function of these parameters and the range of equivalent strains occurring in the analyzed process. The proposed method was used for the upsetting process of the bimetallic head of the electrical contact. A nomogram was established to predict the distribution of materials in the head of the rivet and the appropriate selection of a pair of materials to achieve the desired distribution.

  2. Petit bourgeois health care? The big small-business of private complementary medical practice.

    Science.gov (United States)

    Andrews, Gavin J; Phillips, David R

    2005-05-01

    Although small business private complementary medicine (CAM) has grown to be a significant provider of health care in many Western societies, there has been relatively little research on the sector in business terms and on its wider socio-economic position and role. Using a combined questionnaire and interview survey, and the concept of small business petit bourgeoisie as a framework, this paper considers the character of therapists and their businesses in England and Wales. The findings suggest that typical of the core characteristics of both the petit bourgeoisie and therapists are the selling of goods with a considerable market viability, at the same time financial insecurity; the modest size of businesses; small amounts of direct employment generation and business owners undertaking everyday 'hands-on' work themselves. Certain of the therapists' and business characteristics depart from the stereotypical image of a small businesses class, such as the high incidence of part-time self-employment and incomes being supplemented often by unrelated waged employment. However, given the acknowledged diversity of the petit bourgeoisie between societies and over time, the framework is arguably appropriate in this context, and private CAM a latest guise. Indeed, just as the petit bourgeoisie have traditionally found market niches either neglected or rejected by bigger business, small business CAM has provided the forms of health care neglected and sometimes rejected by orthodox medicine.

  3. Affordable Development and Demonstration of a Small Nuclear Thermal Rocket (NTR) Engine and Stage: How Small Is Big Enough?

    Science.gov (United States)

    Borowski, Stanley K.; Sefcik, Robert J.; Fittje, James E.; McCurdy, David R.; Qualls, Arthur L.; Schnitzler, Bruce G.; Werner, James E.; Weitzberg, Abraham; Joyner, Claude R.

    2016-01-01

    The Nuclear Thermal Rocket (NTR) derives its energy from fission of uranium-235 atoms contained within fuel elements that comprise the engine's reactor core. It generates high thrust and has a specific impulse potential of approximately 900 specific impulse - a 100 percent increase over today's best chemical rockets. The Nuclear Thermal Propulsion (NTP) project, funded by NASA's Advanced Exploration Systems (AES) program, includes five key task activities: (1) Recapture, demonstration, and validation of heritage graphite composite (GC) fuel (selected as the Lead Fuel option); (2) Engine Conceptual Design; (3) Operating Requirements Definition; (4) Identification of Affordable Options for Ground Testing; and (5) Formulation of an Affordable Development Strategy. During fiscal year (FY) 2014, a preliminary Design Development Test and Evaluation (DDT&E) plan and schedule for NTP development was outlined by the NASA Glenn Research Center (GRC), Department of Energy (DOE) and industry that involved significant system-level demonstration projects that included Ground Technology Demonstration (GTD) tests at the Nevada National Security Site (NNSS), followed by a Flight Technology Demonstration (FTD) mission. To reduce cost for the GTD tests and FTD mission, small NTR engines, in either the 7.5 or 16.5 kilopound-force thrust class, were considered. Both engine options used GC fuel and a common fuel element (FE) design. The small approximately 7.5 kilopound-force criticality-limited engine produces approximately157 thermal megawatts and its core is configured with parallel rows of hexagonal-shaped FEs and tie tubes (TTs) with a FE to TT ratio of approximately 1:1. The larger approximately 16.5 kilopound-force Small Nuclear Rocket Engine (SNRE), developed by Los Alamos National Laboratory (LANL) at the end of the Rover program, produces approximately 367 thermal megawatts and has a FE to TT ratio of approximately 2:1. Although both engines use a common 35-inch (approximately

  4. III. FROM SMALL TO BIG: METHODS FOR INCORPORATING LARGE SCALE DATA INTO DEVELOPMENTAL SCIENCE.

    Science.gov (United States)

    Davis-Kean, Pamela E; Jager, Justin

    2017-06-01

    For decades, developmental science has been based primarily on relatively small-scale data collections with children and families. Part of the reason for the dominance of this type of data collection is the complexity of collecting cognitive and social data on infants and small children. These small data sets are limited in both power to detect differences and the demographic diversity to generalize clearly and broadly. Thus, in this chapter we will discuss the value of using existing large-scale data sets to tests the complex questions of child development and how to develop future large-scale data sets that are both representative and can answer the important questions of developmental scientists. © 2017 The Society for Research in Child Development, Inc.

  5. Big concerns with small projects: Evaluating the socio-ecological impacts of small hydropower projects in India.

    Science.gov (United States)

    Jumani, Suman; Rao, Shishir; Machado, Siddarth; Prakash, Anup

    2017-05-01

    Although Small Hydropower Projects (SHPs) are encouraged as sources of clean and green energy, there is a paucity of research examining their socio-ecological impacts. We assessed the perceived socio-ecological impacts of 4 SHPs within the Western Ghats in India by conducting semi-structured interviews with local respondents. Primary interview data were sequentially validated with secondary data, and respondent perceptions were subsequently compared against the expected baseline of assured impacts. We evaluated the level of awareness about SHPs, their perceived socio-economic impacts, influence on resource access and impacts on human-elephant interactions. The general level of awareness about SHPs was low, and assurances of local electricity and employment generation remained largely unfulfilled. Additionally most respondents faced numerous unanticipated adverse impacts. We found a strong relationship between SHP construction and increasing levels of human-elephant conflict. Based on the disparity between assured and actual social impacts, we suggest that policies regarding SHPs be suitably revised.

  6. Using Small Models for Big Issues : Exploratory System Dynamics Modelling and Analysis for Insightful Crisis Management

    NARCIS (Netherlands)

    Pruyt, E.

    2010-01-01

    The main goal of this paper is to explain and illustrate different exploratory uses of small System Dynamics models for analysis and decision support in case of dynamically complex issues that are deeply uncertain. The applied focuss of the paper is the field of inter/national safety and security.

  7. Population and harvest trends of big game and small game species: a technical document supporting the USDA Forest Service Interim Update of the 2000 RPA Assessment

    Science.gov (United States)

    Curtis H. Flather; Michael S. Knowles; Stephen J. Brady

    2009-01-01

    This technical document supports the Forest Service's requirement to assess the status of renewable natural resources as mandated by the Forest and Rangeland Renewable Resources Planning Act of 1974 (RPA). It updates past reports on national and regional trends in population and harvest estimates for species classified as big game and small game. The trends...

  8. The importance of hunting and hunting grounds for big and small game for tourism development in the basin of Crna Reka the Republic of Macedonia

    OpenAIRE

    Koteski, Cane; Jakovlev, Zlatko; Mitreva, Elizabeta; Angelkova, Tanja; Kitanov, Vladimir

    2012-01-01

    To show the hunting and hunting grounds for big and small game, the structure of the areas of certain hunting, fishing, fishing water objects, fish species, fishponds up to 20 years shown by municipalities and individual farms with ponds in the basin of Crna Reka.

  9. Small Farmers and Big Retail: trade-offs of supplying supermarkets in Nicaragua

    OpenAIRE

    Michelson, Hope; Reardon, Thomas; Perez, Francisco Jose

    2010-01-01

    In Nicaragua and elsewhere in Central America, small-scale farmers are weighing the risks of entering into contracts with supermarket chains. We use unique data on negotiated prices from Nicaraguan farm cooperatives supplying supermarkets to study the impact of supply agreements on producers’ mean output prices and price stability. We find that prices paid by the domestic retail chain approximate the traditional market in mean and variance. In contrast, we find that mean prices paid by Wal-ma...

  10. Big catch, little sharks: Insight into Peruvian small-scale longline fisheries

    OpenAIRE

    Doherty, Philip D; Alfaro-Shigueto, Joanna; Hodgson, David J; Mangel, Jeffrey C; Witt, Matthew J; Godley, Brendan J

    2014-01-01

    Shark take, driven by vast demand for meat and fins, is increasing. We set out to gain insights into the impact of small-scale longline fisheries in Peru. Onboard observers were used to document catch from 145 longline fishing trips (1668 fishing days) originating from Ilo, southern Peru. Fishing effort is divided into two seasons: targeting dolphinfish (Coryphaena hippurus; December to February) and sharks (March to November). A total of 16,610 sharks were observed caught, with 11,166 identi...

  11. Small molecules, big players: the National Cancer Institute's Initiative for Chemical Genetics.

    Science.gov (United States)

    Tolliday, Nicola; Clemons, Paul A; Ferraiolo, Paul; Koehler, Angela N; Lewis, Timothy A; Li, Xiaohua; Schreiber, Stuart L; Gerhard, Daniela S; Eliasof, Scott

    2006-09-15

    In 2002, the National Cancer Institute created the Initiative for Chemical Genetics (ICG), to enable public research using small molecules to accelerate the discovery of cancer-relevant small-molecule probes. The ICG is a public-access research facility consisting of a tightly integrated team of synthetic and analytical chemists, assay developers, high-throughput screening and automation engineers, computational scientists, and software developers. The ICG seeks to facilitate the cross-fertilization of synthetic chemistry and cancer biology by creating a research environment in which new scientific collaborations are possible. To date, the ICG has interacted with 76 biology laboratories from 39 institutions and more than a dozen organic synthetic chemistry laboratories around the country and in Canada. All chemistry and screening data are deposited into the ChemBank web site (http://chembank.broad.harvard.edu/) and are available to the entire research community within a year of generation. ChemBank is both a data repository and a data analysis environment, facilitating the exploration of chemical and biological information across many different assays and small molecules. This report outlines how the ICG functions, how researchers can take advantage of its screening, chemistry and informatic capabilities, and provides a brief summary of some of the many important research findings.

  12. Small Groups, Big Change: Preliminary Findings from the Sparks for Change Institute

    Science.gov (United States)

    Kirsch, R.; Batchelor, R. L.; Habtes, S. Y.; King, B.; Crockett, J.

    2017-12-01

    The geoscience professoriate continues to under represent women and minorities, yet the value of diversity, both for science as well as recruiting and retaining diverse students, is well known. While there are growing numbers of early career tenure-track minority faculty, low retention rates pose a challenge for sustained diversity in the professoriate. Part of this challenge is the lack of institutional support and recognition in tenure and promotion pathways for faculty who undertake broadening participation efforts. Sparks for Change is a NSF Geoscience Opportunities for Leadership in Diversity (GOLD)-funded project that aims to change departmental culture to better value and reward inclusion and broadening participation efforts. By encouraging, recognizing, and rewarding diversity, equity, and inclusion (DEI) efforts at the department level, we aim to support and retain underrepresented minority (URM) faculty, who often disproportionately undertake these efforts, and to build more inclusive environments for faculty, staff and students alike. Sparks for Change utilizes a small group theory of change, arguing that the effort of a small group of committed individuals inside the organization is the best way to overcome the institutional inertia of academic departments that makes them resistant to change. For this effort, we propose that the ideal composition of these small groups is a junior faculty URM who is interested in DEI in the geosciences, a senior member of that same department who can lend weight to efforts and is positioned to help enact department policy, and an external broadening participation expert who can share best practices and provide accountability for the group. Eleven of these small groups, representing a range of institutions, will be brought together at the Sparks for Change Institute in Boulder, CO, in September. There they will receive leadership training on adaptive, transformative, and solidarity practices, share DEI experiences and

  13. Multi-modality image reconstruction for dual-head small-animal PET

    International Nuclear Information System (INIS)

    Huang, Chang-Han; Chou, Cheng-Ying

    2015-01-01

    The hybrid positron emission tomography/computed tomography (PET/CT) or positron emission tomography/magnetic resonance imaging (PET/MRI) has become routine practice in clinics. The applications of multi-modality imaging can also benefit research advances. Consequently, dedicated small-imaging system like dual-head small-animal PET (DHAPET) that possesses the advantages of high detection sensitivity and high resolution can exploit the structural information from CT or MRI. It should be noted that the special detector arrangement in DHAPET leads to severe data truncation, thereby degrading the image quality. We proposed to take advantage of anatomical priors and total variation (TV) minimization methods to reconstruct PET activity distribution form incomplete measurement data. The objective is to solve the penalized least-squares function consisted of data fidelity term, TV norm and medium root priors. In this work, we employed the splitting-based fast iterative shrinkage/thresholding algorithm to split smooth and non-smooth functions in the convex optimization problems. Our simulations studies validated that the images reconstructed by use of the proposed method can outperform those obtained by use of conventional expectation maximization algorithms or that without considering the anatomical prior information. Additionally, the convergence rate is also accelerated.

  14. Finding big shots: small-area mapping and spatial modelling of obesity among Swiss male conscripts.

    Science.gov (United States)

    Panczak, Radoslaw; Held, Leonhard; Moser, André; Jones, Philip A; Rühli, Frank J; Staub, Kaspar

    2016-01-01

    In Switzerland, as in other developed countries, the prevalence of overweight and obesity has increased substantially since the early 1990s. Most of the analyses so far have been based on sporadic surveys or self-reported data and did not offer potential for small-area analyses. The goal of this study was to investigate spatial variation and determinants of obesity among young Swiss men using recent conscription data. A complete, anonymized dataset of conscription records for the 2010-2012 period were provided by Swiss Armed Forces. We used a series of Bayesian hierarchical logistic regression models to investigate the spatial pattern of obesity across 3,187 postcodes, varying them by type of random effects (spatially unstructured and structured), level of adjustment by individual (age and professional status) and area-based [urbanicity and index of socio-economic position (SEP)] characteristics. The analysed dataset consisted of 100,919 conscripts, out of which 5,892 (5.8 %) were obese. Crude obesity prevalence increased with age among conscripts of lower individual and area-based SEP and varied greatly over postcodes. Best model's estimates of adjusted odds ratios of obesity on postcode level ranged from 0.61 to 1.93 and showed a strong spatial pattern of obesity risk across the country. Odds ratios above 1 concentrated in central and north Switzerland. Smaller pockets of elevated obesity risk also emerged around cities of Geneva, Fribourg and Lausanne. Lower estimates were observed in North-East and East as well as south of the Alps. Importantly, small regional outliers were observed and patterning did not follow administrative boundaries. Similarly as with crude obesity prevalence, the best fitting model confirmed increasing risk of obesity with age and among conscripts of lower professional status. The risk decreased with higher area-based SEP and, to a lesser degree - in rural areas. In Switzerland, there is a substantial spatial variation in obesity risk

  15. Small Engine, Big Power: MicroRNAs as Regulators of Cardiac Diseases and Regeneration

    Directory of Open Access Journals (Sweden)

    Darukeshwara Joladarashi

    2014-09-01

    Full Text Available Cardiac diseases are the predominant cause of human mortality in the United States and around the world. MicroRNAs (miRNAs are small non-coding RNAs that have been shown to modulate a wide range of biological functions under various pathophysiological conditions. miRNAs alter target expression by post-transcriptional regulation of gene expression. Numerous studies have implicated specific miRNAs in cardiovascular development, pathology, regeneration and repair. These observations suggest that miRNAs are potential therapeutic targets to prevent or treat cardiovascular diseases. This review focuses on the emerging role of miRNAs in cardiac development, pathogenesis of cardiovascular diseases, cardiac regeneration and stem cell-mediated cardiac repair. We also discuss the novel diagnostic and therapeutic potential of these miRNAs and their targets in patients with cardiac diseases.

  16. Small team, big reach : Leduc's AMR process translates cutting-edge engineering skills into worldwide contracts

    International Nuclear Information System (INIS)

    Gullage, R.

    2009-01-01

    AMR Process specializes in oil and gas process engineering. The small company which is headquartered in Leduc, Alberta provides oil and gas packages, proprietary licensed designs and process engineering services in Brazil, China, Iraq and more than a dozen other countries. AMR's expertise includes troubleshooting, de-bottlenecking and process train upgrading. They help clients avoid complicated, costly changes to a project by targeting potential problems at a project's inception. The president and founder of AMR credits the company's success in part to the Nisku-Leduc Economic Development Authority (EDA) which brought AMR together with I. Matheson and Company Ltd., a steel manufacturer in New Glasgow, Nova Scotia. Pressure vessels are one of Matheson's primary products. This article described how AMR Process Inc. has worked with Matheson to secure contracts to build equipment used in oil processing, for markets around the world. 1 fig

  17. Small eyes big problems: is cataract surgery the best option for the nanophthalmic eyes

    International Nuclear Information System (INIS)

    Utman, S.A.K.

    2013-01-01

    Nanophthalmos refers to an eyeball of short axial length, usually less than 20 mm which leads to angle closure glaucoma due to relatively large lens. Intra-ocular lens extraction relieves the angle closure in nanophthalmos. Cataract surgery in a nanophthalmic eye is technically difficult with high risk of complications such as posterior capsular rupture, uveal effusion, choroidal haemorrhage, vitreous haemorrhage, malignant glaucoma, retinal detachment and aqueous misdirection. Various options are explained in the literature to perform cataract surgery in nanophthalmos, like extracapsular cataract extraction with or without sclerostomy; small-incision cataract extraction by phacoemulsification which not only helps maintain the anterior chamber during surgery but also reduces the incidence of complications due to less fluctuation of intraocular pressure (IOP) during the surgery. Cataract surgery deepens and widens the anterior chamber angle in nanophthalmic eyes and has beneficial effects on IOP in eyes with nanophthalmos but is associated with a high incidence of complications. (author)

  18. Managing Astronomy Research Data: Case Studies of Big and Small Research Projects

    Science.gov (United States)

    Sands, Ashley E.

    2015-01-01

    Astronomy data management refers to all actions taken upon data over the course of the entire research process. It includes activities involving the collection, organization, analysis, release, storage, archiving, preservation, and curation of research data. Astronomers have cultivated data management tools, infrastructures, and local practices to ensure the use and future reuse of their data. However, new sky surveys will soon amass petabytes of data requiring new data management strategies.The goal of this dissertation, to be completed in 2015, is to identify and understand data management practices and the infrastructure and expertise required to support best practices. This will benefit the astronomy community in efforts toward an integrated scholarly communication framework.This dissertation employs qualitative, social science research methods (including interviews, observations, and document analysis) to conduct case studies of data management practices, covering the entire data lifecycle, amongst three populations: Sloan Digital Sky Survey (SDSS) collaboration team members; Individual and small-group users of SDSS data; and Large Synoptic Survey Telescope (LSST) collaboration team members. I have been observing the collection, release, and archiving of data by the SDSS collaboration, the data practices of individuals and small groups using SDSS data in journal articles, and the LSST collaboration's planning and building of infrastructure to produce data.Preliminary results demonstrate that current data management practices in astronomy are complex, situational, and heterogeneous. Astronomers often have different management repertoires for working on sky surveys and for their own data collections, varying their data practices as they move between projects. The multitude of practices complicates coordinated efforts to maintain data.While astronomy expertise proves critical to managing astronomy data in the short, medium, and long term, the larger astronomy

  19. Big cat, small cat: reconstructing body size evolution in living and extinct Felidae.

    Science.gov (United States)

    Cuff, A R; Randau, M; Head, J; Hutchinson, J R; Pierce, S E; Goswami, A

    2015-08-01

    The evolution of body mass is a fundamental topic in evolutionary biology, because it is closely linked to manifold life history and ecological traits and is readily estimable for many extinct taxa. In this study, we examine patterns of body mass evolution in Felidae (Placentalia, Carnivora) to assess the effects of phylogeny, mode of evolution, and the relationship between body mass and prey choice in this charismatic mammalian clade. Our data set includes 39 extant and 26 extinct taxa, with published body mass data supplemented by estimates based on condylobasal length. These data were run through 'SURFACE' and 'bayou' to test for patterns of body mass evolution and convergence between taxa. Body masses of felids are significantly different among prey choice groupings (small, mixed and large). We find that body mass evolution in cats is strongly influenced by phylogeny, but different patterns emerged depending on inclusion of extinct taxa and assumptions about branch lengths. A single Ornstein-Uhlenbeck optimum best explains the distribution of body masses when first-occurrence data were used for the fossil taxa. However, when mean occurrence dates or last known occurrence dates were used, two selective optima for felid body mass were recovered in most analyses: a small optimum around 5 kg and a large one around 100 kg. Across living and extinct cats, we infer repeated evolutionary convergences towards both of these optima, but, likely due to biased extinction of large taxa, our results shift to supporting a Brownian motion model when only extant taxa are included in analyses. © 2015 European Society For Evolutionary Biology. Journal of Evolutionary Biology © 2015 European Society For Evolutionary Biology.

  20. Small Molecule, Big Prospects: MicroRNA in Pregnancy and Its Complications

    Directory of Open Access Journals (Sweden)

    Meng Cai

    2017-01-01

    Full Text Available MicroRNAs are small, noncoding RNA molecules that regulate target gene expression in the posttranscriptional level. Unlike siRNA, microRNAs are “fine-tuners” rather than “switches” in the regulation of gene expression; thus they play key roles in maintaining tissue homeostasis. The aberrant microRNA expression is implicated in the disease process. To date, numerous studies have demonstrated the regulatory roles of microRNAs in various pathophysiological conditions. In contrast, the study of microRNA in pregnancy and its associated complications, such as preeclampsia (PE, fetal growth restriction (FGR, and preterm labor, is a young field. Over the last decade, the knowledge of pregnancy-related microRNAs has increased and the molecular mechanisms by which microRNAs regulate pregnancy or its associated complications are emerging. In this review, we focus on the recent advances in the research of pregnancy-related microRNAs, especially their function in pregnancy-associated complications and the potential clinical applications. Here microRNAs that associate with pregnancy are classified as placenta-specific, placenta-associated, placenta-derived circulating, and uterine microRNA according to their localization and origin. MicroRNAs offer a great potential for developing diagnostic and therapeutic targets in pregnancy-related disorders.

  1. Big Fish in Small Ponds: massive stars in the low-mass clusters of M83

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, J. E.; Calzetti, D.; McElwee, Sean [Department of Astronomy, University of Massachusetts, Amherst, MA 01003 (United States); Chandar, R. [Department of Physics and Astronomy, University of Toledo, Toledo, OH 43606 (United States); Elmegreen, B. G. [IBM T. J. Watson Research Center, Yorktown Heights, NY 10598 (United States); Kennicutt, R. C. [Institute of Astronomy, University of Cambridge, Madingley Road, Cambridge CB3 0HA (United Kingdom); Kim, Hwihyun [School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Krumholz, Mark R. [Department of Astronomy and Astrophysics, University of California, 1156 High Street, Santa Cruz, CA 95064 (United States); Lee, J. C.; Whitmore, B. [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); O' Connell, R. W., E-mail: jandrews@astro.umass.edu, E-mail: callzetti@astro.umass.edu [Department of Astronomy, University of Virginia, P.O. Box 400325, Charlottesville, VA 22904-4325 (United States)

    2014-09-20

    We have used multi-wavelength Hubble Space Telescope WFC3 data of the starbursting spiral galaxy M83 in order to measure variations in the upper end of the stellar initial mass function (uIMF) using the production rate of ionizing photons in unresolved clusters with ages ≤ 8 Myr. As in earlier papers on M51 and NGC 4214, the uIMF in M83 is consistent with a universal IMF, and stochastic sampling of the stellar populations in the ∼<10{sup 3} M {sub ☉} clusters are responsible for any deviations in this universality. The ensemble cluster population, as well as individual clusters, also imply that the most massive star in a cluster does not depend on the cluster mass. In fact, we have found that these small clusters seem to have an over-abundance of ionizing photons when compared to an expected universal or truncated IMF. This also suggests that the presence of massive stars in these clusters does not affect the star formation in a destructive way.

  2. Microcredit in West Africa: how small loans make a big impact on poverty.

    Science.gov (United States)

    Gbezo, B E

    1999-01-01

    This article examines the impact of microfinancing schemes in West Africa and the role of the International Labor Organization (ILO) in their development. Microfinancing or microcredit schemes are meant to create the kind of jobs that can keep households severely hit by the economic crisis afloat. They affect not only the financial, but also the agricultural, crafts, financing of social economy, and social protection sectors of the society. Thus, they contribute to improved access to basic social, health and family planning services and to drinking water. The challenge then, is for institutes to adopt microfinancing and to reach out to more than 100 million families in the region. To realize this, nongovernmental organizations are setting up as veritable microfinancing institutions, which are able to realize the resulting benefits so as to be economically viable. In the context of its role in the development of microfinancing schemes, ILO manages a portfolio of technical cooperation and research projects aimed at identifying and removing constraints in the access to credit, savings, insurance, and other financial services through its Social Finance Unit. In addition, ILO is promoting women's entrepreneurship through the International Small Enterprise Programme and the International Programme on More and Better Jobs for Women.

  3. Think big, start small, move fast a blueprint for transformation from the Mayo Clinic Center for Innovation

    CERN Document Server

    LaRusso, Nicholas; Farrugia, Gianrico

    2015-01-01

    The Only Innovation Guide You Will Ever Need--from the Award-Winning Minds at Mayo Clinic. A lot of businesspeople talk about innovation, but few companies have achieved the level of truly transformative innovation as brilliantly--or as famously--as the legendary Mayo Clinic. Introducing Think Big, Start Small, Move Fast, the first innovation guide based on the proven, decade-long program that’s made Mayo Clinic one of the most respected and successful organizations in the world. This essential must-have guide shows you how to: Inspire and ignite trailblazing innovation in your workplace Design a new business model that’s creative, collaborative, and sustainable Apply the traditional scientific method to the latest innovations in "design thinking" Build a customized toolkit of the best practices, project portfolios, and strategies Increase your innovation capacity--and watch how quickly you succeed These field-tested techniques grew out of the health care industry but are designed ...

  4. Association of prenatal phenobarbital and phenytoin exposure with small head size at birth and with learning problems

    NARCIS (Netherlands)

    Dessens, A. B.; Cohen-Kettenis, P. T.; Mellenbergh, G. J.; Koppe, J. G.; van de Poll, N. E.; Boer, K.

    2000-01-01

    Small head size has been observed in prenatally anticonvulsant-exposed neonates. In infancy, cognitive impairments were revealed. It is presently unknown whether these impairments are permanent or disappear after puberty. We studied the link between the prenatal influence of anticonvulsants on brain

  5. Why small males have big sperm: dimorphic squid sperm linked to alternative mating behaviours.

    Science.gov (United States)

    Iwata, Yoko; Shaw, Paul; Fujiwara, Eiji; Shiba, Kogiku; Kakiuchi, Yasutaka; Hirohashi, Noritaka

    2011-08-10

    Sperm cells are the target of strong sexual selection that may drive changes in sperm structure and function to maximize fertilisation success. Sperm evolution is regarded to be one of the major consequences of sperm competition in polyandrous species, however it can also be driven by adaptation to the environmental conditions at the site of fertilization. Strong stabilizing selection limits intra-specific variation, and therefore polymorphism, among fertile sperm (eusperm). Here we analyzed reproductive morphology differences among males employing characteristic alternative mating behaviours, and so potentially different conditions of sperm competition and fertilization environment, in the squid Loligo bleekeri. Large consort males transfer smaller (average total length = 73 μm) sperm to a female's internal sperm storage location, inside the oviduct; whereas small sneaker males transfer larger (99 μm) sperm to an external location around the seminal receptacle near the mouth. No significant difference in swimming speed was observed between consort and sneaker sperm. Furthermore, sperm precedence in the seminal receptacle was not biased toward longer sperm, suggesting no evidence for large sperm being favoured in competition for space in the sperm storage organ among sneaker males. Here we report the first case, in the squid Loligo bleekeri, where distinctly dimorphic eusperm are produced by different sized males that employ alternative mating behaviours. Our results found no evidence that the distinct sperm dimorphism was driven by between- and within-tactic sperm competition. We propose that presence of alternative fertilization environments with distinct characteristics (i.e. internal or external), whether or not in combination with the effects of sperm competition, can drive the disruptive evolution of sperm size.

  6. Why small males have big sperm: dimorphic squid sperm linked to alternative mating behaviours

    Directory of Open Access Journals (Sweden)

    Shiba Kogiku

    2011-08-01

    Full Text Available Abstract Background Sperm cells are the target of strong sexual selection that may drive changes in sperm structure and function to maximize fertilisation success. Sperm evolution is regarded to be one of the major consequences of sperm competition in polyandrous species, however it can also be driven by adaptation to the environmental conditions at the site of fertilization. Strong stabilizing selection limits intra-specific variation, and therefore polymorphism, among fertile sperm (eusperm. Here we analyzed reproductive morphology differences among males employing characteristic alternative mating behaviours, and so potentially different conditions of sperm competition and fertilization environment, in the squid Loligo bleekeri. Results Large consort males transfer smaller (average total length = 73 μm sperm to a female's internal sperm storage location, inside the oviduct; whereas small sneaker males transfer larger (99 μm sperm to an external location around the seminal receptacle near the mouth. No significant difference in swimming speed was observed between consort and sneaker sperm. Furthermore, sperm precedence in the seminal receptacle was not biased toward longer sperm, suggesting no evidence for large sperm being favoured in competition for space in the sperm storage organ among sneaker males. Conclusions Here we report the first case, in the squid Loligo bleekeri, where distinctly dimorphic eusperm are produced by different sized males that employ alternative mating behaviours. Our results found no evidence that the distinct sperm dimorphism was driven by between- and within-tactic sperm competition. We propose that presence of alternative fertilization environments with distinct characteristics (i.e. internal or external, whether or not in combination with the effects of sperm competition, can drive the disruptive evolution of sperm size.

  7. The p53-reactivating small molecule RITA induces senescence in head and neck cancer cells.

    Directory of Open Access Journals (Sweden)

    Hui-Ching Chuang

    Full Text Available TP53 is the most commonly mutated gene in head and neck cancer (HNSCC, with mutations being associated with resistance to conventional therapy. Restoring normal p53 function has previously been investigated via the use of RITA (reactivation of p53 and induction of tumor cell apoptosis, a small molecule that induces a conformational change in p53, leading to activation of its downstream targets. In the current study we found that RITA indeed exerts significant effects in HNSCC cells. However, in this model, we found that a significant outcome of RITA treatment was accelerated senescence. RITA-induced senescence in a variety of p53 backgrounds, including p53 null cells. Also, inhibition of p53 expression did not appear to significantly inhibit RITA-induced senescence. Thus, this phenomenon appears to be partially p53-independent. Additionally, RITA-induced senescence appears to be partially mediated by activation of the DNA damage response and SIRT1 (Silent information regulator T1 inhibition, with a synergistic effect seen by combining either ionizing radiation or SIRT1 inhibition with RITA treatment. These data point toward a novel mechanism of RITA function as well as hint to its possible therapeutic benefit in HNSCC.

  8. The p53-reactivating small molecule RITA induces senescence in head and neck cancer cells.

    Science.gov (United States)

    Chuang, Hui-Ching; Yang, Liang Peng; Fitzgerald, Alison L; Osman, Abdullah; Woo, Sang Hyeok; Myers, Jeffrey N; Skinner, Heath D

    2014-01-01

    TP53 is the most commonly mutated gene in head and neck cancer (HNSCC), with mutations being associated with resistance to conventional therapy. Restoring normal p53 function has previously been investigated via the use of RITA (reactivation of p53 and induction of tumor cell apoptosis), a small molecule that induces a conformational change in p53, leading to activation of its downstream targets. In the current study we found that RITA indeed exerts significant effects in HNSCC cells. However, in this model, we found that a significant outcome of RITA treatment was accelerated senescence. RITA-induced senescence in a variety of p53 backgrounds, including p53 null cells. Also, inhibition of p53 expression did not appear to significantly inhibit RITA-induced senescence. Thus, this phenomenon appears to be partially p53-independent. Additionally, RITA-induced senescence appears to be partially mediated by activation of the DNA damage response and SIRT1 (Silent information regulator T1) inhibition, with a synergistic effect seen by combining either ionizing radiation or SIRT1 inhibition with RITA treatment. These data point toward a novel mechanism of RITA function as well as hint to its possible therapeutic benefit in HNSCC.

  9. The small molecule inhibitor QLT0267 Radiosensitizes squamous cell carcinoma cells of the head and neck.

    Directory of Open Access Journals (Sweden)

    Iris Eke

    Full Text Available BACKGROUND: The constant increase of cancer cell resistance to radio- and chemotherapy hampers improvement of patient survival and requires novel targeting approaches. Integrin-Linked Kinase (ILK has been postulated as potent druggable cancer target. On the basis of our previous findings clearly showing that ILK transduces antisurvival signals in cells exposed to ionizing radiation, this study evaluated the impact of the small molecule inhibitor QLT0267, reported as putative ILK inhibitor, on the cellular radiation survival response of human head and neck squamous cell carcinoma cells (hHNSCC. METHODOLOGY/PRINCIPAL FINDINGS: Parental FaDu cells and FaDu cells stably transfected with a constitutively active ILK mutant (FaDu-IH or empty vectors, UTSCC45 cells, ILK(floxed/floxed(fl/fl and ILK(-/- mouse fibroblasts were used. Cells grew either two-dimensionally (2D on or three-dimensionally (3D in laminin-rich extracellular matrix. Cells were treated with QLT0267 alone or in combination with irradiation (X-rays, 0-6 Gy single dose. ILK knockdown was achieved by small interfering RNA transfection. ILK kinase activity, clonogenic survival, number of residual DNA double strand breaks (rDSB; gammaH2AX/53BP1 foci assay, cell cycle distribution, protein expression and phosphorylation (e.g. Akt, p44/42 mitogen-activated protein kinase (MAPK were measured. Data on ILK kinase activity and phosphorylation of Akt and p44/42 MAPK revealed a broad inhibitory spectrum of QLT0267 without specificity for ILK. QLT0267 significantly reduced basal cell survival and enhanced the radiosensitivity of FaDu and UTSCC45 cells in a time- and concentration-dependent manner. QLT0267 exerted differential, cell culture model-dependent effects with regard to radiogenic rDSB and accumulation of cells in the G2 cell cycle phase. Relative to corresponding controls, FaDu-IH and ILK(fl/fl fibroblasts showed enhanced radiosensitivity, which failed to be antagonized by QLT0267. A

  10. Revealing the Value of “Green” and the Small Group with a Big Heart in Transportation Mode Choice

    Directory of Open Access Journals (Sweden)

    David Gaker

    2013-07-01

    Full Text Available To address issues of climate change, people are more and more being presented with the greenhouse gas emissions associated with their alternatives. Statements of pounds or kilograms of CO2 are showing up in trip planners, car advertisements, and even restaurant menus under the assumption that this information influences behavior. This research contributes to the literature that investigates how travelers respond to such information. Our objective is to better understand the “value of green” or how much travelers are willing to pay in money in order to reduce the CO2 associated with their travel. As with previous work, we designed and conducted a mode choice experiment using methods that have long been used to study value of time. The contributions of this paper are twofold. First, we employ revealed preference data, whereas previous studies have been based on stated preferences. Second, we provide new insight on how the value of green is distributed in the population. Whereas previous work has specified heterogeneity either systematically or with a continuous distribution, we find that a latent class choice model specification better fits the data and also is attractive behaviorally. The best fitting latent class model has two classes: one large class (76% of the sample who are not willing to spend any time or money to reduce their CO2 and a second class (24% of the sample who value reducing their CO2 at a very high rate of $2.68 per pound of reduction—our so-called small group with a big heart. We reanalyzed three datasets that we had previously collected and found considerable robustness of this two class result.

  11. «If a woman has a big head»: Physiognomy and female nature in an assyro-babylonian text

    Directory of Open Access Journals (Sweden)

    Couto Ferreira, Érica

    2008-06-01

    Full Text Available In Mesopotamia, the human body was understood as an object for divination, that is, a system of signs, which carried messages about the individual, and whose meaning had to be decoded by means of observation and interpretation. Taking the physiognomic series Šumma sinništu qaqqada rabât («If a woman has a big head» as the main source of my article, I analyse, on the one hand, the processes that take part in the promotion of a particular perception of women based on a specific reading of the female body. On the other hand, I deal with the elements that characterize this female perception, basically, the image of the ideal woman centred on motherhood, and, in close relation to this, the dangers that threaten women’s life during pregnancy.

    El cuerpo humano en Mesopotamia era entendido como objeto adivinatorio: un sistema de signos, portador de mensajes sobre el propio individuo, cuyo significado debía decodificarse mediante la observación y la interpretación. Tomando como fuente principal de mi trabajo la serie fisionómica Šumma sinništu qaqqada rabât («Si una mujer tiene la cabeza grande», analizo, por una parte, los mecanismos por los que se promueve una determinada visión de las mujeres en la adivinación fisionómica a partir de la lectura sexuada de su cuerpo. Por otra, los elementos que caracterizan esta imagen femenina, en base a dos grandes ejes: la imagen de la mujer ideal encarnada en el rol de madre; y los peligros que amenazan la vida de las mujeres en calidad de procreadoras.

  12. Small Artifacts - Big Technologies

    DEFF Research Database (Denmark)

    Kreiner, Kristian

    2005-01-01

    The computer IC is the heart of the information and telecommunication technology. It is a tiny artifact, but with incredible organizing powers. We use this physical artifact as the location for studying central problems of the knowledge economy. First, the paper describes the history of chip design...

  13. Small country, big business?

    DEFF Research Database (Denmark)

    Martens, Kerstin; Starke, Peter

    2008-01-01

    This paper discusses New Zealand's role in the global market for tertiary education. The internationalisation and liberalisation of education markets is progressing rapidly in today's globalising world, as reflected by the incorporation of education as a service into the GATS framework. Through...

  14. Big Data, Small Media

    Directory of Open Access Journals (Sweden)

    Grant Bollmer

    2014-08-01

    Andrew Dubber Radio in the Digital Age Polity, Cambridge, 2013   Charles Ess Digital Media Ethics, Second Edition Polity, Cambridge, 2014   Graeme Kirkpatrick Computer Games and the Social Imaginary Polity, Cambridge, 2013   Dhiraj Murthy Twitter: Social Communication in the Twitter Age Polity, Cambrige, 2013   Jill Walker Rettberg Blogging, Second Edition Polity, Cambridge, 2014

  15. Small risk, big price

    International Nuclear Information System (INIS)

    Maclaine, D.

    1994-01-01

    A conference held in the United Kingdom on the harmful effects of low frequency electromagnetic fields (EM), such as those emitted by powerlines, is reported. It was sponsored by solicitors acting on behalf of families taking legal action on the issue of power lines and health risks and the delegates ranged from leading cancer specialists to campaigning groups. The view of the National Grid Company was expressed that, since no cause-and-effect relationships has been established, it would be premature to take astronomically expensive measures to shield substations and house underground pipelines in steel pipes in order to achieve very low field levels acceptable to campaign groups. The possibility of a cancer link with exposure to EM fields could not be ruled out, however. On behalf of one of the pressure groups it was argued that faced with a suspected hazard for which there is statistically significant evidence of association but incomplete evidence of cause, the electricity companies should take some positive action. In the view of an epidemiologist the evidence is sufficiently unclear as to allow people to arrive at differing conclusions and called for a policy response which was something less than panic but something greater than negligence. A solicitor's view was that some form of self-regulation before conclusive proof either way is found would ease public concern and that any such code of practice should specify that new lines should be placed at least 50 m from houses. (UK)

  16. Small becomes big

    International Nuclear Information System (INIS)

    Carpenter, W.L.; Brooks, J.R.

    1993-01-01

    The owner of an old-line manufacturing company arranged with a waste hauler to dispose of six 208-L (55-gal) drums of mixed waste, including at least two containing used oils from transformers or gear boxes. A lab analysis was run but the last item on the list, PCBs, was not checked. The hauler transferred the contents to 3 different tanks before taking them to a cement plant kiln that was permitted to burn EPA-approved wastes for supplementary fuel. The kiln's lab analysis identified PCB contamination and they refused to take the load. The hauler returned the load to his storage facility, unloaded it into storage tanks, and called EPA who told him the owner was legally responsible for disposal. The disposal problem had grown much larger than the original two drums. Initially it would have cost about $600/drum to properly analyze and dispose of the wastes. In the end, the final cost was $470,000. As this case study illustrates, getting environmental advice often is viewed as an expensive answer to an environmental problem by most business owners, but in the final analysis, it is the most economical thing to do

  17. Big Data, Big Problems: A Healthcare Perspective.

    Science.gov (United States)

    Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M

    2017-01-01

    Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."

  18. An innovative design of small low head hydropower units for low cost decentralized production

    International Nuclear Information System (INIS)

    Holmen, E.; Dennehy, T.

    1991-01-01

    Design allowing turbine operation at heads as low as 1m and operating at a rotational speed of 500 RPM at a flow of 2.6m 3 /s and a runner diameter of 700 mm. This eliminates the need for a gear box and helps in achieving efficiency of 60% in the case of a 21 kW installation at a 1m head site and 85% with a 69 kW 3.2m head site. Present turbine designs for such low head sites are very expensive to produce and have a low efficiency. The design uses an all plastic waterway, guide vane assembly and reinforced plastic runner blades. There will be a short pay-back period, for example 4.5 years in the case of a 21 kW unit and 2.0 years in case of the 69 kW unit. These payback periods assume a cost per kW of 0.00 ECU. Design is attractive for decentralized production. 3 figs

  19. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......, and highlight some recent methodological advancements in machine learning and image analysis triggered by astronomical applications....

  20. Small cell carcinoma of the head and neck: A comparative study by primary site based on population data.

    Science.gov (United States)

    Kuan, Edward C; Alonso, Jose E; Tajudeen, Bobby A; Arshi, Armin; Mallen-St Clair, Jon; St John, Maie A

    2017-08-01

    Small cell carcinoma (SmCC) of the head and neck is an extremely rare neuroendocrine malignancy. In this study, we describe the incidence and determinants of survival of patients with SmCC of the head and neck between the years of 1973 and 2012 using the Surveillance, Epidemiology, and End Results database as differed by primary site. Retrospective, population-based cohort study. A total of 237 cases of SmCC of the head and neck were identified, which was divided into sinonasal primaries (n = 82) and all other head and neck primaries (n = 155). Clinicopathologic and epidemiologic variables were analyzed as predictors of overall survival (OS) and disease-specific survival (DSS) based on the Kaplan-Meier method. More than half of sinonasal primaries presented with Kadish stage C or D. On multivariate analysis, surgery was the only independent predictor of improved DSS (P = .008) for sinonasal primaries; in contrast, radiation therapy was a favorable prognosticator for OS (P = .007) and DSS (P = .043) in extrasinonasal sites. Comparison of survival between sinonasal primaries and all other sites demonstrated that sinonasal SmCC had uniformly better OS (P = .002) and DSS (P = .006). SmCC in the head and neck remains rare, and sinonasal primaries appear to have improved survival compared to other sites. Based on these results, optimal treatment for sinonasal SmCC appears to be surgical therapy, whereas radiation therapy is the preferred treatment for SmCC of other primary sites, particularly the larynx. 4. Laryngoscope, 127:1785-1790, 2017. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.

  1. The Importance of Hunting and Hunting Areas for Big and Small Game (Food) for the Tourism Development in the Crna River Basin in the Republic of Macedonia

    OpenAIRE

    Koteski, Cane; Josheski, Dushko; Jakovlev, Zlatko; Bardarova, Snezana; Serafimova, Mimoza

    2014-01-01

    The Crna River is a river in the Republic of Macedonia, right tributary to Vardar. Its source is in the mountains of Western Macedonia, west of Krusevo. It flows through the village of Sopotnica, and southwards through the plains east of Bitola. The name means “black river” in Macedonian, which is translation for its former Thracian name. The purpose of this paper is to show the hunting and hunting areas for big and small Game (food), the structure of the areas of certain hunting, fi...

  2. Radio frequency-induced temperature elevations in the human head considering small anatomical structures

    International Nuclear Information System (INIS)

    Schmid, G.; Ueberbacher, R.; Samaras, T.

    2007-01-01

    In order to enable a detailed numerical radio frequency (RF) dosimetry and the computations of RF-induced temperature elevations, high-resolution (0.1 mm) numerical models of the human eye, the inner ear organs and the pineal gland were developed and inserted into a commercially available head model. As radiation sources, generic models of handsets at 400, 900 and 1850 MHz operating in close proximity to the head were considered. The results, obtained by finite-difference time domain-based computations, showed a highly heterogeneous specific absorption rate (SAR) distribution and SAR-peaks inside the inner ear structures; however, the corresponding RF-induced temperature elevations were well below 0.1 deg. C, when considering typical output power values of hand-held devices. In case of frontal exposure, with the radiation sources ∼2.5 cm in front of the closed eye, maximum temperature elevations in the eye in the range of ∼0.2-0.6 deg. C were found for typical device output powers. A reduction in tissue perfusion mainly affected the maximum RF-induced temperature elevation of tissues deep inside the head. Similarly, worst-case considerations regarding pulsed irradiation affected temperature elevations in deep tissue significantly more than in superficial tissues. (authors)

  3. Interleaved 3D-CNNs for joint segmentation of small-volume structures in head and neck CT images.

    Science.gov (United States)

    Ren, Xuhua; Xiang, Lei; Nie, Dong; Shao, Yeqin; Zhang, Huan; Shen, Dinggang; Wang, Qian

    2018-02-26

    Accurate 3D image segmentation is a crucial step in radiation therapy planning of head and neck tumors. These segmentation results are currently obtained by manual outlining of tissues, which is a tedious and time-consuming procedure. Automatic segmentation provides an alternative solution, which, however, is often difficult for small tissues (i.e., chiasm and optic nerves in head and neck CT images) because of their small volumes and highly diverse appearance/shape information. In this work, we propose to interleave multiple 3D Convolutional Neural Networks (3D-CNNs) to attain automatic segmentation of small tissues in head and neck CT images. A 3D-CNN was designed to segment each structure of interest. To make full use of the image appearance information, multiscale patches are extracted to describe the center voxel under consideration and then input to the CNN architecture. Next, as neighboring tissues are often highly related in the physiological and anatomical perspectives, we interleave the CNNs designated for the individual tissues. In this way, the tentative segmentation result of a specific tissue can contribute to refine the segmentations of other neighboring tissues. Finally, as more CNNs are interleaved and cascaded, a complex network of CNNs can be derived, such that all tissues can be jointly segmented and iteratively refined. Our method was validated on a set of 48 CT images, obtained from the Medical Image Computing and Computer Assisted Intervention (MICCAI) Challenge 2015. The Dice coefficient (DC) and the 95% Hausdorff Distance (95HD) are computed to measure the accuracy of the segmentation results. The proposed method achieves higher segmentation accuracy (with the average DC: 0.58 ± 0.17 for optic chiasm, and 0.71 ± 0.08 for optic nerve; 95HD: 2.81 ± 1.56 mm for optic chiasm, and 2.23 ± 0.90 mm for optic nerve) than the MICCAI challenge winner (with the average DC: 0.38 for optic chiasm, and 0.68 for optic nerve; 95HD: 3.48 for

  4. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  5. Big science

    CERN Multimedia

    Nadis, S

    2003-01-01

    " "Big science" is moving into astronomy, bringing large experimental teams, multi-year research projects, and big budgets. If this is the wave of the future, why are some astronomers bucking the trend?" (2 pages).

  6. Crowd-funded micro-grants for genomics and "big data": an actionable idea connecting small (artisan) science, infrastructure science, and citizen philanthropy.

    Science.gov (United States)

    Özdemir, Vural; Badr, Kamal F; Dove, Edward S; Endrenyi, Laszlo; Geraci, Christy Jo; Hotez, Peter J; Milius, Djims; Neves-Pereira, Maria; Pang, Tikki; Rotimi, Charles N; Sabra, Ramzi; Sarkissian, Christineh N; Srivastava, Sanjeeva; Tims, Hesther; Zgheib, Nathalie K; Kickbusch, Ilona

    2013-04-01

    Biomedical science in the 21(st) century is embedded in, and draws from, a digital commons and "Big Data" created by high-throughput Omics technologies such as genomics. Classic Edisonian metaphors of science and scientists (i.e., "the lone genius" or other narrow definitions of expertise) are ill equipped to harness the vast promises of the 21(st) century digital commons. Moreover, in medicine and life sciences, experts often under-appreciate the important contributions made by citizen scholars and lead users of innovations to design innovative products and co-create new knowledge. We believe there are a large number of users waiting to be mobilized so as to engage with Big Data as citizen scientists-only if some funding were available. Yet many of these scholars may not meet the meta-criteria used to judge expertise, such as a track record in obtaining large research grants or a traditional academic curriculum vitae. This innovation research article describes a novel idea and action framework: micro-grants, each worth $1000, for genomics and Big Data. Though a relatively small amount at first glance, this far exceeds the annual income of the "bottom one billion"-the 1.4 billion people living below the extreme poverty level defined by the World Bank ($1.25/day). We describe two types of micro-grants. Type 1 micro-grants can be awarded through established funding agencies and philanthropies that create micro-granting programs to fund a broad and highly diverse array of small artisan labs and citizen scholars to connect genomics and Big Data with new models of discovery such as open user innovation. Type 2 micro-grants can be funded by existing or new science observatories and citizen think tanks through crowd-funding mechanisms described herein. Type 2 micro-grants would also facilitate global health diplomacy by co-creating crowd-funded micro-granting programs across nation-states in regions facing political and financial instability, while sharing similar disease

  7. [Effectiveness of multiple small-diameter drilling decompression combined with hip arthroscopy for early osteonecrosis of the femoral head].

    Science.gov (United States)

    Li, Ji; Li, Zhongli; Su, Xiangzheng; Liu, Chunhui; Zhang, Hao; Wang, Ketao

    2017-09-01

    To evaluate the effectiveness of multiple small-diameter drilling decompression combined with hip arthroscopy for early oeteonecrosis of the femoral head (ONFH). Between March 2010 and December 2013, 91 patients with early ONFH were treated with the operation of multiple small-diameter drilling decompression combined with hip arthroscopy in 39 cases (53 hips, group A) or with drilling decompression alone in 52 cases (74 hips, group B). The patients in 2 groups had obvious hip pain and limited motion before operation. There was no significant difference in gender, age, etiology, effected side, stage of osteonecrosis, and preoperative Harris score between 2 groups ( P >0.05). All operations succeeded and all incisions healed by first intention. The operation time was significantly longer in group A [(73.3±10.6) minutes] than in group B [(41.5±7.2) minutes] ( t =8.726, P =0.000). Temporary of sciatic nerve apraxia after operation occurred in 2 patients of group A, and no complication occurred in other patients. Patients were followed up 24-52 months (mean, 39.3 months) in group A and 24-48 months (mean, 34.6 months) in group B. At last follow-up, the Harris scores were 83.34±8.76 in group A and 76.61±9.22 in group B, showing significant differences when compared between 2 groups ( t =-4.247, P =0.029) and when compared with preoperative values in 2 groups ( t =-10.327, P =0.001; t =-8.216, P =0.008). X-ray films showed that the collapse of the femoral head was observed in 6 hips (1 hip at stage Ⅰand 5 hips at stage Ⅱ) in group A, and in 16 hips (4 hips at stageⅠand 12 hips at stage Ⅱ) in group B; and hip arthroplasty was performed. The total effective rates were 88.68% (47/53) in group A and 78.38% (58/74) in group B, respectively; showing significant difference between 2 groups ( χ 2 =5.241, P =0.041). Multiple small-diameter drilling decompression combined with hip arthroscopy is effective in pain relief, improvement of hip function, slowing-down the

  8. Radiological image interpretation for hematoma and small tumors simulated in a head and neck phantom

    International Nuclear Information System (INIS)

    Thompson, Larissa; Campos, Tarcisio Passos Ribeiro

    2009-01-01

    Subarachnoidal hemorrhages (HSA) are caused by aneurysms and their symptom usually becomes evident after a rupture. Nevertheless, there are situations in which the aneurysms compress a nerve or produce a bleed before the rupture happens, as consequence one alert signal like headache occurs. It, often occurs after minutes or weeks previous the major rupture. The main goal is to prevent a massive hemorrhage. Thus the Computer Tomography (CT) scan of skull provides a basic and specific function: to reveal the position where the hemorrhage was produced, guiding to a additional medical procedures. On the other hand, CT does not prevent the cerebral tumor development, but precise diagnostic for some symptoms such as vomits, nauseas, epileptic attacks, weakness in arms or legs, require this image protocol. CT has its fundamental importance to tumor detection. Indeed CT reveals its importance in the tumor early diagnosis. Specialized training in CT analysis shall be done. Ahead of a precise diagnosis to manager an early intervention, a CT diagnostic training is suitable for a favorable prognostic. In this context, focusing on propose of radiological inquires; a head and neck phantom will be used to simulate hematomas and cerebral tumors. Images of CT of skull will be used to identify these lesions physically implanted in phantom. The radiological response will be analyzed with the purpose of validation of the skull's CT diagnosis, for a double blind test. The diagnostic with non contrast CT shows only higher 5mm diameter subjects (tumors) identified by the double blind test. Hemorrhage is identified by only the administrator (single-blind test). As conclusion, the author's launches the hypothesis that this object simulator shall provide assistance for specialized training on pathology interpretation on radiological images. (author)

  9. Big Atoms for Small Children: Building Atomic Models from Common Materials to Better Visualize and Conceptualize Atomic Structure

    Science.gov (United States)

    Cipolla, Laura; Ferrari, Lia A.

    2016-01-01

    A hands-on approach to introduce the chemical elements and the atomic structure to elementary/middle school students is described. The proposed classroom activity presents Bohr models of atoms using common and inexpensive materials, such as nested plastic balls, colored modeling clay, and small-sized pasta (or small plastic beads).

  10. A simple non-invasive method for measuring gross brain size in small live fish with semi-transparent heads

    Directory of Open Access Journals (Sweden)

    Joacim Näslund

    2014-09-01

    Full Text Available This paper describes a non-invasive method for estimating gross brain size in small fish with semi-transparent heads, using system camera equipment. Macro-photographs were taken from above on backlit free-swimming fish undergoing light anaesthesia. From the photographs, the width of the optic tectum was measured. This measure (TeO-measure correlates well with the width of the optic tectum as measured from out-dissected brains in both brown trout fry and zebrafish (Pearson r > 0.90. The TeO-measure also correlates well with overall brain wet weight in brown trout fry (r = 0.90, but less well for zebrafish (r = 0.79. A non-invasive measure makes it possible to quickly assess brain size from a large number of individuals, as well as repeatedly measuring brain size of live individuals allowing calculation of brain growth.

  11. New camera-based microswitch technology to monitor small head and mouth responses of children with multiple disabilities.

    Science.gov (United States)

    Lancioni, Giulio E; Bellini, Domenico; Oliva, Doretta; Singh, Nirbhay N; O'Reilly, Mark F; Green, Vanessa A; Furniss, Fred

    2014-06-01

    Assessing a new camera-based microswitch technology, which did not require the use of color marks on the participants' face. Two children with extensive multiple disabilities participated. The responses selected for them consisted of small, lateral head movements and mouth closing or opening. The intervention was carried out according to a multiple probe design across responses. The technology involved a computer with a CPU using a 2-GHz clock, a USB video camera with a 16-mm lens, a USB cable connecting the camera and the computer, and a special software program written in ISO C++ language. The new technology was satisfactorily used with both children. Large increases in their responding were observed during the intervention periods (i.e. when the responses were followed by preferred stimulation). The new technology may be an important resource for persons with multiple disabilities and minimal motor behavior.

  12. Crowd-Funded Micro-Grants for Genomics and “Big Data”: An Actionable Idea Connecting Small (Artisan) Science, Infrastructure Science, and Citizen Philanthropy

    Science.gov (United States)

    Badr, Kamal F.; Dove, Edward S.; Endrenyi, Laszlo; Geraci, Christy Jo; Hotez, Peter J.; Milius, Djims; Neves-Pereira, Maria; Pang, Tikki; Rotimi, Charles N.; Sabra, Ramzi; Sarkissian, Christineh N.; Srivastava, Sanjeeva; Tims, Hesther; Zgheib, Nathalie K.; Kickbusch, Ilona

    2013-01-01

    Abstract Biomedical science in the 21st century is embedded in, and draws from, a digital commons and “Big Data” created by high-throughput Omics technologies such as genomics. Classic Edisonian metaphors of science and scientists (i.e., “the lone genius” or other narrow definitions of expertise) are ill equipped to harness the vast promises of the 21st century digital commons. Moreover, in medicine and life sciences, experts often under-appreciate the important contributions made by citizen scholars and lead users of innovations to design innovative products and co-create new knowledge. We believe there are a large number of users waiting to be mobilized so as to engage with Big Data as citizen scientists—only if some funding were available. Yet many of these scholars may not meet the meta-criteria used to judge expertise, such as a track record in obtaining large research grants or a traditional academic curriculum vitae. This innovation research article describes a novel idea and action framework: micro-grants, each worth $1000, for genomics and Big Data. Though a relatively small amount at first glance, this far exceeds the annual income of the “bottom one billion”—the 1.4 billion people living below the extreme poverty level defined by the World Bank ($1.25/day). We describe two types of micro-grants. Type 1 micro-grants can be awarded through established funding agencies and philanthropies that create micro-granting programs to fund a broad and highly diverse array of small artisan labs and citizen scholars to connect genomics and Big Data with new models of discovery such as open user innovation. Type 2 micro-grants can be funded by existing or new science observatories and citizen think tanks through crowd-funding mechanisms described herein. Type 2 micro-grants would also facilitate global health diplomacy by co-creating crowd-funded micro-granting programs across nation-states in regions facing political and financial instability, while

  13. Small Businesses Save Big: A Borrower's Guide To Increase the Bottom Line Using Energy Efficiency (Fact Sheet)

    Energy Technology Data Exchange (ETDEWEB)

    2015-01-01

    Dollars saved through energy efficiency can directly impact your bottom line. Whether you are planning for a major renovation or upgrading individual pieces of building equipment, these improvements can help reduce operating costs, save on utility bills, and boost profits. This fact sheet provides a guide for small businesses to find the resources to increase the energy efficiency of their buildings.

  14. On the Equivalence between Small-Step and Big-Step Abstract Machines: A Simple Application of Lightweight Fusion

    DEFF Research Database (Denmark)

    Danvy, Olivier; Millikin, Kevin

    2007-01-01

    -step specification. We illustrate this observation here with a recognizer for Dyck words, the CEK machine, and Krivine's machine with call/cc. The need for such a simple proof is motivated by our current work on small-step abstract machines as obtained by refocusing a function implementing a reduction semantics (a...

  15. Children's Behaviors and Emotions in Small-Group Argumentative Discussion: Explore the Influence of Big Five Personality Factors

    Science.gov (United States)

    Dong, Ting

    2009-01-01

    The assessment and structure of personality traits and small group learning during classroom discussions are both research fields that have undergone fast development in the past few decades. However, very few studies have investigated the relationship between individual personality characteristics and performance in discussions, especially with…

  16. Urbanising Big

    DEFF Research Database (Denmark)

    Ljungwall, Christer

    2013-01-01

    Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis.......Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis....

  17. Big Argumentation?

    Directory of Open Access Journals (Sweden)

    Daniel Faltesek

    2013-08-01

    Full Text Available Big Data is nothing new. Public concern regarding the mass diffusion of data has appeared repeatedly with computing innovations, in the formation before Big Data it was most recently referred to as the information explosion. In this essay, I argue that the appeal of Big Data is not a function of computational power, but of a synergistic relationship between aesthetic order and a politics evacuated of a meaningful public deliberation. Understanding, and challenging, Big Data requires an attention to the aesthetics of data visualization and the ways in which those aesthetics would seem to depoliticize information. The conclusion proposes an alternative argumentative aesthetic as the appropriate response to the depoliticization posed by the popular imaginary of Big Data.

  18. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations......The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  19. A randomised comparison of radical radiotherapy with or without chemotherapy for patients with non-small cell lung cancer: Results from the Big Lung Trial

    International Nuclear Information System (INIS)

    Fairlamb, David; Milroy, Robert; Gower, Nicole; Parmar, Mahesh; Peake, Michael; Rudd, Robin; Souhami, Robert; Spiro, Stephen; Stephens, Richard; Waller, David

    2005-01-01

    Background: A meta-analysis of trials comparing primary treatment with or without chemotherapy for patients with non-small cell lung cancer published in 1995 suggested a survival benefit for cisplatin-based chemotherapy in each of the primary treatment settings studied, but it included many small trials, and trials with differing eligibility criteria and chemotherapy regimens. Methods: The Big Lung Trial was a large pragmatic trial designed to confirm the survival benefits seen in the meta-analysis, and this paper reports the findings in the radical radiotherapy setting. The trial closed before the required sample size was achieved due to slow accrual, with a total of 288 patients randomised to receive radical radiotherapy alone (146 patients) or sequential radical radiotherapy and cisplatin-based chemotherapy (142 patients). Results: There was no evidence that patients allocated sequential chemotherapy and radical radiotherapy had a better survival than those allocated radical radiotherapy alone, HR 1.07 (95% CI 0.84-1.38, P=0.57), median survival 13.0 months for the sequential group and 13.2 for the radical radiotherapy alone group. In addition, exploratory analyses could not identify any subgroup that might benefit more or less from chemotherapy. Conclusions: Despite not suggesting a survival benefit for the sequential addition of chemotherapy to radical radiotherapy, possibly because of the relatively small sample size and consequently wide confidence intervals, the results can still be regarded as consistent with the meta-analysis, and other similarly designed recently published large trials. Combining all these results suggests there may be a small median survival benefit with chemotherapy of between 2 and 8 weeks

  20. Big data analytics — A review of data-mining models for small and medium enterprises in the transportation sector.

    OpenAIRE

    Selamat, Siti Aishah Mohd; Prakoonwit, Simant; Sahandi, Reza; Khan, W.; Ramachandran, M.

    2018-01-01

    The need for small and medium enterprises (SMEs) to adopt data analytics has reached a critical point, given the surge of data implied by the advancement of technology. Despite data mining (DM) being widely used in the transportation sector, it is staggering to note that there are minimal research case studies being done on the application of DM by SMEs, specifically in the transportation sector. From the extensive review conducted, the three most common DM models used by large enterprises in...

  1. Collaborative Work without Large, Shared Displays: Looking for “the Big Picture” on a Small Screen?

    DEFF Research Database (Denmark)

    Hertzum, Morten

    2017-01-01

    Large, shared displays – such as electronic whiteboards – have proven successful in supporting actors in forming and maintaining an overview of tightly coupled collaborative activities. However, in many developing countries the technology of choice is mobile phones, which have neither a large nor...... a shared screen. It therefore appears relevant to ask: How may mobile devices with small screens support, or fail to support, actors in forming and maintaining an overview of their collaborative activities?...

  2. A Little Here, A Little There, A Fairly Big Problem Everywhere: Small Quantity Site Transuranic Waste Disposition Alternatives

    International Nuclear Information System (INIS)

    Luke, Dale Elden; Parker, Douglas Wayne; Moss, J.; Monk, Thomas Hugh; Fritz, Lori Lee; Daugherty, B.; Hladek, K.; Kosiewicx, S.

    2000-01-01

    Small quantities of transuranic (TRU) waste represent a significant challenge to the waste disposition and facility closure plans of several sites in the Department of Energy (DOE) complex. This paper presents the results of a series of evaluations, using a systems engineering approach, to identify the preferred alternative for dispositioning TRU waste from small quantity sites (SQSs). The TRU waste disposition alternatives evaluation used semi-quantitative data provided by the SQSs, potential receiving sites, and the Waste Isolation Pilot Plant (WIPP) to select and recommend candidate sites for waste receipt, interim storage, processing, and preparation for final disposition of contact-handled (CH) and remote-handled (RH) TRU waste. The evaluations of only four of these SQSs resulted in potential savings to the taxpayer of $33 million to $81 million, depending on whether mobile systems could be used to characterize, package, and certify the waste or whether each site would be required to perform this work. Small quantity shipping sites included in the evaluation included the Battelle Columbus Laboratory (BCL), University of Missouri Research Reactor (MURR), Energy Technology Engineering Center (ETEC), and Mound. Candidate receiving sites included the Idaho National Engineering and Environmental Laboratory (INEEL), the Savannah River Site (SRS), Los Alamos National Laboratory (LANL), Oak Ridge (OR), and Hanford. At least 14 additional DOE sites having TRU waste may be able to save significant money if cost savings are similar to the four evaluated thus far

  3. Small drains, big problems: the impact of dry weather runoff on shoreline water quality at enclosed beaches.

    Science.gov (United States)

    Rippy, Megan A; Stein, Robert; Sanders, Brett F; Davis, Kristen; McLaughlin, Karen; Skinner, John F; Kappeler, John; Grant, Stanley B

    2014-12-16

    Enclosed beaches along urban coastlines are frequent hot spots of fecal indicator bacteria (FIB) pollution. In this paper we present field measurements and modeling studies aimed at evaluating the impact of small storm drains on FIB pollution at enclosed beaches in Newport Bay, the second largest tidal embayment in Southern California. Our results suggest that small drains have a disproportionate impact on enclosed beach water quality for five reasons: (1) dry weather surface flows (primarily from overirrigation of lawns and ornamental plants) harbor FIB at concentrations exceeding recreational water quality criteria; (2) small drains can trap dry weather runoff during high tide, and then release it in a bolus during the falling tide when drainpipe outlets are exposed; (3) nearshore turbulence is low (turbulent diffusivities approximately 10(-3) m(2) s(-1)), limiting dilution of FIB and other runoff-associated pollutants once they enter the bay; (4) once in the bay, runoff can form buoyant plumes that further limit vertical mixing and dilution; and (5) local winds can force buoyant runoff plumes back against the shoreline, where water depth is minimal and human contact likely. Outdoor water conservation and urban retrofits that minimize the volume of dry and wet weather runoff entering the local storm drain system may be the best option for improving beach water quality in Newport Bay and other urban-impacted enclosed beaches.

  4. A little here, a little there, a fairly big problem everywhere: Small-quantity-site transuranic waste disposition alternatives

    International Nuclear Information System (INIS)

    D. Luke; D. Parker; J. Moss; T. Monk; L. Fritz; B. Daugherty; K. Hladek; S. Kosiewicx

    2000-01-01

    Small quantities of transuranic (TRU) waste represent a significant challenge to the waste disposition and facility closure plans of several sites in the Department of Energy (DOE) complex. This paper presents the results of a series of evaluations, using a systems engineering approach, to identify the preferred alternative for dispositioning TRU waste from small quantity sites (SQSs). The TRU waste disposition alternatives evaluation used semi-quantitative data provided by the SQSs, potential receiving sites, and the Waste Isolation Pilot Plant (WIPP) to select and recommend candidate sites for waste receipt, interim storage, processing, and preparation for final disposition of contact-handled (CH) and remote-handled (RH) TRU waste. The evaluations of only four of these SQSs resulted in potential savings to the taxpayer of $33 million to $81 million, depending on whether mobile systems could be used to characterize, package, and certify the waste or whether each site would be required to perform this work. Small quantity shipping sites included in the evaluation included the Battelle Columbus Laboratory (BCL), University of Missouri Research Reactor (MURR), Energy Technology Engineering Center (ETEC), and Mound Laboratory. Candidate receiving sites included the Idaho National Engineering and Environmental Laboratory (INEEL), the Savannah River Site (SRS), Los Alamos National Laboratory (LANL), Oak Ridge (OR), and Hanford. At least 14 additional DOE sites having TRU waste may be able to save significant money if cost savings are similar to the four evaluated thus far

  5. The method of UCN "small heating" measurement in the big gravitational spectrometer (BGS) and studies of this effect on Fomblin oil Y-HVAC 18/8

    Science.gov (United States)

    Nesvizhevsky, V. V.; Voronin, A. Yu.; Lambrecht, A.; Reynaud, S.; Lychagin, E. V.; Muzychka, A. Yu.; Nekhaev, G. V.; Strelkov, A. V.

    2018-02-01

    The Big Gravitational Spectrometer (BGS) takes advantage of the strong influence of the Earth's gravity on the motion of ultracold neutrons (UCNs) that makes it possible to shape and measure UCN spectra. We optimized the BGS to investigate the "small heating" of UCNs, that is, the inelastic reflection of UCNs from a surface accompanied by an energy change comparable with the initial UCN energy. UCNs whose energy increases are referred to as "Vaporized UCNs" (VUCNs). The BGS provides the narrowest UCN spectra of a few cm and the broadest "visible" VUCN energy range of up to ˜150 cm (UCN energy is given in units of its maximum height in the Earth's gravitational field, where 1.00 cm ≈ 1.02 neV). The dead-zone between the UCN and VUCN spectra is the narrowest ever achieved (a few cm). We performed measurements with and without samples without breaking vacuum. BGS provides the broadest range of temperatures (77-600 K) and the highest sensitivity to the small heating effect, up to ˜10-8 per bounce, i.e., two orders of magnitude higher than the sensitivity of alternative methods. We describe the method to measure the probability of UCN "small heating" using the BGS and illustrate it with a study of samples of the hydrogen-free oil Fomblin Y-HVAC 18/8. The data obtained are well reproducible, do not depend on sample thickness, and do not evolve over time. The measured model-independent probability P+ of UCN small heating from an energy "mono-line" 30.2 ± 2.5 cm to the energy range 35-140 cm is in the range (1.05 ±0.02s t a t )×1 0-5-(1.31 ±0.24s t a t )×1 0-5 at a temperature of 24 °C. The associated systematic uncertainty would disappear if a VUCN spectrum shape were known, for instance, from a particular model of small heating. This experiment provides the most precise and reliable value of small heating probability on Fomblin measured so far. These results are of importance for studies of UCN small heating as well as for analyzing and designing neutron

  6. The method of UCN "small heating" measurement in the big gravitational spectrometer (BGS) and studies of this effect on Fomblin oil Y-HVAC 18/8.

    Science.gov (United States)

    Nesvizhevsky, V V; Voronin, A Yu; Lambrecht, A; Reynaud, S; Lychagin, E V; Muzychka, A Yu; Nekhaev, G V; Strelkov, A V

    2018-02-01

    The Big Gravitational Spectrometer (BGS) takes advantage of the strong influence of the Earth's gravity on the motion of ultracold neutrons (UCNs) that makes it possible to shape and measure UCN spectra. We optimized the BGS to investigate the "small heating" of UCNs, that is, the inelastic reflection of UCNs from a surface accompanied by an energy change comparable with the initial UCN energy. UCNs whose energy increases are referred to as "Vaporized UCNs" (VUCNs). The BGS provides the narrowest UCN spectra of a few cm and the broadest "visible" VUCN energy range of up to ∼150 cm (UCN energy is given in units of its maximum height in the Earth's gravitational field, where 1.00 cm ≈ 1.02 neV). The dead-zone between the UCN and VUCN spectra is the narrowest ever achieved (a few cm). We performed measurements with and without samples without breaking vacuum. BGS provides the broadest range of temperatures (77-600 K) and the highest sensitivity to the small heating effect, up to ∼10 -8 per bounce, i.e., two orders of magnitude higher than the sensitivity of alternative methods. We describe the method to measure the probability of UCN "small heating" using the BGS and illustrate it with a study of samples of the hydrogen-free oil Fomblin Y-HVAC 18/8. The data obtained are well reproducible, do not depend on sample thickness, and do not evolve over time. The measured model-independent probability P + of UCN small heating from an energy "mono-line" 30.2 ± 2.5 cm to the energy range 35-140 cm is in the range 1.05±0.02 stat ×10 -5 -1.31±0.24 stat ×10 -5 at a temperature of 24 °C. The associated systematic uncertainty would disappear if a VUCN spectrum shape were known, for instance, from a particular model of small heating. This experiment provides the most precise and reliable value of small heating probability on Fomblin measured so far. These results are of importance for studies of UCN small heating as well as for analyzing and designing neutron lifetime

  7. How Big Is Too Big?

    Science.gov (United States)

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  8. Small cause - big effect: improvement in interface design results in improved data quality - a multicenter crossover study.

    Science.gov (United States)

    Ahlbrandt, Janko; Henrich, Michael; Hartmann, Bernd A; Bundschuh, Bettina B; Schwarz, Julia; Klasen, Joachim; Röhrig, Rainer

    2012-01-01

    In Germany the core data set for anesthesia version 3.0 was recently introduced for external quality assurance, which includes five surgical tracer procedures. We found a low rate of correctly documented tracers when compared to procedure data (OPS-Codes) documented separately. Examination revealed that the graphical user interface (GUI) contravened the dialogue principles as defined in EN ISO 9241-110. We worked with the manufacturer to implement small improvements and roll out the software. A crossover study was conducted at a university hospital and a municipal hospital chain with five hospitals. All study sites and surgical tracer procedures combined, we found an improvement from 42% to 65% (pbig effect small changes to the GUI can have on data quality. They also raise the question, if highly flexible and parameterized clinical documentation systems are suited to achieve high usability. Finding the right balance between GUIs designed by usability experts and the flexibility of parameterization by administrators will be a difficult task for the future and subject to further research.

  9. Making big communities small: using network science to understand the ecological and behavioral requirements for community social capital.

    Science.gov (United States)

    Neal, Zachary

    2015-06-01

    The concept of social capital is becoming increasingly common in community psychology and elsewhere. However, the multiple conceptual and operational definitions of social capital challenge its utility as a theoretical tool. The goals of this paper are to clarify two forms of social capital (bridging and bonding), explicitly link them to the structural characteristics of small world networks, and explore the behavioral and ecological prerequisites of its formation. First, I use the tools of network science and specifically the concept of small-world networks to clarify what patterns of social relationships are likely to facilitate social capital formation. Second, I use an agent-based model to explore how different ecological characteristics (diversity and segregation) and behavioral tendencies (homophily and proximity) impact communities' potential for developing social capital. The results suggest diverse communities have the greatest potential to develop community social capital, and that segregation moderates the effects that the behavioral tendencies of homophily and proximity have on community social capital. The discussion highlights how these findings provide community-based researchers with both a deeper understanding of the contextual constraints with which they must contend, and a useful tool for targeting their efforts in communities with the greatest need or greatest potential.

  10. Chlorine Incorporation in the CH3NH3PbI3 Perovskite: Small Concentration, Big Effect.

    Science.gov (United States)

    Quarti, Claudio; Mosconi, Edoardo; Umari, Paolo; De Angelis, Filippo

    2017-01-03

    The role of chlorine doping in CH 3 NH 3 PbI 3 represents an important open issue in the use of hybrid perovskites for photovoltaic applications. In particular, even if a positive role of chlorine doping on perovskite film formation and on material morphology has been demonstrated, an inherent positive effect on the electronic and photovoltaic properties cannot be excluded. Here we carried out periodic density functional theory and Car-Parrinello molecular dynamics simulations, going down to ∼1% doping, to investigate the effect of chlorine on CH 3 NH 3 PbI 3 . We found that such a small doping has important effects on the dynamics of the crystalline structure, both with respect to the inorganic framework and with respect to the cation libration motion. Together, we observe a dynamic spatial localization of the valence and conduction states in separated spatial material regions, which takes place in the 10 -1 ps time scale and which could be the key to ease of exciton dissociation and, likely, to small charge recombination in hybrid perovskites. Moreover, such localization is enhanced by chlorine doping, demonstrating an inherent positive role of chlorine doping on the electronic properties of this class of materials.

  11. Performance evaluation of a rotatory dual-head PET system with 90o increments for small animal imaging

    Science.gov (United States)

    Meng, F.; Zhu, S.; Li, L.; Wang, J.; Cao, X.; Cao, X.; Chen, X.; Liang, J.

    2017-09-01

    A rotatory dual-head positron emission tomography (PET) system with 90o increments has been built up by our lab. In this study, a geometric calibration phantom was designed and then used to calibrate the geometric offset of the system. With the geometric calibration, the artifacts in the reconstructed images were greatly eliminated. Then, we measured the imaging performance including resolution, sensitivity and image quality. The results showed that the full width at half maximum (FWHMs) of the point source were about 1.1 mm in three directions. The peak absolute sensitivity in the center of the field of view varied from 5.66% to 3.17% when the time window was fixed to 10 ns and the energy window was changed from 200-800 keV to 350-650 keV. The recovery coefficients ranged from 0.13 with a standard deviation of 17.5% to 0.98 with a standard deviation of 15.76%. For the air-filled and water-filled chamber, the spill-over ratio was 14.48% and 15.38%, respectively. The in vivo mouse experiment was carried out and further demonstrated the potential of our system in small animal studies.

  12. Performance evaluation of a rotatory dual-head PET system with 90o increments for small animal imaging

    International Nuclear Information System (INIS)

    Meng, F.; Zhu, S.; Li, L.; Wang, J.; Cao, X.; Cao, X.; Chen, X.; Liang, J.

    2017-01-01

    A rotatory dual-head positron emission tomography (PET) system with 90 o increments has been built up by our lab. In this study, a geometric calibration phantom was designed and then used to calibrate the geometric offset of the system. With the geometric calibration, the artifacts in the reconstructed images were greatly eliminated. Then, we measured the imaging performance including resolution, sensitivity and image quality. The results showed that the full width at half maximum (FWHMs) of the point source were about 1.1 mm in three directions. The peak absolute sensitivity in the center of the field of view varied from 5.66% to 3.17% when the time window was fixed to 10 ns and the energy window was changed from 200-800 keV to 350–650 keV. The recovery coefficients ranged from 0.13 with a standard deviation of 17.5% to 0.98 with a standard deviation of 15.76%. For the air-filled and water-filled chamber, the spill-over ratio was 14.48% and 15.38%, respectively. The in vivo mouse experiment was carried out and further demonstrated the potential of our system in small animal studies.

  13. The p53-reactivating small-molecule RITA enhances cisplatin-induced cytotoxicity and apoptosis in head and neck cancer.

    Science.gov (United States)

    Roh, Jong-Lyel; Ko, Jung Ho; Moon, Soo Jin; Ryu, Chang Hwan; Choi, Jun Young; Koch, Wayne M

    2012-12-01

    We evaluated whether the restoration of p53 function by the p53-reactivating small molecule RITA (reactivation of p53 and induction of tumor cell apoptosis enhances cisplatin-induced cytotoxicity and apoptosis in head-and-neck cancer (HNC). RITA induced prominent accumulation and reactivation of p53 in a wild-type TP53-bearing HNC cell line. RITA showed maximal growth suppression in tumor cells showing MDM2-dependent p53 degradation. RITA promoted apoptosis in association with upregulation of p21, BAX, and cleaved caspase-3; notably, the apoptotic response was blocked by pifithrin-α, demonstrating its p53 dependence. With increasing concentrations, RITA strongly induced apoptosis rather than G2-phase arrest. In combination therapy, RITA enhanced cisplatin-induced growth inhibition and apoptosis of HNC cells invitro and in vivo. Our data suggest that the restoration of p53 tumor-suppressive function by RITA enhances the cytotoxicity and apoptosis of cisplatin, an action that may offer an attractive strategy for treating HNC. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  14. Cognitive ability in adolescents born small for gestational age: Associations with fetal growth velocity, head circumference and postnatal growth.

    Science.gov (United States)

    Jensen, Rikke Beck; Juul, Anders; Larsen, Torben; Mortensen, Erik Lykke; Greisen, Gorm

    2015-12-01

    Small size at birth may be associated with impaired cognitive ability later in life. The aim of this study was to examine the effect of being born small for gestational age (SGA), with or without intrauterine growth restriction (IUGR) on cognitive ability in late adolescence. A follow-up study of a former cohort included 123 participants (52 males); 47 born SGA and 76 born appropriate for gestational age (AGA). Fetal growth velocity (FGV) was determined by serial ultrasound measurements during the third trimester. A control group matched for age and birthplace was included. The original Wechsler Adult Intelligence Scale (WAIS) was administered, and verbal, performance and full-scale Intelligence Quotient (IQ) scores were calculated. There was no difference in IQ between adolescents born SGA and AGA. FGV or IUGR during the third trimester did not influence cognitive ability in late adolescence. Full-scale IQ was positively related to head circumference (HC) in adolescence (B: 1.30, 95% CI: 0.32-2.28, p=0.01). HC at birth and three months was positively associated with full-scale IQ. Catch-up growth in the group of SGA children was associated with a significantly increased height, larger HC, increased levels of insulin-like growth factor-I (IGF-I) and increased full-scale IQ compared to those born SGA without catch-up growth. SGA and IUGR may not be harmful for adult cognitive ability, at least not in individuals born at near-term. However, known risk factors of impaired fetal growth may explain the link between early growth and cognitive ability in adulthood. Copyright © 2015. Published by Elsevier Ireland Ltd.

  15. Decision based on big data research for non-small cell lung cancer in medical artificial system in developing country.

    Science.gov (United States)

    Wu, Jia; Tan, Yanlin; Chen, Zhigang; Zhao, Ming

    2018-06-01

    Non-small cell lung cancer (NSCLC) is a high risk cancer and is usually scanned by PET-CT for testing, predicting and then give the treatment methods. However, in the actual hospital system, at least 640 images must be generated for each patient through PET-CT scanning. Especially in developing countries, a huge number of patients in NSCLC are attended by doctors. Artificial system can predict and make decision rapidly. According to explore and research artificial medical system, the selection of artificial observations also can result in low work efficiency for doctors. In this study, data information of 2,789,675 patients in three hospitals in China are collected, compiled, and used as the research basis; these data are obtained through image acquisition and diagnostic parameter machine decision-making method on the basis of the machine diagnosis and medical system design model of adjuvant therapy. By combining image and diagnostic parameters, the machine decision diagnosis auxiliary algorithm is established. Experimental result shows that the accuracy has reached 77% in NSCLC. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. From Vocal Replication to Shared Combinatorial Speech Codes: A Small Step for Evolution, A Big Step for Language

    Science.gov (United States)

    Oudeyer, Pierre-Yves

    Humans use spoken vocalizations, or their signed equivalent, as a physical support to carry language. This support is highly organized: vocalizations are built with the re-use of a small number of articulatory units, which are themselves discrete elements carved up by each linguistic community in the articulatory continuum. Moreover, the repertoires of these elementary units (the gestures, the phonemes, the morphemes) have a number of structural regularities: for example, while our vocal tract allows physically the production of hundreds of vowels, each language uses most often 5, and never more than 20 of them. Also, certain vowels are very frequent, like /a,e,i,o,u/, and some others are very rare, like /en/. All the speakers of a given linguistic community categorize the speech sounds in the same manner, and share the same repertoire of vocalizations. Speakers of different communities may have very different ways of categorizing sounds (for example, Chinese use tones to distinguish sounds), and repertoires of vocalizations. Such an organized physical support of language is crucial for the existence of language, and thus asking how it may have appeared in the biological and/or cultural history of humans is a fundamental questions. In particular, one can wonder how much the evolution of human speech codes relied on specific evolutionary innovations, and thus how difficult (or not) it was for speech to appear.

  17. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  18. Big Books and Small Marvels

    Science.gov (United States)

    Stanistreet, Paul

    2012-01-01

    The Reader Organisation's Get into Reading programme is all about getting people together in groups to engage with serious books. The groups are mixed and the participants sometimes challenging, but the outcomes are often remarkable. Jane Davis, who founded the Reader Organisation and continues to oversee Get into Reading, has witnessed a massive…

  19. Appendix: small organ, big problem

    International Nuclear Information System (INIS)

    Ilieva, E.

    2013-01-01

    Full text: Introduction: The widespread penetration in the practice of ultrasound (US) multidetector computed tomography (MDCT) and magnetic resonance imaging (MRI) allows a significant reduction of false positive diagnosis of acute appendicitis (AA), reducing the number of unnecessary appendectomy, identification of complications in the course of differentiation, and other causes of acute abdominal pain. What you will learn: The aim of the lecture is to remind anatomy and pathophysiology in AA, to present the main diagnostic imaging methods, by highlighting their advantages and disadvantages, especially in childhood and during pregnancy, to find out who imaging method should prefer in case of suspected atypical presentation or occurring complication of AA and to clarify the most common differential diagnostic difficulties. Discussion: Ultrasound examination is usually the first method in the diagnostic process of AA due to its low price, wide availability, and lack of radiation, but ot depends largely on the experience of the operator and is difficult applicable under certain conditions. MRI, despite the high cost and duration, is particularly suitable in late pregnancy and in children, when ultrasound does not bear information necessary for diagnosis and the avoidance of radiation exposure is crucial. MDCT provide detailed anatomical information in a short time and is applicable in the abnormal functioning of AA for the diagnosis of any complications or for exclusion of other causes of severe abdominal pain. Conclusion: Imaging studies contribute significantly to determining the behavior in case of suspected AA and its complications. The two main and complementary imaging method most widely used in practice are ultrasound and CT . In inconclusive ultrasound findings in children and in pregnant it is appropriate to carry out a MRI study

  20. Big Pile or Small Pile?

    Science.gov (United States)

    Branca, Mario; Quidacciolu, Rossana G.; Soletta, Isabella

    2013-01-01

    The construction of a voltaic pile (battery) is a simple laboratory activity that commemorates the invention of this important device and is of great help in teaching physics. The voltaic pile is often seen as a scientific toy, with the "pile" being constructed from fruit. These toys use some strips of copper and zinc inserted in a piece…

  1. Big news in small samples

    NARCIS (Netherlands)

    P.C. Schotman (Peter); S. Straetmans; C.G. de Vries (Casper)

    1997-01-01

    textabstractUnivariate time series regressions of the forex return on the forward premium generate mostly negative slope coefficients. Simple and refined panel estimation techniques yield slope estimates that are much closer to unity. We explain the two apparently opposing results by allowing for

  2. Small Big Data Congress 2015

    NARCIS (Netherlands)

    Koers, W.A.; Bomhof, F.W.

    2015-01-01

    NOWADAYS DATA IS EVERYWHERE AND IT IS RAPIDLY BECOMING BUSINESS’ BIGGEST RESOURCE. THROUGH NUMEROUS SYSTEMS AND DEVICES, A HUGE AMOUNT OF DATA CAN BE COLLECTED. JUST THINK OF THE MANY SENSOR SYSTEMS, SUCH AS SURVEILLANCE CAMERAS, LOOPS IN THE ROAD, SMARTPHONES OR THE DIGITAL MARKERS WE CREATE

  3. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  4. Big Dreams

    Science.gov (United States)

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  5. Big Science

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1986-05-15

    Astronomy, like particle physics, has become Big Science where the demands of front line research can outstrip the science budgets of whole nations. Thus came into being the European Southern Observatory (ESO), founded in 1962 to provide European scientists with a major modern observatory to study the southern sky under optimal conditions.

  6. Big Data and medicine: a big deal?

    Science.gov (United States)

    Mayer-Schönberger, V; Ingelsson, E

    2018-05-01

    Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research. Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data's role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval. © 2017 The Association for the Publication of the Journal of Internal Medicine.

  7. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Andersen, Kristina Vaarst; Jeppesen, Jacob

    In this paper we investigate the micro-mechanisms governing structural evolution and performance of scientific collaboration. Scientific discovery tends not to be lead by so called lone ?stars?, or big egos, but instead by collaboration among groups of researchers, from a multitude of institutions...

  8. Big Data and Big Science

    OpenAIRE

    Di Meglio, Alberto

    2014-01-01

    Brief introduction to the challenges of big data in scientific research based on the work done by the HEP community at CERN and how the CERN openlab promotes collaboration among research institutes and industrial IT companies. Presented at the FutureGov 2014 conference in Singapore.

  9. Plasma chromogranin A levels are increased in a small portion of patients with hereditary head and neck paragangliomas

    NARCIS (Netherlands)

    van Duinen, Nicolette; Kema, Ido P.; Romijn, Johannes A.; Corssmit, Eleonora P. M.

    2011-01-01

    The majority of patients with head and neck paragangliomas (HNPGL) have biochemically silent tumours. Chromogranin A (CgA) is a tumour marker for neuroendocrine tumours. To assess the role of CgA as a tumour marker in patients with hereditary HNPGL. We included 95 consecutive patients with

  10. Plasma chromogranin A levels are increased in a small portion of patients with hereditary head and neck paragangliomas

    NARCIS (Netherlands)

    van Duinen, Nicolette; Kema, Ido P.; Romijn, Johannes A.; Corssmit, Eleonora P. M.

    P>Context The majority of patients with head and neck paragangliomas (HNPGL) have biochemically silent tumours. Chromogranin A (CgA) is a tumour marker for neuroendocrine tumours. Objective To assess the role of CgA as a tumour marker in patients with hereditary HNPGL. Patients and Methods We

  11. Big inquiry

    Energy Technology Data Exchange (ETDEWEB)

    Wynne, B [Lancaster Univ. (UK)

    1979-06-28

    The recently published report entitled 'The Big Public Inquiry' from the Council for Science and Society and the Outer Circle Policy Unit is considered, with especial reference to any future enquiry which may take place into the first commercial fast breeder reactor. Proposals embodied in the report include stronger rights for objectors and an attempt is made to tackle the problem that participation in a public inquiry is far too late to be objective. It is felt by the author that the CSS/OCPU report is a constructive contribution to the debate about big technology inquiries but that it fails to understand the deeper currents in the economic and political structure of technology which so influence the consequences of whatever formal procedures are evolved.

  12. Big Data

    OpenAIRE

    Bútora, Matúš

    2017-01-01

    Cieľom bakalárskej práca je popísať problematiku Big Data a agregačné operácie OLAP pre podporu rozhodovania, ktoré sú na ne aplikované pomocou technológie Apache Hadoop. Prevažná časť práce je venovaná popisu práve tejto technológie. Posledná kapitola sa zaoberá spôsobom aplikovania agregačných operácií a problematikou ich realizácie. Nasleduje celkové zhodnotenie práce a možnosti využitia výsledného systému do budúcna. The aim of the bachelor thesis is to describe the Big Data issue and ...

  13. BIG DATA

    OpenAIRE

    Abhishek Dubey

    2018-01-01

    The term 'Big Data' portrays inventive methods and advances to catch, store, disseminate, oversee and break down petabyte-or bigger estimated sets of data with high-speed & diverted structures. Enormous information can be organized, non-structured or half-organized, bringing about inadequacy of routine information administration techniques. Information is produced from different distinctive sources and can touch base in the framework at different rates. With a specific end goal to handle this...

  14. Modelling the species distribution of flat-headed cats (Prionailurus planiceps), an endangered South-East Asian small felid.

    Science.gov (United States)

    Wilting, Andreas; Cord, Anna; Hearn, Andrew J; Hesse, Deike; Mohamed, Azlan; Traeholdt, Carl; Cheyne, Susan M; Sunarto, Sunarto; Jayasilan, Mohd-Azlan; Ross, Joanna; Shapiro, Aurélie C; Sebastian, Anthony; Dech, Stefan; Breitenmoser, Christine; Sanderson, Jim; Duckworth, J W; Hofer, Heribert

    2010-03-17

    The flat-headed cat (Prionailurus planiceps) is one of the world's least known, highly threatened felids with a distribution restricted to tropical lowland rainforests in Peninsular Thailand/Malaysia, Borneo and Sumatra. Throughout its geographic range large-scale anthropogenic transformation processes, including the pollution of fresh-water river systems and landscape fragmentation, raise concerns regarding its conservation status. Despite an increasing number of camera-trapping field surveys for carnivores in South-East Asia during the past two decades, few of these studies recorded the flat-headed cat. In this study, we designed a predictive species distribution model using the Maximum Entropy (MaxEnt) algorithm to reassess the potential current distribution and conservation status of the flat-headed cat. Eighty-eight independent species occurrence records were gathered from field surveys, literature records, and museum collections. These current and historical records were analysed in relation to bioclimatic variables (WorldClim), altitude (SRTM) and minimum distance to larger water resources (Digital Chart of the World). Distance to water was identified as the key predictor for the occurrence of flat-headed cats (>50% explanation). In addition, we used different land cover maps (GLC2000, GlobCover and SarVision LLC for Borneo), information on protected areas and regional human population density data to extract suitable habitats from the potential distribution predicted by the MaxEnt model. Between 54% and 68% of suitable habitat has already been converted to unsuitable land cover types (e.g. croplands, plantations), and only between 10% and 20% of suitable land cover is categorised as fully protected according to the IUCN criteria. The remaining habitats are highly fragmented and only a few larger forest patches remain. Based on our findings, we recommend that future conservation efforts for the flat-headed cat should focus on the identified remaining key

  15. Modelling the species distribution of flat-headed cats (Prionailurus planiceps, an endangered South-East Asian small felid.

    Directory of Open Access Journals (Sweden)

    Andreas Wilting

    Full Text Available BACKGROUND: The flat-headed cat (Prionailurus planiceps is one of the world's least known, highly threatened felids with a distribution restricted to tropical lowland rainforests in Peninsular Thailand/Malaysia, Borneo and Sumatra. Throughout its geographic range large-scale anthropogenic transformation processes, including the pollution of fresh-water river systems and landscape fragmentation, raise concerns regarding its conservation status. Despite an increasing number of camera-trapping field surveys for carnivores in South-East Asia during the past two decades, few of these studies recorded the flat-headed cat. METHODOLOGY/PRINCIPAL FINDINGS: In this study, we designed a predictive species distribution model using the Maximum Entropy (MaxEnt algorithm to reassess the potential current distribution and conservation status of the flat-headed cat. Eighty-eight independent species occurrence records were gathered from field surveys, literature records, and museum collections. These current and historical records were analysed in relation to bioclimatic variables (WorldClim, altitude (SRTM and minimum distance to larger water resources (Digital Chart of the World. Distance to water was identified as the key predictor for the occurrence of flat-headed cats (>50% explanation. In addition, we used different land cover maps (GLC2000, GlobCover and SarVision LLC for Borneo, information on protected areas and regional human population density data to extract suitable habitats from the potential distribution predicted by the MaxEnt model. Between 54% and 68% of suitable habitat has already been converted to unsuitable land cover types (e.g. croplands, plantations, and only between 10% and 20% of suitable land cover is categorised as fully protected according to the IUCN criteria. The remaining habitats are highly fragmented and only a few larger forest patches remain. CONCLUSION/SIGNIFICANCE: Based on our findings, we recommend that future conservation

  16. Keynote: Big Data, Big Opportunities

    OpenAIRE

    Borgman, Christine L.

    2014-01-01

    The enthusiasm for big data is obscuring the complexity and diversity of data in scholarship and the challenges for stewardship. Inside the black box of data are a plethora of research, technology, and policy issues. Data are not shiny objects that are easily exchanged. Rather, data are representations of observations, objects, or other entities used as evidence of phenomena for the purposes of research or scholarship. Data practices are local, varying from field to field, individual to indiv...

  17. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  18. Big Data and reality

    Directory of Open Access Journals (Sweden)

    Ryan Shaw

    2015-11-01

    Full Text Available DNA sequencers, Twitter, MRIs, Facebook, particle accelerators, Google Books, radio telescopes, Tumblr: what do these things have in common? According to the evangelists of “data science,” all of these are instruments for observing reality at unprecedentedly large scales and fine granularities. This perspective ignores the social reality of these very different technological systems, ignoring how they are made, how they work, and what they mean in favor of an exclusive focus on what they generate: Big Data. But no data, big or small, can be interpreted without an understanding of the process that generated them. Statistical data science is applicable to systems that have been designed as scientific instruments, but is likely to lead to confusion when applied to systems that have not. In those cases, a historical inquiry is preferable.

  19. Really big numbers

    CERN Document Server

    Schwartz, Richard Evan

    2014-01-01

    In the American Mathematical Society's first-ever book for kids (and kids at heart), mathematician and author Richard Evan Schwartz leads math lovers of all ages on an innovative and strikingly illustrated journey through the infinite number system. By means of engaging, imaginative visuals and endearing narration, Schwartz manages the monumental task of presenting the complex concept of Big Numbers in fresh and relatable ways. The book begins with small, easily observable numbers before building up to truly gigantic ones, like a nonillion, a tredecillion, a googol, and even ones too huge for names! Any person, regardless of age, can benefit from reading this book. Readers will find themselves returning to its pages for a very long time, perpetually learning from and growing with the narrative as their knowledge deepens. Really Big Numbers is a wonderful enrichment for any math education program and is enthusiastically recommended to every teacher, parent and grandparent, student, child, or other individual i...

  20. Therapeutic Efficacy Comparison of 5 Major EGFR-TKIs in Advanced EGFR-positive Non-Small-cell Lung Cancer: A Network Meta-analysis Based on Head-to-Head Trials.

    Science.gov (United States)

    Zhang, Yaxiong; Zhang, Zhonghan; Huang, Xiaodan; Kang, Shiyang; Chen, Gang; Wu, Manli; Miao, Siyu; Huang, Yan; Zhao, Hongyun; Zhang, Li

    2017-09-01

    Five major first- and second-generation epidermal growth factor receptor-tyrosine kinase inhibitors (EGFR-TKIs), including erlotinib, gefitinib, icotinib, afatinib, and dacomitinib, are currently optional for patients with advanced non-small-cell lung cancer (NSCLC) who harbor EGFR mutations. However, there was no head-to-head-based network meta-analysis among all the TKIs in EGFR-mutated populations. Eligible literature was searched from an electronic database. Data of objective response rate, disease control rate, progression-free survival, and overall survival were extracted from enrolled studies. Multiple treatment comparisons based on Bayesian network integrated the efficacy of all included treatments. Six phase III randomized trials involving 1055 EGFR-mutated patients with advanced NSCLC were enrolled. Multiple treatment comparisons showed that 5 different EGFR-TKIs shared equivalent therapeutic efficacy in terms of all outcome measures. Rank probabilities indicated that dacomitinib and afatinib had potentially better efficacy compared with erlotinib, gefitinib, and icotinib in the EGFR-mutated patients. When compared with other agents, potential survival benefits (progression-free and overall survival) were observed in dacomitinib, whereas afatinib showed a better rank probability in overall response rate and disease control rate. Our study indicated a preferable therapeutic efficacy in the second-generation TKIs (dacomitinib and afatinib) when compared with the first-generation TKIs (erlotinib, gefitinib, and icotinib). Copyright © 2016 Elsevier Inc. All rights reserved.

  1. De las grandes a las pequeñas pantallas. Nuevas narrativas africanas de entretenimiento / From Big to Small Screens. New African Narratives of Entertainment

    Directory of Open Access Journals (Sweden)

    Alexie Tcheuyap

    2016-03-01

    been made possible not only because local postcolonial governments have lost their monopoly over television production, but especially because of private business initiatives. This article uses the examples of Cameroon, Senegal and Ivory Coast to explore this rapid development which has shifted spectatorship from big to small screens. The spread of TV series may have accelerated the «death» of a moribund cinema, but has also lead to the creation of a new visual culture which is likely to expand.Keywords: postcolonial, video, cinema theatres, Nigeria, Ghana, Cameroon, Senegal, Ivory Coast, audiences, television, series, audiovisual narrative, entertainment, comedy.

  2. Big Data

    DEFF Research Database (Denmark)

    Aaen, Jon; Nielsen, Jeppe Agger

    2016-01-01

    Big Data byder sig til som en af tidens mest hypede teknologiske innovationer, udråbt til at rumme kimen til nye, værdifulde operationelle indsigter for private virksomheder og offentlige organisationer. Mens de optimistiske udmeldinger er mange, er forskningen i Big Data i den offentlige sektor...... indtil videre begrænset. Denne artikel belyser, hvordan den offentlige sundhedssektor kan genanvende og udnytte en stadig større mængde data under hensyntagen til offentlige værdier. Artiklen bygger på et casestudie af anvendelsen af store mængder sundhedsdata i Dansk AlmenMedicinsk Database (DAMD......). Analysen viser, at (gen)brug af data i nye sammenhænge er en flerspektret afvejning mellem ikke alene økonomiske rationaler og kvalitetshensyn, men også kontrol over personfølsomme data og etiske implikationer for borgeren. I DAMD-casen benyttes data på den ene side ”i den gode sags tjeneste” til...

  3. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  4. Continuous infusion of small-volume fluid resuscitation in the treatment of combined uncontrolled hemorrhagic shock and head injury

    International Nuclear Information System (INIS)

    Hayrettin, O.; Yagmur, Y.; Tas, A.; Topcu, S.; Orak, M.

    2007-01-01

    To determine the effect of continuous limited fluid resuscitation on the hemodynamic response and survival in rats in a model of uncontrolled hemorrhage shock due to Massive Splenic Injury (MSI) and Head Injury (HI). Seventy Sprague-Dawley rats were used in this study. Group 1 rats (n=10) was sham-operated. In group 2 (n=10), only Massive Splenic Injury (MSI) was performed and untreated. In group 3 (n=10), only head injury (HI) was performed and untreated. In group 4 (n=10), HI and MSI were performed and were untreated. In group 5 (n=10), HI and MSI were performed and 15 minutes later treated with 7.5% NaCl. In group 6 (n=10), HI and MSI were performed, and rats were treated with Ringer's Lactate (RL) solution. In group 7 (n=10), HI and MSI were performed, rats were treated with 0.9 % NaCl. In groups 2,4,5,6 and 7 midline incision was reopened and splenectomy was performed at 45 minutes. In group 4 rats, Mean Arterial Pressure (MAP) was decreased from 104 +- 6.1 mmHg to 75 +- 19.5 mmHg at 15 minutes; heart rate decreased from 357+- 24.9 beats/min to 321 +- 62.1 beats/min and hematocrit decreased from 46 +- 1.3 % to 43 +- 2.5 % (p<0.01). Similar early changes in MAP, heart rate and hematocrit were observed in groups 5, 6, and 7, at 15 minutes. At 45,60 and 120 minutes, in fluid resuscitated rats (group 5,6,7) MAP, heart rate and hematocrit values were measured higher than group 2 and 4 (p<0.01 for all). At 120 min. in group 6, hematocrit was higher than group 4, 5 and 7, in group 6, total blood loss after splenectomy was calculated at 20 +- 2.4% of blood volume and was the best value compared to other fluid resuscitated group 5 and 7 (28% and 27% of blood volume) (p<0.01). Mortality was lower in all fluid resuscitated groups when compared to group 3 and 4 (p< 0.05). The median survival time was again higher in fluid resuscitated groups. Continuous infusion of 7.5% NaCl, RL and 0.9 % NaCl following uncontrolled hemorrhagic shock with massive splenic injury and

  5. A little big history of Tiananmen

    NARCIS (Netherlands)

    Quaedackers, E.; Grinin, L.E.; Korotayev, A.V.; Rodrigue, B.H.

    2011-01-01

    This contribution aims at demonstrating the usefulness of studying small-scale subjects such as Tiananmen, or the Gate of Heavenly Peace, in Beijing - from a Big History perspective. By studying such a ‘little big history’ of Tiananmen, previously overlooked yet fundamental explanations for why

  6. Challenges of Big Data Analysis.

    Science.gov (United States)

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  7. Small steps for hydro

    International Nuclear Information System (INIS)

    Wicke, Peter

    1998-01-01

    The government in Peru has decided to utilise its gas reserves and restrict hydro to relatively small schemes. A number of reasons for the decision are given. In 1997, the Shell-Mobile-Bechtel-COSAPI consortium was formed and agreements were signed regarding exploiting Gas de Camisea. The country's energy needs to 2010 are being assessed. It is likely that by 2001 the whole of south Peru will be receiving gas from Camisea. The Peru situation is discussed under the headings of (i) existing capacity, (ii) growing demands, (iii) a history of hydro in Peru, (iv) electrification and SHP and (v) outlook. The future for Peru's electric energy development is bright. While most of its new power capacity will come from natural gas, the small hydros also have a part to play. A stronger commitment of national and regional political authorities to consider supplies outside the big cities is said to be needed. (UK)

  8. Small Data

    NARCIS (Netherlands)

    S. Pemberton (Steven)

    2014-01-01

    htmlabstractThe term “Open Data” often goes hand in hand with the term “Big Data”, where large data sets get released allowing for analysis, but the Cinderella of the Open Data ball is Small Data, small amounts of data, nonetheless possibly essential, that are too small to be put in some database or

  9. Head first Ajax

    CERN Document Server

    Riordan, Rebecca M

    2008-01-01

    Ajax is no longer an experimental approach to website development, but the key to building browser-based applications that form the cornerstone of Web 2.0. Head First Ajax gives you an up-to-date perspective that lets you see exactly what you can do -- and has been done -- with Ajax. With it, you get a highly practical, in-depth, and mature view of what is now a mature development approach. Using the unique and highly effective visual format that has turned Head First titles into runaway bestsellers, this book offers a big picture overview to introduce Ajax, and then explores the use of ind

  10. Heads Up

    Science.gov (United States)

    ... Connect with Us HEADS UP Apps Reshaping the Culture Around Concussion in Sports Get HEADS UP on Your Web Site Concussion ... HEADS UP on your web site! Create a culture of safety for young athletes Officials, learn how you can ... UP to Providers HEADS UP to Youth Sports HEADS UP to School Sports HEADS UP to ...

  11. Implications of improved diagnostic imaging of small nodal metastases in head and neck cancer: Radiotherapy target volume transformation and dose de-escalation.

    Science.gov (United States)

    van den Bosch, Sven; Vogel, Wouter V; Raaijmakers, Cornelis P; Dijkema, Tim; Terhaard, Chris H J; Al-Mamgani, Abrahim; Kaanders, Johannes H A M

    2018-05-03

    Diagnostic imaging continues to evolve, and now has unprecedented accuracy for detecting small nodal metastasis. This influences the tumor load in elective target volumes and subsequently has consequences for the radiotherapy dose required to control disease in these volumes. Small metastases that used to remain subclinical and were included in elective volumes, will nowadays be detected and included in high-dose volumes. Consequentially, high-dose volumes will more often contain low-volume disease. These target volume transformations lead to changes in the tumor burden in elective and "gross" tumor volumes with implications for the radiotherapy dose prescribed to these volumes. For head and neck tumors, nodal staging has evolved from mere palpation to combinations of high-resolution imaging modalities. A traditional nodal gross tumor volume in the neck typically had a minimum diameter of 10-15 mm, while nowadays much smaller tumor deposits are detected in lymph nodes. However, the current dose levels for elective nodal irradiation were empirically determined in the 1950s, and have not changed since. In this report the radiobiological consequences of target volume transformation caused by modern imaging of the neck are evaluated, and theoretically derived reductions of dose in radiotherapy for head and neck cancer are proposed. The concept of target volume transformation and subsequent strategies for dose adaptation applies to many other tumor types as well. Awareness of this concept may result in new strategies for target definition and selection of dose levels with the aim to provide optimal tumor control with less toxicity. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  12. Characterization of small-to-medium head-and-face dimensions for developing respirator fit test panels and evaluating fit of filtering facepiece respirators with different faceseal design

    Science.gov (United States)

    Lin, Yi-Chun

    2017-01-01

    A respirator fit test panel (RFTP) with facial size distribution representative of intended users is essential to the evaluation of respirator fit for new models of respirators. In this study an anthropometric survey was conducted among youths representing respirator users in mid-Taiwan to characterize head-and-face dimensions key to RFTPs for application to small-to-medium facial features. The participants were fit-tested for three N95 masks of different facepiece design and the results compared to facial size distribution specified in the RFTPs of bivariate and principal component analysis design developed in this study to realize the influence of facial characteristics to respirator fit in relation to facepiece design. Nineteen dimensions were measured for 206 participants. In fit testing the qualitative fit test (QLFT) procedures prescribed by the U.S. Occupational Safety and Health Administration were adopted. As the results show, the bizygomatic breadth of the male and female participants were 90.1 and 90.8% of their counterparts reported for the U.S. youths (P < 0.001), respectively. Compared to the bivariate distribution, the PCA design better accommodated variation in facial contours among different respirator user groups or populations, with the RFTPs reported in this study and from literature consistently covering over 92% of the participants. Overall, the facial fit of filtering facepieces increased with increasing facial dimensions. The total percentages of the tests wherein the final maneuver being completed was “Moving head up-and-down”, “Talking” or “Bending over” in bivariate and PCA RFTPs were 13.3–61.9% and 22.9–52.8%, respectively. The respirators with a three-panel flat fold structured in the facepiece provided greater fit, particularly when the users moved heads. When the facial size distribution in a bivariate RFTP did not sufficiently represent petite facial size, the fit testing was inclined to overestimate the general fit

  13. Big Sky Legacy. In Montana, Small Schools Aren't a Bold New Idea. They're a Way of Life.

    Science.gov (United States)

    Boss, Suzie

    2000-01-01

    Two-thirds of Montana's school districts are rural, and most students attend schools with enrollments under 300. Such recent trends as peer tutoring, multigrade classrooms, and project-based learning have always been practiced in these small schools. One small community's successful effort to save its school, classroom practices in one-room…

  14. Model of the material removal function and an experimental study on a magnetorheological finishing process using a small ball-end permanent-magnet polishing head.

    Science.gov (United States)

    Chen, Mingjun; Liu, Henan; Cheng, Jian; Yu, Bo; Fang, Zhen

    2017-07-01

    In order to achieve the deterministic finishing of optical components with concave surfaces of a curvature radius less than 10 mm, a novel magnetorheological finishing (MRF) process using a small ball-end permanent-magnet polishing head with a diameter of 4 mm is introduced. The characteristics of material removal in the proposed MRF process are studied. The model of the material removal function for the proposed MRF process is established based on the three-dimensional hydrodynamics analysis and Preston's equation. The shear stress on the workpiece surface is calculated by means of resolving the presented mathematical model using a numerical solution method. The analysis result reveals that the material removal in the proposed MRF process shows a positive dependence on shear stress. Experimental research is conducted to investigate the effect of processing parameters on the material removal rate and improve the surface accuracy of a typical rotational symmetrical optical component. The experimental results show that the surface accuracy of the finished component of K9 glass material has been improved to 0.14 μm (PV) from the initial 0.8 μm (PV), and the finished surface roughness Ra is 0.0024 μm. It indicates that the proposed MRF process can be used to achieve the deterministic removal of surface material and perform the nanofinishing of small curvature radius concave surfaces.

  15. Radar detectability studies of slow and small zodiacal dust cloud particles. I. The case of Arecibo 430 MHz meteor head echo observations

    International Nuclear Information System (INIS)

    Janches, D.; Plane, J. M. C.; Feng, W.; Nesvorný, D.; Vokrouhlický, D.; Nicolls, M. J.

    2014-01-01

    Recent model development of the Zodiacal Dust Cloud (ZDC) argues that the incoming flux of meteoric material into the Earth's upper atmosphere is mostly undetected by radars because they cannot detect small extraterrestrial particles entering the atmosphere at low velocities due to the relatively small production of electrons. In this paper, we present a new methodology utilizing meteor head echo radar observations that aims to constrain the ZDC physical model by ground-based measurements. In particular, for this work, we focus on Arecibo 430 MHz observations since this is the most sensitive radar utilized for this type of observations to date. For this, we integrate and employ existing comprehensive models of meteoroid ablation, ionization, and radar detection to enable accurate interpretation of radar observations and show that reasonable agreement in the hourly rates is found between model predictions and Arecibo observations when (1) we invoke the lower limit of the model predicted flux (∼16 t d –1 ) and (2) we estimate the ionization probability of ablating metal atoms using laboratory measurements of the ionization cross sections of high-speed metal atom beams, resulting in values up to two orders of magnitude lower than the extensively utilized figure reported by Jones for low-speed meteors. However, even at this lower limit, the model overpredicts the slow portion of the Arecibo radial velocity distributions by a factor of three, suggesting that the model requires some revision.

  16. Radar detectability studies of slow and small zodiacal dust cloud particles. I. The case of Arecibo 430 MHz meteor head echo observations

    Energy Technology Data Exchange (ETDEWEB)

    Janches, D. [Space Weather Laboratory, Mail Code 674, GSFC/NASA, Greenbelt, MD 20771 (United States); Plane, J. M. C.; Feng, W. [School of Chemistry, University of Leeds, Leeds LS2 9JT (United Kingdom); Nesvorný, D. [SouthWest Research Institute, Boulder, CO 80302 (United States); Vokrouhlický, D. [Institute of Astronomy, Charles University, Prague (Czech Republic); Nicolls, M. J., E-mail: diego.janches@nasa.gov, E-mail: j.m.c.plane@leeds.ac.uk, E-mail: w.feng@leeds.ac.uk, E-mail: davidn@boulder.swri.edu, E-mail: vokrouhl@cesnet.cz, E-mail: Michael.Nicolls@sri.com [SRI International, Menlo Park, CA 94025 (United States)

    2014-11-20

    Recent model development of the Zodiacal Dust Cloud (ZDC) argues that the incoming flux of meteoric material into the Earth's upper atmosphere is mostly undetected by radars because they cannot detect small extraterrestrial particles entering the atmosphere at low velocities due to the relatively small production of electrons. In this paper, we present a new methodology utilizing meteor head echo radar observations that aims to constrain the ZDC physical model by ground-based measurements. In particular, for this work, we focus on Arecibo 430 MHz observations since this is the most sensitive radar utilized for this type of observations to date. For this, we integrate and employ existing comprehensive models of meteoroid ablation, ionization, and radar detection to enable accurate interpretation of radar observations and show that reasonable agreement in the hourly rates is found between model predictions and Arecibo observations when (1) we invoke the lower limit of the model predicted flux (∼16 t d{sup –1}) and (2) we estimate the ionization probability of ablating metal atoms using laboratory measurements of the ionization cross sections of high-speed metal atom beams, resulting in values up to two orders of magnitude lower than the extensively utilized figure reported by Jones for low-speed meteors. However, even at this lower limit, the model overpredicts the slow portion of the Arecibo radial velocity distributions by a factor of three, suggesting that the model requires some revision.

  17. Radar Detectability Studies of Slow and Small Zodiacal Dust Cloud Particles: I. The Case of Arecibo 430 MHz Meteor Head Echo Observations

    Science.gov (United States)

    Janches, D.; Plane, J. M. C.; Nesvorny, D.; Feng, W.; Vokrouhlicky, D.; Nicolls, M. J.

    2014-01-01

    Recent model development of the Zodiacal Dust Cloud (ZDC) model (Nesvorny et al. 2010, 2011b) argue that the incoming flux of meteoric material into the Earth's upper atmosphere is mostly undetected by radars because they cannot detect small extraterrestrial particles entering the atmosphere at low velocities due to the relatively small production of electrons. In this paper we present a new methodology utilizing meteor head echo radar observations that aims to constrain the ZDC physical model by ground-based measurements. In particular, for this work, we focus on Arecibo 430 MHz observations since this is the most sensitive radar utilized for this type of observations to date. For this, we integrate and employ existing comprehensive models of meteoroid ablation, ionization and radar detection to enable accurate interpretation of radar observations and show that reasonable agreement in the hourly rates is found between model predictions and Arecibo observations when: 1) we invoke the lower limit of the model predicted flux (approximately 16 t/d) and 2) we estimate the ionization probability of ablating metal atoms using laboratory measurements of the ionization cross sections of high speed metal atom beams, resulting in values up to two orders of magnitude lower than the extensively utilized figure reported by Jones (1997) for low speeds meteors. However, even at this lower limit the model over predicts the slow portion of the Arecibo radial velocity distributions by a factor of 3, suggesting the model requires some revision.

  18. A three-component system incorporating Ppd-D1, copy number variation at Ppd-B1, and numerous small-effect quantitative trait loci facilitates adaptation of heading time in winter wheat cultivars of worldwide origin.

    Science.gov (United States)

    Würschum, Tobias; Langer, Simon M; Longin, C Friedrich H; Tucker, Matthew R; Leiser, Willmar L

    2018-06-01

    The broad adaptability of heading time has contributed to the global success of wheat in a diverse array of climatic conditions. Here, we investigated the genetic architecture underlying heading time in a large panel of 1,110 winter wheat cultivars of worldwide origin. Genome-wide association mapping, in combination with the analysis of major phenology loci, revealed a three-component system that facilitates the adaptation of heading time in winter wheat. The photoperiod sensitivity locus Ppd-D1 was found to account for almost half of the genotypic variance in this panel and can advance or delay heading by many days. In addition, copy number variation at Ppd-B1 was the second most important source of variation in heading, explaining 8.3% of the genotypic variance. Results from association mapping and genomic prediction indicated that the remaining variation is attributed to numerous small-effect quantitative trait loci that facilitate fine-tuning of heading to the local climatic conditions. Collectively, our results underpin the importance of the two Ppd-1 loci for the adaptation of heading time in winter wheat and illustrate how the three components have been exploited for wheat breeding globally. © 2018 John Wiley & Sons Ltd.

  19. [Big data in imaging].

    Science.gov (United States)

    Sewerin, Philipp; Ostendorf, Benedikt; Hueber, Axel J; Kleyer, Arnd

    2018-04-01

    Until now, most major medical advancements have been achieved through hypothesis-driven research within the scope of clinical trials. However, due to a multitude of variables, only a certain number of research questions could be addressed during a single study, thus rendering these studies expensive and time consuming. Big data acquisition enables a new data-based approach in which large volumes of data can be used to investigate all variables, thus opening new horizons. Due to universal digitalization of the data as well as ever-improving hard- and software solutions, imaging would appear to be predestined for such analyses. Several small studies have already demonstrated that automated analysis algorithms and artificial intelligence can identify pathologies with high precision. Such automated systems would also seem well suited for rheumatology imaging, since a method for individualized risk stratification has long been sought for these patients. However, despite all the promising options, the heterogeneity of the data and highly complex regulations covering data protection in Germany would still render a big data solution for imaging difficult today. Overcoming these boundaries is challenging, but the enormous potential advances in clinical management and science render pursuit of this goal worthwhile.

  20. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  1. Stalin's Big Fleet Program

    National Research Council Canada - National Science Library

    Mauner, Milan

    2002-01-01

    Although Dr. Milan Hauner's study 'Stalin's Big Fleet program' has focused primarily on the formation of Big Fleets during the Tsarist and Soviet periods of Russia's naval history, there are important lessons...

  2. Big Data Semantics

    NARCIS (Netherlands)

    Ceravolo, Paolo; Azzini, Antonia; Angelini, Marco; Catarci, Tiziana; Cudré-Mauroux, Philippe; Damiani, Ernesto; Mazak, Alexandra; van Keulen, Maurice; Jarrar, Mustafa; Santucci, Giuseppe; Sattler, Kai-Uwe; Scannapieco, Monica; Wimmer, Manuel; Wrembel, Robert; Zaraket, Fadi

    2018-01-01

    Big Data technology has discarded traditional data modeling approaches as no longer applicable to distributed data processing. It is, however, largely recognized that Big Data impose novel challenges in data and infrastructure management. Indeed, multiple components and procedures must be

  3. Novel Emergency Medicine Curriculum Utilizing Self-Directed Learning and the Flipped Classroom Method: Head, Eyes, Ears, Nose and Throat Emergencies Small Group Module

    Directory of Open Access Journals (Sweden)

    Andrew King

    2017-09-01

    Full Text Available Audience: This curriculum created and implemented at The Ohio State University Wexner Medical Center was designed to educate our emergency medicine (EM residents, PGY-1 to PGY-3, as well as medical students and attending physicians. Introduction: Head, Eyes, Ears, Nose and Throat (HEENT complaints are very commonly seen in the Emergency Department. Numbers vary as to exact prevalence, but sources show that there are about 2 million annual emergency department (ED visits in the United States for non-traumatic dental problems, representing 1.5% of all ED visits.1 Other sources show that symptoms referable to the throat encompass 2,496,000 visits or 1.9% of total visits.2 Notably, about 8% of the written exam in emergency medicine covers the topic of head and neck complaints, making it the second most tested topic behind cardiovascular.3 Residents must be proficient in the differential diagnosis and management of the wide variety of HEENT emergencies. The flipped classroom curricular model emphasizes self-directed learning activities completed by learners, followed by small group discussions pertaining to the topic reviewed. The active learning fostered by this curriculum increases faculty and learner engagement and interaction time typically absent in traditional lecture-based formats.4-6 Studies have revealed that the application of knowledge through case studies, personal interaction with content experts, and integrated questions are effective learning strategies for emergency medicine residents.6-8 The Ohio State University EM Residency didactic curriculum recently transitioned to a “flipped classroom” approach.9-13 We created this innovative curriculum aimed to improve our residency education program and to share educational resources with other EM residency programs. Our curriculum utilizes an 18-month curricular cycle to cover the defined emergency medicine content. The flipped classroom curriculum maximizes didactic time and resident

  4. Characterizing Big Data Management

    OpenAIRE

    Rogério Rossi; Kechi Hirama

    2015-01-01

    Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: t...

  5. The big bang cosmology - enigmas and nostrums

    International Nuclear Information System (INIS)

    Dicke, R.H.; Peebles, P.J.E.

    1979-01-01

    Some outstanding problems in connection with the big bang cosmology and relativity theory are reviewed under the headings: enigmas; nostrums and elixirs (the universe as Phoenix (an oscillating universe), the anthropomorphic universe (existence of observers in the present universe), reproducing universes (could a mini big bang bounce, perhaps adding entropy and matter and eventually developing into a suitable home for observers), variable strength of the gravitational interaction and oscillating universes (possible bounce models that have led eventually to the present hospitable environment). (U.K.)

  6. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  7. Big bacteria

    DEFF Research Database (Denmark)

    Schulz, HN; Jørgensen, BB

    2001-01-01

    A small number of prokaryotic species have a unique physiology or ecology related to their development of unusually large size. The biomass of bacteria varies over more than 10 orders of magnitude, from the 0.2 mum wide nanobacteria to the largest cells of the colorless sulfur bacteria......, Thiomargarita namibiensis, with a diameter of 750 mum. All bacteria, including those that swim around in the environment, obtain their food molecules by molecular diffusion. Only the fastest and largest swimmers known, Thiovulum majus, are able to significantly increase their food supply by motility...... and by actively creating an advective flow through the entire population. Diffusion limitation generally restricts the maximal size of prokaryotic cells and provides a selective advantage for mum-sized cells at the normally low substrate concentrations in the environment. The largest heterotrophic bacteria...

  8. Big bacteria

    DEFF Research Database (Denmark)

    Schulz, HN; Jørgensen, BB

    2001-01-01

    A small number of prokaryotic species have a unique physiology or ecology related to their development of unusually large size. The biomass of bacteria varies over more than 10 orders of magnitude, from the 0.2 mum wide nanobacteria to the largest cells of the colorless sulfur bacteria...... and by actively creating an advective flow through the entire population. Diffusion limitation generally restricts the maximal size of prokaryotic cells and provides a selective advantage for mum-sized cells at the normally low substrate concentrations in the environment. The largest heterotrophic bacteria......, the 80 x 600 mum large Epulopiscium sp. from the gut of tropical fish, are presumably living in a very nutrient-rich medium. Many large bacteria contain numerous inclusions in the cells that reduce the volume of active cytoplasm. The most striking examples of competitive advantage from large cell size...

  9. BIG SCHOOL - SMALL SCHOOL. STUDIES OF THE EFFECTS OF HIGH SCHOOL SIZE UPON THE BEHAVIOR AND EXPERIENCES OF STUDENTS. FINAL REPORT.

    Science.gov (United States)

    BARKER, ROGER G.; AND OTHERS

    STUDIES WERE MADE IN KANSAS HIGH SCHOOLS TO DETERMINE THE EFFECT OF SCHOOL SIZE UPON THE BEHAVIOR AND EXPERIENCES OF STUDENTS. THE FOLLOWING AREAS WERE CONSIDERED-- THE SCHOOL INVOLVED IN THE STUDY, THE DATA GATHERED FROM RECORDS AND RESEARCH, OUT-OF-SCHOOL ACTIVITIES, AND THE PLACE OF HIGH SCHOOL STUDENTS IN THE TOTAL LIFE OF FOUR SMALL TOWNS.…

  10. Adenylyl cyclase-associated protein 1 in metastasis of squamous cell carcinoma of the head and neck and non-small cell lung cancer

    Science.gov (United States)

    Kakurina, G. V.; Kolegova, E. S.; Cheremisina, O. V.; Zavyalov, A. A.; Shishkin, D. A.; Kondakova, I. V.; Choinzonov, E. L.

    2016-08-01

    Progression of tumors and metastasis in particular is one of the main reasons of the high mortality rate among cancer patients. The primary role in developing metastases plays cell locomotion which requires remodeling of the actin cytoskeleton. Form, dynamics, localization and mechanical properties of the actin cytoskeleton are regulated by a variety of actin-binding proteins, which include the adenylyl cyclase-associated protein 1 (CAP1). The study is devoted to the investigation of CAP1 level depending on the presence or absence of metastases in patients with squamous cell carcinoma of the head and neck (SCCHN) and non-small cell lung cancer (NSCLC). The results show the contribution of CAP1 to SCCHN and NSCLC progression. We detected the connection between the tissue protein CAP1 level and the stage of NSCLC and SCCHN disease. Also the levels of the CAP1 protein in tissues of primary tumors and metastases in lung cancer were different. Our data showed that CAP is important in the development of metastases, which suggests further perspectives in the study of this protein for projecting metastasis of NSCLC and SCCHN.

  11. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  12. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  13. Reviews Book: Nucleus Book: The Wonderful World of Relativity Book: Head Shot Book: Cosmos Close-Up Places to Visit: Physics DemoLab Book: Quarks, Leptons and the Big Bang EBook: Shooting Stars Equipment: Victor 70C USB Digital Multimeter Web Watch

    Science.gov (United States)

    2012-09-01

    WE RECOMMEND Nucleus: A Trip into the Heart of Matter A coffee-table book for everyone to dip into and learn from The Wonderful World of Relativity A charming, stand-out introduction to relativity The Physics DemoLab, National University of Singapore A treasure trove of physics for hands-on science experiences Quarks, Leptons and the Big Bang Perfect to polish up on particle physics for older students Victor 70C USB Digital Multimeter Equipment impresses for usability and value WORTH A LOOK Cosmos Close-Up Weighty tour of the galaxy that would make a good display Shooting Stars Encourage students to try astrophotography with this ebook HANDLE WITH CARE Head Shot: The Science Behind the JKF Assassination Exploration of the science behind the crime fails to impress WEB WATCH App-lied science for education: a selection of free Android apps are reviewed and iPhone app options are listed

  14. A little big history of Tiananmen

    OpenAIRE

    Quaedackers, E.; Grinin, L.E.; Korotayev, A.V.; Rodrigue, B.H.

    2011-01-01

    This contribution aims at demonstrating the usefulness of studying small-scale subjects such as Tiananmen, or the Gate of Heavenly Peace, in Beijing - from a Big History perspective. By studying such a ‘little big history’ of Tiananmen, previously overlooked yet fundamental explanations for why people built the gate the way they did can be found. These explanations are useful in their own right and may also be used to deepen our understanding of more traditional explanations of why Tiananmen ...

  15. Small Data

    OpenAIRE

    Pemberton, Steven

    2014-01-01

    htmlabstractThe term “Open Data” often goes hand in hand with the term “Big Data”, where large data sets get released allowing for analysis, but the Cinderella of the Open Data ball is Small Data, small amounts of data, nonetheless possibly essential, that are too small to be put in some database or online dataset to be put to use. RDFa is a technology that allows Cinderella to go to the ball.

  16. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  17. Long-term Outcome of Multiple Small-diameter Drilling Decompression Combined with Hip Arthroscopy versus Drilling Alone for Early Avascular Necrosis of the Femoral Head.

    Science.gov (United States)

    Li, Ji; Li, Zhong-Li; Zhang, Hao; Su, Xiang-Zheng; Wang, Ke-Tao; Yang, Yi-Meng

    2017-06-20

    Avascular necrosis of femoral head (AVNFH) typically presents in the young adults and progresses quickly without proper treatments. However, the optimum treatments for early stage of AVNFH are still controversial. This study was conducted to evaluate the therapeutic effects of multiple small-diameter drilling decompression combined with hip arthroscopy for early AVNFH compared to drilling alone. This is a nonrandomized retrospective case series study. Between April 2006 and November 2010, 60 patients (98 hips) with early stage AVNFH participated in this study. The patients underwent multiple small-diameter drilling decompression combined with hip arthroscopy in 26 cases/43 hips (Group A) or drilling decompression alone in 34 cases/55 hips (Group B). Patients were followed up at 6, 12, and 24 weeks, and every 6 months thereafter. Radiographs were taken at every follow-up, Harris scores were recorded at the last follow-up, the paired t-test was used to compare the postoperative Harris scores. Surgery effective rate of the two groups was compared using the Chi-square test. All patients were followed up for an average of 57.6 months (range: 17-108 months). Pain relief and improvement of hip function were assessed in all patients at 6 months after the surgery. At the last follow-up, Group A had better outcome with mean Harris' scores improved from 68.23 ± 11.37 to 82.07 ± 2.92 (t = -7.21, P = 0.001) than Group B with mean Harris' scores improved from 69.46 ± 9.71 to 75.79 ± 4.13 (t = -9.47, P = 0.037) (significantly different: t = -2.54, P = 0.017). The total surgery effective rate was also significantly different between Groups A and B (86.0% vs. 74.5%; χ2 = 3.69, P = 0.02). For early stage of AVNFH, multiple small-diameter drilling decompression combined with hip arthroscopy is more effective than drilling decompression alone.

  18. Long-term Outcome of Multiple Small-diameter Drilling Decompression Combined with Hip Arthroscopy versus Drilling Alone for Early Avascular Necrosis of the Femoral Head

    Science.gov (United States)

    Li, Ji; Li, Zhong-Li; Zhang, Hao; Su, Xiang-Zheng; Wang, Ke-Tao; Yang, Yi-Meng

    2017-01-01

    Background: Avascular necrosis of femoral head (AVNFH) typically presents in the young adults and progresses quickly without proper treatments. However, the optimum treatments for early stage of AVNFH are still controversial. This study was conducted to evaluate the therapeutic effects of multiple small-diameter drilling decompression combined with hip arthroscopy for early AVNFH compared to drilling alone. Methods: This is a nonrandomized retrospective case series study. Between April 2006 and November 2010, 60 patients (98 hips) with early stage AVNFH participated in this study. The patients underwent multiple small-diameter drilling decompression combined with hip arthroscopy in 26 cases/43 hips (Group A) or drilling decompression alone in 34 cases/55 hips (Group B). Patients were followed up at 6, 12, and 24 weeks, and every 6 months thereafter. Radiographs were taken at every follow-up, Harris scores were recorded at the last follow-up, the paired t-test was used to compare the postoperative Harris scores. Surgery effective rate of the two groups was compared using the Chi-square test. Results: All patients were followed up for an average of 57.6 months (range: 17–108 months). Pain relief and improvement of hip function were assessed in all patients at 6 months after the surgery. At the last follow-up, Group A had better outcome with mean Harris’ scores improved from 68.23 ± 11.37 to 82.07 ± 2.92 (t = −7.21, P = 0.001) than Group B with mean Harris’ scores improved from 69.46 ± 9.71 to 75.79 ± 4.13 (t = –9.47, P = 0.037) (significantly different: t = –2.54, P = 0.017). The total surgery effective rate was also significantly different between Groups A and B (86.0% vs. 74.5%; χ2 = 3.69, P = 0.02). Conclusion: For early stage of AVNFH, multiple small-diameter drilling decompression combined with hip arthroscopy is more effective than drilling decompression alone. PMID:28584206

  19. Is HEADS in our heads?

    DEFF Research Database (Denmark)

    Boisen, Kirsten A; Hertz, Pernille Grarup; Blix, Charlotte

    2016-01-01

    contraception], Safety, Self-harm) interview is a feasible way of exploring health risk behaviors and resilience. OBJECTIVE: The purpose of this study was to evaluate how often HEADS topics were addressed according to young patients and staff in pediatric and adult outpatient clinics. METHODS: We conducted...... care professionals participated. We found only small reported differences between staff and young patients regarding whether home, education, and activity were addressed. However, staff reported twice the rate of addressing smoking, alcohol, illegal drugs, sexuality, and contraception compared to young...... patients. Young patients reported that smoking, alcohol, illegal drugs, sexuality, and contraception were addressed significantly more at adult clinics in comparison to pediatric clinics. After controlling for age, gender and duration of illness, according to young patients, adjusted odds ratios...

  20. Data Decision and Drug Therapy Based on Non-Small Cell Lung Cancer in a Big Data Medical System in Developing Countries

    Directory of Open Access Journals (Sweden)

    Jia Wu

    2018-05-01

    Full Text Available In many developing or underdeveloped countries, limited medical resources and large populations may affect the survival of mankind. The research for the medical information system and recommendation of effective treatment methods may improve diagnosis and drug therapy for patients in developing or underdeveloped countries. In this study, we built a system model for the drug therapy, relevance parameter analysis, and data decision making in non-small cell lung cancer. Based on the probability analysis and status decision, the optimized therapeutic schedule can be calculated and selected, and then effective drug therapy methods can be determined to improve relevance parameters. Statistical analysis of clinical data proves that the model of the probability analysis and decision making can provide fast and accurate clinical data.

  1. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  2. The big bang

    International Nuclear Information System (INIS)

    Chown, Marcus.

    1987-01-01

    The paper concerns the 'Big Bang' theory of the creation of the Universe 15 thousand million years ago, and traces events which physicists predict occurred soon after the creation. Unified theory of the moment of creation, evidence of an expanding Universe, the X-boson -the particle produced very soon after the big bang and which vanished from the Universe one-hundredth of a second after the big bang, and the fate of the Universe, are all discussed. (U.K.)

  3. Big Data for Business Ecosystem Players

    Directory of Open Access Journals (Sweden)

    Perko Igor

    2016-06-01

    Full Text Available In the provided research, some of the Big Data most prospective usage domains connect with distinguished player groups found in the business ecosystem. Literature analysis is used to identify the state of the art of Big Data related research in the major domains of its use-namely, individual marketing, health treatment, work opportunities, financial services, and security enforcement. System theory was used to identify business ecosystem major player types disrupted by Big Data: individuals, small and mid-sized enterprises, large organizations, information providers, and regulators. Relationships between the domains and players were explained through new Big Data opportunities and threats and by players’ responsive strategies. System dynamics was used to visualize relationships in the provided model.

  4. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  5. A Big Video Manifesto

    DEFF Research Database (Denmark)

    Mcilvenny, Paul Bruce; Davidsen, Jacob

    2017-01-01

    and beautiful visualisations. However, we also need to ask what the tools of big data can do both for the Humanities and for more interpretative approaches and methods. Thus, we prefer to explore how the power of computation, new sensor technologies and massive storage can also help with video-based qualitative......For the last few years, we have witnessed a hype about the potential results and insights that quantitative big data can bring to the social sciences. The wonder of big data has moved into education, traffic planning, and disease control with a promise of making things better with big numbers...

  6. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  7. Dual of big bang and big crunch

    International Nuclear Information System (INIS)

    Bak, Dongsu

    2007-01-01

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory

  8. Big Data and Neuroimaging.

    Science.gov (United States)

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  9. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  10. From Big Data to Big Business

    DEFF Research Database (Denmark)

    Lund Pedersen, Carsten

    2017-01-01

    Idea in Brief: Problem: There is an enormous profit potential for manufacturing firms in big data, but one of the key barriers to obtaining data-driven growth is the lack of knowledge about which capabilities are needed to extract value and profit from data. Solution: We (BDBB research group at C...

  11. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  12. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  13. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  14. Big-Men and Small Chiefs

    DEFF Research Database (Denmark)

    Iversen, Rune

    2017-01-01

    This paper investigates to what extent the significant material changes observable at the end of the Neolithic reflect transformations of the underlying social dynamics. Answering this question will help us to understand the formation of Bronze Age societies. The analysis concerns southern Scandi...

  15. A big measurement of a small moment

    Science.gov (United States)

    E Sauer, B.; Devlin, J. A.; Rabey, I. M.

    2017-07-01

    A beam of ThO molecules has been used to make the most precise measurement of the electron’s electric dipole moment (EDM) to date. In their recent paper, the ACME collaboration set out in detail their experimental and data analysis techniques. In a tour-de-force, they explain the many ways in which their apparatus can produce a signal which mimics the EDM and show how these systematic effects are measured and controlled.

  16. A small jab - a big effect

    DEFF Research Database (Denmark)

    Benn, Christine Stabell; Netea, Mihai G; Selin, Liisa K

    2013-01-01

    Recent epidemiological studies have shown that, in addition to disease-specific effects, vaccines against infectious diseases have nonspecific effects on the ability of the immune system to handle other pathogens. For instance, in randomized trials tuberculosis and measles vaccines are associated...... with a substantial reduction in overall child mortality, which cannot be explained by prevention of the target disease. New research suggests that the nonspecific effects of vaccines are related to cross-reactivity of the adaptive immune system with unrelated pathogens, and to training of the innate immune system...... through epigenetic reprogramming. Hence, epidemiological findings are backed by immunological data. This generates a new understanding of the immune system and about how it can be modulated by vaccines to impact the general resistance to disease....

  17. String Theory: Big Problem for Small Size

    Science.gov (United States)

    Sahoo, S.

    2009-01-01

    String theory is the most promising candidate theory for a unified description of all the fundamental forces that exist in nature. It provides a mathematical framework that combines quantum theory with Einstein's general theory of relativity. The typical size of a string is of the order of 10[superscript -33] cm, called the Planck length. But due…

  18. A Small Essay About a Big Boulevard

    Directory of Open Access Journals (Sweden)

    Elena Grigoryeva

    2015-08-01

    Full Text Available The essay presents the analogies of turning city borders and walls into boulevards and gardens through the examples of Moscow and Irkutsk. The main street of Irkutsk is viewed in a new light, as a boulevard. It is proposed to take into consideration the peculiarities of the four sense-parts of the street when working on design codes.

  19. Sustainable Offices: Small Practices for Big Benefits

    Science.gov (United States)

    2012-05-01

    comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE MAY 2012 2. REPORT TYPE 3. DATES...16 Reducing Fuel Consumption • Participate in meetings via telephone • Telecommute or alternate work schedules • Carpool • Bike or walk around

  20. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van

  1. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  2. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Boyd, Richard N.

    2001-01-01

    The precision of measurements in modern cosmology has made huge strides in recent years, with measurements of the cosmic microwave background and the determination of the Hubble constant now rivaling the level of precision of the predictions of big bang nucleosynthesis. However, these results are not necessarily consistent with the predictions of the Standard Model of big bang nucleosynthesis. Reconciling these discrepancies may require extensions of the basic tenets of the model, and possibly of the reaction rates that determine the big bang abundances

  3. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  4. Big Data, indispensable today

    Directory of Open Access Journals (Sweden)

    Radu-Ioan ENACHE

    2015-10-01

    Full Text Available Big data is and will be used more in the future as a tool for everything that happens both online and offline. Of course , online is a real hobbit, Big Data is found in this medium , offering many advantages , being a real help for all consumers. In this paper we talked about Big Data as being a plus in developing new applications, by gathering useful information about the users and their behaviour.We've also presented the key aspects of real-time monitoring and the architecture principles of this technology. The most important benefit brought to this paper is presented in the cloud section.

  5. Head Lice

    Science.gov (United States)

    ... nits. You should also use hot water to wash any bed linens, towels, and clothing recently worn by the person who had head lice. Vacuum anything that can’t be washed, such as the couch, carpets, your child’s car seat, and any stuffed animals. Because head lice ...

  6. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  7. Cryptography for Big Data Security

    Science.gov (United States)

    2015-07-13

    Cryptography for Big Data Security Book Chapter for Big Data: Storage, Sharing, and Security (3S) Distribution A: Public Release Ariel Hamlin1 Nabil...Email: arkady@ll.mit.edu ii Contents 1 Cryptography for Big Data Security 1 1.1 Introduction...48 Chapter 1 Cryptography for Big Data Security 1.1 Introduction With the amount

  8. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  9. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  10. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  11. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  12. BigDansing

    KAUST Repository

    Khayyat, Zuhair; Ilyas, Ihab F.; Jindal, Alekh; Madden, Samuel; Ouzzani, Mourad; Papotti, Paolo; Quiané -Ruiz, Jorge-Arnulfo; Tang, Nan; Yin, Si

    2015-01-01

    of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic

  13. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  14. Big Creek Pit Tags

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BCPITTAGS database is used to store data from an Oncorhynchus mykiss (steelhead/rainbow trout) population dynamics study in Big Creek, a coastal stream along the...

  15. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-01-01

    on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also

  16. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  17. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  18. Big Bang baryosynthesis

    International Nuclear Information System (INIS)

    Turner, M.S.; Chicago Univ., IL

    1983-01-01

    In these lectures I briefly review Big Bang baryosynthesis. In the first lecture I discuss the evidence which exists for the BAU, the failure of non-GUT symmetrical cosmologies, the qualitative picture of baryosynthesis, and numerical results of detailed baryosynthesis calculations. In the second lecture I discuss the requisite CP violation in some detail, further the statistical mechanics of baryosynthesis, possible complications to the simplest scenario, and one cosmological implication of Big Bang baryosynthesis. (orig./HSI)

  19. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  20. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2015 The Authors.

  1. Big data uncertainties.

    Science.gov (United States)

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  2. Big Data in Caenorhabditis elegans: quo vadis?

    Science.gov (United States)

    Hutter, Harald; Moerman, Donald

    2015-11-05

    A clear definition of what constitutes "Big Data" is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of "complete" data sets for this organism is actually rather small--not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein-protein interaction--important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell. © 2015 Hutter and Moerman. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  3. Head Injuries

    Science.gov (United States)

    ... a severe blow to the head can still knock the brain into the side of the skull ... following certain precautions and taking a break from sports and other activities that make symptoms worse. Playing ...

  4. BigDansing

    KAUST Repository

    Khayyat, Zuhair

    2015-06-02

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to scaling to big datasets. This presents a serious impediment since data cleansing often involves costly computations such as enumerating pairs of tuples, handling inequality joins, and dealing with user-defined functions. In this paper, we present BigDansing, a Big Data Cleansing system to tackle efficiency, scalability, and ease-of-use issues in data cleansing. The system can run on top of most common general purpose data processing platforms, ranging from DBMSs to MapReduce-like frameworks. A user-friendly programming interface allows users to express data quality rules both declaratively and procedurally, with no requirement of being aware of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic and real datasets show that BigDansing outperforms existing baseline systems up to more than two orders of magnitude without sacrificing the quality provided by the repair algorithms.

  5. Big and small women facing small culture / Reet Varblane

    Index Scriptorium Estoniae

    Varblane, Reet, 1952-

    1999-01-01

    Feministlik kunst Eestis alates 1994. aastast. Näitustest EST. FEM, Kood-eks, Private Views. Videod. Fotoinstallatsioonid. Tuntumad kunstnikud: Mare Tralla, Ene-Liis Semper, Mari Laanemets, Eve Kiiler, Tiina Tammetalu, Kai Kaljo, Ly Lestberg, Liina Siib jt. Kunstnike rühmitused: F.F.F.F. Teemad: ema, väike naine, keha. 6 illustratsiooni

  6. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  7. Thick-Big Descriptions

    DEFF Research Database (Denmark)

    Lai, Signe Sophus

    The paper discusses the rewards and challenges of employing commercial audience measurements data – gathered by media industries for profitmaking purposes – in ethnographic research on the Internet in everyday life. It questions claims to the objectivity of big data (Anderson 2008), the assumption...... communication systems, language and behavior appear as texts, outputs, and discourses (data to be ‘found’) – big data then documents things that in earlier research required interviews and observations (data to be ‘made’) (Jensen 2014). However, web-measurement enterprises build audiences according...... to a commercial logic (boyd & Crawford 2011) and is as such directed by motives that call for specific types of sellable user data and specific segmentation strategies. In combining big data and ‘thick descriptions’ (Geertz 1973) scholars need to question how ethnographic fieldwork might map the ‘data not seen...

  8. Big data in biomedicine.

    Science.gov (United States)

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Big data bioinformatics.

    Science.gov (United States)

    Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao

    2014-12-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.

  10. Big Java late objects

    CERN Document Server

    Horstmann, Cay S

    2012-01-01

    Big Java: Late Objects is a comprehensive introduction to Java and computer programming, which focuses on the principles of programming, software engineering, and effective learning. It is designed for a two-semester first course in programming for computer science students.

  11. Big ideas: innovation policy

    OpenAIRE

    John Van Reenen

    2011-01-01

    In the last CentrePiece, John Van Reenen stressed the importance of competition and labour market flexibility for productivity growth. His latest in CEP's 'big ideas' series describes the impact of research on how policy-makers can influence innovation more directly - through tax credits for business spending on research and development.

  12. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with

  13. Big data in history

    CERN Document Server

    Manning, Patrick

    2013-01-01

    Big Data in History introduces the project to create a world-historical archive, tracing the last four centuries of historical dynamics and change. Chapters address the archive's overall plan, how to interpret the past through a global archive, the missions of gathering records, linking local data into global patterns, and exploring the results.

  14. The Big Sky inside

    Science.gov (United States)

    Adams, Earle; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Hester, Carolyn; Knuth, Randy; Spangler, Todd; Jones, David; Henthorn, Melissa; Hammill, Brock; Smith, Paul; Salisbury, Rob; Reckin, Gene; Boulafentis, Johna

    2009-01-01

    The University of Montana (UM)-Missoula has implemented a problem-based program in which students perform scientific research focused on indoor air pollution. The Air Toxics Under the Big Sky program (Jones et al. 2007; Adams et al. 2008; Ward et al. 2008) provides a community-based framework for understanding the complex relationship between poor…

  15. Moving Another Big Desk.

    Science.gov (United States)

    Fawcett, Gay

    1996-01-01

    New ways of thinking about leadership require that leaders move their big desks and establish environments that encourage trust and open communication. Educational leaders must trust their colleagues to make wise choices. When teachers are treated democratically as leaders, classrooms will also become democratic learning organizations. (SM)

  16. A Big Bang Lab

    Science.gov (United States)

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  17. New 'bigs' in cosmology

    International Nuclear Information System (INIS)

    Yurov, Artyom V.; Martin-Moruno, Prado; Gonzalez-Diaz, Pedro F.

    2006-01-01

    This paper contains a detailed discussion on new cosmic solutions describing the early and late evolution of a universe that is filled with a kind of dark energy that may or may not satisfy the energy conditions. The main distinctive property of the resulting space-times is that they make to appear twice the single singular events predicted by the corresponding quintessential (phantom) models in a manner which can be made symmetric with respect to the origin of cosmic time. Thus, big bang and big rip singularity are shown to take place twice, one on the positive branch of time and the other on the negative one. We have also considered dark energy and phantom energy accretion onto black holes and wormholes in the context of these new cosmic solutions. It is seen that the space-times of these holes would then undergo swelling processes leading to big trip and big hole events taking place on distinct epochs along the evolution of the universe. In this way, the possibility is considered that the past and future be connected in a non-paradoxical manner in the universes described by means of the new symmetric solutions

  18. The Big Bang

    CERN Multimedia

    Moods, Patrick

    2006-01-01

    How did the Universe begin? The favoured theory is that everything - space, time, matter - came into existence at the same moment, around 13.7 thousand million years ago. This event was scornfully referred to as the "Big Bang" by Sir Fred Hoyle, who did not believe in it and maintained that the Universe had always existed.

  19. Big Data Analytics

    Indian Academy of Sciences (India)

    The volume and variety of data being generated using computersis doubling every two years. It is estimated that in 2015,8 Zettabytes (Zetta=1021) were generated which consistedmostly of unstructured data such as emails, blogs, Twitter,Facebook posts, images, and videos. This is called big data. Itis possible to analyse ...

  20. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  1. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  2. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-07-31

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to big data scaling. This presents a serious impediment since identify- ing and repairing dirty data often involves processing huge input datasets, handling sophisticated error discovery approaches and managing huge arbitrary errors. With large datasets, error detection becomes overly expensive and complicated especially when considering user-defined functions. Furthermore, a distinctive algorithm is de- sired to optimize inequality joins in sophisticated error discovery rather than na ̈ıvely parallelizing them. Also, when repairing large errors, their skewed distribution may obstruct effective error repairs. In this dissertation, I present solutions to overcome the above three problems in scaling data cleansing. First, I present BigDansing as a general system to tackle efficiency, scalability, and ease-of-use issues in data cleansing for Big Data. It automatically parallelizes the user’s code on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also enables parallel execution of serial repair algorithms by exploiting the graph representation of discovered errors. The experimental results show that BigDansing outperforms existing baselines up to more than two orders of magnitude. Although BigDansing scales cleansing jobs, it still lacks the ability to handle sophisticated error discovery requiring inequality joins. Therefore, I developed IEJoin as an algorithm for fast inequality joins. It is based on sorted arrays and space efficient bit-arrays to reduce the problem’s search space. By comparing IEJoin against well- known optimizations, I show that it is more scalable, and several orders of magnitude faster. BigDansing depends on vertex-centric graph systems, i.e., Pregel

  3. Advancements in Big Data Processing

    CERN Document Server

    Vaniachine, A; The ATLAS collaboration

    2012-01-01

    The ever-increasing volumes of scientific data present new challenges for Distributed Computing and Grid-technologies. The emerging Big Data revolution drives new discoveries in scientific fields including nanotechnology, astrophysics, high-energy physics, biology and medicine. New initiatives are transforming data-driven scientific fields by pushing Bid Data limits enabling massive data analysis in new ways. In petascale data processing scientists deal with datasets, not individual files. As a result, a task (comprised of many jobs) became a unit of petascale data processing on the Grid. Splitting of a large data processing task into jobs enabled fine-granularity checkpointing analogous to the splitting of a large file into smaller TCP/IP packets during data transfers. Transferring large data in small packets achieves reliability through automatic re-sending of the dropped TCP/IP packets. Similarly, transient job failures on the Grid can be recovered by automatic re-tries to achieve reliable Six Sigma produc...

  4. Head Start.

    Science.gov (United States)

    Greenman, Geri

    2000-01-01

    Discusses an art project in which students created drawings of mop heads. Explains that the approach of drawing was more important than the subject. States that the students used the chiaroscuro technique, used by Rembrandt and Caravaggio, in which light appears out of the darkness. (CMK)

  5. The Study of “big data” to support internal business strategists

    Science.gov (United States)

    Ge, Mei

    2018-01-01

    How is big data different from previous data analysis systems? The primary purpose behind traditional small data analytics that all managers are more or less familiar with is to support internal business strategies. But big data also offers a promising new dimension: to discover new opportunities to offer customers high-value products and services. The study focus to introduce some strategists which big data support to. Business decisions using big data can also involve some areas for analytics. They include customer satisfaction, customer journeys, supply chains, risk management, competitive intelligence, pricing, discovery and experimentation or facilitating big data discovery.

  6. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    Science.gov (United States)

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  7. Pengembangan Aplikasi Antarmuka Layanan Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Gede Karya

    2017-11-01

    Full Text Available In the 2016 Higher Competitive Grants Research (Hibah Bersaing Dikti, we have been successfully developed models, infrastructure and modules of Hadoop-based big data analysis application. It has also successfully developed a virtual private network (VPN network that allows integration and access to the infrastructure from outside the FTIS Computer Laboratorium. Infrastructure and application modules of analysis are then wanted to be presented as services to small and medium enterprises (SMEs in Indonesia. This research aims to develop application of big data analysis service interface integrated with Hadoop-Cluster. The research begins with finding appropriate methods and techniques for scheduling jobs, calling for ready-made Java Map-Reduce (MR application modules, and techniques for tunneling input / output and meta-data construction of service request (input and service output. The above methods and techniques are then developed into a web-based service application, as well as an executable module that runs on Java and J2EE based programming environment and can access Hadoop-Cluster in the FTIS Computer Lab. The resulting application can be accessed by the public through the site http://bigdata.unpar.ac.id. Based on the test results, the application has functioned well in accordance with the specifications and can be used to perform big data analysis. Keywords: web based service, big data analysis, Hadop, J2EE Abstrak Pada penelitian Hibah Bersaing Dikti tahun 2016 telah berhasil dikembangkan model, infrastruktur dan modul-modul aplikasi big data analysis berbasis Hadoop. Selain itu juga telah berhasil dikembangkan jaringan virtual private network (VPN yang memungkinkan integrasi dan akses infrastruktur tersebut dari luar Laboratorium Komputer FTIS. Infrastruktur dan modul aplikasi analisis tersebut selanjutnya ingin dipresentasikan sebagai layanan kepada usaha kecil dan menengah (UKM di Indonesia. Penelitian ini bertujuan untuk mengembangkan

  8. A numerical simulation of pre-big bang cosmology

    CERN Document Server

    Maharana, J P; Veneziano, Gabriele

    1998-01-01

    We analyse numerically the onset of pre-big bang inflation in an inhomogeneous, spherically symmetric Universe. Adding a small dilatonic perturbation to a trivial (Milne) background, we find that suitable regions of space undergo dilaton-driven inflation and quickly become spatially flat ($\\Omega \\to 1$). Numerical calculations are pushed close enough to the big bang singularity to allow cross checks against previously proposed analytic asymptotic solutions.

  9. Circulating Tumor DNA in Predicting Outcomes in Patients With Stage IV Head and Neck Cancer or Stage III-IV Non-small Cell Lung Cancer

    Science.gov (United States)

    2018-01-12

    Metastatic Squamous Neck Cancer With Occult Primary Squamous Cell Carcinoma; Salivary Gland Squamous Cell Carcinoma; Stage IIIA Non-small Cell Lung Cancer; Stage IIIB Non-small Cell Lung Cancer; Stage IV Non-small Cell Lung Cancer; Stage IV Squamous Cell Carcinoma of the Hypopharynx; Stage IV Squamous Cell Carcinoma of the Nasopharynx; Stage IVA Salivary Gland Cancer; Stage IVA Squamous Cell Carcinoma of the Larynx; Stage IVA Squamous Cell Carcinoma of the Lip and Oral Cavity; Stage IVA Squamous Cell Carcinoma of the Oropharynx; Stage IVA Squamous Cell Carcinoma of the Paranasal Sinus and Nasal Cavity; Stage IVA Verrucous Carcinoma of the Larynx; Stage IVA Verrucous Carcinoma of the Oral Cavity; Stage IVB Salivary Gland Cancer; Stage IVB Squamous Cell Carcinoma of the Larynx; Stage IVB Squamous Cell Carcinoma of the Lip and Oral Cavity; Stage IVB Squamous Cell Carcinoma of the Oropharynx; Stage IVB Squamous Cell Carcinoma of the Paranasal Sinus and Nasal Cavity; Stage IVB Verrucous Carcinoma of the Larynx; Stage IVB Verrucous Carcinoma of the Oral Cavity; Stage IVC Salivary Gland Cancer; Stage IVC Squamous Cell Carcinoma of the Larynx; Stage IVC Squamous Cell Carcinoma of the Lip and Oral Cavity; Stage IVC Squamous Cell Carcinoma of the Oropharynx; Stage IVC Squamous Cell Carcinoma of the Paranasal Sinus and Nasal Cavity; Stage IVC Verrucous Carcinoma of the Larynx; Stage IVC Verrucous Carcinoma of the Oral Cavity; Tongue Cancer; Untreated Metastatic Squamous Neck Cancer With Occult Primary

  10. Big fish in small pond or small fish in big pond? An analysis of job mobility

    OpenAIRE

    Cardoso, Ana Rute

    2005-01-01

    The statement that individuals care for status and for their position within a hierarchy has been subject to sparse economic analysis. I check this assertion by analyzing wages and status within the firm, with status measured as the worker rank in the firm wage hierarchy. More precisely, I focus on worker mobility between jobs, to compare movers and stayers in terms of gains/losses in wage level versus gains/losses in rank position. The following questions are addressed: Upon switching firm, ...

  11. Big Data-Survey

    Directory of Open Access Journals (Sweden)

    P.S.G. Aruna Sri

    2016-03-01

    Full Text Available Big data is the term for any gathering of information sets, so expensive and complex, that it gets to be hard to process for utilizing customary information handling applications. The difficulties incorporate investigation, catch, duration, inquiry, sharing, stockpiling, Exchange, perception, and protection infringement. To reduce spot business patterns, anticipate diseases, conflict etc., we require bigger data sets when compared with the smaller data sets. Enormous information is hard to work with utilizing most social database administration frameworks and desktop measurements and perception bundles, needing rather enormously parallel programming running on tens, hundreds, or even a large number of servers. In this paper there was an observation on Hadoop architecture, different tools used for big data and its security issues.

  12. Finding the big bang

    CERN Document Server

    Page, Lyman A; Partridge, R Bruce

    2009-01-01

    Cosmology, the study of the universe as a whole, has become a precise physical science, the foundation of which is our understanding of the cosmic microwave background radiation (CMBR) left from the big bang. The story of the discovery and exploration of the CMBR in the 1960s is recalled for the first time in this collection of 44 essays by eminent scientists who pioneered the work. Two introductory chapters put the essays in context, explaining the general ideas behind the expanding universe and fossil remnants from the early stages of the expanding universe. The last chapter describes how the confusion of ideas and measurements in the 1960s grew into the present tight network of tests that demonstrate the accuracy of the big bang theory. This book is valuable to anyone interested in how science is done, and what it has taught us about the large-scale nature of the physical universe.

  13. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Klinkby Madsen, Anders; Rasche, Andreas

    data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects......This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... of international development agendas to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development efforts, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion...

  14. Big nuclear accidents

    International Nuclear Information System (INIS)

    Marshall, W.; Billingon, D.E.; Cameron, R.F.; Curl, S.J.

    1983-09-01

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but just imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the risks of nuclear power. The paper reviews the way in which the probability and consequences of big nuclear accidents have been presented in the past and makes recommendations for the future, including the presentation of the long-term consequences of such accidents in terms of 'loss of life expectancy', 'increased chance of fatal cancer' and 'equivalent pattern of compulsory cigarette smoking'. The paper presents mathematical arguments, which show the derivation and validity of the proposed methods of presenting the consequences of imaginable big nuclear accidents. (author)

  15. Big Bounce and inhomogeneities

    International Nuclear Information System (INIS)

    Brizuela, David; Mena Marugan, Guillermo A; Pawlowski, Tomasz

    2010-01-01

    The dynamics of an inhomogeneous universe is studied with the methods of loop quantum cosmology, via a so-called hybrid quantization, as an example of the quantization of vacuum cosmological spacetimes containing gravitational waves (Gowdy spacetimes). The analysis of this model with an infinite number of degrees of freedom, performed at the effective level, shows that (i) the initial Big Bang singularity is replaced (as in the case of homogeneous cosmological models) by a Big Bounce, joining deterministically two large universes, (ii) the universe size at the bounce is at least of the same order of magnitude as that of the background homogeneous universe and (iii) for each gravitational wave mode, the difference in amplitude at very early and very late times has a vanishing statistical average when the bounce dynamics is strongly dominated by the inhomogeneities, whereas this average is positive when the dynamics is in a near-vacuum regime, so that statistically the inhomogeneities are amplified. (fast track communication)

  16. Big Bang Circus

    Science.gov (United States)

    Ambrosini, C.

    2011-06-01

    Big Bang Circus is an opera I composed in 2001 and which was premiered at the Venice Biennale Contemporary Music Festival in 2002. A chamber group, four singers and a ringmaster stage the story of the Universe confronting and interweaving two threads: how early man imagined it and how scientists described it. Surprisingly enough fancy, myths and scientific explanations often end up using the same images, metaphors and sometimes even words: a strong tension, a drumskin starting to vibrate, a shout…

  17. Big Bang 5

    CERN Document Server

    Apolin, Martin

    2007-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 5 RG behandelt die Grundlagen (Maßsystem, Größenordnungen) und die Mechanik (Translation, Rotation, Kraft, Erhaltungssätze).

  18. Big Bang 8

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Band 8 vermittelt auf verständliche Weise Relativitätstheorie, Kern- und Teilchenphysik (und deren Anwendungen in der Kosmologie und Astrophysik), Nanotechnologie sowie Bionik.

  19. Big Bang 6

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 6 RG behandelt die Gravitation, Schwingungen und Wellen, Thermodynamik und eine Einführung in die Elektrizität anhand von Alltagsbeispielen und Querverbindungen zu anderen Disziplinen.

  20. Big Bang 7

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. In Band 7 werden neben einer Einführung auch viele aktuelle Aspekte von Quantenmechanik (z. Beamen) und Elektrodynamik (zB Elektrosmog), sowie die Klimaproblematik und die Chaostheorie behandelt.

  1. Big Bang Darkleosynthesis

    OpenAIRE

    Krnjaic, Gordan; Sigurdson, Kris

    2014-01-01

    In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis , a dark-sector version of big-bang nucleosynthesis (BBN), in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD , which generica...

  2. Big³. Editorial.

    Science.gov (United States)

    Lehmann, C U; Séroussi, B; Jaulent, M-C

    2014-05-22

    To provide an editorial introduction into the 2014 IMIA Yearbook of Medical Informatics with an overview of the content, the new publishing scheme, and upcoming 25th anniversary. A brief overview of the 2014 special topic, Big Data - Smart Health Strategies, and an outline of the novel publishing model is provided in conjunction with a call for proposals to celebrate the 25th anniversary of the Yearbook. 'Big Data' has become the latest buzzword in informatics and promise new approaches and interventions that can improve health, well-being, and quality of life. This edition of the Yearbook acknowledges the fact that we just started to explore the opportunities that 'Big Data' will bring. However, it will become apparent to the reader that its pervasive nature has invaded all aspects of biomedical informatics - some to a higher degree than others. It was our goal to provide a comprehensive view at the state of 'Big Data' today, explore its strengths and weaknesses, as well as its risks, discuss emerging trends, tools, and applications, and stimulate the development of the field through the aggregation of excellent survey papers and working group contributions to the topic. For the first time in history will the IMIA Yearbook be published in an open access online format allowing a broader readership especially in resource poor countries. For the first time, thanks to the online format, will the IMIA Yearbook be published twice in the year, with two different tracks of papers. We anticipate that the important role of the IMIA yearbook will further increase with these changes just in time for its 25th anniversary in 2016.

  3. Comprehensive small animal imaging strategies on a clinical 3 T dedicated head MR-scanner; adapted methods and sequence protocols in CNS pathologies.

    Directory of Open Access Journals (Sweden)

    Deepu R Pillai

    Full Text Available BACKGROUND: Small animal models of human diseases are an indispensable aspect of pre-clinical research. Being dynamic, most pathologies demand extensive longitudinal monitoring to understand disease mechanisms, drug efficacy and side effects. These considerations often demand the concomitant development of monitoring systems with sufficient temporal and spatial resolution. METHODOLOGY AND RESULTS: This study attempts to configure and optimize a clinical 3 Tesla magnetic resonance scanner to facilitate imaging of small animal central nervous system pathologies. The hardware of the scanner was complemented by a custom-built, 4-channel phased array coil system. Extensive modification of standard sequence protocols was carried out based on tissue relaxometric calculations. Proton density differences between the gray and white matter of the rodent spinal cord along with transverse relaxation due to magnetic susceptibility differences at the cortex and striatum of both rats and mice demonstrated statistically significant differences. The employed parallel imaging reconstruction algorithms had distinct properties dependent on the sequence type and in the presence of the contrast agent. The attempt to morphologically phenotype a normal healthy rat brain in multiple planes delineated a number of anatomical regions, and all the clinically relevant sequels following acute cerebral ischemia could be adequately characterized. Changes in blood-brain-barrier permeability following ischemia-reperfusion were also apparent at a later time. Typical characteristics of intra-cerebral haemorrhage at acute and chronic stages were also visualized up to one month. Two models of rodent spinal cord injury were adequately characterized and closely mimicked the results of histological studies. In the employed rodent animal handling system a mouse model of glioblastoma was also studied with unequivocal results. CONCLUSIONS: The implemented customizations including extensive

  4. Comprehensive Small Animal Imaging Strategies on a Clinical 3 T Dedicated Head MR-Scanner; Adapted Methods and Sequence Protocols in CNS Pathologies

    Science.gov (United States)

    Pillai, Deepu R.; Heidemann, Robin M.; Lanz, Titus; Dittmar, Michael S.; Sandner, Beatrice; Beier, Christoph P.; Weidner, Norbert; Greenlee, Mark W.; Schuierer, Gerhard; Bogdahn, Ulrich; Schlachetzki, Felix

    2011-01-01

    Background Small animal models of human diseases are an indispensable aspect of pre-clinical research. Being dynamic, most pathologies demand extensive longitudinal monitoring to understand disease mechanisms, drug efficacy and side effects. These considerations often demand the concomitant development of monitoring systems with sufficient temporal and spatial resolution. Methodology and Results This study attempts to configure and optimize a clinical 3 Tesla magnetic resonance scanner to facilitate imaging of small animal central nervous system pathologies. The hardware of the scanner was complemented by a custom-built, 4-channel phased array coil system. Extensive modification of standard sequence protocols was carried out based on tissue relaxometric calculations. Proton density differences between the gray and white matter of the rodent spinal cord along with transverse relaxation due to magnetic susceptibility differences at the cortex and striatum of both rats and mice demonstrated statistically significant differences. The employed parallel imaging reconstruction algorithms had distinct properties dependent on the sequence type and in the presence of the contrast agent. The attempt to morphologically phenotype a normal healthy rat brain in multiple planes delineated a number of anatomical regions, and all the clinically relevant sequels following acute cerebral ischemia could be adequately characterized. Changes in blood-brain-barrier permeability following ischemia-reperfusion were also apparent at a later time. Typical characteristics of intra-cerebral haemorrhage at acute and chronic stages were also visualized up to one month. Two models of rodent spinal cord injury were adequately characterized and closely mimicked the results of histological studies. In the employed rodent animal handling system a mouse model of glioblastoma was also studied with unequivocal results. Conclusions The implemented customizations including extensive sequence protocol

  5. Recent big flare

    International Nuclear Information System (INIS)

    Moriyama, Fumio; Miyazawa, Masahide; Yamaguchi, Yoshisuke

    1978-01-01

    The features of three big solar flares observed at Tokyo Observatory are described in this paper. The active region, McMath 14943, caused a big flare on September 16, 1977. The flare appeared on both sides of a long dark line which runs along the boundary of the magnetic field. Two-ribbon structure was seen. The electron density of the flare observed at Norikura Corona Observatory was 3 x 10 12 /cc. Several arc lines which connect both bright regions of different magnetic polarity were seen in H-α monochrome image. The active region, McMath 15056, caused a big flare on December 10, 1977. At the beginning, several bright spots were observed in the region between two main solar spots. Then, the area and the brightness increased, and the bright spots became two ribbon-shaped bands. A solar flare was observed on April 8, 1978. At first, several bright spots were seen around the solar spot in the active region, McMath 15221. Then, these bright spots developed to a large bright region. On both sides of a dark line along the magnetic neutral line, bright regions were generated. These developed to a two-ribbon flare. The time required for growth was more than one hour. A bright arc which connects two ribbons was seen, and this arc may be a loop prominence system. (Kato, T.)

  6. Big Data Technologies

    Science.gov (United States)

    Bellazzi, Riccardo; Dagliati, Arianna; Sacchi, Lucia; Segagni, Daniele

    2015-01-01

    The so-called big data revolution provides substantial opportunities to diabetes management. At least 3 important directions are currently of great interest. First, the integration of different sources of information, from primary and secondary care to administrative information, may allow depicting a novel view of patient’s care processes and of single patient’s behaviors, taking into account the multifaceted nature of chronic care. Second, the availability of novel diabetes technologies, able to gather large amounts of real-time data, requires the implementation of distributed platforms for data analysis and decision support. Finally, the inclusion of geographical and environmental information into such complex IT systems may further increase the capability of interpreting the data gathered and extract new knowledge from them. This article reviews the main concepts and definitions related to big data, it presents some efforts in health care, and discusses the potential role of big data in diabetes care. Finally, as an example, it describes the research efforts carried on in the MOSAIC project, funded by the European Commission. PMID:25910540

  7. Plentiful natural gas headed for big growth in Mideast

    International Nuclear Information System (INIS)

    Hamid, S.H.; Aitani, A.M.

    1995-01-01

    Natural gas is increasingly becoming a major contributor in the industrial development of most Middle Eastern countries. Demand there will rise steeply in coming years. This is because of the abundant and growing natural gas resources in the region, the economic benefits of using local resources, as well as increased emphasis on a cleaner environment. Today, proved reserves of natural gas in the Middle East are 45 trillion cu meters (tcm), or 1,488 trillion cu ft (tcf). This is over 30% of the world's natural gas reserves. A table presents data on reserves and production of natural gas in the region. About 20% of this gross production is rein-injecting for oil field pressure maintenance, 13% is flared or vented, and 7% is accounted as losses. The remaining 60% represents consumption in power generation, water desalination, petrochemicals and fertilizers production, aluminum and copper smelting, and fuel for refineries and other industries. The use of natural gas in these various industries is discussed. Thirteen tables present data on gas consumption by country and sector, power generation capacity, major chemicals derived from natural gas, and petrochemical plant capacities

  8. Big bang and big crunch in matrix string theory

    OpenAIRE

    Bedford, J; Papageorgakis, C; Rodríguez-Gómez, D; Ward, J

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  9. Flued head replacement alternatives

    International Nuclear Information System (INIS)

    Smetters, J.L.

    1987-01-01

    This paper discusses flued head replacement options. Section 2 discusses complete flued head replacement with a design that eliminates the inaccessible welds. Section 3 discusses alternate flued head support designs that can drastically reduce flued head installation costs. Section 4 describes partial flued head replacement designs. Finally, Section 5 discusses flued head analysis methods. (orig./GL)

  10. Implementing the “Big Data” Concept in Official Statistics

    Directory of Open Access Journals (Sweden)

    О. V.

    2017-02-01

    Full Text Available Big data is a huge resource that needs to be used at all levels of economic planning. The article is devoted to the study of the development of the concept of “Big Data” in the world and its impact on the transformation of statistical simulation of economic processes. Statistics at the current stage should take into account the complex system of international economic relations, which functions in the conditions of globalization and brings new forms of economic development in small open economies. Statistical science should take into account such phenomena as gig-economy, common economy, institutional factors, etc. The concept of “Big Data” and open data are analyzed, problems of implementation of “Big Data” in the official statistics are shown. The ways of implementation of “Big Data” in the official statistics of Ukraine through active use of technological opportunities of mobile operators, navigation systems, surveillance cameras, social networks, etc. are presented. The possibilities of using “Big Data” in different sectors of the economy, also on the level of companies are shown. The problems of storage of large volumes of data are highlighted. The study shows that “Big Data” is a huge resource that should be used across the Ukrainian economy.

  11. Goniometer head

    International Nuclear Information System (INIS)

    Dzhazairov-Kakhramanov, V.; Berger, V.D.; Kadyrzhanov, K.K.; Zarifov, R.A.

    1994-01-01

    The goniometer head is an electromechanical instrument that performs the independent transfer of a testing sample on three coordinate axes (X, Y, Z) within limits of ±8 mm and independent rotation relative of these directions. The instrument comprises a sample holder, bellows component and three electrometer drives. The sample holder rotates around the axes X and Y, and is installed on the central arm which rotates around axis Z. One characteristic of this instrument is its independence which allows its use in any camera for researches in the field of radiation physics. 2 figs

  12. Disaggregating asthma: Big investigation versus big data.

    Science.gov (United States)

    Belgrave, Danielle; Henderson, John; Simpson, Angela; Buchan, Iain; Bishop, Christopher; Custovic, Adnan

    2017-02-01

    We are facing a major challenge in bridging the gap between identifying subtypes of asthma to understand causal mechanisms and translating this knowledge into personalized prevention and management strategies. In recent years, "big data" has been sold as a panacea for generating hypotheses and driving new frontiers of health care; the idea that the data must and will speak for themselves is fast becoming a new dogma. One of the dangers of ready accessibility of health care data and computational tools for data analysis is that the process of data mining can become uncoupled from the scientific process of clinical interpretation, understanding the provenance of the data, and external validation. Although advances in computational methods can be valuable for using unexpected structure in data to generate hypotheses, there remains a need for testing hypotheses and interpreting results with scientific rigor. We argue for combining data- and hypothesis-driven methods in a careful synergy, and the importance of carefully characterized birth and patient cohorts with genetic, phenotypic, biological, and molecular data in this process cannot be overemphasized. The main challenge on the road ahead is to harness bigger health care data in ways that produce meaningful clinical interpretation and to translate this into better diagnoses and properly personalized prevention and treatment plans. There is a pressing need for cross-disciplinary research with an integrative approach to data science, whereby basic scientists, clinicians, data analysts, and epidemiologists work together to understand the heterogeneity of asthma. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Head injury in children

    International Nuclear Information System (INIS)

    Sugiura, Makoto; Mori, Nobuhiko; Yokosuka, Reiko; Yamamoto, Masaaki; Imanaga, Hirohisa

    1981-01-01

    Findings of computerized tomography (CT) in 183 cases of head injury in children were investigated with special reference to CT findings of mild head injury. As was expected, CT findings of mild head injury fell within the normal range, in almost all cases. However, abnormal findings were noticed in 4 out of 34 cases (12%) in acute stage and 7 out of 76 cases (9%) in chronic stage. They were 3 cases of localized low density area in acute stage and 6 cases of mild cerebral atrophy in chronic stage, etc. There were some cases of mild head injury in which CT findings were normal while EEG examination revealed abnormality. Also in some cases, x-ray study demonstrated linear skull fracture which CT failed to show. These conventional techniques could be still remained as useful adjunct aid in diagnosis of head injury. CT findings of cases of cerebral contusion in their acute stage were divided as follows; normal, low density, small ventricle and ventricular and/or cisternal hemorrhage, frequency of incidence being 38, 17, 22, 11% respectively. These findings were invariably converted to cerebral atrophy from 10 days to 2 months after the impacts. In the cases with intracranial hematoma revealed by CT, only 32% of them showed clinical signs of Araki's type IV in their acute stage and 63% of them showed no neurological defects, that is Araki's type I and II. A case of extreme diffuse cerebral atrophy which followed acute subdural hematoma caused by tear of bridging veins without cortical contusion was presented. (author)

  14. Big data and educational research

    OpenAIRE

    Beneito-Montagut, Roser

    2017-01-01

    Big data and data analytics offer the promise to enhance teaching and learning, improve educational research and progress education governance. This chapter aims to contribute to the conceptual and methodological understanding of big data and analytics within educational research. It describes the opportunities and challenges that big data and analytics bring to education as well as critically explore the perils of applying a data driven approach to education. Despite the claimed value of the...

  15. The trashing of Big Green

    International Nuclear Information System (INIS)

    Felten, E.

    1990-01-01

    The Big Green initiative on California's ballot lost by a margin of 2-to-1. Green measures lost in five other states, shocking ecology-minded groups. According to the postmortem by environmentalists, Big Green was a victim of poor timing and big spending by the opposition. Now its supporters plan to break up the bill and try to pass some provisions in the Legislature

  16. Big data in fashion industry

    Science.gov (United States)

    Jain, S.; Bruniaux, J.; Zeng, X.; Bruniaux, P.

    2017-10-01

    Significant work has been done in the field of big data in last decade. The concept of big data includes analysing voluminous data to extract valuable information. In the fashion world, big data is increasingly playing a part in trend forecasting, analysing consumer behaviour, preference and emotions. The purpose of this paper is to introduce the term fashion data and why it can be considered as big data. It also gives a broad classification of the types of fashion data and briefly defines them. Also, the methodology and working of a system that will use this data is briefly described.

  17. Big Data Analytics An Overview

    Directory of Open Access Journals (Sweden)

    Jayshree Dwivedi

    2015-08-01

    Full Text Available Big data is a data beyond the storage capacity and beyond the processing power is called big data. Big data term is used for data sets its so large or complex that traditional data it involves data sets with sizes. Big data size is a constantly moving target year by year ranging from a few dozen terabytes to many petabytes of data means like social networking sites the amount of data produced by people is growing rapidly every year. Big data is not only a data rather it become a complete subject which includes various tools techniques and framework. It defines the epidemic possibility and evolvement of data both structured and unstructured. Big data is a set of techniques and technologies that require new forms of assimilate to uncover large hidden values from large datasets that are diverse complex and of a massive scale. It is difficult to work with using most relational database management systems and desktop statistics and visualization packages exacting preferably massively parallel software running on tens hundreds or even thousands of servers. Big data environment is used to grab organize and resolve the various types of data. In this paper we describe applications problems and tools of big data and gives overview of big data.

  18. Was there a big bang

    International Nuclear Information System (INIS)

    Narlikar, J.

    1981-01-01

    In discussing the viability of the big-bang model of the Universe relative evidence is examined including the discrepancies in the age of the big-bang Universe, the red shifts of quasars, the microwave background radiation, general theory of relativity aspects such as the change of the gravitational constant with time, and quantum theory considerations. It is felt that the arguments considered show that the big-bang picture is not as soundly established, either theoretically or observationally, as it is usually claimed to be, that the cosmological problem is still wide open and alternatives to the standard big-bang picture should be seriously investigated. (U.K.)

  19. How Big is Earth?

    Science.gov (United States)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  20. Effect of Osteonecrosis Intervention Rod Versus Core Decompression Using Multiple Small Drill Holes on Early Stages of Necrosis of the Femoral Head: A Prospective Study on a Series of 60 Patients with a Minimum 1-Year-Follow-Up.

    Science.gov (United States)

    Miao, Haixiong; Ye, Dongping; Liang, Weiguo; Yao, Yicun

    2015-01-01

    The conventional CD used 10 mm drill holes associated with a lack of structural support. Thus, alternative methods such as a tantalum implant, small drill holes, and biological treatment were developed to prevent deterioration of the joint. The treatment of CD by multiple 3.2 mm drill holes could reduce the femoral neck fracture and partial weight bearing was allowed. This study was aimed to evaluate the effect of osteonecrosis intervention rod versus core decompression using multiple small drill holes on early stages of necrosis of the femoral head. From January 2011 to January 2012, 60 patients undergoing surgery for osteonecrosis with core decompression were randomly assigned into 2 groups based on the type of core decompression used: (1) a total of 30 osteonecrosis patients (with 16 hips on Steinburg stageⅠ,20 hips on Steinburg stageⅡ) were treated with a porous tantalum rod insertion. The diameter of the drill hole for the intervention rod was 10mm.(2) a total of 30 osteonecrosis patients (with 14 hips on Steinburg stageⅠ,20 hips on Steinburg stageⅡ) were treated with core decompression using five drill holes on the lateral femur, the diameter of the hole was 3.2 mm. The average age of the patient was 32.6 years (20-45 years) and the average time of follow-up was 25.6 months (12- 28 months) in the rod implanted group. The average age of the patient was 35.2 years (22- 43 years) and the average time of follow-up was 26.3 months (12-28 months) in the small drill holes group. The average of surgical time was 40 min, and the mean volume of blood loss was 30 ml in both surgical groups. The average of Harris score was improved from 56.2 ± 7.1 preoperative to 80.2 ± 11.4 at the last follow-up in the rod implanted group (p holes group (pholes group. No significant difference was observed in radiographic stage between the two groups. There was no favourable result on the outcome of a tantalum intervention implant compared to multiple small drill holes. CD via

  1. Privacy and Big Data

    CERN Document Server

    Craig, Terence

    2011-01-01

    Much of what constitutes Big Data is information about us. Through our online activities, we leave an easy-to-follow trail of digital footprints that reveal who we are, what we buy, where we go, and much more. This eye-opening book explores the raging privacy debate over the use of personal data, with one undeniable conclusion: once data's been collected, we have absolutely no control over who uses it or how it is used. Personal data is the hottest commodity on the market today-truly more valuable than gold. We are the asset that every company, industry, non-profit, and government wants. Pri

  2. Visualizing big energy data

    DEFF Research Database (Denmark)

    Hyndman, Rob J.; Liu, Xueqin Amy; Pinson, Pierre

    2018-01-01

    Visualization is a crucial component of data analysis. It is always a good idea to plot the data before fitting models, making predictions, or drawing conclusions. As sensors of the electric grid are collecting large volumes of data from various sources, power industry professionals are facing th...... the challenge of visualizing such data in a timely fashion. In this article, we demonstrate several data-visualization solutions for big energy data through three case studies involving smart-meter data, phasor measurement unit (PMU) data, and probabilistic forecasts, respectively....

  3. Big Data Challenges

    Directory of Open Access Journals (Sweden)

    Alexandru Adrian TOLE

    2013-10-01

    Full Text Available The amount of data that is traveling across the internet today, not only that is large, but is complex as well. Companies, institutions, healthcare system etc., all of them use piles of data which are further used for creating reports in order to ensure continuity regarding the services that they have to offer. The process behind the results that these entities requests represents a challenge for software developers and companies that provide IT infrastructure. The challenge is how to manipulate an impressive volume of data that has to be securely delivered through the internet and reach its destination intact. This paper treats the challenges that Big Data creates.

  4. Big data naturally rescaled

    International Nuclear Information System (INIS)

    Stoop, Ruedi; Kanders, Karlis; Lorimer, Tom; Held, Jenny; Albert, Carlo

    2016-01-01

    We propose that a handle could be put on big data by looking at the systems that actually generate the data, rather than the data itself, realizing that there may be only few generic processes involved in this, each one imprinting its very specific structures in the space of systems, the traces of which translate into feature space. From this, we propose a practical computational clustering approach, optimized for coping with such data, inspired by how the human cortex is known to approach the problem.

  5. A Matrix Big Bang

    OpenAIRE

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matr...

  6. BIG Data - BIG Gains? Understanding the Link Between Big Data Analytics and Innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance for product innovations. Since big data technologies provide new data information practices, they create new decision-making possibilities, which firms can use to realize innovations. Applying German firm-level data we find suggestive evidence that big data analytics matters for the likelihood of becoming a product innovator as well as the market success of the firms’ product innovat...

  7. BIG data - BIG gains? Empirical evidence on the link between big data analytics and innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance in terms of product innovations. Since big data technologies provide new data information practices, they create novel decision-making possibilities, which are widely believed to support firms’ innovation process. Applying German firm-level data within a knowledge production function framework we find suggestive evidence that big data analytics is a relevant determinant for the likel...

  8. Management of non-traumatic avascular necrosis of the femoral head-a comparative analysis of the outcome of multiple small diameter drilling and core decompression with fibular grafting.

    Science.gov (United States)

    Mohanty, S P; Singh, K A; Kundangar, R; Shankar, V

    2017-04-01

    The purpose of this study was to compare the clinical and radiological outcomes of multiple small diameter drilling and core decompression with fibular strut grafting in the management of non-traumatic avascular necrosis (AVN) of the femoral head. Outcomes of patients with AVN treated by multiple small diameter drilling (group 1) were compared retrospectively with patients treated by core decompression and fibular grafting (group 2). Harris hip score (HHS) was used to assess the clinical status pre- and postoperatively. Modified Ficat and Arlet classification was used to assess the radiological stage pre- and postoperatively. Forty-six patients (68 hips) were included in this study. Group 1 consisted of 33 hips, and group 2 consisted of 35 hips. In stages I and IIB, there was no statistically significant difference in the final HHS between the two groups. However, in stages IIA and III, hips in group 2 had a better final HHS (P < 0.05). In terms of radiographic progression, there was no statistical difference between hips in stages I, IIA and stage IIB. However, in stage III, hips belonging to group 2 had better results (P < 0.05). Kaplan-Meier survivorship analysis showed better outcome in group 2 in stage III (P < 0.05). Hips with AVN in the precollapse stage can be salvaged by core decompression with or without fibular grafting. Multiple small diameter drilling is relatively simple and carries less morbidity and hence preferred in stages I and II. However, in stage III disease, core decompression with fibular strut grafting gives better results.

  9. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Fields, Brian D.; Olive, Keith A.

    2006-01-01

    We present an overview of the standard model of big bang nucleosynthesis (BBN), which describes the production of the light elements in the early universe. The theoretical prediction for the abundances of D, 3 He, 4 He, and 7 Li is discussed. We emphasize the role of key nuclear reactions and the methods by which experimental cross section uncertainties are propagated into uncertainties in the predicted abundances. The observational determination of the light nuclides is also discussed. Particular attention is given to the comparison between the predicted and observed abundances, which yields a measurement of the cosmic baryon content. The spectrum of anisotropies in the cosmic microwave background (CMB) now independently measures the baryon density to high precision; we show how the CMB data test BBN, and find that the CMB and the D and 4 He observations paint a consistent picture. This concordance stands as a major success of the hot big bang. On the other hand, 7 Li remains discrepant with the CMB-preferred baryon density; possible explanations are reviewed. Finally, moving beyond the standard model, primordial nucleosynthesis constraints on early universe and particle physics are also briefly discussed

  10. Nursing Needs Big Data and Big Data Needs Nursing.

    Science.gov (United States)

    Brennan, Patricia Flatley; Bakken, Suzanne

    2015-09-01

    Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.

  11. Was the big bang hot

    International Nuclear Information System (INIS)

    Wright, E.L.

    1983-01-01

    The author considers experiments to confirm the substantial deviations from a Planck curve in the Woody and Richards spectrum of the microwave background, and search for conducting needles in our galaxy. Spectral deviations and needle-shaped grains are expected for a cold Big Bang, but are not required by a hot Big Bang. (Auth.)

  12. Ethische aspecten van big data

    NARCIS (Netherlands)

    N. (Niek) van Antwerpen; Klaas Jan Mollema

    2017-01-01

    Big data heeft niet alleen geleid tot uitdagende technische vraagstukken, ook gaat het gepaard met allerlei nieuwe ethische en morele kwesties. Om verantwoord met big data om te gaan, moet ook over deze kwesties worden nagedacht. Want slecht datagebruik kan nadelige gevolgen hebben voor

  13. Fremtidens landbrug bliver big business

    DEFF Research Database (Denmark)

    Hansen, Henning Otte

    2016-01-01

    Landbrugets omverdensforhold og konkurrencevilkår ændres, og det vil nødvendiggøre en udvikling i retning af “big business“, hvor landbrugene bliver endnu større, mere industrialiserede og koncentrerede. Big business bliver en dominerende udvikling i dansk landbrug - men ikke den eneste...

  14. Human factors in Big Data

    NARCIS (Netherlands)

    Boer, J. de

    2016-01-01

    Since 2014 I am involved in various (research) projects that try to make the hype around Big Data more concrete and tangible for the industry and government. Big Data is about multiple sources of (real-time) data that can be analysed, transformed to information and be used to make 'smart' decisions.

  15. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  16. Envisioning the future of 'big data' biomedicine.

    Science.gov (United States)

    Bui, Alex A T; Van Horn, John Darrell

    2017-05-01

    Through the increasing availability of more efficient data collection procedures, biomedical scientists are now confronting ever larger sets of data, often finding themselves struggling to process and interpret what they have gathered. This, while still more data continues to accumulate. This torrent of biomedical information necessitates creative thinking about how the data are being generated, how they might be best managed, analyzed, and eventually how they can be transformed into further scientific understanding for improving patient care. Recognizing this as a major challenge, the National Institutes of Health (NIH) has spearheaded the "Big Data to Knowledge" (BD2K) program - the agency's most ambitious biomedical informatics effort ever undertaken to date. In this commentary, we describe how the NIH has taken on "big data" science head-on, how a consortium of leading research centers are developing the means for handling large-scale data, and how such activities are being marshalled for the training of a new generation of biomedical data scientists. All in all, the NIH BD2K program seeks to position data science at the heart of 21 st Century biomedical research. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Applications of Big Data in Education

    OpenAIRE

    Faisal Kalota

    2015-01-01

    Big Data and analytics have gained a huge momentum in recent years. Big Data feeds into the field of Learning Analytics (LA) that may allow academic institutions to better understand the learners' needs and proactively address them. Hence, it is important to have an understanding of Big Data and its applications. The purpose of this descriptive paper is to provide an overview of Big Data, the technologies used in Big Data, and some of the applications of Big Data in educa...

  18. Exploring complex and big data

    Directory of Open Access Journals (Sweden)

    Stefanowski Jerzy

    2017-12-01

    Full Text Available This paper shows how big data analysis opens a range of research and technological problems and calls for new approaches. We start with defining the essential properties of big data and discussing the main types of data involved. We then survey the dedicated solutions for storing and processing big data, including a data lake, virtual integration, and a polystore architecture. Difficulties in managing data quality and provenance are also highlighted. The characteristics of big data imply also specific requirements and challenges for data mining algorithms, which we address as well. The links with related areas, including data streams and deep learning, are discussed. The common theme that naturally emerges from this characterization is complexity. All in all, we consider it to be the truly defining feature of big data (posing particular research and technological challenges, which ultimately seems to be of greater importance than the sheer data volume.

  19. Fixing the Big Bang Theory's Lithium Problem

    Science.gov (United States)

    Kohler, Susanna

    2017-02-01

    How did our universe come into being? The Big Bang theory is a widely accepted and highly successful cosmological model of the universe, but it does introduce one puzzle: the cosmological lithium problem. Have scientists now found a solution?Too Much LithiumIn the Big Bang theory, the universe expanded rapidly from a very high-density and high-temperature state dominated by radiation. This theory has been validated again and again: the discovery of the cosmic microwave background radiation and observations of the large-scale structure of the universe both beautifully support the Big Bang theory, for instance. But one pesky trouble-spot remains: the abundance of lithium.The arrows show the primary reactions involved in Big Bang nucleosynthesis, and their flux ratios, as predicted by the authors model, are given on the right. Synthesizing primordial elements is complicated! [Hou et al. 2017]According to Big Bang nucleosynthesis theory, primordial nucleosynthesis ran wild during the first half hour of the universes existence. This produced most of the universes helium and small amounts of other light nuclides, including deuterium and lithium.But while predictions match the observed primordial deuterium and helium abundances, Big Bang nucleosynthesis theory overpredicts the abundance of primordial lithium by about a factor of three. This inconsistency is known as the cosmological lithium problem and attempts to resolve it using conventional astrophysics and nuclear physics over the past few decades have not been successful.In a recent publicationled by Suqing Hou (Institute of Modern Physics, Chinese Academy of Sciences) and advisorJianjun He (Institute of Modern Physics National Astronomical Observatories, Chinese Academy of Sciences), however, a team of scientists has proposed an elegant solution to this problem.Time and temperature evolution of the abundances of primordial light elements during the beginning of the universe. The authors model (dotted lines

  20. The big data telescope

    International Nuclear Information System (INIS)

    Finkel, Elizabeth

    2017-01-01

    On a flat, red mulga plain in the outback of Western Australia, preparations are under way to build the most audacious telescope astronomers have ever dreamed of - the Square Kilometre Array (SKA). Next-generation telescopes usually aim to double the performance of their predecessors. The Australian arm of SKA will deliver a 168-fold leap on the best technology available today, to show us the universe as never before. It will tune into signals emitted just a million years after the Big Bang, when the universe was a sea of hydrogen gas, slowly percolating with the first galaxies. Their starlight illuminated the fledgling universe in what is referred to as the “cosmic dawn”.

  1. The Big Optical Array

    International Nuclear Information System (INIS)

    Mozurkewich, D.; Johnston, K.J.; Simon, R.S.

    1990-01-01

    This paper describes the design and the capabilities of the Naval Research Laboratory Big Optical Array (BOA), an interferometric optical array for high-resolution imaging of stars, stellar systems, and other celestial objects. There are four important differences between the BOA design and the design of Mark III Optical Interferometer on Mount Wilson (California). These include a long passive delay line which will be used in BOA to do most of the delay compensation, so that the fast delay line will have a very short travel; the beam combination in BOA will be done in triplets, to allow measurement of closure phase; the same light will be used for both star and fringe tracking; and the fringe tracker will use several wavelength channels

  2. Big nuclear accidents

    International Nuclear Information System (INIS)

    Marshall, W.

    1983-01-01

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the safety of nuclear power. The way in which the probability and consequences of big nuclear accidents have been presented in the past is reviewed and recommendations for the future are made including the presentation of the long-term consequences of such accidents in terms of 'reduction in life expectancy', 'increased chance of fatal cancer' and the equivalent pattern of compulsory cigarette smoking. (author)

  3. Nonstandard big bang models

    International Nuclear Information System (INIS)

    Calvao, M.O.; Lima, J.A.S.

    1989-01-01

    The usual FRW hot big-bang cosmologies have been generalized by considering the equation of state ρ = Anm +(γ-1) -1 p, where m is the rest mass of the fluid particles and A is a dimensionless constant. Explicit analytic solutions are given for the flat case (ε=O). For large cosmological times these extended models behave as the standard Einstein-de Sitter universes regardless of the values of A and γ. Unlike the usual FRW flat case the deceleration parameter q is a time-dependent function and its present value, q≅ 1, obtained from the luminosity distance versus redshift relation, may be fitted by taking, for instance, A=1 and γ = 5/3 (monatomic relativistic gas with >> k B T). In all cases the universe cools obeying the same temperature law of the FRW models and it is shown that the age of the universe is only slightly modified. (author) [pt

  4. The Last Big Bang

    Energy Technology Data Exchange (ETDEWEB)

    McGuire, Austin D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Meade, Roger Allen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-13

    As one of the very few people in the world to give the “go/no go” decision to detonate a nuclear device, Austin “Mac” McGuire holds a very special place in the history of both the Los Alamos National Laboratory and the world. As Commander of Joint Task Force Unit 8.1.1, on Christmas Island in the spring and summer of 1962, Mac directed the Los Alamos data collection efforts for twelve of the last atmospheric nuclear detonations conducted by the United States. Since data collection was at the heart of nuclear weapon testing, it fell to Mac to make the ultimate decision to detonate each test device. He calls his experience THE LAST BIG BANG, since these tests, part of Operation Dominic, were characterized by the dramatic displays of the heat, light, and sounds unique to atmospheric nuclear detonations – never, perhaps, to be witnessed again.

  5. A matrix big bang

    International Nuclear Information System (INIS)

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control

  6. A matrix big bang

    Energy Technology Data Exchange (ETDEWEB)

    Craps, Ben [Instituut voor Theoretische Fysica, Universiteit van Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam (Netherlands); Sethi, Savdeep [Enrico Fermi Institute, University of Chicago, Chicago, IL 60637 (United States); Verlinde, Erik [Instituut voor Theoretische Fysica, Universiteit van Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam (Netherlands)

    2005-10-15

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control.

  7. DPF Big One

    International Nuclear Information System (INIS)

    Anon.

    1993-01-01

    At its latest venue at Fermilab from 10-14 November, the American Physical Society's Division of Particles and Fields meeting entered a new dimension. These regular meetings, which allow younger researchers to communicate with their peers, have been gaining popularity over the years (this was the seventh in the series), but nobody had expected almost a thousand participants and nearly 500 requests to give talks. Thus Fermilab's 800-seat auditorium had to be supplemented with another room with a video hookup, while the parallel sessions were organized into nine bewildering streams covering fourteen major physics topics. With the conventionality of the Standard Model virtually unchallenged, physics does not move fast these days. While most of the physics results had already been covered in principle at the International Conference on High Energy Physics held in Dallas in August (October, page 1), the Fermilab DPF meeting had a very different atmosphere. Major international meetings like Dallas attract big names from far and wide, and it is difficult in such an august atmosphere for young researchers to find a receptive audience. This was not the case at the DPF parallel sessions. The meeting also adopted a novel approach, with the parallels sandwiched between an initial day of plenaries to set the scene, and a final day of summaries. With the whole world waiting for the sixth ('top') quark to be discovered at Fermilab's Tevatron protonantiproton collider, the meeting began with updates from Avi Yagil and Ronald Madaras from the big detectors, CDF and DO respectively. Although rumours flew thick and fast, the Tevatron has not yet reached the top, although Yagil could show one intriguing event of a type expected from the heaviest quark

  8. DPF Big One

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1993-01-15

    At its latest venue at Fermilab from 10-14 November, the American Physical Society's Division of Particles and Fields meeting entered a new dimension. These regular meetings, which allow younger researchers to communicate with their peers, have been gaining popularity over the years (this was the seventh in the series), but nobody had expected almost a thousand participants and nearly 500 requests to give talks. Thus Fermilab's 800-seat auditorium had to be supplemented with another room with a video hookup, while the parallel sessions were organized into nine bewildering streams covering fourteen major physics topics. With the conventionality of the Standard Model virtually unchallenged, physics does not move fast these days. While most of the physics results had already been covered in principle at the International Conference on High Energy Physics held in Dallas in August (October, page 1), the Fermilab DPF meeting had a very different atmosphere. Major international meetings like Dallas attract big names from far and wide, and it is difficult in such an august atmosphere for young researchers to find a receptive audience. This was not the case at the DPF parallel sessions. The meeting also adopted a novel approach, with the parallels sandwiched between an initial day of plenaries to set the scene, and a final day of summaries. With the whole world waiting for the sixth ('top') quark to be discovered at Fermilab's Tevatron protonantiproton collider, the meeting began with updates from Avi Yagil and Ronald Madaras from the big detectors, CDF and DO respectively. Although rumours flew thick and fast, the Tevatron has not yet reached the top, although Yagil could show one intriguing event of a type expected from the heaviest quark.

  9. Big bang and big crunch in matrix string theory

    International Nuclear Information System (INIS)

    Bedford, J.; Ward, J.; Papageorgakis, C.; Rodriguez-Gomez, D.

    2007-01-01

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe

  10. BigOP: Generating Comprehensive Big Data Workloads as a Benchmarking Framework

    OpenAIRE

    Zhu, Yuqing; Zhan, Jianfeng; Weng, Chuliang; Nambiar, Raghunath; Zhang, Jinchao; Chen, Xingzhen; Wang, Lei

    2014-01-01

    Big Data is considered proprietary asset of companies, organizations, and even nations. Turning big data into real treasure requires the support of big data systems. A variety of commercial and open source products have been unleashed for big data storage and processing. While big data users are facing the choice of which system best suits their needs, big data system developers are facing the question of how to evaluate their systems with regard to general big data processing needs. System b...

  11. determination of head for small hydropower development

    African Journals Online (AJOL)

    eobe

    volumetric flow of 31.73 m3/s, the power potential of the River Ethiope was placed at 2.43 MW. It ca. /s, the power potential ... water projects such as water supply, irrigation and ..... f is the friction factor (unit-less), L is the length of the pipe (m), D ...

  12. Head Impact Laboratory (HIL)

    Data.gov (United States)

    Federal Laboratory Consortium — The HIL uses testing devices to evaluate vehicle interior energy attenuating (EA) technologies for mitigating head injuries resulting from head impacts during mine/...

  13. From big bang to big crunch and beyond

    International Nuclear Information System (INIS)

    Elitzur, Shmuel; Rabinovici, Eliezer; Giveon, Amit; Kutasov, David

    2002-01-01

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a 'big bang' singularity, expands and then contracts to a 'big crunch' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spacetimes is further connected at the singularities to a series of non-compact static regions with closed timelike curves. These regions contain boundaries, on which the observables of the theory live. This suggests a holographic interpretation of the physics. (author)

  14. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  15. Empathy and the Big Five

    OpenAIRE

    Paulus, Christoph

    2016-01-01

    Del Barrio et al. (2004) haben vor mehr als 10 Jahren versucht, eine direkte Beziehung zwischen Empathie und den Big Five herzustellen. Im Mittel hatten in ihrer Stichprobe Frauen höhere Werte in der Empathie und auf den Big Five-Faktoren mit Ausnahme des Faktors Neurotizismus. Zusammenhänge zu Empathie fanden sie in den Bereichen Offenheit, Verträglichkeit, Gewissenhaftigkeit und Extraversion. In unseren Daten besitzen Frauen sowohl in der Empathie als auch den Big Five signifikant höhere We...

  16. "Big data" in economic history.

    Science.gov (United States)

    Gutmann, Myron P; Merchant, Emily Klancher; Roberts, Evan

    2018-03-01

    Big data is an exciting prospect for the field of economic history, which has long depended on the acquisition, keying, and cleaning of scarce numerical information about the past. This article examines two areas in which economic historians are already using big data - population and environment - discussing ways in which increased frequency of observation, denser samples, and smaller geographic units allow us to analyze the past with greater precision and often to track individuals, places, and phenomena across time. We also explore promising new sources of big data: organically created economic data, high resolution images, and textual corpora.

  17. Big Data as Information Barrier

    Directory of Open Access Journals (Sweden)

    Victor Ya. Tsvetkov

    2014-07-01

    Full Text Available The article covers analysis of ‘Big Data’ which has been discussed over last 10 years. The reasons and factors for the issue are revealed. It has proved that the factors creating ‘Big Data’ issue has existed for quite a long time, and from time to time, would cause the informational barriers. Such barriers were successfully overcome through the science and technologies. The conducted analysis refers the “Big Data” issue to a form of informative barrier. This issue may be solved correctly and encourages development of scientific and calculating methods.

  18. Homogeneous and isotropic big rips?

    CERN Document Server

    Giovannini, Massimo

    2005-01-01

    We investigate the way big rips are approached in a fully inhomogeneous description of the space-time geometry. If the pressure and energy densities are connected by a (supernegative) barotropic index, the spatial gradients and the anisotropic expansion decay as the big rip is approached. This behaviour is contrasted with the usual big-bang singularities. A similar analysis is performed in the case of sudden (quiescent) singularities and it is argued that the spatial gradients may well be non-negligible in the vicinity of pressure singularities.

  19. Big Data in Space Science

    OpenAIRE

    Barmby, Pauline

    2018-01-01

    It seems like “big data” is everywhere these days. In planetary science and astronomy, we’ve been dealing with large datasets for a long time. So how “big” is our data? How does it compare to the big data that a bank or an airline might have? What new tools do we need to analyze big datasets, and how can we make better use of existing tools? What kinds of science problems can we address with these? I’ll address these questions with examples including ESA’s Gaia mission, ...

  20. Rate Change Big Bang Theory

    Science.gov (United States)

    Strickland, Ken

    2013-04-01

    The Rate Change Big Bang Theory redefines the birth of the universe with a dramatic shift in energy direction and a new vision of the first moments. With rate change graph technology (RCGT) we can look back 13.7 billion years and experience every step of the big bang through geometrical intersection technology. The analysis of the Big Bang includes a visualization of the first objects, their properties, the astounding event that created space and time as well as a solution to the mystery of anti-matter.

  1. Big Data is invading big places as CERN

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Big Data technologies are becoming more popular with the constant grow of data generation in different fields such as social networks, internet of things and laboratories like CERN. How is CERN making use of such technologies? How machine learning is applied at CERN with Big Data technologies? How much data we move and how it is analyzed? All these questions will be answered during the talk.

  2. The ethics of big data in big agriculture

    OpenAIRE

    Carbonell (Isabelle M.)

    2016-01-01

    This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique in...

  3. Heading and head injuries in soccer.

    Science.gov (United States)

    Kirkendall, D T; Jordan, S E; Garrett, W E

    2001-01-01

    In the world of sports, soccer is unique because of the purposeful use of the unprotected head for controlling and advancing the ball. This skill obviously places the player at risk of head injury and the game does carry some risk. Head injury can be a result of contact of the head with another head (or other body parts), ground, goal post, other unknown objects or even the ball. Such impacts can lead to contusions, fractures, eye injuries, concussions or even, in rare cases, death. Coaches, players, parents and physicians are rightly concerned about the risk of head injury in soccer. Current research shows that selected soccer players have some degree of cognitive dysfunction. It is important to determine the reasons behind such deficits. Purposeful heading has been blamed, but a closer look at the studies that focus on heading has revealed methodological concerns that question the validity of blaming purposeful heading of the ball. The player's history and age (did they play when the ball was leather and could absorb significant amounts of water), alcohol intake, drug intake, learning disabilities, concussion definition and control group use/composition are all factors that cloud the ability to blame purposeful heading. What does seem clear is that a player's history of concussive episodes is a more likely explanation for cognitive deficits. While it is likely that the subconcussive impact of purposeful heading is a doubtful factor in the noted deficits, it is unknown whether multiple subconcussive impacts might have some lingering effects. In addition, it is unknown whether the noted deficits have any affect on daily life. Proper instruction in the technique is critical because if the ball contacts an unprepared head (as in accidental head-ball contacts), the potential for serious injury is possible. To further our understanding of the relationship of heading, head injury and cognitive deficits, we need to: learn more about the actual impact of a ball on the

  4. Big climate data analysis

    Science.gov (United States)

    Mudelsee, Manfred

    2015-04-01

    The Big Data era has begun also in the climate sciences, not only in economics or molecular biology. We measure climate at increasing spatial resolution by means of satellites and look farther back in time at increasing temporal resolution by means of natural archives and proxy data. We use powerful supercomputers to run climate models. The model output of the calculations made for the IPCC's Fifth Assessment Report amounts to ~650 TB. The 'scientific evolution' of grid computing has started, and the 'scientific revolution' of quantum computing is being prepared. This will increase computing power, and data amount, by several orders of magnitude in the future. However, more data does not automatically mean more knowledge. We need statisticians, who are at the core of transforming data into knowledge. Statisticians notably also explore the limits of our knowledge (uncertainties, that is, confidence intervals and P-values). Mudelsee (2014 Climate Time Series Analysis: Classical Statistical and Bootstrap Methods. Second edition. Springer, Cham, xxxii + 454 pp.) coined the term 'optimal estimation'. Consider the hyperspace of climate estimation. It has many, but not infinite, dimensions. It consists of the three subspaces Monte Carlo design, method and measure. The Monte Carlo design describes the data generating process. The method subspace describes the estimation and confidence interval construction. The measure subspace describes how to detect the optimal estimation method for the Monte Carlo experiment. The envisaged large increase in computing power may bring the following idea of optimal climate estimation into existence. Given a data sample, some prior information (e.g. measurement standard errors) and a set of questions (parameters to be estimated), the first task is simple: perform an initial estimation on basis of existing knowledge and experience with such types of estimation problems. The second task requires the computing power: explore the hyperspace to

  5. Hey, big spender

    Energy Technology Data Exchange (ETDEWEB)

    Cope, G.

    2000-04-01

    Business to business electronic commerce is looming large in the future of the oil industry. It is estimated that by adopting e-commerce the industry could achieve bottom line savings of between $1.8 to $ 3.4 billion a year on annual gross revenues in excess of $ 30 billion. At present there are several teething problems to overcome such as inter-operability standards, which are at least two or three years away. Tying in electronically with specific suppliers is also an expensive proposition, although the big benefits are in fact in doing business with the same suppliers on a continuing basis. Despite these problems, 14 of the world's largest energy and petrochemical companies joined forces in mid-April to create a single Internet procurement marketplace for the industry's complex supply chain. The exchange was designed by B2B (business-to-business) software provider, Commerce One Inc., ; it will leverage the buying clout of these industry giants (BP Amoco, Royal Dutch Shell Group, Conoco, Occidental Petroleum, Phillips Petroleum, Unocal Corporation and Statoil among them), currently about $ 125 billion on procurement per year; they hope to save between 5 to 30 per cent depending on the product and the region involved. Other similar schemes such as Chevron and partners' Petrocosm Marketplace, Network Oil, a Houston-based Internet portal aimed at smaller petroleum companies, are also doing business in the $ 10 billion per annum range. e-Energy, a cooperative project between IBM Ericson and Telus Advertising is another neutral, virtual marketplace targeted at the oil and gas sector. PetroTRAX, a Calgary-based website plans to take online procurement and auction sales a big step forward by establishing a portal to handle any oil company's asset management needs. There are also a number of websites targeting specific needs: IndigoPool.com (acquisitions and divestitures) and WellBid.com (products related to upstream oil and gas operators) are just

  6. Hey, big spender

    International Nuclear Information System (INIS)

    Cope, G.

    2000-01-01

    Business to business electronic commerce is looming large in the future of the oil industry. It is estimated that by adopting e-commerce the industry could achieve bottom line savings of between $1.8 to $ 3.4 billion a year on annual gross revenues in excess of $ 30 billion. At present there are several teething problems to overcome such as inter-operability standards, which are at least two or three years away. Tying in electronically with specific suppliers is also an expensive proposition, although the big benefits are in fact in doing business with the same suppliers on a continuing basis. Despite these problems, 14 of the world's largest energy and petrochemical companies joined forces in mid-April to create a single Internet procurement marketplace for the industry's complex supply chain. The exchange was designed by B2B (business-to-business) software provider, Commerce One Inc., ; it will leverage the buying clout of these industry giants (BP Amoco, Royal Dutch Shell Group, Conoco, Occidental Petroleum, Phillips Petroleum, Unocal Corporation and Statoil among them), currently about $ 125 billion on procurement per year; they hope to save between 5 to 30 per cent depending on the product and the region involved. Other similar schemes such as Chevron and partners' Petrocosm Marketplace, Network Oil, a Houston-based Internet portal aimed at smaller petroleum companies, are also doing business in the $ 10 billion per annum range. e-Energy, a cooperative project between IBM Ericson and Telus Advertising is another neutral, virtual marketplace targeted at the oil and gas sector. PetroTRAX, a Calgary-based website plans to take online procurement and auction sales a big step forward by establishing a portal to handle any oil company's asset management needs. There are also a number of websites targeting specific needs: IndigoPool.com (acquisitions and divestitures) and WellBid.com (products related to upstream oil and gas operators) are just two examples. All in

  7. Methods and tools for big data visualization

    OpenAIRE

    Zubova, Jelena; Kurasova, Olga

    2015-01-01

    In this paper, methods and tools for big data visualization have been investigated. Challenges faced by the big data analysis and visualization have been identified. Technologies for big data analysis have been discussed. A review of methods and tools for big data visualization has been done. Functionalities of the tools have been demonstrated by examples in order to highlight their advantages and disadvantages.

  8. Measuring the Promise of Big Data Syllabi

    Science.gov (United States)

    Friedman, Alon

    2018-01-01

    Growing interest in Big Data is leading industries, academics and governments to accelerate Big Data research. However, how teachers should teach Big Data has not been fully examined. This article suggests criteria for redesigning Big Data syllabi in public and private degree-awarding higher education establishments. The author conducted a survey…

  9. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  10. Mountain big sagebrush age distribution and relationships on the northern Yellowstone Winter Range

    Science.gov (United States)

    Carl L. Wambolt; Trista L. Hoffman

    2001-01-01

    This study was conducted within the Gardiner Basin, an especially critical wintering area for native ungulates utilizing the Northern Yellowstone Winter Range. Mountain big sagebrush plants on 33 sites were classified as large (≥22 cm canopy cover), small (

  11. Biophotonics: the big picture

    Science.gov (United States)

    Marcu, Laura; Boppart, Stephen A.; Hutchinson, Mark R.; Popp, Jürgen; Wilson, Brian C.

    2018-02-01

    The 5th International Conference on Biophotonics (ICOB) held April 30 to May 1, 2017, in Fremantle, Western Australia, brought together opinion leaders to discuss future directions for the field and opportunities to consider. The first session of the conference, "How to Set a Big Picture Biophotonics Agenda," was focused on setting the stage for developing a vision and strategies for translation and impact on society of biophotonic technologies. The invited speakers, panelists, and attendees engaged in discussions that focused on opportunities and promising applications for biophotonic techniques, challenges when working at the confluence of the physical and biological sciences, driving factors for advances of biophotonic technologies, and educational opportunities. We share a summary of the presentations and discussions. Three main themes from the conference are presented in this position paper that capture the current status, opportunities, challenges, and future directions of biophotonics research and key areas of applications: (1) biophotonics at the nano- to microscale level; (2) biophotonics at meso- to macroscale level; and (3) biophotonics and the clinical translation conundrum.

  12. Big bang darkleosynthesis

    Directory of Open Access Journals (Sweden)

    Gordan Krnjaic

    2015-12-01

    Full Text Available In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN, in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV/dark-nucleon binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S≫3/2, whose discovery would be smoking gun evidence for dark nuclei.

  13. Predicting big bang deuterium

    Energy Technology Data Exchange (ETDEWEB)

    Hata, N.; Scherrer, R.J.; Steigman, G.; Thomas, D.; Walker, T.P. [Department of Physics, Ohio State University, Columbus, Ohio 43210 (United States)

    1996-02-01

    We present new upper and lower bounds to the primordial abundances of deuterium and {sup 3}He based on observational data from the solar system and the interstellar medium. Independent of any model for the primordial production of the elements we find (at the 95{percent} C.L.): 1.5{times}10{sup {minus}5}{le}(D/H){sub {ital P}}{le}10.0{times}10{sup {minus}5} and ({sup 3}He/H){sub {ital P}}{le}2.6{times}10{sup {minus}5}. When combined with the predictions of standard big bang nucleosynthesis, these constraints lead to a 95{percent} C.L. bound on the primordial abundance deuterium: (D/H){sub best}=(3.5{sup +2.7}{sub {minus}1.8}){times}10{sup {minus}5}. Measurements of deuterium absorption in the spectra of high-redshift QSOs will directly test this prediction. The implications of this prediction for the primordial abundances of {sup 4}He and {sup 7}Li are discussed, as well as those for the universal density of baryons. {copyright} {ital 1996 The American Astronomical Society.}

  14. Big bang darkleosynthesis

    Science.gov (United States)

    Krnjaic, Gordan; Sigurdson, Kris

    2015-12-01

    In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN), in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV /dark-nucleon) binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S ≫ 3 / 2), whose discovery would be smoking gun evidence for dark nuclei.

  15. The role of big laboratories

    CERN Document Server

    Heuer, Rolf-Dieter

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward.

  16. The role of big laboratories

    International Nuclear Information System (INIS)

    Heuer, R-D

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward. (paper)

  17. Head Trauma: First Aid

    Science.gov (United States)

    First aid Head trauma: First aid Head trauma: First aid By Mayo Clinic Staff Most head trauma involves injuries that are minor and don't require ... 21, 2015 Original article: http://www.mayoclinic.org/first-aid/first-aid-head-trauma/basics/ART-20056626 . Mayo ...

  18. Generalized formal model of Big Data

    OpenAIRE

    Shakhovska, N.; Veres, O.; Hirnyak, M.

    2016-01-01

    This article dwells on the basic characteristic features of the Big Data technologies. It is analyzed the existing definition of the “big data” term. The article proposes and describes the elements of the generalized formal model of big data. It is analyzed the peculiarities of the application of the proposed model components. It is described the fundamental differences between Big Data technology and business analytics. Big Data is supported by the distributed file system Google File System ...

  19. Implementing the “Big Data” Concept in Official Statistics

    OpenAIRE

    О. V.

    2017-01-01

    Big data is a huge resource that needs to be used at all levels of economic planning. The article is devoted to the study of the development of the concept of “Big Data” in the world and its impact on the transformation of statistical simulation of economic processes. Statistics at the current stage should take into account the complex system of international economic relations, which functions in the conditions of globalization and brings new forms of economic development in small open ec...

  20. Coastal Flooding in Florida's Big Bend Region with Application to Sea Level Rise Based on Synthetic Storms Analysis

    Directory of Open Access Journals (Sweden)

    Scott C. Hagen Peter Bacopoulos

    2012-01-01

    Full Text Available Flooding is examined by comparing maximum envelopes of water against the 0.2% (= 1-in-500-year return-period flooding surface generated as part of revising the Federal Emergency Management Agency¡¦s flood insurance rate maps for Franklin, Wakulla, and Jefferson counties in Florida¡¦s Big Bend Region. The analysis condenses the number of storms to a small fraction of the original 159 used in production. The analysis is performed by assessing which synthetic storms contributed to inundation extent (the extent of inundation into the floodplain, coverage (the overall surface area of the inundated floodplain and the spatially variable 0.2% flooding surface. The results are interpreted in terms of storm attributes (pressure deficit, radius to maximum winds, translation speed, storm heading, and landfall location and the physical processes occurring within the natural system (storms surge and waves; both are contextualized against existing and new hurricane scales. The approach identifies what types of storms and storm attributes lead to what types of inundation, as measured in terms of extent and coverage, in Florida¡¦s Big Bend Region and provides a basis in the identification of a select subset of synthetic storms for studying the impact of sea level rise. The sea level rise application provides a clear contrast between a dynamic approach versus that of a static approach.

  1. Head and neck cancer

    International Nuclear Information System (INIS)

    Vogl, S.E.

    1988-01-01

    This book contains 10 chapters. Some of the titles are: Combined Surgical Resection and Irradiation for Head and Neck Cancers; Analysis of Radiation Therapy Oncology Group Head and Neck Database: Identification of Prognostic Factors and the Re-evaluation of American Joint Committee Stages; Combined Modality Approach to Head and Neck Cancer; Induction Combination Chemotherapy of Regionally Advanced Head and Neck Cancer; and Outcome after Complete Remission to Induction Chemotherapy in Head and Neck Cancer

  2. Does Implementation of Big Data Analytics Improve Firms’ Market Value? Investors’ Reaction in Stock Market

    Directory of Open Access Journals (Sweden)

    Hansol Lee

    2017-06-01

    Full Text Available Recently, due to the development of social media, multimedia, and the Internet of Things (IoT, various types of data have increased. As the existing data analytics tools cannot cover this huge volume of data, big data analytics becomes one of the emerging technologies for business today. Considering that big data analytics is an up-to-date term, in the present study, we investigated the impact of implementing big data analytics in the short-term perspective. We used an event study methodology to investigate the changes in stock price caused by announcements on big data analytics solution investment. A total of 54 investment announcements of firms publicly traded in NASDAQ and NYSE from 2010 to 2015 were collected. Our results empirically demonstrate that announcement of firms’ investment on big data solution leads to positive stock market reactions. In addition, we also found that investments on small vendors’ solution with industry-oriented functions tend to result in higher abnormal returns than those on big vendors’ solution with general functions. Finally, our results also suggest that stock market investors highly evaluate big data analytics investments of big firms as compared to those of small firms.

  3. [Big data in official statistics].

    Science.gov (United States)

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  4. GEOSS: Addressing Big Data Challenges

    Science.gov (United States)

    Nativi, S.; Craglia, M.; Ochiai, O.

    2014-12-01

    In the sector of Earth Observation, the explosion of data is due to many factors including: new satellite constellations, the increased capabilities of sensor technologies, social media, crowdsourcing, and the need for multidisciplinary and collaborative research to face Global Changes. In this area, there are many expectations and concerns about Big Data. Vendors have attempted to use this term for their commercial purposes. It is necessary to understand whether Big Data is a radical shift or an incremental change for the existing digital infrastructures. This presentation tries to explore and discuss the impact of Big Data challenges and new capabilities on the Global Earth Observation System of Systems (GEOSS) and particularly on its common digital infrastructure called GCI. GEOSS is a global and flexible network of content providers allowing decision makers to access an extraordinary range of data and information at their desk. The impact of the Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on GEOSS is discussed. The main solutions and experimentation developed by GEOSS along these axes are introduced and analyzed. GEOSS is a pioneering framework for global and multidisciplinary data sharing in the Earth Observation realm; its experience on Big Data is valuable for the many lessons learned.

  5. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  6. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  7. Quantum fields in a big-crunch-big-bang spacetime

    International Nuclear Information System (INIS)

    Tolley, Andrew J.; Turok, Neil

    2002-01-01

    We consider quantum field theory on a spacetime representing the big-crunch-big-bang transition postulated in ekpyrotic or cyclic cosmologies. We show via several independent methods that an essentially unique matching rule holds connecting the incoming state, in which a single extra dimension shrinks to zero, to the outgoing state in which it reexpands at the same rate. For free fields in our construction there is no particle production from the incoming adiabatic vacuum. When interactions are included the particle production for fixed external momentum is finite at the tree level. We discuss a formal correspondence between our construction and quantum field theory on de Sitter spacetime

  8. Poker Player Behavior After Big Wins and Big Losses

    OpenAIRE

    Gary Smith; Michael Levere; Robert Kurtzman

    2009-01-01

    We find that experienced poker players typically change their style of play after winning or losing a big pot--most notably, playing less cautiously after a big loss, evidently hoping for lucky cards that will erase their loss. This finding is consistent with Kahneman and Tversky's (Kahneman, D., A. Tversky. 1979. Prospect theory: An analysis of decision under risk. Econometrica 47(2) 263-292) break-even hypothesis and suggests that when investors incur a large loss, it might be time to take ...

  9. Turning big bang into big bounce: II. Quantum dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Malkiewicz, Przemyslaw; Piechocki, Wlodzimierz, E-mail: pmalk@fuw.edu.p, E-mail: piech@fuw.edu.p [Theoretical Physics Department, Institute for Nuclear Studies, Hoza 69, 00-681 Warsaw (Poland)

    2010-11-21

    We analyze the big bounce transition of the quantum Friedmann-Robertson-Walker model in the setting of the nonstandard loop quantum cosmology (LQC). Elementary observables are used to quantize composite observables. The spectrum of the energy density operator is bounded and continuous. The spectrum of the volume operator is bounded from below and discrete. It has equally distant levels defining a quantum of the volume. The discreteness may imply a foamy structure of spacetime at a semiclassical level which may be detected in astro-cosmo observations. The nonstandard LQC method has a free parameter that should be fixed in some way to specify the big bounce transition.

  10. Big Data Analytics in Medicine and Healthcare.

    Science.gov (United States)

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  11. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  12. The challenges of big data.

    Science.gov (United States)

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. © 2016. Published by The Company of Biologists Ltd.

  13. Big Book of Windows Hacks

    CERN Document Server

    Gralla, Preston

    2008-01-01

    Bigger, better, and broader in scope, the Big Book of Windows Hacks gives you everything you need to get the most out of your Windows Vista or XP system, including its related applications and the hardware it runs on or connects to. Whether you want to tweak Vista's Aero interface, build customized sidebar gadgets and run them from a USB key, or hack the "unhackable" screensavers, you'll find quick and ingenious ways to bend these recalcitrant operating systems to your will. The Big Book of Windows Hacks focuses on Vista, the new bad boy on Microsoft's block, with hacks and workarounds that

  14. Sosiaalinen asiakassuhdejohtaminen ja big data

    OpenAIRE

    Toivonen, Topi-Antti

    2015-01-01

    Tässä tutkielmassa käsitellään sosiaalista asiakassuhdejohtamista sekä hyötyjä, joita siihen voidaan saada big datan avulla. Sosiaalinen asiakassuhdejohtaminen on terminä uusi ja monille tuntematon. Tutkimusta motivoi aiheen vähäinen tutkimus, suomenkielisen tutkimuksen puuttuminen kokonaan sekä sosiaalisen asiakassuhdejohtamisen mahdollinen olennainen rooli yritysten toiminnassa tulevaisuudessa. Big dataa käsittelevissä tutkimuksissa keskitytään monesti sen tekniseen puoleen, eikä sovellutuk...

  15. Do big gods cause anything?

    DEFF Research Database (Denmark)

    Geertz, Armin W.

    2014-01-01

    Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere.......Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere....

  16. Big Data and Social Media

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    A critical analysis of the "keep everything" Big Data era, the impact on our lives of the information, at first glance "convenient for future use" that we make known about ourselves on the network. NB! The lecture will be recorded like all Academic Training lectures. Lecturer's biography: Father of the Internet, see https://internethalloffame.org/inductees/vint-cerf or https://en.wikipedia.org/wiki/Vint_Cerf The video on slide number 9 is from page https://www.gapminder.org/tools/#$state$time$value=2018&value;;&chart-type=bubbles   Keywords: Big Data, Internet, History, Applications, tools, privacy, technology, preservation, surveillance, google, Arpanet, CERN, Web  

  17. Baryon symmetric big bang cosmology

    International Nuclear Information System (INIS)

    Stecker, F.W.

    1978-01-01

    It is stated that the framework of baryon symmetric big bang (BSBB) cosmology offers our greatest potential for deducting the evolution of the Universe because its physical laws and processes have the minimum number of arbitrary assumptions about initial conditions in the big-bang. In addition, it offers the possibility of explaining the photon-baryon ratio in the Universe and how galaxies and galaxy clusters are formed. BSBB cosmology also provides the only acceptable explanation at present for the origin of the cosmic γ-ray background radiation. (author)

  18. Release plan for Big Pete

    International Nuclear Information System (INIS)

    Edwards, T.A.

    1996-11-01

    This release plan is to provide instructions for the Radiological Control Technician (RCT) to conduct surveys for the unconditional release of ''Big Pete,'' which was used in the removal of ''Spacers'' from the N-Reactor. Prior to performing surveys on the rear end portion of ''Big Pete,'' it shall be cleaned (i.e., free of oil, grease, caked soil, heavy dust). If no contamination is found, the vehicle may be released with the permission of the area RCT Supervisor. If contamination is found by any of the surveys, contact the cognizant Radiological Engineer for decontamination instructions

  19. [Big Data- challenges and risks].

    Science.gov (United States)

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-06

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented.

  20. Towards a big crunch dual

    Energy Technology Data Exchange (ETDEWEB)

    Hertog, Thomas E-mail: hertog@vulcan2.physics.ucsb.edu; Horowitz, Gary T

    2004-07-01

    We show there exist smooth asymptotically anti-de Sitter initial data which evolve to a big crunch singularity in a low energy supergravity limit of string theory. This opens up the possibility of using the dual conformal field theory to obtain a fully quantum description of the cosmological singularity. A preliminary study of this dual theory suggests that the big crunch is an endpoint of evolution even in the full string theory. We also show that any theory with scalar solitons must have negative energy solutions. The results presented here clarify our earlier work on cosmic censorship violation in N=8 supergravity. (author)

  1. The Inverted Big-Bang

    OpenAIRE

    Vaas, Ruediger

    2004-01-01

    Our universe appears to have been created not out of nothing but from a strange space-time dust. Quantum geometry (loop quantum gravity) makes it possible to avoid the ominous beginning of our universe with its physically unrealistic (i.e. infinite) curvature, extreme temperature, and energy density. This could be the long sought after explanation of the big-bang and perhaps even opens a window into a time before the big-bang: Space itself may have come from an earlier collapsing universe tha...

  2. Gender Differences in Personality across the Ten Aspects of the Big Five.

    Science.gov (United States)

    Weisberg, Yanna J; Deyoung, Colin G; Hirsh, Jacob B

    2011-01-01

    This paper investigates gender differences in personality traits, both at the level of the Big Five and at the sublevel of two aspects within each Big Five domain. Replicating previous findings, women reported higher Big Five Extraversion, Agreeableness, and Neuroticism scores than men. However, more extensive gender differences were found at the level of the aspects, with significant gender differences appearing in both aspects of every Big Five trait. For Extraversion, Openness, and Conscientiousness, the gender differences were found to diverge at the aspect level, rendering them either small or undetectable at the Big Five level. These findings clarify the nature of gender differences in personality and highlight the utility of measuring personality at the aspect level.

  3. Addressing Data Veracity in Big Data Applications

    Energy Technology Data Exchange (ETDEWEB)

    Aman, Saima [Univ. of Southern California, Los Angeles, CA (United States). Dept. of Computer Science; Chelmis, Charalampos [Univ. of Southern California, Los Angeles, CA (United States). Dept. of Electrical Engineering; Prasanna, Viktor [Univ. of Southern California, Los Angeles, CA (United States). Dept. of Electrical Engineering

    2014-10-27

    Big data applications such as in smart electric grids, transportation, and remote environment monitoring involve geographically dispersed sensors that periodically send back information to central nodes. In many cases, data from sensors is not available at central nodes at a frequency that is required for real-time modeling and decision-making. This may be due to physical limitations of the transmission networks, or due to consumers limiting frequent transmission of data from sensors located at their premises for security and privacy concerns. Such scenarios lead to partial data problem and raise the issue of data veracity in big data applications. We describe a novel solution to the problem of making short term predictions (up to a few hours ahead) in absence of real-time data from sensors in Smart Grid. A key implication of our work is that by using real-time data from only a small subset of influential sensors, we are able to make predictions for all sensors. We thus reduce the communication complexity involved in transmitting sensory data in Smart Grids. We use real-world electricity consumption data from smart meters to empirically demonstrate the usefulness of our method. Our dataset consists of data collected at 15-min intervals from 170 smart meters in the USC Microgrid for 7 years, totaling 41,697,600 data points.

  4. The hot big bang and beyond

    Energy Technology Data Exchange (ETDEWEB)

    Turner, M.S. [Departments of Physics and of Astronomy & Astrophysics, Enrico Fermi Institute, The University of Chicago, Chicago, Illinois 60637-1433 (United States)]|[NASA/Fermilab Astrophysics Center, Fermi National Accelerator Laboratory, Batavia, Illinois 60510-0500 (United States)

    1995-08-01

    The hot big-bang cosmology provides a reliable accounting of the Universe from about 10{sup {minus}2} sec after the bang until the present, as well as a robust framework for speculating back to times as early as 10{sup {minus}43} sec. Cosmology faces a number of important challenges; foremost among them are determining the quantity and composition of matter in the Universe and developing a detailed and coherent picture of how structure (galaxies, clusters of galaxies, superclusters, voids, great walls, and so on) developed. At present there is a working hypothesis{emdash}cold dark matter{emdash}which is based upon inflation and which, if correct, would extend the big bang model back to 10{sup {minus}32} sec and cast important light on the unification of the forces. Many experiments and observations, from CBR anisotropy experiments to Hubble Space Telescope observations to experiments at Fermilab and CERN, are now putting the cold dark matter theory to the test. At present it appears that the theory is viable only if the Hubble constant is smaller than current measurements indicate (around 30 km s{sup {minus}1} Mpc{sup {minus}1}), or if the theory is modified slightly, e.g., by the addition of a cosmological constant, a small admixture of hot dark matter (5 eV {open_quote}{open_quote}worth of neutrinos{close_quote}{close_quote}), more relativistic particle or a tilted spectrum of density perturbations.

  5. Head injury - first aid

    Science.gov (United States)

    ... medlineplus.gov/ency/article/000028.htm Head injury - first aid To use the sharing features on this page, ... a concussion can range from mild to severe. First Aid Learning to recognize a serious head injury and ...

  6. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... the head uses special x-ray equipment to help assess head injuries, severe headaches, dizziness, and other ... aneurysm, bleeding, stroke and brain tumors. It also helps your doctor to evaluate your face, sinuses, and ...

  7. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... for Brain Tumors Radiation Therapy for Head and Neck Cancer Others American Stroke Association National Stroke Association ... Computer Tomography (CT) Safety During Pregnancy Head and Neck Cancer X-ray, Interventional Radiology and Nuclear Medicine ...

  8. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... When the image slices are reassembled by computer software, the result is a very detailed multidimensional view ... Safety Images related to Computed Tomography (CT) - Head Videos related to Computed Tomography (CT) - Head Sponsored by ...

  9. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... the limitations of CT Scanning of the Head? What is CT Scanning of the Head? Computed tomography, ... than regular radiographs (x-rays). top of page What are some common uses of the procedure? CT ...

  10. Big Cities, Big Problems: Reason for the Elderly to Move?

    NARCIS (Netherlands)

    Fokkema, T.; de Jong-Gierveld, J.; Nijkamp, P.

    1996-01-01

    In many European countries, data on geographical patterns of internal elderly migration show that the elderly (55+) are more likely to leave than to move to the big cities. Besides emphasising the attractive features of the destination areas (pull factors), it is often assumed that this negative

  11. Big-Eyed Bugs Have Big Appetite for Pests

    Science.gov (United States)

    Many kinds of arthropod natural enemies (predators and parasitoids) inhabit crop fields in Arizona and can have a large negative impact on several pest insect species that also infest these crops. Geocoris spp., commonly known as big-eyed bugs, are among the most abundant insect predators in field c...

  12. Ocean Networks Canada's "Big Data" Initiative

    Science.gov (United States)

    Dewey, R. K.; Hoeberechts, M.; Moran, K.; Pirenne, B.; Owens, D.

    2013-12-01

    Ocean Networks Canada operates two large undersea observatories that collect, archive, and deliver data in real time over the Internet. These data contribute to our understanding of the complex changes taking place on our ocean planet. Ocean Networks Canada's VENUS was the world's first cabled seafloor observatory to enable researchers anywhere to connect in real time to undersea experiments and observations. Its NEPTUNE observatory is the largest cabled ocean observatory, spanning a wide range of ocean environments. Most recently, we installed a new small observatory in the Arctic. Together, these observatories deliver "Big Data" across many disciplines in a cohesive manner using the Oceans 2.0 data management and archiving system that provides national and international users with open access to real-time and archived data while also supporting a collaborative work environment. Ocean Networks Canada operates these observatories to support science, innovation, and learning in four priority areas: study of the impact of climate change on the ocean; the exploration and understanding the unique life forms in the extreme environments of the deep ocean and below the seafloor; the exchange of heat, fluids, and gases that move throughout the ocean and atmosphere; and the dynamics of earthquakes, tsunamis, and undersea landslides. To date, the Ocean Networks Canada archive contains over 130 TB (collected over 7 years) and the current rate of data acquisition is ~50 TB per year. This data set is complex and diverse. Making these "Big Data" accessible and attractive to users is our priority. In this presentation, we share our experience as a "Big Data" institution where we deliver simple and multi-dimensional calibrated data cubes to a diverse pool of users. Ocean Networks Canada also conducts extensive user testing. Test results guide future tool design and development of "Big Data" products. We strive to bridge the gap between the raw, archived data and the needs and

  13. A survey on Big Data Stream Mining

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... Big Data can be static on one machine or distributed ... decision making, and process automation. Big data .... Concept Drifting: concept drifting mean the classifier .... transactions generated by a prefix tree structure. EstDec ...

  14. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Head Computed tomography (CT) of the head uses special x-ray ... What is CT Scanning of the Head? Computed tomography, more commonly known as a CT or CAT ...

  15. BIG DATA-DRIVEN MARKETING: AN ABSTRACT

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (BD use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: What are the key antecedents of big data customer analytics use? How, and to what extent, does big data customer an...

  16. Big data in Finnish financial services

    OpenAIRE

    Laurila, M. (Mikko)

    2017-01-01

    Abstract This thesis aims to explore the concept of big data, and create understanding of big data maturity in the Finnish financial services industry. The research questions of this thesis are “What kind of big data solutions are being implemented in the Finnish financial services sector?” and “Which factors impede faster implementation of big data solutions in the Finnish financial services sector?”. ...

  17. China: Big Changes Coming Soon

    Science.gov (United States)

    Rowen, Henry S.

    2011-01-01

    Big changes are ahead for China, probably abrupt ones. The economy has grown so rapidly for many years, over 30 years at an average of nine percent a year, that its size makes it a major player in trade and finance and increasingly in political and military matters. This growth is not only of great importance internationally, it is already having…

  18. Big data and urban governance

    NARCIS (Netherlands)

    Taylor, L.; Richter, C.; Gupta, J.; Pfeffer, K.; Verrest, H.; Ros-Tonen, M.

    2015-01-01

    This chapter examines the ways in which big data is involved in the rise of smart cities. Mobile phones, sensors and online applications produce streams of data which are used to regulate and plan the city, often in real time, but which presents challenges as to how the city’s functions are seen and

  19. Big Data for personalized healthcare

    NARCIS (Netherlands)

    Siemons, Liseth; Sieverink, Floor; Vollenbroek, Wouter; van de Wijngaert, Lidwien; Braakman-Jansen, Annemarie; van Gemert-Pijnen, Lisette

    2016-01-01

    Big Data, often defined according to the 5V model (volume, velocity, variety, veracity and value), is seen as the key towards personalized healthcare. However, it also confronts us with new technological and ethical challenges that require more sophisticated data management tools and data analysis

  20. Big data en gelijke behandeling

    NARCIS (Netherlands)

    Lammerant, Hans; de Hert, Paul; Blok, P.H.; Blok, P.H.

    2017-01-01

    In dit hoofdstuk bekijken we allereerst de voornaamste basisbegrippen inzake gelijke behandeling en discriminatie (paragraaf 6.2). Vervolgens kijken we haar het Nederlandse en Europese juridisch kader inzake non-discriminatie (paragraaf 6.3-6.5) en hoe die regels moeten worden toegepast op big

  1. Research Ethics in Big Data.

    Science.gov (United States)

    Hammer, Marilyn J

    2017-05-01

    The ethical conduct of research includes, in part, patient agreement to participate in studies and the protection of health information. In the evolving world of data science and the accessibility of large quantities of web-based data created by millions of individuals, novel methodologic approaches to answering research questions are emerging. This article explores research ethics in the context of big data.

  2. Big data e data science

    OpenAIRE

    Cavique, Luís

    2014-01-01

    Neste artigo foram apresentados os conceitos básicos de Big Data e a nova área a que deu origem, a Data Science. Em Data Science foi discutida e exemplificada a noção de redução da dimensionalidade dos dados.

  3. The Case for "Big History."

    Science.gov (United States)

    Christian, David

    1991-01-01

    Urges an approach to the teaching of history that takes the largest possible perspective, crossing time as well as space. Discusses the problems and advantages of such an approach. Describes a course on "big" history that begins with time, creation myths, and astronomy, and moves on to paleontology and evolution. (DK)

  4. Finding errors in big data

    NARCIS (Netherlands)

    Puts, Marco; Daas, Piet; de Waal, A.G.

    No data source is perfect. Mistakes inevitably creep in. Spotting errors is hard enough when dealing with survey responses from several thousand people, but the difficulty is multiplied hugely when that mysterious beast Big Data comes into play. Statistics Netherlands is about to publish its first

  5. Sampling Operations on Big Data

    Science.gov (United States)

    2015-11-29

    gories. These include edge sampling methods where edges are selected by a predetermined criteria; snowball sampling methods where algorithms start... Sampling Operations on Big Data Vijay Gadepally, Taylor Herr, Luke Johnson, Lauren Milechin, Maja Milosavljevic, Benjamin A. Miller Lincoln...process and disseminate information for discovery and exploration under real-time constraints. Common signal processing operations such as sampling and

  6. The International Big History Association

    Science.gov (United States)

    Duffy, Michael; Duffy, D'Neil

    2013-01-01

    IBHA, the International Big History Association, was organized in 2010 and "promotes the unified, interdisciplinary study and teaching of history of the Cosmos, Earth, Life, and Humanity." This is the vision that Montessori embraced long before the discoveries of modern science fleshed out the story of the evolving universe. "Big…

  7. NASA's Big Data Task Force

    Science.gov (United States)

    Holmes, C. P.; Kinter, J. L.; Beebe, R. F.; Feigelson, E.; Hurlburt, N. E.; Mentzel, C.; Smith, G.; Tino, C.; Walker, R. J.

    2017-12-01

    Two years ago NASA established the Ad Hoc Big Data Task Force (BDTF - https://science.nasa.gov/science-committee/subcommittees/big-data-task-force), an advisory working group with the NASA Advisory Council system. The scope of the Task Force included all NASA Big Data programs, projects, missions, and activities. The Task Force focused on such topics as exploring the existing and planned evolution of NASA's science data cyber-infrastructure that supports broad access to data repositories for NASA Science Mission Directorate missions; best practices within NASA, other Federal agencies, private industry and research institutions; and Federal initiatives related to big data and data access. The BDTF has completed its two-year term and produced several recommendations plus four white papers for NASA's Science Mission Directorate. This presentation will discuss the activities and results of the TF including summaries of key points from its focused study topics. The paper serves as an introduction to the papers following in this ESSI session.

  8. Big Math for Little Kids

    Science.gov (United States)

    Greenes, Carole; Ginsburg, Herbert P.; Balfanz, Robert

    2004-01-01

    "Big Math for Little Kids," a comprehensive program for 4- and 5-year-olds, develops and expands on the mathematics that children know and are capable of doing. The program uses activities and stories to develop ideas about number, shape, pattern, logical reasoning, measurement, operations on numbers, and space. The activities introduce the…

  9. BIG DATA IN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2015-06-01

    Full Text Available In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured in order to improve current transactions, to develop new business models, to provide a real image of the supply and demand and thereby, generate market advantages. So, the companies that turn to Big Data have a competitive advantage over other firms. Looking from the perspective of IT organizations, they must accommodate the storage and processing Big Data, and provide analysis tools that are easily integrated into business processes. This paper aims to discuss aspects regarding the Big Data concept, the principles to build, organize and analyse huge datasets in the business environment, offering a three-layer architecture, based on actual software solutions. Also, the article refers to the graphical tools for exploring and representing unstructured data, Gephi and NodeXL.

  10. From Big Bang to Eternity?

    Indian Academy of Sciences (India)

    at different distances (that is, at different epochs in the past) to come to this ... that the expansion started billions of years ago from an explosive Big Bang. Recent research sheds new light on the key cosmological question about the distant ...

  11. Banking Wyoming big sagebrush seeds

    Science.gov (United States)

    Robert P. Karrfalt; Nancy Shaw

    2013-01-01

    Five commercially produced seed lots of Wyoming big sagebrush (Artemisia tridentata Nutt. var. wyomingensis (Beetle & Young) S.L. Welsh [Asteraceae]) were stored under various conditions for 5 y. Purity, moisture content as measured by equilibrium relative humidity, and storage temperature were all important factors to successful seed storage. Our results indicate...

  12. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  13. Big data: een zoektocht naar instituties

    NARCIS (Netherlands)

    van der Voort, H.G.; Crompvoets, J

    2016-01-01

    Big data is a well-known phenomenon, even a buzzword nowadays. It refers to an abundance of data and new possibilities to process and use them. Big data is subject of many publications. Some pay attention to the many possibilities of big data, others warn us for their consequences. This special

  14. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  15. Data, Data, Data : Big, Linked & Open

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Eckartz, S.M.

    2013-01-01

    De gehele business en IT-wereld praat op dit moment over Big Data, een trend die medio 2013 Cloud Computing is gepasseerd (op basis van Google Trends). Ook beleidsmakers houden zich actief bezig met Big Data. Neelie Kroes, vice-president van de Europese Commissie, spreekt over de ‘Big Data

  16. A survey of big data research

    Science.gov (United States)

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  17. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  18. Delivering advanced therapies: the big pharma approach.

    Science.gov (United States)

    Tarnowski, J; Krishna, D; Jespers, L; Ketkar, A; Haddock, R; Imrie, J; Kili, S

    2017-09-01

    After two decades of focused development and some recent clinical successes, cell and gene therapy (CGT) is emerging as a promising approach to personalized medicines. Genetically engineered cells as a medical modality are poised to stand alongside or in combination with small molecule and biopharmaceutical approaches to bring new therapies to patients globally. Big pharma can have a vital role in industrializing CGT by focusing on diseases with high unmet medical need and compelling genetic evidence. Pharma should invest in manufacturing and supply chain solutions that deliver reproducible, high-quality therapies at a commercially viable cost. Owing to the fast pace of innovation in this field proactive engagement with regulators is critical. It is also vital to understand the needs of patients all along the patient care pathway and to establish product pricing that is accepted by prescribers, payers and patients.

  19. The Economics of Big Area Addtiive Manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Post, Brian [Oak Ridge National Laboratory (ORNL); Lloyd, Peter D [ORNL; Lindahl, John [Oak Ridge National Laboratory (ORNL); Lind, Randall F [ORNL; Love, Lonnie J [ORNL; Kunc, Vlastimil [ORNL

    2016-01-01

    Case studies on the economics of Additive Manufacturing (AM) suggest that processing time is the dominant cost in manufacturing. Most additive processes have similar performance metrics: small part sizes, low production rates and expensive feedstocks. Big Area Additive Manufacturing is based on transitioning polymer extrusion technology from a wire to a pellet feedstock. Utilizing pellets significantly increases deposition speed and lowers material cost by utilizing low cost injection molding feedstock. The use of carbon fiber reinforced polymers eliminates the need for a heated chamber, significantly reducing machine power requirements and size constraints. We hypothesize that the increase in productivity coupled with decrease in feedstock and energy costs will enable AM to become more competitive with conventional manufacturing processes for many applications. As a test case, we compare the cost of using traditional fused deposition modeling (FDM) with BAAM for additively manufacturing composite tooling.

  20. Big Data - Smart Health Strategies

    Science.gov (United States)

    2014-01-01

    Summary Objectives To select best papers published in 2013 in the field of big data and smart health strategies, and summarize outstanding research efforts. Methods A systematic search was performed using two major bibliographic databases for relevant journal papers. The references obtained were reviewed in a two-stage process, starting with a blinded review performed by the two section editors, and followed by a peer review process operated by external reviewers recognized as experts in the field. Results The complete review process selected four best papers, illustrating various aspects of the special theme, among them: (a) using large volumes of unstructured data and, specifically, clinical notes from Electronic Health Records (EHRs) for pharmacovigilance; (b) knowledge discovery via querying large volumes of complex (both structured and unstructured) biological data using big data technologies and relevant tools; (c) methodologies for applying cloud computing and big data technologies in the field of genomics, and (d) system architectures enabling high-performance access to and processing of large datasets extracted from EHRs. Conclusions The potential of big data in biomedicine has been pinpointed in various viewpoint papers and editorials. The review of current scientific literature illustrated a variety of interesting methods and applications in the field, but still the promises exceed the current outcomes. As we are getting closer towards a solid foundation with respect to common understanding of relevant concepts and technical aspects, and the use of standardized technologies and tools, we can anticipate to reach the potential that big data offer for personalized medicine and smart health strategies in the near future. PMID:25123721

  1. Head, Neck, and Oral Cancer

    Medline Plus

    Full Text Available ... Head and Neck Pathology Oral, Head and Neck Pathology Close to 49,750 Americans will be diagnosed ... Head and Neck Pathology Oral, Head and Neck Pathology Close to 49,750 Americans will be diagnosed ...

  2. String Theory and Pre-big bang Cosmology

    CERN Document Server

    Gasperini, M.

    In string theory, the traditional picture of a Universe that emerges from the inflation of a very small and highly curved space-time patch is a possibility, not a necessity: quite different initial conditions are possible, and not necessarily unlikely. In particular, the duality symmetries of string theory suggest scenarios in which the Universe starts inflating from an initial state characterized by very small curvature and interactions. Such a state, being gravitationally unstable, will evolve towards higher curvature and coupling, until string-size effects and loop corrections make the Universe "bounce" into a standard, decreasing-curvature regime. In such a context, the hot big bang of conventional cosmology is replaced by a "hot big bounce" in which the bouncing and heating mechanisms originate from the quantum production of particles in the high-curvature, large-coupling pre-bounce phase. Here we briefly summarize the main features of this inflationary scenario, proposed a quarter century ago. In its si...

  3. The Problem with Big Data: Operating on Smaller Datasets to Bridge the Implementation Gap.

    Science.gov (United States)

    Mann, Richard P; Mushtaq, Faisal; White, Alan D; Mata-Cervantes, Gabriel; Pike, Tom; Coker, Dalton; Murdoch, Stuart; Hiles, Tim; Smith, Clare; Berridge, David; Hinchliffe, Suzanne; Hall, Geoff; Smye, Stephen; Wilkie, Richard M; Lodge, J Peter A; Mon-Williams, Mark

    2016-01-01

    Big datasets have the potential to revolutionize public health. However, there is a mismatch between the political and scientific optimism surrounding big data and the public's perception of its benefit. We suggest a systematic and concerted emphasis on developing models derived from smaller datasets to illustrate to the public how big data can produce tangible benefits in the long term. In order to highlight the immediate value of a small data approach, we produced a proof-of-concept model predicting hospital length of stay. The results demonstrate that existing small datasets can be used to create models that generate a reasonable prediction, facilitating health-care delivery. We propose that greater attention (and funding) needs to be directed toward the utilization of existing information resources in parallel with current efforts to create and exploit "big data."

  4. Možnosti využitia Big Data pre Competitive Inteligence

    OpenAIRE

    Verníček, Marek

    2016-01-01

    The main purpose of this thesis is to investigate the use of Big Data for the methods and procedures of Competitive Intelligence. Among the goals of the work is a toolkit for small and large businesses which is supposed to support their work with the whole process of Big Data work. Another goal is to design an effective solution of processing Big Data to gain a competitive advantage in business. The theoretical part of the work processes available scientific literature in the Czech Republic a...

  5. Big-Leaf Mahogany on CITES Appendix II: Big Challenge, Big Opportunity

    Science.gov (United States)

    JAMES GROGAN; PAULO BARRETO

    2005-01-01

    On 15 November 2003, big-leaf mahogany (Swietenia macrophylla King, Meliaceae), the most valuable widely traded Neotropical timber tree, gained strengthened regulatory protection from its listing on Appendix II of the Convention on International Trade in Endangered Species ofWild Fauna and Flora (CITES). CITES is a United Nations-chartered agreement signed by 164...

  6. Generating a hot big bang via a change in topology

    International Nuclear Information System (INIS)

    Kandvup, H.E.

    1990-01-01

    This paper uses ideas developed recently in semiclassical quantum gravity to argue that many qualitative features of the hot big bang generally assumed in cosmology may be explained by the hypothesis that, interpreted semiclassically, the universe tunnelled into being via a quantum fluctuation from a small (Planck-sized), topologically complex entity to a topologically trivial entity (like a Friedmann universe) that rapidly grew to a more macroscopic size

  7. Generating a hot big bang via a change in topology

    Energy Technology Data Exchange (ETDEWEB)

    Kandvup, H.E. (Florida Univ., Gainesville, FL (USA). Space Astronomy Lab.); Masur, P.O. (Institute for Fundamental Theory, Univ. of Florida, Gainesville, FL (US))

    1990-08-01

    This paper uses ideas developed recently in semiclassical quantum gravity to argue that many qualitative features of the hot big bang generally assumed in cosmology may be explained by the hypothesis that, interpreted semiclassically, the universe tunnelled into being via a quantum fluctuation from a small (Planck-sized), topologically complex entity to a topologically trivial entity (like a Friedmann universe) that rapidly grew to a more macroscopic size.

  8. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    CERN Multimedia

    2008-01-01

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  9. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  10. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  11. Antigravity and the big crunch/big bang transition

    International Nuclear Information System (INIS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  12. Solution of a braneworld big crunch/big bang cosmology

    International Nuclear Information System (INIS)

    McFadden, Paul L.; Turok, Neil; Steinhardt, Paul J.

    2007-01-01

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c) 2 . At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios

  13. The ethics of big data in big agriculture

    Directory of Open Access Journals (Sweden)

    Isabelle M. Carbonell

    2016-03-01

    Full Text Available This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique insights on a field-by-field basis into a third or more of the US farmland. This power asymmetry may be rebalanced through open-sourced data, and publicly-funded data analytic tools which rival Climate Corp. in complexity and innovation for use in the public domain.

  14. Big Data – Big Deal for Organization Design?

    OpenAIRE

    Janne J. Korhonen

    2014-01-01

    Analytics is an increasingly important source of competitive advantage. It has even been posited that big data will be the next strategic emphasis of organizations and that analytics capability will be manifested in organizational structure. In this article, I explore how analytics capability might be reflected in organizational structure using the notion of  “requisite organization” developed by Jaques (1998). Requisite organization argues that a new strategic emphasis requires the addition ...

  15. Nowcasting using news topics Big Data versus big bank

    OpenAIRE

    Thorsrud, Leif Anders

    2016-01-01

    The agents in the economy use a plethora of high frequency information, including news media, to guide their actions and thereby shape aggregate economic fluctuations. Traditional nowcasting approches have to a relatively little degree made use of such information. In this paper, I show how unstructured textual information in a business newspaper can be decomposed into daily news topics and used to nowcast quarterly GDP growth. Compared with a big bank of experts, here represented by o cial c...

  16. Big Bang synthesis of nuclear dark matter

    International Nuclear Information System (INIS)

    Hardy, Edward; Lasenby, Robert; March-Russell, John; West, Stephen M.

    2015-01-01

    We investigate the physics of dark matter models featuring composite bound states carrying a large conserved dark “nucleon” number. The properties of sufficiently large dark nuclei may obey simple scaling laws, and we find that this scaling can determine the number distribution of nuclei resulting from Big Bang Dark Nucleosynthesis. For plausible models of asymmetric dark matter, dark nuclei of large nucleon number, e.g. ≳10 8 , may be synthesised, with the number distribution taking one of two characteristic forms. If small-nucleon-number fusions are sufficiently fast, the distribution of dark nuclei takes on a logarithmically-peaked, universal form, independent of many details of the initial conditions and small-number interactions. In the case of a substantial bottleneck to nucleosynthesis for small dark nuclei, we find the surprising result that even larger nuclei, with size ≫10 8 , are often finally synthesised, again with a simple number distribution. We briefly discuss the constraints arising from the novel dark sector energetics, and the extended set of (often parametrically light) dark sector states that can occur in complete models of nuclear dark matter. The physics of the coherent enhancement of direct detection signals, the nature of the accompanying dark-sector form factors, and the possible modifications to astrophysical processes are discussed in detail in a companion paper.

  17. Big Data in Medicine is Driving Big Changes

    Science.gov (United States)

    Verspoor, K.

    2014-01-01

    Summary Objectives To summarise current research that takes advantage of “Big Data” in health and biomedical informatics applications. Methods Survey of trends in this work, and exploration of literature describing how large-scale structured and unstructured data sources are being used to support applications from clinical decision making and health policy, to drug design and pharmacovigilance, and further to systems biology and genetics. Results The survey highlights ongoing development of powerful new methods for turning that large-scale, and often complex, data into information that provides new insights into human health, in a range of different areas. Consideration of this body of work identifies several important paradigm shifts that are facilitated by Big Data resources and methods: in clinical and translational research, from hypothesis-driven research to data-driven research, and in medicine, from evidence-based practice to practice-based evidence. Conclusions The increasing scale and availability of large quantities of health data require strategies for data management, data linkage, and data integration beyond the limits of many existing information systems, and substantial effort is underway to meet those needs. As our ability to make sense of that data improves, the value of the data will continue to increase. Health systems, genetics and genomics, population and public health; all areas of biomedicine stand to benefit from Big Data and the associated technologies. PMID:25123716

  18. Big data is not a monolith

    CERN Document Server

    Ekbia, Hamid R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  19. The Operation “Big Saturn”: Plan and Implementation Opportunities

    Directory of Open Access Journals (Sweden)

    Ochirov Utash Borisovich

    2015-11-01

    Full Text Available The article analyzes the plan of one of the most important operations of a radical turning point in the course of the Great Patriotic War. Initially, the plan, conceived as the development of Operation “Uranus” and resulted in the encirclement of the 6th Army headed by Field Marshall Paulus at Stalingrad, was called “Saturn”. However, after creating a shortened version of the name “Small Saturn”, it was called “Big Saturn”. The main purpose of “Saturn” was to conduct the strategic offensive with the help of four or five armies from middle Don and Chir towards Rostov-on-Don to withdraw the enemy Army Group “Don” and Army Group “A” from the Caucasus. The enemy army groups consisted of almost a third of the units of the Wehrmacht and its allies, who fought against the Soviet Union and their defeat or delay on the Taman Peninsula where there was no port capacity for rapid evacuation of such a large group of people, equipment and supplies, could significantly change the balance of forces on the Eastern front. Unfortunately, the Сhief of the General Staff of the Red Army Alexander Vasilevsky, who was initially in charge of the “Saturn” operation as the representative of Stavka (General Headquarters, was appointed by Stalin in late November as a coordinator of the Soviet troops, achieving the encirclement of the surrounded Paulus’s grouping. Vasilevsky, using his authority, managed to transfer a significant part of the strategic reserves from the middle Don direction firstly to the Stalingrad direction and then to the Kotelnikovsky direction against unlocking grouping of Manstein. The analysis of the forces of the Stalingrad Front, given in the article, shows that there were enough troops to repulse an attack. The transferred strategic reserves proved to be redundant, and their offensive on Rostov failed, while their usage according to the plan “Saturn” would have led to undeniable success. As a result, Manstein

  20. Big fish, dry pond

    International Nuclear Information System (INIS)

    Casey, A.

    1995-01-01

    The successes and failures of the Williston Wildcatters Oil Corp., a junior oil company headquartered in Arcola, Saskatchewan were chronicled. It is located in Arcola, Saskatchewan, whereas practically all the other oil companies operate from Calgary, Alberta. Apart from location and management, (a half back female) Williston operations are also different. Unlike most junior oil companies that rely on consultants and service companies to do most of their work, Williston has re-introduced the vertical integration to the industry. By owning and operating most of the needed equipment themselves, Williston spends most of the money locally, thus helping to keep alive a small prairie town that would otherwise have had a very difficult time of it. Williston CEO Mary Tidlund and her partner, have been described by the press as r ebels of the oil patch , are disliked by many in the industry for their antagonistic partnering style, and failure to pay suppliers and service companies, (at least in most recent times). Initially, the company has had good success with horizontal drilling, a technology new to Saskatchewan. Williston managed to decrease the cost of a horizontal hole to roughly twice that of a vertical one (considered to be very good in view of the potential payout). However, initial successes of the company were followed with financial problems and difficulty in paying their debts. High payroll costs associated with vertical integration, and lack of focus have been cited as the main reasons for Williston's financial troubles. Although an agreement has been reached with creditors to allow the company to continue operating, there is considerable doubt that they will be able to meet scheduled loan repayment deadlines

  1. Big russian oil round

    International Nuclear Information System (INIS)

    Slovak, K.; Beer, G.

    2006-01-01

    The departure of Mikhail Khodorkovsky has brought an end to the idyllic times of supplies of Russian oil to the MOL-Slovnaft group. The group used to purchase oil directly from Yukos. But now brokers have again entered the Central European oil business. And their aim is to take control over all of the oil business. The Russians demonstrated the changed situation to Slovakia last autumn: you will either accept the new model, or there will be problems with oil deliveries. Consumers got the message. The main brokers of Russian oil in Central Europe are the Swiss companies Glencore and Fisotra. Little information is available regarding these commodity brokers. But the information available is sufficient to indicate that these are not small companies. Glencore undertakes 3% of all international oil trades. With an annual turnover of 72 billions USD, it was the biggest Swiss company by turnover in 2004. Fisotra also has an extensive product portfolio. It offers financial and commercial services and does not hide its good relations with Russian oil companies. Between 1994 and 1998, it managed their financial operations with major western companies such as BP, Cargill, Elf, Exxon, Shell, Total, and Mutsubishi and also with Glencore. Fisotra states that some of its clients achieved an annual turnover of 1.5 billions USD. At present, the Swiss brokers receive a fee of 1 to 1.5 USD per barrel. The Russian political elite must be aware of these brokerage services as the oil transport through the transit system is closely monitored by the state owned company Transneft. (authors)

  2. What are Head Cavities? - A History of Studies on Vertebrate Head Segmentation.

    Science.gov (United States)

    Kuratani, Shigeru; Adachi, Noritaka

    2016-06-01

    Motivated by the discovery of segmental epithelial coeloms, or "head cavities," in elasmobranch embryos toward the end of the 19th century, the debate over the presence of mesodermal segments in the vertebrate head became a central problem in comparative embryology. The classical segmental view assumed only one type of metamerism in the vertebrate head, in which each metamere was thought to contain one head somite and one pharyngeal arch, innervated by a set of cranial nerves serially homologous to dorsal and ventral roots of spinal nerves. The non-segmental view, on the other hand, rejected the somite-like properties of head cavities. A series of small mesodermal cysts in early Torpedo embryos, which were thought to represent true somite homologs, provided a third possible view on the nature of the vertebrate head. Recent molecular developmental data have shed new light on the vertebrate head problem, explaining that head mesoderm evolved, not by the modification of rostral somites of an amphioxus-like ancestor, but through the polarization of unspecified paraxial mesoderm into head mesoderm anteriorly and trunk somites posteriorly.

  3. Big Data for Precision Medicine

    Directory of Open Access Journals (Sweden)

    Daniel Richard Leff

    2015-09-01

    Full Text Available This article focuses on the potential impact of big data analysis to improve health, prevent and detect disease at an earlier stage, and personalize interventions. The role that big data analytics may have in interrogating the patient electronic health record toward improved clinical decision support is discussed. We examine developments in pharmacogenetics that have increased our appreciation of the reasons why patients respond differently to chemotherapy. We also assess the expansion of online health communications and the way in which this data may be capitalized on in order to detect public health threats and control or contain epidemics. Finally, we describe how a new generation of wearable and implantable body sensors may improve wellbeing, streamline management of chronic diseases, and improve the quality of surgical implants.

  4. Big Data hvor N=1

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    2017-01-01

    Forskningen vedrørende anvendelsen af ’big data’ indenfor sundhed er kun lige begyndt, og kan på sigt blive en stor hjælp i forhold til at tilrettelægge en mere personlig og helhedsorienteret sundhedsindsats for multisyge. Personlig sundhedsteknologi, som kort præsenteres i dette kapital, rummer et...... stor potentiale for at gennemføre ’big data’ analyser for den enkelte person, det vil sige hvor N=1. Der er store teknologiske udfordringer i at få lavet teknologier og metoder til at indsamle og håndtere personlige data, som kan deles, på tværs på en standardiseret, forsvarlig, robust, sikker og ikke...

  5. George and the big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2012-01-01

    George has problems. He has twin baby sisters at home who demand his parents’ attention. His beloved pig Freddy has been exiled to a farm, where he’s miserable. And worst of all, his best friend, Annie, has made a new friend whom she seems to like more than George. So George jumps at the chance to help Eric with his plans to run a big experiment in Switzerland that seeks to explore the earliest moment of the universe. But there is a conspiracy afoot, and a group of evildoers is planning to sabotage the experiment. Can George repair his friendship with Annie and piece together the clues before Eric’s experiment is destroyed forever? This engaging adventure features essays by Professor Stephen Hawking and other eminent physicists about the origins of the universe and ends with a twenty-page graphic novel that explains how the Big Bang happened—in reverse!

  6. Did the Big Bang begin?

    International Nuclear Information System (INIS)

    Levy-Leblond, J.

    1990-01-01

    It is argued that the age of the universe may well be numerically finite (20 billion years or so) and conceptually infinite. A new and natural time scale is defined on a physical basis using group-theoretical arguments. An additive notion of time is obtained according to which the age of the universe is indeed infinite. In other words, never did the Big Bang begin. This new time scale is not supposed to replace the ordinary cosmic time scale, but to supplement it (in the same way as rapidity has taken a place by the side of velocity in Einsteinian relativity). The question is discussed within the framework of conventional (big-bang) and classical (nonquantum) cosmology, but could easily be extended to more elaborate views, as the purpose is not so much to modify present theories as to reach a deeper understanding of their meaning

  7. Big Data in Drug Discovery.

    Science.gov (United States)

    Brown, Nathan; Cambruzzi, Jean; Cox, Peter J; Davies, Mark; Dunbar, James; Plumbley, Dean; Sellwood, Matthew A; Sim, Aaron; Williams-Jones, Bryn I; Zwierzyna, Magdalena; Sheppard, David W

    2018-01-01

    Interpretation of Big Data in the drug discovery community should enhance project timelines and reduce clinical attrition through improved early decision making. The issues we encounter start with the sheer volume of data and how we first ingest it before building an infrastructure to house it to make use of the data in an efficient and productive way. There are many problems associated with the data itself including general reproducibility, but often, it is the context surrounding an experiment that is critical to success. Help, in the form of artificial intelligence (AI), is required to understand and translate the context. On the back of natural language processing pipelines, AI is also used to prospectively generate new hypotheses by linking data together. We explain Big Data from the context of biology, chemistry and clinical trials, showcasing some of the impressive public domain sources and initiatives now available for interrogation. © 2018 Elsevier B.V. All rights reserved.

  8. Big Data and central banks

    OpenAIRE

    David Bholat

    2015-01-01

    This commentary recaps a Centre for Central Banking Studies event held at the Bank of England on 2–3 July 2014. The article covers three main points. First, it situates the Centre for Central Banking Studies event within the context of the Bank’s Strategic Plan and initiatives. Second, it summarises and reflects on major themes from the event. Third, the article links central banks’ emerging interest in Big Data approaches with their broader uptake by other economic agents.

  9. Big Bang or vacuum fluctuation

    International Nuclear Information System (INIS)

    Zel'dovich, Ya.B.

    1980-01-01

    Some general properties of vacuum fluctuations in quantum field theory are described. The connection between the ''energy dominance'' of the energy density of vacuum fluctuations in curved space-time and the presence of singularity is discussed. It is pointed out that a de-Sitter space-time (with the energy density of the vacuum fluctuations in the Einstein equations) that matches the expanding Friedman solution may describe the history of the Universe before the Big Bang. (P.L.)

  10. Big bang is not needed

    Energy Technology Data Exchange (ETDEWEB)

    Allen, A.D.

    1976-02-01

    Recent computer simulations indicate that a system of n gravitating masses breaks up, even when the total energy is negative. As a result, almost any initial phase-space distribution results in a universe that eventually expands under the Hubble law. Hence Hubble expansion implies little regarding an initial cosmic state. Especially it does not imply the singularly dense superpositioned state used in the big bang model.

  11. Big data: the management revolution.

    Science.gov (United States)

    McAfee, Andrew; Brynjolfsson, Erik

    2012-10-01

    Big data, the authors write, is far more powerful than the analytics of the past. Executives can measure and therefore manage more precisely than ever before. They can make better predictions and smarter decisions. They can target more-effective interventions in areas that so far have been dominated by gut and intuition rather than by data and rigor. The differences between big data and analytics are a matter of volume, velocity, and variety: More data now cross the internet every second than were stored in the entire internet 20 years ago. Nearly real-time information makes it possible for a company to be much more agile than its competitors. And that information can come from social networks, images, sensors, the web, or other unstructured sources. The managerial challenges, however, are very real. Senior decision makers have to learn to ask the right questions and embrace evidence-based decision making. Organizations must hire scientists who can find patterns in very large data sets and translate them into useful business information. IT departments have to work hard to integrate all the relevant internal and external sources of data. The authors offer two success stories to illustrate how companies are using big data: PASSUR Aerospace enables airlines to match their actual and estimated arrival times. Sears Holdings directly analyzes its incoming store data to make promotions much more precise and faster.

  12. Big Data Comes to School

    Directory of Open Access Journals (Sweden)

    Bill Cope

    2016-03-01

    Full Text Available The prospect of “big data” at once evokes optimistic views of an information-rich future and concerns about surveillance that adversely impacts our personal and private lives. This overview article explores the implications of big data in education, focusing by way of example on data generated by student writing. We have chosen writing because it presents particular complexities, highlighting the range of processes for collecting and interpreting evidence of learning in the era of computer-mediated instruction and assessment as well as the challenges. Writing is significant not only because it is central to the core subject area of literacy; it is also an ideal medium for the representation of deep disciplinary knowledge across a number of subject areas. After defining what big data entails in education, we map emerging sources of evidence of learning that separately and together have the potential to generate unprecedented amounts of data: machine assessments, structured data embedded in learning, and unstructured data collected incidental to learning activity. Our case is that these emerging sources of evidence of learning have significant implications for the traditional relationships between assessment and instruction. Moreover, for educational researchers, these data are in some senses quite different from traditional evidentiary sources, and this raises a number of methodological questions. The final part of the article discusses implications for practice in an emerging field of education data science, including publication of data, data standards, and research ethics.

  13. A small step for mankind

    NARCIS (Netherlands)

    Huizing, C.; Koymans, R.L.C.; Kuiper, R.; Dams, D.; Hannemann, U.; Steffen, M.

    2010-01-01

    For many programming languages, the only formal semantics published is an SOS big-step semantics. Such a semantics is not suited for investigations that observe intermediate states, such as invariant techniques. In this paper, a construction is proposed that generates automatically a small-step SOS

  14. Head CT scan

    Science.gov (United States)

    ... scan - orbits; CT scan - sinuses; Computed tomography - cranial; CAT scan - brain ... head size in children Changes in thinking or behavior Fainting Headache, when you have certain other signs ...

  15. Bottom head assembly

    International Nuclear Information System (INIS)

    Fife, A.B.

    1998-01-01

    A bottom head dome assembly is described which includes, in one embodiment, a bottom head dome and a liner configured to be positioned proximate the bottom head dome. The bottom head dome has a plurality of openings extending there through. The liner also has a plurality of openings extending there through, and each liner opening aligns with a respective bottom head dome opening. A seal is formed, such as by welding, between the liner and the bottom head dome to resist entry of water between the liner and the bottom head dome at the edge of the liner. In the one embodiment, a plurality of stub tubes are secured to the liner. Each stub tube has a bore extending there through, and each stub tube bore is coaxially aligned with a respective liner opening. A seat portion is formed by each liner opening for receiving a portion of the respective stub tube. The assembly also includes a plurality of support shims positioned between the bottom head dome and the liner for supporting the liner. In one embodiment, each support shim includes a support stub having a bore there through, and each support stub bore aligns with a respective bottom head dome opening. 2 figs

  16. Drug packaging in 2013: small changes would reap big benefits.

    Science.gov (United States)

    2014-05-01

    Drug packaging is important both in protecting and informing patients. Some improvements were made in 2013, but many of the products examined by Prescrire still had poor-quality or even dangerous packaging. Problem packaging is a major concern for patients who are more vulnerable to adverse effects, particularly children and pregnant women. Several problems were noted with products intended for self-medication (umbrella brands), oral solutions sold with dosing devices, and injectable drugs. Looking back at 20 years of Red Cards that Prescrire has issued to products with dangerous packaging reveals several improvements, but too many dangers persist. Urgent action needs to be taken by regulatory agencies and drug companies: patient leaflets must be more explicit with regard to adverse effects, especially those of nonsteroidal anti-inflammatory drugs during pregnancy; accidental ingestion by children must be prevented; and companies must design safer dosing devices. Healthcare professionals and patients must remain vigilant and report all packaging issues to the relevant authorities.

  17. [Nanotechnology: a big revolution from the small world].

    Science.gov (United States)

    Bassi, Matteo; Santinello, Irene; Bevilacqua, Andrea; Bassi, Pierfrancesco

    2013-01-01

    Nanotechnology is a multidisciplinary field originating from the interaction of several different disciplines, such as engineering, physics, biology and chemistry. New materials and devices effectively interact with the body at molecular level, yielding a brand new range of highly selective and targeted applications designed to maximize the therapeutic efficiency while reducing the side effects. Liposomes, quantum dots, carbon nanotubes and superparamagnetic nanoparticles are among the most assessed nanotechnologies. Meanwhile, other futuristic platforms are paving the way toward a new scientific paradigm, able to deeply change the research path in the medical science. The growth of nanotechnology, driven by the dramatic advances in science and technology, clearly creates new opportunities for the development of the medical science and disease treatment in human health care. Despite the concerns and the on-going studies about their safety, nanotechnology clearly emerges as holding the promise of delivering one of the greatest breakthroughs in the history of medical science.

  18. Going Extreme For Small Solutions To Big Environmental Challenges

    Energy Technology Data Exchange (ETDEWEB)

    Bagwell, Christopher E.

    2011-03-31

    This chapter is devoted to the scale, scope, and specific issues confronting the cleanup and long-term disposal of the U.S. nuclear legacy generated during WWII and the Cold War Era. The research reported is aimed at complex microbiological interactions with legacy waste materials generated by past nuclear production activities in the United States. The intended purpose of this research is to identify cost effective solutions to the specific problems (stability) and environmental challenges (fate, transport, exposure) in managing and detoxifying persistent contaminant species. Specifically addressed are high level waste microbiology and bacteria inhabiting plutonium laden soils in the unsaturated subsurface.

  19. Big-picture ecology for a small planet

    Directory of Open Access Journals (Sweden)

    Robert J. Scholes

    2015-11-01

    Full Text Available For a number of years, the extensive ecosystems of southern Africa have been a testing ground for ideas and techniques useful for studying and managing large-scale complex systems everywhere, and in particular for tackling issues of global change. The first contribution has been through making consistent, long-term, large-scale observations on climate, vegetation and animal dynamics and disturbances. These have been crucial in developing and testing hypotheses regarding how the earth system works at large space and timescales. The observational techniques have evolved dramatically over time: from notes kept by individuals, to systematic measurement programmes by organisations, to continuous and sophisticated measurements made by automated systems such as satellites and flux towers. The second contribution has been experimental, developing the notion that ecosystems can be the subject of deliberate experimental manipulation. Sometimes this has taken the form of large-scale treatments, such as fire trials or herbivore exclusion plots. More frequently, it has made use of the ‘experiment’ of the protected area in contrast to its surrounds, or has exploited the information in natural or human-induced gradients. Ecosystem experimentation has required rethinking the fundamentals of experimental design: What is the experimental unit? What is the meaning of a control? What constitutes replication? The third contribution has been theoretical. How does the functioning of warm, dry, species-rich ecosystems differ from the cool, moist, species-poor ecosystem examples that dominate the literature? What are the roles of disturbance and competition is maintaining ecosystem diversity, and top-down versus bottom-up control in maintaining ecosystem structure? The fourth contribution concerns the management of large-scale complex systems in the face of limited knowledge. How can the gap between science and policy be narrowed? What advantages and challenges does participatory co-management offer? How do you implement adaptive management?

  20. A small jab - a big effect: nonspecific immunomodulation by vaccines.

    Science.gov (United States)

    Benn, Christine S; Netea, Mihai G; Selin, Liisa K; Aaby, Peter

    2013-09-01

    Recent epidemiological studies have shown that, in addition to disease-specific effects, vaccines against infectious diseases have nonspecific effects on the ability of the immune system to handle other pathogens. For instance, in randomized trials tuberculosis and measles vaccines are associated with a substantial reduction in overall child mortality, which cannot be explained by prevention of the target disease. New research suggests that the nonspecific effects of vaccines are related to cross-reactivity of the adaptive immune system with unrelated pathogens, and to training of the innate immune system through epigenetic reprogramming. Hence, epidemiological findings are backed by immunological data. This generates a new understanding of the immune system and about how it can be modulated by vaccines to impact the general resistance to disease. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Infantile Accountability: When Big Data Meet Small Children

    Science.gov (United States)

    Wrigley, Terry; Wormwell, Louise

    2016-01-01

    This article examines a government attempt to impose testing of 4-year-olds as a baseline against which to "hold primary schools accountable" for children's subsequent progress. It examines the various forms of baseline testing in this experiment and analyses the misleading claims made for the "predictive validity" of baseline…

  2. Combine To Create Size or Growing From Small To Big?

    OpenAIRE

    Gan, Thian Han

    2013-01-01

    Many companies seek to acquire or merge to grow. Size has always been the underlying rationale for mergers and acquisitions (“M&A”) as the creation of larger entities will purportedly increase the capacity of the entities hence allowing them to take advantage of economies of scale and scope. Another reason for M&A is to create a “portfolio of core competences” by combining and exploiting the core competences of the two entities to create synergy and to enhance the competitive edge for the enl...

  3. Getting Results: Small Changes, Big Cohorts and Technology

    Science.gov (United States)

    Kenney, Jacqueline L.

    2012-01-01

    This paper presents an example of constructive alignment in practice. Integrated technology supports were deployed to increase the consistency between learning objectives, activities and assessment and to foster student-centred, higher-order learning processes in the unit. Modifications took place over nine iterations of a second-year Marketing…

  4. Seeking common ground: making connections big and small

    Directory of Open Access Journals (Sweden)

    David Lee Keiser

    2017-11-01

    Full Text Available Purpose – This essay furthers cross-cultural exchange, and understanding. Written for a general audience by a teacher educator, it argues for accepting all others into the academic conversation. Using varied examples, the purpose of this paper is to illustrate both lifelong learning and the power of connecting across difference. Design/methodology/approach – The author draws upon experience as a teacher and professor and his engagement with Kingdom of Saudi Arabia (KSA for examples of edification and engagement. Findings – The author cites both the current period and a mid-twentieth-century American major event, the civil rights March on Washington to illustrate possibilities for connection, clarity and symbiosis. Originality/value – Written for this journal, this essay uses an original and skeletal theoretical and empirical frame as well as field examples to argue for inclusiveness, exchange and acceptance of all learners.

  5. Small bugs, big business: the economic power of the microbe.

    Science.gov (United States)

    Demain, A L

    2000-10-01

    The versatility of microbial biosynthesis is enormous. The most industrially important primary metabolites are the amino acids, nucleotides, vitamins, solvents, and organic acids. Millions of tons of amino acids are produced each year with a total multibillion dollar market. Many synthetic vitamin production processes are being replaced by microbial fermentations. In addition to the multiple reaction sequences of fermentations, microorganisms are extremely useful in carrying out biotransformation processes. These are becoming essential to the fine chemical industry in the production of single-isomer intermediates. Microbially produced secondary metabolites are extremely important to our health and nutrition. As a group, they have tremendous economic importance. The antibiotic market amounts to almost 30 billion dollars and includes about 160 antibiotics and derivatives such as the beta-lactam peptide antibiotics, the macrolide polyketide erythromycin, tetracyclines, aminoglycosides and others. Other important pharmaceutical products produced by microrganisms are hypocholesterolemic agents, enzyme inhibitors, immunosuppressants and antitumor compounds, some having markets of over 1 billion dollars per year. Agriculturally important secondary metabolites include coccidiostats, animal growth promotants, antihelmintics and biopesticides. The modern biotechnology industry has made a major impact in the business world, biopharmaceuticals (recombinant protein drugs, vaccines and monoclonal antibodies) having a market of 15 billion dollars. Recombinant DNA technology has also produced a revolution in agriculture and has markedly increased markets for microbial enzymes. Molecular manipulations have been added to mutational techniques as means of increasing titers and yields of microbial procresses and in discovery of new drugs. Today, microbiology is a major participant in global industry. The best is yet to come as microbes move into the environmental and energy sectors.

  6. Tracking big and small agriculture with new satellite sensors

    Science.gov (United States)

    Lobell, D. B.; Azzari, G.; Jin, Z.

    2017-12-01

    New sensors from both the public and private sector are opening up exciting possibilities for monitoring agriculture and its use of water. This talk will present selected examples from recent work using data from Planet's Planetscope and Skysat sensors as well as Sentinel-1 and Sentinel-2 missions that are part of Europe's Copernicus program. Among other things, these satellites are now helping to track crop types and productivity for fields in rainfed cropping systems of East Africa and irrigated systems in South Asia. This information should contribute to understanding land and water use decisions throughout the world.

  7. Big Opportunities in Access to "Small Science" Data

    OpenAIRE

    Onsrud, Harlan; Campbell, James

    2007-01-01

    A distributed infrastructure that would enable those who wish to do so to contribute their scientific or technical data to a universal digital commons could allow such data to be more readily preserved and accessible among disciplinary domains. Five critical issues that must be addressed in developing an efficient and effective data commons infrastructure are described. We conclude that creation of a distributed infrastructure meeting the critical criteria and deployable throughout the networ...

  8. Focus small to find big - the microbeam story.

    Science.gov (United States)

    Wu, Jinhua; Hei, Tom K

    2017-08-29

    Even though the first ultraviolet microbeam was described by S. Tschachotin back in 1912, the development of sophisticated micro-irradiation facilities only began to flourish in the late 1980s. In this article, we highlight significant microbeam experiments, describe the latest microbeam irradiator configurations and critical discoveries made by using the microbeam apparatus. Modern radiological microbeams facilities are capable of producing a beam size of a few micrometers, or even tens of nanometers in size, and can deposit radiation with high precision within a cellular target. In the past three decades, a variety of microbeams has been developed to deliver a range of radiations including charged particles, X-rays, and electrons. Despite the original intention for their development to measure the effects of a single radiation track, the ability to target radiation with microbeams at sub-cellular targets has been extensively used to investigate radiation-induced biological responses within cells. Studies conducted using microbeams to target specific cells in a tissue have elucidated bystander responses, and further studies have shown reactive oxygen species (ROS) and reactive nitrogen species (RNS) play critical roles in the process. The radiation-induced abscopal effect, which has a profound impact on cancer radiotherapy, further reaffirmed the importance of bystander effects. Finally, by targeting sub-cellular compartments with a microbeam, we have reported cytoplasmic-specific biological responses. Despite the common dogma that nuclear DNA is the primary target for radiation-induced cell death and carcinogenesis, studies conducted using microbeam suggested that targeted cytoplasmic irradiation induces mitochondrial dysfunction, cellular stress, and genomic instability. A more recent development in microbeam technology includes application of mouse models to visualize in vivo DNA double-strand breaks. Microbeams are making important contributions towards our understanding of radiation responses in cells and tissue models.

  9. Small Yams, Big Deal; Name pequeno, grandes posibilidades

    Energy Technology Data Exchange (ETDEWEB)

    Henriques, Sasha [International Atomic Energy Agency, Division of Public Information, Vienna (Austria)

    2012-09-15

    The Dioscorea esculenta, or Chinese Yam as it's called in Ghana, is one of the smallest varieties still in existence. It's being affected by the destruction of natural ecosystems, as well as socio-economic changes.

  10. Small wins big: analytic pinyin skills promote Chinese word reading.

    Science.gov (United States)

    Lin, Dan; McBride-Chang, Catherine; Shu, Hua; Zhang, Yuping; Li, Hong; Zhang, Juan; Aram, Dorit; Levin, Iris

    2010-08-01

    The present study examined invented spelling of pinyin (a phonological coding system for teaching and learning Chinese words) in relation to subsequent Chinese reading development. Among 296 Chinese kindergartners in Beijing, independent invented pinyin spelling was found to be uniquely predictive of Chinese word reading 12 months later, even with Time 1 syllable deletion, phoneme deletion, and letter knowledge, in addition to the autoregressive effects of Time 1 Chinese word reading, statistically controlled. These results underscore the importance of children's early pinyin representations for Chinese reading acquisition, both theoretically and practically. The findings further support the idea of a universal phonological principle and indicate that pinyin is potentially an ideal measure of phonological awareness in Chinese.

  11. Nanotechnology and nanomedicine: going small means aiming big.

    Science.gov (United States)

    Teli, Mahesh Kumar; Mutalik, Srinivas; Rajanikant, G K

    2010-06-01

    Nanotechnology is an emerging branch of science for designing tools and devices of size 1 to 100 nm with specific function at the cellular, atomic and molecular levels. The concept of employing nanotechnology in biomedical research and clinical practice is best known as nanomedicine. Nanomedicine is an upcoming field that could potentially make a major impact to human health. Nanomaterials are increasingly used in diagnostics, imaging and targeted drug delivery. Nanotechnology will assist the integration of diagnostics/imaging with therapeutics and facilitates the development of personalized medicine, i.e. prescription of specific medications best suited for an individual. This review provides an integrated overview of application of nanotechnology based molecular diagnostics and drug delivery in the development of nanomedicine and ultimately personalized medicine. Finally, we identify critical gaps in our knowledge of nanoparticle toxicity and how these gaps need to be evaluated to enable nanotechnology to transit safely from bench to bedside.

  12. Research Spotlight: The next big thing is actually small.

    Science.gov (United States)

    Garcia, Carlos D

    2012-07-01

    Recent developments in materials, surface modifications, separation schemes, detection systems and associated instrumentation have allowed significant advances in the performance of lab-on-a-chip devices. These devices, also referred to as micro total analysis systems (µTAS), offer great versatility, high throughput, short analysis time, low cost and, more importantly, performance that is comparable to standard bench-top instrumentation. To date, µTAS have demonstrated advantages in a significant number of fields including biochemical, pharmaceutical, military and environmental. Perhaps most importantly, µTAS represent excellent platforms to introduce students to microfabrication and nanotechnology, bridging chemistry with other fields, such as engineering and biology, enabling the integration of various skills and curricular concepts. Considering the advantages of the technology and the potential impact to society, our research program aims to address the need for simpler, more affordable, faster and portable devices to measure biologically active compounds. Specifically, the program is focused on the development and characterization of a series of novel strategies towards the realization of integrated microanalytical devices. One key aspect of our research projects is that the developed analytical strategies must be compatible with each other; therefore, enabling their use in integrated devices. The program combines spectroscopy, surface chemistry, capillary electrophoresis, electrochemical detection and nanomaterials. This article discusses some of the most recent results obtained in two main areas of emphasis: capillary electrophoresis, microchip-capillary electrophoresis, electrochemical detection and interaction of proteins with nanomaterials.

  13. 20180410 - Finding small molecules in big data (Analytica)

    Science.gov (United States)

    Metabolomics and exposomics are amongst the youngest and most dynamic of the omics disciplines. While the molecules involved are smaller than proteomics and the other, larger “omics”, the challenges are in many ways greater. Elements are less constrained, there are no...

  14. Transparent data mining for big and small data

    CERN Document Server

    Quercia, Daniele; Pasquale, Frank

    2017-01-01

    This book focuses on new and emerging data mining solutions that offer a greater level of transparency than existing solutions. Transparent data mining solutions with desirable properties (e.g. effective, fully automatic, scalable) are covered in the book. Experimental findings of transparent solutions are tailored to different domain experts, and experimental metrics for evaluating algorithmic transparency are presented. The book also discusses societal effects of black box vs. transparent approaches to data mining, as well as real-world use cases for these approaches. As algorithms increasingly support different aspects of modern life, a greater level of transparency is sorely needed, not least because discrimination and biases have to be avoided. With contributions from domain experts, this book provides an overview of an emerging area of data mining that has profound societal consequences, and provides the technical background to for readers to contribute to the field or to put existing approaches to prac...

  15. Uncertainties in effective dose estimates of adult CT head scans: The effect of head size

    International Nuclear Information System (INIS)

    Gregory, Kent J.; Bibbo, Giovanni; Pattison, John E.

    2009-01-01

    Purpose: This study is an extension of a previous study where the uncertainties in effective dose estimates from adult CT head scans were calculated using four CT effective dose estimation methods, three of which were computer programs (CT-EXPO, CTDOSIMETRY, and IMPACTDOSE) and one that involved the dose length product (DLP). However, that study did not include the uncertainty contribution due to variations in head sizes. Methods: The uncertainties due to head size variations were estimated by first using the computer program data to calculate doses to small and large heads. These doses were then compared with doses calculated for the phantom heads used by the computer programs. An uncertainty was then assigned based on the difference between the small and large head doses and the doses of the phantom heads. Results: The uncertainties due to head size variations alone were found to be between 4% and 26% depending on the method used and the patient gender. When these uncertainties were included with the results of the previous study, the overall uncertainties in effective dose estimates (stated at the 95% confidence interval) were 20%-31% (CT-EXPO), 15%-30% (CTDOSIMETRY), 20%-36% (IMPACTDOSE), and 31%-40% (DLP). Conclusions: For the computer programs, the lower overall uncertainties were still achieved when measured values of CT dose index were used rather than tabulated values. For DLP dose estimates, head size variations made the largest (for males) and second largest (for females) contributions to effective dose uncertainty. An improvement in the uncertainty of the DLP method dose estimates will be achieved if head size variation can be taken into account.

  16. Uncertainties in effective dose estimates of adult CT head scans: The effect of head size

    Energy Technology Data Exchange (ETDEWEB)

    Gregory, Kent J.; Bibbo, Giovanni; Pattison, John E. [Department of Medical Physics, Royal Adelaide Hospital, Adelaide, South Australia 5000 (Australia) and School of Electrical and Information Engineering (Applied Physics), University of South Australia, Mawson Lakes, South Australia 5095 (Australia); Division of Medical Imaging, Women' s and Children' s Hospital, North Adelaide, South Australia 5006 (Australia) and School of Electrical and Information Engineering (Applied Physics), University of South Australia, Mawson Lakes, South Australia 5095 (Australia); School of Electrical and Information Engineering (Applied Physics), University of South Australia, Mawson Lakes, South Australia 5095 (Australia)

    2009-09-15

    Purpose: This study is an extension of a previous study where the uncertainties in effective dose estimates from adult CT head scans were calculated using four CT effective dose estimation methods, three of which were computer programs (CT-EXPO, CTDOSIMETRY, and IMPACTDOSE) and one that involved the dose length product (DLP). However, that study did not include the uncertainty contribution due to variations in head sizes. Methods: The uncertainties due to head size variations were estimated by first using the computer program data to calculate doses to small and large heads. These doses were then compared with doses calculated for the phantom heads used by the computer programs. An uncertainty was then assigned based on the difference between the small and large head doses and the doses of the phantom heads. Results: The uncertainties due to head size variations alone were found to be between 4% and 26% depending on the method used and the patient gender. When these uncertainties were included with the results of the previous study, the overall uncertainties in effective dose estimates (stated at the 95% confidence interval) were 20%-31% (CT-EXPO), 15%-30% (CTDOSIMETRY), 20%-36% (IMPACTDOSE), and 31%-40% (DLP). Conclusions: For the computer programs, the lower overall uncertainties were still achieved when measured values of CT dose index were used rather than tabulated values. For DLP dose estimates, head size variations made the largest (for males) and second largest (for females) contributions to effective dose uncertainty. An improvement in the uncertainty of the DLP method dose estimates will be achieved if head size variation can be taken into account.

  17. Turning big bang into big bounce. I. Classical dynamics

    Science.gov (United States)

    Dzierżak, Piotr; Małkiewicz, Przemysław; Piechocki, Włodzimierz

    2009-11-01

    The big bounce (BB) transition within a flat Friedmann-Robertson-Walker model is analyzed in the setting of loop geometry underlying the loop cosmology. We solve the constraint of the theory at the classical level to identify physical phase space and find the Lie algebra of the Dirac observables. We express energy density of matter and geometrical functions in terms of the observables. It is the modification of classical theory by the loop geometry that is responsible for BB. The classical energy scale specific to BB depends on a parameter that should be fixed either by cosmological data or determined theoretically at quantum level, otherwise the energy scale stays unknown.

  18. What would be outcome of a Big Crunch?

    CERN Document Server

    Hajdukovic, Dragan Slavkov

    2010-01-01

    I suggest the existence of a still undiscovered interaction: repulsion between matter and antimatter. The simplest and the most elegant candidate for such a force is gravitational repulsion between particles and antiparticles. I argue that such a force may give birth to a new Universe; by transforming an eventual Big Crunch of our universe, to an event similar to Big Bang. In fact, when a collapsing Universe is reduced to a supermassive black hole of a small size, a very strong field of the conjectured force may create particle-antiparticle pairs from the surrounding vacuum. The amount of the antimatter created from the physical vacuum is equal to the decrease of mass of "black hole Universe" and violently repelled from it. When the size of the black hole is sufficiently small the creation of antimatter may become so huge and fast, that matter of our Universe may disappear in a fraction of the Planck time. So fast transformation of matter to antimatter may look like a Big Bang with the initial size about 30 o...

  19. Interventions for treating osteoarthritis of the big toe joint.

    Science.gov (United States)

    Zammit, Gerard V; Menz, Hylton B; Munteanu, Shannon E; Landorf, Karl B; Gilheany, Mark F

    2010-09-08

    Osteoarthritis affecting of the big toe joint of the foot (hallux limitus or rigidus) is a common and painful condition. Although several treatments have been proposed, few have been adequately evaluated. To identify controlled trials evaluating interventions for osteoarthritis of the big toe joint and to determine the optimum intervention(s). Literature searches were conducted across the following electronic databases: CENTRAL; MEDLINE; EMBASE; CINAHL; and PEDro (to 14th January 2010). No language restrictions were applied. Randomised controlled trials, quasi-randomised trials, or controlled clinical trials that assessed treatment outcomes for osteoarthritis of the big toe joint. Participants of any age or gender with osteoarthritis of the big toe joint (defined either radiographically or clinically) were included. Two authors examined the list of titles and abstracts identified by the literature searches. One content area expert and one methodologist independently applied the pre-determined inclusion and exclusion criteria to the full text of identified trials. To minimise error and reduce potential bias, data were extracted independently by two content experts. Only one trial satisfactorily fulfilled the inclusion criteria and was included in this review. This trial evaluated the effectiveness of two physical therapy programs in 20 individuals with osteoarthritis of the big toe joint. Assessment outcomes included pain levels, big toe joint range of motion and plantar flexion strength of the hallux. Mean differences at four weeks follow up were 3.80 points (95% CI 2.74 to 4.86) for self reported pain, 28.30 degrees (95% CI 21.37 to 35.23) for big toe joint range of motion, and 2.80 kg (95% CI 2.13 to 3.47) for muscle strength. Although differences in outcomes between treatment and control groups were reported, the risk of bias was high. The trial failed to employ appropriate randomisation or adequate allocation concealment, used a relatively small sample and

  20. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Head Computed tomography (CT) of the head uses special x-ray equipment ... story here Images × Image Gallery Patient undergoing computed tomography (CT) scan. View full size with caption Pediatric Content ...