WorldWideScience

Sample records for big moving day

  1. Moving Another Big Desk.

    Science.gov (United States)

    Fawcett, Gay

    1996-01-01

    New ways of thinking about leadership require that leaders move their big desks and establish environments that encourage trust and open communication. Educational leaders must trust their colleagues to make wise choices. When teachers are treated democratically as leaders, classrooms will also become democratic learning organizations. (SM)

  2. Big Cities, Big Problems: Reason for the Elderly to Move?

    NARCIS (Netherlands)

    Fokkema, T.; de Jong-Gierveld, J.; Nijkamp, P.

    1996-01-01

    In many European countries, data on geographical patterns of internal elderly migration show that the elderly (55+) are more likely to leave than to move to the big cities. Besides emphasising the attractive features of the destination areas (pull factors), it is often assumed that this negative

  3. Measuring the Distance of Moving Objects from Big Trajectory Data

    Directory of Open Access Journals (Sweden)

    Khaing Phyo Wai

    2017-03-01

    Full Text Available Location-based services have become important in social networking, mobile applications, advertising, traffic monitoring, and many other domains. The growth of location sensing devices has led to the vast generation of dynamic spatial-temporal data in the form of moving object trajectories which can be characterized as big trajectory data. Big trajectory data enables the opportunities such as analyzing the groups of moving objects. To obtain such facilities, the issue of this work is to find a distance measurement method that respects the geographic distance and the semantic similarity for each trajectory. Measurement of similarity between moving objects is a difficult task because not only their position changes but also their semantic features vary. In this research, a method to measure trajectory similarity based on both geographical features and semantic features of motion is proposed. Finally, the proposed methods are practically evaluated by using real trajectory dataset.

  4. Passport to the Big Bang moves across the road

    CERN Document Server

    Corinne Pralavorio

    2015-01-01

    The ATLAS platform of the Passport to the Big Bang circuit has been relocated in front of the CERN Reception.   The ATLAS platform of the Passport to the Big Bang, outside the CERN Reception building. The Passport to the Big Bang platform of the ATLAS Experiment has been moved in front of the CERN Reception to make it more visible and accessible. It had to be dismantled and moved from its previous location in the garden of the Globe of Science and Innovation due to the major refurbishment work in progress on the Globe, and is now fully operational in its new location on the other side of the road, in the Main Reception car-park. The Passport to the Big Bang circuit, inaugurated in 2013, comprises ten platforms installed in front of ten CERN sites and aims to help local residents and visitors to the region understand CERN's research. Dedicated Passport to the Big Bang flyers, containing all necessary information and riddles for you to solve, are available at the CERN Rec...

  5. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    CERN Multimedia

    2008-01-01

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  6. Big Data Caching for Networking: Moving from Cloud to Edge

    OpenAIRE

    Zeydan, Engin; Baştuğ, Ejder; Bennis, Mehdi; Kader, Manhal Abdel; Karatepe, Alper; Er, Ahmet Salih; Debbah, Mérouane

    2016-01-01

    In order to cope with the relentless data tsunami in $5G$ wireless networks, current approaches such as acquiring new spectrum, deploying more base stations (BSs) and increasing nodes in mobile packet core networks are becoming ineffective in terms of scalability, cost and flexibility. In this regard, context-aware $5$G networks with edge/cloud computing and exploitation of \\emph{big data} analytics can yield significant gains to mobile operators. In this article, proactive content caching in...

  7. Big Bang Day: 5 Particles - 3. The Anti-particle

    CERN Multimedia

    Franck Close

    2008-01-01

    Simon Singh looks at the stories behind the discovery of 5 of the universe's most significant subatomic particles: the Electron, the Quark, the Anti-particle, the Neutrino and the "next particle". 3. The Anti-particle. It appears to be the stuff of science fiction. Associated with every elementary particle is an antiparticle which has the same mass and opposite charge. Should the two meet and combine, the result is annihilation - and a flash of light. Thanks to mysterious processes that occurred after the Big Bang there are a vastly greater number of particles than anti-particles. So how could their elusive existence be proved? At CERN particle physicists are crashing together subatomic particles at incredibly high speeds to create antimatter, which they hope will finally reveal what happened at the precise moment of the Big Bang to create the repertoire of elementary particles and antiparticles in existence today.

  8. Big Bang Day : Afternoon Play - Torchwood: Lost Souls

    CERN Multimedia

    2008-01-01

    Martha Jones, ex-time traveller and now working as a doctor for a UN task force, has been called to CERN where they're about to activate the Large Hadron Collider. Once activated, the Collider will fire beams of protons together recreating conditions a billionth of a second after the Big Bang - and potentially allowing the human race a greater insight into what the Universe is made of. But so much could go wrong - it could open a gateway to a parallel dimension, or create a black hole - and now voices from the past are calling out to people and scientists have started to disappear... Where have the missing scientists gone? What is the secret of the glowing man? What is lurking in the underground tunnel? And do the dead ever really stay dead? Lost Souls is a spin-off from the award-winning BBC Wales TV production Torchwood. It stars John Barrowman, Freema Agyeman, Eve Myles, Gareth David-Lloyd, Lucy Montgomery (of Titty Bang Bang) and Stephen Critchlow.

  9. How They Move Reveals What Is Happening: Understanding the Dynamics of Big Events from Human Mobility Pattern

    Directory of Open Access Journals (Sweden)

    Jean Damascène Mazimpaka

    2017-01-01

    Full Text Available The context in which a moving object moves contributes to the movement pattern observed. Likewise, the movement pattern reflects the properties of the movement context. In particular, big events influence human mobility depending on the dynamics of the events. However, this influence has not been explored to understand big events. In this paper, we propose a methodology for learning about big events from human mobility pattern. The methodology involves extracting and analysing the stopping, approaching, and moving-away interactions between public transportation vehicles and the geographic context. The analysis is carried out at two different temporal granularity levels to discover global and local patterns. The results of evaluating this methodology on bus trajectories demonstrate that it can discover occurrences of big events from mobility patterns, roughly estimate the event start and end time, and reveal the temporal patterns of arrival and departure of event attendees. This knowledge can be usefully applied in transportation and event planning and management.

  10. Use and selection of bridges as day roosts by Rafinesque's Big Eared Bats.

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, Frances, M.; Loeb, Susan, C.; Bunch, Mary, S.; Bowerman, William, W.

    2008-03-01

    ABSTRACT.—Rafinesque’s big-eared bats (Corynorhinus rafinesquii) use bridges as day roosts in parts of their range, but information on bridge use across their range is lacking. From May to Aug. 2002 we surveyed 1129 bridges (12.5%) within all 46 counties of South Carolina to determine use and selection of bridges as day roosts by big-eared bats and to document their distribution across the state. During summer 2003, we visited 235 bridges in previously occupied areas of the state to evaluate short-term fidelity to bridge roosts. We found colonies and solitary big-eared bats beneath 38 bridges in 2002 and 54 bridges in 2003. Construction type and size of bridges strongly influenced use in both years; bats selected large, concrete girder bridges and avoided flat-bottomed slab bridges. The majority of occupied bridges (94.7%) were in the Upper and Lower Coastal Plains, but a few bridges (5.3%) were located in the Piedmont. Rafinesque’s big-eared bats were absent beneath bridges in the Blue Ridge Mountains. We established new records of occurrence for 10 counties. In the Coastal Plains, big-eared bats exhibited a high degree of short-term fidelity to roosts in highway bridges. For bridges that were occupied at least once, mean frequency of use was 65.9%. Probability of finding bats under a bridge ranged from 0.46 to 0.73 depending on whether the bridge was occupied in the previous year. Thus, bridges should be inspected three to five times in a given year to determine whether they are being used. Regional bridge roost surveys may be a good method for determining the distribution of C. rafinesquii, particularly in the Coastal Plains, and protection of suitable bridges may be a viable conservation strategy where natural roost sites are limited.

  11. Privacy as virtue : Moving beyond the individual in the age of big data

    NARCIS (Netherlands)

    van der Sloot, B.

    2017-01-01

    This book discusses whether a rights-based approach to privacy regulation still suffices to address the challenges triggered by new data processing techniques such as Big Data and mass surveillance. A rights-based approach generally grants subjective rights to individuals to protect their personal

  12. Artificial intelligence and big data management: the dynamic duo for moving forward data centric sciences

    OpenAIRE

    Vargas Solar, Genoveva

    2017-01-01

    After vivid discussions led by the emergence of the buzzword “Big Data”, it seems that industry and academia have reached an objective understanding about data properties (volume, velocity, variety, veracity and value), the resources and “know how” it requires, and the opportunities it opens. Indeed, new applications promising fundamental changes in society, industry and science, include face recognition, machine translation, digital assistants, self-driving cars, ad-serving, chat-bots, perso...

  13. Big science

    CERN Multimedia

    Nadis, S

    2003-01-01

    " "Big science" is moving into astronomy, bringing large experimental teams, multi-year research projects, and big budgets. If this is the wave of the future, why are some astronomers bucking the trend?" (2 pages).

  14. Think big, start small, move fast a blueprint for transformation from the Mayo Clinic Center for Innovation

    CERN Document Server

    LaRusso, Nicholas; Farrugia, Gianrico

    2015-01-01

    The Only Innovation Guide You Will Ever Need--from the Award-Winning Minds at Mayo Clinic. A lot of businesspeople talk about innovation, but few companies have achieved the level of truly transformative innovation as brilliantly--or as famously--as the legendary Mayo Clinic. Introducing Think Big, Start Small, Move Fast, the first innovation guide based on the proven, decade-long program that’s made Mayo Clinic one of the most respected and successful organizations in the world. This essential must-have guide shows you how to: Inspire and ignite trailblazing innovation in your workplace Design a new business model that’s creative, collaborative, and sustainable Apply the traditional scientific method to the latest innovations in "design thinking" Build a customized toolkit of the best practices, project portfolios, and strategies Increase your innovation capacity--and watch how quickly you succeed These field-tested techniques grew out of the health care industry but are designed ...

  15. Dog days of summer: Influences on decision of wolves to move pups

    Science.gov (United States)

    Ausband, David E.; Mitchell, Michael S.; Bassing, Sarah B.; Nordhagen, Matthew; Smith, Douglas W.; Stahler, Daniel R.

    2016-01-01

    For animals that forage widely, protecting young from predation can span relatively long time periods due to the inability of young to travel with and be protected by their parents. Moving relatively immobile young to improve access to important resources, limit detection of concentrated scent by predators, and decrease infestations by ectoparasites can be advantageous. Moving young, however, can also expose them to increased mortality risks (e.g., accidents, getting lost, predation). For group-living animals that live in variable environments and care for young over extended time periods, the influence of biotic factors (e.g., group size, predation risk) and abiotic factors (e.g., temperature and precipitation) on the decision to move young is unknown. We used data from 25 satellite-collared wolves ( Canis lupus ) in Idaho, Montana, and Yellowstone National Park to evaluate how these factors could influence the decision to move pups during the pup-rearing season. We hypothesized that litter size, the number of adults in a group, and perceived predation risk would positively affect the number of times gray wolves moved pups. We further hypothesized that wolves would move their pups more often when it was hot and dry to ensure sufficient access to water. Contrary to our hypothesis, monthly temperature above the 30-year average was negatively related to the number of times wolves moved their pups. Monthly precipitation above the 30-year average, however, was positively related to the amount of time wolves spent at pup-rearing sites after leaving the natal den. We found little relationship between risk of predation (by grizzly bears, humans, or conspecifics) or group and litter sizes and number of times wolves moved their pups. Our findings suggest that abiotic factors most strongly influence the decision of wolves to move pups, although responses to unpredictable biotic events (e.g., a predator encountering pups) cannot be ruled out.

  16. The Ethical Imperative to Move to a Seven-Day Care Model.

    Science.gov (United States)

    Bell, Anthony; McDonald, Fiona; Hobson, Tania

    2016-06-01

    Whilst the nature of human illness is not determined by time of day or day of week, we currently structure health service delivery around a five-day delivery model. At least one country is endeavouring to develop a systems-based approach to planning a transition from five- to seven-day healthcare delivery models, and some services are independently instituting program reorganization to achieve these ends as research, amongst other things, highlights increased mortality and morbidity for weekend and after-hours admissions to hospitals. In this article, we argue that this issue does not merely raise instrumental concerns but also opens up a normative ethical dimension, recognizing that clinical ethical dilemmas are impacted on and created by systems of care. Using health policy ethics, we critically examine whether our health services, as currently structured, are at odds with ethical obligations for patient care and broader collective goals associated with the provision of publicly funded health services. We conclude by arguing that a critical health policy ethics perspective applying relevant ethical values and principles needs to be included when considering whether and how to transition from five-day to seven-day models for health delivery.

  17. Move! Eat better: Do you walk more than 10,000 steps a day?

    CERN Multimedia

    2013-01-01

    All of you who borrowed a pedometer from the Infirmary will confirm that it was a useful experience and that, in order to reach your recommended daily tally of 10,000 paces, you need to add about 30 minutes of exercise to your daily routine, which equates to about 4,000 extra paces. Why 10,000 steps?   The slogan “10,000 steps a day” comes from Japan. But the number wasn’t just pulled from a hat -  it’s based on a study by Dr Yoshiro Hatano. Walking is sacrosanct in Japanese society, where they use pedometers to determine the number of steps they walk every day. Dr Yoshiro Hatano has demonstrated that people who get enough walking in every day have improved health. In other words, you have to burn at least 200 extra calories per day, through exercise, in order to start reaping the benefits in terms of improved health. A 30-minute walk roughly corresponds to the 4,000 extra steps which are recommended but rarely taken. Is 10,000 steps an achie...

  18. Constraining Big Hurricanes: Remotely sensing Galveston Islands' changing coastal landscape from days to millennia

    Science.gov (United States)

    Dougherty, A. J.; Choi, J. H.; Heo, S.; Dosseto, A.

    2017-12-01

    Climate change models forecast increased storm intensity, which will drive coastal erosion as sea-level rise accelerates with global warming. Over the last five years the largest hurricanes ever recorded in the Pacific (Patricia) and the Atlantic (Irma) occurred as well as the devastation of Harvey. The preceding decade was marked with Super Storm Sandy, Katrina and Ike. A century prior, the deadliest natural disaster in North America occurred as a category 4 hurricane known as `The 1900 Storm' hit Galveston Island. This research aims to contextualize the impact of storms long before infrastructure and historical/scientific accounts documented erosion. Unlike the majority of barrier islands in the US, Galveston built seaward over the Holocene. As the beach prograded it preserved a history of storms and shoreline change over millennia to the present-day. These systems (called prograded barriers) were first studied over 50 years ago using topographic profiles, sediment cores and radiocarbon dating. This research revisits some of these benchmark study sites to augment existing data utilizing state-of-the-art Light Detection and Ranging (LiDAR), Ground Penetrating Radar (GPR), and Optically Stimulated Luminescence (OSL) techniques. In 2016 GPR and OSL data were collected from Galveston Island, with the aim to combine GPR, OSL and LiDAR (GOaL) to extract a high-resolution geologic record spanning 6,000 years. The resulting millennia-scale coastal evolution can be used to contextualize the impact of historic hurricanes over the past century (`The 1900 Storm'), decade (Ike in 2008) and year (now with Harvey). Preliminary results reveal a recent change in shoreline behaviour, and data from Harvey are currently being accessed within the perspective of these initial findings. This dataset will be discussed with respect to the other two benchmark prograded barriers studied in North America: Nayarit Barrier (Mexico) that Hurricane Patricia passed directly over in 2013 and

  19. Astroinformatics: the big data of the universe

    OpenAIRE

    Barmby, Pauline

    2016-01-01

    In astrophysics we like to think that our field was the originator of big data, back when it had to be carried around in big sky charts and books full of tables. These days, it's easier to move astrophysics data around, but we still have a lot of it, and upcoming telescope  facilities will generate even more. I discuss how astrophysicists approach big data in general, and give examples from some Western Physics & Astronomy research projects.  I also give an overview of ho...

  20. Scientists Have Feelings, Too. We Can Connect Emotions and Intellect to Reach Big Audiences, and Then Move Them.

    Science.gov (United States)

    Tenenbaum, L. F.

    2016-12-01

    Most of us in the science community have been rewarded for our brainpower throughout our lives. Therefore we've learned to rely heavily on our intellectual skills, even though we're clearly much more than disembodied brainiacs. What would happen if we tried to combine our emotions and intellect to communicate with the public? Would it foster deeper connections? NASA's Earth Right Now blog author Laura Faye Tenenbaum noticed that science communication sometimes flat-lines at "Wow," as if that's the only acceptable emotion we can feel. And even though emotions such as anger, frustration disappointment, or joy are messy and uncomfortable, and even though we sometimes worry that showing too much playfulness might prevent our scientific messages from being taken seriously, we decided that writing personal and emotional science stories would be worth a try. In January 2014, Tenenbaum brought her teaching and public speaking experience and her entertainment industry communication knowledge to the Earth Right Now blog. It quickly became one of NASA's most popular, most heavily commented and most shared blogs. We found that, when we combined "feeling" and "thinking," communication thrived and we were able to reach a much wider audience. We learned that to move people, we could begin by tapping into what we were moved by. We saw how lifting the mask of the starchy, nerdy stereotype to show that scientists are real people who experience deep emotions on the job is a best practice for enhancing the public perception of science. As a case study on the effect of using emotional content in communication, the Earth Right Now blog showed not only that it's possible, it's actually beneficial to include complex and authentic emotion in solid science communication.

  1. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  2. Early processing variations in selective attention to the color and direction of moving stimuli during 30 days head-down bed rest

    Science.gov (United States)

    Wang, Lin-Jie; He, Si-Yang; Niu, Dong-Bin; Guo, Jian-Ping; Xu, Yun-Long; Wang, De-Sheng; Cao, Yi; Zhao, Qi; Tan, Cheng; Li, Zhi-Li; Tang, Guo-Hua; Li, Yin-Hui; Bai, Yan-Qiang

    2013-11-01

    Dynamic variations in early selective attention to the color and direction of moving stimuli were explored during a 30 days period of head-down bed rest. Event-related potentials (ERPs) were recorded at F5, F6, P5, P6 scalp locations in seven male subjects who attended to pairs of bicolored light emitting diodes that flashed sequentially to produce a perception of movement. Subjects were required to attend selectively to a critical feature of the moving target, e.g., color or direction. The tasks included: a no response task, a color selective response task, a moving direction selective response task, and a combined color-direction selective response task. Subjects were asked to perform these four tasks on: the 3rd day before bed rest; the 3rd, 15th and 30th day during the bed rest; and the 5th day after bed rest. Subjects responded quickly to the color than moving direction and combined color-direction response. And they had a longer reaction time during bed rest on the 15th and 30th day during bed rest after a relatively quicker response on the 3rd day. Using brain event-related potentials technique, we found that in the color selective response task, the mean amplitudes of P1 and N1 for target ERPs decreased in the 3rd day during bed rest and 5th day after bed rest in comparison with pre-bed rest, 15th day and 30th day during bed rest. In the combined color-direction selective response task, the P1 latencies for target ERPs on the 3rd and 30th day during bed rest were longer than on the 15th day during bed rest. As 3rd day during bed rest was in the acute adaptation period and 30th day during bed rest was in the relatively adaptation stage of head-down bed rest, the results help to clarify the effects of bed rest on different task loads and patterns of attention. It was suggested that subjects expended more time to give correct decision in the head-down tilt bed rest state. A difficulty in the recruitment of brain resources was found in feature selection task

  3. A 90-day Bundled Payment for Primary Single-level Lumbar Discectomy/Decompression: What Does "Big Data" Say?

    Science.gov (United States)

    Jain, Nikhil; Virk, Sohrab S; Phillips, Frank M; Yu, Elizabeth; Khan, Safdar N

    2018-04-01

    Episode-based bundling may become the major form of reimbursement for many elective spine procedures. As the amount for a 90-day episode of care is not known for a lumbar discectomy, we analyzed the previous reimbursements from Commercial payers (2007-Q2 2015), Medicare Advantage (2007-Q2 2015), and Medicare (2005-2012) for a primary single-level lumbar discectomy/decompression. Distribution of payments among various service providers was studied and a 90-day bundle was simulated. Depending on the payer type, the average facility costs constituted 59.7% to 73.6% of total payments, followed by surgeon's fees, which accounted for 13.7% to 18.5%. Postacute services made up 8.8% to 15.8% of the total reimbursement. Surgeries performed in the inpatient setting were significantly more expensive as compared with surgeries performed in the outpatient setting (P<0.01). The average 90-day bundle amount was estimated at $11,091, $6571, and $6239 for Commercial payers, Medicare Advantage, and Medicare, respectively. Overall, service providers in the Southern region were reimbursed the lowest from Commercial payers and Medicare, compared with other regions. Postacute services are not as major cost drivers after discectomy as after total joint arthroplasty or hip fracture repair.

  4. A Big Video Manifesto

    DEFF Research Database (Denmark)

    Mcilvenny, Paul Bruce; Davidsen, Jacob

    2017-01-01

    and beautiful visualisations. However, we also need to ask what the tools of big data can do both for the Humanities and for more interpretative approaches and methods. Thus, we prefer to explore how the power of computation, new sensor technologies and massive storage can also help with video-based qualitative......For the last few years, we have witnessed a hype about the potential results and insights that quantitative big data can bring to the social sciences. The wonder of big data has moved into education, traffic planning, and disease control with a promise of making things better with big numbers...

  5. Moving Forward, Looking Back--Historical Perspective, "Big History" and the Return of the "Longue Durée": Time to Develop Our Scale Hopping Muscles

    Science.gov (United States)

    Hawkey, Kate

    2015-01-01

    "Big history" is a term receiving a great deal of attention at present, particularly in North America where considerable sums of money have been invested in designing curricula and assessment tools to help teachers teach history at far larger scales of time than normal. Hawkey considers the pros and cons of incorporating components of…

  6. Moving from reclusion to partial freedom: the experience of family caregivers for disabled elderly persons assisted in a day care center.

    Science.gov (United States)

    Bocchi, Silvia Cristina Mangini; Cano, Karen Cristina Urtado; Baltieri, Lilian; Godoy, Daniele Cristina; Spiri, Wilza Carla; Juliani, Carmen Maria Casquel Monti

    2010-09-01

    This study aimed at understanding the interactional experience between family caregivers and disabled elderly persons supported in a Day Care Center according to the caregiver's perspective. It also aimed at developing a representative theoretical model for the events experienced by such caregiver. The Grounded Theory was used as methodological framework whereas Interactional Symbolism served as the theoretical framework. Observation and interviews were used for data collection. The following phenomenon arose from the results: feeling of support by the Day Care Center, by the strength of the bond with the elderly and by spirituality in order to continue playing the challenging role of a family caregiver for a disabled elderly person. The study made possible to understand that, among these three supporting cornerstones for coping with the burden generated by the family caregiver role, the care model promoted by the Day Care Center was the intervenient variable in the process of improving the quality of life of the family caregiver-disabled elderly person binomial. This allowed the identification of the main category--moving from reclusion to partial freedom: the experience of family caregivers for disabled elderly persons assisted in a Day Care Center.

  7. Big Bang Day: Engineering Solutions

    CERN Multimedia

    Lyn Evans; Austin Ball; Jim Virdee; Adam Hart-Davis

    2008-01-01

    CERN's Large Hadron Collider is the most complicated scientific apparatus ever built. Many of the technologies it uses hadn't even been invented when scientists started building it. Adam Hart-Davis discovers what it takes to build the world's most intricate discovery machine.

  8. Big Bang Day : Physics Rocks

    CERN Multimedia

    Brian Cox; John Barrowman; Eddie Izzard

    2008-01-01

    Is particle physics the new rock 'n' roll? The fundamental questions about the nature of the universe that particle physics hopes to answer have attracted the attention of some very high profile and unusual fans. Alan Alda, Ben Miller, Eddie Izzard, Dara O'Briain and John Barrowman all have interests in this branch of physics. Brian Cox - CERN physicist, and former member of 90's band D:Ream, tracks down some very well known celebrity enthusiasts and takes a light-hearted look at why this subject can appeal to all of us.

  9. DPF Big One

    International Nuclear Information System (INIS)

    Anon.

    1993-01-01

    At its latest venue at Fermilab from 10-14 November, the American Physical Society's Division of Particles and Fields meeting entered a new dimension. These regular meetings, which allow younger researchers to communicate with their peers, have been gaining popularity over the years (this was the seventh in the series), but nobody had expected almost a thousand participants and nearly 500 requests to give talks. Thus Fermilab's 800-seat auditorium had to be supplemented with another room with a video hookup, while the parallel sessions were organized into nine bewildering streams covering fourteen major physics topics. With the conventionality of the Standard Model virtually unchallenged, physics does not move fast these days. While most of the physics results had already been covered in principle at the International Conference on High Energy Physics held in Dallas in August (October, page 1), the Fermilab DPF meeting had a very different atmosphere. Major international meetings like Dallas attract big names from far and wide, and it is difficult in such an august atmosphere for young researchers to find a receptive audience. This was not the case at the DPF parallel sessions. The meeting also adopted a novel approach, with the parallels sandwiched between an initial day of plenaries to set the scene, and a final day of summaries. With the whole world waiting for the sixth ('top') quark to be discovered at Fermilab's Tevatron protonantiproton collider, the meeting began with updates from Avi Yagil and Ronald Madaras from the big detectors, CDF and DO respectively. Although rumours flew thick and fast, the Tevatron has not yet reached the top, although Yagil could show one intriguing event of a type expected from the heaviest quark

  10. DPF Big One

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1993-01-15

    At its latest venue at Fermilab from 10-14 November, the American Physical Society's Division of Particles and Fields meeting entered a new dimension. These regular meetings, which allow younger researchers to communicate with their peers, have been gaining popularity over the years (this was the seventh in the series), but nobody had expected almost a thousand participants and nearly 500 requests to give talks. Thus Fermilab's 800-seat auditorium had to be supplemented with another room with a video hookup, while the parallel sessions were organized into nine bewildering streams covering fourteen major physics topics. With the conventionality of the Standard Model virtually unchallenged, physics does not move fast these days. While most of the physics results had already been covered in principle at the International Conference on High Energy Physics held in Dallas in August (October, page 1), the Fermilab DPF meeting had a very different atmosphere. Major international meetings like Dallas attract big names from far and wide, and it is difficult in such an august atmosphere for young researchers to find a receptive audience. This was not the case at the DPF parallel sessions. The meeting also adopted a novel approach, with the parallels sandwiched between an initial day of plenaries to set the scene, and a final day of summaries. With the whole world waiting for the sixth ('top') quark to be discovered at Fermilab's Tevatron protonantiproton collider, the meeting began with updates from Avi Yagil and Ronald Madaras from the big detectors, CDF and DO respectively. Although rumours flew thick and fast, the Tevatron has not yet reached the top, although Yagil could show one intriguing event of a type expected from the heaviest quark.

  11. Big moving day for biodiversity? A macroecological assessment of the scope for assisted colonization as a conservation strategy under global warming

    Science.gov (United States)

    Svenning, Jens-Christian; Fløjgaard, Camilla; Morueta-Holme, Naia; Lenoir, Jonathan; Normand, Signe; Skov, Flemming

    2009-11-01

    Future climate change constitutes a major threat to Earth's biodiversity. If anthropogenic greenhouse gas emissions continue unabated, 21st century climate change is likely to exceed the natural adaptive capacity of many natural ecosystems and a large proportion of species may risk extinction. A recurrent finding is that the degree of negative impact depends strongly on the dispersal potential of the species. However, there is a growing realization that many, if not most species would be unlikely to disperse as fast and far as required. As a consequence, it has been proposed that species at risk should be actively translocated into unoccupied, but environmentally suitable areas that are likely to stay suitable over the next 100 or more years (assisted colonization or assisted migration). This solution is controversial, though, reflecting negative experiences with introduced exotics and probably also the traditional emphasis in conservation management on preserving a certain local, often historical situation with a static species composition, and a tendency among ecologists to think of biological communities as generally saturated with species. Using the European flora as a case study, we here estimate the main environmental controls of plant species richness, assess how the maximum observed species richness depends on these environmental controls, and based here on estimate how many species could at least be added to an area before further species additions would perhaps inevitably lead to corresponding losses locally. Our results suggest that there is substantial room for additional plant species across most areas of Europe, indicating that there is considerable scope for implementing assisted colonization as a proactive conservation strategy under global warming without necessarily implicating negative effects on the native flora in the areas targeted for establishment of translocated populations. Notably, our results suggest that 50% of the cells in Northern Europe, the likely target area for many translocations, could harbor at least 1/3 as many additional species as they have native species. However, we also emphasize that other, more traditional conservation strategies should also be strengthened, notably providing more space for nature and reducing nitrogen deposition to increase population resilience and facilitate unassisted colonization. Furthermore, any implementation of assisted colonization should be done cautiously, with a careful analysis on a species-by-species case.

  12. Big moving day for biodiversity? A macroecological assessment of the scope for assisted colonization as a conservation strategy under global warming

    International Nuclear Information System (INIS)

    Svenning, Jens-Christian; Floejgaard, Camilla; Morueta-Holme, Naia; Lenoir, Jonathan; Normand, Signe; Skov, Flemming

    2009-01-01

    Future climate change constitutes a major threat to Earth's biodiversity. If anthropogenic greenhouse gas emissions continue unabated, 21st century climate change is likely to exceed the natural adaptive capacity of many natural ecosystems and a large proportion of species may risk extinction. A recurrent finding is that the degree of negative impact depends strongly on the dispersal potential of the species. However, there is a growing realization that many, if not most species would be unlikely to disperse as fast and far as required. As a consequence, it has been proposed that species at risk should be actively translocated into unoccupied, but environmentally suitable areas that are likely to stay suitable over the next 100 or more years (assisted colonization or assisted migration). This solution is controversial, though, reflecting negative experiences with introduced exotics and probably also the traditional emphasis in conservation management on preserving a certain local, often historical situation with a static species composition, and a tendency among ecologists to think of biological communities as generally saturated with species. Using the European flora as a case study, we here estimate the main environmental controls of plant species richness, assess how the maximum observed species richness depends on these environmental controls, and based here on estimate how many species could at least be added to an area before further species additions would perhaps inevitably lead to corresponding losses locally. Our results suggest that there is substantial room for additional plant species across most areas of Europe, indicating that there is considerable scope for implementing assisted colonization as a proactive conservation strategy under global warming without necessarily implicating negative effects on the native flora in the areas targeted for establishment of translocated populations. Notably, our results suggest that 50% of the cells in Northern Europe, the likely target area for many translocations, could harbor at least 1/3 as many additional species as they have native species. However, we also emphasize that other, more traditional conservation strategies should also be strengthened, notably providing more space for nature and reducing nitrogen deposition to increase population resilience and facilitate unassisted colonization. Furthermore, any implementation of assisted colonization should be done cautiously, with a careful analysis on a species-by-species case.

  13. Big moving day for biodiversity? A macroecological assessment of the scope for assisted colonization as a conservation strategy under global warming

    DEFF Research Database (Denmark)

    Svenning, J.-C.; Fløjgaard, Camilla; Morueta-Holme, Naia

    2009-01-01

    to corresponding losses locally. Our results suggest that there is substantial room for additional plant species across most areas of Europe, indicating that there is considerable scope for implementing assisted colonization as a proactive conservation strategy under global warming without necessarily implicating...

  14. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......, and highlight some recent methodological advancements in machine learning and image analysis triggered by astronomical applications....

  15. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with

  16. The Big-Wheel TGC-1 being moved against the Barrel Muon Spectrometer. The 216 trigger chambers are supported by a thin structure of 22 m diameter and 0.4 m thickness, weighting 44 tons and supported on two rails.

    CERN Multimedia

    Claudia Marcelloni

    2006-01-01

    The Big-Wheel TGC-1 being moved against the Barrel Muon Spectrometer. The 216 trigger chambers are supported by a thin structure of 22 m diameter and 0.4 m thickness, weighting 44 tons and supported on two rails.

  17. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  18. Age differences in big five behavior averages and variabilities across the adult life span: moving beyond retrospective, global summary accounts of personality.

    Science.gov (United States)

    Noftle, Erik E; Fleeson, William

    2010-03-01

    In 3 intensive cross-sectional studies, age differences in behavior averages and variabilities were examined. Three questions were posed: Does variability differ among age groups? Does the sizable variability in young adulthood persist throughout the life span? Do past conclusions about trait development, based on trait questionnaires, hold up when actual behavior is examined? Three groups participated: young adults (18-23 years), middle-aged adults (35-55 years), and older adults (65-81 years). In 2 experience-sampling studies, participants reported their current behavior multiple times per day for 1- or 2-week spans. In a 3rd study, participants interacted in standardized laboratory activities on 8 occasions. First, results revealed a sizable amount of intraindividual variability in behavior for all adult groups, with average within-person standard deviations ranging from about half a point to well over 1 point on 6-point scales. Second, older adults were most variable in Openness, whereas young adults were most variable in Agreeableness and Emotional Stability. Third, most specific patterns of maturation-related age differences in actual behavior were more greatly pronounced and differently patterned than those revealed by the trait questionnaire method. When participants interacted in standardized situations, personality differences between young adults and middle-aged adults were larger, and older adults exhibited a more positive personality profile than they exhibited in their everyday lives.

  19. Big Data is invading big places as CERN

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Big Data technologies are becoming more popular with the constant grow of data generation in different fields such as social networks, internet of things and laboratories like CERN. How is CERN making use of such technologies? How machine learning is applied at CERN with Big Data technologies? How much data we move and how it is analyzed? All these questions will be answered during the talk.

  20. Big Data Analytics An Overview

    Directory of Open Access Journals (Sweden)

    Jayshree Dwivedi

    2015-08-01

    Full Text Available Big data is a data beyond the storage capacity and beyond the processing power is called big data. Big data term is used for data sets its so large or complex that traditional data it involves data sets with sizes. Big data size is a constantly moving target year by year ranging from a few dozen terabytes to many petabytes of data means like social networking sites the amount of data produced by people is growing rapidly every year. Big data is not only a data rather it become a complete subject which includes various tools techniques and framework. It defines the epidemic possibility and evolvement of data both structured and unstructured. Big data is a set of techniques and technologies that require new forms of assimilate to uncover large hidden values from large datasets that are diverse complex and of a massive scale. It is difficult to work with using most relational database management systems and desktop statistics and visualization packages exacting preferably massively parallel software running on tens hundreds or even thousands of servers. Big data environment is used to grab organize and resolve the various types of data. In this paper we describe applications problems and tools of big data and gives overview of big data.

  1. Urbanising Big

    DEFF Research Database (Denmark)

    Ljungwall, Christer

    2013-01-01

    Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis.......Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis....

  2. Big Argumentation?

    Directory of Open Access Journals (Sweden)

    Daniel Faltesek

    2013-08-01

    Full Text Available Big Data is nothing new. Public concern regarding the mass diffusion of data has appeared repeatedly with computing innovations, in the formation before Big Data it was most recently referred to as the information explosion. In this essay, I argue that the appeal of Big Data is not a function of computational power, but of a synergistic relationship between aesthetic order and a politics evacuated of a meaningful public deliberation. Understanding, and challenging, Big Data requires an attention to the aesthetics of data visualization and the ways in which those aesthetics would seem to depoliticize information. The conclusion proposes an alternative argumentative aesthetic as the appropriate response to the depoliticization posed by the popular imaginary of Big Data.

  3. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations......The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  4. How Big Is Too Big?

    Science.gov (United States)

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  5. People on the Move

    Science.gov (United States)

    Mohan, Audrey

    2018-01-01

    The purpose of this 2-3 day lesson is to introduce students in Grades 2-4 to the idea that people move around the world for a variety of reasons. In this activity, students explore why people move through class discussion, a guided reading, and interviews. The teacher elicits student ideas using the compelling question (Dimension 1 of the C3…

  6. The Case for "Big History."

    Science.gov (United States)

    Christian, David

    1991-01-01

    Urges an approach to the teaching of history that takes the largest possible perspective, crossing time as well as space. Discusses the problems and advantages of such an approach. Describes a course on "big" history that begins with time, creation myths, and astronomy, and moves on to paleontology and evolution. (DK)

  7. The Moving image

    DEFF Research Database (Denmark)

    Hansen, Lennard Højbjerg

    2014-01-01

    Every day we are presented with bodily expressions in audiovisual media – by anchors, journalists and characters in films for instance. This article explores how body language in the moving image has been and can be approached in a scholarly manner.......Every day we are presented with bodily expressions in audiovisual media – by anchors, journalists and characters in films for instance. This article explores how body language in the moving image has been and can be approached in a scholarly manner....

  8. Big Bang Day : The Great Big Particle Adventure - 1. Atom

    CERN Multimedia

    Steven Weinberg; Terry White; John Ellis; Jim Virdee

    2008-01-01

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. The notion of atoms dates back to Greek philosophers who sought a natural mechanical explanation of the Universe, as opposed to a divine one. The existence what we call chemical atoms, the constituents of all we see around us, wasn't proved until a hundred years ago, but almost simultaneously it was realised these weren't the indivisible constituents the Greeks envisaged. Much of the story of physics since then has been the ever-deeper probing of matter until, at the end of the 20th century, a complete list of fundamental ingredients had been identified, apart from one, the much discussed Higgs particle. In this programme, Ben finds out why this last particle is so pivotal, not just to atomic theory, but to our very existence - and how hopeful the scientists are of proving its existence.

  9. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  10. Big Dreams

    Science.gov (United States)

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  11. Big Science

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1986-05-15

    Astronomy, like particle physics, has become Big Science where the demands of front line research can outstrip the science budgets of whole nations. Thus came into being the European Southern Observatory (ESO), founded in 1962 to provide European scientists with a major modern observatory to study the southern sky under optimal conditions.

  12. Big Data in Space Science

    OpenAIRE

    Barmby, Pauline

    2018-01-01

    It seems like “big data” is everywhere these days. In planetary science and astronomy, we’ve been dealing with large datasets for a long time. So how “big” is our data? How does it compare to the big data that a bank or an airline might have? What new tools do we need to analyze big datasets, and how can we make better use of existing tools? What kinds of science problems can we address with these? I’ll address these questions with examples including ESA’s Gaia mission, ...

  13. How to Move in a Jostling Crowd

    Indian Academy of Sciences (India)

    For people living in big cities it is an ordeal to walk in a bus stand or a railway station. They get stuck helplessly in a crowd. They are simply pushed around and all their efforts to move forward appear futile. Only the most energetic can wade through the constantly moving sea of people. How about the weaker ones?

  14. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Andersen, Kristina Vaarst; Jeppesen, Jacob

    In this paper we investigate the micro-mechanisms governing structural evolution and performance of scientific collaboration. Scientific discovery tends not to be lead by so called lone ?stars?, or big egos, but instead by collaboration among groups of researchers, from a multitude of institutions...

  15. Big Data and Big Science

    OpenAIRE

    Di Meglio, Alberto

    2014-01-01

    Brief introduction to the challenges of big data in scientific research based on the work done by the HEP community at CERN and how the CERN openlab promotes collaboration among research institutes and industrial IT companies. Presented at the FutureGov 2014 conference in Singapore.

  16. Big inquiry

    Energy Technology Data Exchange (ETDEWEB)

    Wynne, B [Lancaster Univ. (UK)

    1979-06-28

    The recently published report entitled 'The Big Public Inquiry' from the Council for Science and Society and the Outer Circle Policy Unit is considered, with especial reference to any future enquiry which may take place into the first commercial fast breeder reactor. Proposals embodied in the report include stronger rights for objectors and an attempt is made to tackle the problem that participation in a public inquiry is far too late to be objective. It is felt by the author that the CSS/OCPU report is a constructive contribution to the debate about big technology inquiries but that it fails to understand the deeper currents in the economic and political structure of technology which so influence the consequences of whatever formal procedures are evolved.

  17. Big Data

    OpenAIRE

    Bútora, Matúš

    2017-01-01

    Cieľom bakalárskej práca je popísať problematiku Big Data a agregačné operácie OLAP pre podporu rozhodovania, ktoré sú na ne aplikované pomocou technológie Apache Hadoop. Prevažná časť práce je venovaná popisu práve tejto technológie. Posledná kapitola sa zaoberá spôsobom aplikovania agregačných operácií a problematikou ich realizácie. Nasleduje celkové zhodnotenie práce a možnosti využitia výsledného systému do budúcna. The aim of the bachelor thesis is to describe the Big Data issue and ...

  18. BIG DATA

    OpenAIRE

    Abhishek Dubey

    2018-01-01

    The term 'Big Data' portrays inventive methods and advances to catch, store, disseminate, oversee and break down petabyte-or bigger estimated sets of data with high-speed & diverted structures. Enormous information can be organized, non-structured or half-organized, bringing about inadequacy of routine information administration techniques. Information is produced from different distinctive sources and can touch base in the framework at different rates. With a specific end goal to handle this...

  19. Keynote: Big Data, Big Opportunities

    OpenAIRE

    Borgman, Christine L.

    2014-01-01

    The enthusiasm for big data is obscuring the complexity and diversity of data in scholarship and the challenges for stewardship. Inside the black box of data are a plethora of research, technology, and policy issues. Data are not shiny objects that are easily exchanged. Rather, data are representations of observations, objects, or other entities used as evidence of phenomena for the purposes of research or scholarship. Data practices are local, varying from field to field, individual to indiv...

  20. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  1. Move up,Move out

    Institute of Scientific and Technical Information of China (English)

    Guo Yan

    2007-01-01

    @@ China has already become the world's largest manufacturer of cement,copper and steel.Chinese producers have moved onto the world stage and dominated the global consumer market from textiles to electronics with amazing speed and efficiency.

  2. Big Data

    DEFF Research Database (Denmark)

    Aaen, Jon; Nielsen, Jeppe Agger

    2016-01-01

    Big Data byder sig til som en af tidens mest hypede teknologiske innovationer, udråbt til at rumme kimen til nye, værdifulde operationelle indsigter for private virksomheder og offentlige organisationer. Mens de optimistiske udmeldinger er mange, er forskningen i Big Data i den offentlige sektor...... indtil videre begrænset. Denne artikel belyser, hvordan den offentlige sundhedssektor kan genanvende og udnytte en stadig større mængde data under hensyntagen til offentlige værdier. Artiklen bygger på et casestudie af anvendelsen af store mængder sundhedsdata i Dansk AlmenMedicinsk Database (DAMD......). Analysen viser, at (gen)brug af data i nye sammenhænge er en flerspektret afvejning mellem ikke alene økonomiske rationaler og kvalitetshensyn, men også kontrol over personfølsomme data og etiske implikationer for borgeren. I DAMD-casen benyttes data på den ene side ”i den gode sags tjeneste” til...

  3. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  4. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Fields, Brian D.; Olive, Keith A.

    2006-01-01

    We present an overview of the standard model of big bang nucleosynthesis (BBN), which describes the production of the light elements in the early universe. The theoretical prediction for the abundances of D, 3 He, 4 He, and 7 Li is discussed. We emphasize the role of key nuclear reactions and the methods by which experimental cross section uncertainties are propagated into uncertainties in the predicted abundances. The observational determination of the light nuclides is also discussed. Particular attention is given to the comparison between the predicted and observed abundances, which yields a measurement of the cosmic baryon content. The spectrum of anisotropies in the cosmic microwave background (CMB) now independently measures the baryon density to high precision; we show how the CMB data test BBN, and find that the CMB and the D and 4 He observations paint a consistent picture. This concordance stands as a major success of the hot big bang. On the other hand, 7 Li remains discrepant with the CMB-preferred baryon density; possible explanations are reviewed. Finally, moving beyond the standard model, primordial nucleosynthesis constraints on early universe and particle physics are also briefly discussed

  5. Big Data and HPC: A Happy Marriage

    KAUST Repository

    Mehmood, Rashid

    2016-01-01

    International Data Corporation (IDC) defines Big Data technologies as “a new generation of technologies and architectures, designed to economically extract value from very large volumes of a wide variety of data produced every day, by enabling high

  6. Big Bend National Park: Acoustical Monitoring 2010

    Science.gov (United States)

    2013-06-01

    During the summer of 2010 (September October 2010), the Volpe Center collected baseline acoustical data at Big Bend National Park (BIBE) at four sites deployed for approximately 30 days each. The baseline data collected during this period will he...

  7. Big Data - What is it and why it matters.

    Science.gov (United States)

    Tattersall, Andy; Grant, Maria J

    2016-06-01

    Big data, like MOOCs, altmetrics and open access, is a term that has been commonplace in the library community for some time yet, despite its prevalence, many in the library and information sector remain unsure of the relationship between big data and their roles. This editorial explores what big data could mean for the day-to-day practice of health library and information workers, presenting examples of big data in action, considering the ethics of accessing big data sets and the potential for new roles for library and information workers. © 2016 Health Libraries Group.

  8. Speaking sociologically with big data: symphonic social science and the future for big data research

    OpenAIRE

    Halford, Susan; Savage, Mike

    2017-01-01

    Recent years have seen persistent tension between proponents of big data analytics, using new forms of digital data to make computational and statistical claims about ‘the social’, and many sociologists sceptical about the value of big data, its associated methods and claims to knowledge. We seek to move beyond this, taking inspiration from a mode of argumentation pursued by Putnam (2000), Wilkinson and Pickett (2009) and Piketty (2014) that we label ‘symphonic social science’. This bears bot...

  9. Solution of a braneworld big crunch/big bang cosmology

    International Nuclear Information System (INIS)

    McFadden, Paul L.; Turok, Neil; Steinhardt, Paul J.

    2007-01-01

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c) 2 . At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios

  10. Opportunity and Challenges for Migrating Big Data Analytics in Cloud

    Science.gov (United States)

    Amitkumar Manekar, S.; Pradeepini, G., Dr.

    2017-08-01

    Big Data Analytics is a big word now days. As per demanding and more scalable process data generation capabilities, data acquisition and storage become a crucial issue. Cloud storage is a majorly usable platform; the technology will become crucial to executives handling data powered by analytics. Now a day’s trend towards “big data-as-a-service” is talked everywhere. On one hand, cloud-based big data analytics exactly tackle in progress issues of scale, speed, and cost. But researchers working to solve security and other real-time problem of big data migration on cloud based platform. This article specially focused on finding possible ways to migrate big data to cloud. Technology which support coherent data migration and possibility of doing big data analytics on cloud platform is demanding in natute for new era of growth. This article also gives information about available technology and techniques for migration of big data in cloud.

  11. Big Data, Big Problems: A Healthcare Perspective.

    Science.gov (United States)

    Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M

    2017-01-01

    Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."

  12. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  13. Stalin's Big Fleet Program

    National Research Council Canada - National Science Library

    Mauner, Milan

    2002-01-01

    Although Dr. Milan Hauner's study 'Stalin's Big Fleet program' has focused primarily on the formation of Big Fleets during the Tsarist and Soviet periods of Russia's naval history, there are important lessons...

  14. Big Data Semantics

    NARCIS (Netherlands)

    Ceravolo, Paolo; Azzini, Antonia; Angelini, Marco; Catarci, Tiziana; Cudré-Mauroux, Philippe; Damiani, Ernesto; Mazak, Alexandra; van Keulen, Maurice; Jarrar, Mustafa; Santucci, Giuseppe; Sattler, Kai-Uwe; Scannapieco, Monica; Wimmer, Manuel; Wrembel, Robert; Zaraket, Fadi

    2018-01-01

    Big Data technology has discarded traditional data modeling approaches as no longer applicable to distributed data processing. It is, however, largely recognized that Big Data impose novel challenges in data and infrastructure management. Indeed, multiple components and procedures must be

  15. Characterizing Big Data Management

    OpenAIRE

    Rogério Rossi; Kechi Hirama

    2015-01-01

    Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: t...

  16. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  17. Will Organization Design Be Affected By Big Data?

    Directory of Open Access Journals (Sweden)

    Giles Slinger

    2014-12-01

    Full Text Available Computing power and analytical methods allow us to create, collate, and analyze more data than ever before. When datasets are unusually large in volume, velocity, and variety, they are referred to as “big data.” Some observers have suggested that in order to cope with big data (a organizational structures will need to change and (b the processes used to design organizations will be different. In this article, we differentiate big data from relatively slow-moving, linked people data. We argue that big data will change organizational structures as organizations pursue the opportunities presented by big data. The processes by which organizations are designed, however, will be relatively unaffected by big data. Instead, organization design processes will be more affected by the complex links found in people data.

  18. The SIKS/BiGGrid Big Data Tutorial

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Lammerts, Evert; de Vries, A.P.

    2011-01-01

    The School for Information and Knowledge Systems SIKS and the Dutch e-science grid BiG Grid organized a new two-day tutorial on Big Data at the University of Twente on 30 November and 1 December 2011, just preceding the Dutch-Belgian Database Day. The tutorial is on top of some exciting new

  19. Living Day by Day

    Science.gov (United States)

    Kaplan, Rachel L.; Khoury, Cynthia El; Field, Emily R. S.; Mokhbat, Jacques

    2016-01-01

    We examined the meaning of living with HIV/AIDS among women in Lebanon. Ten women living with HIV/AIDS (WLWHA) described their experiences via semistructured in-depth interviews. They navigated a process of HIV diagnosis acceptance that incorporated six overlapping elements: receiving the news, accessing care, starting treatment, navigating disclosure decisions, negotiating stigma, and maintaining stability. Through these elements, we provide a framework for understanding three major themes that were constructed during data analysis: Stand by my side: Decisions of disclosure; Being “sick” and feeling “normal”: Interacting with self, others, and society; and Living day by day: focusing on the present. We contribute to the existing literature by providing a theoretical framework for understanding the process of diagnosis and sero-status acceptance among WLWHA. This was the first study of its kind to examine the meaning of living with HIV/AIDS among women in a Middle Eastern country. PMID:28462340

  20. Living Day by Day

    Directory of Open Access Journals (Sweden)

    Rachel L. Kaplan

    2016-05-01

    Full Text Available We examined the meaning of living with HIV/AIDS among women in Lebanon. Ten women living with HIV/AIDS (WLWHA described their experiences via semistructured in-depth interviews. They navigated a process of HIV diagnosis acceptance that incorporated six overlapping elements: receiving the news, accessing care, starting treatment, navigating disclosure decisions, negotiating stigma, and maintaining stability. Through these elements, we provide a framework for understanding three major themes that were constructed during data analysis: Stand by my side: Decisions of disclosure; Being “sick” and feeling “normal”: Interacting with self, others, and society; and Living day by day: focusing on the present. We contribute to the existing literature by providing a theoretical framework for understanding the process of diagnosis and sero-status acceptance among WLWHA. This was the first study of its kind to examine the meaning of living with HIV/AIDS among women in a Middle Eastern country.

  1. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  2. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  3. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  4. Environmental effects of the Big Rapids dam remnant removal, Big Rapids, Michigan, 2000-02

    Science.gov (United States)

    Healy, Denis F.; Rheaume, Stephen J.; Simpson, J. Alan

    2003-01-01

    The U.S. Geological Survey (USGS), in cooperation with the city of Big Rapids, investigated the environmental effects of removal of a dam-foundation remnant and downstream cofferdam from the Muskegon River in Big Rapids, Mich. The USGS applied a multidiscipline approach, which determined the water quality, sediment character, and stream habitat before and after dam removal. Continuous water-quality data and discrete water-quality samples were collected, the movement of suspended and bed sediment were measured, changes in stream habitat were assessed, and streambed elevations were surveyed. Analyses of water upstream and downstream from the dam showed that the dam-foundation remnant did not affect water quality. Dissolved-oxygen concentrations downstream from the dam remnant were depressed for a short period (days) during the beginning of the dam removal, in part because of that removal effort. Sediment transport from July 2000 through March 2002 was 13,800 cubic yards more at the downstream site than the upstream site. This increase in sediment represents the remobilized sediment upstream from the dam, bank erosion when the impoundment was lowered, and contributions from small tributaries between the sites. Five habitat reaches were monitored before and after dam-remnant removal. The reaches consisted of a reference reach (A), upstream from the effects of the impoundment; the impoundment (B); and three sites below the impoundment where habitat changes were expected (C, D, and E, in downstream order). Stream-habitat assessment reaches varied in their responses to the dam-remnant removal. Reference reach A was not affected. In impoundment reach B, Great Lakes and Environmental Assessment Section (GLEAS) Procedure 51 ratings went from fair to excellent. For the three downstream reaches, reach C underwent slight habitat degradation, but ratings remained good; reach D underwent slight habitat degradation with ratings changing from excellent to good; and, in an area

  5. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  6. The big bang

    International Nuclear Information System (INIS)

    Chown, Marcus.

    1987-01-01

    The paper concerns the 'Big Bang' theory of the creation of the Universe 15 thousand million years ago, and traces events which physicists predict occurred soon after the creation. Unified theory of the moment of creation, evidence of an expanding Universe, the X-boson -the particle produced very soon after the big bang and which vanished from the Universe one-hundredth of a second after the big bang, and the fate of the Universe, are all discussed. (U.K.)

  7. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  8. Data: Big and Small.

    Science.gov (United States)

    Jones-Schenk, Jan

    2017-02-01

    Big data is a big topic in all leadership circles. Leaders in professional development must develop an understanding of what data are available across the organization that can inform effective planning for forecasting. Collaborating with others to integrate data sets can increase the power of prediction. Big data alone is insufficient to make big decisions. Leaders must find ways to access small data and triangulate multiple types of data to ensure the best decision making. J Contin Educ Nurs. 2017;48(2):60-61. Copyright 2017, SLACK Incorporated.

  9. Big Data Components for Business Process Optimization

    Directory of Open Access Journals (Sweden)

    Mircea Raducu TRIFU

    2016-01-01

    Full Text Available In these days, more and more people talk about Big Data, Hadoop, noSQL and so on, but very few technical people have the necessary expertise and knowledge to work with those concepts and technologies. The present issue explains one of the concept that stand behind two of those keywords, and this is the map reduce concept. MapReduce model is the one that makes the Big Data and Hadoop so powerful, fast, and diverse for business process optimization. MapReduce is a programming model with an implementation built to process and generate large data sets. In addition, it is presented the benefits of integrating Hadoop in the context of Business Intelligence and Data Warehousing applications. The concepts and technologies behind big data let organizations to reach a variety of objectives. Like other new information technologies, the main important objective of big data technology is to bring dramatic cost reduction.

  10. Student-Centered Coaching: The Moves

    Science.gov (United States)

    Sweeney, Diane; Harris, Leanna S.

    2017-01-01

    Student-centered coaching is a highly-effective, evidence-based coaching model that shifts the focus from "fixing" teachers to collaborating with them to design instruction that targets student outcomes. But what does this look like in practice? "Student-Centered Coaching: The Moves" shows you the day-to-day coaching moves that…

  11. Big data analytics a management perspective

    CERN Document Server

    Corea, Francesco

    2016-01-01

    This book is about innovation, big data, and data science seen from a business perspective. Big data is a buzzword nowadays, and there is a growing necessity within practitioners to understand better the phenomenon, starting from a clear stated definition. This book aims to be a starting reading for executives who want (and need) to keep the pace with the technological breakthrough introduced by new analytical techniques and piles of data. Common myths about big data will be explained, and a series of different strategic approaches will be provided. By browsing the book, it will be possible to learn how to implement a big data strategy and how to use a maturity framework to monitor the progress of the data science team, as well as how to move forward from one stage to the next. Crucial challenges related to big data will be discussed, where some of them are more general - such as ethics, privacy, and ownership – while others concern more specific business situations (e.g., initial public offering, growth st...

  12. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  13. Dual of big bang and big crunch

    International Nuclear Information System (INIS)

    Bak, Dongsu

    2007-01-01

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory

  14. Big Data and Neuroimaging.

    Science.gov (United States)

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  15. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  16. From Big Data to Big Business

    DEFF Research Database (Denmark)

    Lund Pedersen, Carsten

    2017-01-01

    Idea in Brief: Problem: There is an enormous profit potential for manufacturing firms in big data, but one of the key barriers to obtaining data-driven growth is the lack of knowledge about which capabilities are needed to extract value and profit from data. Solution: We (BDBB research group at C...

  17. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  18. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  19. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  20. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van

  1. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  2. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Boyd, Richard N.

    2001-01-01

    The precision of measurements in modern cosmology has made huge strides in recent years, with measurements of the cosmic microwave background and the determination of the Hubble constant now rivaling the level of precision of the predictions of big bang nucleosynthesis. However, these results are not necessarily consistent with the predictions of the Standard Model of big bang nucleosynthesis. Reconciling these discrepancies may require extensions of the basic tenets of the model, and possibly of the reaction rates that determine the big bang abundances

  3. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  4. Big Data, indispensable today

    Directory of Open Access Journals (Sweden)

    Radu-Ioan ENACHE

    2015-10-01

    Full Text Available Big data is and will be used more in the future as a tool for everything that happens both online and offline. Of course , online is a real hobbit, Big Data is found in this medium , offering many advantages , being a real help for all consumers. In this paper we talked about Big Data as being a plus in developing new applications, by gathering useful information about the users and their behaviour.We've also presented the key aspects of real-time monitoring and the architecture principles of this technology. The most important benefit brought to this paper is presented in the cloud section.

  5. A proposed framework of big data readiness in public sectors

    Science.gov (United States)

    Ali, Raja Haslinda Raja Mohd; Mohamad, Rosli; Sudin, Suhizaz

    2016-08-01

    Growing interest over big data mainly linked to its great potential to unveil unforeseen pattern or profiles that support organisation's key business decisions. Following private sector moves to embrace big data, the government sector has now getting into the bandwagon. Big data has been considered as one of the potential tools to enhance service delivery of the public sector within its financial resources constraints. Malaysian government, particularly, has considered big data as one of the main national agenda. Regardless of government commitment to promote big data amongst government agencies, degrees of readiness of the government agencies as well as their employees are crucial in ensuring successful deployment of big data. This paper, therefore, proposes a conceptual framework to investigate perceived readiness of big data potentials amongst Malaysian government agencies. Perceived readiness of 28 ministries and their respective employees will be assessed using both qualitative (interview) and quantitative (survey) approaches. The outcome of the study is expected to offer meaningful insight on factors affecting change readiness among public agencies on big data potentials and the expected outcome from greater/lower change readiness among the public sectors.

  6. Ready, set, move!

    CERN Multimedia

    Anaïs Schaeffer

    2012-01-01

    This year, the CERN Medical Service is launching a new public health campaign. Advertised by the catchphrase “Move! & Eat Better”, the particular aim of the campaign is to encourage people at CERN to take more regular exercise, of whatever kind.   The CERN annual relay race is scheduled on 24 May this year. The CERN Medical Service will officially launch its “Move! & Eat Better” campaign at this popular sporting event. “We shall be on hand on the day of the race to strongly advocate regular physical activity,” explains Rachid Belkheir, one of the Medical Service doctors. "We really want to pitch our campaign and answer any questions people may have. Above all we want to set an example. So we are going to walk the same circuit as the runners to underline to people that they can easily incorporate movement into their daily routine.” An underlying concern has prompted this campaign: during their first few year...

  7. Move! Eat better: news

    CERN Multimedia

    2013-01-01

    Are you curious to know whether you’re doing enough daily exercise…? Test yourself with a pedometer!   Through the Move! Eat better campaign, launched in May 2012, the CERN medical service is aiming to improve the health of members of the personnel by encouraging them to prioritise physical activity in conjunction with a balanced diet. Various successful activities have already taken place: relay race/Nordic walk, Bike2work, Zumba and fitness workshops, two conferences (“Physical activity for health” and “Good nutrition every day”), events in the restaurants, as well as posters and a website. Although everyone has got the message from our various communications that physical activity is good for your health, there is still a relevant question being asked: “What is the minimum amount of exercise recommended?” 10,000 steps per day is the ideal figure, which has been demonstrated as beneficial by scientific studies ...

  8. 77 FR 74829 - Notice of Public Meeting-Cloud Computing and Big Data Forum and Workshop

    Science.gov (United States)

    2012-12-18

    ...--Cloud Computing and Big Data Forum and Workshop AGENCY: National Institute of Standards and Technology... Standards and Technology (NIST) announces a Cloud Computing and Big Data Forum and Workshop to be held on... followed by a one-day hands-on workshop. The NIST Cloud Computing and Big Data Forum and Workshop will...

  9. Job Surfing: Move On to Move Up.

    Science.gov (United States)

    Martin, Justin

    1997-01-01

    Looks at the process of switching jobs and changing careers. Discusses when to consider options and make the move as well as the need to be flexible and open minded. Provides a test for determining the chances of promotion and when to move on. (JOW)

  10. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  11. Small Big Data Congress 2017

    NARCIS (Netherlands)

    Doorn, J.

    2017-01-01

    TNO, in collaboration with the Big Data Value Center, presents the fourth Small Big Data Congress! Our congress aims at providing an overview of practical and innovative applications based on big data. Do you want to know what is happening in applied research with big data? And what can already be

  12. Cryptography for Big Data Security

    Science.gov (United States)

    2015-07-13

    Cryptography for Big Data Security Book Chapter for Big Data: Storage, Sharing, and Security (3S) Distribution A: Public Release Ariel Hamlin1 Nabil...Email: arkady@ll.mit.edu ii Contents 1 Cryptography for Big Data Security 1 1.1 Introduction...48 Chapter 1 Cryptography for Big Data Security 1.1 Introduction With the amount

  13. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  14. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  15. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  16. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  17. BigDansing

    KAUST Repository

    Khayyat, Zuhair; Ilyas, Ihab F.; Jindal, Alekh; Madden, Samuel; Ouzzani, Mourad; Papotti, Paolo; Quiané -Ruiz, Jorge-Arnulfo; Tang, Nan; Yin, Si

    2015-01-01

    of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic

  18. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  19. Big Creek Pit Tags

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BCPITTAGS database is used to store data from an Oncorhynchus mykiss (steelhead/rainbow trout) population dynamics study in Big Creek, a coastal stream along the...

  20. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-01-01

    on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also

  1. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  2. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  3. Big Bang baryosynthesis

    International Nuclear Information System (INIS)

    Turner, M.S.; Chicago Univ., IL

    1983-01-01

    In these lectures I briefly review Big Bang baryosynthesis. In the first lecture I discuss the evidence which exists for the BAU, the failure of non-GUT symmetrical cosmologies, the qualitative picture of baryosynthesis, and numerical results of detailed baryosynthesis calculations. In the second lecture I discuss the requisite CP violation in some detail, further the statistical mechanics of baryosynthesis, possible complications to the simplest scenario, and one cosmological implication of Big Bang baryosynthesis. (orig./HSI)

  4. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  5. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2015 The Authors.

  6. Big Data and medicine: a big deal?

    Science.gov (United States)

    Mayer-Schönberger, V; Ingelsson, E

    2018-05-01

    Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research. Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data's role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval. © 2017 The Association for the Publication of the Journal of Internal Medicine.

  7. Revisiting the Battle of the Little Big Horn

    National Research Council Canada - National Science Library

    Burns, Matthew

    2000-01-01

    The Battle of the Little Big Horn has captured the interest of historians, scholars, and military enthusiasts since the day that over 200 United States soldiers under General George Armstrong Custer's...

  8. Big data uncertainties.

    Science.gov (United States)

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  9. Texting on the Move

    Science.gov (United States)

    ... text. What's the Big Deal? The problem is multitasking. No matter how young and agile we are, ... on something other than the road. In fact, driving while texting (DWT) can be more dangerous than ...

  10. ATLAS starts moving in

    CERN Multimedia

    Della Mussia, S

    2004-01-01

    The first large active detector component was lowered into the ATLAS cavern on 1st March. It consisted of the 8 modules forming the lower part of the central barrel of the tile hadronic calorimeter. The work of assembling the barrel, which comprises 64 modules, started the following day. Two road trailers each with 64 wheels, positioned side by side. This was the solution chosen to transport the lower part of the central barrel of ATLAS' tile hadronic calorimeter from Building 185 to the PX16 shaft at Point 1 (see Figure 1). The transportation, and then the installation of the component in the experimental cavern, which took place over three days were, to say the least, rather spectacular. On 25 February, the component, consisting of eight 6-metre modules, was loaded on to the trailers. The segment of the barrel was transported on a steel support so that it wouldn't move an inch during the journey. On 26 February, once all the necessary safety checks had been carried out, the convoy was able to leave Buildi...

  11. One Second After the Big Bang

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    A new experiment called PTOLEMY (Princeton Tritium Observatory for Light, Early-Universe, Massive-Neutrino Yield) is under development at the Princeton Plasma Physics Laboratory with the goal of challenging one of the most fundamental predictions of the Big Bang – the present-day existence of relic neutrinos produced less than one second after the Big Bang. Using a gigantic graphene surface to hold 100 grams of a single-atomic layer of tritium, low noise antennas that sense the radio waves of individual electrons undergoing cyclotron motion, and a massive array of cryogenic sensors that sit at the transition between normal and superconducting states, the PTOLEMY project has the potential to challenge one of the most fundamental predictions of the Big Bang, to potentially uncover new interactions and properties of the neutrinos, and to search for the existence of a species of light dark matter known as sterile neutrinos.

  12. Relational Databases and Biomedical Big Data.

    Science.gov (United States)

    de Silva, N H Nisansa D

    2017-01-01

    In various biomedical applications that collect, handle, and manipulate data, the amounts of data tend to build up and venture into the range identified as bigdata. In such occurrences, a design decision has to be taken as to what type of database would be used to handle this data. More often than not, the default and classical solution to this in the biomedical domain according to past research is relational databases. While this used to be the norm for a long while, it is evident that there is a trend to move away from relational databases in favor of other types and paradigms of databases. However, it still has paramount importance to understand the interrelation that exists between biomedical big data and relational databases. This chapter will review the pros and cons of using relational databases to store biomedical big data that previous researches have discussed and used.

  13. Improving Healthcare Using Big Data Analytics

    Directory of Open Access Journals (Sweden)

    Revanth Sonnati

    2017-03-01

    Full Text Available In daily terms we call the current era as Modern Era which can also be named as the era of Big Data in the field of Information Technology. Our daily lives in todays world are rapidly advancing never quenching ones thirst. The fields of science engineering and technology are producing data at an exponential rate leading to Exabytes of data every day. Big data helps us to explore and re-invent many areas not limited to education health and law. The primary purpose of this paper is to provide an in-depth analysis in the area of Healthcare using the big data and analytics. The main purpose is to emphasize on the usage of the big data which is being stored all the time helping to look back in the history but this is the time to emphasize on the analyzation to improve the medication and services. Although many big data implementations happen to be in-house development this proposed implementation aims to propose a broader extent using Hadoop which just happen to be the tip of the iceberg. The focus of this paper is not limited to the improvement and analysis of the data it also focusses on the strengths and drawbacks compared to the conventional techniques available.

  14. BigDansing

    KAUST Repository

    Khayyat, Zuhair

    2015-06-02

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to scaling to big datasets. This presents a serious impediment since data cleansing often involves costly computations such as enumerating pairs of tuples, handling inequality joins, and dealing with user-defined functions. In this paper, we present BigDansing, a Big Data Cleansing system to tackle efficiency, scalability, and ease-of-use issues in data cleansing. The system can run on top of most common general purpose data processing platforms, ranging from DBMSs to MapReduce-like frameworks. A user-friendly programming interface allows users to express data quality rules both declaratively and procedurally, with no requirement of being aware of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic and real datasets show that BigDansing outperforms existing baseline systems up to more than two orders of magnitude without sacrificing the quality provided by the repair algorithms.

  15. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  16. Thick-Big Descriptions

    DEFF Research Database (Denmark)

    Lai, Signe Sophus

    The paper discusses the rewards and challenges of employing commercial audience measurements data – gathered by media industries for profitmaking purposes – in ethnographic research on the Internet in everyday life. It questions claims to the objectivity of big data (Anderson 2008), the assumption...... communication systems, language and behavior appear as texts, outputs, and discourses (data to be ‘found’) – big data then documents things that in earlier research required interviews and observations (data to be ‘made’) (Jensen 2014). However, web-measurement enterprises build audiences according...... to a commercial logic (boyd & Crawford 2011) and is as such directed by motives that call for specific types of sellable user data and specific segmentation strategies. In combining big data and ‘thick descriptions’ (Geertz 1973) scholars need to question how ethnographic fieldwork might map the ‘data not seen...

  17. Big data in biomedicine.

    Science.gov (United States)

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Big data bioinformatics.

    Science.gov (United States)

    Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao

    2014-12-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.

  19. Dinosaur Day!

    Science.gov (United States)

    Nakamura, Sandra; Baptiste, H. Prentice

    2006-01-01

    In this article, the authors describe how they capitalized on their first-grade students' love of dinosaurs by hosting a fun-filled Dinosaur Day in their classroom. On Dinosaur Day, students rotated through four dinosaur-related learning stations that integrated science content with art, language arts, math, and history in a fun and time-efficient…

  20. Big Java late objects

    CERN Document Server

    Horstmann, Cay S

    2012-01-01

    Big Java: Late Objects is a comprehensive introduction to Java and computer programming, which focuses on the principles of programming, software engineering, and effective learning. It is designed for a two-semester first course in programming for computer science students.

  1. Big ideas: innovation policy

    OpenAIRE

    John Van Reenen

    2011-01-01

    In the last CentrePiece, John Van Reenen stressed the importance of competition and labour market flexibility for productivity growth. His latest in CEP's 'big ideas' series describes the impact of research on how policy-makers can influence innovation more directly - through tax credits for business spending on research and development.

  2. Big data in history

    CERN Document Server

    Manning, Patrick

    2013-01-01

    Big Data in History introduces the project to create a world-historical archive, tracing the last four centuries of historical dynamics and change. Chapters address the archive's overall plan, how to interpret the past through a global archive, the missions of gathering records, linking local data into global patterns, and exploring the results.

  3. The Big Sky inside

    Science.gov (United States)

    Adams, Earle; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Hester, Carolyn; Knuth, Randy; Spangler, Todd; Jones, David; Henthorn, Melissa; Hammill, Brock; Smith, Paul; Salisbury, Rob; Reckin, Gene; Boulafentis, Johna

    2009-01-01

    The University of Montana (UM)-Missoula has implemented a problem-based program in which students perform scientific research focused on indoor air pollution. The Air Toxics Under the Big Sky program (Jones et al. 2007; Adams et al. 2008; Ward et al. 2008) provides a community-based framework for understanding the complex relationship between poor…

  4. A Big Bang Lab

    Science.gov (United States)

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  5. New 'bigs' in cosmology

    International Nuclear Information System (INIS)

    Yurov, Artyom V.; Martin-Moruno, Prado; Gonzalez-Diaz, Pedro F.

    2006-01-01

    This paper contains a detailed discussion on new cosmic solutions describing the early and late evolution of a universe that is filled with a kind of dark energy that may or may not satisfy the energy conditions. The main distinctive property of the resulting space-times is that they make to appear twice the single singular events predicted by the corresponding quintessential (phantom) models in a manner which can be made symmetric with respect to the origin of cosmic time. Thus, big bang and big rip singularity are shown to take place twice, one on the positive branch of time and the other on the negative one. We have also considered dark energy and phantom energy accretion onto black holes and wormholes in the context of these new cosmic solutions. It is seen that the space-times of these holes would then undergo swelling processes leading to big trip and big hole events taking place on distinct epochs along the evolution of the universe. In this way, the possibility is considered that the past and future be connected in a non-paradoxical manner in the universes described by means of the new symmetric solutions

  6. The Big Bang

    CERN Multimedia

    Moods, Patrick

    2006-01-01

    How did the Universe begin? The favoured theory is that everything - space, time, matter - came into existence at the same moment, around 13.7 thousand million years ago. This event was scornfully referred to as the "Big Bang" by Sir Fred Hoyle, who did not believe in it and maintained that the Universe had always existed.

  7. Big Data Analytics

    Indian Academy of Sciences (India)

    The volume and variety of data being generated using computersis doubling every two years. It is estimated that in 2015,8 Zettabytes (Zetta=1021) were generated which consistedmostly of unstructured data such as emails, blogs, Twitter,Facebook posts, images, and videos. This is called big data. Itis possible to analyse ...

  8. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  9. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  10. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-07-31

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to big data scaling. This presents a serious impediment since identify- ing and repairing dirty data often involves processing huge input datasets, handling sophisticated error discovery approaches and managing huge arbitrary errors. With large datasets, error detection becomes overly expensive and complicated especially when considering user-defined functions. Furthermore, a distinctive algorithm is de- sired to optimize inequality joins in sophisticated error discovery rather than na ̈ıvely parallelizing them. Also, when repairing large errors, their skewed distribution may obstruct effective error repairs. In this dissertation, I present solutions to overcome the above three problems in scaling data cleansing. First, I present BigDansing as a general system to tackle efficiency, scalability, and ease-of-use issues in data cleansing for Big Data. It automatically parallelizes the user’s code on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also enables parallel execution of serial repair algorithms by exploiting the graph representation of discovered errors. The experimental results show that BigDansing outperforms existing baselines up to more than two orders of magnitude. Although BigDansing scales cleansing jobs, it still lacks the ability to handle sophisticated error discovery requiring inequality joins. Therefore, I developed IEJoin as an algorithm for fast inequality joins. It is based on sorted arrays and space efficient bit-arrays to reduce the problem’s search space. By comparing IEJoin against well- known optimizations, I show that it is more scalable, and several orders of magnitude faster. BigDansing depends on vertex-centric graph systems, i.e., Pregel

  11. Big Bang Day: The Making of CERN (Episode 1)

    CERN Multimedia

    Lyn Evans; Francis Farley; Maria Fidecaro; Giuseppe Fidecaro; David S Coxill; Gunther Plass; James Gillies

    2008-01-01

    A two-part history of the CERN project. Quentin Cooper explores the fifty-year history of CERN, the European particle physics laboratory in Switzerland. The institution was created to bring scientists together after WW2 .......

  12. Liquid biopsies in gastrointestinal malignancies: when is the big day?

    Science.gov (United States)

    Lopez, Anthony; Harada, Kazuto; Mizrak Kaya, Dilsa; Dong, Xiaochuan; Song, Shumei; Ajani, Jaffer A

    2018-01-01

    Tumor tissue sample is currently the gold standard for diagnosing gastrointestinal cancers, but also for genomic/immune component analyses that can help in the selection of therapy. However, this approach of studying a 'representative' sample of the tumor does not address inherent heterogeneity. Liquid biopsies, mainly represented by circulating tumor cells, circulating tumor DNA, tumor exosomes, and microRNAs, have the potential to assess various biomarkers for early detection of cancer, carrying out genomic/immune profiling for not only selection of appropriate therapy but also to monitor effect of therapy. Areas covered: This review summarizes the current evidence in the literature on liquid biopsies in gastrointestinal cancers concerning diagnosis, prognosis, and response to therapy. The following terms were used in PubMed: 'esophageal', 'gastric', 'colorectal', 'cancer', 'circulating tumor cells', 'circulating tumor DNA', microRNA', 'diagnosis', 'prognosis', 'response', 'resistance'. Expert commentary: Data increasingly supports the potential of liquid biopsies for early detection, selection of therapy, and monitoring response to therapy. One major question is whether assaying various components of the blood would accommodate considerable context-dependent heterogeneity of gastrointestinal tumors. There are many potential strategies to exploit liquid biopsy use. To put them in to perspective, well-designed and meticulous prospective studies will be needed to prove their usefulness.

  13. ''Your Big Wedding Day''. Temporal Goal in Church Marriage Rituals

    NARCIS (Netherlands)

    Robinson, R.; Hermans, C.A.M.; Scheepers, P.L.H.; Schilderman, J.B.A.M.

    2007-01-01

    In this contribution the authors explore notions about the origin and destiny of bridal couples’ relationships from participants’ views of church marriage rituals. A church wedding can be a pivotal moment in a bridal couple’s life, and on these occasions people tend to contemplate the past and the

  14. Big Bang Day: 5 Particles - 4. The Neutrino

    CERN Multimedia

    2008-01-01

    Simon Singh looks at the stories behind the discovery of 5 of the universe's most significant subatomic particles: the Electron, the Quark, the Anti-particle, the Neutrino and the "next particle". It's the most populous particle in the universe. Millions of these subatomic particles are passing through each one of us. With no charge and virtually no mass they can penetrate vast thicknesses of matter without any interaction - indeed the sun emits huge numbers that pass through earth at the speed of light. Neutrinos are similar to the more familiar electron, with one crucial difference: neutrinos do not carry electric charge. As a result they're extremely difficult to detect . But like HG Wells' invisible man they can give themselves away by bumping into things at high energy and detectors hidden in mines are exploiting this to observe these rare interactions.

  15. Big Bang Day: 5 Particles - 5. The Next Particle

    CERN Multimedia

    Franck Close

    2008-01-01

    Simon Singh looks at the stories behind the discovery of 5 of the universe's most significant subatomic particles: the Electron, the Quark, the Anti-particle, the Neutrino and the "next particle". 5. The Next Particle The "sparticle" - a super symmetric partner to all the known particles could be the answer to uniting all the known particles and their interactions under one grand theoretical pattern of activity. But how do researchers know where to look for such phenomena and how do they know if they find them? Simon Singh reviews the next particle that physicists would like to find if the current particle theories are to ring true.

  16. Big Bang Day: 5 Particles - 2. The Quark

    CERN Multimedia

    Franck Close

    2008-01-01

    Simon Singh looks at the stories behind the discovery of 5 of the universe's most significant subatomic particles: the Electron, the Quark, the Anti-particle, the Neutrino and the "next particle". 2. The Quark "Three Quarks for Master Mark! Sure he hasn't got much of a bark." James Joyce's Finnegans Wake left its mark on modern physics when physicist Murray Gell Mann proposed this name for a group of hypothetical subatomic particles that were revealed in 1960 as the fundamental units of matter. Basic particles it seems are made up of even more basic units called quarks that make up 99.9% of visible material in the universe.. But why do we know so little about them? Quarks have never been seen as free particles but instead, inextricably bound together by the Strong Force that in turn holds the atomic nucleus together. This is the hardest of Nature's fundamental forces to crack, but recent theoretical advances, mean that the properties of the quark are at last being revealed.

  17. Big Bang Day: 5 Particles - 1. The Electron

    CERN Multimedia

    Simon Singh

    2008-01-01

    Simon Singh looks at the stories behind the discovery of 5 of the universe's most significant subatomic particles: the Electron, the Quark, the Anti-particle, the Neutrino and the "next particle". 1. The Electron Just over a century ago, British physicist J.J. Thompson experimenting with electric currents and charged particles inside empty glass tubes, showed that atoms are divisible into indivisible elementary particles. But how could atoms be built up of these so called "corpuscles"? An exciting 30 year race ensued, to grasp the planetary model of the atom with its orbiting electrons, and the view inside the atom was born. Whilst the number of electrons around the nucleus of an atom determines their the chemistry of all elements, the power of electrons themselves have been harnessed for everyday use: electron beams for welding,cathode ray tubes and radiation therapy.

  18. Moving and Being Moved: Implications for Practice.

    Science.gov (United States)

    Kretchmar, R. Scott

    2000-01-01

    Uses philosophical writings, a novel about baseball, and a nonfiction work on rowing to analyze levels of meaning in physical activity, showing why three popular methods for enhancing meaning have not succeeded and may have moved some students away from deeper levels of meaning. The paper suggests that using hints taken from the three books could…

  19. Quest for Value in Big Earth Data

    Science.gov (United States)

    Kuo, Kwo-Sen; Oloso, Amidu O.; Rilee, Mike L.; Doan, Khoa; Clune, Thomas L.; Yu, Hongfeng

    2017-04-01

    Among all the V's of Big Data challenges, such as Volume, Variety, Velocity, Veracity, etc., we believe Value is the ultimate determinant, because a system delivering better value has a competitive edge over others. Although it is not straightforward to assess the value of scientific endeavors, we believe the ratio of scientific productivity increase to investment is a reasonable measure. Our research in Big Data approaches to data-intensive analysis for Earth Science has yielded some insights, as well as evidences, as to how optimal value might be attained. The first insight is that we should avoid, as much as possible, moving data through connections with relatively low bandwidth. That is, we recognize that moving data is expensive, albeit inevitable. They must at least be moved from the storage device into computer main memory and then to CPU registers for computation. When data must be moved it is better to move them via relatively high-bandwidth connections and avoid low-bandwidth ones. For this reason, a technology that can best exploit data locality will have an advantage over others. Data locality is easy to achieve and exploit with only one dataset. With multiple datasets, data colocation becomes important in addition to data locality. However, the organization of datasets can only be co-located for certain types of analyses. It is impossible for them to be co-located for all analyses. Therefore, our second insight is that we need to co-locate the datasets for the most commonly used analyses. In Earth Science, we believe the most common analysis requirement is "spatiotemporal coincidence". For example, when we analyze precipitation systems, we often would like to know the environment conditions "where and when" (i.e. at the same location and time) there is precipitation. This "where and when" indicates the "spatiotemporal coincidence" requirement. Thus, an associated insight is that datasets need to be partitioned per the physical dimensions, i.e. space

  20. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    Science.gov (United States)

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  1. Infectious Disease Surveillance in the Big Data Era

    DEFF Research Database (Denmark)

    Simonsen, Lone; Gog, Julia R.; Olson, Don

    2016-01-01

    , flexible, and local tracking of infectious diseases, especially for emerging pathogens. In this opinion piece, we reflect on the long and distinguished history of disease surveillance and discuss recent developments related to use of big data. We start with a brief review of traditional systems relying...... of Google Flu Trends. We conclude by advocating for increased use of hybrid systems combining information from traditional surveillance and big data sources, which seems the most promising option moving forward. Throughout the article, we use influenza as an exemplar of an emerging and reemerging infection...

  2. Collaborative Approaches Needed to Close the Big Data Skills Gap

    Directory of Open Access Journals (Sweden)

    Steven Miller

    2014-04-01

    Full Text Available The big data and analytics talent discussion has largely focused on a single role – the data scientist. However, the need is much broader than data scientists. Data has become a strategic business asset. Every professional occupation must adapt to this new mindset. Universities in partnership with industry must move quickly to ensure that the graduates they produce have the required skills for the age of big data. Existing curricula should be reviewed and adapted to ensure relevance. New curricula and degree programs are needed to meet the needs of industry.

  3. The big data-big model (BDBM) challenges in ecological research

    Science.gov (United States)

    Luo, Y.

    2015-12-01

    The field of ecology has become a big-data science in the past decades due to development of new sensors used in numerous studies in the ecological community. Many sensor networks have been established to collect data. For example, satellites, such as Terra and OCO-2 among others, have collected data relevant on global carbon cycle. Thousands of field manipulative experiments have been conducted to examine feedback of terrestrial carbon cycle to global changes. Networks of observations, such as FLUXNET, have measured land processes. In particular, the implementation of the National Ecological Observatory Network (NEON), which is designed to network different kinds of sensors at many locations over the nation, will generate large volumes of ecological data every day. The raw data from sensors from those networks offer an unprecedented opportunity for accelerating advances in our knowledge of ecological processes, educating teachers and students, supporting decision-making, testing ecological theory, and forecasting changes in ecosystem services. Currently, ecologists do not have the infrastructure in place to synthesize massive yet heterogeneous data into resources for decision support. It is urgent to develop an ecological forecasting system that can make the best use of multiple sources of data to assess long-term biosphere change and anticipate future states of ecosystem services at regional and continental scales. Forecasting relies on big models that describe major processes that underlie complex system dynamics. Ecological system models, despite great simplification of the real systems, are still complex in order to address real-world problems. For example, Community Land Model (CLM) incorporates thousands of processes related to energy balance, hydrology, and biogeochemistry. Integration of massive data from multiple big data sources with complex models has to tackle Big Data-Big Model (BDBM) challenges. Those challenges include interoperability of multiple

  4. Moving Field Guides

    Science.gov (United States)

    Cassie Meador; Mark Twery; Meagan. Leatherbury

    2011-01-01

    The Moving Field Guides (MFG) project is a creative take on site interpretation. Moving Field Guides provide an example of how scientific and artistic endeavors work in parallel. Both begin with keen observations that produce information that must be analyzed, understood, and interpreted. That interpretation then needs to be communicated to others to complete the...

  5. Big Data, Big Responsibility! Building best-practice privacy strategies into a large-scale neuroinformatics platform

    Directory of Open Access Journals (Sweden)

    Christina Popovich

    2017-04-01

    OBI’s rigorous approach to data sharing in the field of neuroscience maintains the accessibility of research data for big discoveries without compromising patient privacy and security. We believe that Brain-CODE is a powerful and advantageous tool; moving neuroscience research from independent silos to an integrative system approach for improving patient health. OBI’s vision for improved brain health for patients living with neurological disorders paired with Brain-CODE’s best-practice strategies in privacy protection of patient data offer a novel and innovative approach to “big data” initiatives aimed towards improving public health and society world-wide.

  6. Embodied affectivity: On moving and being moved

    Directory of Open Access Journals (Sweden)

    Thomas eFuchs

    2014-06-01

    Full Text Available There is a growing body of research indicating that bodily sensation and behaviour strongly influences one’s emotional reaction towards certain situations or objects. On this background, a framework model of embodied affectivity is suggested: we regard emotions as resulting from the circular interaction between affective qualities or affordances in the environment and the subject’s bodily resonance, be it in the form of sensations, postures, expressive movements or movement tendencies. Motion and emotion are thus intrinsically connected: one is moved by movement (perception; impression; affection and moved to move (action; expression; e-motion. Through its resonance, the body functions as a medium of emotional perception: it colours or charges self-experience and the environment with affective valences while it remains itself in the background of one’s own awareness. This model is then applied to emotional social understanding or interaffectivity which is regarded as an intertwinement of two cycles of embodied affectivity, thus continuously modifying each partner’s affective affordances and bodily resonance. We conclude with considerations of how embodied affectivity is altered in psychopathology and can be addressed in psychotherapy of the embodied self.

  7. Big Data and Biomedical Informatics: A Challenging Opportunity

    Science.gov (United States)

    2014-01-01

    Summary Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations. PMID:24853034

  8. Big data and biomedical informatics: a challenging opportunity.

    Science.gov (United States)

    Bellazzi, R

    2014-05-22

    Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations.

  9. Big Data-Survey

    Directory of Open Access Journals (Sweden)

    P.S.G. Aruna Sri

    2016-03-01

    Full Text Available Big data is the term for any gathering of information sets, so expensive and complex, that it gets to be hard to process for utilizing customary information handling applications. The difficulties incorporate investigation, catch, duration, inquiry, sharing, stockpiling, Exchange, perception, and protection infringement. To reduce spot business patterns, anticipate diseases, conflict etc., we require bigger data sets when compared with the smaller data sets. Enormous information is hard to work with utilizing most social database administration frameworks and desktop measurements and perception bundles, needing rather enormously parallel programming running on tens, hundreds, or even a large number of servers. In this paper there was an observation on Hadoop architecture, different tools used for big data and its security issues.

  10. Finding the big bang

    CERN Document Server

    Page, Lyman A; Partridge, R Bruce

    2009-01-01

    Cosmology, the study of the universe as a whole, has become a precise physical science, the foundation of which is our understanding of the cosmic microwave background radiation (CMBR) left from the big bang. The story of the discovery and exploration of the CMBR in the 1960s is recalled for the first time in this collection of 44 essays by eminent scientists who pioneered the work. Two introductory chapters put the essays in context, explaining the general ideas behind the expanding universe and fossil remnants from the early stages of the expanding universe. The last chapter describes how the confusion of ideas and measurements in the 1960s grew into the present tight network of tests that demonstrate the accuracy of the big bang theory. This book is valuable to anyone interested in how science is done, and what it has taught us about the large-scale nature of the physical universe.

  11. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Klinkby Madsen, Anders; Rasche, Andreas

    data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects......This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... of international development agendas to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development efforts, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion...

  12. Big nuclear accidents

    International Nuclear Information System (INIS)

    Marshall, W.; Billingon, D.E.; Cameron, R.F.; Curl, S.J.

    1983-09-01

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but just imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the risks of nuclear power. The paper reviews the way in which the probability and consequences of big nuclear accidents have been presented in the past and makes recommendations for the future, including the presentation of the long-term consequences of such accidents in terms of 'loss of life expectancy', 'increased chance of fatal cancer' and 'equivalent pattern of compulsory cigarette smoking'. The paper presents mathematical arguments, which show the derivation and validity of the proposed methods of presenting the consequences of imaginable big nuclear accidents. (author)

  13. Big Bounce and inhomogeneities

    International Nuclear Information System (INIS)

    Brizuela, David; Mena Marugan, Guillermo A; Pawlowski, Tomasz

    2010-01-01

    The dynamics of an inhomogeneous universe is studied with the methods of loop quantum cosmology, via a so-called hybrid quantization, as an example of the quantization of vacuum cosmological spacetimes containing gravitational waves (Gowdy spacetimes). The analysis of this model with an infinite number of degrees of freedom, performed at the effective level, shows that (i) the initial Big Bang singularity is replaced (as in the case of homogeneous cosmological models) by a Big Bounce, joining deterministically two large universes, (ii) the universe size at the bounce is at least of the same order of magnitude as that of the background homogeneous universe and (iii) for each gravitational wave mode, the difference in amplitude at very early and very late times has a vanishing statistical average when the bounce dynamics is strongly dominated by the inhomogeneities, whereas this average is positive when the dynamics is in a near-vacuum regime, so that statistically the inhomogeneities are amplified. (fast track communication)

  14. Big Data and reality

    Directory of Open Access Journals (Sweden)

    Ryan Shaw

    2015-11-01

    Full Text Available DNA sequencers, Twitter, MRIs, Facebook, particle accelerators, Google Books, radio telescopes, Tumblr: what do these things have in common? According to the evangelists of “data science,” all of these are instruments for observing reality at unprecedentedly large scales and fine granularities. This perspective ignores the social reality of these very different technological systems, ignoring how they are made, how they work, and what they mean in favor of an exclusive focus on what they generate: Big Data. But no data, big or small, can be interpreted without an understanding of the process that generated them. Statistical data science is applicable to systems that have been designed as scientific instruments, but is likely to lead to confusion when applied to systems that have not. In those cases, a historical inquiry is preferable.

  15. Really big numbers

    CERN Document Server

    Schwartz, Richard Evan

    2014-01-01

    In the American Mathematical Society's first-ever book for kids (and kids at heart), mathematician and author Richard Evan Schwartz leads math lovers of all ages on an innovative and strikingly illustrated journey through the infinite number system. By means of engaging, imaginative visuals and endearing narration, Schwartz manages the monumental task of presenting the complex concept of Big Numbers in fresh and relatable ways. The book begins with small, easily observable numbers before building up to truly gigantic ones, like a nonillion, a tredecillion, a googol, and even ones too huge for names! Any person, regardless of age, can benefit from reading this book. Readers will find themselves returning to its pages for a very long time, perpetually learning from and growing with the narrative as their knowledge deepens. Really Big Numbers is a wonderful enrichment for any math education program and is enthusiastically recommended to every teacher, parent and grandparent, student, child, or other individual i...

  16. Radiochemistry days

    International Nuclear Information System (INIS)

    1998-09-01

    This document provides the 44 papers (transparencies used during the presentations and posters) presented at the Radiochemistry Days, held September 3-4, 1998 in Nantes, France. The main studied topics were problematic questions concerning the nuclear fuel cycle and in particular the management, storage of radioactive wastes and the environmental impact. (O.M.)

  17. Big Bang Circus

    Science.gov (United States)

    Ambrosini, C.

    2011-06-01

    Big Bang Circus is an opera I composed in 2001 and which was premiered at the Venice Biennale Contemporary Music Festival in 2002. A chamber group, four singers and a ringmaster stage the story of the Universe confronting and interweaving two threads: how early man imagined it and how scientists described it. Surprisingly enough fancy, myths and scientific explanations often end up using the same images, metaphors and sometimes even words: a strong tension, a drumskin starting to vibrate, a shout…

  18. Big Bang 5

    CERN Document Server

    Apolin, Martin

    2007-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 5 RG behandelt die Grundlagen (Maßsystem, Größenordnungen) und die Mechanik (Translation, Rotation, Kraft, Erhaltungssätze).

  19. Big Bang 8

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Band 8 vermittelt auf verständliche Weise Relativitätstheorie, Kern- und Teilchenphysik (und deren Anwendungen in der Kosmologie und Astrophysik), Nanotechnologie sowie Bionik.

  20. Big Bang 6

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 6 RG behandelt die Gravitation, Schwingungen und Wellen, Thermodynamik und eine Einführung in die Elektrizität anhand von Alltagsbeispielen und Querverbindungen zu anderen Disziplinen.

  1. Big Bang 7

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. In Band 7 werden neben einer Einführung auch viele aktuelle Aspekte von Quantenmechanik (z. Beamen) und Elektrodynamik (zB Elektrosmog), sowie die Klimaproblematik und die Chaostheorie behandelt.

  2. Big Bang Darkleosynthesis

    OpenAIRE

    Krnjaic, Gordan; Sigurdson, Kris

    2014-01-01

    In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis , a dark-sector version of big-bang nucleosynthesis (BBN), in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD , which generica...

  3. Big³. Editorial.

    Science.gov (United States)

    Lehmann, C U; Séroussi, B; Jaulent, M-C

    2014-05-22

    To provide an editorial introduction into the 2014 IMIA Yearbook of Medical Informatics with an overview of the content, the new publishing scheme, and upcoming 25th anniversary. A brief overview of the 2014 special topic, Big Data - Smart Health Strategies, and an outline of the novel publishing model is provided in conjunction with a call for proposals to celebrate the 25th anniversary of the Yearbook. 'Big Data' has become the latest buzzword in informatics and promise new approaches and interventions that can improve health, well-being, and quality of life. This edition of the Yearbook acknowledges the fact that we just started to explore the opportunities that 'Big Data' will bring. However, it will become apparent to the reader that its pervasive nature has invaded all aspects of biomedical informatics - some to a higher degree than others. It was our goal to provide a comprehensive view at the state of 'Big Data' today, explore its strengths and weaknesses, as well as its risks, discuss emerging trends, tools, and applications, and stimulate the development of the field through the aggregation of excellent survey papers and working group contributions to the topic. For the first time in history will the IMIA Yearbook be published in an open access online format allowing a broader readership especially in resource poor countries. For the first time, thanks to the online format, will the IMIA Yearbook be published twice in the year, with two different tracks of papers. We anticipate that the important role of the IMIA yearbook will further increase with these changes just in time for its 25th anniversary in 2016.

  4. Functional connectomics from a "big data" perspective.

    Science.gov (United States)

    Xia, Mingrui; He, Yong

    2017-10-15

    In the last decade, explosive growth regarding functional connectome studies has been observed. Accumulating knowledge has significantly contributed to our understanding of the brain's functional network architectures in health and disease. With the development of innovative neuroimaging techniques, the establishment of large brain datasets and the increasing accumulation of published findings, functional connectomic research has begun to move into the era of "big data", which generates unprecedented opportunities for discovery in brain science and simultaneously encounters various challenging issues, such as data acquisition, management and analyses. Big data on the functional connectome exhibits several critical features: high spatial and/or temporal precision, large sample sizes, long-term recording of brain activity, multidimensional biological variables (e.g., imaging, genetic, demographic, cognitive and clinic) and/or vast quantities of existing findings. We review studies regarding functional connectomics from a big data perspective, with a focus on recent methodological advances in state-of-the-art image acquisition (e.g., multiband imaging), analysis approaches and statistical strategies (e.g., graph theoretical analysis, dynamic network analysis, independent component analysis, multivariate pattern analysis and machine learning), as well as reliability and reproducibility validations. We highlight the novel findings in the application of functional connectomic big data to the exploration of the biological mechanisms of cognitive functions, normal development and aging and of neurological and psychiatric disorders. We advocate the urgent need to expand efforts directed at the methodological challenges and discuss the direction of applications in this field. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Recent big flare

    International Nuclear Information System (INIS)

    Moriyama, Fumio; Miyazawa, Masahide; Yamaguchi, Yoshisuke

    1978-01-01

    The features of three big solar flares observed at Tokyo Observatory are described in this paper. The active region, McMath 14943, caused a big flare on September 16, 1977. The flare appeared on both sides of a long dark line which runs along the boundary of the magnetic field. Two-ribbon structure was seen. The electron density of the flare observed at Norikura Corona Observatory was 3 x 10 12 /cc. Several arc lines which connect both bright regions of different magnetic polarity were seen in H-α monochrome image. The active region, McMath 15056, caused a big flare on December 10, 1977. At the beginning, several bright spots were observed in the region between two main solar spots. Then, the area and the brightness increased, and the bright spots became two ribbon-shaped bands. A solar flare was observed on April 8, 1978. At first, several bright spots were seen around the solar spot in the active region, McMath 15221. Then, these bright spots developed to a large bright region. On both sides of a dark line along the magnetic neutral line, bright regions were generated. These developed to a two-ribbon flare. The time required for growth was more than one hour. A bright arc which connects two ribbons was seen, and this arc may be a loop prominence system. (Kato, T.)

  6. Big Data Technologies

    Science.gov (United States)

    Bellazzi, Riccardo; Dagliati, Arianna; Sacchi, Lucia; Segagni, Daniele

    2015-01-01

    The so-called big data revolution provides substantial opportunities to diabetes management. At least 3 important directions are currently of great interest. First, the integration of different sources of information, from primary and secondary care to administrative information, may allow depicting a novel view of patient’s care processes and of single patient’s behaviors, taking into account the multifaceted nature of chronic care. Second, the availability of novel diabetes technologies, able to gather large amounts of real-time data, requires the implementation of distributed platforms for data analysis and decision support. Finally, the inclusion of geographical and environmental information into such complex IT systems may further increase the capability of interpreting the data gathered and extract new knowledge from them. This article reviews the main concepts and definitions related to big data, it presents some efforts in health care, and discusses the potential role of big data in diabetes care. Finally, as an example, it describes the research efforts carried on in the MOSAIC project, funded by the European Commission. PMID:25910540

  7. Towards Cloud Processing of GGOS Big Data

    Science.gov (United States)

    Weston, Stuart; Kim, Bumjun; Litchfield, Alan; Gulyaev, Sergei; Hall, Dylan; Chorao, Carlos; Ruthven, Andrew; Davies, Glyn; Lagos, Bruno; Christie, Don

    2017-04-01

    We report on our initial steps towards development of a cloud-like correlation infrastructure for geodetic Very Long Baseline Interferometry (VLBI), which in its raw format is of the order of 10-100 TB (big data). Data is generated by multiple VLBI radio telescopes, and is then used by for geodetic, geophysical, and astrometric research and operational activities through the International VLBI Service (IVS), as well as for corrections of GPS satellite orbits. Currently IVS data is correlated in several international Correlators (Correlation Centres), which receive data from individual radio telescope stations either in hard drives via regular mail service or via fibre using e-transfer mode. The latter is strongly limited by connectivity of existing correlation centres, which creates bottle necks and slows down the turnover of the data. This becomes critical in many applications - for example, it currently takes 1-2 weeks to generate the dUT1 parameter for corrections of GNSS orbits while less than 1-2 days delay is desirable. We started with a blade server at the AUT campus to emulate a cloud server using Virtual Machines (VMWare). The New Zealand Data Head node is connected to the high speed (100 Gbps) network ring circuit courtesy of the Research and Education Advanced Network New Zealand (REANNZ), with the additional nodes at remote physical sites connected via 10 Gbps fibre. We use real Australian Long Baseline Array (LBA) observational data from 6 radio telescopes in Australia, South Africa and New Zealand (15 baselines) of 1.5 hours in duration making 8 TB to emulate data transfer from remote locations and to provide a meaningful benchmark dataset for correlation. Data was successfully transferred using bespoke UDT network transfer tools and correlated with the speed-up factor of 0.8 using DiFX software correlator. In partnership with the New Zealand office of Catalyst IT Ltd we have moved this environment into Catalyst Cloud and report on the first

  8. Pamphlet day

    OpenAIRE

    Eastwood, Phil; Dunne, Chris; Fowler, Stephen

    2017-01-01

    Pamphlet Day: A Political Protest Pamphlet and Zine Event focused around the occupation of Loughborough Public Library, Granby Street, Loughborough, LE11 3DZ, UK. ABSTRACT “Throughout the 20th Century artists have engaged provocatively with text, images and performance, publishing writings, pamphlets, and manifestos that challenge the status quo.” (1) Loughborough Echo, May 2017 https://www.loughboroughecho.net/whats-on/arts-culture-news/pamphlet-art-feature-events-13038989 A s...

  9. Big Bang Day : The Great Big Particle Adventure - 2. Who Ordered That?

    CERN Multimedia

    2008-01-01

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. The atoms that make up our material world are important to us, but it turns out they aren't so significant on the cosmic stage. In fact early in the search for the stuff of atoms, researchers discovered particles that played no part in Earthly chemistry - for example particles in cosmic rays that resemble electrons (the stuff of electricity and the chemical glue in molecules) in almost all respects except that they weigh 140 times more. "Who ordered that?" one Nobel laureate demanded. They also discovered antimatter - the destructive mirror-image particles at obliterate all matter they come into contact with. In fact, the Universe is mostly made up of particles that could never make atoms, so that we are just the flotsam of the cosmos. But the main constituent of the Universe, what makes 80% of creation, has never been seen in the lab. Researchers at CERN believe they can create samples of it, down here on Earth...

  10. Big bang and big crunch in matrix string theory

    OpenAIRE

    Bedford, J; Papageorgakis, C; Rodríguez-Gómez, D; Ward, J

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  11. Big Data and HPC: A Happy Marriage

    KAUST Repository

    Mehmood, Rashid

    2016-01-25

    International Data Corporation (IDC) defines Big Data technologies as “a new generation of technologies and architectures, designed to economically extract value from very large volumes of a wide variety of data produced every day, by enabling high velocity capture, discovery, and/or analysis”. High Performance Computing (HPC) most generally refers to “the practice of aggregating computing power in a way that delivers much higher performance than one could get out of a typical desktop computer or workstation in order to solve large problems in science, engineering, or business”. Big data platforms are built primarily considering the economics and capacity of the system for dealing with the 4V characteristics of data. HPC traditionally has been more focussed on the speed of digesting (computing) the data. For these reasons, the two domains (HPC and Big Data) have developed their own paradigms and technologies. However, recently, these two have grown fond of each other. HPC technologies are needed by Big Data to deal with the ever increasing Vs of data in order to forecast and extract insights from existing and new domains, faster, and with greater accuracy. Increasingly more data is being produced by scientific experiments from areas such as bioscience, physics, and climate, and therefore, HPC needs to adopt data-driven paradigms. Moreover, there are synergies between them with unimaginable potential for developing new computing paradigms, solving long-standing grand challenges, and making new explorations and discoveries. Therefore, they must get married to each other. In this talk, we will trace the HPC and big data landscapes through time including their respective technologies, paradigms and major applications areas. Subsequently, we will present the factors that are driving the convergence of the two technologies, the synergies between them, as well as the benefits of their convergence to the biosciences field. The opportunities and challenges of the

  12. A cosmological analogy between the big bang and a supernova

    International Nuclear Information System (INIS)

    Sen, S.

    1983-01-01

    The author presents an objection to Brown's (1981) analogy between a supernova and the Big Bang. According to Brown an expanding spherical shell is quite similar to an ejected supernova shell. However, the fragmented shell of a supernova moves outward in pre-existing space. The force of repulsion which makes the fragments of the shell drift apart can be regarded as equivalent to the force of attraction of the rest of the universe on the supernova. By definition, such a force of attraction is absent in the case of the Big Bang. Energy is supposed suddenly to appear simultaneously at all points throughout the universe at the time of the Big Bang. As the universe expands, space expands too. In the relativistic cosmology, the universe cannot expand in pre-existing space. (Auth.)

  13. Cosmological analogy between the big bang and a supernova

    Energy Technology Data Exchange (ETDEWEB)

    Sen, S. (Hamburg, Germany, F.R.)

    1983-10-01

    The author presents an objection to Brown's (1981) analogy between a supernova and the Big Bang. According to Brown an expanding spherical shell is quite similar to an ejected supernova shell. However, the fragmented shell of a supernova moves outward in pre-existing space. The force of repulsion which makes the fragments of the shell drift apart can be regarded as equivalent to the force of attraction of the rest of the universe on the supernova. By definition, such a force of attraction is absent in the case of the Big Bang. Energy is supposed suddenly to appear simultaneously at all points throughout the universe at the time of the Big Bang. As the universe expands, space expands too. In the relativistic cosmology, the universe cannot expand in pre-existing space.

  14. PARALLEL MOVING MECHANICAL SYSTEMS

    Directory of Open Access Journals (Sweden)

    Florian Ion Tiberius Petrescu

    2014-09-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 Moving mechanical systems parallel structures are solid, fast, and accurate. Between parallel systems it is to be noticed Stewart platforms, as the oldest systems, fast, solid and precise. The work outlines a few main elements of Stewart platforms. Begin with the geometry platform, kinematic elements of it, and presented then and a few items of dynamics. Dynamic primary element on it means the determination mechanism kinetic energy of the entire Stewart platforms. It is then in a record tail cinematic mobile by a method dot matrix of rotation. If a structural mottoelement consists of two moving elements which translates relative, drive train and especially dynamic it is more convenient to represent the mottoelement as a single moving components. We have thus seven moving parts (the six motoelements or feet to which is added mobile platform 7 and one fixed.

  15. "Big Science" exhibition at Balexert

    CERN Multimedia

    2008-01-01

    CERN is going out to meet those members of the general public who were unable to attend the recent Open Day. The Laboratory will be taking its "Big Science" exhibition from the Globe of Science and Innovation to the Balexert shopping centre from 19 to 31 May 2008. The exhibition, which shows the LHC and its experiments through the eyes of a photographer, features around thirty spectacular photographs measuring 4.5 metres high and 2.5 metres wide. Welcomed and guided around the exhibition by CERN volunteers, shoppers at Balexert will also have the opportunity to discover LHC components on display and watch films. "Fun with Physics" workshops will be held at certain times of the day. Main hall of the Balexert shopping centre, ground floor, from 9.00 a.m. to 7.00 p.m. Monday to Friday and from 10 a.m. to 6 p.m. on the two Saturdays. Call for volunteers All members of the CERN personnel are invited to enrol as volunteers to help welcom...

  16. Disaggregating asthma: Big investigation versus big data.

    Science.gov (United States)

    Belgrave, Danielle; Henderson, John; Simpson, Angela; Buchan, Iain; Bishop, Christopher; Custovic, Adnan

    2017-02-01

    We are facing a major challenge in bridging the gap between identifying subtypes of asthma to understand causal mechanisms and translating this knowledge into personalized prevention and management strategies. In recent years, "big data" has been sold as a panacea for generating hypotheses and driving new frontiers of health care; the idea that the data must and will speak for themselves is fast becoming a new dogma. One of the dangers of ready accessibility of health care data and computational tools for data analysis is that the process of data mining can become uncoupled from the scientific process of clinical interpretation, understanding the provenance of the data, and external validation. Although advances in computational methods can be valuable for using unexpected structure in data to generate hypotheses, there remains a need for testing hypotheses and interpreting results with scientific rigor. We argue for combining data- and hypothesis-driven methods in a careful synergy, and the importance of carefully characterized birth and patient cohorts with genetic, phenotypic, biological, and molecular data in this process cannot be overemphasized. The main challenge on the road ahead is to harness bigger health care data in ways that produce meaningful clinical interpretation and to translate this into better diagnoses and properly personalized prevention and treatment plans. There is a pressing need for cross-disciplinary research with an integrative approach to data science, whereby basic scientists, clinicians, data analysts, and epidemiologists work together to understand the heterogeneity of asthma. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Big data and educational research

    OpenAIRE

    Beneito-Montagut, Roser

    2017-01-01

    Big data and data analytics offer the promise to enhance teaching and learning, improve educational research and progress education governance. This chapter aims to contribute to the conceptual and methodological understanding of big data and analytics within educational research. It describes the opportunities and challenges that big data and analytics bring to education as well as critically explore the perils of applying a data driven approach to education. Despite the claimed value of the...

  18. The trashing of Big Green

    International Nuclear Information System (INIS)

    Felten, E.

    1990-01-01

    The Big Green initiative on California's ballot lost by a margin of 2-to-1. Green measures lost in five other states, shocking ecology-minded groups. According to the postmortem by environmentalists, Big Green was a victim of poor timing and big spending by the opposition. Now its supporters plan to break up the bill and try to pass some provisions in the Legislature

  19. Big data in fashion industry

    Science.gov (United States)

    Jain, S.; Bruniaux, J.; Zeng, X.; Bruniaux, P.

    2017-10-01

    Significant work has been done in the field of big data in last decade. The concept of big data includes analysing voluminous data to extract valuable information. In the fashion world, big data is increasingly playing a part in trend forecasting, analysing consumer behaviour, preference and emotions. The purpose of this paper is to introduce the term fashion data and why it can be considered as big data. It also gives a broad classification of the types of fashion data and briefly defines them. Also, the methodology and working of a system that will use this data is briefly described.

  20. Was there a big bang

    International Nuclear Information System (INIS)

    Narlikar, J.

    1981-01-01

    In discussing the viability of the big-bang model of the Universe relative evidence is examined including the discrepancies in the age of the big-bang Universe, the red shifts of quasars, the microwave background radiation, general theory of relativity aspects such as the change of the gravitational constant with time, and quantum theory considerations. It is felt that the arguments considered show that the big-bang picture is not as soundly established, either theoretically or observationally, as it is usually claimed to be, that the cosmological problem is still wide open and alternatives to the standard big-bang picture should be seriously investigated. (U.K.)

  1. Some experiences and opportunities for big data in translational research.

    Science.gov (United States)

    Chute, Christopher G; Ullman-Cullere, Mollie; Wood, Grant M; Lin, Simon M; He, Min; Pathak, Jyotishman

    2013-10-01

    Health care has become increasingly information intensive. The advent of genomic data, integrated into patient care, significantly accelerates the complexity and amount of clinical data. Translational research in the present day increasingly embraces new biomedical discovery in this data-intensive world, thus entering the domain of "big data." The Electronic Medical Records and Genomics consortium has taught us many lessons, while simultaneously advances in commodity computing methods enable the academic community to affordably manage and process big data. Although great promise can emerge from the adoption of big data methods and philosophy, the heterogeneity and complexity of clinical data, in particular, pose additional challenges for big data inferencing and clinical application. However, the ultimate comparability and consistency of heterogeneous clinical information sources can be enhanced by existing and emerging data standards, which promise to bring order to clinical data chaos. Meaningful Use data standards in particular have already simplified the task of identifying clinical phenotyping patterns in electronic health records.

  2. How Big is Earth?

    Science.gov (United States)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  3. 2016 SPD: Day 1

    Science.gov (United States)

    Kohler, Susanna

    2016-06-01

    Editors note: This week were in Boulder, Colorado at the 47th meeting of the AAS Solar Physics Division (SPD). Follow along to catch some of the latest news from the field of solar physics!The 2016 SPD meeting was launched this morning from the University of Colorado Boulder campus. Two of the hot topics at this years meeting include celebration of the recent move of the National Solar Observatorys headquarters to Boulder, and discussion of the future Daniel K. Inouye Solar Telescope (DKIST, formerly the Advanced Technology Solar Telescope, ATST). DKIST, planned for a 2019 completion in Hawaii, is the next big telescope on the horizon for solar physics.Todays press conference had an interesting focus: instruments providing new high-energy observations of the Sun. Representatives from four different instruments were here to talk about some of the latest X-ray solar observations.GRIPSThe GRIPS payload flew at 130,000 ft over Antarctica on a giant balloon in January 2016. [NASA/Albert Shih]First up, Albert Shih (NASA Goddard) described the Gamma-Ray Imager/Polarimeter for Solar flares, or GRIPS. GRIPS is a balloon-borne instrument designed to detect X-rays and gamma rays emitted during solar flares. Up to tens of a percent of the energy in solar flares is emitted in the form of accelerated particles, but the physics behind this process is not well understood. GRIPS observes where the highest-energy particles are accelerated, in an effort to learn more about the process.GRIPS was launched on 19 January, 2016 and flew for roughly 12 days gathering ~1 million seconds of data! The logistics of this instruments flight are especially interesting, since it was launched from Antarctica and carried by a balloon at a whopping elevation of 130,000 ft (to get high enough that the atmosphere doesnt absorb all the photons GRIPS is trying to observe). Though the data from the mission has been retrieved, the bulk of the hardware remains where it landed at the end of January. It must

  4. Privacy and Big Data

    CERN Document Server

    Craig, Terence

    2011-01-01

    Much of what constitutes Big Data is information about us. Through our online activities, we leave an easy-to-follow trail of digital footprints that reveal who we are, what we buy, where we go, and much more. This eye-opening book explores the raging privacy debate over the use of personal data, with one undeniable conclusion: once data's been collected, we have absolutely no control over who uses it or how it is used. Personal data is the hottest commodity on the market today-truly more valuable than gold. We are the asset that every company, industry, non-profit, and government wants. Pri

  5. Visualizing big energy data

    DEFF Research Database (Denmark)

    Hyndman, Rob J.; Liu, Xueqin Amy; Pinson, Pierre

    2018-01-01

    Visualization is a crucial component of data analysis. It is always a good idea to plot the data before fitting models, making predictions, or drawing conclusions. As sensors of the electric grid are collecting large volumes of data from various sources, power industry professionals are facing th...... the challenge of visualizing such data in a timely fashion. In this article, we demonstrate several data-visualization solutions for big energy data through three case studies involving smart-meter data, phasor measurement unit (PMU) data, and probabilistic forecasts, respectively....

  6. Big Data Challenges

    Directory of Open Access Journals (Sweden)

    Alexandru Adrian TOLE

    2013-10-01

    Full Text Available The amount of data that is traveling across the internet today, not only that is large, but is complex as well. Companies, institutions, healthcare system etc., all of them use piles of data which are further used for creating reports in order to ensure continuity regarding the services that they have to offer. The process behind the results that these entities requests represents a challenge for software developers and companies that provide IT infrastructure. The challenge is how to manipulate an impressive volume of data that has to be securely delivered through the internet and reach its destination intact. This paper treats the challenges that Big Data creates.

  7. Big data naturally rescaled

    International Nuclear Information System (INIS)

    Stoop, Ruedi; Kanders, Karlis; Lorimer, Tom; Held, Jenny; Albert, Carlo

    2016-01-01

    We propose that a handle could be put on big data by looking at the systems that actually generate the data, rather than the data itself, realizing that there may be only few generic processes involved in this, each one imprinting its very specific structures in the space of systems, the traces of which translate into feature space. From this, we propose a practical computational clustering approach, optimized for coping with such data, inspired by how the human cortex is known to approach the problem.

  8. A Matrix Big Bang

    OpenAIRE

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matr...

  9. Moving a House by Moved Participants

    DEFF Research Database (Denmark)

    Axel, Erik

    himself in controlling every detail of the shape of the concrete slaps. He pushed all the other participants of the meetings, asking for details, information, the change of drawings etc. He explained the technical issues he was pursuing, was prepared for problems at the meetings, was well informed, always......? The participant observer believed it was a matter of changing coordinates, but the engineers immediately saw it was an issue of pipes in the ground, could they be moved and still function as planned? To decide the possibility of this suggestion the engineer was given the task of investigating the consequences...... they saw him as a bit pushy. On the other hand they understood why he was so since his firm would be fined if the concrete slabs did not meet specifications. The case will be the basis for a discussion of double motivation of the engineer, his evident interest in his professional work, and the wish...

  10. BIG Data - BIG Gains? Understanding the Link Between Big Data Analytics and Innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance for product innovations. Since big data technologies provide new data information practices, they create new decision-making possibilities, which firms can use to realize innovations. Applying German firm-level data we find suggestive evidence that big data analytics matters for the likelihood of becoming a product innovator as well as the market success of the firms’ product innovat...

  11. BIG data - BIG gains? Empirical evidence on the link between big data analytics and innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance in terms of product innovations. Since big data technologies provide new data information practices, they create novel decision-making possibilities, which are widely believed to support firms’ innovation process. Applying German firm-level data within a knowledge production function framework we find suggestive evidence that big data analytics is a relevant determinant for the likel...

  12. [Big data in imaging].

    Science.gov (United States)

    Sewerin, Philipp; Ostendorf, Benedikt; Hueber, Axel J; Kleyer, Arnd

    2018-04-01

    Until now, most major medical advancements have been achieved through hypothesis-driven research within the scope of clinical trials. However, due to a multitude of variables, only a certain number of research questions could be addressed during a single study, thus rendering these studies expensive and time consuming. Big data acquisition enables a new data-based approach in which large volumes of data can be used to investigate all variables, thus opening new horizons. Due to universal digitalization of the data as well as ever-improving hard- and software solutions, imaging would appear to be predestined for such analyses. Several small studies have already demonstrated that automated analysis algorithms and artificial intelligence can identify pathologies with high precision. Such automated systems would also seem well suited for rheumatology imaging, since a method for individualized risk stratification has long been sought for these patients. However, despite all the promising options, the heterogeneity of the data and highly complex regulations covering data protection in Germany would still render a big data solution for imaging difficult today. Overcoming these boundaries is challenging, but the enormous potential advances in clinical management and science render pursuit of this goal worthwhile.

  13. Research on Technology Innovation Management in Big Data Environment

    Science.gov (United States)

    Ma, Yanhong

    2018-02-01

    With the continuous development and progress of the information age, the demand for information is getting larger. The processing and analysis of information data is also moving toward the direction of scale. The increasing number of information data makes people have higher demands on processing technology. The explosive growth of information data onto the current society have prompted the advent of the era of big data. At present, people have more value and significance in producing and processing various kinds of information and data in their lives. How to use big data technology to process and analyze information data quickly to improve the level of big data management is an important stage to promote the current development of information and data processing technology in our country. To some extent, innovative research on the management methods of information technology in the era of big data can enhance our overall strength and make China be an invincible position in the development of the big data era.

  14. Nursing Needs Big Data and Big Data Needs Nursing.

    Science.gov (United States)

    Brennan, Patricia Flatley; Bakken, Suzanne

    2015-09-01

    Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.

  15. MOVES regional level sensitivity analysis

    Science.gov (United States)

    2012-01-01

    The MOVES Regional Level Sensitivity Analysis was conducted to increase understanding of the operations of the MOVES Model in regional emissions analysis and to highlight the following: : the relative sensitivity of selected MOVES Model input paramet...

  16. Move of ground water

    International Nuclear Information System (INIS)

    Kimura, Shigehiko

    1983-01-01

    As a ground water flow which is difficult to explain by Darcy's theory, there is stagnant water in strata, which moves by pumping and leads to land subsidence. This is now a major problem in Japan. Such move on an extensive scale has been investigated in detail by means of 3 H such as from rainfall in addition to ordinary measurement. The move of ground water is divided broadly into that in an unsaturated stratum from ground surface to water-table and that in a saturated stratum below the water-table. The course of the analyses made so far by 3 H contained in water, and the future trend of its usage are described. A flow model of regarding water as plastic fluid and its flow as channel assembly may be available for some flow mechanism which is not possible to explain with Darcy's theory. (Mori, K.)

  17. Moving toroidal limiter

    International Nuclear Information System (INIS)

    Ikuta, Kazunari; Miyahara, Akira.

    1983-06-01

    The concept of the limiter-divertor proposed by Mirnov is extended to a toroidal limiter-divertor (which we call moving toroidal limiter) using the stream of ferromagnetic balls coated with a low Z materials such as plastics, graphite and ceramics. An important advantage of the use of the ferromagnetic materials would be possible soft landing of the balls on a catcher, provided that the temperature of the balls is below Curie point. Moreover, moving toroidal limiter would work as a protector of the first wall not only against the vertical movement of plasma ring but also against the violent inward motion driven by major disruption because the orbit of the ball in the case of moving toroidal limiter distributes over the small major radius side of the toroidal plasma. (author)

  18. Was the big bang hot

    International Nuclear Information System (INIS)

    Wright, E.L.

    1983-01-01

    The author considers experiments to confirm the substantial deviations from a Planck curve in the Woody and Richards spectrum of the microwave background, and search for conducting needles in our galaxy. Spectral deviations and needle-shaped grains are expected for a cold Big Bang, but are not required by a hot Big Bang. (Auth.)

  19. Ethische aspecten van big data

    NARCIS (Netherlands)

    N. (Niek) van Antwerpen; Klaas Jan Mollema

    2017-01-01

    Big data heeft niet alleen geleid tot uitdagende technische vraagstukken, ook gaat het gepaard met allerlei nieuwe ethische en morele kwesties. Om verantwoord met big data om te gaan, moet ook over deze kwesties worden nagedacht. Want slecht datagebruik kan nadelige gevolgen hebben voor

  20. Fremtidens landbrug bliver big business

    DEFF Research Database (Denmark)

    Hansen, Henning Otte

    2016-01-01

    Landbrugets omverdensforhold og konkurrencevilkår ændres, og det vil nødvendiggøre en udvikling i retning af “big business“, hvor landbrugene bliver endnu større, mere industrialiserede og koncentrerede. Big business bliver en dominerende udvikling i dansk landbrug - men ikke den eneste...

  1. Human factors in Big Data

    NARCIS (Netherlands)

    Boer, J. de

    2016-01-01

    Since 2014 I am involved in various (research) projects that try to make the hype around Big Data more concrete and tangible for the industry and government. Big Data is about multiple sources of (real-time) data that can be analysed, transformed to information and be used to make 'smart' decisions.

  2. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  3. Moving to Jobs?

    OpenAIRE

    Dave Maré; Jason Timmins

    2003-01-01

    This paper examines whether New Zealand residents move from low-growth to high-growth regions, using New Zealand census data from the past three inter-censal periods (covering 1986-2001). We focus on the relationship between employment growth and migration flows to gauge the strength of the relationship and the stability of the relationship over the business cycle. We find that people move to areas of high employment growth, but that the probability of leaving a region is less strongly relate...

  4. Applications of Big Data in Education

    OpenAIRE

    Faisal Kalota

    2015-01-01

    Big Data and analytics have gained a huge momentum in recent years. Big Data feeds into the field of Learning Analytics (LA) that may allow academic institutions to better understand the learners' needs and proactively address them. Hence, it is important to have an understanding of Big Data and its applications. The purpose of this descriptive paper is to provide an overview of Big Data, the technologies used in Big Data, and some of the applications of Big Data in educa...

  5. Moving from reclusion to partial freedom: the experience of family caregivers for disabled elderly persons assisted in a day care center Movendo-se da reclusão à liberdade parcial: a experiência do cuidador familiar de idoso dependente assistido num centro-dia

    Directory of Open Access Journals (Sweden)

    Silvia Cristina Mangini Bocchi

    2010-09-01

    Full Text Available This study aimed at understanding the interactional experience between family caregivers and disabled elderly persons supported in a Day Care Center according to the caregiver's perspective. It also aimed at developing a representative theoretical model for the events experienced by such caregiver. The Grounded Theory was used as methodological framework whereas Interactional Symbolism served as the theoretical framework. Observation and interviews were used for data collection. The following phenomenon arose from the results: feeling of support by the Day Care Center, by the strength of the bond with the elderly and by spirituality in order to continue playing the challenging role of a family caregiver for a disabled elderly person. The study made possible to understand that, among these three supporting cornerstones for coping with the burden generated by the family caregiver role, the care model promoted by the Day Care Center was the intervenient variable in the process of improving the quality of life of the family caregiver-disabled elderly person binomial. This allowed the identification of the main category - moving from reclusion to partial freedom: the experience of family caregivers for disabled elderly persons assisted in a Day Care Center.O estudo teve como objetivos: compreender a experiência interacional cuidador familiar-idoso dependente apoiada por um Centro-Dia (CD, segundo a perspectiva do cuidador familiar, e desenvolver um modelo teórico representativo da experiência vivida por ele. Utilizou-se como referencial metodológico a Grounded Theory e como referencial teórico o Interacionismo Simbólico. As estratégias para a obtenção dos dados foram a observação e a entrevista. Dos resultados emergiu o fenômeno: sentindo-se apoiado pelo CD, pela força do vínculo com o idoso e pela espiritualidade para continuar desempenhando o papel desafiante de cuidador familiar de idoso dependente. O trabalho permitiu

  6. Exploring complex and big data

    Directory of Open Access Journals (Sweden)

    Stefanowski Jerzy

    2017-12-01

    Full Text Available This paper shows how big data analysis opens a range of research and technological problems and calls for new approaches. We start with defining the essential properties of big data and discussing the main types of data involved. We then survey the dedicated solutions for storing and processing big data, including a data lake, virtual integration, and a polystore architecture. Difficulties in managing data quality and provenance are also highlighted. The characteristics of big data imply also specific requirements and challenges for data mining algorithms, which we address as well. The links with related areas, including data streams and deep learning, are discussed. The common theme that naturally emerges from this characterization is complexity. All in all, we consider it to be the truly defining feature of big data (posing particular research and technological challenges, which ultimately seems to be of greater importance than the sheer data volume.

  7. The NOAA Big Data Project

    Science.gov (United States)

    de la Beaujardiere, J.

    2015-12-01

    The US National Oceanic and Atmospheric Administration (NOAA) is a Big Data producer, generating tens of terabytes per day from hundreds of sensors on satellites, radars, aircraft, ships, and buoys, and from numerical models. These data are of critical importance and value for NOAA's mission to understand and predict changes in climate, weather, oceans, and coasts. In order to facilitate extracting additional value from this information, NOAA has established Cooperative Research and Development Agreements (CRADAs) with five Infrastructure-as-a-Service (IaaS) providers — Amazon, Google, IBM, Microsoft, Open Cloud Consortium — to determine whether hosting NOAA data in publicly-accessible Clouds alongside on-demand computational capability stimulates the creation of new value-added products and services and lines of business based on the data, and if the revenue generated by these new applications can support the costs of data transmission and hosting. Each IaaS provider is the anchor of a "Data Alliance" which organizations or entrepreneurs can join to develop and test new business or research avenues. This presentation will report on progress and lessons learned during the first 6 months of the 3-year CRADAs.

  8. ATLAS BigPanDA Monitoring

    CERN Document Server

    Padolski, Siarhei; The ATLAS collaboration; Klimentov, Alexei; Korchuganova, Tatiana

    2017-01-01

    BigPanDA monitoring is a web based application which provides various processing and representation of the Production and Distributed Analysis (PanDA) system objects states. Analyzing hundreds of millions of computation entities such as an event or a job BigPanDA monitoring builds different scale and levels of abstraction reports in real time mode. Provided information allows users to drill down into the reason of a concrete event failure or observe system bigger picture such as tracking the computation nucleus and satellites performance or the progress of whole production campaign. PanDA system was originally developed for the Atlas experiment and today effectively managing more than 2 million jobs per day distributed over 170 computing centers worldwide. BigPanDA is its core component commissioned in the middle of 2014 and now is the primary source of information for ATLAS users about state of their computations and the source of decision support information for shifters, operators and managers. In this wor...

  9. Significance of Supply Logistics in Big Cities

    Directory of Open Access Journals (Sweden)

    Mario Šafran

    2012-10-01

    Full Text Available The paper considers the concept and importance of supplylogistics as element in improving storage, supply and transportof goods in big cities. There is always room for improvements inthis segmenl of economic activities, and therefore continuousoptimisation of the cargo flows from the manufacturer to theend user is impor1a11t. Due to complex requirements in thecargo supply a11d the "spoiled" end users, modem cities represe/ll great difficulties and a big challenge for the supply organisers.The consumers' needs in big cities have developed over therecent years i11 such a way that they require supply of goods severaltimes a day at precisely determined times (orders are receivedby e-mail, and the information transfer is therefore instantaneous.In order to successfully meet the consumers'needs in advanced economic systems, advanced methods ofgoods supply have been developed and improved, such as 'justin time'; ''door-to-door", and "desk-to-desk". Regular operationof these systems requires supply logistics 1vhiclz includes thetotalthroughpw of materials, from receiving the raw materialsor reproduction material to the delive1y of final products to theend users.

  10. ATLAS BigPanDA Monitoring

    CERN Document Server

    Padolski, Siarhei; The ATLAS collaboration

    2017-01-01

    BigPanDA monitoring is a web-based application that provides various processing and representation of the Production and Distributed Analysis (PanDA) system objects states. Analysing hundreds of millions of computation entities such as an event or a job BigPanDA monitoring builds different scale and levels of abstraction reports in real time mode. Provided information allows users to drill down into the reason of a concrete event failure or observe system bigger picture such as tracking the computation nucleus and satellites performance or the progress of whole production campaign. PanDA system was originally developed for the Atlas experiment and today effectively managing more than 2 million jobs per day distributed over 170 computing centers worldwide. BigPanDA is its core component commissioned in the middle of 2014 and now is the primary source of information for ATLAS users about state of their computations and the source of decision support information for shifters, operators and managers. In this work...

  11. Opening the Big Black Box: European study reveals visitors' impressions of science laboratories

    CERN Multimedia

    2004-01-01

    "On 29 - 30 March the findings of 'Inside the Big Black Box'- a Europe-wide science and society project - will be revealed during a two-day seminar hosted by CERN*. The principle aim of Inside the Big Black Box (IN3B) is to determine whether a working scientific laboratory can capture the curiosity of the general public through visits" (1 page)

  12. Libraries on the MOVE.

    Science.gov (United States)

    Edgar, Jim; And Others

    1986-01-01

    Presents papers from Illinois State Library and Shawnee Library System's "Libraries on the MOVE" conference focusing on how libraries can impact economic/cultural climate of an area. Topics addressed included information services of rural libraries; marketing; rural library development; library law; information access; interagency…

  13. Sense of moving

    DEFF Research Database (Denmark)

    Christensen, Mark Schram; Grünbaum, Thor

    2017-01-01

    In this chapter, we assume the existence of a sense of “movement activity” that arises when a person actively moves a body part. This sense is usually supposed to be part of sense of agency (SoA). The purpose of the chapter is to determine whether the already existing experimental paradigms can...

  14. Indexing Moving Points

    DEFF Research Database (Denmark)

    Agarwal, Pankaj K.; Arge, Lars Allan; Erickson, Jeff

    2003-01-01

    We propose three indexing schemes for storing a set S of N points in the plane, each moving along a linear trajectory, so that any query of the following form can be answered quickly: Given a rectangle R and a real value t, report all K points of S that lie inside R at time t. We first present an...

  15. Moving up in industry.

    Science.gov (United States)

    Covell, Charlotte

    2016-01-23

    Charlotte Covell is commercial business manager at Virbac UK, a role that gives her responsibility for the company's sales to corporate practices, some buying groups and internet pharmacies. She began her career as a veterinary nurse, but moved into industry and now has a role in senior business management. British Veterinary Association.

  16. Optics of moving media

    Science.gov (United States)

    Piwnicki, P.; Leonhardt, U.

    2001-01-01

    Light experiences a moving medium as an effective gravitational field. In the limit of low medium velocities the medium flow plays the role of a magnetic vector potential. We review the background of our theory [U. Leonhardt and P. Piwnicki, Phys. Rev. A 60, 4301 (1999); Phys. Rev. Lett. 84, 822 (2000)], including our proposal of making optical black holes.

  17. Making Images That Move

    Science.gov (United States)

    Rennie, Richard

    2015-01-01

    The history of the moving image (the cinema) is well documented in books and on the Internet. This article offers a number of activities that can easily be carried out in a science class. They make use of the phenomenon of "Persistence of Vision." The activities presented herein demonstrate the functionality of the phenakistoscope, the…

  18. Aboard the "Moving School."

    Science.gov (United States)

    Ainscow, Mel; Hopkins, David

    1992-01-01

    In many countries, education legislation embodies contradictory pressures for centralization and decentralization. In the United Kingdom, there is growing government control over policy and direction of schools; schools are also being given more responsibility for resource management. "Moving" schools within Improving the Quality of…

  19. The big data telescope

    International Nuclear Information System (INIS)

    Finkel, Elizabeth

    2017-01-01

    On a flat, red mulga plain in the outback of Western Australia, preparations are under way to build the most audacious telescope astronomers have ever dreamed of - the Square Kilometre Array (SKA). Next-generation telescopes usually aim to double the performance of their predecessors. The Australian arm of SKA will deliver a 168-fold leap on the best technology available today, to show us the universe as never before. It will tune into signals emitted just a million years after the Big Bang, when the universe was a sea of hydrogen gas, slowly percolating with the first galaxies. Their starlight illuminated the fledgling universe in what is referred to as the “cosmic dawn”.

  20. The Big Optical Array

    International Nuclear Information System (INIS)

    Mozurkewich, D.; Johnston, K.J.; Simon, R.S.

    1990-01-01

    This paper describes the design and the capabilities of the Naval Research Laboratory Big Optical Array (BOA), an interferometric optical array for high-resolution imaging of stars, stellar systems, and other celestial objects. There are four important differences between the BOA design and the design of Mark III Optical Interferometer on Mount Wilson (California). These include a long passive delay line which will be used in BOA to do most of the delay compensation, so that the fast delay line will have a very short travel; the beam combination in BOA will be done in triplets, to allow measurement of closure phase; the same light will be used for both star and fringe tracking; and the fringe tracker will use several wavelength channels

  1. Big nuclear accidents

    International Nuclear Information System (INIS)

    Marshall, W.

    1983-01-01

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the safety of nuclear power. The way in which the probability and consequences of big nuclear accidents have been presented in the past is reviewed and recommendations for the future are made including the presentation of the long-term consequences of such accidents in terms of 'reduction in life expectancy', 'increased chance of fatal cancer' and the equivalent pattern of compulsory cigarette smoking. (author)

  2. Nonstandard big bang models

    International Nuclear Information System (INIS)

    Calvao, M.O.; Lima, J.A.S.

    1989-01-01

    The usual FRW hot big-bang cosmologies have been generalized by considering the equation of state ρ = Anm +(γ-1) -1 p, where m is the rest mass of the fluid particles and A is a dimensionless constant. Explicit analytic solutions are given for the flat case (ε=O). For large cosmological times these extended models behave as the standard Einstein-de Sitter universes regardless of the values of A and γ. Unlike the usual FRW flat case the deceleration parameter q is a time-dependent function and its present value, q≅ 1, obtained from the luminosity distance versus redshift relation, may be fitted by taking, for instance, A=1 and γ = 5/3 (monatomic relativistic gas with >> k B T). In all cases the universe cools obeying the same temperature law of the FRW models and it is shown that the age of the universe is only slightly modified. (author) [pt

  3. The Last Big Bang

    Energy Technology Data Exchange (ETDEWEB)

    McGuire, Austin D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Meade, Roger Allen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-13

    As one of the very few people in the world to give the “go/no go” decision to detonate a nuclear device, Austin “Mac” McGuire holds a very special place in the history of both the Los Alamos National Laboratory and the world. As Commander of Joint Task Force Unit 8.1.1, on Christmas Island in the spring and summer of 1962, Mac directed the Los Alamos data collection efforts for twelve of the last atmospheric nuclear detonations conducted by the United States. Since data collection was at the heart of nuclear weapon testing, it fell to Mac to make the ultimate decision to detonate each test device. He calls his experience THE LAST BIG BANG, since these tests, part of Operation Dominic, were characterized by the dramatic displays of the heat, light, and sounds unique to atmospheric nuclear detonations – never, perhaps, to be witnessed again.

  4. A matrix big bang

    International Nuclear Information System (INIS)

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control

  5. A matrix big bang

    Energy Technology Data Exchange (ETDEWEB)

    Craps, Ben [Instituut voor Theoretische Fysica, Universiteit van Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam (Netherlands); Sethi, Savdeep [Enrico Fermi Institute, University of Chicago, Chicago, IL 60637 (United States); Verlinde, Erik [Instituut voor Theoretische Fysica, Universiteit van Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam (Netherlands)

    2005-10-15

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control.

  6. ICRH coupling experiment in Big D

    International Nuclear Information System (INIS)

    Hoffman, D.J.; Baity, F.W.; Owens, T.L.; Jaeger, E.F.; Bryan, W.E.; Hammonds, C.J.

    1985-01-01

    A 10 MW, 40 to 80 MHz ICRH experiment has been proposed for Big D (at General Atomic). Compact loop antennas have been chosen to convey this power. In order to verify that the antenna will have sufficient loading, a prototype low-power antenna has been designed and will be installed in January 1986. The antenna is a cavity antenna that will operate from 30 to 80 MHz with a 50 Ohm match at R = 1 Ohm. The antenna can be moved from a position flush with the wall to flush with the limiter. By these means, we will establish the maximum acceptable gap from the coupler to the plasma. The electrical, mechanical, and thermal characteristics of this antenna system will be discussed. In addition to experimental exploration of coupling, we have investigated wave propagation and absorption in Big D by using a cold collisional plasma model in straight tokamak geometry with rotational transform. Although loading is dependent on the plasma position, both the reactive and real loads (10 to 20 and 1 to 2 Ohms) are comparable to other experiments. Loading and power deposition profiles as a function of frequency, density, and species mix will be presented. The report consists of viewgraphs of the presentation

  7. The Four Day School Week. Research Brief

    Science.gov (United States)

    Muir, Mike

    2013-01-01

    Can four-day school weeks help districts save money? How do districts overcome the barriers of moving to a four-day week? What is the effect of a four-day week on students, staff and the community? This paper enumerates the benefits for students and teachers of four-day school weeks. Recommendations for implementation of a four-day week are also…

  8. Big bang and big crunch in matrix string theory

    International Nuclear Information System (INIS)

    Bedford, J.; Ward, J.; Papageorgakis, C.; Rodriguez-Gomez, D.

    2007-01-01

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe

  9. BigOP: Generating Comprehensive Big Data Workloads as a Benchmarking Framework

    OpenAIRE

    Zhu, Yuqing; Zhan, Jianfeng; Weng, Chuliang; Nambiar, Raghunath; Zhang, Jinchao; Chen, Xingzhen; Wang, Lei

    2014-01-01

    Big Data is considered proprietary asset of companies, organizations, and even nations. Turning big data into real treasure requires the support of big data systems. A variety of commercial and open source products have been unleashed for big data storage and processing. While big data users are facing the choice of which system best suits their needs, big data system developers are facing the question of how to evaluate their systems with regard to general big data processing needs. System b...

  10. From big bang to big crunch and beyond

    International Nuclear Information System (INIS)

    Elitzur, Shmuel; Rabinovici, Eliezer; Giveon, Amit; Kutasov, David

    2002-01-01

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a 'big bang' singularity, expands and then contracts to a 'big crunch' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spacetimes is further connected at the singularities to a series of non-compact static regions with closed timelike curves. These regions contain boundaries, on which the observables of the theory live. This suggests a holographic interpretation of the physics. (author)

  11. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  12. Empathy and the Big Five

    OpenAIRE

    Paulus, Christoph

    2016-01-01

    Del Barrio et al. (2004) haben vor mehr als 10 Jahren versucht, eine direkte Beziehung zwischen Empathie und den Big Five herzustellen. Im Mittel hatten in ihrer Stichprobe Frauen höhere Werte in der Empathie und auf den Big Five-Faktoren mit Ausnahme des Faktors Neurotizismus. Zusammenhänge zu Empathie fanden sie in den Bereichen Offenheit, Verträglichkeit, Gewissenhaftigkeit und Extraversion. In unseren Daten besitzen Frauen sowohl in der Empathie als auch den Big Five signifikant höhere We...

  13. "Big data" in economic history.

    Science.gov (United States)

    Gutmann, Myron P; Merchant, Emily Klancher; Roberts, Evan

    2018-03-01

    Big data is an exciting prospect for the field of economic history, which has long depended on the acquisition, keying, and cleaning of scarce numerical information about the past. This article examines two areas in which economic historians are already using big data - population and environment - discussing ways in which increased frequency of observation, denser samples, and smaller geographic units allow us to analyze the past with greater precision and often to track individuals, places, and phenomena across time. We also explore promising new sources of big data: organically created economic data, high resolution images, and textual corpora.

  14. Big Data as Information Barrier

    Directory of Open Access Journals (Sweden)

    Victor Ya. Tsvetkov

    2014-07-01

    Full Text Available The article covers analysis of ‘Big Data’ which has been discussed over last 10 years. The reasons and factors for the issue are revealed. It has proved that the factors creating ‘Big Data’ issue has existed for quite a long time, and from time to time, would cause the informational barriers. Such barriers were successfully overcome through the science and technologies. The conducted analysis refers the “Big Data” issue to a form of informative barrier. This issue may be solved correctly and encourages development of scientific and calculating methods.

  15. Homogeneous and isotropic big rips?

    CERN Document Server

    Giovannini, Massimo

    2005-01-01

    We investigate the way big rips are approached in a fully inhomogeneous description of the space-time geometry. If the pressure and energy densities are connected by a (supernegative) barotropic index, the spatial gradients and the anisotropic expansion decay as the big rip is approached. This behaviour is contrasted with the usual big-bang singularities. A similar analysis is performed in the case of sudden (quiescent) singularities and it is argued that the spatial gradients may well be non-negligible in the vicinity of pressure singularities.

  16. Rate Change Big Bang Theory

    Science.gov (United States)

    Strickland, Ken

    2013-04-01

    The Rate Change Big Bang Theory redefines the birth of the universe with a dramatic shift in energy direction and a new vision of the first moments. With rate change graph technology (RCGT) we can look back 13.7 billion years and experience every step of the big bang through geometrical intersection technology. The analysis of the Big Bang includes a visualization of the first objects, their properties, the astounding event that created space and time as well as a solution to the mystery of anti-matter.

  17. Autonomous Landing on Moving Platforms

    KAUST Repository

    Mendoza Chavez, Gilberto

    2016-08-01

    This thesis investigates autonomous landing of a micro air vehicle (MAV) on a nonstationary ground platform. Unmanned aerial vehicles (UAVs) and micro air vehicles (MAVs) are becoming every day more ubiquitous. Nonetheless, many applications still require specialized human pilots or supervisors. Current research is focusing on augmenting the scope of tasks that these vehicles are able to accomplish autonomously. Precise autonomous landing on moving platforms is essential for self-deployment and recovery of MAVs, but it remains a challenging task for both autonomous and piloted vehicles. Model Predictive Control (MPC) is a widely used and effective scheme to control constrained systems. One of its variants, output-feedback tube-based MPC, ensures robust stability for systems with bounded disturbances under system state reconstruction. This thesis proposes a MAV control strategy based on this variant of MPC to perform rapid and precise autonomous landing on moving targets whose nominal (uncommitted) trajectory and velocity are slowly varying. The proposed approach is demonstrated on an experimental setup.

  18. The ethics of big data in big agriculture

    OpenAIRE

    Carbonell (Isabelle M.)

    2016-01-01

    This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique in...

  19. Moving in Circles

    DEFF Research Database (Denmark)

    Simonsen, Gunvor

    2008-01-01

    The article examines the development of African diaspora history during the last fifty years. It outlines the move from a focus on African survivals to a focus on deep rooted cultural principles and back again to a revived interest in concrete cultural transfers from Africa to the Americas....... This circular movement can be explained by a combination of elements characterizing African Atlantic and black Atlantic history. Among them is a lack of attention to questions of periodisation and change. Likewise, it has proven difficult to conceptualize Africa and America at one and the same time...... as characterized by cultural diversity and variation. Moreover, the field has been haunted by a tendency of moving to easily from descriptive evidence to conclusions about African identity in the Americas. A promising way to overcome these problems, it is suggested, is to develop research that focuses on single...

  20. Big endothelin changes the cellular miRNA environment in TMOb osteoblasts and increases mineralization.

    Science.gov (United States)

    Johnson, Michael G; Kristianto, Jasmin; Yuan, Baozhi; Konicke, Kathryn; Blank, Robert

    2014-08-01

    Endothelin (ET1) promotes the growth of osteoblastic breast and prostate cancer metastases. Conversion of big ET1 to mature ET1, catalyzed primarily by endothelin converting enzyme 1 (ECE1), is necessary for ET1's biological activity. We previously identified the Ece1, locus as a positional candidate gene for a pleiotropic quantitative trait locus affecting femoral size, shape, mineralization, and biomechanical performance. We exposed TMOb osteoblasts continuously to 25 ng/ml big ET1. Cells were grown for 6 days in growth medium and then switched to mineralization medium for an additional 15 days with or without big ET1, by which time the TMOb cells form mineralized nodules. We quantified mineralization by alizarin red staining and analyzed levels of miRNAs known to affect osteogenesis. Micro RNA 126-3p was identified by search as a potential regulator of sclerostin (SOST) translation. TMOb cells exposed to big ET1 showed greater mineralization than control cells. Big ET1 repressed miRNAs targeting transcripts of osteogenic proteins. Big ET1 increased expression of miRNAs that target transcripts of proteins that inhibit osteogenesis. Big ET1 increased expression of 126-3p 121-fold versus control. To begin to assess the effect of big ET1 on SOST production we analyzed both SOST transcription and protein production with and without the presence of big ET1 demonstrating that transcription and translation were uncoupled. Our data show that big ET1 signaling promotes mineralization. Moreover, the results suggest that big ET1's osteogenic effects are potentially mediated through changes in miRNA expression, a previously unrecognized big ET1 osteogenic mechanism.

  1. Electric moving shadow garden

    OpenAIRE

    Bracey, Andrew

    2010-01-01

    Electric Moving Shadow Garden is a multi-directional exploration of the links between artists and cinema, with multiple reference and contextual points. it accompanied the exhibition, UnSpooling: Artists & Cinema, curated by Bracey and Dave Griffiths at Corernhouse, Manchester, who also edited the publication. Published to accompany the Cornerhouse exhibition, UnSpooling: Artists & Cinema, curated by artists Andrew Bracey and Dave Griffiths. This illustrated catalogue explores how internat...

  2. TCR moves to MCR

    CERN Multimedia

    Peter Sollander, AB/OP/TI

    2005-01-01

    The monitoring of CERN's technical infrastructure has moved from the Technical Control Room in building 212 to the Meyrin Control Room (MCR) in building 354 (see map) and from the TS/CSE group to AB/OP. The operation's team as well as the services provided remain the same as before and you can still reach the operator on shift by calling 72201. Peter Sollander, AB/OP/TI

  3. CERN Pension Fund move

    CERN Multimedia

    HR Department

    2007-01-01

    The CERN Pension Fund has moved to new offices on the 5th floor of Building 5. The Benefits Service of the Fund is now located in Offices 5-5-017 - 5-5-021 - 5-5-023. We remind you that the office hours are: Tuesday/Wednesday/Thursday from 10 am to 12 am and from 3 pm to 5 pm. The Fund would like to take this opportunity to warmly thank all the persons involved in the relocation.

  4. Lecture - "Move! Eat better"

    CERN Multimedia

    2012-01-01

    As part of the "Move! Eat better" campaign, Novae’s nutrition adviser, Irène Rolfo, will give a talk on the subject of everyday good nutrition. This will be held in the main building auditorium at 12:30 on Thursday, 20 September 2012. Don’t miss this informative event. For more information, go to http://cern.ch/bpmm            

  5. The Big Bang: UK Young Scientists' and Engineers' Fair 2010

    Science.gov (United States)

    Allison, Simon

    2010-01-01

    The Big Bang: UK Young Scientists' and Engineers' Fair is an annual three-day event designed to promote science, technology, engineering and maths (STEM) careers to young people aged 7-19 through experiential learning. It is supported by stakeholders from business and industry, government and the community, and brings together people from various…

  6. A moving experience !

    CERN Document Server

    2005-01-01

    The Transport Service pulled out all the stops and, more specifically, its fleet of moving and lifting equipment for the Discovery Monday on 6 June - a truly moving experience for all the visitors who took part ! Visitors could play at being machine operator, twiddling the controls of a lift truck fitted with a jib to lift a dummy magnet into a wooden mock-up of a beam-line.They had to show even greater dexterity for this game of lucky dip...CERN-style.Those with a head for heights took to the skies 20 m above ground in a telescopic boom lift.Children were allowed to climb up into the operator's cabin - this is one of the cranes used to move the LHC magnets around. Warm thanks to all members of the Transport Service for their participation, especially B. Goicoechea, T. Ilkei, R. Bihery, S. Prodon, S. Pelletier, Y. Bernard, A.  Sallot, B. Pigeard, S. Guinchard, B. Bulot, J. Berrez, Y. Grandjean, A. Bouakkaz, M. Bois, F. Stach, T. Mazzarino and S. Fumey.

  7. Ocean Networks Canada's "Big Data" Initiative

    Science.gov (United States)

    Dewey, R. K.; Hoeberechts, M.; Moran, K.; Pirenne, B.; Owens, D.

    2013-12-01

    Ocean Networks Canada operates two large undersea observatories that collect, archive, and deliver data in real time over the Internet. These data contribute to our understanding of the complex changes taking place on our ocean planet. Ocean Networks Canada's VENUS was the world's first cabled seafloor observatory to enable researchers anywhere to connect in real time to undersea experiments and observations. Its NEPTUNE observatory is the largest cabled ocean observatory, spanning a wide range of ocean environments. Most recently, we installed a new small observatory in the Arctic. Together, these observatories deliver "Big Data" across many disciplines in a cohesive manner using the Oceans 2.0 data management and archiving system that provides national and international users with open access to real-time and archived data while also supporting a collaborative work environment. Ocean Networks Canada operates these observatories to support science, innovation, and learning in four priority areas: study of the impact of climate change on the ocean; the exploration and understanding the unique life forms in the extreme environments of the deep ocean and below the seafloor; the exchange of heat, fluids, and gases that move throughout the ocean and atmosphere; and the dynamics of earthquakes, tsunamis, and undersea landslides. To date, the Ocean Networks Canada archive contains over 130 TB (collected over 7 years) and the current rate of data acquisition is ~50 TB per year. This data set is complex and diverse. Making these "Big Data" accessible and attractive to users is our priority. In this presentation, we share our experience as a "Big Data" institution where we deliver simple and multi-dimensional calibrated data cubes to a diverse pool of users. Ocean Networks Canada also conducts extensive user testing. Test results guide future tool design and development of "Big Data" products. We strive to bridge the gap between the raw, archived data and the needs and

  8. Big climate data analysis

    Science.gov (United States)

    Mudelsee, Manfred

    2015-04-01

    The Big Data era has begun also in the climate sciences, not only in economics or molecular biology. We measure climate at increasing spatial resolution by means of satellites and look farther back in time at increasing temporal resolution by means of natural archives and proxy data. We use powerful supercomputers to run climate models. The model output of the calculations made for the IPCC's Fifth Assessment Report amounts to ~650 TB. The 'scientific evolution' of grid computing has started, and the 'scientific revolution' of quantum computing is being prepared. This will increase computing power, and data amount, by several orders of magnitude in the future. However, more data does not automatically mean more knowledge. We need statisticians, who are at the core of transforming data into knowledge. Statisticians notably also explore the limits of our knowledge (uncertainties, that is, confidence intervals and P-values). Mudelsee (2014 Climate Time Series Analysis: Classical Statistical and Bootstrap Methods. Second edition. Springer, Cham, xxxii + 454 pp.) coined the term 'optimal estimation'. Consider the hyperspace of climate estimation. It has many, but not infinite, dimensions. It consists of the three subspaces Monte Carlo design, method and measure. The Monte Carlo design describes the data generating process. The method subspace describes the estimation and confidence interval construction. The measure subspace describes how to detect the optimal estimation method for the Monte Carlo experiment. The envisaged large increase in computing power may bring the following idea of optimal climate estimation into existence. Given a data sample, some prior information (e.g. measurement standard errors) and a set of questions (parameters to be estimated), the first task is simple: perform an initial estimation on basis of existing knowledge and experience with such types of estimation problems. The second task requires the computing power: explore the hyperspace to

  9. Hey, big spender

    Energy Technology Data Exchange (ETDEWEB)

    Cope, G.

    2000-04-01

    Business to business electronic commerce is looming large in the future of the oil industry. It is estimated that by adopting e-commerce the industry could achieve bottom line savings of between $1.8 to $ 3.4 billion a year on annual gross revenues in excess of $ 30 billion. At present there are several teething problems to overcome such as inter-operability standards, which are at least two or three years away. Tying in electronically with specific suppliers is also an expensive proposition, although the big benefits are in fact in doing business with the same suppliers on a continuing basis. Despite these problems, 14 of the world's largest energy and petrochemical companies joined forces in mid-April to create a single Internet procurement marketplace for the industry's complex supply chain. The exchange was designed by B2B (business-to-business) software provider, Commerce One Inc., ; it will leverage the buying clout of these industry giants (BP Amoco, Royal Dutch Shell Group, Conoco, Occidental Petroleum, Phillips Petroleum, Unocal Corporation and Statoil among them), currently about $ 125 billion on procurement per year; they hope to save between 5 to 30 per cent depending on the product and the region involved. Other similar schemes such as Chevron and partners' Petrocosm Marketplace, Network Oil, a Houston-based Internet portal aimed at smaller petroleum companies, are also doing business in the $ 10 billion per annum range. e-Energy, a cooperative project between IBM Ericson and Telus Advertising is another neutral, virtual marketplace targeted at the oil and gas sector. PetroTRAX, a Calgary-based website plans to take online procurement and auction sales a big step forward by establishing a portal to handle any oil company's asset management needs. There are also a number of websites targeting specific needs: IndigoPool.com (acquisitions and divestitures) and WellBid.com (products related to upstream oil and gas operators) are just

  10. Hey, big spender

    International Nuclear Information System (INIS)

    Cope, G.

    2000-01-01

    Business to business electronic commerce is looming large in the future of the oil industry. It is estimated that by adopting e-commerce the industry could achieve bottom line savings of between $1.8 to $ 3.4 billion a year on annual gross revenues in excess of $ 30 billion. At present there are several teething problems to overcome such as inter-operability standards, which are at least two or three years away. Tying in electronically with specific suppliers is also an expensive proposition, although the big benefits are in fact in doing business with the same suppliers on a continuing basis. Despite these problems, 14 of the world's largest energy and petrochemical companies joined forces in mid-April to create a single Internet procurement marketplace for the industry's complex supply chain. The exchange was designed by B2B (business-to-business) software provider, Commerce One Inc., ; it will leverage the buying clout of these industry giants (BP Amoco, Royal Dutch Shell Group, Conoco, Occidental Petroleum, Phillips Petroleum, Unocal Corporation and Statoil among them), currently about $ 125 billion on procurement per year; they hope to save between 5 to 30 per cent depending on the product and the region involved. Other similar schemes such as Chevron and partners' Petrocosm Marketplace, Network Oil, a Houston-based Internet portal aimed at smaller petroleum companies, are also doing business in the $ 10 billion per annum range. e-Energy, a cooperative project between IBM Ericson and Telus Advertising is another neutral, virtual marketplace targeted at the oil and gas sector. PetroTRAX, a Calgary-based website plans to take online procurement and auction sales a big step forward by establishing a portal to handle any oil company's asset management needs. There are also a number of websites targeting specific needs: IndigoPool.com (acquisitions and divestitures) and WellBid.com (products related to upstream oil and gas operators) are just two examples. All in

  11. Methods and tools for big data visualization

    OpenAIRE

    Zubova, Jelena; Kurasova, Olga

    2015-01-01

    In this paper, methods and tools for big data visualization have been investigated. Challenges faced by the big data analysis and visualization have been identified. Technologies for big data analysis have been discussed. A review of methods and tools for big data visualization has been done. Functionalities of the tools have been demonstrated by examples in order to highlight their advantages and disadvantages.

  12. Measuring the Promise of Big Data Syllabi

    Science.gov (United States)

    Friedman, Alon

    2018-01-01

    Growing interest in Big Data is leading industries, academics and governments to accelerate Big Data research. However, how teachers should teach Big Data has not been fully examined. This article suggests criteria for redesigning Big Data syllabi in public and private degree-awarding higher education establishments. The author conducted a survey…

  13. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  14. Biophotonics: the big picture

    Science.gov (United States)

    Marcu, Laura; Boppart, Stephen A.; Hutchinson, Mark R.; Popp, Jürgen; Wilson, Brian C.

    2018-02-01

    The 5th International Conference on Biophotonics (ICOB) held April 30 to May 1, 2017, in Fremantle, Western Australia, brought together opinion leaders to discuss future directions for the field and opportunities to consider. The first session of the conference, "How to Set a Big Picture Biophotonics Agenda," was focused on setting the stage for developing a vision and strategies for translation and impact on society of biophotonic technologies. The invited speakers, panelists, and attendees engaged in discussions that focused on opportunities and promising applications for biophotonic techniques, challenges when working at the confluence of the physical and biological sciences, driving factors for advances of biophotonic technologies, and educational opportunities. We share a summary of the presentations and discussions. Three main themes from the conference are presented in this position paper that capture the current status, opportunities, challenges, and future directions of biophotonics research and key areas of applications: (1) biophotonics at the nano- to microscale level; (2) biophotonics at meso- to macroscale level; and (3) biophotonics and the clinical translation conundrum.

  15. Big Data, Small Sample.

    Science.gov (United States)

    Gerlovina, Inna; van der Laan, Mark J; Hubbard, Alan

    2017-05-20

    Multiple comparisons and small sample size, common characteristics of many types of "Big Data" including those that are produced by genomic studies, present specific challenges that affect reliability of inference. Use of multiple testing procedures necessitates calculation of very small tail probabilities of a test statistic distribution. Results based on large deviation theory provide a formal condition that is necessary to guarantee error rate control given practical sample sizes, linking the number of tests and the sample size; this condition, however, is rarely satisfied. Using methods that are based on Edgeworth expansions (relying especially on the work of Peter Hall), we explore the impact of departures of sampling distributions from typical assumptions on actual error rates. Our investigation illustrates how far the actual error rates can be from the declared nominal levels, suggesting potentially wide-spread problems with error rate control, specifically excessive false positives. This is an important factor that contributes to "reproducibility crisis". We also review some other commonly used methods (such as permutation and methods based on finite sampling inequalities) in their application to multiple testing/small sample data. We point out that Edgeworth expansions, providing higher order approximations to the sampling distribution, offer a promising direction for data analysis that could improve reliability of studies relying on large numbers of comparisons with modest sample sizes.

  16. Big bang darkleosynthesis

    Directory of Open Access Journals (Sweden)

    Gordan Krnjaic

    2015-12-01

    Full Text Available In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN, in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV/dark-nucleon binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S≫3/2, whose discovery would be smoking gun evidence for dark nuclei.

  17. Predicting big bang deuterium

    Energy Technology Data Exchange (ETDEWEB)

    Hata, N.; Scherrer, R.J.; Steigman, G.; Thomas, D.; Walker, T.P. [Department of Physics, Ohio State University, Columbus, Ohio 43210 (United States)

    1996-02-01

    We present new upper and lower bounds to the primordial abundances of deuterium and {sup 3}He based on observational data from the solar system and the interstellar medium. Independent of any model for the primordial production of the elements we find (at the 95{percent} C.L.): 1.5{times}10{sup {minus}5}{le}(D/H){sub {ital P}}{le}10.0{times}10{sup {minus}5} and ({sup 3}He/H){sub {ital P}}{le}2.6{times}10{sup {minus}5}. When combined with the predictions of standard big bang nucleosynthesis, these constraints lead to a 95{percent} C.L. bound on the primordial abundance deuterium: (D/H){sub best}=(3.5{sup +2.7}{sub {minus}1.8}){times}10{sup {minus}5}. Measurements of deuterium absorption in the spectra of high-redshift QSOs will directly test this prediction. The implications of this prediction for the primordial abundances of {sup 4}He and {sup 7}Li are discussed, as well as those for the universal density of baryons. {copyright} {ital 1996 The American Astronomical Society.}

  18. Big bang darkleosynthesis

    Science.gov (United States)

    Krnjaic, Gordan; Sigurdson, Kris

    2015-12-01

    In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN), in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV /dark-nucleon) binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S ≫ 3 / 2), whose discovery would be smoking gun evidence for dark nuclei.

  19. The role of big laboratories

    CERN Document Server

    Heuer, Rolf-Dieter

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward.

  20. The role of big laboratories

    International Nuclear Information System (INIS)

    Heuer, R-D

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward. (paper)

  1. Comparative influence of dose rate and radiation nature, on lethality after big mammals irradiation

    International Nuclear Information System (INIS)

    Destombe, C.; Le Fleche, Ph.; Grasseau, A.; Reynal, A.

    1997-01-01

    For the same dose and the 30 days lethality as biological criterion, the dose rate influence is more important than the radiation nature on the results of an big mammals total body irradiation. (authors)

  2. Challenges of Big Data Analysis.

    Science.gov (United States)

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  3. Generalized formal model of Big Data

    OpenAIRE

    Shakhovska, N.; Veres, O.; Hirnyak, M.

    2016-01-01

    This article dwells on the basic characteristic features of the Big Data technologies. It is analyzed the existing definition of the “big data” term. The article proposes and describes the elements of the generalized formal model of big data. It is analyzed the peculiarities of the application of the proposed model components. It is described the fundamental differences between Big Data technology and business analytics. Big Data is supported by the distributed file system Google File System ...

  4. Boosting Big National Lab Data

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-02-21

    Introduction: Big data. Love it or hate it, solving the world’s most intractable problems requires the ability to make sense of huge and complex sets of data and do it quickly. Speeding up the process – from hours to minutes or from weeks to days – is key to our success. One major source of such big data are physical experiments. As many will know, these physical experiments are commonly used to solve challenges in fields such as energy security, manufacturing, medicine, pharmacology, environmental protection and national security. Experiments use different instruments and sensor types to research for example the validity of new drugs, the base cause for diseases, more efficient energy sources, new materials for every day goods, effective methods for environmental cleanup, the optimal ingredients composition for chocolate or determine how to preserve valuable antics. This is done by experimentally determining the structure, properties and processes that govern biological systems, chemical processes and materials. The speed and quality at which we can acquire new insights from experiments directly influences the rate of scientific progress, industrial innovation and competitiveness. And gaining new groundbreaking insights, faster, is key to the economic success of our nations. Recent years have seen incredible advances in sensor technologies, from house size detector systems in large experiments such as the Large Hadron Collider and the ‘Eye of Gaia’ billion pixel camera detector to high throughput genome sequencing. These developments have led to an exponential increase in data volumes, rates and variety produced by instruments used for experimental work. This increase is coinciding with a need to analyze the experimental results at the time they are collected. This speed is required to optimize the data taking and quality, and also to enable new adaptive experiments, where the sample is manipulated as it is observed, e.g. a substance is injected into a

  5. A Cognitive Adopted Framework for IoT Big-Data Management and Knowledge Discovery Prospective

    OpenAIRE

    Mishra, Nilamadhab; Lin, Chung-Chih; Chang, Hsien-Tsung

    2015-01-01

    In future IoT big-data management and knowledge discovery for large scale industrial automation application, the importance of industrial internet is increasing day by day. Several diversified technologies such as IoT (Internet of Things), computational intelligence, machine type communication, big-data, and sensor technology can be incorporated together to improve the data management and knowledge discovery efficiency of large scale automation applications. So in this work, we need to propos...

  6. CERN Pension Fund move

    CERN Multimedia

    HR Department

    2007-01-01

    The CERN Pension Fund has moved to new offices at the 5th floor of Building 5. The Benefits Service of the Fund will henceforth receive you in the offices: 5-5-017 - 5-5-021 - 5-5-023. We remind you that the office hours are: Tuesday/Wednesday/Thursday from 10 am to 12 am and from 3 pm to 5 pm. The Fund would like to take this opportunity to warmly thank all the persons involved in the Removal.

  7. Moving related to separation : who moves and to what distance

    NARCIS (Netherlands)

    Mulder, Clara H.; Malmberg, Gunnar

    We address the issue of moving from the joint home on the occasion of separation. Our research question is: To what extent can the occurrence of moves related to separation, and the distance moved, be explained by ties to the location, resources, and other factors influencing the likelihood of

  8. AAS 227: Day 2

    Science.gov (United States)

    Kohler, Susanna

    2016-01-01

    Editors Note:This week were at the 227th AAS Meeting in Kissimmee, FL. Along with several fellow authors from astrobites.com, I will bewritingupdates on selectedevents at themeeting and posting at the end of each day. Follow along here or atastrobites.com, or catch ourlive-tweeted updates from the@astrobites Twitter account. The usual posting schedule for AAS Nova will resumenext week.Welcome to Day 2 of the winter American Astronomical Society (AAS) meeting in Kissimmee! Several of us are attending the conference this year, and we will report highlights from each day here on astrobites. If youd like to see more timely updates during the day, we encourage you to follow @astrobites on twitter or search the #aas227 hashtag.Plenary Session: Black Hole Physics with the Event Horizon Telescope (by Susanna Kohler)If anyone needed motivation to wake up early this morning, they got it in the form of Feryal Ozel (University of Arizona) enthralling us all with exciting pictures, videos, and words about black holes and the Event Horizon Telescope. Ozel spoke to a packed room (at 8:30am!) about where the project currently stands, and where its heading in the future.The EHT has pretty much the coolest goal ever: actually image the event horizons of black holes in our universe. The problem is that the largest black hole we can look at (Sgr A*, in the center of our galaxy) has an event horizon size of 50 as. For this kind of resolution roughly equivalent to trying to image a DVD on the Moon! wed need an Earth-sized telescope. EHT has solved this problem by linking telescopes around the world, creating one giant, mm-wavelength effective telescope with a baseline the size of Earth.Besides producing awesome images, the EHT will be able to test properties of black-hole spacetime, the no-hair theorem, and general relativity (GR) in new regimes.Ozel walked us through some of the theory prep work we need to do now in order to get the most science out of the EHT, including devising new

  9. Slow light in moving media

    Science.gov (United States)

    Leonhardt, U.; Piwnicki, P.

    2001-06-01

    We review the theory of light propagation in moving media with extremely low group velocity. We intend to clarify the most elementary features of monochromatic slow light in a moving medium and, whenever possible, to give an instructive simplified picture.

  10. When Big Ice Turns Into Water It Matters For Houses, Stores And Schools All Over

    Science.gov (United States)

    Bell, R. E.

    2017-12-01

    When ice in my glass turns to water it is not bad but when the big ice at the top and bottom of the world turns into water it is not good. This new water makes many houses, stores and schools wet. It is really bad during when the wind is strong and the rain is hard. New old ice water gets all over the place. We can not get to work or school or home. We go to the big ice at the top and bottom of the world to see if it will turn to water soon and make more houses wet. We fly over the big ice to see how it is doing. Most of the big ice sits on rock. Around the edge of the big sitting on rock ice, is really low ice that rides on top of the water. This really low ice slows down the big rock ice turning into water. If the really low ice cracks up and turns into little pieces of ice, the big rock ice will make more houses wet. We look to see if there is new water in the cracks. Water in the cracks is bad as it hurts the big rock ice. Water in the cracks on the really low ice will turn the low ice into many little pieces of ice. Then the big rock ice will turn to water. That is water in cracks is bad for the houses, schools and businesses. If water moves off the really low ice, it does not stay in the cracks. This is better for the really low ice. This is better for the big rock ice. We took pictures of the really low ice and saw water leaving. The water was not staying in the cracks. Water leaving the really low ice might be good for houses, schools and stores.

  11. The (Big Data-security assemblage: Knowledge and critique

    Directory of Open Access Journals (Sweden)

    Claudia Aradau

    2015-10-01

    Full Text Available The Snowden revelations and the emergence of ‘Big Data’ have rekindled questions about how security practices are deployed in a digital age and with what political effects. While critical scholars have drawn attention to the social, political and legal challenges to these practices, the debates in computer and information science have received less analytical attention. This paper proposes to take seriously the critical knowledge developed in information and computer science and reinterpret their debates to develop a critical intervention into the public controversies concerning data-driven security and digital surveillance. The paper offers a two-pronged contribution: on the one hand, we challenge the credibility of security professionals’ discourses in light of the knowledge that they supposedly mobilize; on the other, we argue for a series of conceptual moves around data, human–computer relations, and algorithms to address some of the limitations of existing engagements with the Big Data-security assemblage.

  12. So, You Want to Move out?!--An Awareness Program of the Real Costs of Moving Away from Home

    Science.gov (United States)

    Hines, Steven L.; Hansen, Lyle; Falen, Christi

    2011-01-01

    The So, You Want To Move Out?! program was developed to help teens explore the financial realities of moving away from home. This 3-day camp program allows youth the opportunity to interview for a job, work, earn a paycheck, and pay financial obligations. After paying expenses and trying to put some money away in savings, the participants begin to…

  13. ATLAS BigPanDA Monitoring and Its Evolution

    CERN Document Server

    Wenaus, Torre; The ATLAS collaboration; Korchuganova, Tatiana

    2016-01-01

    BigPanDA is the latest generation of the monitoring system for the Production and Distributed Analysis (PanDA) system. The BigPanDA monitor is a core component of PanDA and also serves the monitoring needs of the new ATLAS Production System Prodsys-2. BigPanDA has been developed to serve the growing computation needs of the ATLAS Experiment and the wider applications of PanDA beyond ATLAS. Through a system-wide job database, the BigPanDA monitor provides a comprehensive and coherent view of the tasks and jobs executed by the system, from high level summaries to detailed drill-down job diagnostics. The system has been in production and has remained in continuous development since mid 2014, today effectively managing more than 2 million jobs per day distributed over 150 computing centers worldwide. BigPanDA also delivers web-based analytics and system state views to groups of users including distributed computing systems operators, shifters, physicist end-users, computing managers and accounting services. Provi...

  14. Big Data: an exploration of research, technologies and application cases

    Directory of Open Access Journals (Sweden)

    Emilcy J. Hernández-Leal

    2017-05-01

    Full Text Available Big Data has become a worldwide trend and although still lacks a scientific or academic consensual concept, every day it portends greater market growth that surrounds and the associated research areas. This paper reports a systematic review of the literature on Big Data considering a state of the art about techniques and technologies associated with Big Data, which include capture, processing, analysis and data visualization. The characteristics, strengths, weaknesses and opportunities for some applications and Big Data models that include support mainly for modeling, analysis, and data mining are explored. Likewise, some of the future trends for the development of Big Data are introduced by basic aspects, scope, and importance of each one. The methodology used for exploration involves the application of two strategies, the first corresponds to a scientometric analysis and the second corresponds to a categorization of documents through a web tool to support the process of literature review. As results, a summary and conclusions about the subject are generated and possible scenarios arise for research work in the field.

  15. Using exponentially weighted moving average algorithm to defend against DDoS attacks

    CSIR Research Space (South Africa)

    Machaka, P

    2016-11-01

    Full Text Available This paper seeks to investigate the performance of the Exponentially Weighted Moving Average (EWMA) for mining big data and detection of DDoS attacks in Internet of Things (IoT) infrastructure. The paper will investigate the tradeoff between...

  16. [Big data in official statistics].

    Science.gov (United States)

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  17. GEOSS: Addressing Big Data Challenges

    Science.gov (United States)

    Nativi, S.; Craglia, M.; Ochiai, O.

    2014-12-01

    In the sector of Earth Observation, the explosion of data is due to many factors including: new satellite constellations, the increased capabilities of sensor technologies, social media, crowdsourcing, and the need for multidisciplinary and collaborative research to face Global Changes. In this area, there are many expectations and concerns about Big Data. Vendors have attempted to use this term for their commercial purposes. It is necessary to understand whether Big Data is a radical shift or an incremental change for the existing digital infrastructures. This presentation tries to explore and discuss the impact of Big Data challenges and new capabilities on the Global Earth Observation System of Systems (GEOSS) and particularly on its common digital infrastructure called GCI. GEOSS is a global and flexible network of content providers allowing decision makers to access an extraordinary range of data and information at their desk. The impact of the Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on GEOSS is discussed. The main solutions and experimentation developed by GEOSS along these axes are introduced and analyzed. GEOSS is a pioneering framework for global and multidisciplinary data sharing in the Earth Observation realm; its experience on Big Data is valuable for the many lessons learned.

  18. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  19. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  20. Quantum fields in a big-crunch-big-bang spacetime

    International Nuclear Information System (INIS)

    Tolley, Andrew J.; Turok, Neil

    2002-01-01

    We consider quantum field theory on a spacetime representing the big-crunch-big-bang transition postulated in ekpyrotic or cyclic cosmologies. We show via several independent methods that an essentially unique matching rule holds connecting the incoming state, in which a single extra dimension shrinks to zero, to the outgoing state in which it reexpands at the same rate. For free fields in our construction there is no particle production from the incoming adiabatic vacuum. When interactions are included the particle production for fixed external momentum is finite at the tree level. We discuss a formal correspondence between our construction and quantum field theory on de Sitter spacetime

  1. Poker Player Behavior After Big Wins and Big Losses

    OpenAIRE

    Gary Smith; Michael Levere; Robert Kurtzman

    2009-01-01

    We find that experienced poker players typically change their style of play after winning or losing a big pot--most notably, playing less cautiously after a big loss, evidently hoping for lucky cards that will erase their loss. This finding is consistent with Kahneman and Tversky's (Kahneman, D., A. Tversky. 1979. Prospect theory: An analysis of decision under risk. Econometrica 47(2) 263-292) break-even hypothesis and suggests that when investors incur a large loss, it might be time to take ...

  2. Turning big bang into big bounce: II. Quantum dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Malkiewicz, Przemyslaw; Piechocki, Wlodzimierz, E-mail: pmalk@fuw.edu.p, E-mail: piech@fuw.edu.p [Theoretical Physics Department, Institute for Nuclear Studies, Hoza 69, 00-681 Warsaw (Poland)

    2010-11-21

    We analyze the big bounce transition of the quantum Friedmann-Robertson-Walker model in the setting of the nonstandard loop quantum cosmology (LQC). Elementary observables are used to quantize composite observables. The spectrum of the energy density operator is bounded and continuous. The spectrum of the volume operator is bounded from below and discrete. It has equally distant levels defining a quantum of the volume. The discreteness may imply a foamy structure of spacetime at a semiclassical level which may be detected in astro-cosmo observations. The nonstandard LQC method has a free parameter that should be fixed in some way to specify the big bounce transition.

  3. "Our federalism" moves indoors.

    Science.gov (United States)

    Ruger, Theodore W

    2013-04-01

    A great deal of the US Supreme Court's federalism jurisprudence over the past two decades has focused on the outer limits of federal power, suggesting a mutually exclusive division of jurisdiction between the states and the federal government, where subjects are regulated by one sovereign or the other but not both. This is not an accurate picture of American governance as it has operated over the past half century - most important areas of American life are regulated concurrently by both the federal government and the states. The Supreme Court's June 2012 decision clearing the way for the Patient Protection and Affordable Care Act (PPACA) to move forward thus should not be regarded as an affront to state sovereignty but as a realistic embrace of state power in its active, modern form. The PPACA is infused with multiple major roles for the states, and as the statute goes into operation over the next few years, states retain, and are already exercising, substantial policy discretion.

  4. Moving Spatial Keyword Queries

    DEFF Research Database (Denmark)

    Wu, Dingming; Yiu, Man Lung; Jensen, Christian S.

    2013-01-01

    propose two algorithms for computing safe zones that guarantee correct results at any time and that aim to optimize the server-side computation as well as the communication between the server and the client. We exploit tight and conservative approximations of safe zones and aggressive computational space...... text data. State-of-the-art solutions for moving queries employ safe zones that guarantee the validity of reported results as long as the user remains within the safe zone associated with a result. However, existing safe-zone methods focus solely on spatial locations and ignore text relevancy. We...... pruning. We present techniques that aim to compute the next safe zone efficiently, and we present two types of conservative safe zones that aim to reduce the communication cost. Empirical studies with real data suggest that the proposals are efficient. To understand the effectiveness of the proposed safe...

  5. What moves us?

    DEFF Research Database (Denmark)

    2015-01-01

    Katalog til udstillingen på Museum Jorn - What moves us? Le Corbusier & Asger Jorn - 12. sept. - 13. dec. 2015. Kataloget undersøger Le Corbusiers skifte fra en rationelt funderet tilgang til arkitekturen til en poetisk, materialistisk tilgang i efterkrigstiden. Den viser hans indflydelse på den...... yngre Asger Jorn og beskriver danskerens første beundring, som sidenhen forvandledes til skarp kritik. Kataloget, som er rigt illustreret med billeder af Le Corbusiers og Asger Jorns kunst og arkitektur, indeholder også genoptryk af originale tekster, samt bidrag i ord og billeder fra fremtrædende...... eksperter. Kataloget indeholder en række artikler af internationale skribenter under flg. overskrifter: Le Corbusier - kunstnerarkitekten i efterkrigstidens Europa Le Corbusier og Asger Jorn - David mod Goliat Gensyn med Le Corbusier - spor i dansk arkitektur og byrum...

  6. Mechanics of moving materials

    CERN Document Server

    Banichuk, Nikolay; Neittaanmäki, Pekka; Saksa, Tytti; Tuovinen, Tero

    2014-01-01

    This book deals with theoretical aspects of modelling the mechanical behaviour of manufacturing, processing, transportation or other systems in which the processed or supporting material is travelling through the system. Examples of such applications include paper making, transmission cables, band saws, printing presses, manufacturing of plastic films and sheets, and extrusion of aluminium foil, textiles and other materials.   The work focuses on out-of-plane dynamics and stability analysis for isotropic and orthotropic travelling elastic and viscoelastic materials, with and without fluid-structure interaction, using analytical and semi-analytical approaches.  Also topics such as fracturing and fatigue are discussed in the context of moving materials. The last part of the book deals with optimization problems involving physical constraints arising from the stability and fatigue analyses, including uncertainties in the parameters.   The book is intended for researchers and specialists in the field, providin...

  7. Big Data Analytics in Medicine and Healthcare.

    Science.gov (United States)

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  8. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  9. The challenges of big data.

    Science.gov (United States)

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. © 2016. Published by The Company of Biologists Ltd.

  10. Moving In, Moving Through, and Moving Out: The Transitional Experiences of Foster Youth College Students

    Science.gov (United States)

    Gamez, Sara I.

    2017-01-01

    The purpose of this qualitative study was to explore the transitional experiences of foster youth college students. The study explored how foster youth experienced moving into, moving through, and moving out of the college environment and what resources and strategies they used to thrive during their college transitions. In addition, this study…

  11. Lemaitre's Big Bang

    OpenAIRE

    Luminet, Jean-Pierre

    2015-01-01

    I give an epistemological analysis of the developments of relativistic cosmology from 1917 to 1966, based on the seminal articles by Einstein, de Sitter, Friedmann, Lemaitre, Hubble, Gamow and other historical figures of the field. It appears that most of the ingredients of the present-day standard cosmological model, including the acceleration of the expansion due to a repulsive dark energy, the interpretation of the cosmological constant as vacuum energy or the possible non-trivial topology...

  12. ATLAS starts moving in

    CERN Multimedia

    2004-01-01

    The first large active detector component was lowered into the ATLAS cavern on 1 March. It consisted of the 8 modules forming the lower part of the central barrel of the tile hadronic calorimeter. The work of assembling the barrel, which comprises 64 modules, started the following day.

  13. Big Book of Windows Hacks

    CERN Document Server

    Gralla, Preston

    2008-01-01

    Bigger, better, and broader in scope, the Big Book of Windows Hacks gives you everything you need to get the most out of your Windows Vista or XP system, including its related applications and the hardware it runs on or connects to. Whether you want to tweak Vista's Aero interface, build customized sidebar gadgets and run them from a USB key, or hack the "unhackable" screensavers, you'll find quick and ingenious ways to bend these recalcitrant operating systems to your will. The Big Book of Windows Hacks focuses on Vista, the new bad boy on Microsoft's block, with hacks and workarounds that

  14. Sosiaalinen asiakassuhdejohtaminen ja big data

    OpenAIRE

    Toivonen, Topi-Antti

    2015-01-01

    Tässä tutkielmassa käsitellään sosiaalista asiakassuhdejohtamista sekä hyötyjä, joita siihen voidaan saada big datan avulla. Sosiaalinen asiakassuhdejohtaminen on terminä uusi ja monille tuntematon. Tutkimusta motivoi aiheen vähäinen tutkimus, suomenkielisen tutkimuksen puuttuminen kokonaan sekä sosiaalisen asiakassuhdejohtamisen mahdollinen olennainen rooli yritysten toiminnassa tulevaisuudessa. Big dataa käsittelevissä tutkimuksissa keskitytään monesti sen tekniseen puoleen, eikä sovellutuk...

  15. Do big gods cause anything?

    DEFF Research Database (Denmark)

    Geertz, Armin W.

    2014-01-01

    Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere.......Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere....

  16. Big Data and Social Media

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    A critical analysis of the "keep everything" Big Data era, the impact on our lives of the information, at first glance "convenient for future use" that we make known about ourselves on the network. NB! The lecture will be recorded like all Academic Training lectures. Lecturer's biography: Father of the Internet, see https://internethalloffame.org/inductees/vint-cerf or https://en.wikipedia.org/wiki/Vint_Cerf The video on slide number 9 is from page https://www.gapminder.org/tools/#$state$time$value=2018&value;;&chart-type=bubbles   Keywords: Big Data, Internet, History, Applications, tools, privacy, technology, preservation, surveillance, google, Arpanet, CERN, Web  

  17. Baryon symmetric big bang cosmology

    International Nuclear Information System (INIS)

    Stecker, F.W.

    1978-01-01

    It is stated that the framework of baryon symmetric big bang (BSBB) cosmology offers our greatest potential for deducting the evolution of the Universe because its physical laws and processes have the minimum number of arbitrary assumptions about initial conditions in the big-bang. In addition, it offers the possibility of explaining the photon-baryon ratio in the Universe and how galaxies and galaxy clusters are formed. BSBB cosmology also provides the only acceptable explanation at present for the origin of the cosmic γ-ray background radiation. (author)

  18. Release plan for Big Pete

    International Nuclear Information System (INIS)

    Edwards, T.A.

    1996-11-01

    This release plan is to provide instructions for the Radiological Control Technician (RCT) to conduct surveys for the unconditional release of ''Big Pete,'' which was used in the removal of ''Spacers'' from the N-Reactor. Prior to performing surveys on the rear end portion of ''Big Pete,'' it shall be cleaned (i.e., free of oil, grease, caked soil, heavy dust). If no contamination is found, the vehicle may be released with the permission of the area RCT Supervisor. If contamination is found by any of the surveys, contact the cognizant Radiological Engineer for decontamination instructions

  19. Small quarks make big nuggets

    International Nuclear Information System (INIS)

    Deligeorges, S.

    1985-01-01

    After a brief recall on the classification of subatomic particles, this paper deals with quark nuggets, particle with more than three quarks, a big bag, which is called ''nuclearite''. Neutron stars, in fact, are big sacks of quarks, gigantic nuggets. Now, physicists try to calculate which type of nuggets of strange quark matter is stable, what has been the influence of quark nuggets on the primordial nucleosynthesis. At the present time, one says that if these ''nuggets'' exist, and in a large proportion, they may be candidates for the missing mass [fr

  20. [Big Data- challenges and risks].

    Science.gov (United States)

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-06

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented.

  1. Towards a big crunch dual

    Energy Technology Data Exchange (ETDEWEB)

    Hertog, Thomas E-mail: hertog@vulcan2.physics.ucsb.edu; Horowitz, Gary T

    2004-07-01

    We show there exist smooth asymptotically anti-de Sitter initial data which evolve to a big crunch singularity in a low energy supergravity limit of string theory. This opens up the possibility of using the dual conformal field theory to obtain a fully quantum description of the cosmological singularity. A preliminary study of this dual theory suggests that the big crunch is an endpoint of evolution even in the full string theory. We also show that any theory with scalar solitons must have negative energy solutions. The results presented here clarify our earlier work on cosmic censorship violation in N=8 supergravity. (author)

  2. The Inverted Big-Bang

    OpenAIRE

    Vaas, Ruediger

    2004-01-01

    Our universe appears to have been created not out of nothing but from a strange space-time dust. Quantum geometry (loop quantum gravity) makes it possible to avoid the ominous beginning of our universe with its physically unrealistic (i.e. infinite) curvature, extreme temperature, and energy density. This could be the long sought after explanation of the big-bang and perhaps even opens a window into a time before the big-bang: Space itself may have come from an earlier collapsing universe tha...

  3. Surface urban heat island across 419 global big cities.

    Science.gov (United States)

    Peng, Shushi; Piao, Shilong; Ciais, Philippe; Friedlingstein, Pierre; Ottle, Catherine; Bréon, François-Marie; Nan, Huijuan; Zhou, Liming; Myneni, Ranga B

    2012-01-17

    Urban heat island is among the most evident aspects of human impacts on the earth system. Here we assess the diurnal and seasonal variation of surface urban heat island intensity (SUHII) defined as the surface temperature difference between urban area and suburban area measured from the MODIS. Differences in SUHII are analyzed across 419 global big cities, and we assess several potential biophysical and socio-economic driving factors. Across the big cities, we show that the average annual daytime SUHII (1.5 ± 1.2 °C) is higher than the annual nighttime SUHII (1.1 ± 0.5 °C) (P < 0.001). But no correlation is found between daytime and nighttime SUHII across big cities (P = 0.84), suggesting different driving mechanisms between day and night. The distribution of nighttime SUHII correlates positively with the difference in albedo and nighttime light between urban area and suburban area, while the distribution of daytime SUHII correlates negatively across cities with the difference of vegetation cover and activity between urban and suburban areas. Our results emphasize the key role of vegetation feedbacks in attenuating SUHII of big cities during the day, in particular during the growing season, further highlighting that increasing urban vegetation cover could be one effective way to mitigate the urban heat island effect.

  4. Big-Eyed Bugs Have Big Appetite for Pests

    Science.gov (United States)

    Many kinds of arthropod natural enemies (predators and parasitoids) inhabit crop fields in Arizona and can have a large negative impact on several pest insect species that also infest these crops. Geocoris spp., commonly known as big-eyed bugs, are among the most abundant insect predators in field c...

  5. Role of moving planes and moving spheres following Dupin cyclides

    KAUST Repository

    Jia, Xiaohong

    2014-03-01

    We provide explicit representations of three moving planes that form a μ-basis for a standard Dupin cyclide. We also show how to compute μ-bases for Dupin cyclides in general position and orientation from their implicit equations. In addition, we describe the role of moving planes and moving spheres in bridging between the implicit and rational parametric representations of these cyclides. © 2014 Elsevier B.V.

  6. Role of moving planes and moving spheres following Dupin cyclides

    KAUST Repository

    Jia, Xiaohong

    2014-01-01

    We provide explicit representations of three moving planes that form a μ-basis for a standard Dupin cyclide. We also show how to compute μ-bases for Dupin cyclides in general position and orientation from their implicit equations. In addition, we describe the role of moving planes and moving spheres in bridging between the implicit and rational parametric representations of these cyclides. © 2014 Elsevier B.V.

  7. A survey on Big Data Stream Mining

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... Big Data can be static on one machine or distributed ... decision making, and process automation. Big data .... Concept Drifting: concept drifting mean the classifier .... transactions generated by a prefix tree structure. EstDec ...

  8. Move and eat better

    CERN Document Server

    2012-01-01

    CERN has many traditions, but in a week that’s seen the launch of the Medical Service’s  ‘Move & eat better’ campaign, it’s refreshing to note that among the oldest is a sporting one.  The CERN relay race dates back to 15 October 1971 when 21 pioneering teams set off to pound the pavements of CERN. Back then, the Focus users group came in first with a time of 12 minutes and 42 seconds. Today’s route is slightly different, and the number of teams has risen to over 100, with a new category of Nordic Walking introduced, as part of the campaign, for the first time.   The relay has provided some memorable events, and perhaps one of the longest-standing records in the history of sport, with the UA1 strollers’ 10 minutes and 13 seconds unbeaten for thirty years. In the women’s category, the UN Gazelles set the fastest time of 13 minutes and 16 seconds in 1996, while in the veterans category, you wi...

  9. Seven remarkable days

    CERN Document Server

    This has been a truly remarkable seven days for CERN. Things have moved so fast that it has sometimes been hard to separate fact from fiction – all the more so since facts have often seemed too good to be true. It’s been a week of many firsts. Monday was the first time we’ve had two captured beams in the LHC. It’s the first time the LHC has functioned as a particle accelerator, boosting particles to the highest beam energy so far achieved at CERN. And it’s been a week in which we’ve seen the highest energy proton-proton collisions ever produced at CERN: our last hadron collider, the SPS was a proton-antiproton collider, a technically simpler machine than the LHC. This week’s successes are all the more remarkable precisely because of the complexity of the LHC. Unlike the SPS collider, it is two accelerators not one, making the job of commissioning nearly twice as difficult. I’d like to express my heartfelt thanks and congra...

  10. The Telecom Lab is moving

    CERN Multimedia

    IT Department

    2009-01-01

    As of 2nd March 2009, the Telecom Lab will move to Building 58 R-017. The Telecom Lab is the central point for all support questions regarding CERN mobile phone services (provision of SIM cards, requests for modifications of subscriptions, diagnostics for mobile phone problems, etc.). The opening hours as well as the contact details for the Telecom Lab remain unchanged: New location: Building 58 R-017 Opening hours: Every week day, from 11 a.m. to 12 a.m. Phone number: 72480 Email address: labo.telecom@cern.ch This change has no impact on support requests for mobile services. Users can still submit their requests concerning mobile phone subscriptions using the usual EDH form (https://edh.cern.ch/Document/GSM). The automatic message sent to inform users of their SIM card availability will be updated to indicate the new Telecom Lab location. You can find all information related to CERN mobile phone services at the following link: http://cern.ch/gsm CS Section - IT/CS group

  11. BIG DATA-DRIVEN MARKETING: AN ABSTRACT

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (BD use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: What are the key antecedents of big data customer analytics use? How, and to what extent, does big data customer an...

  12. Big data in Finnish financial services

    OpenAIRE

    Laurila, M. (Mikko)

    2017-01-01

    Abstract This thesis aims to explore the concept of big data, and create understanding of big data maturity in the Finnish financial services industry. The research questions of this thesis are “What kind of big data solutions are being implemented in the Finnish financial services sector?” and “Which factors impede faster implementation of big data solutions in the Finnish financial services sector?”. ...

  13. China: Big Changes Coming Soon

    Science.gov (United States)

    Rowen, Henry S.

    2011-01-01

    Big changes are ahead for China, probably abrupt ones. The economy has grown so rapidly for many years, over 30 years at an average of nine percent a year, that its size makes it a major player in trade and finance and increasingly in political and military matters. This growth is not only of great importance internationally, it is already having…

  14. Big data and urban governance

    NARCIS (Netherlands)

    Taylor, L.; Richter, C.; Gupta, J.; Pfeffer, K.; Verrest, H.; Ros-Tonen, M.

    2015-01-01

    This chapter examines the ways in which big data is involved in the rise of smart cities. Mobile phones, sensors and online applications produce streams of data which are used to regulate and plan the city, often in real time, but which presents challenges as to how the city’s functions are seen and

  15. Big Data for personalized healthcare

    NARCIS (Netherlands)

    Siemons, Liseth; Sieverink, Floor; Vollenbroek, Wouter; van de Wijngaert, Lidwien; Braakman-Jansen, Annemarie; van Gemert-Pijnen, Lisette

    2016-01-01

    Big Data, often defined according to the 5V model (volume, velocity, variety, veracity and value), is seen as the key towards personalized healthcare. However, it also confronts us with new technological and ethical challenges that require more sophisticated data management tools and data analysis

  16. Big data en gelijke behandeling

    NARCIS (Netherlands)

    Lammerant, Hans; de Hert, Paul; Blok, P.H.; Blok, P.H.

    2017-01-01

    In dit hoofdstuk bekijken we allereerst de voornaamste basisbegrippen inzake gelijke behandeling en discriminatie (paragraaf 6.2). Vervolgens kijken we haar het Nederlandse en Europese juridisch kader inzake non-discriminatie (paragraaf 6.3-6.5) en hoe die regels moeten worden toegepast op big

  17. Research Ethics in Big Data.

    Science.gov (United States)

    Hammer, Marilyn J

    2017-05-01

    The ethical conduct of research includes, in part, patient agreement to participate in studies and the protection of health information. In the evolving world of data science and the accessibility of large quantities of web-based data created by millions of individuals, novel methodologic approaches to answering research questions are emerging. This article explores research ethics in the context of big data.

  18. Big data e data science

    OpenAIRE

    Cavique, Luís

    2014-01-01

    Neste artigo foram apresentados os conceitos básicos de Big Data e a nova área a que deu origem, a Data Science. Em Data Science foi discutida e exemplificada a noção de redução da dimensionalidade dos dados.

  19. Finding errors in big data

    NARCIS (Netherlands)

    Puts, Marco; Daas, Piet; de Waal, A.G.

    No data source is perfect. Mistakes inevitably creep in. Spotting errors is hard enough when dealing with survey responses from several thousand people, but the difficulty is multiplied hugely when that mysterious beast Big Data comes into play. Statistics Netherlands is about to publish its first

  20. Sampling Operations on Big Data

    Science.gov (United States)

    2015-11-29

    gories. These include edge sampling methods where edges are selected by a predetermined criteria; snowball sampling methods where algorithms start... Sampling Operations on Big Data Vijay Gadepally, Taylor Herr, Luke Johnson, Lauren Milechin, Maja Milosavljevic, Benjamin A. Miller Lincoln...process and disseminate information for discovery and exploration under real-time constraints. Common signal processing operations such as sampling and

  1. The International Big History Association

    Science.gov (United States)

    Duffy, Michael; Duffy, D'Neil

    2013-01-01

    IBHA, the International Big History Association, was organized in 2010 and "promotes the unified, interdisciplinary study and teaching of history of the Cosmos, Earth, Life, and Humanity." This is the vision that Montessori embraced long before the discoveries of modern science fleshed out the story of the evolving universe. "Big…

  2. NASA's Big Data Task Force

    Science.gov (United States)

    Holmes, C. P.; Kinter, J. L.; Beebe, R. F.; Feigelson, E.; Hurlburt, N. E.; Mentzel, C.; Smith, G.; Tino, C.; Walker, R. J.

    2017-12-01

    Two years ago NASA established the Ad Hoc Big Data Task Force (BDTF - https://science.nasa.gov/science-committee/subcommittees/big-data-task-force), an advisory working group with the NASA Advisory Council system. The scope of the Task Force included all NASA Big Data programs, projects, missions, and activities. The Task Force focused on such topics as exploring the existing and planned evolution of NASA's science data cyber-infrastructure that supports broad access to data repositories for NASA Science Mission Directorate missions; best practices within NASA, other Federal agencies, private industry and research institutions; and Federal initiatives related to big data and data access. The BDTF has completed its two-year term and produced several recommendations plus four white papers for NASA's Science Mission Directorate. This presentation will discuss the activities and results of the TF including summaries of key points from its focused study topics. The paper serves as an introduction to the papers following in this ESSI session.

  3. Big Math for Little Kids

    Science.gov (United States)

    Greenes, Carole; Ginsburg, Herbert P.; Balfanz, Robert

    2004-01-01

    "Big Math for Little Kids," a comprehensive program for 4- and 5-year-olds, develops and expands on the mathematics that children know and are capable of doing. The program uses activities and stories to develop ideas about number, shape, pattern, logical reasoning, measurement, operations on numbers, and space. The activities introduce the…

  4. BIG DATA IN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2015-06-01

    Full Text Available In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured in order to improve current transactions, to develop new business models, to provide a real image of the supply and demand and thereby, generate market advantages. So, the companies that turn to Big Data have a competitive advantage over other firms. Looking from the perspective of IT organizations, they must accommodate the storage and processing Big Data, and provide analysis tools that are easily integrated into business processes. This paper aims to discuss aspects regarding the Big Data concept, the principles to build, organize and analyse huge datasets in the business environment, offering a three-layer architecture, based on actual software solutions. Also, the article refers to the graphical tools for exploring and representing unstructured data, Gephi and NodeXL.

  5. From Big Bang to Eternity?

    Indian Academy of Sciences (India)

    at different distances (that is, at different epochs in the past) to come to this ... that the expansion started billions of years ago from an explosive Big Bang. Recent research sheds new light on the key cosmological question about the distant ...

  6. Banking Wyoming big sagebrush seeds

    Science.gov (United States)

    Robert P. Karrfalt; Nancy Shaw

    2013-01-01

    Five commercially produced seed lots of Wyoming big sagebrush (Artemisia tridentata Nutt. var. wyomingensis (Beetle & Young) S.L. Welsh [Asteraceae]) were stored under various conditions for 5 y. Purity, moisture content as measured by equilibrium relative humidity, and storage temperature were all important factors to successful seed storage. Our results indicate...

  7. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  8. Big data: een zoektocht naar instituties

    NARCIS (Netherlands)

    van der Voort, H.G.; Crompvoets, J

    2016-01-01

    Big data is a well-known phenomenon, even a buzzword nowadays. It refers to an abundance of data and new possibilities to process and use them. Big data is subject of many publications. Some pay attention to the many possibilities of big data, others warn us for their consequences. This special

  9. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  10. Data, Data, Data : Big, Linked & Open

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Eckartz, S.M.

    2013-01-01

    De gehele business en IT-wereld praat op dit moment over Big Data, een trend die medio 2013 Cloud Computing is gepasseerd (op basis van Google Trends). Ook beleidsmakers houden zich actief bezig met Big Data. Neelie Kroes, vice-president van de Europese Commissie, spreekt over de ‘Big Data

  11. A survey of big data research

    Science.gov (United States)

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  12. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  13. Distribution and movement of Big Spring spinedace (Lepidomeda mollispinis pratensis) in Condor Canyon, Meadow Valley Wash, Nevada

    Science.gov (United States)

    Jezorek, Ian G.; Connolly, Patrick J.

    2013-01-01

    Big Spring spinedace (Lepidomeda mollispinis pratensis) is a cyprinid whose entire population occurs within a section of Meadow Valley Wash, Nevada. Other spinedace species have suffered population and range declines (one species is extinct). Managers, concerned about the vulnerability of Big Spring spinedace, have considered habitat restoration actions or translocation, but they have lacked data on distribution or habitat use. Our study occurred in an 8.2-km section of Meadow Valley Wash, including about 7.2 km in Condor Canyon and 0.8 km upstream of the canyon. Big Spring spinedace were present upstream of the currently listed critical habitat, including in the tributary Kill Wash. We found no Big Spring spinedace in the lower 3.3 km of Condor Canyon. We tagged Big Spring spinedace ≥70 mm fork length (range 70–103 mm) with passive integrated transponder tags during October 2008 (n = 100) and March 2009 (n = 103) to document movement. At least 47 of these individuals moved from their release location (up to 2 km). Thirty-nine individuals moved to Kill Wash or the confluence area with Meadow Valley Wash. Ninety-three percent of movement occurred in spring 2009. Fish moved both upstream and downstream. We found no movement downstream over a small waterfall at river km 7.9 and recorded only one fish that moved downstream over Delmue Falls (a 12-m drop) at river km 6.1. At the time of tagging, there was no significant difference in fork length or condition between Big Spring Spinedace that were later detected moving and those not detected moving. We found no significant difference in fork length or condition at time of tagging of Big Spring spinedace ≥70 mm fork length that were detected moving and those not detected moving. Kill Wash and its confluence area appeared important to Big Spring spinedace; connectivity with these areas may be key to species persistence. These areas may provide a habitat template for restoration or translocation. The lower 3.3 km of

  14. Big Data - Smart Health Strategies

    Science.gov (United States)

    2014-01-01

    Summary Objectives To select best papers published in 2013 in the field of big data and smart health strategies, and summarize outstanding research efforts. Methods A systematic search was performed using two major bibliographic databases for relevant journal papers. The references obtained were reviewed in a two-stage process, starting with a blinded review performed by the two section editors, and followed by a peer review process operated by external reviewers recognized as experts in the field. Results The complete review process selected four best papers, illustrating various aspects of the special theme, among them: (a) using large volumes of unstructured data and, specifically, clinical notes from Electronic Health Records (EHRs) for pharmacovigilance; (b) knowledge discovery via querying large volumes of complex (both structured and unstructured) biological data using big data technologies and relevant tools; (c) methodologies for applying cloud computing and big data technologies in the field of genomics, and (d) system architectures enabling high-performance access to and processing of large datasets extracted from EHRs. Conclusions The potential of big data in biomedicine has been pinpointed in various viewpoint papers and editorials. The review of current scientific literature illustrated a variety of interesting methods and applications in the field, but still the promises exceed the current outcomes. As we are getting closer towards a solid foundation with respect to common understanding of relevant concepts and technical aspects, and the use of standardized technologies and tools, we can anticipate to reach the potential that big data offer for personalized medicine and smart health strategies in the near future. PMID:25123721

  15. Big-Leaf Mahogany on CITES Appendix II: Big Challenge, Big Opportunity

    Science.gov (United States)

    JAMES GROGAN; PAULO BARRETO

    2005-01-01

    On 15 November 2003, big-leaf mahogany (Swietenia macrophylla King, Meliaceae), the most valuable widely traded Neotropical timber tree, gained strengthened regulatory protection from its listing on Appendix II of the Convention on International Trade in Endangered Species ofWild Fauna and Flora (CITES). CITES is a United Nations-chartered agreement signed by 164...

  16. Big Pharma: a former insider's view.

    Science.gov (United States)

    Badcott, David

    2013-05-01

    There is no lack of criticisms frequently levelled against the international pharmaceutical industry (Big Pharma): excessive profits, dubious or even dishonest practices, exploiting the sick and selective use of research data. Neither is there a shortage of examples used to support such opinions. A recent book by Brody (Hooked: Ethics, the Medical Profession and the Pharmaceutical Industry, 2008) provides a précis of the main areas of criticism, adopting a twofold strategy: (1) An assumption that the special nature and human need for pharmaceutical medicines requires that such products should not be treated like other commodities and (2) A multilevel descriptive approach that facilitates an ethical analysis of relationships and practices. At the same time, Brody is fully aware of the nature of the fundamental dilemma: the apparent addiction to (and denial of) the widespread availability of gifts and financial support for conferences etc., but recognises that 'Remove the industry and its products, and a considerable portion of scientific medicine's power to help the patient vanishes' (Brody 2008, p. 5). The paper explores some of the relevant issues, and argues that despite the identified shortcomings and a need for rigorous and perhaps enhanced regulation, and realistic price control, the commercially competitive pharmaceutical industry remains the best option for developing safer and more effective medicinal treatments. At the same time, adoption of a broader ethical basis for the industry's activities, such as a triple bottom line policy, would register an important move in the right direction and go some way toward answering critics.

  17. Radiation by moving charges

    International Nuclear Information System (INIS)

    Geloni, Gianluca; Kocharyan, Vitali; Saldin, Evgeni

    2017-04-01

    It is generally accepted that in order to describe the dynamics of relativistic particles in the laboratory (lab) frame it is sufficient to take into account the relativistic dependence of the particle momenta on the velocity. This solution of the dynamics problem in the lab frame makes no reference to Lorentz transformations. For this reason they are not discussed in particle tracking calculations in accelerator and plasma physics. It is generally believed that the electrodynamics problem can be treated within the same ''single inertial frame'' description without reference to Lorentz transformations. In particular, in order to evaluate radiation fields arising from charged particles in motion we need to know their velocities and positions as a function of the lab frame time t. The relativistic motion of a particle in the lab frame is described by Newton's second law ''corrected'' for the relativistic dependence of momentum on velocity. It is assumed in all standard derivations that one can perform identification of the trajectories in the source part of the usual Maxwell's equations with the trajectories vector x(t) measured (or calculated by using the corrected Newton's second law) in the lab frame. This way of coupling fields and particles is considered since more than a century as the relativistically correct procedure.We argue that this procedure needs to be changed, and we demonstrate the following, completely counterintuitive statement: the results of conventional theory of radiation by relativistically moving charges are not consistent with the principle of relativity. In order to find the trajectory of a particle in the lab frame consistent with the usual Maxwell's equations, one needs to solve the dynamic equation inmanifestly covariant form by using the coordinate-independent proper time τ to parameterize the particle world-line in space-time. We show that there is a difference between ''true'' particle trajectory vector x(t) calculated or measured in

  18. SDN Low Latency for Medical Big Data Using Wavelets

    Directory of Open Access Journals (Sweden)

    Fadia Shah

    2017-06-01

    Full Text Available New era is the age of 5G. The network has moved from the simple internet connection towards advanced LTE connections and transmission. The information and communication technology has reshaped telecommunication. For this, among many types of big data, Medical Big Data is one of the most sensitive forms of data. Wavelet is a technical tool to reduce the size of this data to make it available for the user for more time. It is also responsible for low latency and high speed data transmission over the network. The key concern is the Medical Big Data should be accurate and reliable enough so that the recommended treatment should be the concerned one. This paper proposed the scheme to support the concept of data availability without losing crucial information, via Wavelet the Medical Data compression and through SDN supportive architecture by making data availability over the wireless network. Such scheme is in favor of the efficient use of technology for the benefit of human beings in the support of medical treatments.

  19. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  20. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  1. Antigravity and the big crunch/big bang transition

    International Nuclear Information System (INIS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  2. The ethics of big data in big agriculture

    Directory of Open Access Journals (Sweden)

    Isabelle M. Carbonell

    2016-03-01

    Full Text Available This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique insights on a field-by-field basis into a third or more of the US farmland. This power asymmetry may be rebalanced through open-sourced data, and publicly-funded data analytic tools which rival Climate Corp. in complexity and innovation for use in the public domain.

  3. Big Data – Big Deal for Organization Design?

    OpenAIRE

    Janne J. Korhonen

    2014-01-01

    Analytics is an increasingly important source of competitive advantage. It has even been posited that big data will be the next strategic emphasis of organizations and that analytics capability will be manifested in organizational structure. In this article, I explore how analytics capability might be reflected in organizational structure using the notion of  “requisite organization” developed by Jaques (1998). Requisite organization argues that a new strategic emphasis requires the addition ...

  4. Nowcasting using news topics Big Data versus big bank

    OpenAIRE

    Thorsrud, Leif Anders

    2016-01-01

    The agents in the economy use a plethora of high frequency information, including news media, to guide their actions and thereby shape aggregate economic fluctuations. Traditional nowcasting approches have to a relatively little degree made use of such information. In this paper, I show how unstructured textual information in a business newspaper can be decomposed into daily news topics and used to nowcast quarterly GDP growth. Compared with a big bank of experts, here represented by o cial c...

  5. When Every Day Is Professional Development Day

    Science.gov (United States)

    Tienken, Christopher H.; Stonaker, Lew

    2007-01-01

    In the Monroe Township (New Jersey) Public Schools, teachers' learning occurs daily, not just on one day in October and February. Central office and school-level administrators foster job-embedded teacher growth. Every day is a professional development day in the district, but that has not always been so. How did the district become a system with…

  6. Social Service has moved

    CERN Multimedia

    2007-01-01

    The offices of the Social Service are now on the 1st floor of Building 33 (Reception), exactly one floor above the old location. We remind you that the team, consisting of two social workers, a psychologist (external consultant, 1 day/week) and an administrative assistant, is at the disposal of all members of the personnel, whatever their status, as well as to their family members. Advice and support in the following areas are offered : · information on integration in the local area; · assistance in dealing with the relevant authorities/services; · consultations with a view to resolving problems of a personal, family or professional nature, such as problems of dependency (alcohol, drugs) relationship or behavioral problems (stress, depression, eating disorders), etc.; · support in facing new situations (maternity, divorce, bereavement, job change, separation from family/familiar surroundings); · assistance with decision making relating to family, personal or profes...

  7. Big Data in Medicine is Driving Big Changes

    Science.gov (United States)

    Verspoor, K.

    2014-01-01

    Summary Objectives To summarise current research that takes advantage of “Big Data” in health and biomedical informatics applications. Methods Survey of trends in this work, and exploration of literature describing how large-scale structured and unstructured data sources are being used to support applications from clinical decision making and health policy, to drug design and pharmacovigilance, and further to systems biology and genetics. Results The survey highlights ongoing development of powerful new methods for turning that large-scale, and often complex, data into information that provides new insights into human health, in a range of different areas. Consideration of this body of work identifies several important paradigm shifts that are facilitated by Big Data resources and methods: in clinical and translational research, from hypothesis-driven research to data-driven research, and in medicine, from evidence-based practice to practice-based evidence. Conclusions The increasing scale and availability of large quantities of health data require strategies for data management, data linkage, and data integration beyond the limits of many existing information systems, and substantial effort is underway to meet those needs. As our ability to make sense of that data improves, the value of the data will continue to increase. Health systems, genetics and genomics, population and public health; all areas of biomedicine stand to benefit from Big Data and the associated technologies. PMID:25123716

  8. Questioning Big Data: Crowdsourcing crisis data towards an inclusive humanitarian response

    Directory of Open Access Journals (Sweden)

    Femke Mulder

    2016-08-01

    Full Text Available The aim of this paper is to critically explore whether crowdsourced Big Data enables an inclusive humanitarian response at times of crisis. We argue that all data, including Big Data, are socially constructed artefacts that reflect the contexts and processes of their creation. To support our argument, we qualitatively analysed the process of ‘Big Data making’ that occurred by way of crowdsourcing through open data platforms, in the context of two specific humanitarian crises, namely the 2010 earthquake in Haiti and the 2015 earthquake in Nepal. We show that the process of creating Big Data from local and global sources of knowledge entails the transformation of information as it moves from one distinct group of contributors to the next. The implication of this transformation is that locally based, affected people and often the original ‘crowd’ are excluded from the information flow, and from the interpretation process of crowdsourced crisis knowledge, as used by formal responding organizations, and are marginalized in their ability to benefit from Big Data in support of their own means. Our paper contributes a critical perspective to the debate on participatory Big Data, by explaining the process of in and exclusion during data making, towards more responsive humanitarian relief.

  9. Big data is not a monolith

    CERN Document Server

    Ekbia, Hamid R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  10. STS-95 Day 06 Highlights

    Science.gov (United States)

    1998-01-01

    On this sixth day of the STS-95 mission, the flight crew, Cmdr. Curtis L. Brown, Pilot Steven W. Lindsey, Mission Specialists Scott E. Parazynski, Stephen K. Robinson, and Pedro Duque, and Payload Specialists Chiaki Mukai and John H. Glenn, test a device called the Video Guidance Sensor, a component of an automated docking system being prepared for use on the International Space Station. As Discovery closes in on Spartan, the astronauts will use a laser system that provides precise measurements of how far away the shuttle is from a target and how fast it is moving toward or away from the target.

  11. Big Bayou Creek and Little Bayou Creek Watershed Monitoring Program

    Energy Technology Data Exchange (ETDEWEB)

    Kszos, L.A.; Peterson, M.J.; Ryon; Smith, J.G.

    1999-03-01

    Biological monitoring of Little Bayou and Big Bayou creeks, which border the Paducah Site, has been conducted since 1987. Biological monitoring was conducted by University of Kentucky from 1987 to 1991 and by staff of the Environmental Sciences Division (ESD) at Oak Ridge National Laboratory (ORNL) from 1991 through March 1999. In March 1998, renewed Kentucky Pollutant Discharge Elimination System (KPDES) permits were issued to the US Department of Energy (DOE) and US Enrichment Corporation. The renewed DOE permit requires that a watershed monitoring program be developed for the Paducah Site within 90 days of the effective date of the renewed permit. This plan outlines the sampling and analysis that will be conducted for the watershed monitoring program. The objectives of the watershed monitoring are to (1) determine whether discharges from the Paducah Site and the Solid Waste Management Units (SWMUs) associated with the Paducah Site are adversely affecting instream fauna, (2) assess the ecological health of Little Bayou and Big Bayou creeks, (3) assess the degree to which abatement actions ecologically benefit Big Bayou Creek and Little Bayou Creek, (4) provide guidance for remediation, (5) provide an evaluation of changes in potential human health concerns, and (6) provide data which could be used to assess the impact of inadvertent spills or fish kill. According to the cleanup will result in these watersheds [Big Bayou and Little Bayou creeks] achieving compliance with the applicable water quality criteria.

  12. Radiation by moving charges

    Energy Technology Data Exchange (ETDEWEB)

    Geloni, Gianluca [European XFEL GmbH, Hamburg (Germany); Kocharyan, Vitali; Saldin, Evgeni [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2017-04-15

    It is generally accepted that in order to describe the dynamics of relativistic particles in the laboratory (lab) frame it is sufficient to take into account the relativistic dependence of the particle momenta on the velocity. This solution of the dynamics problem in the lab frame makes no reference to Lorentz transformations. For this reason they are not discussed in particle tracking calculations in accelerator and plasma physics. It is generally believed that the electrodynamics problem can be treated within the same ''single inertial frame'' description without reference to Lorentz transformations. In particular, in order to evaluate radiation fields arising from charged particles in motion we need to know their velocities and positions as a function of the lab frame time t. The relativistic motion of a particle in the lab frame is described by Newton's second law ''corrected'' for the relativistic dependence of momentum on velocity. It is assumed in all standard derivations that one can perform identification of the trajectories in the source part of the usual Maxwell's equations with the trajectories vector x(t) measured (or calculated by using the corrected Newton's second law) in the lab frame. This way of coupling fields and particles is considered since more than a century as the relativistically correct procedure.We argue that this procedure needs to be changed, and we demonstrate the following, completely counterintuitive statement: the results of conventional theory of radiation by relativistically moving charges are not consistent with the principle of relativity. In order to find the trajectory of a particle in the lab frame consistent with the usual Maxwell's equations, one needs to solve the dynamic equation inmanifestly covariant form by using the coordinate-independent proper time τ to parameterize the particle world-line in space-time. We show that there is a difference between &apos

  13. Moving On (Editorial

    Directory of Open Access Journals (Sweden)

    Lindsay Glynn

    2008-12-01

    Full Text Available Well, here we are with another full volume published. I have watched this journal grow from an idea to a sustainable, reputable journal with readers and contributors from all over the world. It has grown from a small editorial group to a collection of peer reviewers, Evidence Summary writers, copyeditors, etc., that is eighty-four strong (and counting. I have wavered between wondering whether promoting evidence based librarianship is akin to flogging a dead horse to feeling secure in knowing that we are making a difference. Being involved with this journal has made me look at what I do in a different light and I now approach decisions and change with what I refer to as “structured flexibility”. Following the EBL framework is the structured part, and when it works, it works well. But we all know that there is not always an answer in the literature nor is there a guarantee that implemented evidence based change will work similarly in different environments. That’s where the flexibility comes in.Three years as Editor-in-Chief has been a challenging, enjoyable, time-consuming, and fascinating learning experience. It has provided me with numerous opportunities in terms of speaking engagements, workshop offerings, and valuable discussion and discourse. It has also provided me with research and project ideas that I have had to place on the back burner until a time when I have enough hours in the day. Recognizing that adding additional hours to the day is, well, impossible, I have decided to stepdown from my editorial role to pursue other activities. This was a bittersweet decision, but a necessary one. I am pleased to announce that Denise Koufogiannakis will be taking on the role of Editor-in-Chief for the next term. Denise, as many of you know, has also been involved with this journal since its inception and, after a brief period of reduced involvement, has eagerly stepped up to the plate. This journal would not be what it is today without

  14. Big Data for Precision Medicine

    Directory of Open Access Journals (Sweden)

    Daniel Richard Leff

    2015-09-01

    Full Text Available This article focuses on the potential impact of big data analysis to improve health, prevent and detect disease at an earlier stage, and personalize interventions. The role that big data analytics may have in interrogating the patient electronic health record toward improved clinical decision support is discussed. We examine developments in pharmacogenetics that have increased our appreciation of the reasons why patients respond differently to chemotherapy. We also assess the expansion of online health communications and the way in which this data may be capitalized on in order to detect public health threats and control or contain epidemics. Finally, we describe how a new generation of wearable and implantable body sensors may improve wellbeing, streamline management of chronic diseases, and improve the quality of surgical implants.

  15. Big Data hvor N=1

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    2017-01-01

    Forskningen vedrørende anvendelsen af ’big data’ indenfor sundhed er kun lige begyndt, og kan på sigt blive en stor hjælp i forhold til at tilrettelægge en mere personlig og helhedsorienteret sundhedsindsats for multisyge. Personlig sundhedsteknologi, som kort præsenteres i dette kapital, rummer et...... stor potentiale for at gennemføre ’big data’ analyser for den enkelte person, det vil sige hvor N=1. Der er store teknologiske udfordringer i at få lavet teknologier og metoder til at indsamle og håndtere personlige data, som kan deles, på tværs på en standardiseret, forsvarlig, robust, sikker og ikke...

  16. George and the big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2012-01-01

    George has problems. He has twin baby sisters at home who demand his parents’ attention. His beloved pig Freddy has been exiled to a farm, where he’s miserable. And worst of all, his best friend, Annie, has made a new friend whom she seems to like more than George. So George jumps at the chance to help Eric with his plans to run a big experiment in Switzerland that seeks to explore the earliest moment of the universe. But there is a conspiracy afoot, and a group of evildoers is planning to sabotage the experiment. Can George repair his friendship with Annie and piece together the clues before Eric’s experiment is destroyed forever? This engaging adventure features essays by Professor Stephen Hawking and other eminent physicists about the origins of the universe and ends with a twenty-page graphic novel that explains how the Big Bang happened—in reverse!

  17. Did the Big Bang begin?

    International Nuclear Information System (INIS)

    Levy-Leblond, J.

    1990-01-01

    It is argued that the age of the universe may well be numerically finite (20 billion years or so) and conceptually infinite. A new and natural time scale is defined on a physical basis using group-theoretical arguments. An additive notion of time is obtained according to which the age of the universe is indeed infinite. In other words, never did the Big Bang begin. This new time scale is not supposed to replace the ordinary cosmic time scale, but to supplement it (in the same way as rapidity has taken a place by the side of velocity in Einsteinian relativity). The question is discussed within the framework of conventional (big-bang) and classical (nonquantum) cosmology, but could easily be extended to more elaborate views, as the purpose is not so much to modify present theories as to reach a deeper understanding of their meaning

  18. Big Data in Drug Discovery.

    Science.gov (United States)

    Brown, Nathan; Cambruzzi, Jean; Cox, Peter J; Davies, Mark; Dunbar, James; Plumbley, Dean; Sellwood, Matthew A; Sim, Aaron; Williams-Jones, Bryn I; Zwierzyna, Magdalena; Sheppard, David W

    2018-01-01

    Interpretation of Big Data in the drug discovery community should enhance project timelines and reduce clinical attrition through improved early decision making. The issues we encounter start with the sheer volume of data and how we first ingest it before building an infrastructure to house it to make use of the data in an efficient and productive way. There are many problems associated with the data itself including general reproducibility, but often, it is the context surrounding an experiment that is critical to success. Help, in the form of artificial intelligence (AI), is required to understand and translate the context. On the back of natural language processing pipelines, AI is also used to prospectively generate new hypotheses by linking data together. We explain Big Data from the context of biology, chemistry and clinical trials, showcasing some of the impressive public domain sources and initiatives now available for interrogation. © 2018 Elsevier B.V. All rights reserved.

  19. Big Data and central banks

    OpenAIRE

    David Bholat

    2015-01-01

    This commentary recaps a Centre for Central Banking Studies event held at the Bank of England on 2–3 July 2014. The article covers three main points. First, it situates the Centre for Central Banking Studies event within the context of the Bank’s Strategic Plan and initiatives. Second, it summarises and reflects on major themes from the event. Third, the article links central banks’ emerging interest in Big Data approaches with their broader uptake by other economic agents.

  20. Big Bang or vacuum fluctuation

    International Nuclear Information System (INIS)

    Zel'dovich, Ya.B.

    1980-01-01

    Some general properties of vacuum fluctuations in quantum field theory are described. The connection between the ''energy dominance'' of the energy density of vacuum fluctuations in curved space-time and the presence of singularity is discussed. It is pointed out that a de-Sitter space-time (with the energy density of the vacuum fluctuations in the Einstein equations) that matches the expanding Friedman solution may describe the history of the Universe before the Big Bang. (P.L.)

  1. Big bang is not needed

    Energy Technology Data Exchange (ETDEWEB)

    Allen, A.D.

    1976-02-01

    Recent computer simulations indicate that a system of n gravitating masses breaks up, even when the total energy is negative. As a result, almost any initial phase-space distribution results in a universe that eventually expands under the Hubble law. Hence Hubble expansion implies little regarding an initial cosmic state. Especially it does not imply the singularly dense superpositioned state used in the big bang model.

  2. Small decisions with big impact on data analytics

    Directory of Open Access Journals (Sweden)

    Jana Diesner

    2015-11-01

    Full Text Available Big social data have enabled new opportunities for evaluating the applicability of social science theories that were formulated decades ago and were often based on small- to medium-sized samples. Big Data coupled with powerful computing has the potential to replace the statistical practice of sampling and estimating effects by measuring phenomena based on full populations. Preparing these data for analysis and conducting analytics involves a plethora of decisions, some of which are already embedded in previously collected data and built tools. These decisions refer to the recording, indexing and representation of data and the settings for analysis methods. While these choices can have tremendous impact on research outcomes, they are not often obvious, not considered or not being made explicit. Consequently, our awareness and understanding of the impact of these decisions on analysis results and derived implications are highly underdeveloped. This might be attributable to occasional high levels of over-confidence in computational solutions as well as the possible yet questionable assumption that Big Data can wash out minor data quality issues, among other reasons. This article provides examples for how to address this issue. It argues that checking, ensuring and validating the quality of big social data and related auxiliary material is a key ingredient for empowering users to gain reliable insights from their work. Scrutinizing data for accuracy issues, systematically fixing them and diligently documenting these processes can have another positive side effect: Closely interacting with the data, thereby forcing ourselves to understand their idiosyncrasies and patterns, can help us to move from being able to precisely model and formally describe effects in society to also understand and explain them.

  3. Big data: the management revolution.

    Science.gov (United States)

    McAfee, Andrew; Brynjolfsson, Erik

    2012-10-01

    Big data, the authors write, is far more powerful than the analytics of the past. Executives can measure and therefore manage more precisely than ever before. They can make better predictions and smarter decisions. They can target more-effective interventions in areas that so far have been dominated by gut and intuition rather than by data and rigor. The differences between big data and analytics are a matter of volume, velocity, and variety: More data now cross the internet every second than were stored in the entire internet 20 years ago. Nearly real-time information makes it possible for a company to be much more agile than its competitors. And that information can come from social networks, images, sensors, the web, or other unstructured sources. The managerial challenges, however, are very real. Senior decision makers have to learn to ask the right questions and embrace evidence-based decision making. Organizations must hire scientists who can find patterns in very large data sets and translate them into useful business information. IT departments have to work hard to integrate all the relevant internal and external sources of data. The authors offer two success stories to illustrate how companies are using big data: PASSUR Aerospace enables airlines to match their actual and estimated arrival times. Sears Holdings directly analyzes its incoming store data to make promotions much more precise and faster.

  4. Big Data Comes to School

    Directory of Open Access Journals (Sweden)

    Bill Cope

    2016-03-01

    Full Text Available The prospect of “big data” at once evokes optimistic views of an information-rich future and concerns about surveillance that adversely impacts our personal and private lives. This overview article explores the implications of big data in education, focusing by way of example on data generated by student writing. We have chosen writing because it presents particular complexities, highlighting the range of processes for collecting and interpreting evidence of learning in the era of computer-mediated instruction and assessment as well as the challenges. Writing is significant not only because it is central to the core subject area of literacy; it is also an ideal medium for the representation of deep disciplinary knowledge across a number of subject areas. After defining what big data entails in education, we map emerging sources of evidence of learning that separately and together have the potential to generate unprecedented amounts of data: machine assessments, structured data embedded in learning, and unstructured data collected incidental to learning activity. Our case is that these emerging sources of evidence of learning have significant implications for the traditional relationships between assessment and instruction. Moreover, for educational researchers, these data are in some senses quite different from traditional evidentiary sources, and this raises a number of methodological questions. The final part of the article discusses implications for practice in an emerging field of education data science, including publication of data, data standards, and research ethics.

  5. EnviroAtlas - Big Game Hunting Recreation Demand by 12-Digit HUC in the Conterminous United States

    Data.gov (United States)

    U.S. Environmental Protection Agency — This EnviroAtlas dataset includes the total number of recreational days per year demanded by people ages 18 and over for big game hunting by location in the...

  6. Minimum Delay Moving Object Detection

    KAUST Repository

    Lao, Dong

    2017-11-09

    We present a general framework and method for detection of an object in a video based on apparent motion. The object moves relative to background motion at some unknown time in the video, and the goal is to detect and segment the object as soon it moves in an online manner. Due to unreliability of motion between frames, more than two frames are needed to reliably detect the object. Our method is designed to detect the object(s) with minimum delay, i.e., frames after the object moves, constraining the false alarms. Experiments on a new extensive dataset for moving object detection show that our method achieves less delay for all false alarm constraints than existing state-of-the-art.

  7. Imaging of Moving Ground Vehicles

    National Research Council Canada - National Science Library

    Rihaczek, A

    1996-01-01

    ... requires that use be made of the complex image. The yaw/pitch/roll/bounce/flex motion of a moving ground vehicle demands that different motion compensations be applied to different parts of the vehicle...

  8. Nordic Seniors on the Move

    DEFF Research Database (Denmark)

    ”I believe that all people need to move about. Actually, some have difficulties in doing so. They stay in their home neighbourhoods where they’ve grown up and feel safe. I can understand that, but my wife and I, we didn’t want that. We are more open to new ideas.” This anthology is about seniors...... on the move. In seven chapters, Nordic researchers from various disciplines, by means of ethnographic methods, attempt to comprehend the phenomenon of Nordic seniors who move to leisure areas in their own or in other countries. The number of people involved in this kind of migratory movement has grown...... above gives voice to one of these seniors, stressing the necessity of moving. The anthology contributes to the international body of literature about later life migration, specifically representing experiences made by Nordic seniors. As shown here, mobility and migration in later life have implications...

  9. Minimum Delay Moving Object Detection

    KAUST Repository

    Lao, Dong

    2017-01-08

    We present a general framework and method for detection of an object in a video based on apparent motion. The object moves relative to background motion at some unknown time in the video, and the goal is to detect and segment the object as soon it moves in an online manner. Due to unreliability of motion between frames, more than two frames are needed to reliably detect the object. Our method is designed to detect the object(s) with minimum delay, i.e., frames after the object moves, constraining the false alarms. Experiments on a new extensive dataset for moving object detection show that our method achieves less delay for all false alarm constraints than existing state-of-the-art.

  10. Minimum Delay Moving Object Detection

    KAUST Repository

    Lao, Dong; Sundaramoorthi, Ganesh

    2017-01-01

    We present a general framework and method for detection of an object in a video based on apparent motion. The object moves relative to background motion at some unknown time in the video, and the goal is to detect and segment the object as soon it moves in an online manner. Due to unreliability of motion between frames, more than two frames are needed to reliably detect the object. Our method is designed to detect the object(s) with minimum delay, i.e., frames after the object moves, constraining the false alarms. Experiments on a new extensive dataset for moving object detection show that our method achieves less delay for all false alarm constraints than existing state-of-the-art.

  11. Turning big bang into big bounce. I. Classical dynamics

    Science.gov (United States)

    Dzierżak, Piotr; Małkiewicz, Przemysław; Piechocki, Włodzimierz

    2009-11-01

    The big bounce (BB) transition within a flat Friedmann-Robertson-Walker model is analyzed in the setting of loop geometry underlying the loop cosmology. We solve the constraint of the theory at the classical level to identify physical phase space and find the Lie algebra of the Dirac observables. We express energy density of matter and geometrical functions in terms of the observables. It is the modification of classical theory by the loop geometry that is responsible for BB. The classical energy scale specific to BB depends on a parameter that should be fixed either by cosmological data or determined theoretically at quantum level, otherwise the energy scale stays unknown.

  12. 76 FR 55779 - Labor Day, 2011

    Science.gov (United States)

    2011-09-09

    ... Proclamation Every day, hard-working men and women across America prove that, even in difficult times, our... we commit to supporting their efforts in moving our economy forward. The right to organize and... Americans today are given opportunities because their parents and grandparents fought for these basic rights...

  13. Autowaves in moving excitable media

    Directory of Open Access Journals (Sweden)

    V.A.Davydov

    2004-01-01

    Full Text Available Within the framework of kinematic theory of autowaves we suggest a method for analytic description of stationary autowave structures appearing at the boundary between the moving and fixed excitable media. The front breakdown phenomenon is predicted for such structures. Autowave refraction and, particulary, one-side "total reflection" at the boundary is considered. The obtained analytical results are confirmed by computer simulations. Prospects of the proposed method for further studies of autowave dynamics in the moving excitable media are discussed.

  14. Infinite games with uncertain moves

    Directory of Open Access Journals (Sweden)

    Nicholas Asher

    2013-03-01

    Full Text Available We study infinite two-player games where one of the players is unsure about the set of moves available to the other player. In particular, the set of moves of the other player is a strict superset of what she assumes it to be. We explore what happens to sets in various levels of the Borel hierarchy under such a situation. We show that the sets at every alternate level of the hierarchy jump to the next higher level.

  15. Big Data Knowledge in Global Health Education.

    Science.gov (United States)

    Olayinka, Olaniyi; Kekeh, Michele; Sheth-Chandra, Manasi; Akpinar-Elci, Muge

    The ability to synthesize and analyze massive amounts of data is critical to the success of organizations, including those that involve global health. As countries become highly interconnected, increasing the risk for pandemics and outbreaks, the demand for big data is likely to increase. This requires a global health workforce that is trained in the effective use of big data. To assess implementation of big data training in global health, we conducted a pilot survey of members of the Consortium of Universities of Global Health. More than half the respondents did not have a big data training program at their institution. Additionally, the majority agreed that big data training programs will improve global health deliverables, among other favorable outcomes. Given the observed gap and benefits, global health educators may consider investing in big data training for students seeking a career in global health. Copyright © 2017 Icahn School of Medicine at Mount Sinai. Published by Elsevier Inc. All rights reserved.

  16. Analytical Cost Metrics : Days of Future Past

    Energy Technology Data Exchange (ETDEWEB)

    Prajapati, Nirmal [Colorado State Univ., Fort Collins, CO (United States); Rajopadhye, Sanjay [Colorado State Univ., Fort Collins, CO (United States); Djidjev, Hristo Nikolov [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-20

    As we move towards the exascale era, the new architectures must be capable of running the massive computational problems efficiently. Scientists and researchers are continuously investing in tuning the performance of extreme-scale computational problems. These problems arise in almost all areas of computing, ranging from big data analytics, artificial intelligence, search, machine learning, virtual/augmented reality, computer vision, image/signal processing to computational science and bioinformatics. With Moore’s law driving the evolution of hardware platforms towards exascale, the dominant performance metric (time efficiency) has now expanded to also incorporate power/energy efficiency. Therefore the major challenge that we face in computing systems research is: “how to solve massive-scale computational problems in the most time/power/energy efficient manner?”

  17. Installation of the first of the big wheels of the ATLAS muon spectrometer, a thin gap chamber (TGC) wheel

    CERN Multimedia

    Claudia Marcelloni

    2006-01-01

    The muon spectrometer will include four big moving wheels at each end, each measuring 25 metres in diameter. Of the eight wheels in total, six will be composed of thin gap chambers for the muon trigger system and the other two will consist of monitored drift tubes (MDTs) to measure the position of the muons

  18. Big Data Strategy for Telco: Network Transformation

    OpenAIRE

    F. Amin; S. Feizi

    2014-01-01

    Big data has the potential to improve the quality of services; enable infrastructure that businesses depend on to adapt continually and efficiently; improve the performance of employees; help organizations better understand customers; and reduce liability risks. Analytics and marketing models of fixed and mobile operators are falling short in combating churn and declining revenue per user. Big Data presents new method to reverse the way and improve profitability. The benefits of Big Data and ...

  19. Big Data in Shipping - Challenges and Opportunities

    OpenAIRE

    Rødseth, Ørnulf Jan; Perera, Lokukaluge Prasad; Mo, Brage

    2016-01-01

    Big Data is getting popular in shipping where large amounts of information is collected to better understand and improve logistics, emissions, energy consumption and maintenance. Constraints to the use of big data include cost and quality of on-board sensors and data acquisition systems, satellite communication, data ownership and technical obstacles to effective collection and use of big data. New protocol standards may simplify the process of collecting and organizing the data, including in...

  20. Big Data in Action for Government : Big Data Innovation in Public Services, Policy, and Engagement

    OpenAIRE

    World Bank

    2017-01-01

    Governments have an opportunity to harness big data solutions to improve productivity, performance and innovation in service delivery and policymaking processes. In developing countries, governments have an opportunity to adopt big data solutions and leapfrog traditional administrative approaches

  1. Between Anzac Day and Waitangi Day

    Directory of Open Access Journals (Sweden)

    Czerwińska Anna

    2017-12-01

    Full Text Available This paper discusses the historical background and significance of the two most important national holidays in New Zealand: Waitangi Day and Anzac Day. Waitangi Day is celebrated on the 6th February and it commemorates the signing of the Treaty of Waitangi between British representatives and a number of Māori chiefs in 1840. Following the signing of the treaty New Zealand became effectively a British colony. Anzac Day is celebrated on 25th April, i.e., on the anniversary of the landing of soldiers of the Australian and New Zealand Army Corps (ANZAC on the Gallipoli peninsula in Turkey in 1915, during World War One. There are three major differences between these two holidays: the process of those days becoming national holidays, the level of contestation, and the changing messages they have carried. The present study analyzes the national discourse around Anzac Day and Waitangi Day in New Zealand, and attempts to reveal how the official New Zealand government rhetoric about national unity becomes deconstructed. The following analysis is based on a selection of online articles from the New Zealand Herald and Stuff published in Auckland and Wellington, respectively. Both cities are populated by multi-ethnic groups, with Auckland featuring the largest Māori population.

  2. BIG hydrogen: hydrogen technology in the oil and gas sector

    International Nuclear Information System (INIS)

    2006-01-01

    The BIG Hydrogen workshop was held in Calgary, Alberta, Canada on February 13, 2006. About 60 representatives of industry, academia and government attended this one-day technical meeting on hydrogen production for the oil and gas industry. The following themes were identified from the presentations and discussion: the need to find a BIG hydrogen replacement for Steam Methane Reformer (SMR) because of uncertainty regarding cost and availability of natural gas, although given the maturity of SMR process (reliability, known capital cost) how high will H2 prices have to rise?; need for a national strategy to link the near-term and the longer-term hydrogen production requirements, which can take hydrogen from chemical feedstock to energy carrier; and in the near-term Canada should get involved in demonstrations and build expertise in large hydrogen systems including production and carbon capture and sequestration

  3. Big data optimization recent developments and challenges

    CERN Document Server

    2016-01-01

    The main objective of this book is to provide the necessary background to work with big data by introducing some novel optimization algorithms and codes capable of working in the big data setting as well as introducing some applications in big data optimization for both academics and practitioners interested, and to benefit society, industry, academia, and government. Presenting applications in a variety of industries, this book will be useful for the researchers aiming to analyses large scale data. Several optimization algorithms for big data including convergent parallel algorithms, limited memory bundle algorithm, diagonal bundle method, convergent parallel algorithms, network analytics, and many more have been explored in this book.

  4. Medical big data: promise and challenges

    Directory of Open Access Journals (Sweden)

    Choong Ho Lee

    2017-03-01

    Full Text Available The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  5. Traffic information computing platform for big data

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn; Liu, Yan, E-mail: ztduan@chd.edu.cn; Dai, Jiting, E-mail: ztduan@chd.edu.cn; Kang, Jun, E-mail: ztduan@chd.edu.cn [Chang' an University School of Information Engineering, Xi' an, China and Shaanxi Engineering and Technical Research Center for Road and Traffic Detection, Xi' an (China)

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  6. Medical big data: promise and challenges.

    Science.gov (United States)

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  7. Traffic information computing platform for big data

    International Nuclear Information System (INIS)

    Duan, Zongtao; Li, Ying; Zheng, Xibin; Liu, Yan; Dai, Jiting; Kang, Jun

    2014-01-01

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users

  8. Big Data's Role in Precision Public Health.

    Science.gov (United States)

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts.

  9. Big data analytics with R and Hadoop

    CERN Document Server

    Prajapati, Vignesh

    2013-01-01

    Big Data Analytics with R and Hadoop is a tutorial style book that focuses on all the powerful big data tasks that can be achieved by integrating R and Hadoop.This book is ideal for R developers who are looking for a way to perform big data analytics with Hadoop. This book is also aimed at those who know Hadoop and want to build some intelligent applications over Big data with R packages. It would be helpful if readers have basic knowledge of R.

  10. On Study of Application of Big Data and Cloud Computing Technology in Smart Campus

    Science.gov (United States)

    Tang, Zijiao

    2017-12-01

    We live in an era of network and information, which means we produce and face a lot of data every day, however it is not easy for database in the traditional meaning to better store, process and analyze the mass data, therefore the big data was born at the right moment. Meanwhile, the development and operation of big data rest with cloud computing which provides sufficient space and resources available to process and analyze data of big data technology. Nowadays, the proposal of smart campus construction aims at improving the process of building information in colleges and universities, therefore it is necessary to consider combining big data technology and cloud computing technology into construction of smart campus to make campus database system and campus management system mutually combined rather than isolated, and to serve smart campus construction through integrating, storing, processing and analyzing mass data.

  11. Supporting diagnosis and treatment in medical care based on Big Data processing.

    Science.gov (United States)

    Lupşe, Oana-Sorina; Crişan-Vida, Mihaela; Stoicu-Tivadar, Lăcrămioara; Bernard, Elena

    2014-01-01

    With information and data in all domains growing every day, it is difficult to manage and extract useful knowledge for specific situations. This paper presents an integrated system architecture to support the activity in the Ob-Gin departments with further developments in using new technology to manage Big Data processing - using Google BigQuery - in the medical domain. The data collected and processed with Google BigQuery results from different sources: two Obstetrics & Gynaecology Departments, the TreatSuggest application - an application for suggesting treatments, and a home foetal surveillance system. Data is uploaded in Google BigQuery from Bega Hospital Timişoara, Romania. The analysed data is useful for the medical staff, researchers and statisticians from public health domain. The current work describes the technological architecture and its processing possibilities that in the future will be proved based on quality criteria to lead to a better decision process in diagnosis and public health.

  12. Big Data; A Management Revolution : The emerging role of big data in businesses

    OpenAIRE

    Blasiak, Kevin

    2014-01-01

    Big data is a term that was coined in 2012 and has since then emerged to one of the top trends in business and technology. Big data is an agglomeration of different technologies resulting in data processing capabilities that have been unreached before. Big data is generally characterized by 4 factors. Volume, velocity and variety. These three factors distinct it from the traditional data use. The possibilities to utilize this technology are vast. Big data technology has touch points in differ...

  13. BigDataBench: a Big Data Benchmark Suite from Internet Services

    OpenAIRE

    Wang, Lei; Zhan, Jianfeng; Luo, Chunjie; Zhu, Yuqing; Yang, Qiang; He, Yongqiang; Gao, Wanling; Jia, Zhen; Shi, Yingjie; Zhang, Shujie; Zheng, Chen; Lu, Gang; Zhan, Kent; Li, Xiaona; Qiu, Bizhu

    2014-01-01

    As architecture, systems, and data management communities pay greater attention to innovative big data systems and architectures, the pressure of benchmarking and evaluating these systems rises. Considering the broad use of big data systems, big data benchmarks must include diversity of data and workloads. Most of the state-of-the-art big data benchmarking efforts target evaluating specific types of applications or system software stacks, and hence they are not qualified for serving the purpo...

  14. The faces of Big Science.

    Science.gov (United States)

    Schatz, Gottfried

    2014-06-01

    Fifty years ago, academic science was a calling with few regulations or financial rewards. Today, it is a huge enterprise confronted by a plethora of bureaucratic and political controls. This change was not triggered by specific events or decisions but reflects the explosive 'knee' in the exponential growth that science has sustained during the past three-and-a-half centuries. Coming to terms with the demands and benefits of 'Big Science' is a major challenge for today's scientific generation. Since its foundation 50 years ago, the European Molecular Biology Organization (EMBO) has been of invaluable help in meeting this challenge.

  15. Big Data and central banks

    Directory of Open Access Journals (Sweden)

    David Bholat

    2015-04-01

    Full Text Available This commentary recaps a Centre for Central Banking Studies event held at the Bank of England on 2–3 July 2014. The article covers three main points. First, it situates the Centre for Central Banking Studies event within the context of the Bank’s Strategic Plan and initiatives. Second, it summarises and reflects on major themes from the event. Third, the article links central banks’ emerging interest in Big Data approaches with their broader uptake by other economic agents.

  16. Inhomogeneous Big Bang Nucleosynthesis Revisited

    OpenAIRE

    Lara, J. F.; Kajino, T.; Mathews, G. J.

    2006-01-01

    We reanalyze the allowed parameters for inhomogeneous big bang nucleosynthesis in light of the WMAP constraints on the baryon-to-photon ratio and a recent measurement which has set the neutron lifetime to be 878.5 +/- 0.7 +/- 0.3 seconds. For a set baryon-to-photon ratio the new lifetime reduces the mass fraction of He4 by 0.0015 but does not significantly change the abundances of other isotopes. This enlarges the region of concordance between He4 and deuterium in the parameter space of the b...

  17. Moving event and moving participant in aspectual conceptions

    Directory of Open Access Journals (Sweden)

    Izutsu Katsunobu

    2016-06-01

    Full Text Available This study advances an analysis of the event conception of aspectual forms in four East Asian languages: Ainu, Japanese, Korean, and Ryukyuan. As earlier studies point out, event conceptions can be divided into two major types: the moving-event type and the moving-participant type, respectively. All aspectual forms in Ainu and Korean, and most forms in Japanese and Ryukyuan are based on that type of event conception. Moving-participant oriented Ainu and movingevent oriented Japanese occupy two extremes, between which Korean and Ryukyuan stand. Notwithstanding the geographical relationships among the four languages, Ryukyuan is closer to Ainu than to Korean, whereas Korean is closer to Ainu than to Japanese.

  18. Comparative validity of brief to medium-length Big Five and Big Six personality questionnaires

    NARCIS (Netherlands)

    Thalmayer, A.G.; Saucier, G.; Eigenhuis, A.

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five

  19. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  20. Big Data en surveillance, deel 1 : Definities en discussies omtrent Big Data

    NARCIS (Netherlands)

    Timan, Tjerk

    2016-01-01

    Naar aanleiding van een (vrij kort) college over surveillance en Big Data, werd me gevraagd iets dieper in te gaan op het thema, definities en verschillende vraagstukken die te maken hebben met big data. In dit eerste deel zal ik proberen e.e.a. uiteen te zetten betreft Big Data theorie en

  1. 77 FR 27245 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN

    Science.gov (United States)

    2012-05-09

    ... DEPARTMENT OF THE INTERIOR Fish and Wildlife Service [FWS-R3-R-2012-N069; FXRS1265030000S3-123-FF03R06000] Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN AGENCY: Fish and... plan (CCP) and environmental assessment (EA) for Big Stone National Wildlife Refuge (Refuge, NWR) for...

  2. Big game hunting practices, meanings, motivations and constraints: a survey of Oregon big game hunters

    Science.gov (United States)

    Suresh K. Shrestha; Robert C. Burns

    2012-01-01

    We conducted a self-administered mail survey in September 2009 with randomly selected Oregon hunters who had purchased big game hunting licenses/tags for the 2008 hunting season. Survey questions explored hunting practices, the meanings of and motivations for big game hunting, the constraints to big game hunting participation, and the effects of age, years of hunting...

  3. CGH Supports World Cancer Day Every Day

    Science.gov (United States)

    We celebrate World Cancer Day every year on February 4th. This year the theme “We can. I can.” invites us to think not only about how we can work with one another to reduce the global burden of cancer, but how we as individuals can make a difference. Every day the staff at CGH work to establish and build upon programs that are aimed at improving the lives of people affected by cancer.

  4. Hepatic Lipidosis in a Research Colony of Big Brown Bats (Eptesicus fuscus)

    OpenAIRE

    Snyder, Jessica M; Treuting, Piper M; Brabb, Thea; Miller, Kimberly E; Covey, Ellen; Lencioni, Karen L

    2015-01-01

    During a nearby construction project, a sudden decrease in food intake and guano production occurred in an outdoor colony of big brown bats (Eptesicus fuscus), and one animal was found dead. Investigation revealed that the project was generating a large amount of noise and vibration, which disturbed the bats’ feeding. Consequently the bats were moved into an indoor enclosure away from the construction noises, and the colony resumed eating. Over the next 3 wk, additional animals presented with...

  5. An analysis of cross-sectional differences in big and non-big public accounting firms' audit programs

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans); Drieenhuizen, F.; Stein, M.T.; Simunic, D.A.

    2006-01-01

    A significant body of prior research has shown that audits by the Big 5 (now Big 4) public accounting firms are quality differentiated relative to non-Big 5 audits. This result can be derived analytically by assuming that Big 5 and non-Big 5 firms face different loss functions for "audit failures"

  6. Minimum Delay Moving Object Detection

    KAUST Repository

    Lao, Dong

    2017-05-14

    This thesis presents a general framework and method for detection of an object in a video based on apparent motion. The object moves, at some unknown time, differently than the “background” motion, which can be induced from camera motion. The goal of proposed method is to detect and segment the object as soon it moves in an online manner. Since motion estimation can be unreliable between frames, more than two frames are needed to reliably detect the object. Observing more frames before declaring a detection may lead to a more accurate detection and segmentation, since more motion may be observed leading to a stronger motion cue. However, this leads to greater delay. The proposed method is designed to detect the object(s) with minimum delay, i.e., frames after the object moves, constraining the false alarms, defined as declarations of detection before the object moves or incorrect or inaccurate segmentation at the detection time. Experiments on a new extensive dataset for moving object detection show that our method achieves less delay for all false alarm constraints than existing state-of-the-art.

  7. Baryon symmetric big bang cosmology

    Science.gov (United States)

    Stecker, F. W.

    1978-01-01

    Both the quantum theory and Einsteins theory of special relativity lead to the supposition that matter and antimatter were produced in equal quantities during the big bang. It is noted that local matter/antimatter asymmetries may be reconciled with universal symmetry by assuming (1) a slight imbalance of matter over antimatter in the early universe, annihilation, and a subsequent remainder of matter; (2) localized regions of excess for one or the other type of matter as an initial condition; and (3) an extremely dense, high temperature state with zero net baryon number; i.e., matter/antimatter symmetry. Attention is given to the third assumption, which is the simplest and the most in keeping with current knowledge of the cosmos, especially as pertains the universality of 3 K background radiation. Mechanisms of galaxy formation are discussed, whereby matter and antimatter might have collided and annihilated each other, or have coexisted (and continue to coexist) at vast distances. It is pointed out that baryon symmetric big bang cosmology could probably be proved if an antinucleus could be detected in cosmic radiation.

  8. Georges et le big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2011-01-01

    Georges et Annie, sa meilleure amie, sont sur le point d'assister à l'une des plus importantes expériences scientifiques de tous les temps : explorer les premiers instants de l'Univers, le Big Bang ! Grâce à Cosmos, leur super ordinateur, et au Grand Collisionneur de hadrons créé par Éric, le père d'Annie, ils vont enfin pouvoir répondre à cette question essentielle : pourquoi existons nous ? Mais Georges et Annie découvrent qu'un complot diabolique se trame. Pire, c'est toute la recherche scientifique qui est en péril ! Entraîné dans d'incroyables aventures, Georges ira jusqu'aux confins de la galaxie pour sauver ses amis...Une plongée passionnante au coeur du Big Bang. Les toutes dernières théories de Stephen Hawking et des plus grands scientifiques actuels.

  9. Astronomical Surveys and Big Data

    Directory of Open Access Journals (Sweden)

    Mickaelian Areg M.

    2016-03-01

    Full Text Available Recent all-sky and large-area astronomical surveys and their catalogued data over the whole range of electromagnetic spectrum, from γ-rays to radio waves, are reviewed, including such as Fermi-GLAST and INTEGRAL in γ-ray, ROSAT, XMM and Chandra in X-ray, GALEX in UV, SDSS and several POSS I and POSS II-based catalogues (APM, MAPS, USNO, GSC in the optical range, 2MASS in NIR, WISE and AKARI IRC in MIR, IRAS and AKARI FIS in FIR, NVSS and FIRST in radio range, and many others, as well as the most important surveys giving optical images (DSS I and II, SDSS, etc., proper motions (Tycho, USNO, Gaia, variability (GCVS, NSVS, ASAS, Catalina, Pan-STARRS, and spectroscopic data (FBS, SBS, Case, HQS, HES, SDSS, CALIFA, GAMA. An overall understanding of the coverage along the whole wavelength range and comparisons between various surveys are given: galaxy redshift surveys, QSO/AGN, radio, Galactic structure, and Dark Energy surveys. Astronomy has entered the Big Data era, with Astrophysical Virtual Observatories and Computational Astrophysics playing an important role in using and analyzing big data for new discoveries.

  10. Big data in oncologic imaging.

    Science.gov (United States)

    Regge, Daniele; Mazzetti, Simone; Giannini, Valentina; Bracco, Christian; Stasi, Michele

    2017-06-01

    Cancer is a complex disease and unfortunately understanding how the components of the cancer system work does not help understand the behavior of the system as a whole. In the words of the Greek philosopher Aristotle "the whole is greater than the sum of parts." To date, thanks to improved information technology infrastructures, it is possible to store data from each single cancer patient, including clinical data, medical images, laboratory tests, and pathological and genomic information. Indeed, medical archive storage constitutes approximately one-third of total global storage demand and a large part of the data are in the form of medical images. The opportunity is now to draw insight on the whole to the benefit of each individual patient. In the oncologic patient, big data analysis is at the beginning but several useful applications can be envisaged including development of imaging biomarkers to predict disease outcome, assessing the risk of X-ray dose exposure or of renal damage following the administration of contrast agents, and tracking and optimizing patient workflow. The aim of this review is to present current evidence of how big data derived from medical images may impact on the diagnostic pathway of the oncologic patient.

  11. Infectious Disease Surveillance in the Big Data Era: Towards Faster and Locally Relevant Systems

    Science.gov (United States)

    Simonsen, Lone; Gog, Julia R.; Olson, Don; Viboud, Cécile

    2016-01-01

    While big data have proven immensely useful in fields such as marketing and earth sciences, public health is still relying on more traditional surveillance systems and awaiting the fruits of a big data revolution. A new generation of big data surveillance systems is needed to achieve rapid, flexible, and local tracking of infectious diseases, especially for emerging pathogens. In this opinion piece, we reflect on the long and distinguished history of disease surveillance and discuss recent developments related to use of big data. We start with a brief review of traditional systems relying on clinical and laboratory reports. We then examine how large-volume medical claims data can, with great spatiotemporal resolution, help elucidate local disease patterns. Finally, we review efforts to develop surveillance systems based on digital and social data streams, including the recent rise and fall of Google Flu Trends. We conclude by advocating for increased use of hybrid systems combining information from traditional surveillance and big data sources, which seems the most promising option moving forward. Throughout the article, we use influenza as an exemplar of an emerging and reemerging infection which has traditionally been considered a model system for surveillance and modeling. PMID:28830112

  12. Infectious Disease Surveillance in the Big Data Era: Towards Faster and Locally Relevant Systems.

    Science.gov (United States)

    Simonsen, Lone; Gog, Julia R; Olson, Don; Viboud, Cécile

    2016-12-01

    While big data have proven immensely useful in fields such as marketing and earth sciences, public health is still relying on more traditional surveillance systems and awaiting the fruits of a big data revolution. A new generation of big data surveillance systems is needed to achieve rapid, flexible, and local tracking of infectious diseases, especially for emerging pathogens. In this opinion piece, we reflect on the long and distinguished history of disease surveillance and discuss recent developments related to use of big data. We start with a brief review of traditional systems relying on clinical and laboratory reports. We then examine how large-volume medical claims data can, with great spatiotemporal resolution, help elucidate local disease patterns. Finally, we review efforts to develop surveillance systems based on digital and social data streams, including the recent rise and fall of Google Flu Trends. We conclude by advocating for increased use of hybrid systems combining information from traditional surveillance and big data sources, which seems the most promising option moving forward. Throughout the article, we use influenza as an exemplar of an emerging and reemerging infection which has traditionally been considered a model system for surveillance and modeling. Published by Oxford University Press for the Infectious Diseases Society of America 2016. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  13. Moving Horizon Estimation and Control

    DEFF Research Database (Denmark)

    Jørgensen, John Bagterp

    successful and applied methodology beyond PID-control for control of industrial processes. The main contribution of this thesis is introduction and definition of the extended linear quadratic optimal control problem for solution of numerical problems arising in moving horizon estimation and control...... problems. Chapter 1 motivates moving horizon estimation and control as a paradigm for control of industrial processes. It introduces the extended linear quadratic control problem and discusses its central role in moving horizon estimation and control. Introduction, application and efficient solution....... It provides an algorithm for computation of the maximal output admissible set for linear model predictive control. Appendix D provides results concerning linear regression. Appendix E discuss prediction error methods for identification of linear models tailored for model predictive control....

  14. Leveraging Mobile Network Big Data for Developmental Policy ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Some argue that big data and big data users offer advantages to generate evidence. ... Supported by IDRC, this research focused on transportation planning in urban ... Using mobile network big data for land use classification CPRsouth 2015.

  15. Automatic Moving Object Segmentation for Freely Moving Cameras

    Directory of Open Access Journals (Sweden)

    Yanli Wan

    2014-01-01

    Full Text Available This paper proposes a new moving object segmentation algorithm for freely moving cameras which is very common for the outdoor surveillance system, the car build-in surveillance system, and the robot navigation system. A two-layer based affine transformation model optimization method is proposed for camera compensation purpose, where the outer layer iteration is used to filter the non-background feature points, and the inner layer iteration is used to estimate a refined affine model based on the RANSAC method. Then the feature points are classified into foreground and background according to the detected motion information. A geodesic based graph cut algorithm is then employed to extract the moving foreground based on the classified features. Unlike the existing global optimization or the long term feature point tracking based method, our algorithm only performs on two successive frames to segment the moving foreground, which makes it suitable for the online video processing applications. The experiment results demonstrate the effectiveness of our algorithm in both of the high accuracy and the fast speed.

  16. Chemistry--The Big Picture

    Science.gov (United States)

    Cassell, Anne

    2011-01-01

    Chemistry produces materials and releases energy by ionic or electronic rearrangements. Three structure types affect the ease with which a reaction occurs. In the Earth's crust, "solid crystals" change chemically only with extreme heat and pressure, unless their fixed ions touch moving fluids. On the other hand, in living things, "liquid crystals"…

  17. A New Look at Big History

    Science.gov (United States)

    Hawkey, Kate

    2014-01-01

    The article sets out a "big history" which resonates with the priorities of our own time. A globalizing world calls for new spacial scales to underpin what the history curriculum addresses, "big history" calls for new temporal scales, while concern over climate change calls for a new look at subject boundaries. The article…

  18. A little big history of Tiananmen

    NARCIS (Netherlands)

    Quaedackers, E.; Grinin, L.E.; Korotayev, A.V.; Rodrigue, B.H.

    2011-01-01

    This contribution aims at demonstrating the usefulness of studying small-scale subjects such as Tiananmen, or the Gate of Heavenly Peace, in Beijing - from a Big History perspective. By studying such a ‘little big history’ of Tiananmen, previously overlooked yet fundamental explanations for why

  19. What is beyond the big five?

    Science.gov (United States)

    Saucier, G; Goldberg, L R

    1998-08-01

    Previous investigators have proposed that various kinds of person-descriptive content--such as differences in attitudes or values, in sheer evaluation, in attractiveness, or in height and girth--are not adequately captured by the Big Five Model. We report on a rather exhaustive search for reliable sources of Big Five-independent variation in data from person-descriptive adjectives. Fifty-three candidate clusters were developed in a college sample using diverse approaches and sources. In a nonstudent adult sample, clusters were evaluated with respect to a minimax criterion: minimum multiple correlation with factors from Big Five markers and maximum reliability. The most clearly Big Five-independent clusters referred to Height, Girth, Religiousness, Employment Status, Youthfulness and Negative Valence (or low-base-rate attributes). Clusters referring to Fashionableness, Sensuality/Seductiveness, Beauty, Masculinity, Frugality, Humor, Wealth, Prejudice, Folksiness, Cunning, and Luck appeared to be potentially beyond the Big Five, although each of these clusters demonstrated Big Five multiple correlations of .30 to .45, and at least one correlation of .20 and over with a Big Five factor. Of all these content areas, Religiousness, Negative Valence, and the various aspects of Attractiveness were found to be represented by a substantial number of distinct, common adjectives. Results suggest directions for supplementing the Big Five when one wishes to extend variable selection outside the domain of personality traits as conventionally defined.

  20. Big Data Analytics and Its Applications

    Directory of Open Access Journals (Sweden)

    Mashooque A. Memon

    2017-10-01

    Full Text Available The term, Big Data, has been authored to refer to the extensive heave of data that can't be managed by traditional data handling methods or techniques. The field of Big Data plays an indispensable role in various fields, such as agriculture, banking, data mining, education, chemistry, finance, cloud computing, marketing, health care stocks. Big data analytics is the method for looking at big data to reveal hidden patterns, incomprehensible relationship and other important data that can be utilize to resolve on enhanced decisions. There has been a perpetually expanding interest for big data because of its fast development and since it covers different areas of applications. Apache Hadoop open source technology created in Java and keeps running on Linux working framework was used. The primary commitment of this exploration is to display an effective and free solution for big data application in a distributed environment, with its advantages and indicating its easy use. Later on, there emerge to be a required for an analytical review of new developments in the big data technology. Healthcare is one of the best concerns of the world. Big data in healthcare imply to electronic health data sets that are identified with patient healthcare and prosperity. Data in the healthcare area is developing past managing limit of the healthcare associations and is relied upon to increment fundamentally in the coming years.

  1. Big data and software defined networks

    CERN Document Server

    Taheri, Javid

    2018-01-01

    Big Data Analytics and Software Defined Networking (SDN) are helping to drive the management of data usage of the extraordinary increase of computer processing power provided by Cloud Data Centres (CDCs). This new book investigates areas where Big-Data and SDN can help each other in delivering more efficient services.

  2. Big Science and Long-tail Science

    CERN Document Server

    2008-01-01

    Jim Downing and I were privileged to be the guests of Salavtore Mele at CERN yesterday and to see the Atlas detector of the Large Hadron Collider . This is a wow experience - although I knew it was big, I hadnt realised how big.

  3. The Death of the Big Men

    DEFF Research Database (Denmark)

    Martin, Keir

    2010-01-01

    Recently Tolai people og Papua New Guinea have adopted the term 'Big Shot' to decribe an emerging post-colonial political elite. The mergence of the term is a negative moral evaluation of new social possibilities that have arisen as a consequence of the Big Shots' privileged position within a glo...

  4. An embedding for the big bang

    Science.gov (United States)

    Wesson, Paul S.

    1994-01-01

    A cosmological model is given that has good physical properties for the early and late universe but is a hypersurface in a flat five-dimensional manifold. The big bang can therefore be regarded as an effect of a choice of coordinates in a truncated higher-dimensional geometry. Thus the big bang is in some sense a geometrical illusion.

  5. Probing the pre-big bang universe

    International Nuclear Information System (INIS)

    Veneziano, G.

    2000-01-01

    Superstring theory suggests a new cosmology whereby a long inflationary phase preceded a non singular big bang-like event. After discussing how pre-big bang inflation naturally arises from an almost trivial initial state of the Universe, I will describe how present or near-future experiments can provide sensitive probes of how the Universe behaved in the pre-bang era

  6. Starting Small, Thinking Big - Continuum Magazine | NREL

    Science.gov (United States)

    , Thinking Big Stories NREL Helps Agencies Target New Federal Sustainability Goals Student Engagements Help solar power in the territory. Photo by Don Buchanan, VIEO Starting Small, Thinking Big NREL helps have used these actions to optimize that energy use.'" NREL's cross-organizational work supports

  7. Exploiting big data for critical care research.

    Science.gov (United States)

    Docherty, Annemarie B; Lone, Nazir I

    2015-10-01

    Over recent years the digitalization, collection and storage of vast quantities of data, in combination with advances in data science, has opened up a new era of big data. In this review, we define big data, identify examples of critical care research using big data, discuss the limitations and ethical concerns of using these large datasets and finally consider scope for future research. Big data refers to datasets whose size, complexity and dynamic nature are beyond the scope of traditional data collection and analysis methods. The potential benefits to critical care are significant, with faster progress in improving health and better value for money. Although not replacing clinical trials, big data can improve their design and advance the field of precision medicine. However, there are limitations to analysing big data using observational methods. In addition, there are ethical concerns regarding maintaining confidentiality of patients who contribute to these datasets. Big data have the potential to improve medical care and reduce costs, both by individualizing medicine, and bringing together multiple sources of data about individual patients. As big data become increasingly mainstream, it will be important to maintain public confidence by safeguarding data security, governance and confidentiality.

  8. Practice variation in Big-4 transparency reports

    NARCIS (Netherlands)

    Girdhar, Sakshi; Jeppesen, K.K.

    2018-01-01

    Purpose The purpose of this paper is to examine the transparency reports published by the Big-4 public accounting firms in the UK, Germany and Denmark to understand the determinants of their content within the networks of big accounting firms. Design/methodology/approach The study draws on a

  9. Big data analysis for smart farming

    NARCIS (Netherlands)

    Kempenaar, C.; Lokhorst, C.; Bleumer, E.J.B.; Veerkamp, R.F.; Been, Th.; Evert, van F.K.; Boogaardt, M.J.; Ge, L.; Wolfert, J.; Verdouw, C.N.; Bekkum, van Michael; Feldbrugge, L.; Verhoosel, Jack P.C.; Waaij, B.D.; Persie, van M.; Noorbergen, H.

    2016-01-01

    In this report we describe results of a one-year TO2 institutes project on the development of big data technologies within the milk production chain. The goal of this project is to ‘create’ an integration platform for big data analysis for smart farming and to develop a show case. This includes both

  10. Cloud Based Big Data Infrastructure: Architectural Components and Automated Provisioning

    OpenAIRE

    Demchenko, Yuri; Turkmen, Fatih; Blanchet, Christophe; Loomis, Charles; Laat, Caees de

    2016-01-01

    This paper describes the general architecture and functional components of the cloud based Big Data Infrastructure (BDI). The proposed BDI architecture is based on the analysis of the emerging Big Data and data intensive technologies and supported by the definition of the Big Data Architecture Framework (BDAF) that defines the following components of the Big Data technologies: Big Data definition, Data Management including data lifecycle and data structures, Big Data Infrastructure (generical...

  11. The ethics of biomedical big data

    CERN Document Server

    Mittelstadt, Brent Daniel

    2016-01-01

    This book presents cutting edge research on the new ethical challenges posed by biomedical Big Data technologies and practices. ‘Biomedical Big Data’ refers to the analysis of aggregated, very large datasets to improve medical knowledge and clinical care. The book describes the ethical problems posed by aggregation of biomedical datasets and re-use/re-purposing of data, in areas such as privacy, consent, professionalism, power relationships, and ethical governance of Big Data platforms. Approaches and methods are discussed that can be used to address these problems to achieve the appropriate balance between the social goods of biomedical Big Data research and the safety and privacy of individuals. Seventeen original contributions analyse the ethical, social and related policy implications of the analysis and curation of biomedical Big Data, written by leading experts in the areas of biomedical research, medical and technology ethics, privacy, governance and data protection. The book advances our understan...

  12. Big data analytics methods and applications

    CERN Document Server

    Rao, BLS; Rao, SB

    2016-01-01

    This book has a collection of articles written by Big Data experts to describe some of the cutting-edge methods and applications from their respective areas of interest, and provides the reader with a detailed overview of the field of Big Data Analytics as it is practiced today. The chapters cover technical aspects of key areas that generate and use Big Data such as management and finance; medicine and healthcare; genome, cytome and microbiome; graphs and networks; Internet of Things; Big Data standards; bench-marking of systems; and others. In addition to different applications, key algorithmic approaches such as graph partitioning, clustering and finite mixture modelling of high-dimensional data are also covered. The varied collection of themes in this volume introduces the reader to the richness of the emerging field of Big Data Analytics.

  13. 2nd INNS Conference on Big Data

    CERN Document Server

    Manolopoulos, Yannis; Iliadis, Lazaros; Roy, Asim; Vellasco, Marley

    2017-01-01

    The book offers a timely snapshot of neural network technologies as a significant component of big data analytics platforms. It promotes new advances and research directions in efficient and innovative algorithmic approaches to analyzing big data (e.g. deep networks, nature-inspired and brain-inspired algorithms); implementations on different computing platforms (e.g. neuromorphic, graphics processing units (GPUs), clouds, clusters); and big data analytics applications to solve real-world problems (e.g. weather prediction, transportation, energy management). The book, which reports on the second edition of the INNS Conference on Big Data, held on October 23–25, 2016, in Thessaloniki, Greece, depicts an interesting collaborative adventure of neural networks with big data and other learning technologies.

  14. Practice Variation in Big-4 Transparency Reports

    DEFF Research Database (Denmark)

    Girdhar, Sakshi; Klarskov Jeppesen, Kim

    2018-01-01

    Purpose: The purpose of this paper is to examine the transparency reports published by the Big-4 public accounting firms in the UK, Germany and Denmark to understand the determinants of their content within the networks of big accounting firms. Design/methodology/approach: The study draws...... on a qualitative research approach, in which the content of transparency reports is analyzed and semi-structured interviews are conducted with key people from the Big-4 firms who are responsible for developing the transparency reports. Findings: The findings show that the content of transparency reports...... is inconsistent and the transparency reporting practice is not uniform within the Big-4 networks. Differences were found in the way in which the transparency reporting practices are coordinated globally by the respective central governing bodies of the Big-4. The content of the transparency reports...

  15. Epidemiology in the Era of Big Data

    Science.gov (United States)

    Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M

    2015-01-01

    Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called ‘3 Vs’: variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that, while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field’s future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future. PMID:25756221

  16. Ethics and Epistemology in Big Data Research.

    Science.gov (United States)

    Lipworth, Wendy; Mason, Paul H; Kerridge, Ian; Ioannidis, John P A

    2017-12-01

    Biomedical innovation and translation are increasingly emphasizing research using "big data." The hope is that big data methods will both speed up research and make its results more applicable to "real-world" patients and health services. While big data research has been embraced by scientists, politicians, industry, and the public, numerous ethical, organizational, and technical/methodological concerns have also been raised. With respect to technical and methodological concerns, there is a view that these will be resolved through sophisticated information technologies, predictive algorithms, and data analysis techniques. While such advances will likely go some way towards resolving technical and methodological issues, we believe that the epistemological issues raised by big data research have important ethical implications and raise questions about the very possibility of big data research achieving its goals.

  17. The Big bang and the Quantum

    Science.gov (United States)

    Ashtekar, Abhay

    2010-06-01

    General relativity predicts that space-time comes to an end and physics comes to a halt at the big-bang. Recent developments in loop quantum cosmology have shown that these predictions cannot be trusted. Quantum geometry effects can resolve singularities, thereby opening new vistas. Examples are: The big bang is replaced by a quantum bounce; the `horizon problem' disappears; immediately after the big bounce, there is a super-inflationary phase with its own phenomenological ramifications; and, in presence of a standard inflation potential, initial conditions are naturally set for a long, slow roll inflation independently of what happens in the pre-big bang branch. As in my talk at the conference, I will first discuss the foundational issues and then the implications of the new Planck scale physics near the Big Bang.

  18. Inhomogeneous neutrino degeneracy and big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Whitmire, Scott E.; Scherrer, Robert J.

    2000-01-01

    We examine big bang nucleosynthesis (BBN) in the case of inhomogeneous neutrino degeneracy, in the limit where the fluctuations are sufficiently small on large length scales that the present-day element abundances are homogeneous. We consider two representative cases: degeneracy of the electron neutrino alone and equal chemical potentials for all three neutrinos. We use a linear programming method to constrain an arbitrary distribution of the chemical potentials. For the current set of (highly restrictive) limits on the primordial element abundances, homogeneous neutrino degeneracy barely changes the allowed range of the baryon-to-photon ratio η. Inhomogeneous degeneracy allows for little change in the lower bound on η, but the upper bound in this case can be as large as η=1.1x10 -8 (only ν e degeneracy) or η=1.0x10 -9 (equal degeneracies for all three neutrinos). For the case of inhomogeneous neutrino degeneracy, we show that there is no BBN upper bound on the neutrino energy density, which is bounded in this case only by limits from structure formation and the cosmic microwave background. (c) 2000 The American Physical Society

  19. Big data for risk analysis: The future of safe railways

    Energy Technology Data Exchange (ETDEWEB)

    Figueres Esteban, M.

    2016-07-01

    New technology brings ever more data to support decision-making for intelligent transport systems. Big Data is no longer a futuristic challenge, it is happening right now: modern railway systems have countless sources of data providing a massive quantity of diverse information on every aspect of operations such as train position and speed, brake applications, passenger numbers, status of the signaling system or reported incidents.The traditional approaches to safety management on the railways have relied on static data sources to populate traditional safety tools such as bow-tie models and fault trees. The Big Data Risk Analysis (BDRA) program for Railways at the University of Huddersfield is investigating how the many Big Data sources from the railway can be combined in a meaningful way to provide a better understanding about the GB railway systems and the environment within which they operate.Moving to BDRA is not simply a matter of scaling-up existing analysis techniques. BDRA has to coordinate and combine a wide range of sources with different types of data and accuracy, and that is not straight-forward. BDRA is structured around three components: data, ontology and visualisation. Each of these components is critical to support the overall framework. This paper describes how these three components are used to get safety knowledge from two data sources by means of ontologies from text documents. This is a part of the ongoing BDRA research that is looking at integrating many large and varied data sources to support railway safety and decision-makers. (Author)

  20. Kazakhstan's Environment-Health system, a Big Data challenge

    Science.gov (United States)

    Vitolo, Claudia; Bella Gazdiyeva, Bella; Tucker, Allan; Russell, Andrew; Ali, Maged; Althonayan, Abraham

    2016-04-01

    Kazakhstan has witnessed a remarkable economic development in the past 15 years, becoming an upper-middle-income country. However it is still widely regarded as a developing nation, partially because of its population's low life expectancy which is 5 years below the average in similar economies. The environment is in a rather fragile state, affected by soil, water, air pollution, radioactive contamination and climate change. However, Kazakhstan's government is moving towards clean energy and environmental protection and calling on scientists to help prioritise investments. The British Council-funded "Kazakhstan's Environment-Health Risk Analysis (KEHRA)" project is one of the recently launched initiatives to support Kazakhstan healthier future. The underlying hypothesis of this research is that the above mentioned factors (air/water/soil pollution, etc.) affecting public health almost certainly do not act independently but rather trigger and exacerbate each other. Exploring the environment-health links in a multi-dimensional framework is a typical Big Data problem, in which the volume and variety of the data needed poses technical as well as scientific challenges. In Kazakhstan, the complexities related to managing and analysing Big Data are worsened by a number of obstacles at the data acquisition step: most of the data is not in digital form, spatial and temporal attributes are often ambiguous and the re-use and re-purpose of the information is subject to restrictive licenses and other mechanisms of control. In this work, we document the first steps taken towards building an understanding of the complex environment-health system in Kazakhstan, using interactive visualisation tools to identify and compare hot-spots of pollution and poor health outcomes, Big Data and web technologies to collect, manage and explore available information. In the future, the knowledge acquired will be modelled to develop evidence-based recommendation systems for decision makers in

  1. Carlson Wagonlit Travel is moving

    CERN Multimedia

    2013-01-01

    The renovation of the Main Building continues!   Because of this, Carlson Wagonlit Travel will move from building 62 to building 510 on 4 October and the agency will be closed in the afternoon. An emergency service will be organised for official travels only. Phone: 022 799 75 73 & 022 799 75 78 / e-mail: cern@carlsonwagonlit.ch

  2. Minimum Delay Moving Object Detection

    KAUST Repository

    Lao, Dong

    2017-01-01

    This thesis presents a general framework and method for detection of an object in a video based on apparent motion. The object moves, at some unknown time, differently than the “background” motion, which can be induced from camera motion. The goal

  3. Congestion and residential moving behaviour

    DEFF Research Database (Denmark)

    Larsen, Morten Marott; Pilegaard, Ninette; Van Ommeren, Jos

    2008-01-01

    to congestion. We focus on the equilibrium in which some workers currently living in one region accept jobs in the other, with a fraction of them choosing to commute from their current residence to the new job in the other region and the remainder choosing to move to the region in which the new job is located...

  4. Moving ring reactor 'Karin-1'

    International Nuclear Information System (INIS)

    1983-12-01

    The conceptual design of a moving ring reactor ''Karin-1'' has been carried out to advance fusion system design, to clarify the research and development problems, and to decide their priority. In order to attain these objectives, a D-T reactor with tritium breeding blanket is designed, a commercial reactor with net power output of 500 MWe is designed, the compatibility of plasma physics with fusion engineering is demonstrated, and some other guideline is indicated. A moving ring reactor is composed mainly of three parts. In the first formation section, a plasma ring is formed and heated up to ignition temperature. The plasma ring of compact torus is transported from the formation section through the next burning section to generate fusion power. Then the plasma ring moves into the last recovery section, and the energy and particles of the plasma ring are recovered. The outline of a moving ring reactor ''Karin-1'' is described. As a candidate material for the first wall, SiC was adopted to reduce the MHD effect and to minimize the interaction with neutrons and charged particles. The thin metal lining was applied to the SiC surface to solve the problem of the compatibility with lithium blanket. Plasma physics, the engineering aspect and the items of research and development are described. (Kako, I.)

  5. Minimum Delay Moving Object Detection

    KAUST Repository

    Lao, Dong; Sundaramoorthi, Ganesh

    2017-01-01

    We present a general framework and method for detection of an object in a video based on apparent motion. The object moves relative to background motion at some unknown time in the video, and the goal is to detect and segment the object as soon

  6. Day Care Centers

    Data.gov (United States)

    Department of Homeland Security — This database contains locations of day care centers for 50 states and Washington D.C. and Puerto Rico. The dataset only includes center based day care locations...

  7. AAS 227: Day 4

    Science.gov (United States)

    Kohler, Susanna

    2016-01-01

    Editors Note:This week were at the 227th AAS Meeting in Kissimmee, FL. Along with several fellow authors from astrobites.com, I will bewritingupdates on selectedevents at themeeting and posting at the end of each day. Follow along here or atastrobites.com, or catch ourlive-tweeted updates from the@astrobites Twitter account. The usual posting schedule for AAS Nova will resumenext week.Welcome to Day 4 of the winter American Astronomical Society (AAS) meeting in Kissimmee! Several of us are attending the conference this year, and we will report highlights from each day here on astrobites. If youd like to see more timely updates during the day, we encourage you to follow @astrobites on twitter or search the #aas227 hashtag.Helen B. Warner Prize: Origins of Structure in Planetary Systems (by Erika Nesvold)Another excellent prize lecture started off todays sessions. The Helen B. Warner Prize is awarded for achievement in observational or theoretical astrophysics by a young researcher (no more than eight years after their Ph.D.). This years Warner Prize was presented to Ruth Murray-Clay of UC Santa Barbara. For her award lecture, Murray-Clay told us all about planetary system architecture: the number, masses, and orbits of planets in a given system.Ruth Murray-Clay [photo from http://web.physics.ucsb.edu/ ~murray/biocv.html]The underlying question motivating this type of research is: How rare is the Solar System? In other words, how likely is it that a given planetary system will have rocky planets close to their star, gas giants farther out, and ice giants at the outer reaches of the system? Answering this question will help us solve the physics problem of how and where planets form, and will also help us on our search for other planets like Earth.The data on exoplanet population from transit and radial velocity observations and from direct imaging tell us that our Solar System is not common (many systems we observe have much more eccentric gas giants), but that doesnt

  8. Development of imaging biomarkers and generation of big data.

    Science.gov (United States)

    Alberich-Bayarri, Ángel; Hernández-Navarro, Rafael; Ruiz-Martínez, Enrique; García-Castro, Fabio; García-Juan, David; Martí-Bonmatí, Luis

    2017-06-01

    Several image processing algorithms have emerged to cover unmet clinical needs but their application to radiological routine with a clear clinical impact is still not straightforward. Moving from local to big infrastructures, such as Medical Imaging Biobanks (millions of studies), or even more, Federations of Medical Imaging Biobanks (in some cases totaling to hundreds of millions of studies) require the integration of automated pipelines for fast analysis of pooled data to extract clinically relevant conclusions, not uniquely linked to medical imaging, but in combination to other information such as genetic profiling. A general strategy for the development of imaging biomarkers and their integration in the cloud for the quantitative management and exploitation in large databases is herein presented. The proposed platform has been successfully launched and is being validated nowadays among the early adopters' community of radiologists, clinicians, and medical imaging researchers.

  9. Intelligent search in Big Data

    Science.gov (United States)

    Birialtsev, E.; Bukharaev, N.; Gusenkov, A.

    2017-10-01

    An approach to data integration, aimed on the ontology-based intelligent search in Big Data, is considered in the case when information objects are represented in the form of relational databases (RDB), structurally marked by their schemes. The source of information for constructing an ontology and, later on, the organization of the search are texts in natural language, treated as semi-structured data. For the RDBs, these are comments on the names of tables and their attributes. Formal definition of RDBs integration model in terms of ontologies is given. Within framework of the model universal RDB representation ontology, oil production subject domain ontology and linguistic thesaurus of subject domain language are built. Technique of automatic SQL queries generation for subject domain specialists is proposed. On the base of it, information system for TATNEFT oil-producing company RDBs was implemented. Exploitation of the system showed good relevance with majority of queries.

  10. Big Data in Transport Geography

    DEFF Research Database (Denmark)

    Reinau, Kristian Hegner; Agerholm, Niels; Lahrmann, Harry Spaabæk

    for studies that explicitly compare the quality of this new type of data to traditional data sources. With the current focus on Big Data in the transport field, public transport planners are increasingly looking towards smart card data to analyze and optimize flows of passengers. However, in many cases...... it is not all public transport passengers in a city, region or country with a smart card system that uses the system, and in such cases, it is important to know what biases smart card data has in relation to giving a complete view upon passenger flows. This paper therefore analyses the quality and biases...... of smart card data in Denmark, where public transport passengers may use a smart card, may pay with cash for individual trips or may hold a season ticket for a certain route. By analyzing smart card data collected in Denmark in relation to data on sales of cash tickets, sales of season tickets, manual...

  11. Advancements in Big Data Processing

    CERN Document Server

    Vaniachine, A; The ATLAS collaboration

    2012-01-01

    The ever-increasing volumes of scientific data present new challenges for Distributed Computing and Grid-technologies. The emerging Big Data revolution drives new discoveries in scientific fields including nanotechnology, astrophysics, high-energy physics, biology and medicine. New initiatives are transforming data-driven scientific fields by pushing Bid Data limits enabling massive data analysis in new ways. In petascale data processing scientists deal with datasets, not individual files. As a result, a task (comprised of many jobs) became a unit of petascale data processing on the Grid. Splitting of a large data processing task into jobs enabled fine-granularity checkpointing analogous to the splitting of a large file into smaller TCP/IP packets during data transfers. Transferring large data in small packets achieves reliability through automatic re-sending of the dropped TCP/IP packets. Similarly, transient job failures on the Grid can be recovered by automatic re-tries to achieve reliable Six Sigma produc...

  12. Was the Big Bang hot?

    Science.gov (United States)

    Wright, E. L.

    1983-01-01

    Techniques for verifying the spectrum defined by Woody and Richards (WR, 1981), which serves as a base for dust-distorted models of the 3 K background, are discussed. WR detected a sharp deviation from the Planck curve in the 3 K background. The absolute intensity of the background may be determined by the frequency dependence of the dipole anisotropy of the background or the frequency dependence effect in galactic clusters. Both methods involve the Doppler shift; analytical formulae are defined for characterization of the dipole anisotropy. The measurement of the 30-300 GHz spectra of cold galactic dust may reveal the presence of significant amounts of needle-shaped grains, which would in turn support a theory of a cold Big Bang.

  13. Big Bang nucleosynthesis in crisis?

    International Nuclear Information System (INIS)

    Hata, N.; Scherrer, R.J.; Steigman, G.; Thomas, D.; Walker, T.P.; Bludman, S.; Langacker, P.

    1995-01-01

    A new evaluation of the constraint on the number of light neutrino species (N ν ) from big bang nucleosynthesis suggests a discrepancy between the predicted light element abundances and those inferred from observations, unless the inferred primordial 4 He abundance has been underestimated by 0.014±0.004 (1σ) or less than 10% (95% C.L.) of 3 He survives stellar processing. With the quoted systematic errors in the observed abundances and a conservative chemical evolution parametrization, the best fit to the combined data is N ν =2.1±0.3 (1σ) and the upper limit is N ν ν =3) at the 98.6% C.L. copyright 1995 The American Physical Society

  14. Is day surgery safe?

    DEFF Research Database (Denmark)

    Majholm, Birgitte; Engbæk, J; Bartholdy, Jens

    2012-01-01

    Day surgery is expanding in several countries, and it is important to collect information about quality. The aim of this study was to assess morbidity and unanticipated hospital visits 0-30 days post-operatively in a large cohort.......Day surgery is expanding in several countries, and it is important to collect information about quality. The aim of this study was to assess morbidity and unanticipated hospital visits 0-30 days post-operatively in a large cohort....

  15. On the Geocentric Nature of the Big Bang Theory

    Science.gov (United States)

    Wang, Ling Jun

    2008-04-01

    An expanding universe with all heavenly bodies moving isotropically away from the earth seems to suggest a geocentric theory which is evidently false. To defend the Big Bang Theory (BBT) from falling into a geocentric theory, it is argued that if the universe is expanding linearly from the singularity, the heavenly bodies would appear to be leaving away from each other with isotropic velocity distribution with respect to any observer. In this presentation we will prove rigorously with both classical and relativistic analysis that even strict linearity of Hubble's law would not save the Big Bang from falling into a geocentric theory. The key of the analysis rests on the two crucial necessary conditions for the raisin-pudding model: 1) The velocities and the positions of the earth and the galaxies must be measured simultaneously; 2) The velocity transformation between the reference frame of the earth and that of the singularity must be linear. The first condition can not be satisfied due to the speed limit of light; and the second condition can not be satisfied due to non linear velocity transformation of relativity. The whole problem is originated from the Doppler shift explanation of the red shift. Wang's Dispersive Extinction Theory (DET), however, interprets the red shift as being caused by the dispersive extinction of the star light by the space medium, and therefore does not lead to a geocentric universe. This lends a strong support to DET over BBT.

  16. Day Care: Other Countries.

    Science.gov (United States)

    Hjartarson, Freida; And Others

    This collection of 5 bilingual papers on day care programs in foreign countries (China, the Soviet Union, and 3 Scandinavian countries) is part of a series of papers on various aspects of day care published by the Canadian Department of Health and Welfare. Each paper is presented in both English and French. Paper I considers day care services in…

  17. Every Day Is Mathematical

    Science.gov (United States)

    Barger, Rita H.; Jarrah, Adeeb M.

    2012-01-01

    March 14 is special because it is Pi Day. Mathematics is celebrated on that day because the date, 3-14, replicates the first three digits of pi. Pi-related songs, websites, trivia facts, and more are at the fingertips of interested teachers and students. Less celebrated, but still fairly well known, is National Metric Day, which falls on October…

  18. Moving from Administrivia Overload to Leadership Competency Development

    Science.gov (United States)

    Hogan, Amy M.

    2018-01-01

    Virginia Tech has intentionally moved beyond the traditional policy-heavy approach to department head orientation to employ a robust leadership development program for new and aspiring academic leaders and faculty. The annual program begins with a full-day workshop in mid-August, and continues throughout the fall semester with a series of shorter…

  19. Video Vortex reader II: moving images beyond YouTube

    NARCIS (Netherlands)

    Lovink, G.; Somers Miles, R.

    2011-01-01

    Video Vortex Reader II is the Institute of Network Cultures' second collection of texts that critically explore the rapidly changing landscape of online video and its use. With the success of YouTube ('2 billion views per day') and the rise of other online video sharing platforms, the moving image

  20. Slimmed May Day Holiday

    Institute of Scientific and Technical Information of China (English)

    Liu Xinwen

    2008-01-01

    @@ Last November the State Council of China decided to renew its holiday system by reducing the seven-day Mav Dav holiday to three days and introducing three new one-day public holidays,namely the Qingming Festival,Dragon Boat Festival and Moon Festival.BY doing so,the three golden-week holidays that were introduced in 1999,namely the Spring Festival,Mav Dav and National Day,could be better distributed.The New Year's Eve holiday would remain one day.The new holiday plan was supposed to take effect in 2008.

  1. Psychophysiological responses to competition and the big five personality traits.

    Science.gov (United States)

    Binboga, Erdal; Guven, Senol; Catıkkaş, Fatih; Bayazıt, Onur; Tok, Serdar

    2012-06-01

    This study examines the relationship between psychophysiological arousal, cognitive anxiety, and personality traits in young taekwondo athletes. A total of 20 male and 10 female taekwondo athletes (mean age = 18.6 years; ± 1.8) volunteered for the study. The Five Factor Personality Inventory and the state scale of the Spielberger State-Trait Anxiety Inventory (STAI) were used to measure personality and cognitive state anxiety. Electrodermal activity (EDA) was measured twice, one day and approximately one hour prior to the competition, to determine psychophysiological arousal. Descriptive statistics, Pearson product-moment correlations, and stepwise regression were used to analyze the data. Several "Big Five" facets were related to the EDA delta scores that were measured both one day and one hour before the competition. Two stepwise regressions were conducted to examine whether personality traits could significantly predict both EDA delta scores. The final model, containing only neuroticism from the Big Five factors, can significantly explain the variations in the EDA delta scores measured one day before the competition. Agreeableness can significantly explain variations in the EDA delta scores measured one hour before the competition. No relationship was found between cognitive anxiety and the EDA delta scores measured one hour before the competition. In conclusion, personality traits, especially agreeableness and neuroticism, might be useful in understanding arousal responses to competition.

  2. Big Data viewed through the lens of management fashion theory

    OpenAIRE

    Madsen, Dag Øivind; Stenheim, Tonny

    2016-01-01

    Big Data (BD) is currently one of the most talked about management ideas in the business community. Many call it the “buzzword of the day.” In books and media articles, BD has been referred to as a “revolution” and “new era.” There is lots of optimistic and upbeat rhetoric surrounding BD. This has led some to question whether BD is a hyped-up management fashion. In this paper, the BD phenomenon is viewed through the lens of management fashion theory. Management fashion provides an analytical ...

  3. DB2 11 the database for big data & analytics

    CERN Document Server

    Molaro, Cristian; Purcell, Terry

    2013-01-01

    The landscape of today's business is shaped by the mountains of data being produced, with rapid growth in the volume, variety, and velocity of data due to the explosion of smart devices, mobile applications, cloud computing, and social media. Much of this growth has been in unstructured data; however, by 2020, internet business transactions-business-to-business and business-to-consumer-are predicted to reach 450 billion per day. Smart organizations are seeking innovative ways to turn this explosion of data, called big data, i

  4. Outcome Prediction after Radiotherapy with Medical Big Data.

    Science.gov (United States)

    Magome, Taiki

    2016-01-01

    Data science is becoming more important in many fields. In medical physics field, we are facing huge data every day. Treatment outcomes after radiation therapy are determined by complex interactions between clinical, biological, and dosimetrical factors. A key concept of recent radiation oncology research is to predict the outcome based on medical big data for personalized medicine. Here, some reports, which are analyzing medical databases with machine learning techniques, were reviewed and feasibility of outcome prediction after radiation therapy was discussed. In addition, some strategies for saving manual labors to analyze huge data in medical physics were discussed.

  5. AAS 227: Day 3

    Science.gov (United States)

    Kohler, Susanna

    2016-01-01

    Editors Note:This week were at the 227th AAS Meeting in Kissimmee, FL. Along with several fellow authors from astrobites.com, I will bewritingupdates on selectedevents at themeeting and posting at the end of each day. Follow along here or atastrobites.com, or catch ourlive-tweeted updates from the@astrobites Twitter account. The usual posting schedule for AAS Nova will resumenext week.Welcome to Day 3 of the winter American Astronomical Society (AAS) meeting in Kissimmee! Several of us are attending the conference this year, and we will report highlights from each day here on astrobites. If youd like to see more timely updates during the day, we encourage you to follow @astrobites on twitter or search the #aas227 hashtag.Henry Norris Russell Lecture: Viewing the Universe with Infrared Eyes: The Spitzer Space Telescope (by Erika Nesvold)The Henry Norris Russell Award is the highest honor given by the AAS, for a lifetime of eminence in astronomy research. This years award went to Giovanni Fazio of the Harvard-Smithsonian Center for Astrophysics. Fazio became a leader in gamma ray astronomy before switching mid-career to the study of infrared astronomy, and he gave his award lecture on the latter subject, specifically on the Spitzer Space Telescope, one of the most successful infrared telescopes of all time.Artists rendering of the Spitzer space telescope. [NASA/JPL-Caltech]Spitzer has been operating for more than twelve years, and has resulted in over six thousand papers in refereed journals in that time. The telescope sits in an Earth-trailing orbit around the Sun, and is now farther from the Earth (1.4 AU) than the Earth is from the Sun. Fazio gave the audience a fascinating overview of the science done by Spitzer over more than a decade. One of the most productive areas of research for Spitzer is the study of exoplanets, which hadnt even been discovered when the Spitzer Telescope was first conceived. Spitzers high sensitivity and ability to observe exoplanets over

  6. Day-to-day variability of the latitudes of the Sq foci

    International Nuclear Information System (INIS)

    Schlapp, D.M.

    1976-01-01

    Day-to-day changes in the latitudes of the Sq foci have been studied for several longitudes and at both sunspot maximum and minimum. A small but significant correlation has been found indicating a tendency for the foci to move poleward or equatorward together. There is little correlation between the strength of the electrojet and the separation of the foci; if anything, the electrojet is weaker when the foci are closer together. (author)

  7. [Big data in medicine and healthcare].

    Science.gov (United States)

    Rüping, Stefan

    2015-08-01

    Healthcare is one of the business fields with the highest Big Data potential. According to the prevailing definition, Big Data refers to the fact that data today is often too large and heterogeneous and changes too quickly to be stored, processed, and transformed into value by previous technologies. The technological trends drive Big Data: business processes are more and more executed electronically, consumers produce more and more data themselves - e.g. in social networks - and finally ever increasing digitalization. Currently, several new trends towards new data sources and innovative data analysis appear in medicine and healthcare. From the research perspective, omics-research is one clear Big Data topic. In practice, the electronic health records, free open data and the "quantified self" offer new perspectives for data analytics. Regarding analytics, significant advances have been made in the information extraction from text data, which unlocks a lot of data from clinical documentation for analytics purposes. At the same time, medicine and healthcare is lagging behind in the adoption of Big Data approaches. This can be traced to particular problems regarding data complexity and organizational, legal, and ethical challenges. The growing uptake of Big Data in general and first best-practice examples in medicine and healthcare in particular, indicate that innovative solutions will be coming. This paper gives an overview of the potentials of Big Data in medicine and healthcare.

  8. [Relevance of big data for molecular diagnostics].

    Science.gov (United States)

    Bonin-Andresen, M; Smiljanovic, B; Stuhlmüller, B; Sörensen, T; Grützkau, A; Häupl, T

    2018-04-01

    Big data analysis raises the expectation that computerized algorithms may extract new knowledge from otherwise unmanageable vast data sets. What are the algorithms behind the big data discussion? In principle, high throughput technologies in molecular research already introduced big data and the development and application of analysis tools into the field of rheumatology some 15 years ago. This includes especially omics technologies, such as genomics, transcriptomics and cytomics. Some basic methods of data analysis are provided along with the technology, however, functional analysis and interpretation requires adaptation of existing or development of new software tools. For these steps, structuring and evaluating according to the biological context is extremely important and not only a mathematical problem. This aspect has to be considered much more for molecular big data than for those analyzed in health economy or epidemiology. Molecular data are structured in a first order determined by the applied technology and present quantitative characteristics that follow the principles of their biological nature. These biological dependencies have to be integrated into software solutions, which may require networks of molecular big data of the same or even different technologies in order to achieve cross-technology confirmation. More and more extensive recording of molecular processes also in individual patients are generating personal big data and require new strategies for management in order to develop data-driven individualized interpretation concepts. With this perspective in mind, translation of information derived from molecular big data will also require new specifications for education and professional competence.

  9. Big data in forensic science and medicine.

    Science.gov (United States)

    Lefèvre, Thomas

    2018-07-01

    In less than a decade, big data in medicine has become quite a phenomenon and many biomedical disciplines got their own tribune on the topic. Perspectives and debates are flourishing while there is a lack for a consensual definition for big data. The 3Vs paradigm is frequently evoked to define the big data principles and stands for Volume, Variety and Velocity. Even according to this paradigm, genuine big data studies are still scarce in medicine and may not meet all expectations. On one hand, techniques usually presented as specific to the big data such as machine learning techniques are supposed to support the ambition of personalized, predictive and preventive medicines. These techniques are mostly far from been new and are more than 50 years old for the most ancient. On the other hand, several issues closely related to the properties of big data and inherited from other scientific fields such as artificial intelligence are often underestimated if not ignored. Besides, a few papers temper the almost unanimous big data enthusiasm and are worth attention since they delineate what is at stakes. In this context, forensic science is still awaiting for its position papers as well as for a comprehensive outline of what kind of contribution big data could bring to the field. The present situation calls for definitions and actions to rationally guide research and practice in big data. It is an opportunity for grounding a true interdisciplinary approach in forensic science and medicine that is mainly based on evidence. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  10. Processing Solutions for Big Data in Astronomy

    Science.gov (United States)

    Fillatre, L.; Lepiller, D.

    2016-09-01

    This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.

  11. Big Data as Governmentality in International Development

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    2017-01-01

    Statistics have long shaped the field of visibility for the governance of development projects. The introduction of big data has altered the field of visibility. Employing Dean's “analytics of government” framework, we analyze two cases—malaria tracking in Kenya and monitoring of food prices...... in Indonesia. Our analysis shows that big data introduces a bias toward particular types of visualizations. What problems are being made visible through big data depends to some degree on how the underlying data is visualized and who is captured in the visualizations. It is also influenced by technical factors...

  12. Big data governance an emerging imperative

    CERN Document Server

    Soares, Sunil

    2012-01-01

    Written by a leading expert in the field, this guide focuses on the convergence of two major trends in information management-big data and information governance-by taking a strategic approach oriented around business cases and industry imperatives. With the advent of new technologies, enterprises are expanding and handling very large volumes of data; this book, nontechnical in nature and geared toward business audiences, encourages the practice of establishing appropriate governance over big data initiatives and addresses how to manage and govern big data, highlighting the relevant processes,

  13. Big Data and historical social science

    Directory of Open Access Journals (Sweden)

    Peter Bearman

    2015-11-01

    Full Text Available “Big Data” can revolutionize historical social science if it arises from substantively important contexts and is oriented towards answering substantively important questions. Such data may be especially important for answering previously largely intractable questions about the timing and sequencing of events, and of event boundaries. That said, “Big Data” makes no difference for social scientists and historians whose accounts rest on narrative sentences. Since such accounts are the norm, the effects of Big Data on the practice of historical social science may be more limited than one might wish.

  14. Hot big bang or slow freeze?

    Science.gov (United States)

    Wetterich, C.

    2014-09-01

    We confront the big bang for the beginning of the universe with an equivalent picture of a slow freeze - a very cold and slowly evolving universe. In the freeze picture the masses of elementary particles increase and the gravitational constant decreases with cosmic time, while the Newtonian attraction remains unchanged. The freeze and big bang pictures both describe the same observations or physical reality. We present a simple ;crossover model; without a big bang singularity. In the infinite past space-time is flat. Our model is compatible with present observations, describing the generation of primordial density fluctuations during inflation as well as the present transition to a dark energy-dominated universe.

  15. Smart Information Management in Health Big Data.

    Science.gov (United States)

    Muteba A, Eustache

    2017-01-01

    The smart information management system (SIMS) is concerned with the organization of anonymous patient records in a big data and their extraction in order to provide needful real-time intelligence. The purpose of the present study is to highlight the design and the implementation of the smart information management system. We emphasis, in one hand, the organization of a big data in flat file in simulation of nosql database, and in the other hand, the extraction of information based on lookup table and cache mechanism. The SIMS in the health big data aims the identification of new therapies and approaches to delivering care.

  16. Moving Manifolds in Electromagnetic Fields

    Directory of Open Access Journals (Sweden)

    David V. Svintradze

    2017-08-01

    Full Text Available We propose dynamic non-linear equations for moving surfaces in an electromagnetic field. The field is induced by a material body with a boundary of the surface. Correspondingly the potential energy, set by the field at the boundary can be written as an addition of four-potential times four-current to a contraction of the electromagnetic tensor. Proper application of the minimal action principle to the system Lagrangian yields dynamic non-linear equations for moving three dimensional manifolds in electromagnetic fields. The equations in different conditions simplify to Maxwell equations for massless three surfaces, to Euler equations for a dynamic fluid, to magneto-hydrodynamic equations and to the Poisson-Boltzmann equation.

  17. Big questions, big science: meeting the challenges of global ecology.

    Science.gov (United States)

    Schimel, David; Keller, Michael

    2015-04-01

    Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects.

  18. Stand up and move forward

    OpenAIRE

    de Jong, Johan; Shokoohi, Roya

    2017-01-01

    Insufficient physical activity or being inactive is one of the leading risk factors for non-communicable diseases worldwide. Globally between 6-10% of premature mortality, caused by non-communicable diseases, could be avoided if people adhered to general physical activity guidelines. Besides that, studies link sitting for prolonged periods of time with many serious health concerns. The solution seems simple: Stand up and move forward. However, human behavior is difficult to change – due to th...

  19. Autoregressive Moving Average Graph Filtering

    OpenAIRE

    Isufi, Elvin; Loukas, Andreas; Simonetto, Andrea; Leus, Geert

    2016-01-01

    One of the cornerstones of the field of signal processing on graphs are graph filters, direct analogues of classical filters, but intended for signals defined on graphs. This work brings forth new insights on the distributed graph filtering problem. We design a family of autoregressive moving average (ARMA) recursions, which (i) are able to approximate any desired graph frequency response, and (ii) give exact solutions for tasks such as graph signal denoising and interpolation. The design phi...

  20. Sustainable Development Plan for Korea through Expansion of Green IT: Policy Issues for the Effective Utilization of Big Data

    Directory of Open Access Journals (Sweden)

    Hyun Baek

    2015-01-01

    Full Text Available The South Korean government is providing full support for green IT as one of the growth engines of Korea. The purpose of this study is to derive policy issues needed for the sustainable development of Korea through utilizing Big Data by applying green IT. The analysis is done using a Delphi technique. Results show that the establishment of computing platforms that can easily share data and generate value is prioritized for the effective use of Big Data from the environment. In addition, the government-led publication of genetic information and electronic medical records for research purposes has been derived as an important policy issue for the use of bio-Big Data. Besides, a guideline concerning the standardization of machine to machine and Internet of Things communication and data security is needed to effectively use Big Data from machines/things. Moreover, a review of legislation related to the utilization of Big Data from digital media has been derived as an important policy issue. The results of this study propose the direction in which the Korean government should move for green growth through effective utilization of Big Data. The results can be also useful resources for establishing relevant policies for various countries that are accelerating sustainable development.