WorldWideScience

Sample records for big brains small

  1. Small Big Data Congress 2017

    NARCIS (Netherlands)

    Doorn, J.

    2017-01-01

    TNO, in collaboration with the Big Data Value Center, presents the fourth Small Big Data Congress! Our congress aims at providing an overview of practical and innovative applications based on big data. Do you want to know what is happening in applied research with big data? And what can already be

  2. Data: Big and Small.

    Science.gov (United States)

    Jones-Schenk, Jan

    2017-02-01

    Big data is a big topic in all leadership circles. Leaders in professional development must develop an understanding of what data are available across the organization that can inform effective planning for forecasting. Collaborating with others to integrate data sets can increase the power of prediction. Big data alone is insufficient to make big decisions. Leaders must find ways to access small data and triangulate multiple types of data to ensure the best decision making. J Contin Educ Nurs. 2017;48(2):60-61. Copyright 2017, SLACK Incorporated.

  3. Big data from small data: data-sharing in the ‘long tail’ of neuroscience

    Science.gov (United States)

    Ferguson, Adam R; Nielson, Jessica L; Cragin, Melissa H; Bandrowski, Anita E; Martone, Maryann E

    2016-01-01

    The launch of the US BRAIN and European Human Brain Projects coincides with growing international efforts toward transparency and increased access to publicly funded research in the neurosciences. The need for data-sharing standards and neuroinformatics infrastructure is more pressing than ever. However, ‘big science’ efforts are not the only drivers of data-sharing needs, as neuroscientists across the full spectrum of research grapple with the overwhelming volume of data being generated daily and a scientific environment that is increasingly focused on collaboration. In this commentary, we consider the issue of sharing of the richly diverse and heterogeneous small data sets produced by individual neuroscientists, so-called long-tail data. We consider the utility of these data, the diversity of repositories and options available for sharing such data, and emerging best practices. We provide use cases in which aggregating and mining diverse long-tail data convert numerous small data sources into big data for improved knowledge about neuroscience-related disorders. PMID:25349910

  4. Statistical Challenges in Modeling Big Brain Signals

    KAUST Repository

    Yu, Zhaoxia; Pluta, Dustin; Shen, Tong; Chen, Chuansheng; Xue, Gui; Ombao, Hernando

    2017-01-01

    Brain signal data are inherently big: massive in amount, complex in structure, and high in dimensions. These characteristics impose great challenges for statistical inference and learning. Here we review several key challenges, discuss possible

  5. Starting Small, Thinking Big - Continuum Magazine | NREL

    Science.gov (United States)

    , Thinking Big Stories NREL Helps Agencies Target New Federal Sustainability Goals Student Engagements Help solar power in the territory. Photo by Don Buchanan, VIEO Starting Small, Thinking Big NREL helps have used these actions to optimize that energy use.'" NREL's cross-organizational work supports

  6. Statistical Challenges in Modeling Big Brain Signals

    KAUST Repository

    Yu, Zhaoxia

    2017-11-01

    Brain signal data are inherently big: massive in amount, complex in structure, and high in dimensions. These characteristics impose great challenges for statistical inference and learning. Here we review several key challenges, discuss possible solutions, and highlight future research directions.

  7. Effect of furosemide and dietary sodium on kidney and plasma big and small renin

    International Nuclear Information System (INIS)

    Iwao, H.; Michelakis, A.M.

    1981-01-01

    Renin was found in mouse plasma in high-molecular-weight forms (big big renin, big renin) and a low-molecular-weight form (small renin). They were measuerd by a radioimmunoassay procedure for the direct measurement of renin. In the kidney, 89% of total renin was small renin and the rest was big big and big renin. This distribution pattern of renins was not changed when the kideny tissue was homogenized in the presence of protease inhibitors. Low-sodium or high-sodium diets changed renal renin content, but not the distribution pattern of renins in the kidney. Acute stimulation of renin release by furosemide increased small renin but not big big and big renin in plasma. However, dietary sodium depletion for 2 weeks significantly increased big big, big, and small renin in plasma of mice with or without submaxillary glands. In contrast, high-sodium intake significantly decreased big big, big, and small renin in plasma of mice with or without submaxillary glands

  8. PAGANI Toolkit: Parallel graph-theoretical analysis package for brain network big data.

    Science.gov (United States)

    Du, Haixiao; Xia, Mingrui; Zhao, Kang; Liao, Xuhong; Yang, Huazhong; Wang, Yu; He, Yong

    2018-05-01

    The recent collection of unprecedented quantities of neuroimaging data with high spatial resolution has led to brain network big data. However, a toolkit for fast and scalable computational solutions is still lacking. Here, we developed the PArallel Graph-theoretical ANalysIs (PAGANI) Toolkit based on a hybrid central processing unit-graphics processing unit (CPU-GPU) framework with a graphical user interface to facilitate the mapping and characterization of high-resolution brain networks. Specifically, the toolkit provides flexible parameters for users to customize computations of graph metrics in brain network analyses. As an empirical example, the PAGANI Toolkit was applied to individual voxel-based brain networks with ∼200,000 nodes that were derived from a resting-state fMRI dataset of 624 healthy young adults from the Human Connectome Project. Using a personal computer, this toolbox completed all computations in ∼27 h for one subject, which is markedly less than the 118 h required with a single-thread implementation. The voxel-based functional brain networks exhibited prominent small-world characteristics and densely connected hubs, which were mainly located in the medial and lateral fronto-parietal cortices. Moreover, the female group had significantly higher modularity and nodal betweenness centrality mainly in the medial/lateral fronto-parietal and occipital cortices than the male group. Significant correlations between the intelligence quotient and nodal metrics were also observed in several frontal regions. Collectively, the PAGANI Toolkit shows high computational performance and good scalability for analyzing connectome big data and provides a friendly interface without the complicated configuration of computing environments, thereby facilitating high-resolution connectomics research in health and disease. © 2018 Wiley Periodicals, Inc.

  9. Small Area Model-Based Estimators Using Big Data Sources

    Directory of Open Access Journals (Sweden)

    Marchetti Stefano

    2015-06-01

    Full Text Available The timely, accurate monitoring of social indicators, such as poverty or inequality, on a finegrained spatial and temporal scale is a crucial tool for understanding social phenomena and policymaking, but poses a great challenge to official statistics. This article argues that an interdisciplinary approach, combining the body of statistical research in small area estimation with the body of research in social data mining based on Big Data, can provide novel means to tackle this problem successfully. Big Data derived from the digital crumbs that humans leave behind in their daily activities are in fact providing ever more accurate proxies of social life. Social data mining from these data, coupled with advanced model-based techniques for fine-grained estimates, have the potential to provide a novel microscope through which to view and understand social complexity. This article suggests three ways to use Big Data together with small area estimation techniques, and shows how Big Data has the potential to mirror aspects of well-being and other socioeconomic phenomena.

  10. Small decisions with big impact on data analytics

    OpenAIRE

    Jana Diesner

    2015-01-01

    Big social data have enabled new opportunities for evaluating the applicability of social science theories that were formulated decades ago and were often based on small- to medium-sized samples. Big Data coupled with powerful computing has the potential to replace the statistical practice of sampling and estimating effects by measuring phenomena based on full populations. Preparing these data for analysis and conducting analytics involves a plethora of decisions, some of which are already em...

  11. "small problems, Big Trouble": An Art and Science Collaborative Exhibition Reflecting Seemingly small problems Leading to Big Threats

    Science.gov (United States)

    Waller, J. L.; Brey, J. A.

    2014-12-01

    "small problems, Big Trouble" (spBT) is an exhibition of artist Judith Waller's paintings accompanied by text panels written by Earth scientist Dr. James A. Brey and several science researchers and educators. The text panels' message is as much the focus of the show as the art--true interdisciplinarity! Waller and Brey's history of art and earth science collaborations include the successful exhibition "Layers: Places in Peril". New in spBT is extended collaboration with other scientists in order to create awareness of geoscience and other subjects (i.e. soil, parasites, dust, pollutants, invasive species, carbon, ground water contaminants, solar wind) small in scale which pose significant threats. The paintings are the size of a mirror, a symbol suggesting the problems depicted are those we increasingly need to face, noting our collective reflections of shared current and future reality. Naturalistic rendering and abstract form in the art helps reach a broad audience including those familiar with art and those familiar with science. The goal is that gallery visitors gain greater appreciation and understanding of both—and of the sober content of the show as a whole. "small problems, Big Trouble" premiers in Wisconsin April, 2015. As in previous collaborations, Waller and Brey actively utilize art and science (specifically geoscience) as an educational vehicle for active student learning. Planned are interdisciplinary university and area high school activities linked through spBT. The exhibition in a public gallery offers a means to enhance community awareness of and action on scientific issues through art's power to engage people on an emotional level. This AGU presentation includes a description of past Waller and Brey activities: incorporating art and earth science in lab and studio classrooms, producing gallery and museum exhibitions and delivering workshops and other presentations. They also describe how walking the paths of several past earth science

  12. Small data in the era of big data

    OpenAIRE

    Kitchin, Rob; Lauriault, Tracey P.

    2015-01-01

    Academic knowledge building has progressed for the past few centuries using small data studies characterized by sampled data generated to answer specific questions. It is a strategy that has been remarkably successful, enabling the sciences, social sciences and humanities to advance in leaps and bounds. This approach is presently being challenged by the development of big data. Small data studies will however, we argue, continue to be popular and valuable in the fut...

  13. Big Data, Small Sample.

    Science.gov (United States)

    Gerlovina, Inna; van der Laan, Mark J; Hubbard, Alan

    2017-05-20

    Multiple comparisons and small sample size, common characteristics of many types of "Big Data" including those that are produced by genomic studies, present specific challenges that affect reliability of inference. Use of multiple testing procedures necessitates calculation of very small tail probabilities of a test statistic distribution. Results based on large deviation theory provide a formal condition that is necessary to guarantee error rate control given practical sample sizes, linking the number of tests and the sample size; this condition, however, is rarely satisfied. Using methods that are based on Edgeworth expansions (relying especially on the work of Peter Hall), we explore the impact of departures of sampling distributions from typical assumptions on actual error rates. Our investigation illustrates how far the actual error rates can be from the declared nominal levels, suggesting potentially wide-spread problems with error rate control, specifically excessive false positives. This is an important factor that contributes to "reproducibility crisis". We also review some other commonly used methods (such as permutation and methods based on finite sampling inequalities) in their application to multiple testing/small sample data. We point out that Edgeworth expansions, providing higher order approximations to the sampling distribution, offer a promising direction for data analysis that could improve reliability of studies relying on large numbers of comparisons with modest sample sizes.

  14. BIG1 is required for the survival of deep layer neurons, neuronal polarity, and the formation of axonal tracts between the thalamus and neocortex in developing brain.

    Directory of Open Access Journals (Sweden)

    Jia-Jie Teoh

    Full Text Available BIG1, an activator protein of the small GTPase, Arf, and encoded by the Arfgef1 gene, is one of candidate genes for epileptic encephalopathy. To know the involvement of BIG1 in epileptic encephalopathy, we analyzed BIG1-deficient mice and found that BIG1 regulates neurite outgrowth and brain development in vitro and in vivo. The loss of BIG1 decreased the size of the neocortex and hippocampus. In BIG1-deficient mice, the neuronal progenitor cells (NPCs and the interneurons were unaffected. However, Tbr1+ and Ctip2+ deep layer (DL neurons showed spatial-temporal dependent apoptosis. This apoptosis gradually progressed from the piriform cortex (PIR, peaked in the neocortex, and then progressed into the hippocampus from embryonic day 13.5 (E13.5 to E17.5. The upper layer (UL and DL order in the neocortex was maintained in BIG1-deficient mice, but the excitatory neurons tended to accumulate before their destination layers. Further pulse-chase migration assay showed that the migration defect was non-cell autonomous and secondary to the progression of apoptosis into the BIG1-deficient neocortex after E15.5. In BIG1-deficient mice, we observed an ectopic projection of corticothalamic axons from the primary somatosensory cortex (S1 into the dorsal lateral geniculate nucleus (dLGN. The thalamocortical axons were unable to cross the diencephalon-telencephalon boundary (DTB. In vitro, BIG1-deficient neurons showed a delay in neuronal polarization. BIG1-deficient neurons were also hypersensitive to low dose glutamate (5 μM, and died via apoptosis. This study showed the role of BIG1 in the survival of DL neurons in developing embryonic brain and in the generation of neuronal polarity.

  15. Bigger data for big data: from Twitter to brain-computer interfaces.

    Science.gov (United States)

    Roesch, Etienne B; Stahl, Frederic; Gaber, Mohamed Medhat

    2014-02-01

    We are sympathetic with Bentley et al.'s attempt to encompass the wisdom of crowds in a generative model, but posit that a successful attempt at using big data will include more sensitive measurements, more varied sources of information, and will also build from the indirect information available through technology, from ancillary technical features to data from brain-computer interfaces.

  16. Big data, open science and the brain: lessons learned from genomics

    Directory of Open Access Journals (Sweden)

    Suparna eChoudhury

    2014-05-01

    Full Text Available The BRAIN Initiative aims to break new ground in the scale and speed of data collection in neuroscience, requiring tools to handle data in the magnitude of yottabytes (1024. The scale, investment and organization of it are being compared to the Human Genome Project (HGP, which has exemplified ‘big science’ for biology. In line with the trend towards Big Data in genomic research, the promise of the BRAIN Initiative, as well as the European Human Brain Project, rests on the possibility to amass vast quantities of data to model the complex interactions between the brain and behaviour and inform the diagnosis and prevention of neurological disorders and psychiatric disease. Advocates of this ‘data driven’ paradigm in neuroscience argue that harnessing the large quantities of data generated across laboratories worldwide has numerous methodological, ethical and economic advantages, but it requires the neuroscience community to adopt a culture of data sharing and open access to benefit from them. In this article, we examine the rationale for data sharing among advocates and briefly exemplify these in terms of new ‘open neuroscience’ projects. Then, drawing on the frequently invoked model of data sharing in genomics, we go on to demonstrate the complexities of data sharing, shedding light on the sociological and ethical challenges within the realms of institutions, researchers and participants, namely dilemmas around public/private interests in data, (lack of motivation to share in the academic community, and potential loss of participant anonymity. Our paper serves to highlight some foreseeable tensions around data sharing relevant to the emergent ‘open neuroscience’ movement.

  17. Mapping Cortical Laminar Structure in the 3D BigBrain.

    Science.gov (United States)

    Wagstyl, Konrad; Lepage, Claude; Bludau, Sebastian; Zilles, Karl; Fletcher, Paul C; Amunts, Katrin; Evans, Alan C

    2018-07-01

    Histological sections offer high spatial resolution to examine laminar architecture of the human cerebral cortex; however, they are restricted by being 2D, hence only regions with sufficiently optimal cutting planes can be analyzed. Conversely, noninvasive neuroimaging approaches are whole brain but have relatively low resolution. Consequently, correct 3D cross-cortical patterns of laminar architecture have never been mapped in histological sections. We developed an automated technique to identify and analyze laminar structure within the high-resolution 3D histological BigBrain. We extracted white matter and pial surfaces, from which we derived histologically verified surfaces at the layer I/II boundary and within layer IV. Layer IV depth was strongly predicted by cortical curvature but varied between areas. This fully automated 3D laminar analysis is an important requirement for bridging high-resolution 2D cytoarchitecture and in vivo 3D neuroimaging. It lays the foundation for in-depth, whole-brain analyses of cortical layering.

  18. More Differences or More Similarities Regarding Education in Big, Middle-sized and Small Companies

    Directory of Open Access Journals (Sweden)

    Marjana Merkač

    2001-12-01

    Full Text Available The article presents the results of research of education and qualifying of employees in small, middle-sized and big Slovenian companies. The research shows some differences regarding the attitude to the development of employees as a part of a company's business strategy, some obstacles for developing their abilities, and connections between job satisfaction and motivation for learning. It also shows how important it is for the subjects concerning education and qualifying if an individual works for a big, middle-sized, or small company.

  19. Small data, data infrastructures and big data (Working Paper 1)

    OpenAIRE

    Kitchin, Rob; Lauriault, Tracey P.

    2014-01-01

    The production of academic knowledge has progressed for the past few centuries using small data studies characterized by sampled data generated to answer specific questions. It is a strategy that has been remarkably successful, enabling the sciences, social sciences and humanities to advance in leaps and bounds. This approach is presently being challenged by the development of big data. Small data studies will, however, continue to be important in the future because of their utility in answer...

  20. Small decisions with big impact on data analytics

    Directory of Open Access Journals (Sweden)

    Jana Diesner

    2015-11-01

    Full Text Available Big social data have enabled new opportunities for evaluating the applicability of social science theories that were formulated decades ago and were often based on small- to medium-sized samples. Big Data coupled with powerful computing has the potential to replace the statistical practice of sampling and estimating effects by measuring phenomena based on full populations. Preparing these data for analysis and conducting analytics involves a plethora of decisions, some of which are already embedded in previously collected data and built tools. These decisions refer to the recording, indexing and representation of data and the settings for analysis methods. While these choices can have tremendous impact on research outcomes, they are not often obvious, not considered or not being made explicit. Consequently, our awareness and understanding of the impact of these decisions on analysis results and derived implications are highly underdeveloped. This might be attributable to occasional high levels of over-confidence in computational solutions as well as the possible yet questionable assumption that Big Data can wash out minor data quality issues, among other reasons. This article provides examples for how to address this issue. It argues that checking, ensuring and validating the quality of big social data and related auxiliary material is a key ingredient for empowering users to gain reliable insights from their work. Scrutinizing data for accuracy issues, systematically fixing them and diligently documenting these processes can have another positive side effect: Closely interacting with the data, thereby forcing ourselves to understand their idiosyncrasies and patterns, can help us to move from being able to precisely model and formally describe effects in society to also understand and explain them.

  1. Guide for the 2 infinities - the infinitely big and the infinitely small

    International Nuclear Information System (INIS)

    Armengaud, E.; Arnaud, N.; Aubourg, E.; Bassler, U.; Binetruy, P.; Bouquet, A.; Boutigny, D.; Brun, P.; Chassande-Mottin, E.; Chardin, G.; Coustenis, A.; Descotes-Genon, S.; Dole, H.; Drouart, A.; Elbaz, D.; Ferrando, Ph.; Glicenstein, J.F.; Giraud-Heraud, Y.; Halloin, H.; Kerhoas-Cavata, S.; De Kerret, H.; Klein, E.; Lachieze-Rey, M.; Lagage, P.O.; Langer, M.; Lebrun, F.; Lequeux, J.; Meheut, H.; Moniez, M.; Palanque-Delabrouille, N.; Paul, J.; Piquemal, F.; Polci, F.; Proust, D.; Richard, F.; Robert, J.L.; Rosnet, Ph.; Roudeau, P.; Royole-Degieux, P.; Sacquin, Y.; Serreau, J.; Shifrin, G.; Sida, J.L.; Smith, D.; Sordini, V.; Spiro, M.; Stolarczyk, Th.; Suomijdrvi, T.; Tagger, M.; Vangioni, E.; Vauclair, S.; Vial, J.C.; Viaud, B.; Vignaud, D.

    2010-01-01

    This book is to be read from both ends: one is dedicated to the path towards the infinitely big and the other to the infinitely small. Each path is made of a series of various subject entries illustrating important concepts or achievements in the quest for the understanding of the concerned infinity. For instance the part concerning the infinitely small includes entries like: quarks, Higgs bosons, radiation detection, Chooz neutrinos... while the part for the infinitely big includes: the universe, cosmic radiations, black matter, antimatter... and a series of experiments such as HESS, INTEGRAL, ANTARES, JWST, LOFAR, Planck, LSST, SOHO, Virgo, VLT, or XMM-Newton. This popularization work includes also an important glossary that explains scientific terms used in the entries. (A.C.)

  2. Big words, halved brains and small worlds: complex brain networks of figurative language comprehension.

    Science.gov (United States)

    Arzouan, Yossi; Solomon, Sorin; Faust, Miriam; Goldstein, Abraham

    2011-04-27

    Language comprehension is a complex task that involves a wide network of brain regions. We used topological measures to qualify and quantify the functional connectivity of the networks used under various comprehension conditions. To that aim we developed a technique to represent functional networks based on EEG recordings, taking advantage of their excellent time resolution in order to capture the fast processes that occur during language comprehension. Networks were created by searching for a specific causal relation between areas, the negative feedback loop, which is ubiquitous in many systems. This method is a simple way to construct directed graphs using event-related activity, which can then be analyzed topologically. Brain activity was recorded while subjects read expressions of various types and indicated whether they found them meaningful. Slightly different functional networks were obtained for event-related activity evoked by each expression type. The differences reflect the special contribution of specific regions in each condition and the balance of hemispheric activity involved in comprehending different types of expressions and are consistent with the literature in the field. Our results indicate that representing event-related brain activity as a network using a simple temporal relation, such as the negative feedback loop, to indicate directional connectivity is a viable option for investigation which also derives new information about aspects not reflected in the classical methods for investigating brain activity.

  3. Small values in big data: The continuing need for appropriate metadata

    Science.gov (United States)

    Stow, Craig A.; Webster, Katherine E.; Wagner, Tyler; Lottig, Noah R.; Soranno, Patricia A.; Cha, YoonKyung

    2018-01-01

    Compiling data from disparate sources to address pressing ecological issues is increasingly common. Many ecological datasets contain left-censored data – observations below an analytical detection limit. Studies from single and typically small datasets show that common approaches for handling censored data — e.g., deletion or substituting fixed values — result in systematic biases. However, no studies have explored the degree to which the documentation and presence of censored data influence outcomes from large, multi-sourced datasets. We describe left-censored data in a lake water quality database assembled from 74 sources and illustrate the challenges of dealing with small values in big data, including detection limits that are absent, range widely, and show trends over time. We show that substitutions of censored data can also bias analyses using ‘big data’ datasets, that censored data can be effectively handled with modern quantitative approaches, but that such approaches rely on accurate metadata that describe treatment of censored data from each source.

  4. Eating on nightshift: A big vs small snack impairs glucose response to breakfast

    Directory of Open Access Journals (Sweden)

    Stephanie Centofanti

    2018-01-01

    Full Text Available Shift work is a risk factor for chronic diseases such as Type 2 diabetes. Food choice may play a role, however simply eating at night when the body is primed for sleep may have implications for health. This study examined the impact of consuming a big versus small snack at night on glucose metabolism. N = 31 healthy subjects (21–35 y; 18 F participated in a simulated nightshift laboratory study that included one baseline night of sleep (22:00 h-07:00 h and one night awake with allocation to either a big snack (2100 kJ or small snack (840 kJ group. The snack was consumed between 00:00–00:30 h and consisted of low fat milk, a sandwich, chips and fruit (big snack or half sandwich and fruit (small snack. Subjects ate an identical mixed meal breakfast (2100 kJ at 08:30 h after one full night of sleep and a simulated nightshift. Interstitial glucose was measured continuously during the entire study using Medtronic Continual Glucose Monitors. Only subjects with identical breakfast consumption and complete datasets were analysed (N = 20. Glucose data were averaged into 5-minute bins and area under the curve (AUC was calculated for 90 min post-breakfast. Pre-breakfast, glucose levels were not significantly different between Day1 and Day2, nor were they different between snack groups (p > 0.05. A snack group by day interaction effect was found (F1,16 = 5.36, p = 0.034 and post-hocs revealed that in the big snack group, AUC response to breakfast was significantly higher following nightshift (Day2 compared to Day1 (p = 0.001. This translated to a 20.8% (SEM 5.6 increase. AUC was not significantly different between days in the small snack group. Consuming a big snack at 00:00 h impaired the glucose response to breakfast at 08:30 h, compared to a smaller snack. Further research in this area will inform dietary advice for shift workers, which could include recommendations on how much to eat as well as content.

  5. Theorizing the narrative dimension of psychotherapy and counseling: A big and small story approach

    NARCIS (Netherlands)

    Sools, Anna Maria; Schuhmann, Carmen

    2014-01-01

    In this article, we develop a theoretically substantiated narrative framework for assessing psychotherapy practices, based on a big and small story approach. This approach stretches the narrative scope of these practices by making explicit and advancing small story counseling. We demonstrate how

  6. Small quarks make big nuggets

    International Nuclear Information System (INIS)

    Deligeorges, S.

    1985-01-01

    After a brief recall on the classification of subatomic particles, this paper deals with quark nuggets, particle with more than three quarks, a big bag, which is called ''nuclearite''. Neutron stars, in fact, are big sacks of quarks, gigantic nuggets. Now, physicists try to calculate which type of nuggets of strange quark matter is stable, what has been the influence of quark nuggets on the primordial nucleosynthesis. At the present time, one says that if these ''nuggets'' exist, and in a large proportion, they may be candidates for the missing mass [fr

  7. Big Data, Big Responsibility! Building best-practice privacy strategies into a large-scale neuroinformatics platform

    Directory of Open Access Journals (Sweden)

    Christina Popovich

    2017-04-01

    OBI’s rigorous approach to data sharing in the field of neuroscience maintains the accessibility of research data for big discoveries without compromising patient privacy and security. We believe that Brain-CODE is a powerful and advantageous tool; moving neuroscience research from independent silos to an integrative system approach for improving patient health. OBI’s vision for improved brain health for patients living with neurological disorders paired with Brain-CODE’s best-practice strategies in privacy protection of patient data offer a novel and innovative approach to “big data” initiatives aimed towards improving public health and society world-wide.

  8. Big History or the 13800 million years from the Big Bang to the Human Brain

    Science.gov (United States)

    Gústafsson, Ludvik E.

    2017-04-01

    structures like plants, animals and fungi. 3. Matter starts to think A comet or an asteroid crashed into Earth about 66 million years ago, ending the dominance of dinosaurs. Small animals giving birth to living offspring were now able to evolve into a multitude of species, among them the primates. A group of primates migrated from Africa to other continents less than 100000 years ago. Their brain developed a special quality, self-conscience. This ability to reflect about oneself boosted their survival considerably. Man (Homo sapiens) had entered the scene, becoming one of the dominant species of this planet. Due to his immense ability today to handle matter and energy he has become something of a caretaker of planet Earth. Man is responsible for sustainable development for the good of his society and of the whole biosphere. If there is a fourth step in the history of the universe, discoveries in astrobiology may provide us with some clues in the next decades.

  9. How big is big and how small is small the sizes of everything and why

    CERN Document Server

    Smith, Timothy Paul

    2013-01-01

    This book is about how big is the universe and how small are quarks, and what are the sizes of dozens of things between these two extremes. It describes the sizes of atoms and planets, quarks and galaxies, cells and sequoias. It is a romp through forty-five orders of magnitude from the smallest sub-nuclear particles we have measured, to the edge of the observed universe. It also looks at time, from the epic age of the cosmos to the fleeting lifetimes of ethereal particles. It is a narrative that trips its way from stellar magnitudes to the clocks on GPS satellites, from the nearly logarithmic scales of a piano keyboard through a system of numbers invented by Archimedes and on to the measurement of the size of an atom. Why do some things happen at certain scales? Why are cells a hundred thousandths of a meter across? Why are stars never smaller than about 100 million meters in diameter? Why are trees limited to about 120 meters in height? Why are planets spherical, but asteroids not? Often the size of an objec...

  10. The Efficiency of a Small-World Functional Brain Network

    Institute of Scientific and Technical Information of China (English)

    ZHAO Qing-Bai; ZHANG Xiao-Fei; SUI Dan-Ni; ZHOU Zhi-Jin; CHEN Qi-Cai; TANG Yi-Yuan

    2012-01-01

    We investigate whether the small-world topology of a functional brain network means high information processing efficiency by calculating the correlation between the small-world measures of a functional brain network and behavioral reaction during an imagery task.Functional brain networks are constructed by multichannel eventrelated potential data,in which the electrodes are the nodes and the functional connectivities between them are the edges.The results show that the correlation between small-world measures and reaction time is task-specific,such that in global imagery,there is a positive correlation between the clustering coefficient and reaction time,while in local imagery the average path length is positively correlated with the reaction time.This suggests that the efficiency of a functional brain network is task-dependent.%We investigate whether the small-world topology of a functional brain network means high information processing efficiency by calculating the correlation between the small-world measures of a functional brain network and behavioral reaction during an imagery task. Functional brain networks are constructed by multichannel event-related potential data, in which the electrodes are the nodes and the functional connectivities between them are the edges. The results show that the correlation between small-world measures and reaction time is task-specific, such that in global imagery, there is a positive correlation between the clustering coefficient and reaction time, while in local imagery the average path length is positively correlated with the reaction time. This suggests that the efficiency of a functional brain network is task-dependent.

  11. Evaluation research of small and medium-sized enterprise informatization on big data

    Science.gov (United States)

    Yang, Na

    2017-09-01

    Under the background of big data, key construction of small and medium-sized enterprise informationization level was needed, but information construction cost was large, while information cost of inputs can bring benefit to small and medium-sized enterprises. This paper established small and medium-sized enterprise informatization evaluation system from hardware and software security level, information organization level, information technology application and the profit level, and information ability level. The rough set theory was used to brief indexes, and then carry out evaluation by support vector machine (SVM) model. At last, examples were used to verify the theory in order to prove the effectiveness of the method.

  12. Small-world human brain networks: Perspectives and challenges.

    Science.gov (United States)

    Liao, Xuhong; Vasilakos, Athanasios V; He, Yong

    2017-06-01

    Modelling the human brain as a complex network has provided a powerful mathematical framework to characterize the structural and functional architectures of the brain. In the past decade, the combination of non-invasive neuroimaging techniques and graph theoretical approaches enable us to map human structural and functional connectivity patterns (i.e., connectome) at the macroscopic level. One of the most influential findings is that human brain networks exhibit prominent small-world organization. Such a network architecture in the human brain facilitates efficient information segregation and integration at low wiring and energy costs, which presumably results from natural selection under the pressure of a cost-efficiency balance. Moreover, the small-world organization undergoes continuous changes during normal development and ageing and exhibits dramatic alterations in neurological and psychiatric disorders. In this review, we survey recent advances regarding the small-world architecture in human brain networks and highlight the potential implications and applications in multidisciplinary fields, including cognitive neuroscience, medicine and engineering. Finally, we highlight several challenging issues and areas for future research in this rapidly growing field. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Big Data, Big Problems: A Healthcare Perspective.

    Science.gov (United States)

    Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M

    2017-01-01

    Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."

  14. A small frog that makes a big difference: brain wave testing of TV advertisements.

    Science.gov (United States)

    Ohme, Rafal; Matukin, Michal

    2012-01-01

    It is important for the marketing industry to better understand the role of the unconscious and emotions in advertising communication and shopping behavior. Yet, traditional consumer research is not enough for such a purpose. Conventional paper-and-pencil or verbal declarations favor conscious pragmatism and functionality as the principles underlying consumer decisions and motives. These approaches should be combined with an emerging discipline (consumer neuroscience or neuromarketing) to examine the brain and its functioning in the context of consumer choices. It has been widely acknowledged that patterns of brain activity are closely related to consumers cognition and behavior. Thus, the analysis of consumers neurophysiology may increase the understanding of how consumers process incoming information and how they use their memory and react emotionally (See "Three Types of Brain Wave Research on TV Advertisements"): Moreover, as the majority of consumer mental processes occur below the level of conscious awareness, observations of the brain reactions enable researchers to reach the very core (which is consciously inaccessible) foundations of consumer decisions, emotions, motivations, and preferences.

  15. Small Bodies, Big Concepts: Engaging Teachers and Their Students in Visual Analysis of Comets and Asteroids

    Science.gov (United States)

    Cobb, W. H.; Buxner, S.; Lebofsky, L. A.; Ristvey, J.; Weeks, S.; Zolensky, M.

    2011-12-01

    Small Bodies, Big Concepts is a multi-disciplinary, professional development project that engages 5th - 8th grade teachers in high end planetary science using a research-based pedagogical framework, Designing Effective Science Instruction (DESI). In addition to developing sound background knowledge with a focus on visual analysis, teachers' awareness of the process of learning new content is heightened, and they use that experience to deepen their science teaching practice. Culling from NASA E/PO educational materials, activities are sequenced to enhance conceptual understanding of big ideas in space science: what do we know, how do we know it, why do we care? Helping teachers develop a picture of the history and evolution of our understanding of the solar system, and honing in on the place of comets and asteroids in helping us answer old questions and discover new ones, teachers see the power and excitement underlying planetary science as human endeavor. Research indicates that science inquiry is powerful in the classroom and mission scientists are real-life models of science inquiry in action. Using guest scientist facilitators from the Planetary Science Institute, NASA Johnson Space Center, Lockheed Martin, and NASA E/PO professionals from McREL and NASA AESP, teachers practice framing scientific questions, using current visual data, and adapting NASA E/PO activities related to current exploration of asteroids and comets in our Solar System. Cross-curricular elements included examining research-based strategies for enhancing English language learners' ability to engage in higher order questions and a professional astronomy artist's insight into how visual analysis requires not just our eyes engaged, but our brains: comparing, synthesizing, questioning, evaluating, and wondering. This summer we pilot tested the SBBC curriculum with thirteen 5th- 10th grade teachers modeling a variety of instructional approaches over eight days. Each teacher developed lesson plans

  16. Kaleido: Visualizing Big Brain Data with Automatic Color Assignment for Single-Neuron Images.

    Science.gov (United States)

    Wang, Ting-Yuan; Chen, Nan-Yow; He, Guan-Wei; Wang, Guo-Tzau; Shih, Chi-Tin; Chiang, Ann-Shyn

    2018-03-03

    Effective 3D visualization is essential for connectomics analysis, where the number of neural images easily reaches over tens of thousands. A formidable challenge is to simultaneously visualize a large number of distinguishable single-neuron images, with reasonable processing time and memory for file management and 3D rendering. In the present study, we proposed an algorithm named "Kaleido" that can visualize up to at least ten thousand single neurons from the Drosophila brain using only a fraction of the memory traditionally required, without increasing computing time. Adding more brain neurons increases memory only nominally. Importantly, Kaleido maximizes color contrast between neighboring neurons so that individual neurons can be easily distinguished. Colors can also be assigned to neurons based on biological relevance, such as gene expression, neurotransmitters, and/or development history. For cross-lab examination, the identity of every neuron is retrievable from the displayed image. To demonstrate the effectiveness and tractability of the method, we applied Kaleido to visualize the 10,000 Drosophila brain neurons obtained from the FlyCircuit database ( http://www.flycircuit.tw/modules.php?name=kaleido ). Thus, Kaleido visualization requires only sensible computer memory for manual examination of big connectomics data.

  17. Big data in small steps : Assessing the value of data

    NARCIS (Netherlands)

    Veenstra, A.F.E. van; Bakker, T.P.; Esmeijer, J.

    2013-01-01

    Data is seen as the new oil: an important driver of innovation and economic growth. At the same time, many find it difficult to determine the value of big data for their organization. TNO presents a stepwise big data model that supports private and public organizations to assess the potential of big

  18. On the equivalence between small-step and big-step abstract machines: a simple application of lightweight fusion

    DEFF Research Database (Denmark)

    Danvy, Olivier; Millikin, Kevin

    2008-01-01

    -step specification. We illustrate this observation here with a recognizer for Dyck words, the CEK machine, and Krivine’s machine with call/cc. The need for such a simple proof is motivated by our current work on small-step abstract machines as obtained by refocusing a function implementing a reduction semantics (a...... syntactic correspondence), and big-step abstract machines as obtained by CPStransforming and then defunctionalizing a function implementing a big-step semantics (a functional correspondence). © 2007 Elsevier B.V. All rights reserved....

  19. Transforming fragments into candidates: small becomes big in medicinal chemistry.

    Science.gov (United States)

    de Kloe, Gerdien E; Bailey, David; Leurs, Rob; de Esch, Iwan J P

    2009-07-01

    Fragment-based drug discovery (FBDD) represents a logical and efficient approach to lead discovery and optimisation. It can draw on structural, biophysical and biochemical data, incorporating a wide range of inputs, from precise mode-of-binding information on specific fragments to wider ranging pharmacophoric screening surveys using traditional HTS approaches. It is truly an enabling technology for the imaginative medicinal chemist. In this review, we analyse a representative set of 23 published FBDD studies that describe how low molecular weight fragments are being identified and efficiently transformed into higher molecular weight drug candidates. FBDD is now becoming warmly endorsed by industry as well as academia and the focus on small interacting molecules is making a big scientific impact.

  20. Pocket data mining big data on small devices

    CERN Document Server

    Gaber, Mohamed Medhat; Gomes, Joao Bartolo

    2014-01-01

    Owing to continuous advances in the computational power of handheld devices like smartphones and tablet computers, it has become possible to perform Big Data operations including modern data mining processes onboard these small devices. A decade of research has proved the feasibility of what has been termed as Mobile Data Mining, with a focus on one mobile device running data mining processes. However, it is not before 2010 until the authors of this book initiated the Pocket Data Mining (PDM) project exploiting the seamless communication among handheld devices performing data analysis tasks that were infeasible until recently. PDM is the process of collaboratively extracting knowledge from distributed data streams in a mobile computing environment. This book provides the reader with an in-depth treatment on this emerging area of research. Details of techniques used and thorough experimental studies are given. More importantly and exclusive to this book, the authors provide detailed practical guide on the depl...

  1. [Cultivation strategy and path analysis on big brand Chinese medicine for small and medium-sized enterprises].

    Science.gov (United States)

    Wang, Yong-Yan; Yang, Hong-Jun

    2014-03-01

    Small and medium-sized enterprises (SMEs) are important components in Chinese medicine industry. However, the lack of big brand is becoming an urgent problem which is critical to the survival of SMEs. This article discusses the concept and traits of Chinese medicine of big brand, from clinical, scientific and market value three aspects. Guided by market value, highlighting clinical value, aiming at the scientific value improvement of big brand cultivation, we put forward the key points in cultivation, aiming at obtaining branded Chinese medicine with widely recognized efficacy, good quality control system and mechanism well explained and meanwhile which can bring innovation improvement to theory of Chinese medicine. According to the characters of SMEs, we hold a view that to build multidisciplinary research union could be considered as basic path, and then, from top-level design, skill upgrading and application three stages to probe the implementation strategy.

  2. The art of being small : brain-body size scaling in minute parasitic wasps

    NARCIS (Netherlands)

    Woude, van der Emma

    2017-01-01

    Haller’s rule states that small animals have relatively larger brains than large animals. This brain-body size relationship may enable small animals to maintain similar levels of brain performance as large animals. However, it also causes small animals to spend an exceptionally large proportion

  3. Small-world organization of self-similar modules in functional brain networks

    Science.gov (United States)

    Sigman, Mariano; Gallos, Lazaros; Makse, Hernan

    2012-02-01

    The modular organization of the brain implies the parallel nature of brain computations. These modules have to remain functionally independent, but at the same time they need to be sufficiently connected to guarantee the unitary nature of brain perception. Small-world architectures have been suggested as probable structures explaining this behavior. However, there is intrinsic tension between shortcuts generating small-worlds and the persistence of modularity. In this talk, we study correlations between the activity in different brain areas. We suggest that the functional brain network formed by the percolation of strong links is highly modular. Contrary to the common view, modules are self-similar and therefore are very far from being small-world. Incorporating the weak ties to the network converts it into a small-world preserving an underlying backbone of well-defined modules. Weak ties are shown to follow a pattern that maximizes information transfer with minimal wiring costs. This architecture is reminiscent of the concept of weak-ties strength in social networks and provides a natural solution to the puzzle of efficient infomration flow in the highly modular structure of the brain.

  4. Functional connectomics from a "big data" perspective.

    Science.gov (United States)

    Xia, Mingrui; He, Yong

    2017-10-15

    In the last decade, explosive growth regarding functional connectome studies has been observed. Accumulating knowledge has significantly contributed to our understanding of the brain's functional network architectures in health and disease. With the development of innovative neuroimaging techniques, the establishment of large brain datasets and the increasing accumulation of published findings, functional connectomic research has begun to move into the era of "big data", which generates unprecedented opportunities for discovery in brain science and simultaneously encounters various challenging issues, such as data acquisition, management and analyses. Big data on the functional connectome exhibits several critical features: high spatial and/or temporal precision, large sample sizes, long-term recording of brain activity, multidimensional biological variables (e.g., imaging, genetic, demographic, cognitive and clinic) and/or vast quantities of existing findings. We review studies regarding functional connectomics from a big data perspective, with a focus on recent methodological advances in state-of-the-art image acquisition (e.g., multiband imaging), analysis approaches and statistical strategies (e.g., graph theoretical analysis, dynamic network analysis, independent component analysis, multivariate pattern analysis and machine learning), as well as reliability and reproducibility validations. We highlight the novel findings in the application of functional connectomic big data to the exploration of the biological mechanisms of cognitive functions, normal development and aging and of neurological and psychiatric disorders. We advocate the urgent need to expand efforts directed at the methodological challenges and discuss the direction of applications in this field. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. 2nd INNS Conference on Big Data

    CERN Document Server

    Manolopoulos, Yannis; Iliadis, Lazaros; Roy, Asim; Vellasco, Marley

    2017-01-01

    The book offers a timely snapshot of neural network technologies as a significant component of big data analytics platforms. It promotes new advances and research directions in efficient and innovative algorithmic approaches to analyzing big data (e.g. deep networks, nature-inspired and brain-inspired algorithms); implementations on different computing platforms (e.g. neuromorphic, graphics processing units (GPUs), clouds, clusters); and big data analytics applications to solve real-world problems (e.g. weather prediction, transportation, energy management). The book, which reports on the second edition of the INNS Conference on Big Data, held on October 23–25, 2016, in Thessaloniki, Greece, depicts an interesting collaborative adventure of neural networks with big data and other learning technologies.

  6. Big ambitions for small reactors as investors size up power options

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, John [nuclear24, Redditch (United Kingdom)

    2016-04-15

    Earlier this year, US nuclear developer NuScale Power completed a study for the UK's National Nuclear Laboratory (NNL) that supported the suitability of NuScale's small modular reactor (SMR) technology for the effective disposition of plutonium. The UK is a frontrunner to compete in the SMR marketplace, both in terms of technological capabilities, trade and political commitment. Industry observers are openly speculating whether SMR design and construction could start to move ahead faster than 'big and conventional' nuclear construction projects - not just in the UK but worldwide. Economies of scale could increase the attraction of SMRs to investors and the general public.

  7. Big data challenges in decoding cortical activity in a human with quadriplegia to inform a brain computer interface.

    Science.gov (United States)

    Friedenberg, David A; Bouton, Chad E; Annetta, Nicholas V; Skomrock, Nicholas; Mingming Zhang; Schwemmer, Michael; Bockbrader, Marcia A; Mysiw, W Jerry; Rezai, Ali R; Bresler, Herbert S; Sharma, Gaurav

    2016-08-01

    Recent advances in Brain Computer Interfaces (BCIs) have created hope that one day paralyzed patients will be able to regain control of their paralyzed limbs. As part of an ongoing clinical study, we have implanted a 96-electrode Utah array in the motor cortex of a paralyzed human. The array generates almost 3 million data points from the brain every second. This presents several big data challenges towards developing algorithms that should not only process the data in real-time (for the BCI to be responsive) but are also robust to temporal variations and non-stationarities in the sensor data. We demonstrate an algorithmic approach to analyze such data and present a novel method to evaluate such algorithms. We present our methodology with examples of decoding human brain data in real-time to inform a BCI.

  8. Elective brain irradiation in patients with small-cell carcinoma of the lung: preliminary report

    International Nuclear Information System (INIS)

    Katsenis, A.T.; Karpasitis, N.; Giannakakis, D.; Maragoudakis, N.; Kiparissiadis, P.

    1982-01-01

    The brain is a common site of metastases in small-cell carcinoma of the lung. Prophylactic brain irradiation with doses of 4000-4500 rads in 3-4 weeks appears to decrease the occurrence of brain metastases although it does not prevent this completely. In a group of patients with small-cell carcinoma of the lung and without evidence of brain metastases, the authors review the site and extent of the primary, the methods of treatment, the techniques of brain irradiation, and the relapses rate in relation to the status of the primary and the rate of brain metastases in another group without prophylactic brain irradiation. They further attempt to investigate combined modalities of treatment which would prolong life and prevent neurological complications in the small number of long survivors with small-cell carcinoma of the lung. (Auth.)

  9. Brain networks: small-worlds, after all?

    International Nuclear Information System (INIS)

    Muller, Lyle; Destexhe, Alain; Rudolph-Lilith, Michelle

    2014-01-01

    Since its introduction, the ‘small-world’ effect has played a central role in network science, particularly in the analysis of the complex networks of the nervous system. From the cellular level to that of interconnected cortical regions, many analyses have revealed small-world properties in the networks of the brain. In this work, we revisit the quantification of small-worldness in neural graphs. We find that neural graphs fall into the ‘borderline’ regime of small-worldness, residing close to that of a random graph, especially when the degree sequence of the network is taken into account. We then apply recently introducted analytical expressions for clustering and distance measures, to study this borderline small-worldness regime. We derive theoretical bounds for the minimal and maximal small-worldness index for a given graph, and by semi-analytical means, study the small-worldness index itself. With this approach, we find that graphs with small-worldness equivalent to that observed in experimental data are dominated by their random component. These results provide the first thorough analysis suggesting that neural graphs may reside far away from the maximally small-world regime. (paper)

  10. Brain networks: small-worlds, after all?

    Energy Technology Data Exchange (ETDEWEB)

    Muller, Lyle; Destexhe, Alain; Rudolph-Lilith, Michelle [Unité de Neurosciences, Information et Complexité (UNIC), Centre National de la Recherche Scientifique (CNRS), 1 Avenue de la Terrasse, Gif-sur-Yvette (France)

    2014-10-01

    Since its introduction, the ‘small-world’ effect has played a central role in network science, particularly in the analysis of the complex networks of the nervous system. From the cellular level to that of interconnected cortical regions, many analyses have revealed small-world properties in the networks of the brain. In this work, we revisit the quantification of small-worldness in neural graphs. We find that neural graphs fall into the ‘borderline’ regime of small-worldness, residing close to that of a random graph, especially when the degree sequence of the network is taken into account. We then apply recently introducted analytical expressions for clustering and distance measures, to study this borderline small-worldness regime. We derive theoretical bounds for the minimal and maximal small-worldness index for a given graph, and by semi-analytical means, study the small-worldness index itself. With this approach, we find that graphs with small-worldness equivalent to that observed in experimental data are dominated by their random component. These results provide the first thorough analysis suggesting that neural graphs may reside far away from the maximally small-world regime. (paper)

  11. Small Core, Big Network: A Comprehensive Approach to GIS Teaching Practice Based on Digital Three-Dimensional Campus Reconstruction

    Science.gov (United States)

    Cheng, Liang; Zhang, Wen; Wang, Jiechen; Li, Manchun; Zhong, Lishan

    2014-01-01

    Geographic information science (GIS) features a wide range of disciplines and has broad applicability. Challenges associated with rapidly developing GIS technology and the currently limited teaching and practice materials hinder universities from cultivating highly skilled GIS graduates. Based on the idea of "small core, big network," a…

  12. Value of brain computed tomography in small cell lung cancers

    International Nuclear Information System (INIS)

    Fernet, M.; Breau, J.L.; Goldlust, D.; Israel, L.

    1988-01-01

    88 patients with small cell lung cancer were studied. Brain scans were performed first at initial staging and repeated at regular intervals during the survey. The results confirm the limited value of brain scans in the detection of metastases in neurologically asymptomatic patients [fr

  13. The lesioned brain: still a small world?

    Directory of Open Access Journals (Sweden)

    Linda Douw

    2010-11-01

    Full Text Available The intra-arterial amobarbital procedure (IAP or Wada test is used to determine language lateralization and contralateral memory functioning in patients eligible for neurosurgery because of pharmaco-resistant epilepsy. During unilateral sedation, functioning of the contralateral hemisphere is assessed by means of neuropsychological tests. We use the IAP as a reversible model for the effect of lesions on brain network topology. Three artifact free epochs (4096 samples were selected from each EEG record before and after amobarbital injection. Functional connectivity was assessed by means of the synchronization likelihood (SL. The resulting functional connectivity matrices were constructed for all six epochs per patient in four frequency bands, and weighted network analysis was performed. The clustering coefficient, average path length, small-world-index, and edge weight correlation were calculated. Recordings of 33 patients were available. Network topology changed significantly after amobarbital injection: clustering decreased in all frequency bands, while path length decreased in the theta and lower alpha band, indicating a shift towards a more random network topology. Likewise, the edge weight correlation decreased after injection of amobarbital in the theta and beta bands. Network characteristics after injection of amobarbital were correlated with memory score: higher theta band small-world-index and increased upper alpha path length were related to better memory score. The whole-brain network topology in patients eligible for epilepsy surgery becomes more random and less optimally organized after selective sedation of one hemisphere, as has been reported in studies with brain tumor patients. Furthermore, memory functioning after injection seems related to network topology, indicating that functional performance is related to topological network properties of the brain.

  14. A little big history of Tiananmen

    NARCIS (Netherlands)

    Quaedackers, E.; Grinin, L.E.; Korotayev, A.V.; Rodrigue, B.H.

    2011-01-01

    This contribution aims at demonstrating the usefulness of studying small-scale subjects such as Tiananmen, or the Gate of Heavenly Peace, in Beijing - from a Big History perspective. By studying such a ‘little big history’ of Tiananmen, previously overlooked yet fundamental explanations for why

  15. Small millets, big potential

    International Development Research Centre (IDRC) Digital Library (Canada)

    consumption of small millets, mainly due to limited productivity, high ... for effective integration of small millets in the ... replicated in other cities. ... to micro-, small- and medium-entrepreneurs producing millet-based ... and Activities Network,.

  16. Relapsing pattern of brain metastasis after brain irradiation in small cell lung cancer

    International Nuclear Information System (INIS)

    Murakami, Masao; Kuroda, Yasumasa; Okamoto, Yoshiaki; Kono, Koichi; Yoden, Eisaku; Mori, Takeki

    1997-01-01

    Many reports concerning radiation therapy for brain metastasis have been published, and which of the various methods urged by these reports provide optional control is still controversial. According to developing diagnosis of metastasis in CNS, therapeutic problems should be referred. We reviewed 67 patients with small cell lung cancer and brain metastasis who underwent brain irradiation (Ave. 47 Gy/5W), and all 15 patients with brain relapse after the irradiation. Relapsing patterns in this clinical setting were divided into local regrowth in the same lesions and re-metastasis (reseeding) in other regions, by reviewing follow up CT and MRI studies. Total survival among 15 patients with brain relapse and 52 without relapse was longer in the former cases than the later: 1-, and 2-year survival (47/19%, 13/8%) and MST (10.8/5.7 months), from the initial brain irradiation. The concerned significant factors limited in younger age, low value of LDH and improvement of NF. Of the 15 patients with brain relapse, 4 developed local regrowth and 11 did re-metastasis. The period of remission since brain irradiation were 172±94.4 and 393±281 days, respectively. Lower number of brain metastasis and lower value of LDH were shown in re-metastasis patients. At the time of brain relapse, 11 patients had recurrence of carcinomatous meningitis. 4 patients were treated with whole brain re-irradiation. All patients died of cancer, including 12 of relapsing CNS diseases and 3 of primary lesion and hepatic metastasis. Leukoencephalopathy developed in 2 patients. Survival since the brain relapse was 2 to 238 days without significant difference in cases of local regrowth and re-metastasis. According to our data on relapsing pattern of brain metastasis after conventional fractionated brain irradiation with an objective dose of 50 Gy, 75% of brain relapse were re-metastasis, we appreciate this irradiation for initial brain metastasis if limited to the brain. (author)

  17. Systemic Chemotherapy for Progression of Brain Metastases in Extensive-Stage Small Cell Lung Cancer

    Directory of Open Access Journals (Sweden)

    Nagla Abdel Karim

    2015-01-01

    Full Text Available Lung cancer is the most common cause of cancer related mortality in men and women. Approximately 15% of lung cancers are small cell type. Chemotherapy and radiation are the mainstay treatments. Currently, the standard chemotherapy regimen includes platinum/etoposide. For extensive small cell lung cancer, irinotecan and cisplatin have also been used. Patients with relapsed small cell lung cancer have a very poor prognosis, and the morbidity increases with brain metastases. Approximately 10%–14% of small cell lung cancer patients exhibit brain metastases at the time of diagnosis, which increases to 50%–80% as the disease progresses. Mean survival with brain metastases is reported to be less than six months, thus calling for improved regimens. Here we present a case series of patients treated with irinotecan for progressive brain metastases in small cell lung cancer, which serves as a reminder of the role of systemic chemotherapy in this setting.

  18. New solar telescope in Big Bear: evidence for super-diffusivity and small-scale solar dynamos?

    International Nuclear Information System (INIS)

    Goode, Philip R; Abramenko, Valentyna; Yurchyshyn, Vasyl

    2012-01-01

    The 1.6 m clear aperture New Solar Telescope (NST) in Big Bear Solar Observatory (BBSO) is now providing the highest resolution solar data ever. These data have revealed surprises about the Sun on small-scales including the observation that bright points (BPs), which can be used as proxies for the intense, compact magnetic elements that are apparent in photospheric intergranular lanes. The BPs are ever more numerous on ever smaller spatial scales as though there were no limit to how small the BPs can be. Here we discuss high resolution NST data on BPs that provide support for the ideas that a turbulent regime of super-diffusivity dominates in the quiet Sun, and there are local dynamos operating near the solar surface. (comment)

  19. Big Bath as a Determinant of Creative Accounting in Small and Micro Enterprises

    Directory of Open Access Journals (Sweden)

    Lenka Zemánková

    2015-01-01

    Full Text Available Creative accounting is a 21st century phenomenon and in the context of the economic crisis and deficit budget it has been receiving increasing attention, in particular in the area of prevention and detection of accounting manipulation. The focus of the research on small and micro-enterprises stems from little attention paid to these enterprises and the undeniable importance of small and micro-enterprises for the economy. Primary research is based on the examination of a phenomenological paradigm, i.e. it focuses on understanding human behaviour on the basis of a reference framework for research participants. The main research method used in research is a comparative case study, which is one of few methods that allow research of this sensitive topic. Research will focus on the existence of a big bath in the company’s ratio of profit and turnover as a determinant of a change in the company’s approach to creative accounting.

  20. The Astronaut Glove Challenge: Big Innovation from a (Very) Small Team

    Science.gov (United States)

    Homer, Peter

    2008-01-01

    Many measurements were taken by test engineers from Hamilton Sundstrand, the prime contractor for the current EVA suit. Because the raw measurements needed to be converted to torques and combined into a final score, it was impossible to keep track of who was ahead in this phase. The final comfort and dexterity test was performed in a depressurized glove box to simulate real on-orbit conditions. Each competitor was required to exercise the glove through a defined set of finger, thumb, and wrist motions without any sign of abrasion or bruising of the competitor's hand. I learned a lot about arm fatigue! This was a pass-fail event, and both of the remaining competitors came through intact. After taking what seemed like an eternity to tally the final scores, the judges announced that I had won the competition. My glove was the only one to have achieved lower finger-bending torques than the Phase VI glove. Looking back, I see three sources of the success of this project that I believe also operate in other programs where small teams have broken new ground in aerospace technologies. These are awareness, failure, and trust. By remaining aware of the big picture, continuously asking myself, "Am I converging on a solution?" and "Am I converging fast enough?" I was able to see that my original design was not going to succeed, leading to the decision to start over. I was also aware that, had I lingered over this choice or taken time to analyze it, I would not have been ready on the first day of competition. Failure forced me to look outside conventional thinking and opened the door to innovation. Choosing to make incremental failures enabled me to rapidly climb the learning curve. Trusting my "gut" feelings-which are really an internalized accumulation of experiences-and my newly acquired skills allowed me to devise new technologies rapidly and complete both gloves just in time. Awareness, failure, and trust are intertwined: failure provides experiences that inform awareness

  1. Graph analysis of structural brain networks in Alzheimer's disease: beyond small world properties.

    Science.gov (United States)

    John, Majnu; Ikuta, Toshikazu; Ferbinteanu, Janina

    2017-03-01

    Changes in brain connectivity in patients with early Alzheimer's disease (AD) have been investigated using graph analysis. However, these studies were based on small data sets, explored a limited range of network parameters, and did not focus on more restricted sub-networks, where neurodegenerative processes may introduce more prominent alterations. In this study, we constructed structural brain networks out of 87 regions using data from 135 healthy elders and 100 early AD patients selected from the Open Access Series of Imaging Studies (OASIS) database. We evaluated the graph properties of these networks by investigating metrics of network efficiency, small world properties, segregation, product measures of complexity, and entropy. Because degenerative processes take place at different rates in different brain areas, analysis restricted to sub-networks may reveal changes otherwise undetected. Therefore, we first analyzed the graph properties of a network encompassing all brain areas considered together, and then repeated the analysis after dividing the brain areas into two sub-networks constructed by applying a clustering algorithm. At the level of large scale network, the analysis did not reveal differences between AD patients and controls. In contrast, the same analysis performed on the two sub-networks revealed that small worldness diminished with AD only in the sub-network containing the areas of medial temporal lobe known to be heaviest and earliest affected. The second sub-network, which did not present significant AD-induced modifications of 'classical' small world parameters, nonetheless showed a trend towards an increase in small world propensity, a novel metric that unbiasedly quantifies small world structure. Beyond small world properties, complexity and entropy measures indicated that the intricacy of connection patterns and structural diversity decreased in both sub-networks. These results show that neurodegenerative processes impact volumetric

  2. Driving and driven architectures of directed small-world human brain functional networks.

    Directory of Open Access Journals (Sweden)

    Chaogan Yan

    Full Text Available Recently, increasing attention has been focused on the investigation of the human brain connectome that describes the patterns of structural and functional connectivity networks of the human brain. Many studies of the human connectome have demonstrated that the brain network follows a small-world topology with an intrinsically cohesive modular structure and includes several network hubs in the medial parietal regions. However, most of these studies have only focused on undirected connections between regions in which the directions of information flow are not taken into account. How the brain regions causally influence each other and how the directed network of human brain is topologically organized remain largely unknown. Here, we applied linear multivariate Granger causality analysis (GCA and graph theoretical approaches to a resting-state functional MRI dataset with a large cohort of young healthy participants (n = 86 to explore connectivity patterns of the population-based whole-brain functional directed network. This directed brain network exhibited prominent small-world properties, which obviously improved previous results of functional MRI studies showing weak small-world properties in the directed brain networks in terms of a kernel-based GCA and individual analysis. This brain network also showed significant modular structures associated with 5 well known subsystems: fronto-parietal, visual, paralimbic/limbic, subcortical and primary systems. Importantly, we identified several driving hubs predominantly located in the components of the attentional network (e.g., the inferior frontal gyrus, supplementary motor area, insula and fusiform gyrus and several driven hubs predominantly located in the components of the default mode network (e.g., the precuneus, posterior cingulate gyrus, medial prefrontal cortex and inferior parietal lobule. Further split-half analyses indicated that our results were highly reproducible between two

  3. Challenges of Big Data Analysis.

    Science.gov (United States)

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  4. Functional magnetic resonance imaging of divergent and convergent thinking in Big-C creativity.

    Science.gov (United States)

    Japardi, Kevin; Bookheimer, Susan; Knudsen, Kendra; Ghahremani, Dara G; Bilder, Robert M

    2018-02-15

    The cognitive and physiological processes underlying creativity remain unclear, and very few studies to date have attempted to identify the behavioral and brain characteristics that distinguish exceptional ("Big-C") from everyday ("little-c") creativity. The Big-C Project examined functional brain responses during tasks demanding divergent and convergent thinking in 35 Big-C Visual Artists (VIS), 41 Big-C Scientists (SCI), and 31 individuals in a "smart comparison group" (SCG) matched to the Big-C groups on parental educational attainment and estimated IQ. Functional MRI (fMRI) scans included two activation paradigms widely used in prior creativity research, the Alternate Uses Task (AUT) and Remote Associates Task (RAT), to assess brain function during divergent and convergent thinking, respectively. Task performance did not differ between groups. Functional MRI activation in Big-C and SCG groups differed during the divergent thinking task. No differences in activation were seen during the convergent thinking task. Big-C groups had less activation than SCG in frontal pole, right frontal operculum, left middle frontal gyrus, and bilaterally in occipital cortex. SCI displayed lower frontal and parietal activation relative to the SCG when generating alternate uses in the AUT, while VIS displayed lower frontal activation than SCI and SCG when generating typical qualities (the control condition in the AUT). VIS showed more activation in right inferior frontal gyrus and left supramarginal gyrus relative to SCI. All groups displayed considerable overlapping activation during the RAT. The results confirm substantial overlap in functional activation across groups, but suggest that exceptionally creative individuals may depend less on task-positive networks during tasks that demand divergent thinking. Published by Elsevier Ltd.

  5. Statistical complexity is maximized in a small-world brain.

    Directory of Open Access Journals (Sweden)

    Teck Liang Tan

    Full Text Available In this paper, we study a network of Izhikevich neurons to explore what it means for a brain to be at the edge of chaos. To do so, we first constructed the phase diagram of a single Izhikevich excitatory neuron, and identified a small region of the parameter space where we find a large number of phase boundaries to serve as our edge of chaos. We then couple the outputs of these neurons directly to the parameters of other neurons, so that the neuron dynamics can drive transitions from one phase to another on an artificial energy landscape. Finally, we measure the statistical complexity of the parameter time series, while the network is tuned from a regular network to a random network using the Watts-Strogatz rewiring algorithm. We find that the statistical complexity of the parameter dynamics is maximized when the neuron network is most small-world-like. Our results suggest that the small-world architecture of neuron connections in brains is not accidental, but may be related to the information processing that they do.

  6. Big Data for Business Ecosystem Players

    Directory of Open Access Journals (Sweden)

    Perko Igor

    2016-06-01

    Full Text Available In the provided research, some of the Big Data most prospective usage domains connect with distinguished player groups found in the business ecosystem. Literature analysis is used to identify the state of the art of Big Data related research in the major domains of its use-namely, individual marketing, health treatment, work opportunities, financial services, and security enforcement. System theory was used to identify business ecosystem major player types disrupted by Big Data: individuals, small and mid-sized enterprises, large organizations, information providers, and regulators. Relationships between the domains and players were explained through new Big Data opportunities and threats and by players’ responsive strategies. System dynamics was used to visualize relationships in the provided model.

  7. CCK-5: sequence analysis of a small cholecystokinin from canine brain and intestine

    International Nuclear Information System (INIS)

    Shively, J.; Reeve, J.R. Jr.; Eysselein, V.E.; Ben-Avram, C.; Vigna, S.R.; Walsh, J.H.

    1987-01-01

    The purpose of this study is to purify and to characterize chemically cholecystokinin (CCK)-like peptides present in brain and gut extracts that elute from gel filtration after the octapeptide. Canine small intestinal mucosa and brain were boiled in water and then extracted in cold trifluoroacetic acid, and cholecystokinin-like immunoreactivity was determined by carboxyl-terminal specific radioimmunoassay. Gel permeation chromatography on Sephadex G-50 revealed a form of CCK apparently smaller than CCK-8. Microsequence analysis showed that the amino terminal primary sequence of this small CCK was Gly-Trp-Met-Asp. Immunochemical and chromatographic analysis indicated that the carboxyl-terminal residue was Phe-NH 2 and thus the full sequence is Gly-Trp-Met-Asp-Phe-NH 2 . An antibody that recognizes synthetic CCK-8, CCK-5, and CCK-equally did not reveal the presence of significant amounts of CCK-4. These results indicate that CCK-5 is the major CCK form smaller than the octapeptide present in brain and small intestine. This finding, coupled with the demonstration by others that CCK-5 interacts with high-affinity brain CCK receptors, indicates that CCK-5 may play a physiological role in brain function

  8. Complex Behavior in a Selective Aging Neuron Model Based on Small World Networks

    International Nuclear Information System (INIS)

    Zhang Guiqing; Chen Tianlun

    2008-01-01

    Complex behavior in a selective aging simple neuron model based on small world networks is investigated. The basic elements of the model are endowed with the main features of a neuron function. The structure of the selective aging neuron model is discussed. We also give some properties of the new network and find that the neuron model displays a power-law behavior. If the brain network is small world-like network, the mean avalanche size is almost the same unless the aging parameter is big enough.

  9. Big Data and medicine: a big deal?

    Science.gov (United States)

    Mayer-Schönberger, V; Ingelsson, E

    2018-05-01

    Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research. Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data's role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval. © 2017 The Association for the Publication of the Journal of Internal Medicine.

  10. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  11. Statistical Challenges in "Big Data" Human Neuroimaging.

    Science.gov (United States)

    Smith, Stephen M; Nichols, Thomas E

    2018-01-17

    Smith and Nichols discuss "big data" human neuroimaging studies, with very large subject numbers and amounts of data. These studies provide great opportunities for making new discoveries about the brain but raise many new analytical challenges and interpretational risks. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Small wormholes change our picture of the big bang

    CERN Multimedia

    1990-01-01

    Matt Visser has studied tiny wormholes, which may be produced on a subatomic scale by quantum fluctuations in the energy of the vacuum. He believes these quantum wormholes could change our picture of the origin of the Universe in the big bang (1/2 p)

  13. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  14. Big Cat Coalitions: A Comparative Analysis of Regional Brain Volumes in Felidae.

    Science.gov (United States)

    Sakai, Sharleen T; Arsznov, Bradley M; Hristova, Ani E; Yoon, Elise J; Lundrigan, Barbara L

    2016-01-01

    Broad-based species comparisons across mammalian orders suggest a number of factors that might influence the evolution of large brains. However, the relationship between these factors and total and regional brain size remains unclear. This study investigated the relationship between relative brain size and regional brain volumes and sociality in 13 felid species in hopes of revealing relationships that are not detected in more inclusive comparative studies. In addition, a more detailed analysis was conducted of four focal species: lions ( Panthera leo ), leopards ( Panthera pardus ), cougars ( Puma concolor ), and cheetahs ( Acinonyx jubatus ). These species differ markedly in sociality and behavioral flexibility, factors hypothesized to contribute to increased relative brain size and/or frontal cortex size. Lions are the only truly social species, living in prides. Although cheetahs are largely solitary, males often form small groups. Both leopards and cougars are solitary. Of the four species, leopards exhibit the most behavioral flexibility, readily adapting to changing circumstances. Regional brain volumes were analyzed using computed tomography. Skulls ( n = 75) were scanned to create three-dimensional virtual endocasts, and regional brain volumes were measured using either sulcal or bony landmarks obtained from the endocasts or skulls. Phylogenetic least squares regression analyses found that sociality does not correspond with larger relative brain size in these species. However, the sociality/solitary variable significantly predicted anterior cerebrum (AC) volume, a region that includes frontal cortex. This latter finding is despite the fact that the two social species in our sample, lions and cheetahs, possess the largest and smallest relative AC volumes, respectively. Additionally, an ANOVA comparing regional brain volumes in four focal species revealed that lions and leopards, while not significantly different from one another, have relatively larger AC

  15. Big Cat Coalitions: A comparative analysis of regional brain volumes in Felidae

    Directory of Open Access Journals (Sweden)

    Sharleen T Sakai

    2016-10-01

    Full Text Available Broad-based species comparisons across mammalian orders suggest a number of factors that might influence the evolution of large brains. However, the relationship between these factors and total and regional brain size remains unclear. This study investigated the relationship between relative brain size and regional brain volumes and sociality in 13 felid species in hopes of revealing relationships that are not detected in more inclusive comparative studies. In addition, a more detailed analysis was conducted of 4 focal species: lions (Panthera leo, leopards (Panthera pardus, cougars (Puma concolor, and cheetahs (Acinonyx jubatus. These species differ markedly in sociality and behavioral flexibility, factors hypothesized to contribute to increased relative brain size and/or frontal cortex size. Lions are the only truly social species, living in prides. Although cheetahs are largely solitary, males often form small groups. Both leopards and cougars are solitary. Of the four species, leopards exhibit the most behavioral flexibility, readily adapting to changing circumstances. Regional brain volumes were analyzed using computed tomography (CT. Skulls (n=75 were scanned to create three-dimensional virtual endocasts, and regional brain volumes were measured using either sulcal or bony landmarks obtained from the endocasts or skulls. Phylogenetic least squares (PGLS regression analyses found that sociality does not correspond with larger relative brain size in these species. However, the sociality/solitary variable significantly predicted anterior cerebrum (AC volume, a region that includes frontal cortex. This latter finding is despite the fact that the two social species in our sample, lions and cheetahs, possess the largest and smallest relative AC volumes, respectively. Additionally, an ANOVA comparing regional brain volumes in 4 focal species revealed that lions and leopards, while not significantly different from one another, have relatively

  16. Metabolic and hemodynamic evaluation of brain metastases from small cell lung cancer with positron emission tomography

    DEFF Research Database (Denmark)

    Lassen, U; Andersen, P; Daugaard, G

    1998-01-01

    for studies of metabolic and hemodynamic features. This study was performed to determine regional cerebral metabolic rate of glucose (rCMRglu), regional cerebral blood flow (rCBF), and regional cerebral blood volume (rCBV) in brain metastases from small cell lung cancer and the surrounding brain. Tumor r......Brain metastases from small cell lung cancer respond to chemotherapy, but response duration is short and the intracerebral concentration of chemotherapy may be too low because of the characteristics of the blood-brain barrier. Positron emission tomography has been applied in a variety of tumors...

  17. "Small Steps, Big Rewards": You Can Prevent Type 2 Diabetes

    Science.gov (United States)

    ... Steps, Big Rewards": You Can Prevent Type 2 Diabetes Past Issues / Winter 2008 Table of Contents For ... million Americans are at risk for type 2 diabetes." "Fifty four million Americans are at risk for ...

  18. [Microsurgery assisted by intraoperative magnetic resonance imaging and neuronavigation for small lesions in deep brain].

    Science.gov (United States)

    Song, Zhi-jun; Chen, Xiao-lei; Xu, Bai-nan; Sun, Zheng-hui; Sun, Guo-chen; Zhao, Yan; Wang, Fei; Wang, Yu-bo; Zhou, Ding-biao

    2012-01-03

    To explore the practicability of resecting small lesions in deep brain by intraoperative magnetic resonance imaging (iMRI) and neuronavigator-assisted microsurgery and its clinical efficacies. A total of 42 cases with small lesions in deep brain underwent intraoperative MRI and neuronavigator-assisted microsurgery. The drifting of neuronavigation was corrected by images acquired from intraoperative MR rescanning. All lesions were successfully identified and 40 cases totally removed without mortality. Only 3 cases developed new neurological deficits post-operatively while 2 of them returned to normal neurological functions after a follow-up duration of 3 months to 2 years. The application of intraoperative MRI can effectively correct the drifting of neuronavigation and enhance the accuracy of microsurgical neuronavigation for small lesions in deep brain.

  19. Specific Regional and Age-Related Small Noncoding RNA Expression Patterns Within Superior Temporal Gyrus of Typical Human Brains Are Less Distinct in Autism Brains.

    Science.gov (United States)

    Stamova, Boryana; Ander, Bradley P; Barger, Nicole; Sharp, Frank R; Schumann, Cynthia M

    2015-12-01

    Small noncoding RNAs play a critical role in regulating messenger RNA throughout brain development and when altered could have profound effects leading to disorders such as autism spectrum disorders (ASD). We assessed small noncoding RNAs, including microRNA and small nucleolar RNA, in superior temporal sulcus association cortex and primary auditory cortex in typical and ASD brains from early childhood to adulthood. Typical small noncoding RNA expression profiles were less distinct in ASD, both between regions and changes with age. Typical micro-RNA coexpression associations were absent in ASD brains. miR-132, miR-103, and miR-320 micro-RNAs were dysregulated in ASD and have previously been associated with autism spectrum disorders. These diminished region- and age-related micro-RNA expression profiles are in line with previously reported findings of attenuated messenger RNA and long noncoding RNA in ASD brain. This study demonstrates alterations in superior temporal sulcus in ASD, a region implicated in social impairment, and is the first to demonstrate molecular alterations in the primary auditory cortex. © The Author(s) 2015.

  20. Prediction of brain target site concentrations on the basis of CSF PK : impact of mechanisms of blood-to-brain transport and within brain distribution

    NARCIS (Netherlands)

    Westerhout, J.

    2014-01-01

    In the development of drugs for the treatment of central nervous system (CNS) disorders, the prediction of human CNS drug action is a big challenge. Direct measurement of brain extracellular fluid (brainECF) concentrations is highly restricted in human. Therefore, unbound drug concentrations in

  1. A little big history of Tiananmen

    OpenAIRE

    Quaedackers, E.; Grinin, L.E.; Korotayev, A.V.; Rodrigue, B.H.

    2011-01-01

    This contribution aims at demonstrating the usefulness of studying small-scale subjects such as Tiananmen, or the Gate of Heavenly Peace, in Beijing - from a Big History perspective. By studying such a ‘little big history’ of Tiananmen, previously overlooked yet fundamental explanations for why people built the gate the way they did can be found. These explanations are useful in their own right and may also be used to deepen our understanding of more traditional explanations of why Tiananmen ...

  2. The Study of “big data” to support internal business strategists

    Science.gov (United States)

    Ge, Mei

    2018-01-01

    How is big data different from previous data analysis systems? The primary purpose behind traditional small data analytics that all managers are more or less familiar with is to support internal business strategies. But big data also offers a promising new dimension: to discover new opportunities to offer customers high-value products and services. The study focus to introduce some strategists which big data support to. Business decisions using big data can also involve some areas for analytics. They include customer satisfaction, customer journeys, supply chains, risk management, competitive intelligence, pricing, discovery and experimentation or facilitating big data discovery.

  3. Making a Big Bang on the small screen

    Science.gov (United States)

    Thomas, Nick

    2010-01-01

    While the quality of some TV sitcoms can leave viewers feeling cheated out of 30 minutes of their lives, audiences and critics are raving about the science-themed US comedy The Big Bang Theory. First shown on the CBS network in 2007, the series focuses on two brilliant postdoc physicists, Leonard and Sheldon, who are totally absorbed by science. Adhering to the stereotype, they also share a fanatical interest in science fiction, video-gaming and comic books, but unfortunately lack the social skills required to connect with their 20-something nonacademic contemporaries.

  4. Big Data solutions on a small scale: Evaluating accessible high-performance computing for social research

    OpenAIRE

    Murthy, Dhiraj; Bowman, S. A.

    2014-01-01

    Though full of promise, Big Data research success is often contingent on access to the newest, most advanced, and often expensive hardware systems and the expertise needed to build and implement such systems. As a result, the accessibility of the growing number of Big Data-capable technology solutions has often been the preserve of business analytics. Pay as you store/process services like Amazon Web Services have opened up possibilities for smaller scale Big Data projects. There is high dema...

  5. A numerical simulation of pre-big bang cosmology

    CERN Document Server

    Maharana, J P; Veneziano, Gabriele

    1998-01-01

    We analyse numerically the onset of pre-big bang inflation in an inhomogeneous, spherically symmetric Universe. Adding a small dilatonic perturbation to a trivial (Milne) background, we find that suitable regions of space undergo dilaton-driven inflation and quickly become spatially flat ($\\Omega \\to 1$). Numerical calculations are pushed close enough to the big bang singularity to allow cross checks against previously proposed analytic asymptotic solutions.

  6. Handedness- and brain size-related efficiency differences in small-world brain networks: a resting-state functional magnetic resonance imaging study.

    Science.gov (United States)

    Li, Meiling; Wang, Junping; Liu, Feng; Chen, Heng; Lu, Fengmei; Wu, Guorong; Yu, Chunshui; Chen, Huafu

    2015-05-01

    The human brain has been described as a complex network, which integrates information with high efficiency. However, the relationships between the efficiency of human brain functional networks and handedness and brain size remain unclear. Twenty-one left-handed and 32 right-handed healthy subjects underwent a resting-state functional magnetic resonance imaging scan. The whole brain functional networks were constructed by thresholding Pearson correlation matrices of 90 cortical and subcortical regions. Graph theory-based methods were employed to further analyze their topological properties. As expected, all participants demonstrated small-world topology, suggesting a highly efficient topological structure. Furthermore, we found that smaller brains showed higher local efficiency, whereas larger brains showed higher global efficiency, reflecting a suitable efficiency balance between local specialization and global integration of brain functional activity. Compared with right-handers, significant alterations in nodal efficiency were revealed in left-handers, involving the anterior and median cingulate gyrus, middle temporal gyrus, angular gyrus, and amygdala. Our findings indicated that the functional network organization in the human brain was associated with handedness and brain size.

  7. Genetics Home Reference: COL4A1-related brain small-vessel disease

    Science.gov (United States)

    ... hemorrhage Johns Hopkins Medicine Department of Neurology and Neurosurgery: Intracerebral Hemorrhage Johns Hopkins Medicine Department of Neurology and Neurosurgery: Stroke MalaCards: col4a1-related brain small-vessel disease ...

  8. Big is not always beautiful - small can be a short cut to blue oceans

    DEFF Research Database (Denmark)

    Kvistgaard, Peter

    2007-01-01

    Often it is claimed that big investments are the only way to success in tourism and the experience economy. Only by building some of the world's biggest hotels - like the ones in Dubai or Las Vegas where hotels with 3-4,000 rooms are not uncommon - success can be achieved. It is understandable...... that hotels have to be big in Las Vegas in order to secure a good return on investment. It is also understandable that they build big hotels when 37 million people came to visit and 22,000 conventions were held in Las Vegas in 2004 according to the official website of Las Vegas (www.lasvegasnevada.gov/factsstatistics/funfacts.htm)....

  9. Big Data solutions on a small scale: Evaluating accessible high-performance computing for social research

    Directory of Open Access Journals (Sweden)

    Dhiraj Murthy

    2014-11-01

    Full Text Available Though full of promise, Big Data research success is often contingent on access to the newest, most advanced, and often expensive hardware systems and the expertise needed to build and implement such systems. As a result, the accessibility of the growing number of Big Data-capable technology solutions has often been the preserve of business analytics. Pay as you store/process services like Amazon Web Services have opened up possibilities for smaller scale Big Data projects. There is high demand for this type of research in the digital humanities and digital sociology, for example. However, scholars are increasingly finding themselves at a disadvantage as available data sets of interest continue to grow in size and complexity. Without a large amount of funding or the ability to form interdisciplinary partnerships, only a select few find themselves in the position to successfully engage Big Data. This article identifies several notable and popular Big Data technologies typically implemented using large and extremely powerful cloud-based systems and investigates the feasibility and utility of development of Big Data analytics systems implemented using low-cost commodity hardware in basic and easily maintainable configurations for use within academic social research. Through our investigation and experimental case study (in the growing field of social Twitter analytics, we found that not only are solutions like Cloudera’s Hadoop feasible, but that they can also enable robust, deep, and fruitful research outcomes in a variety of use-case scenarios across the disciplines.

  10. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    Science.gov (United States)

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  11. The importance of hunting and hunting grounds for big and small game for tourism development in the basin of Crna Reka the Republic of Macedonia

    OpenAIRE

    Koteski, Cane; Jakovlev, Zlatko; Mitreva, Elizabeta; Angelkova, Tanja; Kitanov, Vladimir

    2012-01-01

    To show the hunting and hunting grounds for big and small game, the structure of the areas of certain hunting, fishing, fishing water objects, fish species, fishponds up to 20 years shown by municipalities and individual farms with ponds in the basin of Crna Reka.

  12. Radioactivity: ''small users, big problems''

    International Nuclear Information System (INIS)

    McDonnell, C.

    1993-01-01

    In the United Kingdom there are at least one thousand small users of radioactivity in industry, in medicine, in higher education establishments and even schools. These users of small amounts of radioactivity, covering a wide variety of forms and applications, have difficulty in disposing of their wastes. Disposal provisions for users outside the nuclear industry, the practical problems they encounter and the future developments likely are discussed. (UK)

  13. Targeting brain metastases in ALK-rearranged non-small-cell lung cancer.

    Science.gov (United States)

    Zhang, Isabella; Zaorsky, Nicholas G; Palmer, Joshua D; Mehra, Ranee; Lu, Bo

    2015-10-01

    The incidence of brain metastases has increased as a result of improved systemic control and advances in imaging. However, development of novel therapeutics with CNS activity has not advanced at the same rate. Research on molecular markers has revealed many potential targets for antineoplastic agents, and a particularly important aberration is translocation in the ALK gene, identified in non-small-cell lung cancer (NSCLC). ALK inhibitors have shown systemic efficacy against ALK-rearranged NSCLC in many clinical trials, but the effectiveness of crizotinib in CNS disease is limited by poor blood-brain barrier penetration and acquired drug resistance. In this Review, we discuss potential pathways to target ALK-rearranged brain metastases, including next generation ALK inhibitors with greater CNS penetration and mechanisms to overcome resistance. Other important mechanisms to control CNS disease include targeting pathways downstream of ALK phosphorylation, increasing the permeability of the blood-brain barrier, modifying the tumour microenvironment, and adding concurrent radiotherapy. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Strategies for transporting nanoparticles across the blood-brain barrier.

    Science.gov (United States)

    Zhang, Tian-Tian; Li, Wen; Meng, Guanmin; Wang, Pei; Liao, Wenzhen

    2016-02-01

    The existence of blood-brain barrier (BBB) hampers the effective treatment of central nervous system (CNS) diseases. Almost all macromolecular drugs and more than 98% of small molecule drugs cannot pass the BBB. Therefore, the BBB remains a big challenge for delivery of therapeutics to the central nervous system. With the structural and mechanistic elucidation of the BBB under both physiological and pathological conditions, it is now possible to design delivery systems that could cross the BBB effectively. Because of their advantageous properties, nanoparticles have been widely deployed for brain-targeted delivery. This review paper presents the current understanding of the BBB under physiological and pathological conditions, and summarizes strategies and systems for BBB crossing with a focus on nanoparticle-based drug delivery systems. In summary, with wider applications and broader prospection the treatment of brain targeted therapy, nano-medicines have proved to be more potent, more specific and less toxic than traditional drug therapy.

  15. A review of structural and functional brain networks: small world and atlas.

    Science.gov (United States)

    Yao, Zhijun; Hu, Bin; Xie, Yuanwei; Moore, Philip; Zheng, Jiaxiang

    2015-03-01

    Brain networks can be divided into two categories: structural and functional networks. Many studies of neuroscience have reported that the complex brain networks are characterized by small-world or scale-free properties. The identification of nodes is the key factor in studying the properties of networks on the macro-, micro- or mesoscale in both structural and functional networks. In the study of brain networks, nodes are always determined by atlases. Therefore, the selection of atlases is critical, and appropriate atlases are helpful to combine the analyses of structural and functional networks. Currently, some problems still exist in the establishment or usage of atlases, which are often caused by the segmentation or the parcellation of the brain. We suggest that quantification of brain networks might be affected by the selection of atlases to a large extent. In the process of building atlases, the influences of single subjects and groups should be balanced. In this article, we focused on the effects of atlases on the analysis of brain networks and the improved divisions based on the tractography or connectivity in the parcellation of atlases.

  16. Prediction of brain target site concentrations on the basis of CSF PK : impact of mechanisms of blood-to-brain transport and within brain distribution

    OpenAIRE

    Westerhout, J.

    2014-01-01

    In the development of drugs for the treatment of central nervous system (CNS) disorders, the prediction of human CNS drug action is a big challenge. Direct measurement of brain extracellular fluid (brainECF) concentrations is highly restricted in human. Therefore, unbound drug concentrations in human cerebrospinal fluid (CSF) are used as a surrogate for human brainECF concentrations. Due to qualitative and quantitative differences in processes that govern the pharmacokinetics (PK) of drugs in...

  17. In a world of big data, small effects can still matter: a reply to Boyce, Daly, Hounkpatin, and Wood (2017)

    OpenAIRE

    Matz, SC; Gladstone, JJ; Stillwell, David John

    2017-01-01

    We make three points in response to Boyce, Daly, Hounkpatin, and Wood (2017). First, we clarify a misunderstanding of the goal of our analyses, which was to investigate the links between life satisfaction and spending patterns, rather than spending volume. Second, we report a simulation study we ran to demonstrate that our results were not driven by the proposed statistical artifact. Finally, we discuss the broader issue of why, in a world of big data, small but reliable effect sizes can be v...

  18. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  19. Big Data in Science and Healthcare: A Review of Recent Literature and Perspectives

    Science.gov (United States)

    Miron-Shatz, T.; Lau, A. Y. S.; Paton, C.

    2014-01-01

    Summary Objectives As technology continues to evolve and rise in various industries, such as healthcare, science, education, and gaming, a sophisticated concept known as Big Data is surfacing. The concept of analytics aims to understand data. We set out to portray and discuss perspectives of the evolving use of Big Data in science and healthcare and, to examine some of the opportunities and challenges. Methods A literature review was conducted to highlight the implications associated with the use of Big Data in scientific research and healthcare innovations, both on a large and small scale. Results Scientists and health-care providers may learn from one another when it comes to understanding the value of Big Data and analytics. Small data, derived by patients and consumers, also requires analytics to become actionable. Connectivism provides a framework for the use of Big Data and analytics in the areas of science and healthcare. This theory assists individuals to recognize and synthesize how human connections are driving the increase in data. Despite the volume and velocity of Big Data, it is truly about technology connecting humans and assisting them to construct knowledge in new ways. Concluding Thoughts The concept of Big Data and associated analytics are to be taken seriously when approaching the use of vast volumes of both structured and unstructured data in science and health-care. Future exploration of issues surrounding data privacy, confidentiality, and education are needed. A greater focus on data from social media, the quantified self-movement, and the application of analytics to “small data” would also be useful. PMID:25123717

  20. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van

  1. Small Data

    NARCIS (Netherlands)

    S. Pemberton (Steven)

    2014-01-01

    htmlabstractThe term “Open Data” often goes hand in hand with the term “Big Data”, where large data sets get released allowing for analysis, but the Cinderella of the Open Data ball is Small Data, small amounts of data, nonetheless possibly essential, that are too small to be put in some database or

  2. Abnormal small-world brain functional networks in obsessive-compulsive disorder patients with poor insight.

    Science.gov (United States)

    Lei, Hui; Cui, Yan; Fan, Jie; Zhang, Xiaocui; Zhong, Mingtian; Yi, Jinyao; Cai, Lin; Yao, Dezhong; Zhu, Xiongzhao

    2017-09-01

    There are limited data on neurobiological correlates of poor insight in obsessive-compulsive disorder (OCD). This study explored whether specific changes occur in small-world network (SWN) properties in the brain functional network of OCD patients with poor insight. Resting-state electroencephalograms (EEGs) were recorded for 12 medication-free OCD patients with poor insight, 50 medication-free OCD patients with good insight, and 36 healthy controls. Both of the OCD groups exhibited topological alterations in the brain functional network characterized by abnormal small-world parameters at the beta band. However, the alterations at the theta band only existed in the OCD patients with poor insight. A relatively small sample size. Subjects were naïve to medications and those with Axis I comorbidity were excluded, perhaps limiting generalizability. Disrupted functional integrity at the beta bands of the brain functional network may be related to OCD, while disrupted functional integrity at the theta band may be associated with poor insight in OCD patients, thus this study might provide novel insight into our understanding of the pathophysiology of OCD. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. What would be outcome of a Big Crunch?

    CERN Document Server

    Hajdukovic, Dragan Slavkov

    2010-01-01

    I suggest the existence of a still undiscovered interaction: repulsion between matter and antimatter. The simplest and the most elegant candidate for such a force is gravitational repulsion between particles and antiparticles. I argue that such a force may give birth to a new Universe; by transforming an eventual Big Crunch of our universe, to an event similar to Big Bang. In fact, when a collapsing Universe is reduced to a supermassive black hole of a small size, a very strong field of the conjectured force may create particle-antiparticle pairs from the surrounding vacuum. The amount of the antimatter created from the physical vacuum is equal to the decrease of mass of "black hole Universe" and violently repelled from it. When the size of the black hole is sufficiently small the creation of antimatter may become so huge and fast, that matter of our Universe may disappear in a fraction of the Planck time. So fast transformation of matter to antimatter may look like a Big Bang with the initial size about 30 o...

  4. Prolonged survival after resection and radiotherapy for solitary brain metastases from non-small-cell lung cancer

    International Nuclear Information System (INIS)

    Chee, R. J.; Bydder, S.; Cameron, F.

    2007-01-01

    Selected patients with brain metastases from non-small-cell lung cancer benefit from aggressive treatment. This report describes three patients who developed solitary brain metastases after previous resection of primary adenocarcinoma of the lung. Each underwent surgical resection of their brain metastasis followed by cranial irradiation and remain disease free 10 or more years later. Two patients developed cognitive impairment approximately 8 years after treatment of their brain metastasis, which was felt to be due to their previous brain irradiation. Here we discuss the treatment of solitary brain metastasis, particularly the value of combined method approaches in selected patients and dose-volume considerations

  5. Does Implementation of Big Data Analytics Improve Firms’ Market Value? Investors’ Reaction in Stock Market

    Directory of Open Access Journals (Sweden)

    Hansol Lee

    2017-06-01

    Full Text Available Recently, due to the development of social media, multimedia, and the Internet of Things (IoT, various types of data have increased. As the existing data analytics tools cannot cover this huge volume of data, big data analytics becomes one of the emerging technologies for business today. Considering that big data analytics is an up-to-date term, in the present study, we investigated the impact of implementing big data analytics in the short-term perspective. We used an event study methodology to investigate the changes in stock price caused by announcements on big data analytics solution investment. A total of 54 investment announcements of firms publicly traded in NASDAQ and NYSE from 2010 to 2015 were collected. Our results empirically demonstrate that announcement of firms’ investment on big data solution leads to positive stock market reactions. In addition, we also found that investments on small vendors’ solution with industry-oriented functions tend to result in higher abnormal returns than those on big vendors’ solution with general functions. Finally, our results also suggest that stock market investors highly evaluate big data analytics investments of big firms as compared to those of small firms.

  6. From Big Data to Big Displays High-Performance Visualization at Blue Brain

    KAUST Repository

    Eilemann, Stefan; Abdellah, Marwan; Antille, Nicolas; Bilgili, Ahmet; Chevtchenko, Grigory; Dumusc, Raphael; Favreau, Cyrille; Hernando, Juan; Nachbaur, Daniel; Podhajski, Pawel; Villafranca, Jafet; Schü rmann, Felix

    2017-01-01

    Blue Brain has pushed high-performance visualization (HPV) to complement its HPC strategy since its inception in 2007. In 2011, this strategy has been accelerated to develop innovative visualization solutions through increased funding and strategic

  7. Really big numbers

    CERN Document Server

    Schwartz, Richard Evan

    2014-01-01

    In the American Mathematical Society's first-ever book for kids (and kids at heart), mathematician and author Richard Evan Schwartz leads math lovers of all ages on an innovative and strikingly illustrated journey through the infinite number system. By means of engaging, imaginative visuals and endearing narration, Schwartz manages the monumental task of presenting the complex concept of Big Numbers in fresh and relatable ways. The book begins with small, easily observable numbers before building up to truly gigantic ones, like a nonillion, a tredecillion, a googol, and even ones too huge for names! Any person, regardless of age, can benefit from reading this book. Readers will find themselves returning to its pages for a very long time, perpetually learning from and growing with the narrative as their knowledge deepens. Really Big Numbers is a wonderful enrichment for any math education program and is enthusiastically recommended to every teacher, parent and grandparent, student, child, or other individual i...

  8. Gender Differences in Personality across the Ten Aspects of the Big Five.

    Science.gov (United States)

    Weisberg, Yanna J; Deyoung, Colin G; Hirsh, Jacob B

    2011-01-01

    This paper investigates gender differences in personality traits, both at the level of the Big Five and at the sublevel of two aspects within each Big Five domain. Replicating previous findings, women reported higher Big Five Extraversion, Agreeableness, and Neuroticism scores than men. However, more extensive gender differences were found at the level of the aspects, with significant gender differences appearing in both aspects of every Big Five trait. For Extraversion, Openness, and Conscientiousness, the gender differences were found to diverge at the aspect level, rendering them either small or undetectable at the Big Five level. These findings clarify the nature of gender differences in personality and highlight the utility of measuring personality at the aspect level.

  9. The Problem with Big Data: Operating on Smaller Datasets to Bridge the Implementation Gap.

    Science.gov (United States)

    Mann, Richard P; Mushtaq, Faisal; White, Alan D; Mata-Cervantes, Gabriel; Pike, Tom; Coker, Dalton; Murdoch, Stuart; Hiles, Tim; Smith, Clare; Berridge, David; Hinchliffe, Suzanne; Hall, Geoff; Smye, Stephen; Wilkie, Richard M; Lodge, J Peter A; Mon-Williams, Mark

    2016-01-01

    Big datasets have the potential to revolutionize public health. However, there is a mismatch between the political and scientific optimism surrounding big data and the public's perception of its benefit. We suggest a systematic and concerted emphasis on developing models derived from smaller datasets to illustrate to the public how big data can produce tangible benefits in the long term. In order to highlight the immediate value of a small data approach, we produced a proof-of-concept model predicting hospital length of stay. The results demonstrate that existing small datasets can be used to create models that generate a reasonable prediction, facilitating health-care delivery. We propose that greater attention (and funding) needs to be directed toward the utilization of existing information resources in parallel with current efforts to create and exploit "big data."

  10. Big Data and reality

    Directory of Open Access Journals (Sweden)

    Ryan Shaw

    2015-11-01

    Full Text Available DNA sequencers, Twitter, MRIs, Facebook, particle accelerators, Google Books, radio telescopes, Tumblr: what do these things have in common? According to the evangelists of “data science,” all of these are instruments for observing reality at unprecedentedly large scales and fine granularities. This perspective ignores the social reality of these very different technological systems, ignoring how they are made, how they work, and what they mean in favor of an exclusive focus on what they generate: Big Data. But no data, big or small, can be interpreted without an understanding of the process that generated them. Statistical data science is applicable to systems that have been designed as scientific instruments, but is likely to lead to confusion when applied to systems that have not. In those cases, a historical inquiry is preferable.

  11. Možnosti využitia Big Data pre Competitive Inteligence

    OpenAIRE

    Verníček, Marek

    2016-01-01

    The main purpose of this thesis is to investigate the use of Big Data for the methods and procedures of Competitive Intelligence. Among the goals of the work is a toolkit for small and large businesses which is supposed to support their work with the whole process of Big Data work. Another goal is to design an effective solution of processing Big Data to gain a competitive advantage in business. The theoretical part of the work processes available scientific literature in the Czech Republic a...

  12. Implementing the “Big Data” Concept in Official Statistics

    Directory of Open Access Journals (Sweden)

    О. V.

    2017-02-01

    Full Text Available Big data is a huge resource that needs to be used at all levels of economic planning. The article is devoted to the study of the development of the concept of “Big Data” in the world and its impact on the transformation of statistical simulation of economic processes. Statistics at the current stage should take into account the complex system of international economic relations, which functions in the conditions of globalization and brings new forms of economic development in small open economies. Statistical science should take into account such phenomena as gig-economy, common economy, institutional factors, etc. The concept of “Big Data” and open data are analyzed, problems of implementation of “Big Data” in the official statistics are shown. The ways of implementation of “Big Data” in the official statistics of Ukraine through active use of technological opportunities of mobile operators, navigation systems, surveillance cameras, social networks, etc. are presented. The possibilities of using “Big Data” in different sectors of the economy, also on the level of companies are shown. The problems of storage of large volumes of data are highlighted. The study shows that “Big Data” is a huge resource that should be used across the Ukrainian economy.

  13. BigOP: Generating Comprehensive Big Data Workloads as a Benchmarking Framework

    OpenAIRE

    Zhu, Yuqing; Zhan, Jianfeng; Weng, Chuliang; Nambiar, Raghunath; Zhang, Jinchao; Chen, Xingzhen; Wang, Lei

    2014-01-01

    Big Data is considered proprietary asset of companies, organizations, and even nations. Turning big data into real treasure requires the support of big data systems. A variety of commercial and open source products have been unleashed for big data storage and processing. While big data users are facing the choice of which system best suits their needs, big data system developers are facing the question of how to evaluate their systems with regard to general big data processing needs. System b...

  14. CXCR4/CXCL12 in Non-Small-Cell Lung Cancer Metastasis to the Brain

    Directory of Open Access Journals (Sweden)

    Sebastiano Cavallaro

    2013-01-01

    Full Text Available Lung cancer represents the leading cause of cancer-related mortality throughout the world. Patients die of local progression, disseminated disease, or both. At least one third of the people with lung cancer develop brain metastases at some point during their disease, even often before the diagnosis of lung cancer is made. The high rate of brain metastasis makes lung cancer the most common type of tumor to spread to the brain. It is critical to understand the biologic basis of brain metastases to develop novel diagnostic and therapeutic approaches. This review will focus on the emerging data supporting the involvement of the chemokine CXCL12 and its receptor CXCR4 in the brain metastatic evolution of non-small-cell lung cancer (NSCLC and the pharmacological tools that may be used to interfere with this signaling axis.

  15. How Big Is Too Big?

    Science.gov (United States)

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  16. Propeptide big-endothelin, N-terminal-pro brain natriuretic peptide and mortality. The Ludwigshafen risk and cardiovascular health (LURIC) study.

    Science.gov (United States)

    Gergei, Ingrid; Krämer, Bernhard K; Scharnagl, Hubert; Stojakovic, Tatjana; März, Winfried; Mondorf, Ulrich

    The endothelin system (Big-ET-1) is a key regulator in cardiovascular (CV) disease and congestive heart failure (CHF). We have examined the incremental value of Big-ET-1 in predicting total and CV mortality next to the well-established CV risk marker N-Terminal Pro-B-Type Natriuretic Peptide (NT-proBNP). Big-ET-1 and NT-proBNP were determined in 2829 participants referred for coronary angiography (follow-up 9.9 years). Big-ET-1 is an independent predictor of total, CV mortality and death due to CHF. The conjunct use of Big-ET-1 and NT-proBNP improves the risk stratification of patients with intermediate to high risk of CV death and CHF. Big-ET-1improves risk stratification in patients referred for coronary angiography.

  17. An object-based approach for detecting small brain lesions: application to Virchow-Robin spaces.

    Science.gov (United States)

    Descombes, Xavier; Kruggel, Frithjof; Wollny, Gert; Gertz, Hermann Josef

    2004-02-01

    This paper is concerned with the detection of multiple small brain lesions from magnetic resonance imaging (MRI) data. A model based on the marked point process framework is designed to detect Virchow-Robin spaces (VRSs). These tubular shaped spaces are due to retraction of the brain parenchyma from its supplying arteries. VRS are described by simple geometrical objects that are introduced as small tubular structures. Their radiometric properties are embedded in a data term. A prior model includes interactions describing the clustering property of VRS. A Reversible Jump Markov Chain Monte Carlo algorithm (RJMCMC) optimizes the proposed model, obtained by multiplying the prior and the data model. Example results are shown on T1-weighted MRI datasets of elderly subjects.

  18. Resources available for autism research in the big data era: a systematic review

    Directory of Open Access Journals (Sweden)

    Reem Al-jawahiri

    2017-01-01

    Full Text Available Recently, there has been a move encouraged by many stakeholders towards generating big, open data in many areas of research. One area where big, open data is particularly valuable is in research relating to complex heterogeneous disorders such as Autism Spectrum Disorder (ASD. The inconsistencies of findings and the great heterogeneity of ASD necessitate the use of big and open data to tackle important challenges such as understanding and defining the heterogeneity and potential subtypes of ASD. To this end, a number of initiatives have been established that aim to develop big and/or open data resources for autism research. In order to provide a useful data reference for autism researchers, a systematic search for ASD data resources was conducted using the Scopus database, the Google search engine, and the pages on ‘recommended repositories’ by key journals, and the findings were translated into a comprehensive list focused on ASD data. The aim of this review is to systematically search for all available ASD data resources providing the following data types: phenotypic, neuroimaging, human brain connectivity matrices, human brain statistical maps, biospecimens, and ASD participant recruitment. A total of 33 resources were found containing different types of data from varying numbers of participants. Description of the data available from each data resource, and links to each resource is provided. Moreover, key implications are addressed and underrepresented areas of data are identified.

  19. Small Bodies, Big Discoveries: NASA's Small Bodies Education Program

    Science.gov (United States)

    Mayo, L.; Erickson, K. J.

    2014-12-01

    2014 is turning out to be a watershed year for celestial events involving the solar system's unsung heroes, small bodies. This includes the close flyby of comet C/2013 A1 / Siding Spring with Mars in October and the historic Rosetta mission with its Philae lander to comet 67P/Churyumov-Gerasimenko. Beyond 2014, the much anticipated 2015 Pluto flyby by New Horizons and the February Dawn Mission arrival at Ceres will take center stage. To deliver the excitement and wonder of our solar system's small bodies to worldwide audiences, NASA's JPL and GSFC education teams in partnership with NASA EDGE will reach out to the public through multiple venues including broadcast media, social media, science and math focused educational activities, observing challenges, interactive visualization tools like "Eyes on the Solar System" and more. This talk will highlight NASA's focused education effort to engage the public in small bodies mission science and the role these objects play in our understanding of the formation and evolution of the solar system.

  20. Nursing Needs Big Data and Big Data Needs Nursing.

    Science.gov (United States)

    Brennan, Patricia Flatley; Bakken, Suzanne

    2015-09-01

    Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.

  1. BIG Data - BIG Gains? Understanding the Link Between Big Data Analytics and Innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance for product innovations. Since big data technologies provide new data information practices, they create new decision-making possibilities, which firms can use to realize innovations. Applying German firm-level data we find suggestive evidence that big data analytics matters for the likelihood of becoming a product innovator as well as the market success of the firms’ product innovat...

  2. Brain anatomical networks in early human brain development.

    Science.gov (United States)

    Fan, Yong; Shi, Feng; Smith, Jeffrey Keith; Lin, Weili; Gilmore, John H; Shen, Dinggang

    2011-02-01

    Recent neuroimaging studies have demonstrated that human brain networks have economic small-world topology and modular organization, enabling efficient information transfer among brain regions. However, it remains largely unknown how the small-world topology and modular organization of human brain networks emerge and develop. Using longitudinal MRI data of 28 healthy pediatric subjects, collected at their ages of 1 month, 1 year, and 2 years, we analyzed development patterns of brain anatomical networks derived from morphological correlations of brain regional volumes. The results show that the brain network of 1-month-olds has the characteristically economic small-world topology and nonrandom modular organization. The network's cost efficiency increases with the brain development to 1 year and 2 years, so does the modularity, providing supportive evidence for the hypothesis that the small-world topology and the modular organization of brain networks are established during early brain development to support rapid synchronization and information transfer with minimal rewiring cost, as well as to balance between local processing and global integration of information. Copyright © 2010. Published by Elsevier Inc.

  3. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  4. Implementing the “Big Data” Concept in Official Statistics

    OpenAIRE

    О. V.

    2017-01-01

    Big data is a huge resource that needs to be used at all levels of economic planning. The article is devoted to the study of the development of the concept of “Big Data” in the world and its impact on the transformation of statistical simulation of economic processes. Statistics at the current stage should take into account the complex system of international economic relations, which functions in the conditions of globalization and brings new forms of economic development in small open ec...

  5. Global fluctuation spectra in big-crunch-big-bang string vacua

    International Nuclear Information System (INIS)

    Craps, Ben; Ovrut, Burt A.

    2004-01-01

    We study big-crunch-big-bang cosmologies that correspond to exact world-sheet superconformal field theories of type II strings. The string theory spacetime contains a big crunch and a big bang cosmology, as well as additional 'whisker' asymptotic and intermediate regions. Within the context of free string theory, we compute, unambiguously, the scalar fluctuation spectrum in all regions of spacetime. Generically, the big crunch fluctuation spectrum is altered while passing through the bounce singularity. The change in the spectrum is characterized by a function Δ, which is momentum and time dependent. We compute Δ explicitly and demonstrate that it arises from the whisker regions. The whiskers are also shown to lead to 'entanglement' entropy in the big bang region. Finally, in the Milne orbifold limit of our superconformal vacua, we show that Δ→1 and, hence, the fluctuation spectrum is unaltered by the big-crunch-big-bang singularity. We comment on, but do not attempt to resolve, subtleties related to gravitational back reaction and light winding modes when interactions are taken into account

  6. [Timing of Brain Radiation Therapy Impacts Outcomes in Patients with 
Non-small Cell Lung Cancer Who Develop Brain Metastases].

    Science.gov (United States)

    Wang, Yang; Fang, Jian; Nie, Jun; Dai, Ling; Hu, Weiheng; Zhang, Jie; Ma, Xiangjuan; Han, Jindi; Chen, Xiaoling; Tian, Guangming; Wu, Di; Han, Sen; Long, Jieran

    2016-08-20

    Radiotherapy combined with chemotherapy or molecular targeted therapy remains the standard of treatment for brain metastases from non-small cell lung cancer (NSCLC). The aim of this study is to determine if the deferral of brain radiotherapy impacts patient outcomes. Between May 2003 and December 2015, a total of 198 patients with brain metastases from NSCLC who received both brain radiotherapy and systemic therapy (chemotherapy or targeted therapy) were identified. The rate of grade 3-4 adverse reactions related to chemotherapy and radiotherapy had no significant difference between two groups. 127 patients received concurrent brain radiotherapy and systemic therapy, and 71 patients received deferred brain radiotherapy after at least two cycles of chemotherapy or targeted therapy. Disease specific-graded prognostic assessment was similar in early radiotherapy group and deferred radiotherapy group. Median overall survival (OS) was longer in early radiotherapy group compared to deferred radiotherapy group (17.9 months vs 12.6 months; P=0.038). Progression free survival (PFS) was also improved in patients receiving early radiotherapy compared to those receiving deferred radiotherapy (4.0 months vs 3.0 months; Pbrain metastases as any line therapy improved the OS (20.0 months vs 10.7 months; Pbrain radiotherapy may resulted in inferior OS in patients with NSCLC who develop brain metastases. A prospective multi-central randomized study is imminently needed.

  7. String Theory and Pre-big bang Cosmology

    CERN Document Server

    Gasperini, M.

    In string theory, the traditional picture of a Universe that emerges from the inflation of a very small and highly curved space-time patch is a possibility, not a necessity: quite different initial conditions are possible, and not necessarily unlikely. In particular, the duality symmetries of string theory suggest scenarios in which the Universe starts inflating from an initial state characterized by very small curvature and interactions. Such a state, being gravitationally unstable, will evolve towards higher curvature and coupling, until string-size effects and loop corrections make the Universe "bounce" into a standard, decreasing-curvature regime. In such a context, the hot big bang of conventional cosmology is replaced by a "hot big bounce" in which the bouncing and heating mechanisms originate from the quantum production of particles in the high-curvature, large-coupling pre-bounce phase. Here we briefly summarize the main features of this inflationary scenario, proposed a quarter century ago. In its si...

  8. Brain development, intelligence and cognitive outcome in children born small for gestational age.

    Science.gov (United States)

    de Bie, H M A; Oostrom, K J; Delemarre-van de Waal, H A

    2010-01-01

    Intrauterine growth restriction (IUGR) can lead to infants being born small for gestational age (SGA). SGA is associated with increased neonatal morbidity and mortality as well as short stature, cardiovascular disease, insulin resistance, diabetes mellitus type 2, dyslipidemia and end-stage renal disease in adulthood. In addition, SGA children have decreased levels of intelligence and cognition, although the effects are mostly subtle. The overall outcome of each child is the result of a complex interaction between intrauterine and extrauterine factors. Animal and human studies show structural alterations in the brains of individuals with IUGR/SGA. The presence of growth hormone (GH) receptors in the brain implies that the brain is also a target for GH. Exogenous GH theoretically has the ability to act on the brain. This is exemplified by the effects of GH on cognition in GH-deficient adults. In SGA children, data on the effect of exogenous GH on intelligence and cognition are scant and contradictory.

  9. Big Argumentation?

    Directory of Open Access Journals (Sweden)

    Daniel Faltesek

    2013-08-01

    Full Text Available Big Data is nothing new. Public concern regarding the mass diffusion of data has appeared repeatedly with computing innovations, in the formation before Big Data it was most recently referred to as the information explosion. In this essay, I argue that the appeal of Big Data is not a function of computational power, but of a synergistic relationship between aesthetic order and a politics evacuated of a meaningful public deliberation. Understanding, and challenging, Big Data requires an attention to the aesthetics of data visualization and the ways in which those aesthetics would seem to depoliticize information. The conclusion proposes an alternative argumentative aesthetic as the appropriate response to the depoliticization posed by the popular imaginary of Big Data.

  10. Controlling the brain: How electrical stimulation can be used as an effective treatment for many brain disorders

    OpenAIRE

    Van Dongen, M.

    2010-01-01

    Our brain is the center of our nervous system. Literally everything we do, from eating an apple to solving a Schrödinger equation, is controlled by it. Usually we don’t give much thought to the fact our brain is so utterly important, but imagine something starts to go drastically wrong inside the brain. Not being able to solve a Schrödinger equation might not be a big problem for the majority of people, but eating an apple is.

  11. Big Data in Caenorhabditis elegans: quo vadis?

    Science.gov (United States)

    Hutter, Harald; Moerman, Donald

    2015-11-05

    A clear definition of what constitutes "Big Data" is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of "complete" data sets for this organism is actually rather small--not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein-protein interaction--important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell. © 2015 Hutter and Moerman. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  12. Gamma Knife irradiation method based on dosimetric controls to target small areas in rat brains

    International Nuclear Information System (INIS)

    Constanzo, Julie; Paquette, Benoit; Charest, Gabriel; Masson-Côté, Laurence; Guillot, Mathieu

    2015-01-01

    Purpose: Targeted and whole-brain irradiation in humans can result in significant side effects causing decreased patient quality of life. To adequately investigate structural and functional alterations after stereotactic radiosurgery, preclinical studies are needed. The purpose of this work is to establish a robust standardized method of targeted irradiation on small regions of the rat brain. Methods: Euthanized male Fischer rats were imaged in a stereotactic bed, by computed tomography (CT), to estimate positioning variations relative to the bregma skull reference point. Using a rat brain atlas and the stereotactic bregma coordinates obtained from CT images, different regions of the brain were delimited and a treatment plan was generated. A single isocenter treatment plan delivering ≥100 Gy in 100% of the target volume was produced by Leksell GammaPlan using the 4 mm diameter collimator of sectors 4, 5, 7, and 8 of the Gamma Knife unit. Impact of positioning deviations of the rat brain on dose deposition was simulated by GammaPlan and validated with dosimetric measurements. Results: The authors’ results showed that 90% of the target volume received 100 ± 8 Gy and the maximum of deposited dose was 125 ± 0.7 Gy, which corresponds to an excellent relative standard deviation of 0.6%. This dose deposition calculated with GammaPlan was validated with dosimetric films resulting in a dose-profile agreement within 5%, both in X- and Z-axes. Conclusions: The authors’ results demonstrate the feasibility of standardizing the irradiation procedure of a small volume in the rat brain using a Gamma Knife

  13. Start small, dream big: Experiences of physical activity in public spaces in Colombia.

    Science.gov (United States)

    Díaz Del Castillo, Adriana; González, Silvia Alejandra; Ríos, Ana Paola; Páez, Diana C; Torres, Andrea; Díaz, María Paula; Pratt, Michael; Sarmiento, Olga L

    2017-10-01

    Multi-sectoral strategies to promote active recreation and physical activity in public spaces are crucial to building a "culture of health". However, studies on the sustainability and scalability of these strategies are limited. This paper identifies the factors related to the sustainability and scaling up of two community-based programs offering physical activity classes in public spaces in Colombia: Bogotá's Recreovía and Colombia's "Healthy Habits and Lifestyles Program-HEVS". Both programs have been sustained for more than 10years, and have benefited 1455 communities. We used a mixed-methods approach including semi-structured interviews, document review and an analysis of data regarding the programs' history, characteristics, funding, capacity building and challenges. Interviews were conducted between May-October 2015. Based on the sustainability frameworks of Shediac-Rizkallah and Bone and Scheirer, we developed categories to independently code each interview. All information was independently analyzed by four of the authors and cross-compared between programs. Findings showed that these programs underwent adaptation processes to address the challenges that threatened their continuation and growth. The primary strategies included flexibility/adaptability, investing in the working conditions and training of instructors, allocating public funds and requesting accountability, diversifying resources, having community support and champions at different levels and positions, and carrying out continuous advocacy to include physical activity in public policies. Recreovía and HEVS illustrate sustainability as an incremental, multi-level process at different levels. Lessons learned for similar initiatives include the importance of individual actions and small events, a willingness to start small while dreaming big, being flexible, and prioritizing the human factor. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Radiolabeled cetuximab plus whole-brain irradiation (WBI) for the treatment of brain metastases from non-small cell lung cancer (NSCLC)

    International Nuclear Information System (INIS)

    Rades, Dirk; Nadrowitz, Roger; Buchmann, Inga; Meller, Birgit; Hunold, Peter; Noack, Frank; Schild, Steven E.

    2010-01-01

    Background and Purpose: The addition of systemic drugs to whole-brain irradiation has not improved the survival of patients with multiple brain metastases, most likely because the agents did not readily cross the blood-brain barrier (BBB). Radiolabeling of cetuximab was performed to investigate whether this antibody crosses the BBB. Case Report: A patient with multiple brain lesions from non-small cell lung cancer was investigated. The largest metastasis (40 x 33 x 27 mm) was selected the reference lesion. On day 1, 200 mg/m 2 cetuximab (0.25% hot and 99.75% cold antibody) were given. On day 3, 200 mg/m 2 cetuximab (cold antibody) were given. Weekly doses of 250 mg/m 2 cetuximab were administered for 3 months. Results: The reference lesion showed enhancement of radiolabeled cetuximab ( 123 I-Erbi) on scintigraphy; 123 I-Erbi crossed the BBB and accumulated in the lesion. The reference lesion measured 31 x 22 x 21 mm at 4 months. Enhancement of contrast medium was less pronounced. Conclusion: This is the first demonstration of cetuximab crossing the BBB and accumulating in brain metastasis. (orig.)

  15. Radiolabeled cetuximab plus whole-brain irradiation (WBI) for the treatment of brain metastases from non-small cell lung cancer (NSCLC)

    Energy Technology Data Exchange (ETDEWEB)

    Rades, Dirk; Nadrowitz, Roger [Dept. of Radiation Oncology, Univ. of Luebeck (Germany); Buchmann, Inga; Meller, Birgit [Section of Nuclear Medicine, Univ. of Luebeck (Germany); Hunold, Peter [Dept. of Radiology, Univ. of Luebeck (Germany); Noack, Frank [Inst. of Pathology, Univ. of Luebeck (Germany); Schild, Steven E. [Dept. of Radiation Oncology, Mayo Clinic, Scottsdale, AZ (United States)

    2010-08-15

    Background and Purpose: The addition of systemic drugs to whole-brain irradiation has not improved the survival of patients with multiple brain metastases, most likely because the agents did not readily cross the blood-brain barrier (BBB). Radiolabeling of cetuximab was performed to investigate whether this antibody crosses the BBB. Case Report: A patient with multiple brain lesions from non-small cell lung cancer was investigated. The largest metastasis (40 x 33 x 27 mm) was selected the reference lesion. On day 1, 200 mg/m{sup 2} cetuximab (0.25% hot and 99.75% cold antibody) were given. On day 3, 200 mg/m{sup 2} cetuximab (cold antibody) were given. Weekly doses of 250 mg/m{sup 2} cetuximab were administered for 3 months. Results: The reference lesion showed enhancement of radiolabeled cetuximab ({sup 123}I-Erbi) on scintigraphy; {sup 123}I-Erbi crossed the BBB and accumulated in the lesion. The reference lesion measured 31 x 22 x 21 mm at 4 months. Enhancement of contrast medium was less pronounced. Conclusion: This is the first demonstration of cetuximab crossing the BBB and accumulating in brain metastasis. (orig.)

  16. HOW THE BRAIN MAY WORK

    OpenAIRE

    Bartlett, Rodney

    2016-01-01

    The brain's functioning has always been a great mystery. Despite the remarkable progress of neuroscience since the 19th century, it remains a puzzle. While continued study of the nervous system is obviously absolutely essential to comprehending how the brain works, that alone may be insufficient. The input of other scientific disciplines appears necessary - studies like physics, holography, and even astronomy. Albert Einstein, one of the world's greatest physicists, regretted not making a big...

  17. Small Data

    OpenAIRE

    Pemberton, Steven

    2014-01-01

    htmlabstractThe term “Open Data” often goes hand in hand with the term “Big Data”, where large data sets get released allowing for analysis, but the Cinderella of the Open Data ball is Small Data, small amounts of data, nonetheless possibly essential, that are too small to be put in some database or online dataset to be put to use. RDFa is a technology that allows Cinderella to go to the ball.

  18. Crowd-funded micro-grants for genomics and "big data": an actionable idea connecting small (artisan) science, infrastructure science, and citizen philanthropy.

    Science.gov (United States)

    Özdemir, Vural; Badr, Kamal F; Dove, Edward S; Endrenyi, Laszlo; Geraci, Christy Jo; Hotez, Peter J; Milius, Djims; Neves-Pereira, Maria; Pang, Tikki; Rotimi, Charles N; Sabra, Ramzi; Sarkissian, Christineh N; Srivastava, Sanjeeva; Tims, Hesther; Zgheib, Nathalie K; Kickbusch, Ilona

    2013-04-01

    Biomedical science in the 21(st) century is embedded in, and draws from, a digital commons and "Big Data" created by high-throughput Omics technologies such as genomics. Classic Edisonian metaphors of science and scientists (i.e., "the lone genius" or other narrow definitions of expertise) are ill equipped to harness the vast promises of the 21(st) century digital commons. Moreover, in medicine and life sciences, experts often under-appreciate the important contributions made by citizen scholars and lead users of innovations to design innovative products and co-create new knowledge. We believe there are a large number of users waiting to be mobilized so as to engage with Big Data as citizen scientists-only if some funding were available. Yet many of these scholars may not meet the meta-criteria used to judge expertise, such as a track record in obtaining large research grants or a traditional academic curriculum vitae. This innovation research article describes a novel idea and action framework: micro-grants, each worth $1000, for genomics and Big Data. Though a relatively small amount at first glance, this far exceeds the annual income of the "bottom one billion"-the 1.4 billion people living below the extreme poverty level defined by the World Bank ($1.25/day). We describe two types of micro-grants. Type 1 micro-grants can be awarded through established funding agencies and philanthropies that create micro-granting programs to fund a broad and highly diverse array of small artisan labs and citizen scholars to connect genomics and Big Data with new models of discovery such as open user innovation. Type 2 micro-grants can be funded by existing or new science observatories and citizen think tanks through crowd-funding mechanisms described herein. Type 2 micro-grants would also facilitate global health diplomacy by co-creating crowd-funded micro-granting programs across nation-states in regions facing political and financial instability, while sharing similar disease

  19. Population and harvest trends of big game and small game species: a technical document supporting the USDA Forest Service Interim Update of the 2000 RPA Assessment

    Science.gov (United States)

    Curtis H. Flather; Michael S. Knowles; Stephen J. Brady

    2009-01-01

    This technical document supports the Forest Service's requirement to assess the status of renewable natural resources as mandated by the Forest and Rangeland Renewable Resources Planning Act of 1974 (RPA). It updates past reports on national and regional trends in population and harvest estimates for species classified as big game and small game. The trends...

  20. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations......The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  1. Therapeutic potential of brain-derived neurotrophic factor (BDNF and a small molecular mimics of BDNF for traumatic brain injury

    Directory of Open Access Journals (Sweden)

    Mary Wurzelmann

    2017-01-01

    Full Text Available Traumatic brain injury (TBI is a major health problem worldwide. Following primary mechanical insults, a cascade of secondary injuries often leads to further neural tissue loss. Thus far there is no cure to rescue the damaged neural tissue. Current therapeutic strategies primarily target the secondary injuries focusing on neuroprotection and neuroregeneration. The neurotrophin brain-derived neurotrophic factor (BDNF has significant effect in both aspects, promoting neuronal survival, synaptic plasticity and neurogenesis. Recently, the flavonoid 7,8-dihydroxyflavone (7,8-DHF, a small TrkB agonist that mimics BDNF function, has shown similar effects as BDNF in promoting neuronal survival and regeneration following TBI. Compared to BDNF, 7,8-DHF has a longer half-life and much smaller molecular size, capable of penetrating the blood-brain barrier, which makes it possible for non-invasive clinical application. In this review, we summarize functions of the BDNF/TrkB signaling pathway and studies examining the potential of BDNF and 7,8-DHF as a therapy for TBI.

  2. Therapeutic potential of brain-derived neurotrophic factor (BDNF) and a small molecular mimics of BDNF for traumatic brain injury.

    Science.gov (United States)

    Wurzelmann, Mary; Romeika, Jennifer; Sun, Dong

    2017-01-01

    Traumatic brain injury (TBI) is a major health problem worldwide. Following primary mechanical insults, a cascade of secondary injuries often leads to further neural tissue loss. Thus far there is no cure to rescue the damaged neural tissue. Current therapeutic strategies primarily target the secondary injuries focusing on neuroprotection and neuroregeneration. The neurotrophin brain-derived neurotrophic factor (BDNF) has significant effect in both aspects, promoting neuronal survival, synaptic plasticity and neurogenesis. Recently, the flavonoid 7,8-dihydroxyflavone (7,8-DHF), a small TrkB agonist that mimics BDNF function, has shown similar effects as BDNF in promoting neuronal survival and regeneration following TBI. Compared to BDNF, 7,8-DHF has a longer half-life and much smaller molecular size, capable of penetrating the blood-brain barrier, which makes it possible for non-invasive clinical application. In this review, we summarize functions of the BDNF/TrkB signaling pathway and studies examining the potential of BDNF and 7,8-DHF as a therapy for TBI.

  3. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  4. Small dose... big poison.

    Science.gov (United States)

    Braitberg, George; Oakley, Ed

    2010-11-01

    It is not possible to identify all toxic substances in a single journal article. However, there are some exposures that in small doses are potentially fatal. Many of these exposures are particularly toxic to children. Using data from poison control centres, it is possible to recognise this group of exposures. This article provides information to assist the general practitioner to identify potential toxic substance exposures in children. In this article the authors report the signs and symptoms of toxic exposures and identify the time of onset. Where clear recommendations on the period of observation and known fatal dose are available, these are provided. We do not discuss management or disposition, and advise readers to contact the Poison Information Service or a toxicologist for this advice.

  5. From big bang to big crunch and beyond

    International Nuclear Information System (INIS)

    Elitzur, Shmuel; Rabinovici, Eliezer; Giveon, Amit; Kutasov, David

    2002-01-01

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a 'big bang' singularity, expands and then contracts to a 'big crunch' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spacetimes is further connected at the singularities to a series of non-compact static regions with closed timelike curves. These regions contain boundaries, on which the observables of the theory live. This suggests a holographic interpretation of the physics. (author)

  6. SU-E-T-457: Design and Characterization of An Economical 192Ir Hemi-Brain Small Animal Irradiator

    International Nuclear Information System (INIS)

    Grams, M; Wilson, Z; Sio, T; Beltran, C; Tryggestad, E; Gupta, S; Blackwell, C; McCollough, K; Sarkaria, J; Furutani, K

    2014-01-01

    Purpose: To describe the design and dosimetric characterization of a simple and economical small animal irradiator. Methods: A high dose rate 192Ir brachytherapy source from a commercially available afterloader was used with a 1.3 centimeter thick tungsten collimator to provide sharp beam penumbra suitable for hemi-brain irradiation of mice. The unit is equipped with continuous gas anesthesia to allow robust animal immobilization. Dosimetric characterization of the device was performed with Gafchromic film. The penumbra from the small animal irradiator was compared under similar collimating conditions to the penumbra from 6 MV photons, 6 MeV electrons, and 20 MeV electrons from a linear accelerator as well as 300 kVp photons from an orthovoltage unit and Monte Carlo simulated 90 MeV protons. Results: The tungsten collimator provides a sharp penumbra suitable for hemi-brain irradiation, and dose rates on the order of 200 cGy/minute were achieved. The sharpness of the penumbra attainable with this device compares favorably to those measured experimentally for 6 MV photons, and 6 and 20 MeV electron beams from a linear accelerator. Additionally, the penumbra was comparable to those measured for a 300 kVp orthovoltage beam and a Monte Carlo simulated 90 MeV proton beam. Conclusions: The small animal irradiator described here can be built for under $1,000 and used in conjunction with any commercial brachytherapy afterloader to provide a convenient and cost-effective option for small animal irradiation experiments. The unit offers high dose rate delivery and sharp penumbra, which is ideal for hemi-brain irradiation of mice. With slight modifications to the design, irradiation of sites other than the brain could be accomplished easily. Due to its simplicity and low cost, the apparatus described is an attractive alternative for small animal irradiation experiments requiring a sharp penumbra

  7. BIG data - BIG gains? Empirical evidence on the link between big data analytics and innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance in terms of product innovations. Since big data technologies provide new data information practices, they create novel decision-making possibilities, which are widely believed to support firms’ innovation process. Applying German firm-level data within a knowledge production function framework we find suggestive evidence that big data analytics is a relevant determinant for the likel...

  8. Understanding brains: details, intuition, and big data.

    Science.gov (United States)

    Marder, Eve

    2015-05-01

    Understanding how the brain works requires a delicate balance between the appreciation of the importance of a multitude of biological details and the ability to see beyond those details to general principles. As technological innovations vastly increase the amount of data we collect, the importance of intuition into how to analyze and treat these data may, paradoxically, become more important.

  9. Understanding Brains: Details, Intuition, and Big Data

    OpenAIRE

    Marder, Eve

    2015-01-01

    Understanding how the brain works requires a delicate balance between the appreciation of the importance of a multitude of biological details and the ability to see beyond those details to general principles. As technological innovations vastly increase the amount of data we collect, the importance of intuition into how to analyze and treat these data may, paradoxically, become more important.

  10. Understanding brains: details, intuition, and big data.

    Directory of Open Access Journals (Sweden)

    Eve Marder

    2015-05-01

    Full Text Available Understanding how the brain works requires a delicate balance between the appreciation of the importance of a multitude of biological details and the ability to see beyond those details to general principles. As technological innovations vastly increase the amount of data we collect, the importance of intuition into how to analyze and treat these data may, paradoxically, become more important.

  11. Mapping fetal brain development in utero using magnetic resonance imaging: the Big Bang of brain mapping.

    Science.gov (United States)

    Studholme, Colin

    2011-08-15

    The development of tools to construct and investigate probabilistic maps of the adult human brain from magnetic resonance imaging (MRI) has led to advances in both basic neuroscience and clinical diagnosis. These tools are increasingly being applied to brain development in adolescence and childhood, and even to neonatal and premature neonatal imaging. Even earlier in development, parallel advances in clinical fetal MRI have led to its growing use as a tool in challenging medical conditions. This has motivated new engineering developments encompassing optimal fast MRI scans and techniques derived from computer vision, the combination of which allows full 3D imaging of the moving fetal brain in utero without sedation. These promise to provide a new and unprecedented window into early human brain growth. This article reviews the developments that have led us to this point, examines the current state of the art in the fields of fast fetal imaging and motion correction, and describes the tools to analyze dynamically changing fetal brain structure. New methods to deal with developmental tissue segmentation and the construction of spatiotemporal atlases are examined, together with techniques to map fetal brain growth patterns.

  12. Crowd-Funded Micro-Grants for Genomics and “Big Data”: An Actionable Idea Connecting Small (Artisan) Science, Infrastructure Science, and Citizen Philanthropy

    Science.gov (United States)

    Badr, Kamal F.; Dove, Edward S.; Endrenyi, Laszlo; Geraci, Christy Jo; Hotez, Peter J.; Milius, Djims; Neves-Pereira, Maria; Pang, Tikki; Rotimi, Charles N.; Sabra, Ramzi; Sarkissian, Christineh N.; Srivastava, Sanjeeva; Tims, Hesther; Zgheib, Nathalie K.; Kickbusch, Ilona

    2013-01-01

    Abstract Biomedical science in the 21st century is embedded in, and draws from, a digital commons and “Big Data” created by high-throughput Omics technologies such as genomics. Classic Edisonian metaphors of science and scientists (i.e., “the lone genius” or other narrow definitions of expertise) are ill equipped to harness the vast promises of the 21st century digital commons. Moreover, in medicine and life sciences, experts often under-appreciate the important contributions made by citizen scholars and lead users of innovations to design innovative products and co-create new knowledge. We believe there are a large number of users waiting to be mobilized so as to engage with Big Data as citizen scientists—only if some funding were available. Yet many of these scholars may not meet the meta-criteria used to judge expertise, such as a track record in obtaining large research grants or a traditional academic curriculum vitae. This innovation research article describes a novel idea and action framework: micro-grants, each worth $1000, for genomics and Big Data. Though a relatively small amount at first glance, this far exceeds the annual income of the “bottom one billion”—the 1.4 billion people living below the extreme poverty level defined by the World Bank ($1.25/day). We describe two types of micro-grants. Type 1 micro-grants can be awarded through established funding agencies and philanthropies that create micro-granting programs to fund a broad and highly diverse array of small artisan labs and citizen scholars to connect genomics and Big Data with new models of discovery such as open user innovation. Type 2 micro-grants can be funded by existing or new science observatories and citizen think tanks through crowd-funding mechanisms described herein. Type 2 micro-grants would also facilitate global health diplomacy by co-creating crowd-funded micro-granting programs across nation-states in regions facing political and financial instability, while

  13. A Big Year for Small Bodies

    Science.gov (United States)

    Mayo, Louis; Erickson, K.

    2013-10-01

    2013 is a watershed year for celestial events involving the solar system’s unsung heroes, small bodies. The Cosmic Valentine of Asteroid 2012 DA14 which passed within ~ 3.5 Earth radii of the Earth's surface (February 15, 2013), Comet C/2011 L4 PANSTARRS and the Thanksgiving 2013 pass of Comet ISON, which will pass less than 0.012 AU (1.8 million km) from the solar surface and could be visible during the day. All this in addition to Comet Lemmon and a host of meteor showers makes 2013 a landmark year to deliver the excitement of planetary science to the audiences worldwide. To deliver the excitement and wonder of our solar system’s small bodies to worldwide audiences, NASA’s JPL and GSFC education teams in partnership with NASA EDGE will reach out to the public through multiple venues including broadcast media, social media, science and math focused educational activities, observing challenges, interactive visualization tools like “Eyes on the Solar System” and more culminating in the Thanksgiving Day Comet ISON perihelion passage. This talk will highlight NASA’s focused education effort to engage the public in small bodies science and the role these objects play in our understanding of the formation and evolution of the solar system.

  14. A Dictionary Learning Approach for Signal Sampling in Task-Based fMRI for Reduction of Big Data

    Science.gov (United States)

    Ge, Bao; Li, Xiang; Jiang, Xi; Sun, Yifei; Liu, Tianming

    2018-01-01

    The exponential growth of fMRI big data offers researchers an unprecedented opportunity to explore functional brain networks. However, this opportunity has not been fully explored yet due to the lack of effective and efficient tools for handling such fMRI big data. One major challenge is that computing capabilities still lag behind the growth of large-scale fMRI databases, e.g., it takes many days to perform dictionary learning and sparse coding of whole-brain fMRI data for an fMRI database of average size. Therefore, how to reduce the data size but without losing important information becomes a more and more pressing issue. To address this problem, we propose a signal sampling approach for significant fMRI data reduction before performing structurally-guided dictionary learning and sparse coding of whole brain's fMRI data. We compared the proposed structurally guided sampling method with no sampling, random sampling and uniform sampling schemes, and experiments on the Human Connectome Project (HCP) task fMRI data demonstrated that the proposed method can achieve more than 15 times speed-up without sacrificing the accuracy in identifying task-evoked functional brain networks. PMID:29706880

  15. A Dictionary Learning Approach for Signal Sampling in Task-Based fMRI for Reduction of Big Data.

    Science.gov (United States)

    Ge, Bao; Li, Xiang; Jiang, Xi; Sun, Yifei; Liu, Tianming

    2018-01-01

    The exponential growth of fMRI big data offers researchers an unprecedented opportunity to explore functional brain networks. However, this opportunity has not been fully explored yet due to the lack of effective and efficient tools for handling such fMRI big data. One major challenge is that computing capabilities still lag behind the growth of large-scale fMRI databases, e.g., it takes many days to perform dictionary learning and sparse coding of whole-brain fMRI data for an fMRI database of average size. Therefore, how to reduce the data size but without losing important information becomes a more and more pressing issue. To address this problem, we propose a signal sampling approach for significant fMRI data reduction before performing structurally-guided dictionary learning and sparse coding of whole brain's fMRI data. We compared the proposed structurally guided sampling method with no sampling, random sampling and uniform sampling schemes, and experiments on the Human Connectome Project (HCP) task fMRI data demonstrated that the proposed method can achieve more than 15 times speed-up without sacrificing the accuracy in identifying task-evoked functional brain networks.

  16. Phase I Study of Concurrent Whole Brain Radiotherapy and Erlotinib for Multiple Brain Metastases From Non-Small-Cell Lung Cancer

    International Nuclear Information System (INIS)

    Lind, Joline S.W.; Lagerwaard, Frank J.; Smit, Egbert F.; Senan, Suresh

    2009-01-01

    Purpose: Erlotinib has shown activity in patients with brain metastases from non-small-cell lung cancer. The present dose-escalation Phase I trial evaluated the toxicity of whole brain radiotherapy (WBRT) with concurrent and maintenance erlotinib in this patient group. Methods and Materials: Erlotinib (Cohort 1, 100 mg/d; Cohort 2, 150 mg/d) was started 1 week before, and continued during, WBRT (30 Gy in 10 fractions). Maintenance erlotinib (150 mg/d) was continued until unacceptable toxicity or disease progression. Results: A total of 11 patients completed WBRT, 4 in Cohort 1 and 7 in Cohort 2. The median duration of erlotinib treatment was 83 days. No treatment-related neurotoxicity was observed. No treatment-related Grade 3 or greater toxicity occurred in Cohort 1. In Cohort 2, 1 patient developed a Grade 3 acneiform rash and 1 patient had Grade 3 fatigue. Two patients in Cohort 2 developed erlotinib-related interstitial lung disease, contributing to death during maintenance therapy. The median overall survival and interval to progression was 133 and 141 days, respectively. Six patients developed extracranial progression; only 1 patient had intracranial progression. In 7 patients with follow-up neuroimaging at 3 months, 5 had a partial response and 2 had stable disease. Conclusion: WBRT with concurrent erlotinib is well tolerated in patients with brain metastases from non-small-cell lung cancer. The suggestion of a high intracranial disease control rate warrants additional study.

  17. Benchmarking Big Data Systems and the BigData Top100 List.

    Science.gov (United States)

    Baru, Chaitanya; Bhandarkar, Milind; Nambiar, Raghunath; Poess, Meikel; Rabl, Tilmann

    2013-03-01

    "Big data" has become a major force of innovation across enterprises of all sizes. New platforms with increasingly more features for managing big datasets are being announced almost on a weekly basis. Yet, there is currently a lack of any means of comparability among such platforms. While the performance of traditional database systems is well understood and measured by long-established institutions such as the Transaction Processing Performance Council (TCP), there is neither a clear definition of the performance of big data systems nor a generally agreed upon metric for comparing these systems. In this article, we describe a community-based effort for defining a big data benchmark. Over the past year, a Big Data Benchmarking Community has become established in order to fill this void. The effort focuses on defining an end-to-end application-layer benchmark for measuring the performance of big data applications, with the ability to easily adapt the benchmark specification to evolving challenges in the big data space. This article describes the efforts that have been undertaken thus far toward the definition of a BigData Top100 List. While highlighting the major technical as well as organizational challenges, through this article, we also solicit community input into this process.

  18. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  19. BigDataBench: a Big Data Benchmark Suite from Internet Services

    OpenAIRE

    Wang, Lei; Zhan, Jianfeng; Luo, Chunjie; Zhu, Yuqing; Yang, Qiang; He, Yongqiang; Gao, Wanling; Jia, Zhen; Shi, Yingjie; Zhang, Shujie; Zheng, Chen; Lu, Gang; Zhan, Kent; Li, Xiaona; Qiu, Bizhu

    2014-01-01

    As architecture, systems, and data management communities pay greater attention to innovative big data systems and architectures, the pressure of benchmarking and evaluating these systems rises. Considering the broad use of big data systems, big data benchmarks must include diversity of data and workloads. Most of the state-of-the-art big data benchmarking efforts target evaluating specific types of applications or system software stacks, and hence they are not qualified for serving the purpo...

  20. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  1. Four cases of small, traumatic hemorrhage in the deep midline portion of the brain

    International Nuclear Information System (INIS)

    Kim, Suho; Tsukahara, Tetsuya; Iwama, Mitsuru; Nishikawa, Michio

    1981-01-01

    Four cases recently encountered are presented in which computerized tomography (CT) demonstrated a small, traumatic hemorrhage in the deep midline portion of the brain. The lesions of hemorrhage revealed by CT were: Case 1, in the septum pellucidum and left lateral ventricle; Case 2, in the Monro's foramen and right lateral ventricle and Case 3, midbrain. These three cases had no other abnormal findings. In addition, a hemorrhage of the corpus callosum and diffuse brain damage were seen in Case 4. These small hemorrhages might be caused not only by the direct damage, but also by a local tendency to bleed due to hystoiogical fragility or the existence of a vascular anomaly, such as AVM or cryptic angioma. The prognoses quod vitam of our cases were relatively better than the previous reports of these hemorrhages, but the prognoses quod functionem were poor. The patients have shown prolonged psychoneurological disorder; these symptoms might be caused by damage to the limbic system. (author)

  2. BigDansing

    KAUST Repository

    Khayyat, Zuhair

    2015-06-02

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to scaling to big datasets. This presents a serious impediment since data cleansing often involves costly computations such as enumerating pairs of tuples, handling inequality joins, and dealing with user-defined functions. In this paper, we present BigDansing, a Big Data Cleansing system to tackle efficiency, scalability, and ease-of-use issues in data cleansing. The system can run on top of most common general purpose data processing platforms, ranging from DBMSs to MapReduce-like frameworks. A user-friendly programming interface allows users to express data quality rules both declaratively and procedurally, with no requirement of being aware of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic and real datasets show that BigDansing outperforms existing baseline systems up to more than two orders of magnitude without sacrificing the quality provided by the repair algorithms.

  3. A small world of weak ties provides optimal global integration of self-similar modules in functional brain networks.

    Science.gov (United States)

    Gallos, Lazaros K; Makse, Hernán A; Sigman, Mariano

    2012-02-21

    The human brain is organized in functional modules. Such an organization presents a basic conundrum: Modules ought to be sufficiently independent to guarantee functional specialization and sufficiently connected to bind multiple processors for efficient information transfer. It is commonly accepted that small-world architecture of short paths and large local clustering may solve this problem. However, there is intrinsic tension between shortcuts generating small worlds and the persistence of modularity, a global property unrelated to local clustering. Here, we present a possible solution to this puzzle. We first show that a modified percolation theory can define a set of hierarchically organized modules made of strong links in functional brain networks. These modules are "large-world" self-similar structures and, therefore, are far from being small-world. However, incorporating weaker ties to the network converts it into a small world preserving an underlying backbone of well-defined modules. Remarkably, weak ties are precisely organized as predicted by theory maximizing information transfer with minimal wiring cost. This trade-off architecture is reminiscent of the "strength of weak ties" crucial concept of social networks. Such a design suggests a natural solution to the paradox of efficient information flow in the highly modular structure of the brain.

  4. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  5. Big science

    CERN Multimedia

    Nadis, S

    2003-01-01

    " "Big science" is moving into astronomy, bringing large experimental teams, multi-year research projects, and big budgets. If this is the wave of the future, why are some astronomers bucking the trend?" (2 pages).

  6. Big-data-based edge biomarkers: study on dynamical drug sensitivity and resistance in individuals.

    Science.gov (United States)

    Zeng, Tao; Zhang, Wanwei; Yu, Xiangtian; Liu, Xiaoping; Li, Meiyi; Chen, Luonan

    2016-07-01

    Big-data-based edge biomarker is a new concept to characterize disease features based on biomedical big data in a dynamical and network manner, which also provides alternative strategies to indicate disease status in single samples. This article gives a comprehensive review on big-data-based edge biomarkers for complex diseases in an individual patient, which are defined as biomarkers based on network information and high-dimensional data. Specifically, we firstly introduce the sources and structures of biomedical big data accessible in public for edge biomarker and disease study. We show that biomedical big data are typically 'small-sample size in high-dimension space', i.e. small samples but with high dimensions on features (e.g. omics data) for each individual, in contrast to traditional big data in many other fields characterized as 'large-sample size in low-dimension space', i.e. big samples but with low dimensions on features. Then, we demonstrate the concept, model and algorithm for edge biomarkers and further big-data-based edge biomarkers. Dissimilar to conventional biomarkers, edge biomarkers, e.g. module biomarkers in module network rewiring-analysis, are able to predict the disease state by learning differential associations between molecules rather than differential expressions of molecules during disease progression or treatment in individual patients. In particular, in contrast to using the information of the common molecules or edges (i.e.molecule-pairs) across a population in traditional biomarkers including network and edge biomarkers, big-data-based edge biomarkers are specific for each individual and thus can accurately evaluate the disease state by considering the individual heterogeneity. Therefore, the measurement of big data in a high-dimensional space is required not only in the learning process but also in the diagnosing or predicting process of the tested individual. Finally, we provide a case study on analyzing the temporal expression

  7. CREDIT SCORING MODELS IN ESTIMATING THE CREDITWORTHINESS OF SMALL AND MEDIUM AND BIG ENTERPRISES

    Directory of Open Access Journals (Sweden)

    Robert Zenzerović

    2011-02-01

    Full Text Available This paper is focused on estimating the credit scoring models for companies operating in the Republic of Croatia. According to level of economic and legal development, especially in the area of bankruptcy regulation as well as business ethics in the Republic of Croatia, the models derived can be applied in wider region particularly in South-eastern European countries that twenty years ago transferred from state directed to free market economy. The purpose of this paper is to emphasize the relevance and possibilities of particular financial ratios in estimating the creditworthiness of business entities what was realized by performing the research among 110 companies. Along most commonly used research methods of description, analysis and synthesis, induction, deduction and surveys, the mathematical and statistical logistic regression method took the central part in this research. The designed sample of 110 business entities represented the structure of firms operating in Republic of Croatia according to their activities as well as to their size. The sample was divided in two sub samples where the first one consist of small and medium enterprises (SME and the second one consist of big business entities. In the next phase the logistic regression method was applied on the 50 independent variables – financial ratios calculated for each sample unit in order to find ones that best discriminate financially stable from unstable companies. As the result of logistic regression analysis, two credit scoring models were derived. First model include the liquidity, solvency and profitability ratios and is applicable for SME’s. With its classification accuracy of 97% the model has high predictive ability and can be used as an effective decision support tool. Second model is applicable for big companies and include only two independent variables – liquidity and solvency ratios. The classification accuracy of this model is 92,5% and, according to criteria of

  8. Bevacizumab and gefitinib enhanced whole-brain radiation therapy for brain metastases due to non-small-cell lung cancer

    Energy Technology Data Exchange (ETDEWEB)

    Yang, R.F.; Yu, B.; Zhang, R.Q.; Wang, X.H.; Li, C.; Wang, P.; Zhang, Y.; Han, B.; Gao, X.X.; Zhang, L. [Taian City Central Hospital, Taian, Shandong (China); Jiang, Z.M., E-mail: dmyh2436@126.com [Qianfoshan Hospital of Shandong Province, Shandong University, Ji’nan, Shandong (China)

    2018-02-01

    Non-small-cell lung cancer (NSCLC) patients who experience brain metastases are usually associated with poor prognostic outcomes. This retrospective study proposed to assess whether bevacizumab or gefitinib can be used to improve the effectiveness of whole brain radiotherapy (WBRT) in managing patients with brain metastases. A total of 218 NSCLC patients with multiple brain metastases were retrospectively included in this study and were randomly allocated to bevacizumab-gefitinibWBRT group (n=76), gefitinib-WBRT group (n=77) and WBRT group (n=75). Then, tumor responses were evaluated every 2 months based on Response Evaluation Criteria in Solid Tumors version 1.0. Karnofsky performance status and neurologic examination were documented every 6 months after the treatment. Compared to the standard WBRT, bevacizumab and gefitinib could significantly enhance response rate (RR) and disease control rate (DCR) of WBRT (Po0.001). At the same time, RR and DCR of patients who received bevacizumab-gefitinib-WBRT were higher than those who received gefitinib-WBRT. The overall survival (OS) rates and progression-free survival (PFS) rates also differed significantly among the bevacizumab-gefitinib-WBRT (48.6 and 29.8%), gefitinib-WBRT (36.7 and 29.6%) and WBRT (9.8 and 14.6%) groups (Po0.05). Although bevacizumabgefitinib-WBRT was slightly more toxic than gefitinib-WBRT, the toxicity was tolerable. As suggested by prolonged PFS and OS status, bevacizumab substantially improved the overall efficacy of WBRT in the management of patients with NSCLC. (author)

  9. Negative-mass lagging cores of the big bang

    International Nuclear Information System (INIS)

    Miller, B.D.

    1976-01-01

    Examples are given of spherically symmetric cosmological models containing space-sections with the following properties: at large values of the geometrically defined coordinate R, the mass is positive, while at small values of R, the mass is negative. The negative-mass region of spacetime has local properties similar to those of the negative-mass Schwarzschild solution. The big bang in these models is partially spacelike and partially timelike, so the spacetimes do not obey the strong form of the cosmic censorship hypothesis. The timelike, negative-mass segments of the big bang are unlimited sources of electromagnetic and gravitational radiation, and as such may be attractive as ''lagging core'' models of highly energetic astrophysical phenomena

  10. Negative-mass lagging cores of the big bang

    Energy Technology Data Exchange (ETDEWEB)

    Miller, B.D.

    1976-09-01

    Examples are given of spherically symmetric cosmological models containing space-sections with the following properties: at large values of the geometrically defined coordinate R, the mass is positive, while at small values of R, the mass is negative. The negative-mass region of spacetime has local properties similar to those of the negative-mass Schwarzschild solution. The big bang in these models is partially spacelike and partially timelike, so the spacetimes do not obey the strong form of the cosmic censorship hypothesis. The timelike, negative-mass segments of the big bang are unlimited sources of electromagnetic and gravitational radiation, and as such may be attractive as ''lagging core'' models of highly energetic astrophysical phenomena. (AIP)

  11. Experimental feeding of DDE and PCB to female big brown bats (Eptesicus fuscus)

    Science.gov (United States)

    Clark, D.R.; Prouty, R.M.

    1977-01-01

    Twenty-two female big brown bats (Eptesicus fuscus) were collected in a house attic in Montgomery County, Maryland. Seventeen were fed mealworms (Tenebrio molitor larvae) that contained 166 ppm DDE; the other five were fed uncontaminated mealworms. After 54 days of feeding, six dosed bats were frozen and the remaining 16 were starved to death. In a second experiment, 21 female big brown bats were collected in a house attic in Prince Georges County, Maryland. Sixteen were fed mealworms that contained 9.4 ppm Aroclor 1254 (PCB). After 37 days, two bats had died, four dosed bats were frozen, and the remaining 15 were starved to death. Starvation caused mobilization of stored residues. After the feeding periods, average weights of all four groups (DDE-dosed, DDE control, PCB-dosed, PCB control) had increased. However, weights of DDE-dosed bats had increased significantly more than those of their contols, whereas weights of PCB-dosed bats had increased significantly less than those of their controls. During starvation, PCB-dosed bats lost weight significantly more slowly than controls. Because PCB levels in dosed bats resembled levels found in some free-living big brown bats, PCBs may be slowing metabolic rates of some free-living bats. It is not known how various common organochlorine residues may affect metabolism in hibernating bats. DDE and PCB increased in brains of starving bats as carcass fat was metabolized. Because the tremors and/or convulsions characteristic of neurotoxicity were not observed, we think even the maximum brain levels attained (132 ppm DDE, 20 ppm PCB) were sublethal. However, extrapolation of our DDE data predicted lethal brain levels when fat reserves declined sufficiently. PCB-dosed bats were probably in no danger of neurotoxic poisoning. However, PCB can kill by a nonneurotoxic mode, and this could explain the deaths of two bats on PCB dosage.

  12. Big bang and big crunch in matrix string theory

    OpenAIRE

    Bedford, J; Papageorgakis, C; Rodríguez-Gómez, D; Ward, J

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  13. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  14. Big data uncertainties.

    Science.gov (United States)

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  15. Small-worldness and gender differences of large scale brain metabolic covariance networks in young adults: a FDG PET study of 400 subjects.

    Science.gov (United States)

    Hu, Yuxiao; Xu, Qiang; Shen, Junkang; Li, Kai; Zhu, Hong; Zhang, Zhiqiang; Lu, Guangming

    2015-02-01

    Many studies have demonstrated the small-worldness of the human brain, and have revealed a sexual dimorphism in brain network properties. However, little is known about the gender effects on the topological organization of the brain metabolic covariance networks. To investigate the small-worldness and the gender differences in the topological architectures of human brain metabolic networks. FDG-PET data of 400 healthy right-handed subjects (200 women and 200 age-matched men) were involved in the present study. Metabolic networks of each gender were constructed by calculating the covariance of regional cerebral glucose metabolism (rCMglc) across subjects on the basis of AAL parcellation. Gender differences of network and nodal properties were investigated by using the graph theoretical approaches. Moreover, the gender-related difference of rCMglc in each brain region was tested for investigating the relationships between the hub regions and the brain regions showing significant gender-related differences in rCMglc. We found prominent small-world properties in the domain of metabolic networks in each gender. No significant gender difference in the global characteristics was found. Gender differences of nodal characteristic were observed in a few brain regions. We also found bilateral and lateralized distributions of network hubs in the females and males. Furthermore, we first reported that some hubs of a gender located in the brain regions showing weaker rCMglc in this gender than the other gender. The present study demonstrated that small-worldness was existed in metabolic networks, and revealed gender differences of organizational patterns in metabolic network. These results maybe provided insights into the understanding of the metabolic substrates underlying individual differences in cognition and behaviors. © The Foundation Acta Radiologica 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  16. Think big, start small, move fast a blueprint for transformation from the Mayo Clinic Center for Innovation

    CERN Document Server

    LaRusso, Nicholas; Farrugia, Gianrico

    2015-01-01

    The Only Innovation Guide You Will Ever Need--from the Award-Winning Minds at Mayo Clinic. A lot of businesspeople talk about innovation, but few companies have achieved the level of truly transformative innovation as brilliantly--or as famously--as the legendary Mayo Clinic. Introducing Think Big, Start Small, Move Fast, the first innovation guide based on the proven, decade-long program that’s made Mayo Clinic one of the most respected and successful organizations in the world. This essential must-have guide shows you how to: Inspire and ignite trailblazing innovation in your workplace Design a new business model that’s creative, collaborative, and sustainable Apply the traditional scientific method to the latest innovations in "design thinking" Build a customized toolkit of the best practices, project portfolios, and strategies Increase your innovation capacity--and watch how quickly you succeed These field-tested techniques grew out of the health care industry but are designed ...

  17. Accelerated Fractionation In The Treatment of Brain Metastasis From Non-Small Cell Carcinoma of The Lung

    International Nuclear Information System (INIS)

    Hong, Seong Eon

    1994-01-01

    Purpose: Metastatic cancer to the brain is a major problem for the patients with bronchogenic carcinoma, and most of these patients have a limited survival expectancy. To increase tumor control and/or to decrease late morbidity with possible shortening in over-all treatment period, multiple daily fraction technique for brain metastasis was performed. The author represented the results of accelerated fractionation radiotherapy in patients with brain metastases from non-small cell lung cancer. Materials and Methods: Twenty-six patients with brain metastases from non-small cell lung cancer between 1991 and 1993 received brain radiotherapy with a total dose of 48 Gy, at 2 Gy per fraction, twice a day with a interfractional period of 6 hours, and delivered 5 days a week. The whole brain was treated to 40 Gy and boost dose escalated to 8 Gy for single metastatic lesion by reduced field. Twenty-four of the 26 patients completed the radiotherapy. Radiotherapy was interrupted in two patients suggesting progressive intracerebral disease. Results: This radiotherapy regimen appears to be comparable to the conventional schema in relief from symptoms. Three of the 24 patients experienced nausea and or vomiting during the course of treatment because of acute irradiation toxicity. The author observed no excessive toxicity with escalating dose of irradiation. An increment in median survival, although not statistically significant (p>0.05), was noted with escalating doses(48 Gy) of accelerated fractionation (7 months) compared to conventional treatment(4.5 months). Median survival also increased in patients with brain solitary metastasis(9 months) compared to multiple extrathoracic sites(4 months), and in patients with good performance status(9 months versus 3.5 months), they were statistically significant(p<0.01). Conclusion: The increment in survival in patients with good prognostic factors such as controlled primary lesion, metastasis in brain only, and good performance status

  18. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  19. Survival prognostic factors for patients with synchronous brain oligometastatic non-small-cell lung carcinoma receiving local therapy

    Directory of Open Access Journals (Sweden)

    Bai H

    2016-07-01

    Full Text Available Hao Bai,1,* Jianlin Xu,1,* Haitang Yang,2,* Bo Jin,1 Yuqing Lou,1 Dan Wu,3 Baohui Han1 1Department of Pulmonary, 2Department of Pathology, 3Central Laboratory, Shanghai Chest Hospital, Shanghai Jiao Tong University, Shanghai, People’s Republic of China *These authors contributed equally to this work Introduction: Clinical evidence for patients with synchronous brain oligometastatic non-small-cell lung carcinoma is limited. We aimed to summarize the clinical data of these patients to explore the survival prognostic factors for this population. Methods: From September 1995 to July 2011, patients with 1–3 synchronous brain oligometastases, who were treated with stereotactic radiosurgery (SRS or surgical resection as the primary treatment, were identified at Shanghai Chest Hospital.Results: A total of 76 patients (22 patients underwent brain surgery as primary treatment and 54 patients received SRS were available for survival analysis. The overall survival (OS for patients treated with SRS and brain surgery as the primary treatment were 12.6 months (95% confidence interval [CI] 10.3–14.9 and 16.4 months (95% CI 8.8–24.1, respectively (adjusted hazard ratio =0.59, 95% CI 0.33–1.07, P=0.08. Among 76 patients treated with SRS or brain surgery, 21 patients who underwent primary tumor resection did not experience a significantly improved OS (16.4 months, 95% CI 9.6–23.2, compared with those who did not undergo resection (11.9 months, 95% CI 9.7–14.0; adjusted hazard ratio =0.81, 95% CI 0.46–1.44, P=0.46. Factors associated with survival benefits included stage I–II of primary lung tumor and solitary brain metastasis. Conclusion: There was no significant difference in OS for patients with synchronous brain oligometastasis receiving SRS or surgical resection. Among this population, the number of brain metastases and stage of primary lung disease were the factors associated with a survival benefit. Keywords: non-small-cell lung carcinoma

  20. Big bang and big crunch in matrix string theory

    International Nuclear Information System (INIS)

    Bedford, J.; Ward, J.; Papageorgakis, C.; Rodriguez-Gomez, D.

    2007-01-01

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe

  1. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  2. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  3. [Big data in imaging].

    Science.gov (United States)

    Sewerin, Philipp; Ostendorf, Benedikt; Hueber, Axel J; Kleyer, Arnd

    2018-04-01

    Until now, most major medical advancements have been achieved through hypothesis-driven research within the scope of clinical trials. However, due to a multitude of variables, only a certain number of research questions could be addressed during a single study, thus rendering these studies expensive and time consuming. Big data acquisition enables a new data-based approach in which large volumes of data can be used to investigate all variables, thus opening new horizons. Due to universal digitalization of the data as well as ever-improving hard- and software solutions, imaging would appear to be predestined for such analyses. Several small studies have already demonstrated that automated analysis algorithms and artificial intelligence can identify pathologies with high precision. Such automated systems would also seem well suited for rheumatology imaging, since a method for individualized risk stratification has long been sought for these patients. However, despite all the promising options, the heterogeneity of the data and highly complex regulations covering data protection in Germany would still render a big data solution for imaging difficult today. Overcoming these boundaries is challenging, but the enormous potential advances in clinical management and science render pursuit of this goal worthwhile.

  4. A simple non-invasive method for measuring gross brain size in small live fish with semi-transparent heads

    Directory of Open Access Journals (Sweden)

    Joacim Näslund

    2014-09-01

    Full Text Available This paper describes a non-invasive method for estimating gross brain size in small fish with semi-transparent heads, using system camera equipment. Macro-photographs were taken from above on backlit free-swimming fish undergoing light anaesthesia. From the photographs, the width of the optic tectum was measured. This measure (TeO-measure correlates well with the width of the optic tectum as measured from out-dissected brains in both brown trout fry and zebrafish (Pearson r > 0.90. The TeO-measure also correlates well with overall brain wet weight in brown trout fry (r = 0.90, but less well for zebrafish (r = 0.79. A non-invasive measure makes it possible to quickly assess brain size from a large number of individuals, as well as repeatedly measuring brain size of live individuals allowing calculation of brain growth.

  5. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  6. From Big Data to Big Displays High-Performance Visualization at Blue Brain

    KAUST Repository

    Eilemann, Stefan

    2017-10-19

    Blue Brain has pushed high-performance visualization (HPV) to complement its HPC strategy since its inception in 2007. In 2011, this strategy has been accelerated to develop innovative visualization solutions through increased funding and strategic partnerships with other research institutions. We present the key elements of this HPV ecosystem, which integrates C++ visualization applications with novel collaborative display systems. We motivate how our strategy of transforming visualization engines into services enables a variety of use cases, not only for the integration with high-fidelity displays, but also to build service oriented architectures, to link into web applications and to provide remote services to Python applications.

  7. The concentration of erlotinib in the cerebrospinal fluid of patients with brain metastasis from non-small-cell lung cancer

    Science.gov (United States)

    DENG, YANMING; FENG, WEINENG; WU, JING; CHEN, ZECHENG; TANG, YICONG; ZHANG, HUA; LIANG, JIANMIAO; XIAN, HAIBING; ZHANG, SHUNDA

    2014-01-01

    It has been demonstrated that erlotinib is effective in treating patients with brain metastasis from non-small-cell lung cancer. However, the number of studies determining the erlotinib concentration in these patients is limited. The purpose of this study was to measure the concentration of erlotinib in the cerebrospinal fluid of patients with brain metastasis from non-small-cell lung carcinoma. Six patients were treated with the standard recommended daily dose of erlotinib (150 mg) for 4 weeks. All the patients had previously received chemotherapy, but no brain radiotherapy. At the end of the treatment period, blood plasma and cerebrospinal fluid samples were collected and the erlotinib concentration was determined by high-performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS). The average erlotinib concentration in the blood plasma and the cerebrospinal fluid was 717.7±459.7 and 23.7±13.4 ng/ml, respectively. The blood-brain barrier permeation rate of erlotinib was found to be 4.4±3.2%. In patients with partial response (PR), stable disease (SD) and progressive disease (PD), the average concentrations of erlotinib in the cerebrospinal fluid were 35.5±19.0, 19.1±8.7 and 16.4±5.9 ng/ml, respectively. In addition, the efficacy rate of erlotinib for metastatic brain lesions was 33.3%, increasing to 50% in patients with EGFR mutations. However, erlotinib appeared to be ineffective in cases with wild-type EGFR. In conclusion, a relatively high concentration of erlotinib was detected in the cerebrospinal fluid of patients with brain metastases from non-small-cell lung cancer. Thus, erlotinib may be considered as a treatment option for this patient population. PMID:24649318

  8. Waste management in small hospitals: trouble for environment.

    Science.gov (United States)

    Pant, Deepak

    2012-07-01

    Small hospitals are the grassroots for the big hospital structures, so proper waste management practices require to be initiated from there. Small hospitals contribute a lot in the health care facilities, but due to their poor waste management practices, they pose serious biomedical waste pollution. A survey was conducted with 13 focus questions collected from the 100 hospital present in Dehradun. Greater value of per day per bed waste was found among the small hospitals (178 g compared with 114 g in big hospitals), indicating unskilled waste management practices. Small hospitals do not follow the proper way for taking care of segregation of waste generated in the hospital, and most biomedical wastes were collected without segregation into infectious and noninfectious categories.

  9. Big Data en surveillance, deel 1 : Definities en discussies omtrent Big Data

    NARCIS (Netherlands)

    Timan, Tjerk

    2016-01-01

    Naar aanleiding van een (vrij kort) college over surveillance en Big Data, werd me gevraagd iets dieper in te gaan op het thema, definities en verschillende vraagstukken die te maken hebben met big data. In dit eerste deel zal ik proberen e.e.a. uiteen te zetten betreft Big Data theorie en

  10. String theory and pre-big bang cosmology

    Science.gov (United States)

    Gasperini, M.; Veneziano, G.

    2016-09-01

    In string theory, the traditional picture of a Universe that emerges from the inflation of a very small and highly curved space-time patch is a possibility, not a necessity: quite different initial conditions are possible, and not necessarily unlikely. In particular, the duality symmetries of string theory suggest scenarios in which the Universe starts inflating from an initial state characterized by very small curvature and interactions. Such a state, being gravitationally unstable, will evolve towards higher curvature and coupling, until string-size effects and loop corrections make the Universe "bounce" into a standard, decreasing-curvature regime. In such a context, the hot big bang of conventional cosmology is replaced by a "hot big bounce" in which the bouncing and heating mechanisms originate from the quantum production of particles in the high-curvature, large-coupling pre-bounce phase. Here we briefly summarize the main features of this inflationary scenario, proposed a quarter century ago. In its simplest version (where it represents an alternative and not a complement to standard slow-roll inflation) it can produce a viable spectrum of density perturbations, together with a tensor component characterized by a "blue" spectral index with a peak in the GHz frequency range. That means, phenomenologically, a very small contribution to a primordial B-mode in the CMB polarization, and the possibility of a large enough stochastic background of gravitational waves to be measurable by present or future gravitational wave detectors.

  11. Selection of optimal treatment scheme for brain metastases of non-small cell lung cancer

    International Nuclear Information System (INIS)

    Dong Mingxin; Zhao Tong; Huang Jingzi; Yu Shukun; Ma Yan; Tian Zhongcheng; Jin Xiangshun; Quan Jizhong; Liu Jin; Wang Dongxu

    2006-01-01

    Objective: To select the optimal treatment scheme for brain metastases of non-small cell lung cancers (NSCLCs). Methods: Seventy-two NSCLC cases diagnosesd by pathology with brain metastases were randomly classified into three groups, Group I, 24 cases with whole brain conventional external fractioned irradiation of D T 36-41 Gy/4-5 w, Group II, 22 eases with y-knife treatment plus whole brain conventional external fractioned irradiation, and Group III, 26 cases with γ-knife plus whole brain conventional external fractioned irradiation in combination with chemotherapy of Vm-26. The surrounding area of tumor was strictly covered with 50% para-central-dosal curve in γ-knife treatment (D T 16-25 Gy with a mean of 16 Gy). The muirleaf collimator was selected according to the volume of tumors. Chemotherapy of Vm-26 (60 mg/m 2 d1-3) was applied during the treatment with whole brain conventional external fractioned irradiation (D T 19-29 Gy/2-3 w), 21 days in a period, 2 periods in total. Results: The median survival time was estimated to be 6.0 months (ranged from 1.2 to 19.0 months) in the Group I, 9.2 months (4.4-30 months) in the Group II, and 10.8 months (5.2-42.2 months) in the Group III. The 1-year and 2-year survival rates were 34.6% and 12.6% , 62.2% and 30.2%, and 70.8% and 35.6% respectively in Group I, Group II, and Group III, respectively. Conclusion: For brain metastases of NSCLC, γ-knife plus whole brain conventional external fractioned irradiation combined with treatment of Vm-26 had a significantly beneficial influence on improvement of the local control and 1-year and 2-year survival. There was no complaint about the side-effects of the treatment. (authors)

  12. Characterizing Big Data Management

    OpenAIRE

    Rogério Rossi; Kechi Hirama

    2015-01-01

    Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: t...

  13. Subthalamic deep brain stimulation modulates small fiber-dependent sensory thresholds in Parkinson's disease.

    Science.gov (United States)

    Ciampi de Andrade, Daniel; Lefaucheur, Jean-Pascal; Galhardoni, Ricardo; Ferreira, Karine S L; Brandão Paiva, Anderson Rodrigues; Bor-Seng-Shu, Edson; Alvarenga, Luciana; Myczkowski, Martin L; Marcolin, Marco Antonio; de Siqueira, Silvia R D T; Fonoff, Erich; Barbosa, Egberto Reis; Teixeira, Manoel Jacobsen

    2012-05-01

    The effects of deep brain stimulation of the subthalamic nucleus on nonmotor symptoms of Parkinson's disease (PD) rarely have been investigated. Among these, sensory disturbances, including chronic pain (CP), are frequent in these patients. The aim of this study was to evaluate the changes induced by deep brain stimulation in the perception of sensory stimuli, either noxious or innocuous, mediated by small or large nerve fibers. Sensory detection and pain thresholds were assessed in 25 PD patients all in the off-medication condition with the stimulator turned on or off (on- and off-stimulation conditions, respectively). The relationship between the changes induced by surgery on quantitative sensory testing, spontaneous CP, and motor abilities were studied. Quantitative sensory test results obtained in PD patients were compared with those of age-matched healthy subjects. Chronic pain was present in 72% of patients before vs 36% after surgery (P=.019). Compared with healthy subjects, PD patients had an increased sensitivity to innocuous thermal stimuli and mechanical pain, but a reduced sensitivity to innocuous mechanical stimuli. In addition, they had an increased pain rating when painful thermal stimuli were applied, particularly in the off-stimulation condition. In the on-stimulation condition, there was an increased sensitivity to innocuous thermal stimuli but a reduced sensitivity to mechanical or thermal pain. Pain provoked by thermal stimuli was reduced when the stimulator was turned on. Motor improvement positively correlated with changes in warm detection and heat pain thresholds. Subthalamic nucleus deep brain stimulation contributes to relieve pain associated with PD and specifically modulates small fiber-mediated sensations. Copyright © 2012 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  14. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  15. An analysis of cross-sectional differences in big and non-big public accounting firms' audit programs

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans); Drieenhuizen, F.; Stein, M.T.; Simunic, D.A.

    2006-01-01

    A significant body of prior research has shown that audits by the Big 5 (now Big 4) public accounting firms are quality differentiated relative to non-Big 5 audits. This result can be derived analytically by assuming that Big 5 and non-Big 5 firms face different loss functions for "audit failures"

  16. Big Data is invading big places as CERN

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Big Data technologies are becoming more popular with the constant grow of data generation in different fields such as social networks, internet of things and laboratories like CERN. How is CERN making use of such technologies? How machine learning is applied at CERN with Big Data technologies? How much data we move and how it is analyzed? All these questions will be answered during the talk.

  17. The big bang

    International Nuclear Information System (INIS)

    Chown, Marcus.

    1987-01-01

    The paper concerns the 'Big Bang' theory of the creation of the Universe 15 thousand million years ago, and traces events which physicists predict occurred soon after the creation. Unified theory of the moment of creation, evidence of an expanding Universe, the X-boson -the particle produced very soon after the big bang and which vanished from the Universe one-hundredth of a second after the big bang, and the fate of the Universe, are all discussed. (U.K.)

  18. Big Data in Science and Healthcare: A Review of Recent Literature and Perspectives. Contribution of the IMIA Social Media Working Group.

    Science.gov (United States)

    Hansen, M M; Miron-Shatz, T; Lau, A Y S; Paton, C

    2014-08-15

    As technology continues to evolve and rise in various industries, such as healthcare, science, education, and gaming, a sophisticated concept known as Big Data is surfacing. The concept of analytics aims to understand data. We set out to portray and discuss perspectives of the evolving use of Big Data in science and healthcare and, to examine some of the opportunities and challenges. A literature review was conducted to highlight the implications associated with the use of Big Data in scientific research and healthcare innovations, both on a large and small scale. Scientists and health-care providers may learn from one another when it comes to understanding the value of Big Data and analytics. Small data, derived by patients and consumers, also requires analytics to become actionable. Connectivism provides a framework for the use of Big Data and analytics in the areas of science and healthcare. This theory assists individuals to recognize and synthesize how human connections are driving the increase in data. Despite the volume and velocity of Big Data, it is truly about technology connecting humans and assisting them to construct knowledge in new ways. Concluding Thoughts: The concept of Big Data and associated analytics are to be taken seriously when approaching the use of vast volumes of both structured and unstructured data in science and health-care. Future exploration of issues surrounding data privacy, confidentiality, and education are needed. A greater focus on data from social media, the quantified self-movement, and the application of analytics to "small data" would also be useful.

  19. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  20. Big Data and Neuroimaging.

    Science.gov (United States)

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  1. Pengembangan Aplikasi Antarmuka Layanan Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Gede Karya

    2017-11-01

    Full Text Available In the 2016 Higher Competitive Grants Research (Hibah Bersaing Dikti, we have been successfully developed models, infrastructure and modules of Hadoop-based big data analysis application. It has also successfully developed a virtual private network (VPN network that allows integration and access to the infrastructure from outside the FTIS Computer Laboratorium. Infrastructure and application modules of analysis are then wanted to be presented as services to small and medium enterprises (SMEs in Indonesia. This research aims to develop application of big data analysis service interface integrated with Hadoop-Cluster. The research begins with finding appropriate methods and techniques for scheduling jobs, calling for ready-made Java Map-Reduce (MR application modules, and techniques for tunneling input / output and meta-data construction of service request (input and service output. The above methods and techniques are then developed into a web-based service application, as well as an executable module that runs on Java and J2EE based programming environment and can access Hadoop-Cluster in the FTIS Computer Lab. The resulting application can be accessed by the public through the site http://bigdata.unpar.ac.id. Based on the test results, the application has functioned well in accordance with the specifications and can be used to perform big data analysis. Keywords: web based service, big data analysis, Hadop, J2EE Abstrak Pada penelitian Hibah Bersaing Dikti tahun 2016 telah berhasil dikembangkan model, infrastruktur dan modul-modul aplikasi big data analysis berbasis Hadoop. Selain itu juga telah berhasil dikembangkan jaringan virtual private network (VPN yang memungkinkan integrasi dan akses infrastruktur tersebut dari luar Laboratorium Komputer FTIS. Infrastruktur dan modul aplikasi analisis tersebut selanjutnya ingin dipresentasikan sebagai layanan kepada usaha kecil dan menengah (UKM di Indonesia. Penelitian ini bertujuan untuk mengembangkan

  2. Big Data; A Management Revolution : The emerging role of big data in businesses

    OpenAIRE

    Blasiak, Kevin

    2014-01-01

    Big data is a term that was coined in 2012 and has since then emerged to one of the top trends in business and technology. Big data is an agglomeration of different technologies resulting in data processing capabilities that have been unreached before. Big data is generally characterized by 4 factors. Volume, velocity and variety. These three factors distinct it from the traditional data use. The possibilities to utilize this technology are vast. Big data technology has touch points in differ...

  3. Multielectrode recordings from auditory neurons in the brain of a small grasshopper.

    Science.gov (United States)

    Bhavsar, Mit Balvantray; Heinrich, Ralf; Stumpner, Andreas

    2015-12-30

    Grasshoppers have been used as a model system to study the neuronal basis of insect acoustic behavior. Auditory neurons have been described from intracellular recordings. The growing interest to study population activity of neurons has been satisfied so far with artificially combining data from different individuals. We for the first time used multielectrode recordings from a small grasshopper brain. We used three 12μm tungsten wires (combined in a multielectrode) to record from local brain neurons and from a population of auditory neurons entering the brain from the thorax. Spikes of the recorded units were separated by sorting algorithms and spike collision analysis. The tungsten wires enabled stable recordings with high signal to noise ratio. Due to the tight temporal coupling of auditory activity to the stimulus spike collisions were frequent and collision analysis retrieved 10-15% of additional spikes. Marking the electrode position was possible using a fluorescent dye or electrocoagulation with high current. Physiological identification of units described from intracellular recordings was hard to achieve. 12μm tungsten wires gave a better signal to noise ratio than 15μm copper wires previously used in recordings from bees' brains. Recording the population activity of auditory neurons in one individual prevents interindividual and trial-to-trial variability which otherwise reduce the validity of the analysis. Double intracellular recordings have quite low success rate and therefore are rarely achieved and their stability is much lower than that of multielectrode recordings which allows sampling of data for 30min or more. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. The Big Bang and the Search for a Theory of Everything

    Science.gov (United States)

    Kogut, Alan

    2010-01-01

    How did the universe begin? Is the gravitational physics that governs the shape and evolution of the cosmos connected in a fundamental way to the sub-atomic physics of particle colliders? Light from the Big Bang still permeates the universe and carries within it faint clues to the physics at the start of space and time. I will describe how current and planned measurements of the cosmic microwave background will observe the Big Bang to provide new insight into a "Theory of Everything" uniting the physics of the very large with the physics of the very small.

  5. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  6. Cryptography for Big Data Security

    Science.gov (United States)

    2015-07-13

    Cryptography for Big Data Security Book Chapter for Big Data: Storage, Sharing, and Security (3S) Distribution A: Public Release Ariel Hamlin1 Nabil...Email: arkady@ll.mit.edu ii Contents 1 Cryptography for Big Data Security 1 1.1 Introduction...48 Chapter 1 Cryptography for Big Data Security 1.1 Introduction With the amount

  7. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  8. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  9. Big Data Analytics An Overview

    Directory of Open Access Journals (Sweden)

    Jayshree Dwivedi

    2015-08-01

    Full Text Available Big data is a data beyond the storage capacity and beyond the processing power is called big data. Big data term is used for data sets its so large or complex that traditional data it involves data sets with sizes. Big data size is a constantly moving target year by year ranging from a few dozen terabytes to many petabytes of data means like social networking sites the amount of data produced by people is growing rapidly every year. Big data is not only a data rather it become a complete subject which includes various tools techniques and framework. It defines the epidemic possibility and evolvement of data both structured and unstructured. Big data is a set of techniques and technologies that require new forms of assimilate to uncover large hidden values from large datasets that are diverse complex and of a massive scale. It is difficult to work with using most relational database management systems and desktop statistics and visualization packages exacting preferably massively parallel software running on tens hundreds or even thousands of servers. Big data environment is used to grab organize and resolve the various types of data. In this paper we describe applications problems and tools of big data and gives overview of big data.

  10. Hemisphere- and gender-related differences in small-world brain networks: a resting-state functional MRI study.

    Science.gov (United States)

    Tian, Lixia; Wang, Jinhui; Yan, Chaogan; He, Yong

    2011-01-01

    We employed resting-state functional MRI (R-fMRI) to investigate hemisphere- and gender-related differences in the topological organization of human brain functional networks. Brain networks were first constructed by measuring inter-regional temporal correlations of R-fMRI data within each hemisphere in 86 young, healthy, right-handed adults (38 males and 48 females) followed by a graph-theory analysis. The hemispheric networks exhibit small-world attributes (high clustering and short paths) that are compatible with previous results in the whole-brain functional networks. Furthermore, we found that compared with females, males have a higher normalized clustering coefficient in the right hemispheric network but a lower clustering coefficient in the left hemispheric network, suggesting a gender-hemisphere interaction. Moreover, we observed significant hemisphere-related differences in the regional nodal characteristics in various brain regions, such as the frontal and occipital regions (leftward asymmetry) and the temporal regions (rightward asymmetry), findings that are consistent with previous studies of brain structural and functional asymmetries. Together, our results suggest that the topological organization of human brain functional networks is associated with gender and hemispheres, and they provide insights into the understanding of functional substrates underlying individual differences in behaviors and cognition. Copyright © 2010 Elsevier Inc. All rights reserved.

  11. Urbanising Big

    DEFF Research Database (Denmark)

    Ljungwall, Christer

    2013-01-01

    Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis.......Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis....

  12. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Boyd, Richard N.

    2001-01-01

    The precision of measurements in modern cosmology has made huge strides in recent years, with measurements of the cosmic microwave background and the determination of the Hubble constant now rivaling the level of precision of the predictions of big bang nucleosynthesis. However, these results are not necessarily consistent with the predictions of the Standard Model of big bang nucleosynthesis. Reconciling these discrepancies may require extensions of the basic tenets of the model, and possibly of the reaction rates that determine the big bang abundances

  13. Sense Things in the Big Deep Water Bring the Big Deep Water to Computers so People can understand the Deep Water all the Time without getting wet

    Science.gov (United States)

    Pelz, M.; Heesemann, M.; Scherwath, M.; Owens, D.; Hoeberechts, M.; Moran, K.

    2015-12-01

    Senses help us learn stuff about the world. We put sense things in, over, and under the water to help people understand water, ice, rocks, life and changes over time out there in the big water. Sense things are like our eyes and ears. We can use them to look up and down, right and left all of the time. We can also use them on top of or near the water to see wind and waves. As the water gets deep, we can use our sense things to see many a layer of different water that make up the big water. On the big water we watch ice grow and then go away again. We think our sense things will help us know if this is different from normal, because it could be bad for people soon if it is not normal. Our sense things let us hear big water animals talking low (but sometimes high). We can also see animals that live at the bottom of the big water and we take lots of pictures of them. Lots of the animals we see are soft and small or hard and small, but sometimes the really big ones are seen too. We also use our sense things on the bottom and sometimes feel the ground shaking. Sometimes, we get little pockets of bad smelling air going up, too. In other areas of the bottom, we feel hot hot water coming out of the rock making new rocks and we watch some animals even make houses and food out of the hot hot water that turns to rock as it cools. To take care of the sense things we use and control water cars and smaller water cars that can dive deep in the water away from the bigger water car. We like to put new things in the water and take things out of the water that need to be fixed at least once a year. Sense things are very cool because you can use the sense things with your computer too. We share everything for free on our computers, which your computer talks to and gets pictures and sounds for you. Sharing the facts from the sense things is the best part about having the sense things because we can get many new ideas about understanding the big water from anyone with a computer!

  14. The ethics of big data in big agriculture

    OpenAIRE

    Carbonell (Isabelle M.)

    2016-01-01

    This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique in...

  15. The big data-big model (BDBM) challenges in ecological research

    Science.gov (United States)

    Luo, Y.

    2015-12-01

    The field of ecology has become a big-data science in the past decades due to development of new sensors used in numerous studies in the ecological community. Many sensor networks have been established to collect data. For example, satellites, such as Terra and OCO-2 among others, have collected data relevant on global carbon cycle. Thousands of field manipulative experiments have been conducted to examine feedback of terrestrial carbon cycle to global changes. Networks of observations, such as FLUXNET, have measured land processes. In particular, the implementation of the National Ecological Observatory Network (NEON), which is designed to network different kinds of sensors at many locations over the nation, will generate large volumes of ecological data every day. The raw data from sensors from those networks offer an unprecedented opportunity for accelerating advances in our knowledge of ecological processes, educating teachers and students, supporting decision-making, testing ecological theory, and forecasting changes in ecosystem services. Currently, ecologists do not have the infrastructure in place to synthesize massive yet heterogeneous data into resources for decision support. It is urgent to develop an ecological forecasting system that can make the best use of multiple sources of data to assess long-term biosphere change and anticipate future states of ecosystem services at regional and continental scales. Forecasting relies on big models that describe major processes that underlie complex system dynamics. Ecological system models, despite great simplification of the real systems, are still complex in order to address real-world problems. For example, Community Land Model (CLM) incorporates thousands of processes related to energy balance, hydrology, and biogeochemistry. Integration of massive data from multiple big data sources with complex models has to tackle Big Data-Big Model (BDBM) challenges. Those challenges include interoperability of multiple

  16. EFFECT OF MARKET ORIENTATION ON SMALL BUSINESS PERORMANCE IN SMALL TOWN IN MALAYSIA: AN EMPRICAL STUDY ON MALAYSIAN SMALL FIRMS

    Directory of Open Access Journals (Sweden)

    Muhammad Masroor ALAM

    2010-01-01

    Full Text Available Most research on market orientation, and performance was related to big firms. In this study, based on the theoretical framework, a model was developed to investigate the effect of market orientation on business performance in small firms. To test the relationships among the variables, data from 53 small firms in the small town of Chunglun at Sintok, Kedah was used. The findings show that the three components of market orientation are related positive to business performance of small firms. The further analysis also confirmed that customer orientation and competitor orientation are strong predictors of small firm performance. The findings of this study confirm that market orientation behavior also applies to small firms.

  17. A Big Video Manifesto

    DEFF Research Database (Denmark)

    Mcilvenny, Paul Bruce; Davidsen, Jacob

    2017-01-01

    and beautiful visualisations. However, we also need to ask what the tools of big data can do both for the Humanities and for more interpretative approaches and methods. Thus, we prefer to explore how the power of computation, new sensor technologies and massive storage can also help with video-based qualitative......For the last few years, we have witnessed a hype about the potential results and insights that quantitative big data can bring to the social sciences. The wonder of big data has moved into education, traffic planning, and disease control with a promise of making things better with big numbers...

  18. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  19. Motexafin Gadolinium Combined With Prompt Whole Brain Radiotherapy Prolongs Time to Neurologic Progression in Non-Small-Cell Lung Cancer Patients With Brain Metastases: Results of a Phase III Trial

    International Nuclear Information System (INIS)

    Mehta, Minesh P.; Shapiro, William R.; Phan, See C.; Gervais, Radj; Carrie, Christian; Chabot, Pierre; Patchell, Roy A.; Glantz, Michael J.; Recht, Lawrence; Langer, Corey; Sur, Ranjan K.; Roa, Wilson H.; Mahe, Marc A.; Fortin, Andre; Nieder, Carsten; Meyers, Christina A.; Smith, Jennifer A.; Miller, Richard A.; Renschler, Markus F.

    2009-01-01

    Purpose: To determine the efficacy of motexafin gadolinium (MGd) in combination with whole brain radiotherapy (WBRT) for the treatment of brain metastases from non-small-cell lung cancer. Methods and Materials: In an international, randomized, Phase III study, patients with brain metastases from non-small-cell lung cancer were randomized to WBRT with or without MGd. The primary endpoint was the interval to neurologic progression, determined by a centralized Events Review Committee who was unaware of the treatment the patients had received. Results: Of 554 patients, 275 were randomized to WBRT and 279 to WBRT+MGd. Treatment with MGd was well tolerated, and 92% of the intended doses were administered. The most common MGd-related Grade 3+ adverse events included liver function abnormalities (5.5%), asthenia (4.0%), and hypertension (4%). MGd improved the interval to neurologic progression compared with WBRT alone (15 vs. 10 months; p = 0.12, hazard ratio [HR] = 0.78) and the interval to neurocognitive progression (p = 0.057, HR = 0.78). The WBRT patients required more salvage brain surgery or radiosurgery than did the WBRT+MGd patients (54 vs. 25 salvage procedures, p < 0.001). A statistically significant interaction between the geographic region and MGd treatment effect (which was in the prespecified analysis plan) and between treatment delay and MGd treatment effect was found. In North American patients, where treatment was more prompt, a statistically significant prolongation of the interval to neurologic progression, from 8.8 months for WBRT to 24.2 months for WBRT+MGd (p = 0.004, HR = 0.53), and the interval to neurocognitive progression (p = 0.06, HR = 0.73) were observed. Conclusion: In the intent-to-treat analysis, MGd exhibited a favorable trend in neurologic outcomes. MGd significantly prolonged the interval to neurologic progression in non-small-cell lung cancer patients with brain metastases receiving prompt WBRT. The toxicity was acceptable

  20. Applications of Big Data in Education

    OpenAIRE

    Faisal Kalota

    2015-01-01

    Big Data and analytics have gained a huge momentum in recent years. Big Data feeds into the field of Learning Analytics (LA) that may allow academic institutions to better understand the learners' needs and proactively address them. Hence, it is important to have an understanding of Big Data and its applications. The purpose of this descriptive paper is to provide an overview of Big Data, the technologies used in Big Data, and some of the applications of Big Data in educa...

  1. Big Data Semantics

    NARCIS (Netherlands)

    Ceravolo, Paolo; Azzini, Antonia; Angelini, Marco; Catarci, Tiziana; Cudré-Mauroux, Philippe; Damiani, Ernesto; Mazak, Alexandra; van Keulen, Maurice; Jarrar, Mustafa; Santucci, Giuseppe; Sattler, Kai-Uwe; Scannapieco, Monica; Wimmer, Manuel; Wrembel, Robert; Zaraket, Fadi

    2018-01-01

    Big Data technology has discarded traditional data modeling approaches as no longer applicable to distributed data processing. It is, however, largely recognized that Big Data impose novel challenges in data and infrastructure management. Indeed, multiple components and procedures must be

  2. Pinhole SPECT: high resolution imaging of brain tumours in small laboratory animals

    International Nuclear Information System (INIS)

    Franceschim, M.; Bokulic, T.; Kusic, Z.; Strand, S.E.; Erlandsson, K.

    1994-01-01

    The performance properties of pinhole SPECT and the application of this technology to evaluate radionuclide uptake in brain in small laboratory animals were investigated. System sensitivity and spatial resolution measurements of a rotating scintillation camera system were made for a low energy pinhole collimator equipped with 2.0 mm aperture pinhole insert. Projection data were acquired at 4 degree increments over 360 degrees in the step and shoot mode using a 4.5 cm radius of rotation. Pinhole planar and SPECT imaging were obtained to evaluate regional uptake of Tl-201, Tc-99m-MIBI, Tc-99m-HMPAO and Tc-99m-DTPA in tumor and control regions of the brain in a primary brain tumor model in Fisher 344 rats. Pinhole SPECT images were reconstructed using a modified cone- beam algorithm developed from a two dimensional fan-beam filtered backprojection algorithm. The reconstructed transaxial resolution of 2.8 FWHM and system sensitivity of 0.086 c/s/kBq with the 2.0 mm pinhole collimator aperture were measured. Tumor to non-tumor uptake ratios at 19-28 days post tumor cell inoculation varied by a factor > 20:1 on SPECT images. Pinhole SPECT provides an important new approach for performing high resolution imaging: the resolution properties of pinhole SPECT are superior to those which have been achieved with conventional SPECT or PET imaging technologies. (author)

  3. Brain Functional Connectivity in Small Cell Lung Cancer Population after Chemotherapy Treatment: an ICA fMRI Study

    Science.gov (United States)

    Bromis, K.; Kakkos, I.; Gkiatis, K.; Karanasiou, I. S.; Matsopoulos, G. K.

    2017-11-01

    Previous neurocognitive assessments in Small Cell Lung Cancer (SCLC) population, highlight the presence of neurocognitive impairments (mainly in attention processing and executive functioning) in this type of cancer. The majority of these studies, associate these deficits with the Prophylactic Cranial Irradiation (PCI) that patients undergo in order to avoid brain metastasis. However, there is not much evidence exploring cognitive impairments induced by chemotherapy in SCLC patients. For this reason, we aimed to investigate the underlying processes that may potentially affect cognition by examining brain functional connectivity in nineteen SCLC patients after chemotherapy treatment, while additionally including fourteen healthy participants as control group. Independent Component Analysis (ICA) is a functional connectivity measure aiming to unravel the temporal correlation between brain regions, which are called brain networks. We focused on two brain networks related to the aforementioned cognitive functions, the Default Mode Network (DMN) and the Task-Positive Network (TPN). Permutation tests were performed between the two groups to assess the differences and control for familywise errors in the statistical parametric maps. ICA analysis showed functional connectivity disruptions within both of the investigated networks. These results, propose a detrimental effect of chemotherapy on brain functioning in the SCLC population.

  4. Evaluation of anesthesia effects on [18F]FDG uptake in mouse brain and heart using small animal PET

    International Nuclear Information System (INIS)

    Toyama, Hiroshi; Ichise, Masanori; Liow, Jeih-San; Vines, Douglass C.; Seneca, Nicholas M.; Modell, Kendra J.; Seidel, Jurgen; Green, Michael V.; Innis, Robert B.

    2004-01-01

    This study evaluates effects of anesthesia on 18 F-FDG (FDG) uptake in mouse brain and heart to establish the basic conditions of small animal PET imaging. Prior to FDG injection, 12 mice were anesthetized with isoflurane gas; 11 mice were anesthetized with an intraperitoneal injection of a ketamine/xylazine mixture; and 11 mice were awake. In isoflurane and ketamine/xylazine conditions, FDG brain uptake (%ID/g) was significantly lower than in controls. Conversely, in the isoflurane condition, %ID/g in heart was significantly higher than in controls, whereas heart uptake in ketamine/xylazine mice was significantly lower. Results suggest that anesthesia impedes FDG uptake in mouse brain and affects FDG uptake in heart; however, the effects in the brain and heart differ depending on the type of anesthesia used

  5. Comparative validity of brief to medium-length Big Five and Big Six personality questionnaires

    NARCIS (Netherlands)

    Thalmayer, A.G.; Saucier, G.; Eigenhuis, A.

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five

  6. Big Bang nucleosynthesis: Accelerator tests and can Ω/sub B/ really be large

    International Nuclear Information System (INIS)

    Schramm, D.N.

    1987-10-01

    The first collider tests of cosmological theory are now underway. The number of neutrino families in nature, N/sub nu/, plays a key role in elementary particle physics as well as in the synthesis of the light elements during the early evolution of the Universe. Standard Big Bang Nucleosynthesis argues for N/sub nu/ = 3 +- 1. Current limits on N/sub nu/ from the CERN anti pp collider and e + e - colliders are presented and compared to the cosmological bound. Supernova SN 1987A is also shown to give a limit on N/sub nu/ comparable to current accelerator bounds. All numbers are found to be small thus verifying the Big Bang model at an earlier epoch than is possible by traditional astronomical observations. Future measurements at SLC and LEP will further tighten this argument. Another key prediction of the standard Big Bang Nucleosynthesis is that the baryon density must be small (Ω/sub B/ ≤ 0.1). Recent attempts to try to subvert this argument using homogeneities of various types are shown to run afoul of the 7 Li abundance which has now become a rather firm constraint. 18 refs., 2 figs

  7. Big bang nucleosynthesis: The strong nuclear force meets the weak anthropic principle

    International Nuclear Information System (INIS)

    MacDonald, J.; Mullan, D. J.

    2009-01-01

    Contrary to a common argument that a small increase in the strength of the strong force would lead to destruction of all hydrogen in the big bang due to binding of the diproton and the dineutron with a catastrophic impact on life as we know it, we show that provided the increase in strong force coupling constant is less than about 50% substantial amounts of hydrogen remain. The reason is that an increase in strong force strength leads to tighter binding of the deuteron, permitting nucleosynthesis to occur earlier in the big bang at higher temperature than in the standard big bang. Photodestruction of the less tightly bound diproton and dineutron delays their production to after the bulk of nucleosynthesis is complete. The decay of the diproton can, however, lead to relatively large abundances of deuterium.

  8. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2015 The Authors.

  9. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  10. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  11. Brain Aneurysm

    Science.gov (United States)

    A brain aneurysm is an abnormal bulge or "ballooning" in the wall of an artery in the brain. They are sometimes called berry aneurysms because they ... often the size of a small berry. Most brain aneurysms produce no symptoms until they become large, ...

  12. Comparative validity of brief to medium-length Big Five and Big Six Personality Questionnaires.

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-12-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are faced with a variety of options as to inventory length. Furthermore, a 6-factor model has been proposed to extend and update the Big Five model, in part by adding a dimension of Honesty/Humility or Honesty/Propriety. In this study, 3 popular brief to medium-length Big Five measures (NEO Five Factor Inventory, Big Five Inventory [BFI], and International Personality Item Pool), and 3 six-factor measures (HEXACO Personality Inventory, Questionnaire Big Six Scales, and a 6-factor version of the BFI) were placed in competition to best predict important student life outcomes. The effect of test length was investigated by comparing brief versions of most measures (subsets of items) with original versions. Personality questionnaires were administered to undergraduate students (N = 227). Participants' college transcripts and student conduct records were obtained 6-9 months after data was collected. Six-factor inventories demonstrated better predictive ability for life outcomes than did some Big Five inventories. Additional behavioral observations made on participants, including their Facebook profiles and cell-phone text usage, were predicted similarly by Big Five and 6-factor measures. A brief version of the BFI performed surprisingly well; across inventory platforms, increasing test length had little effect on predictive validity. Comparative validity of the models and measures in terms of outcome prediction and parsimony is discussed.

  13. Big Machines and Big Science: 80 Years of Accelerators at Stanford

    Energy Technology Data Exchange (ETDEWEB)

    Loew, Gregory

    2008-12-16

    Longtime SLAC physicist Greg Loew will present a trip through SLAC's origins, highlighting its scientific achievements, and provide a glimpse of the lab's future in 'Big Machines and Big Science: 80 Years of Accelerators at Stanford.'

  14. Risk of intracranial hemorrhage and cerebrovascular accidents in non-small cell lung cancer brain metastasis patients.

    Science.gov (United States)

    Srivastava, Geetika; Rana, Vishal; Wallace, Suzy; Taylor, Sarah; Debnam, Matthew; Feng, Lei; Suki, Dima; Karp, Daniel; Stewart, David; Oh, Yun

    2009-03-01

    Brain metastases confer significant morbidity and a poorer survival in non-small cell lung cancer (NSCLC). Vascular endothelial growth factor-targeted antiangiogenic therapies (AAT) have demonstrated benefit for patients with metastatic NSCLC and are expected to directly inhibit the pathophysiology and morbidity of brain metastases, yet patients with brain metastases have been excluded from most clinical trials of AAT for fear of intracranial hemorrhage (ICH). The underlying risk of ICH from NSCLC brain metastases is low, but needs to be quantitated to plan clinical trials of AAT for NSCLC brain metastases. Data from MD Anderson Cancer Center Tumor Registry and electronic medical records from January 1998 to March 2006 was interrogated. Two thousand one hundred forty-three patients with metastatic NSCLC registering from January 1998 to September 2005 were followed till March 2006. Seven hundred seventy-six patients with and 1,367 patients without brain metastases were followed till death, date of ICH, or last date of study, whichever occurred first. The incidence of ICH seemed to be higher in those with brain metastasis compared with those without brain metastases, in whom they occurred as result of cerebrovascular accidents. However, the rates of symptomatic ICH were not significantly different. All ICH patients with brain metastasis had received radiation therapy for them and had been free of anticoagulation. Most of the brain metastasis-associated ICH's were asymptomatic, detected during increased radiologic surveillance. The rates of symptomatic ICH, or other cerebrovascular accidents in general were similar and not significantly different between the two groups. In metastatic NSCLC patients, the incidence of spontaneous ICH appeared to be higher in those with brain metastases compared with those without, but was very low in both groups without a statistically significant difference. These data suggest a minimal risk of clinically significant ICH for NSCLC

  15. Dual of big bang and big crunch

    International Nuclear Information System (INIS)

    Bak, Dongsu

    2007-01-01

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory

  16. The Importance of Hunting and Hunting Areas for Big and Small Game (Food) for the Tourism Development in the Crna River Basin in the Republic of Macedonia

    OpenAIRE

    Koteski, Cane; Josheski, Dushko; Jakovlev, Zlatko; Bardarova, Snezana; Serafimova, Mimoza

    2014-01-01

    The Crna River is a river in the Republic of Macedonia, right tributary to Vardar. Its source is in the mountains of Western Macedonia, west of Krusevo. It flows through the village of Sopotnica, and southwards through the plains east of Bitola. The name means “black river” in Macedonian, which is translation for its former Thracian name. The purpose of this paper is to show the hunting and hunting areas for big and small Game (food), the structure of the areas of certain hunting, fi...

  17. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  18. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  19. An original emission tomograph for in vivo brain imaging of small animals

    International Nuclear Information System (INIS)

    Ochoa, A.V.; Ploux, L.; Mastrippolito, R.

    1996-01-01

    The principle of a new tomograph TOHR dedicated for small volume analysis with very high resolution is presented in this paper. We use uncorrelated multi-photons (X or gamma rays) radioisotopes and a large solid angle focusing collimator to make tomographic imaging without reconstruction algorithm. With this original device, detection efficiency and resolution are independent and submillimetric resolution can be achieved. A feasibility study shows that, made achieve the predicted performances of TOHR. We discuss its potential in rat brain tomography by simulating a realistic neuropharmacological experiment using a 1.4 mm resolution prototype of TOHR under development

  20. Immunocytochemical localization of luteinizing hormone-releasing hormone (LHRH) in the nervus terminalis and brain of the big brown bat, Eptesicus fuscus.

    Science.gov (United States)

    Oelschläger, H A; Northcutt, R G

    1992-01-15

    Little is known about the immunohistochemistry of the nervous system in bats. This is particularly true of the nervus terminalis, which exerts strong influence on the reproductive system during ontogeny and in the adult. Luteinizing hormone-releasing hormone (LHRH) was visualized immunocytochemically in the nervus terminalis and brain of juvenile and adult big brown bats (Eptesicus fuscus). The peripheral LHRH-immunoreactive (ir) cells and fibers (nervus terminalis) are dispersed along the basal surface of the forebrain from the olfactory bulbs to the prepiriform cortex and the interpeduncular fossa. A concentration of peripheral LHRH-ir perikarya and fibers was found at the caudalmost part of the olfactory bulbs, near the medioventral forebrain sulcus; obviously these cells mediate between the bulbs and the remaining forebrain. Within the central nervous system (CNS), LHRH-ir perikarya and fibers were distributed throughout the olfactory tubercle, diagonal band, preoptic area, suprachiasmatic and supraoptic nuclei, the bed nuclei of stria terminalis and stria medullaris, the anterior lateral and posterior hypothalamus, and the tuber cinereum. The highest concentration of cells was found within the arcuate nucleus. Fibers were most concentrated within the median eminence, infundibular stalk, and the medial habenula. The data obtained suggest that this distribution of LHRH immunoreactivity may be characteristic for microchiropteran (insectivorous) bats. The strong projections of LHRH-containing nuclei in the basal forebrain (including the arcuate nucleus) to the habenula, may indicate close functional contact between these brain areas via feedback loops, which could be important for the processing of thermal and other environmental stimuli correlated with hibernation.

  1. Decoder calibration with ultra small current sample set for intracortical brain-machine interface

    Science.gov (United States)

    Zhang, Peng; Ma, Xuan; Chen, Luyao; Zhou, Jin; Wang, Changyong; Li, Wei; He, Jiping

    2018-04-01

    Objective. Intracortical brain-machine interfaces (iBMIs) aim to restore efficient communication and movement ability for paralyzed patients. However, frequent recalibration is required for consistency and reliability, and every recalibration will require relatively large most current sample set. The aim in this study is to develop an effective decoder calibration method that can achieve good performance while minimizing recalibration time. Approach. Two rhesus macaques implanted with intracortical microelectrode arrays were trained separately on movement and sensory paradigm. Neural signals were recorded to decode reaching positions or grasping postures. A novel principal component analysis-based domain adaptation (PDA) method was proposed to recalibrate the decoder with only ultra small current sample set by taking advantage of large historical data, and the decoding performance was compared with other three calibration methods for evaluation. Main results. The PDA method closed the gap between historical and current data effectively, and made it possible to take advantage of large historical data for decoder recalibration in current data decoding. Using only ultra small current sample set (five trials of each category), the decoder calibrated using the PDA method could achieve much better and more robust performance in all sessions than using other three calibration methods in both monkeys. Significance. (1) By this study, transfer learning theory was brought into iBMIs decoder calibration for the first time. (2) Different from most transfer learning studies, the target data in this study were ultra small sample set and were transferred to the source data. (3) By taking advantage of historical data, the PDA method was demonstrated to be effective in reducing recalibration time for both movement paradigm and sensory paradigm, indicating a viable generalization. By reducing the demand for large current training data, this new method may facilitate the application

  2. BIG1, a brefeldin A-inhibited guanine nucleotide-exchange protein regulates neurite development via PI3K-AKT and ERK signaling pathways.

    Science.gov (United States)

    Zhou, C; Li, C; Li, D; Wang, Y; Shao, W; You, Y; Peng, J; Zhang, X; Lu, L; Shen, X

    2013-12-19

    The elongation of neuron is highly dependent on membrane trafficking. Brefeldin A (BFA)-inhibited guanine nucleotide-exchange protein 1 (BIG1) functions in the membrane trafficking between the Golgi apparatus and the plasma membrane. BFA, an uncompetitive inhibitor of BIG1 can inhibit neurite outgrowth and polarity development. In this study, we aimed to define the possible role of BIG1 in neurite development and to further investigate the potential mechanism. By immunostaining, we found that BIG1 was extensively colocalized with synaptophysin, a marker for synaptic vesicles in soma and partly in neurites. The amount of both protein and mRNA of BIG1 were up-regulated during rat brain development. BIG1 depletion significantly decreased the neurite length and inhibited the phosphorylation of phosphatidylinositide 3-kinase (PI3K) and protein kinase B (AKT). Inhibition of BIG1 guanine nucleotide-exchange factor (GEF) activity by BFA or overexpression of the dominant-negative BIG1 reduced PI3K and AKT phosphorylation, indicating regulatory effects of BIG1 on PI3K-AKT signaling pathway is dependent on its GEF activity. BIG1 siRNA or BFA treatment also significantly reduced extracellular signal-regulated kinase (ERK) phosphorylation. Overexpression of wild-type BIG1 significantly increased ERK phosphorylation, but the dominant-negative BIG1 had no effect on ERK phosphorylation, indicating the involvement of BIG1 in ERK signaling regulation may not be dependent on its GEF activity. Our result identified a novel function of BIG1 in neurite development. The newly recognized function integrates the function of BIG1 in membrane trafficking with the activation of PI3K-AKT and ERK signaling pathways which are critical in neurite development. Copyright © 2013 IBRO. Published by Elsevier Ltd. All rights reserved.

  3. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  4. A Quantum Universe Before the Big Bang(s)?

    Science.gov (United States)

    Veneziano, Gabriele

    2017-08-01

    The predictions of general relativity have been verified by now in a variety of different situations, setting strong constraints on any alternative theory of gravity. Nonetheless, there are strong indications that general relativity has to be regarded as an approximation of a more complete theory. Indeed theorists have long been looking for ways to connect general relativity, which describes the cosmos and the infinitely large, to quantum physics, which has been remarkably successful in explaining the infinitely small world of elementary particles. These two worlds, however, come closer and closer to each other as we go back in time all the way up to the big bang. Actually, modern cosmology has changed completely the old big bang paradigm: we now have to talk about (at least) two (big?) bangs. If we know quite something about the one closer to us, at the end of inflation, we are much more ignorant about the one that may have preceded inflation and possibly marked the beginning of time. No one doubts that quantum mechanics plays an essential role in answering these questions: unfortunately a unified theory of gravity and quantum mechanics is still under construction. Finding such a synthesis and confirming it experimentally will no doubt be one of the biggest challenges of this century’s physics.

  5. Generalized formal model of Big Data

    OpenAIRE

    Shakhovska, N.; Veres, O.; Hirnyak, M.

    2016-01-01

    This article dwells on the basic characteristic features of the Big Data technologies. It is analyzed the existing definition of the “big data” term. The article proposes and describes the elements of the generalized formal model of big data. It is analyzed the peculiarities of the application of the proposed model components. It is described the fundamental differences between Big Data technology and business analytics. Big Data is supported by the distributed file system Google File System ...

  6. BigWig and BigBed: enabling browsing of large distributed datasets.

    Science.gov (United States)

    Kent, W J; Zweig, A S; Barber, G; Hinrichs, A S; Karolchik, D

    2010-09-01

    BigWig and BigBed files are compressed binary indexed files containing data at several resolutions that allow the high-performance display of next-generation sequencing experiment results in the UCSC Genome Browser. The visualization is implemented using a multi-layered software approach that takes advantage of specific capabilities of web-based protocols and Linux and UNIX operating systems files, R trees and various indexing and compression tricks. As a result, only the data needed to support the current browser view is transmitted rather than the entire file, enabling fast remote access to large distributed data sets. Binaries for the BigWig and BigBed creation and parsing utilities may be downloaded at http://hgdownload.cse.ucsc.edu/admin/exe/linux.x86_64/. Source code for the creation and visualization software is freely available for non-commercial use at http://hgdownload.cse.ucsc.edu/admin/jksrc.zip, implemented in C and supported on Linux. The UCSC Genome Browser is available at http://genome.ucsc.edu.

  7. "To big to fail"-doktrinen står for fald?

    DEFF Research Database (Denmark)

    Grosen, Anders

    2010-01-01

    Hvis præsident Barack Obama får sin vilje, skal den klassiske "too big to fail"-bankdoktrin afløses af en "small enough to fail"-doktrin. Det fremgår af præsidentens planer om at opdele storbankerne i mindre enheder og forbyde bankernes handelsaktiviteter for egen regning. Hvis Barack Obama får...

  8. Long-duration transcutaneous electric acupoint stimulation alters small-world brain functional networks.

    Science.gov (United States)

    Zhang, Yue; Jiang, Yin; Glielmi, Christopher B; Li, Longchuan; Hu, Xiaoping; Wang, Xiaoying; Han, Jisheng; Zhang, Jue; Cui, Cailian; Fang, Jing

    2013-09-01

    Acupuncture, which is recognized as an alternative and complementary treatment in Western medicine, has long shown efficiencies in chronic pain relief, drug addiction treatment, stroke rehabilitation and other clinical practices. The neural mechanism underlying acupuncture, however, is still unclear. Many studies have focused on the sustained effects of acupuncture on healthy subjects, yet there are very few on the topological organization of functional networks in the whole brain in response to long-duration acupuncture (longer than 20 min). This paper presents a novel study on the effects of long-duration transcutaneous electric acupoint stimulation (TEAS) on the small-world properties of brain functional networks. Functional magnetic resonance imaging was used to construct brain functional networks of 18 healthy subjects (9 males and 9 females) during the resting state. All subjects received both TEAS and minimal TEAS (MTEAS) and were scanned before and after each stimulation. An altered functional network was found with lower local efficiency and no significant change in global efficiency for healthy subjects after TEAS, while no significant difference was observed after MTEAS. The experiments also showed that the nodal efficiencies in several paralimbic/limbic regions were altered by TEAS, and those in middle frontal gyrus and other regions by MTEAS. To remove the psychological effects and the baseline, we compared the difference between diffTEAS (difference between after and before TEAS) and diffMTEAS (difference between after and before MTEAS). The results showed that the local efficiency was decreased and that the nodal efficiencies in frontal gyrus, orbitofrontal cortex, anterior cingulate gyrus and hippocampus gyrus were changed. Based on those observations, we conclude that long-duration TEAS may modulate the short-range connections of brain functional networks and also the limbic system. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. Interventions for treating osteoarthritis of the big toe joint.

    Science.gov (United States)

    Zammit, Gerard V; Menz, Hylton B; Munteanu, Shannon E; Landorf, Karl B; Gilheany, Mark F

    2010-09-08

    Osteoarthritis affecting of the big toe joint of the foot (hallux limitus or rigidus) is a common and painful condition. Although several treatments have been proposed, few have been adequately evaluated. To identify controlled trials evaluating interventions for osteoarthritis of the big toe joint and to determine the optimum intervention(s). Literature searches were conducted across the following electronic databases: CENTRAL; MEDLINE; EMBASE; CINAHL; and PEDro (to 14th January 2010). No language restrictions were applied. Randomised controlled trials, quasi-randomised trials, or controlled clinical trials that assessed treatment outcomes for osteoarthritis of the big toe joint. Participants of any age or gender with osteoarthritis of the big toe joint (defined either radiographically or clinically) were included. Two authors examined the list of titles and abstracts identified by the literature searches. One content area expert and one methodologist independently applied the pre-determined inclusion and exclusion criteria to the full text of identified trials. To minimise error and reduce potential bias, data were extracted independently by two content experts. Only one trial satisfactorily fulfilled the inclusion criteria and was included in this review. This trial evaluated the effectiveness of two physical therapy programs in 20 individuals with osteoarthritis of the big toe joint. Assessment outcomes included pain levels, big toe joint range of motion and plantar flexion strength of the hallux. Mean differences at four weeks follow up were 3.80 points (95% CI 2.74 to 4.86) for self reported pain, 28.30 degrees (95% CI 21.37 to 35.23) for big toe joint range of motion, and 2.80 kg (95% CI 2.13 to 3.47) for muscle strength. Although differences in outcomes between treatment and control groups were reported, the risk of bias was high. The trial failed to employ appropriate randomisation or adequate allocation concealment, used a relatively small sample and

  10. [Big data, Roemer's law and avoidable hospital admissions].

    Science.gov (United States)

    van der Horst, H E

    2016-01-01

    From an analysis of data from 23 European countries to determine the impact of primary care on avoidable hospital admissions for uncontrolled diabetes it appeared that, contrary to expectation, countries with strong primary care did not have a lower rate of avoidable hospital admission. It is clear that Roemer's law, 'a bed built is a bed filled,' still applies. However, the validity of this sort of analysis can be questioned, as these data are highly aggregated, and registration quality differs between countries. It is also questionable if these datasets can be considered as 'big data' as there are relatively small numbers per country. Big data analyses are useful for discerning patterns and formulating hypotheses, but not for proving causality. An unwanted side effect of this kind of analysis might be that policymakers use these not so valid results to underpin their policy to their advantage.

  11. Mountain big sagebrush age distribution and relationships on the northern Yellowstone Winter Range

    Science.gov (United States)

    Carl L. Wambolt; Trista L. Hoffman

    2001-01-01

    This study was conducted within the Gardiner Basin, an especially critical wintering area for native ungulates utilizing the Northern Yellowstone Winter Range. Mountain big sagebrush plants on 33 sites were classified as large (≥22 cm canopy cover), small (

  12. Big data-driven business how to use big data to win customers, beat competitors, and boost profits

    CERN Document Server

    Glass, Russell

    2014-01-01

    Get the expert perspective and practical advice on big data The Big Data-Driven Business: How to Use Big Data to Win Customers, Beat Competitors, and Boost Profits makes the case that big data is for real, and more than just big hype. The book uses real-life examples-from Nate Silver to Copernicus, and Apple to Blackberry-to demonstrate how the winners of the future will use big data to seek the truth. Written by a marketing journalist and the CEO of a multi-million-dollar B2B marketing platform that reaches more than 90% of the U.S. business population, this book is a comprehens

  13. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  14. Classification of brain MRI with big data and deep 3D convolutional neural networks

    Science.gov (United States)

    Wegmayr, Viktor; Aitharaju, Sai; Buhmann, Joachim

    2018-02-01

    Our ever-aging society faces the growing problem of neurodegenerative diseases, in particular dementia. Magnetic Resonance Imaging provides a unique tool for non-invasive investigation of these brain diseases. However, it is extremely difficult for neurologists to identify complex disease patterns from large amounts of three-dimensional images. In contrast, machine learning excels at automatic pattern recognition from large amounts of data. In particular, deep learning has achieved impressive results in image classification. Unfortunately, its application to medical image classification remains difficult. We consider two reasons for this difficulty: First, volumetric medical image data is considerably scarcer than natural images. Second, the complexity of 3D medical images is much higher compared to common 2D images. To address the problem of small data set size, we assemble the largest dataset ever used for training a deep 3D convolutional neural network to classify brain images as healthy (HC), mild cognitive impairment (MCI) or Alzheimers disease (AD). We use more than 20.000 images from subjects of these three classes, which is almost 9x the size of the previously largest data set. The problem of high dimensionality is addressed by using a deep 3D convolutional neural network, which is state-of-the-art in large-scale image classification. We exploit its ability to process the images directly, only with standard preprocessing, but without the need for elaborate feature engineering. Compared to other work, our workflow is considerably simpler, which increases clinical applicability. Accuracy is measured on the ADNI+AIBL data sets, and the independent CADDementia benchmark.

  15. Fixing the Big Bang Theory's Lithium Problem

    Science.gov (United States)

    Kohler, Susanna

    2017-02-01

    How did our universe come into being? The Big Bang theory is a widely accepted and highly successful cosmological model of the universe, but it does introduce one puzzle: the cosmological lithium problem. Have scientists now found a solution?Too Much LithiumIn the Big Bang theory, the universe expanded rapidly from a very high-density and high-temperature state dominated by radiation. This theory has been validated again and again: the discovery of the cosmic microwave background radiation and observations of the large-scale structure of the universe both beautifully support the Big Bang theory, for instance. But one pesky trouble-spot remains: the abundance of lithium.The arrows show the primary reactions involved in Big Bang nucleosynthesis, and their flux ratios, as predicted by the authors model, are given on the right. Synthesizing primordial elements is complicated! [Hou et al. 2017]According to Big Bang nucleosynthesis theory, primordial nucleosynthesis ran wild during the first half hour of the universes existence. This produced most of the universes helium and small amounts of other light nuclides, including deuterium and lithium.But while predictions match the observed primordial deuterium and helium abundances, Big Bang nucleosynthesis theory overpredicts the abundance of primordial lithium by about a factor of three. This inconsistency is known as the cosmological lithium problem and attempts to resolve it using conventional astrophysics and nuclear physics over the past few decades have not been successful.In a recent publicationled by Suqing Hou (Institute of Modern Physics, Chinese Academy of Sciences) and advisorJianjun He (Institute of Modern Physics National Astronomical Observatories, Chinese Academy of Sciences), however, a team of scientists has proposed an elegant solution to this problem.Time and temperature evolution of the abundances of primordial light elements during the beginning of the universe. The authors model (dotted lines

  16. Evaluation of anesthesia effects on [{sup 18}F]FDG uptake in mouse brain and heart using small animal PET

    Energy Technology Data Exchange (ETDEWEB)

    Toyama, Hiroshi E-mail: htoyama@fujita-hu.ac.jp; Ichise, Masanori; Liow, Jeih-San; Vines, Douglass C.; Seneca, Nicholas M.; Modell, Kendra J.; Seidel, Jurgen; Green, Michael V.; Innis, Robert B

    2004-02-01

    This study evaluates effects of anesthesia on {sup 18}F-FDG (FDG) uptake in mouse brain and heart to establish the basic conditions of small animal PET imaging. Prior to FDG injection, 12 mice were anesthetized with isoflurane gas; 11 mice were anesthetized with an intraperitoneal injection of a ketamine/xylazine mixture; and 11 mice were awake. In isoflurane and ketamine/xylazine conditions, FDG brain uptake (%ID/g) was significantly lower than in controls. Conversely, in the isoflurane condition, %ID/g in heart was significantly higher than in controls, whereas heart uptake in ketamine/xylazine mice was significantly lower. Results suggest that anesthesia impedes FDG uptake in mouse brain and affects FDG uptake in heart; however, the effects in the brain and heart differ depending on the type of anesthesia used.

  17. Brain metabolite differences in one-year-old infants born small at term and association with neurodevelopmental outcome.

    Science.gov (United States)

    Simões, Rui V; Cruz-Lemini, Mónica; Bargalló, Núria; Gratacós, Eduard; Sanz-Cortés, Magdalena

    2015-08-01

    We assessed brain metabolite levels by magnetic resonance spectroscopy (MRS) in 1-year-old infants born small at term, as compared with infants born appropriate for gestational age (AGA), and their association with neurodevelopment at 2 years of age. A total of 40 infants born small (birthweight growth restriction or as small for gestational age, based on the presence or absence of prenatal Doppler and birthweight predictors of an adverse perinatal outcome, respectively. Single-voxel proton magnetic resonance spectroscopy ((1)H-MRS) data were acquired from the frontal lobe at short echo time. Neurodevelopment was evaluated at 2 years of age using the Bayley Scales of Infant and Toddler Development, Third Edition, assessing cognitive, language, motor, social-emotional, and adaptive behavior scales. As compared with AGA controls, infants born small showed significantly higher levels of glutamate and total N-acetylaspartate (NAAt) to creatine (Cr) ratio at age 1 year, and lower Bayley Scales of Infant and Toddler Development, Third Edition scores at 2 years. The subgroup with late intrauterine growth restriction further showed lower estimated glutathione levels at age 1 year. Significant correlations were observed for estimated glutathione levels with adaptive scores, and for myo-inositol with language scores. Significant associations were also noticed for NAA/Cr with cognitive scores, and for glutamate/Cr with motor scores. Infants born small show brain metabolite differences at 1 year of age, which are correlated with later neurodevelopment. These results support further research on MRS to develop imaging biomarkers of abnormal neurodevelopment. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Stalin's Big Fleet Program

    National Research Council Canada - National Science Library

    Mauner, Milan

    2002-01-01

    Although Dr. Milan Hauner's study 'Stalin's Big Fleet program' has focused primarily on the formation of Big Fleets during the Tsarist and Soviet periods of Russia's naval history, there are important lessons...

  19. Five Big, Big Five Issues : Rationale, Content, Structure, Status, and Crosscultural Assessment

    NARCIS (Netherlands)

    De Raad, Boele

    1998-01-01

    This article discusses the rationale, content, structure, status, and crosscultural assessment of the Big Five trait factors, focusing on topics of dispute and misunderstanding. Taxonomic restrictions of the original Big Five forerunner, the "Norman Five," are discussed, and criticisms regarding the

  20. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  1. Big Data and HPC collocation: Using HPC idle resources for Big Data Analytics

    OpenAIRE

    MERCIER , Michael; Glesser , David; Georgiou , Yiannis; Richard , Olivier

    2017-01-01

    International audience; Executing Big Data workloads upon High Performance Computing (HPC) infrastractures has become an attractive way to improve their performances. However, the collocation of HPC and Big Data workloads is not an easy task, mainly because of their core concepts' differences. This paper focuses on the challenges related to the scheduling of both Big Data and HPC workloads on the same computing platform. In classic HPC workloads, the rigidity of jobs tends to create holes in ...

  2. Ocean Networks Canada's "Big Data" Initiative

    Science.gov (United States)

    Dewey, R. K.; Hoeberechts, M.; Moran, K.; Pirenne, B.; Owens, D.

    2013-12-01

    Ocean Networks Canada operates two large undersea observatories that collect, archive, and deliver data in real time over the Internet. These data contribute to our understanding of the complex changes taking place on our ocean planet. Ocean Networks Canada's VENUS was the world's first cabled seafloor observatory to enable researchers anywhere to connect in real time to undersea experiments and observations. Its NEPTUNE observatory is the largest cabled ocean observatory, spanning a wide range of ocean environments. Most recently, we installed a new small observatory in the Arctic. Together, these observatories deliver "Big Data" across many disciplines in a cohesive manner using the Oceans 2.0 data management and archiving system that provides national and international users with open access to real-time and archived data while also supporting a collaborative work environment. Ocean Networks Canada operates these observatories to support science, innovation, and learning in four priority areas: study of the impact of climate change on the ocean; the exploration and understanding the unique life forms in the extreme environments of the deep ocean and below the seafloor; the exchange of heat, fluids, and gases that move throughout the ocean and atmosphere; and the dynamics of earthquakes, tsunamis, and undersea landslides. To date, the Ocean Networks Canada archive contains over 130 TB (collected over 7 years) and the current rate of data acquisition is ~50 TB per year. This data set is complex and diverse. Making these "Big Data" accessible and attractive to users is our priority. In this presentation, we share our experience as a "Big Data" institution where we deliver simple and multi-dimensional calibrated data cubes to a diverse pool of users. Ocean Networks Canada also conducts extensive user testing. Test results guide future tool design and development of "Big Data" products. We strive to bridge the gap between the raw, archived data and the needs and

  3. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  4. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  5. Big data - a 21st century science Maginot Line? No-boundary thinking: shifting from the big data paradigm.

    Science.gov (United States)

    Huang, Xiuzhen; Jennings, Steven F; Bruce, Barry; Buchan, Alison; Cai, Liming; Chen, Pengyin; Cramer, Carole L; Guan, Weihua; Hilgert, Uwe Kk; Jiang, Hongmei; Li, Zenglu; McClure, Gail; McMullen, Donald F; Nanduri, Bindu; Perkins, Andy; Rekepalli, Bhanu; Salem, Saeed; Specker, Jennifer; Walker, Karl; Wunsch, Donald; Xiong, Donghai; Zhang, Shuzhong; Zhang, Yu; Zhao, Zhongming; Moore, Jason H

    2015-01-01

    Whether your interests lie in scientific arenas, the corporate world, or in government, you have certainly heard the praises of big data: Big data will give you new insights, allow you to become more efficient, and/or will solve your problems. While big data has had some outstanding successes, many are now beginning to see that it is not the Silver Bullet that it has been touted to be. Here our main concern is the overall impact of big data; the current manifestation of big data is constructing a Maginot Line in science in the 21st century. Big data is not "lots of data" as a phenomena anymore; The big data paradigm is putting the spirit of the Maginot Line into lots of data. Big data overall is disconnecting researchers and science challenges. We propose No-Boundary Thinking (NBT), applying no-boundary thinking in problem defining to address science challenges.

  6. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Andersen, Kristina Vaarst; Jeppesen, Jacob

    In this paper we investigate the micro-mechanisms governing structural evolution and performance of scientific collaboration. Scientific discovery tends not to be lead by so called lone ?stars?, or big egos, but instead by collaboration among groups of researchers, from a multitude of institutions...

  7. Big Data and Big Science

    OpenAIRE

    Di Meglio, Alberto

    2014-01-01

    Brief introduction to the challenges of big data in scientific research based on the work done by the HEP community at CERN and how the CERN openlab promotes collaboration among research institutes and industrial IT companies. Presented at the FutureGov 2014 conference in Singapore.

  8. Role of prophylactic brain irradiation in limited stage small cell lung cancer: clinical, neuropsychologic, and CT sequelae

    International Nuclear Information System (INIS)

    Laukkanen, E.; Klonoff, H.; Allan, B.; Graeb, D.; Murray, N.

    1988-01-01

    Ninety-four patients with limited stage small cell lung cancer treated between 1981 and 1985 with a regimen including prophylactic brain irradiation (PBI) after combination chemotherapy were assessed for compliance with PBI, brain relapse, and neurologic morbidity. Seventy-seven percent of patients had PBI and of these, 22% developed brain metastases after a median time of 11 months post treatment. The brain was the apparent unique initial site of relapse in 10% of PBI cases but more commonly brain relapse was preceded or accompanied by failure at other sites, especially the chest. Brain metastases were the greatest cause of morbidity in 50% of PBI failures. Twelve of 14 PBI patients alive 2 years after treatment had oncologic, neurologic, and neuropsychological evaluation, and brain CT. All long-term survivors were capable of self care and none fulfilled diagnostic criteria for dementia, with three borderline cases. One third had pretreatment neurologic dysfunction and two thirds post treatment neurologic symptoms, most commonly recent memory loss. Fifty percent had subtle motor findings. Intellectual functioning was at the 38th percentile with most patients having an unskilled occupational history. Neuropsychologic impairment ratings were borderline in three cases and definitely impaired in seven cases. CT scans showed brain atrophy in all cases with mild progression in those having a pre-treatment baseline. Periventricular and subcortical low density lesions identical to the CT appearance of subcortical arteriosclerotic encephalopathy were seen in 82% of posttreatment CT studies, and lacunar infarcts in 54%. Neuropsychologic impairment scores and the extent of CT periventricular low density lesions were strongly associated

  9. Big data is not a monolith

    CERN Document Server

    Ekbia, Hamid R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  10. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......, and highlight some recent methodological advancements in machine learning and image analysis triggered by astronomical applications....

  11. Poker Player Behavior After Big Wins and Big Losses

    OpenAIRE

    Gary Smith; Michael Levere; Robert Kurtzman

    2009-01-01

    We find that experienced poker players typically change their style of play after winning or losing a big pot--most notably, playing less cautiously after a big loss, evidently hoping for lucky cards that will erase their loss. This finding is consistent with Kahneman and Tversky's (Kahneman, D., A. Tversky. 1979. Prospect theory: An analysis of decision under risk. Econometrica 47(2) 263-292) break-even hypothesis and suggests that when investors incur a large loss, it might be time to take ...

  12. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  13. Big data in Finnish financial services

    OpenAIRE

    Laurila, M. (Mikko)

    2017-01-01

    Abstract This thesis aims to explore the concept of big data, and create understanding of big data maturity in the Finnish financial services industry. The research questions of this thesis are “What kind of big data solutions are being implemented in the Finnish financial services sector?” and “Which factors impede faster implementation of big data solutions in the Finnish financial services sector?”. ...

  14. Big data in fashion industry

    Science.gov (United States)

    Jain, S.; Bruniaux, J.; Zeng, X.; Bruniaux, P.

    2017-10-01

    Significant work has been done in the field of big data in last decade. The concept of big data includes analysing voluminous data to extract valuable information. In the fashion world, big data is increasingly playing a part in trend forecasting, analysing consumer behaviour, preference and emotions. The purpose of this paper is to introduce the term fashion data and why it can be considered as big data. It also gives a broad classification of the types of fashion data and briefly defines them. Also, the methodology and working of a system that will use this data is briefly described.

  15. The Sounds of the Little and Big Bangs

    Science.gov (United States)

    Shuryak, Edward

    2017-11-01

    Studies of heavy ion collisions have discovered that tiny fireballs of new phase of matter -- quark gluon plasma (QGP) -- undergoes explosion, called the Little Bang. In spite of its small size, it is not only well described by hydrodynamics, but even small perturbations on top of the explosion turned to be well described by hydrodynamical sound modes. The cosmological Big Bang also went through phase transitions, the QCD and electroweak ones, which are expected to produce sounds as well. We discuss their subsequent evolution and hypothetical inverse acoustic cascade, amplifying the amplitude. Ultimately, collision of two sound waves leads to formation of gravity waves, with the smallest wavelength. We briefly discuss how those can be detected.

  16. A study and analysis of recommendation systems for location-based social network (LBSN with big data

    Directory of Open Access Journals (Sweden)

    Murale Narayanan

    2016-03-01

    Full Text Available Recommender systems play an important role in our day-to-day life. A recommender system automatically suggests an item to a user that he/she might be interested in. Small-scale datasets are used to provide recommendations based on location, but in real time, the volume of data is large. We have selected Foursquare dataset to study the need for big data in recommendation systems for location-based social network (LBSN. A few quality parameters like parallel processing and multimodal interface have been selected to study the need for big data in recommender systems. This paper provides a study and analysis of quality parameters of recommendation systems for LBSN with big data.

  17. Big data bioinformatics.

    Science.gov (United States)

    Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao

    2014-12-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.

  18. Changing the personality of a face: Perceived Big Two and Big Five personality factors modeled in real photographs.

    Science.gov (United States)

    Walker, Mirella; Vetter, Thomas

    2016-04-01

    General, spontaneous evaluations of strangers based on their faces have been shown to reflect judgments of these persons' intention and ability to harm. These evaluations can be mapped onto a 2D space defined by the dimensions trustworthiness (intention) and dominance (ability). Here we go beyond general evaluations and focus on more specific personality judgments derived from the Big Two and Big Five personality concepts. In particular, we investigate whether Big Two/Big Five personality judgments can be mapped onto the 2D space defined by the dimensions trustworthiness and dominance. Results indicate that judgments of the Big Two personality dimensions almost perfectly map onto the 2D space. In contrast, at least 3 of the Big Five dimensions (i.e., neuroticism, extraversion, and conscientiousness) go beyond the 2D space, indicating that additional dimensions are necessary to describe more specific face-based personality judgments accurately. Building on this evidence, we model the Big Two/Big Five personality dimensions in real facial photographs. Results from 2 validation studies show that the Big Two/Big Five are perceived reliably across different samples of faces and participants. Moreover, results reveal that participants differentiate reliably between the different Big Two/Big Five dimensions. Importantly, this high level of agreement and differentiation in personality judgments from faces likely creates a subjective reality which may have serious consequences for those being perceived-notably, these consequences ensue because the subjective reality is socially shared, irrespective of the judgments' validity. The methodological approach introduced here might prove useful in various psychological disciplines. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  19. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  20. Big game hunting practices, meanings, motivations and constraints: a survey of Oregon big game hunters

    Science.gov (United States)

    Suresh K. Shrestha; Robert C. Burns

    2012-01-01

    We conducted a self-administered mail survey in September 2009 with randomly selected Oregon hunters who had purchased big game hunting licenses/tags for the 2008 hunting season. Survey questions explored hunting practices, the meanings of and motivations for big game hunting, the constraints to big game hunting participation, and the effects of age, years of hunting...

  1. A small step for mankind

    NARCIS (Netherlands)

    Huizing, C.; Koymans, R.L.C.; Kuiper, R.; Dams, D.; Hannemann, U.; Steffen, M.

    2010-01-01

    For many programming languages, the only formal semantics published is an SOS big-step semantics. Such a semantics is not suited for investigations that observe intermediate states, such as invariant techniques. In this paper, a construction is proposed that generates automatically a small-step SOS

  2. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  3. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  4. Evolution of brain region volumes during artificial selection for relative brain size.

    Science.gov (United States)

    Kotrschal, Alexander; Zeng, Hong-Li; van der Bijl, Wouter; Öhman-Mägi, Caroline; Kotrschal, Kurt; Pelckmans, Kristiaan; Kolm, Niclas

    2017-12-01

    The vertebrate brain shows an extremely conserved layout across taxa. Still, the relative sizes of separate brain regions vary markedly between species. One interesting pattern is that larger brains seem associated with increased relative sizes only of certain brain regions, for instance telencephalon and cerebellum. Till now, the evolutionary association between separate brain regions and overall brain size is based on comparative evidence and remains experimentally untested. Here, we test the evolutionary response of brain regions to directional selection on brain size in guppies (Poecilia reticulata) selected for large and small relative brain size. In these animals, artificial selection led to a fast response in relative brain size, while body size remained unchanged. We use microcomputer tomography to investigate how the volumes of 11 main brain regions respond to selection for larger versus smaller brains. We found no differences in relative brain region volumes between large- and small-brained animals and only minor sex-specific variation. Also, selection did not change allometric scaling between brain and brain region sizes. Our results suggest that brain regions respond similarly to strong directional selection on relative brain size, which indicates that brain anatomy variation in contemporary species most likely stem from direct selection on key regions. © 2017 The Author(s). Evolution © 2017 The Society for the Study of Evolution.

  5. Exploring complex and big data

    Directory of Open Access Journals (Sweden)

    Stefanowski Jerzy

    2017-12-01

    Full Text Available This paper shows how big data analysis opens a range of research and technological problems and calls for new approaches. We start with defining the essential properties of big data and discussing the main types of data involved. We then survey the dedicated solutions for storing and processing big data, including a data lake, virtual integration, and a polystore architecture. Difficulties in managing data quality and provenance are also highlighted. The characteristics of big data imply also specific requirements and challenges for data mining algorithms, which we address as well. The links with related areas, including data streams and deep learning, are discussed. The common theme that naturally emerges from this characterization is complexity. All in all, we consider it to be the truly defining feature of big data (posing particular research and technological challenges, which ultimately seems to be of greater importance than the sheer data volume.

  6. Was there a big bang

    International Nuclear Information System (INIS)

    Narlikar, J.

    1981-01-01

    In discussing the viability of the big-bang model of the Universe relative evidence is examined including the discrepancies in the age of the big-bang Universe, the red shifts of quasars, the microwave background radiation, general theory of relativity aspects such as the change of the gravitational constant with time, and quantum theory considerations. It is felt that the arguments considered show that the big-bang picture is not as soundly established, either theoretically or observationally, as it is usually claimed to be, that the cosmological problem is still wide open and alternatives to the standard big-bang picture should be seriously investigated. (U.K.)

  7. BIG DATA-DRIVEN MARKETING: AN ABSTRACT

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (BD use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: What are the key antecedents of big data customer analytics use? How, and to what extent, does big data customer an...

  8. Pathways for Small Molecule Delivery to the Central Nervous System Across the Blood-Brain Barrier

    OpenAIRE

    Mikitsh, John L; Chacko, Ann-Marie

    2014-01-01

    The treatment of central nervous system (CNS) disease has long been difficult due to the ineffectiveness of drug delivery across the blood-brain barrier (BBB). This review summarizes important concepts of the BBB in normal versus pathophysiology and how this physical, enzymatic, and efflux barrier provides necessary protection to the CNS during drug delivery, and consequently treatment challenging. Small molecules account for the vast majority of available CNS drugs primarily due to their abi...

  9. Big Data Analytics in Medicine and Healthcare.

    Science.gov (United States)

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  10. The trashing of Big Green

    International Nuclear Information System (INIS)

    Felten, E.

    1990-01-01

    The Big Green initiative on California's ballot lost by a margin of 2-to-1. Green measures lost in five other states, shocking ecology-minded groups. According to the postmortem by environmentalists, Big Green was a victim of poor timing and big spending by the opposition. Now its supporters plan to break up the bill and try to pass some provisions in the Legislature

  11. Detection of small traumatic hemorrhages using a computer-generated average human brain CT.

    Science.gov (United States)

    Afzali-Hashemi, Liza; Hazewinkel, Marieke; Tjepkema-Cloostermans, Marleen C; van Putten, Michel J A M; Slump, Cornelis H

    2018-04-01

    Computed tomography is a standard diagnostic imaging technique for patients with traumatic brain injury (TBI). A limitation is the poor-to-moderate sensitivity for small traumatic hemorrhages. A pilot study using an automatic method to detect hemorrhages [Formula: see text] in diameter in patients with TBI is presented. We have created an average image from 30 normal noncontrast CT scans that were automatically aligned using deformable image registration as implemented in Elastix software. Subsequently, the average image was aligned to the scans of TBI patients, and the hemorrhages were detected by a voxelwise subtraction of the average image from the CT scans of nine TBI patients. An experienced neuroradiologist and a radiologist in training assessed the presence of hemorrhages in the final images and determined the false positives and false negatives. The 9 CT scans contained 67 small haemorrhages, of which 97% was correctly detected by our system. The neuroradiologist detected three false positives, and the radiologist in training found two false positives. For one patient, our method showed a hemorrhagic contusion that was originally missed. Comparing individual CT scans with a computed average may assist the physicians in detecting small traumatic hemorrhages in patients with TBI.

  12. Survival prognostic factors for patients with synchronous brain oligometastatic non-small-cell lung carcinoma receiving local therapy

    Science.gov (United States)

    Bai, Hao; Xu, Jianlin; Yang, Haitang; Jin, Bo; Lou, Yuqing; Wu, Dan; Han, Baohui

    2016-01-01

    Introduction Clinical evidence for patients with synchronous brain oligometastatic non-small-cell lung carcinoma is limited. We aimed to summarize the clinical data of these patients to explore the survival prognostic factors for this population. Methods From September 1995 to July 2011, patients with 1–3 synchronous brain oligometastases, who were treated with stereotactic radiosurgery (SRS) or surgical resection as the primary treatment, were identified at Shanghai Chest Hospital. Results A total of 76 patients (22 patients underwent brain surgery as primary treatment and 54 patients received SRS) were available for survival analysis. The overall survival (OS) for patients treated with SRS and brain surgery as the primary treatment were 12.6 months (95% confidence interval [CI] 10.3–14.9) and 16.4 months (95% CI 8.8–24.1), respectively (adjusted hazard ratio =0.59, 95% CI 0.33–1.07, P=0.08). Among 76 patients treated with SRS or brain surgery, 21 patients who underwent primary tumor resection did not experience a significantly improved OS (16.4 months, 95% CI 9.6–23.2), compared with those who did not undergo resection (11.9 months, 95% CI 9.7–14.0; adjusted hazard ratio =0.81, 95% CI 0.46–1.44, P=0.46). Factors associated with survival benefits included stage I–II of primary lung tumor and solitary brain metastasis. Conclusion There was no significant difference in OS for patients with synchronous brain oligometastasis receiving SRS or surgical resection. Among this population, the number of brain metastases and stage of primary lung disease were the factors associated with a survival benefit. PMID:27471395

  13. From the "little brain" gastrointestinal infection to the "big brain" neuroinflammation: a proposed fast axonal transport pathway involved in multiple sclerosis.

    Science.gov (United States)

    Deretzi, Georgia; Kountouras, Jannis; Grigoriadis, Nikolaos; Zavos, Christos; Chatzigeorgiou, Stavros; Koutlas, Evangelos; Tsiptsios, Iakovos

    2009-11-01

    The human central nervous system (CNS) is targeted by different pathogens which, apart from pathogens' intranasal inoculation or trafficking into the brain through infected blood cells, may use a distinct pathway to bypass the blood-brain barrier by using the gastrointestinal tract (GIT) retrograde axonal transport through sensory or motor fibres. The recent findings regarding the enteric nervous system (often called the "little brain") similarities with CNS and GIT axonal transport of infections resulting in CNS neuroinflammation are mainly reviewed in this article. We herein propose that the GIT is the vulnerable area through which pathogens (such as Helicobacter pylori) may influence the brain and induce multiple sclerosis pathologies, mainly via the fast axonal transport by the afferent neurones connecting the GIT to brain.

  14. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  15. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  16. Constructing fine-granularity functional brain network atlases via deep convolutional autoencoder.

    Science.gov (United States)

    Zhao, Yu; Dong, Qinglin; Chen, Hanbo; Iraji, Armin; Li, Yujie; Makkie, Milad; Kou, Zhifeng; Liu, Tianming

    2017-12-01

    State-of-the-art functional brain network reconstruction methods such as independent component analysis (ICA) or sparse coding of whole-brain fMRI data can effectively infer many thousands of volumetric brain network maps from a large number of human brains. However, due to the variability of individual brain networks and the large scale of such networks needed for statistically meaningful group-level analysis, it is still a challenging and open problem to derive group-wise common networks as network atlases. Inspired by the superior spatial pattern description ability of the deep convolutional neural networks (CNNs), a novel deep 3D convolutional autoencoder (CAE) network is designed here to extract spatial brain network features effectively, based on which an Apache Spark enabled computational framework is developed for fast clustering of larger number of network maps into fine-granularity atlases. To evaluate this framework, 10 resting state networks (RSNs) were manually labeled from the sparsely decomposed networks of Human Connectome Project (HCP) fMRI data and 5275 network training samples were obtained, in total. Then the deep CAE models are trained by these functional networks' spatial maps, and the learned features are used to refine the original 10 RSNs into 17 network atlases that possess fine-granularity functional network patterns. Interestingly, it turned out that some manually mislabeled outliers in training networks can be corrected by the deep CAE derived features. More importantly, fine granularities of networks can be identified and they reveal unique network patterns specific to different brain task states. By further applying this method to a dataset of mild traumatic brain injury study, it shows that the technique can effectively identify abnormal small networks in brain injury patients in comparison with controls. In general, our work presents a promising deep learning and big data analysis solution for modeling functional connectomes, with

  17. Validation of the RTOG recursive partitioning analysis (RPA) classification for small-cell lung cancer-only brain metastases

    International Nuclear Information System (INIS)

    Videtic, Gregory M.M.; Adelstein, David J.; Mekhail, Tarek M.; Rice, Thomas W.; Stevens, Glen H.J.; Lee, S.-Y.; Suh, John H.

    2007-01-01

    Purpose: Radiation Therapy Oncology Group (RTOG) developed a prognostic classification based on a recursive partitioning analysis (RPA) of patient pretreatment characteristics from three completed brain metastases randomized trials. Clinical trials for patients with brain metastases generally exclude small-cell lung cancer (SCLC) cases. We hypothesize that the RPA classes are valid in the setting of SCLC brain metastases. Methods and Materials: A retrospective review of 154 SCLC patients with brain metastases treated between April 1983 and May 2005 was performed. RPA criteria used for class assignment were Karnofsky performance status (KPS), primary tumor status (PT), presence of extracranial metastases (ED), and age. Results: Median survival was 4.9 months, with 4 patients (2.6%) alive at analysis. Median follow-up was 4.7 months (range, 0.3-40.3 months). Median age was 65 (range, 42-85 years). Median KPS was 70 (range, 40-100). Number of patients with controlled PT and no ED was 20 (13%) and with ED, 27 (18%); without controlled PT and ED, 34 (22%) and with ED, 73 (47%). RPA class distribution was: Class I: 8 (5%); Class II: 96 (62%); Class III: 51 (33%). Median survivals (in months) by RPA class were: Class I: 8.6; Class II: 4.2; Class III: 2.3 (p = 0.0023). Conclusions: Survivals for SCLC-only brain metastases replicate the results from the RTOG RPA classification. These classes are therefore valid for brain metastases from SCLC, support the inclusion of SCLC patients in future brain metastases trials, and may also serve as a basis for historical comparisons

  18. Big Data Science: Opportunities and Challenges to Address Minority Health and Health Disparities in the 21st Century

    Science.gov (United States)

    Zhang, Xinzhi; Pérez-Stable, Eliseo J.; Bourne, Philip E.; Peprah, Emmanuel; Duru, O. Kenrik; Breen, Nancy; Berrigan, David; Wood, Fred; Jackson, James S.; Wong, David W.S.; Denny, Joshua

    2017-01-01

    Addressing minority health and health disparities has been a missing piece of the puzzle in Big Data science. This article focuses on three priority opportunities that Big Data science may offer to the reduction of health and health care disparities. One opportunity is to incorporate standardized information on demographic and social determinants in electronic health records in order to target ways to improve quality of care for the most disadvantaged populations over time. A second opportunity is to enhance public health surveillance by linking geographical variables and social determinants of health for geographically defined populations to clinical data and health outcomes. Third and most importantly, Big Data science may lead to a better understanding of the etiology of health disparities and understanding of minority health in order to guide intervention development. However, the promise of Big Data needs to be considered in light of significant challenges that threaten to widen health disparities. Care must be taken to incorporate diverse populations to realize the potential benefits. Specific recommendations include investing in data collection on small sample populations, building a diverse workforce pipeline for data science, actively seeking to reduce digital divides, developing novel ways to assure digital data privacy for small populations, and promoting widespread data sharing to benefit under-resourced minority-serving institutions and minority researchers. With deliberate efforts, Big Data presents a dramatic opportunity for reducing health disparities but without active engagement, it risks further widening them. PMID:28439179

  19. Big Data Science: Opportunities and Challenges to Address Minority Health and Health Disparities in the 21st Century.

    Science.gov (United States)

    Zhang, Xinzhi; Pérez-Stable, Eliseo J; Bourne, Philip E; Peprah, Emmanuel; Duru, O Kenrik; Breen, Nancy; Berrigan, David; Wood, Fred; Jackson, James S; Wong, David W S; Denny, Joshua

    2017-01-01

    Addressing minority health and health disparities has been a missing piece of the puzzle in Big Data science. This article focuses on three priority opportunities that Big Data science may offer to the reduction of health and health care disparities. One opportunity is to incorporate standardized information on demographic and social determinants in electronic health records in order to target ways to improve quality of care for the most disadvantaged populations over time. A second opportunity is to enhance public health surveillance by linking geographical variables and social determinants of health for geographically defined populations to clinical data and health outcomes. Third and most importantly, Big Data science may lead to a better understanding of the etiology of health disparities and understanding of minority health in order to guide intervention development. However, the promise of Big Data needs to be considered in light of significant challenges that threaten to widen health disparities. Care must be taken to incorporate diverse populations to realize the potential benefits. Specific recommendations include investing in data collection on small sample populations, building a diverse workforce pipeline for data science, actively seeking to reduce digital divides, developing novel ways to assure digital data privacy for small populations, and promoting widespread data sharing to benefit under-resourced minority-serving institutions and minority researchers. With deliberate efforts, Big Data presents a dramatic opportunity for reducing health disparities but without active engagement, it risks further widening them.

  20. Predicting functional impairment in brain tumor surgery: the Big Five and the Milan Complexity Scale.

    Science.gov (United States)

    Ferroli, Paolo; Broggi, Morgan; Schiavolin, Silvia; Acerbi, Francesco; Bettamio, Valentina; Caldiroli, Dario; Cusin, Alberto; La Corte, Emanuele; Leonardi, Matilde; Raggi, Alberto; Schiariti, Marco; Visintini, Sergio; Franzini, Angelo; Broggi, Giovanni

    2015-12-01

    (Nagelkerke R(2) = 0.286). A grading scale was obtained with scores ranging between 0 and 8. Worsened patients showed mean total scores that were significantly higher than the improved/unchanged scores (3.24 ± 1.55 vs 1.47 ± 1.58; p < 0.001). Finally, a grid was developed to show the risk of worsening after surgery for each total score: scores higher than 3 are suggestive of worse clinical outcome. CONCLUSIONS Through the evaluation of the 5 aforementioned parameters-the Big Five-the Milan Complexity Scale enables neurosurgeons to estimate the risk of a negative clinical course after brain tumor surgery and share these data with the patient. Furthermore, the Milan Complexity Scale could be used for research and educational purposes and better health system management.

  1. Constraining antimatter domains in the early universe with big bang nucleosynthesis.

    Science.gov (United States)

    Kurki-Suonio, H; Sihvola, E

    2000-04-24

    We consider the effect of a small-scale matter-antimatter domain structure on big bang nucleosynthesis and place upper limits on the amount of antimatter in the early universe. For small domains, which annihilate before nucleosynthesis, this limit comes from underproduction of 4He. For larger domains, the limit comes from 3He overproduction. Since most of the 3He from &pmacr; 4He annihilation are themselves annihilated, the main source of primordial 3He is the photodisintegration of 4He by the electromagnetic cascades initiated by the annihilation.

  2. Medical big data: promise and challenges.

    Science.gov (United States)

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  3. Medical big data: promise and challenges

    Directory of Open Access Journals (Sweden)

    Choong Ho Lee

    2017-03-01

    Full Text Available The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  4. What is beyond the big five?

    Science.gov (United States)

    Saucier, G; Goldberg, L R

    1998-08-01

    Previous investigators have proposed that various kinds of person-descriptive content--such as differences in attitudes or values, in sheer evaluation, in attractiveness, or in height and girth--are not adequately captured by the Big Five Model. We report on a rather exhaustive search for reliable sources of Big Five-independent variation in data from person-descriptive adjectives. Fifty-three candidate clusters were developed in a college sample using diverse approaches and sources. In a nonstudent adult sample, clusters were evaluated with respect to a minimax criterion: minimum multiple correlation with factors from Big Five markers and maximum reliability. The most clearly Big Five-independent clusters referred to Height, Girth, Religiousness, Employment Status, Youthfulness and Negative Valence (or low-base-rate attributes). Clusters referring to Fashionableness, Sensuality/Seductiveness, Beauty, Masculinity, Frugality, Humor, Wealth, Prejudice, Folksiness, Cunning, and Luck appeared to be potentially beyond the Big Five, although each of these clusters demonstrated Big Five multiple correlations of .30 to .45, and at least one correlation of .20 and over with a Big Five factor. Of all these content areas, Religiousness, Negative Valence, and the various aspects of Attractiveness were found to be represented by a substantial number of distinct, common adjectives. Results suggest directions for supplementing the Big Five when one wishes to extend variable selection outside the domain of personality traits as conventionally defined.

  5. Big Data Analytics and Its Applications

    Directory of Open Access Journals (Sweden)

    Mashooque A. Memon

    2017-10-01

    Full Text Available The term, Big Data, has been authored to refer to the extensive heave of data that can't be managed by traditional data handling methods or techniques. The field of Big Data plays an indispensable role in various fields, such as agriculture, banking, data mining, education, chemistry, finance, cloud computing, marketing, health care stocks. Big data analytics is the method for looking at big data to reveal hidden patterns, incomprehensible relationship and other important data that can be utilize to resolve on enhanced decisions. There has been a perpetually expanding interest for big data because of its fast development and since it covers different areas of applications. Apache Hadoop open source technology created in Java and keeps running on Linux working framework was used. The primary commitment of this exploration is to display an effective and free solution for big data application in a distributed environment, with its advantages and indicating its easy use. Later on, there emerge to be a required for an analytical review of new developments in the big data technology. Healthcare is one of the best concerns of the world. Big data in healthcare imply to electronic health data sets that are identified with patient healthcare and prosperity. Data in the healthcare area is developing past managing limit of the healthcare associations and is relied upon to increment fundamentally in the coming years.

  6. Measuring the Promise of Big Data Syllabi

    Science.gov (United States)

    Friedman, Alon

    2018-01-01

    Growing interest in Big Data is leading industries, academics and governments to accelerate Big Data research. However, how teachers should teach Big Data has not been fully examined. This article suggests criteria for redesigning Big Data syllabi in public and private degree-awarding higher education establishments. The author conducted a survey…

  7. 77 FR 27245 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN

    Science.gov (United States)

    2012-05-09

    ... DEPARTMENT OF THE INTERIOR Fish and Wildlife Service [FWS-R3-R-2012-N069; FXRS1265030000S3-123-FF03R06000] Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN AGENCY: Fish and... plan (CCP) and environmental assessment (EA) for Big Stone National Wildlife Refuge (Refuge, NWR) for...

  8. Changes of Brain Glucose Metabolism in the Pretreatment Patients with Non-Small Cell Lung Cancer: A Retrospective PET/CT Study.

    Science.gov (United States)

    Zhang, Weishan; Ning, Ning; Li, Xianjun; Niu, Gang; Bai, Lijun; Guo, Youmin; Yang, Jian

    2016-01-01

    The tumor-to-brain communication has been emphasized by recent converging evidences. This study aimed to compare the difference of brain glucose metabolism between patients with non-small cell lung cancer (NSCLC) and control subjects. NSCLC patients prior to oncotherapy and control subjects without malignancy confirmed by 6 months follow-up were collected and underwent the resting state 18F-fluoro-D-glucose (FDG) PET/CT. Normalized FDG metabolism was calculated by a signal intensity ratio of each brain region to whole brain. Brain glucose metabolism was compared between NSCLC patients and control group using two samples t-test and multivariate test by statistical parametric maps (SPM) software. Compared with the control subjects (n = 76), both brain glucose hyper- and hypometabolism regions with significant statistical differences (Pbrain signal transduction pathways, and the hypometabolism regions (the left superior parietal lobule, bilateral inferior parietal lobule and left fusiform gyrus) lied in dorsal attention network and visuospatial function areas. The changes of brain glucose metabolism exist in NSCLC patients prior to oncotherapy, which might be attributed to lung-cancer related visceral sympathetic activation and decrease of dorsal attention network function.

  9. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  10. Forget the hype or reality. Big data presents new opportunities in Earth Science.

    Science.gov (United States)

    Lee, T. J.

    2015-12-01

    Earth science is arguably one of the most mature science discipline which constantly acquires, curates, and utilizes a large volume of data with diverse variety. We deal with big data before there is big data. For example, while developing the EOS program in the 1980s, the EOS data and information system (EOSDIS) was developed to manage the vast amount of data acquired by the EOS fleet of satellites. EOSDIS continues to be a shining example of modern science data systems in the past two decades. With the explosion of internet, the usage of social media, and the provision of sensors everywhere, the big data era has bring new challenges. First, Goggle developed the search algorithm and a distributed data management system. The open source communities quickly followed up and developed Hadoop file system to facility the map reduce workloads. The internet continues to generate tens of petabytes of data every day. There is a significant shortage of algorithms and knowledgeable manpower to mine the data. In response, the federal government developed the big data programs that fund research and development projects and training programs to tackle these new challenges. Meanwhile, comparatively to the internet data explosion, Earth science big data problem has become quite small. Nevertheless, the big data era presents an opportunity for Earth science to evolve. We learned about the MapReduce algorithms, in memory data mining, machine learning, graph analysis, and semantic web technologies. How do we apply these new technologies to our discipline and bring the hype to Earth? In this talk, I will discuss how we might want to apply some of the big data technologies to our discipline and solve many of our challenging problems. More importantly, I will propose new Earth science data system architecture to enable new type of scientific inquires.

  11. The Sounds of the Little and Big Bangs

    Directory of Open Access Journals (Sweden)

    Edward Shuryak

    2017-11-01

    Full Text Available Studies on heavy ion collisions have discovered that tiny fireballs of a new phase of matter—quark gluon plasma (QGP—undergo an explosion, called the Little Bang. In spite of its small size, not only is it well described by hydrodynamics, but even small perturbations on top of the explosion turned out to be well described by hydrodynamical sound modes. The cosmological Big Bang also went through phase transitions, related with Quantum Chromodynamics (QCD and electroweak/Higgs symmetry breaking, which are also expected to produce sounds. We discuss their subsequent evolution and hypothetical inverse acoustic cascade, amplifying the amplitude. Ultimately, the collision of two sound waves leads to the formation of one gravity waves. We briefly discuss how these gravity waves can be detected.

  12. Big data and educational research

    OpenAIRE

    Beneito-Montagut, Roser

    2017-01-01

    Big data and data analytics offer the promise to enhance teaching and learning, improve educational research and progress education governance. This chapter aims to contribute to the conceptual and methodological understanding of big data and analytics within educational research. It describes the opportunities and challenges that big data and analytics bring to education as well as critically explore the perils of applying a data driven approach to education. Despite the claimed value of the...

  13. Complementary social science? Quali-quantitative experiments in a Big Data world

    Directory of Open Access Journals (Sweden)

    Anders Blok

    2014-08-01

    Full Text Available The rise of Big Data in the social realm poses significant questions at the intersection of science, technology, and society, including in terms of how new large-scale social databases are currently changing the methods, epistemologies, and politics of social science. In this commentary, we address such epochal (“large-scale” questions by way of a (situated experiment: at the Danish Technical University in Copenhagen, an interdisciplinary group of computer scientists, physicists, economists, sociologists, and anthropologists (including the authors is setting up a large-scale data infrastructure, meant to continually record the digital traces of social relations among an entire freshman class of students ( N  > 1000. At the same time, fieldwork is carried out on friendship (and other relations amongst the same group of students. On this basis, the question we pose is the following: what kind of knowledge is obtained on this social micro-cosmos via the Big (computational, quantitative and Small (embodied, qualitative Data, respectively? How do the two relate? Invoking Bohr’s principle of complementarity as analogy, we hypothesize that social relations, as objects of knowledge, depend crucially on the type of measurement device deployed. At the same time, however, we also expect new interferences and polyphonies to arise at the intersection of Big and Small Data, provided that these are, so to speak, mixed with care. These questions, we stress, are important not only for the future of social science methods but also for the type of societal (self-knowledge that may be expected from new large-scale social databases.

  14. Water Loss in Small Settlements

    OpenAIRE

    Mindaugas Rimeika; Anželika Jurkienė

    2014-01-01

    The main performance indicators of a water supply system include the quality and safety of water, continuous work, relevant pressure and small water loss. The majority of foreign and local projects on reducing water loss have been carried out in the water supply systems of metropolitans; however, the specificity of small settlements differs from that of big cities. Differences can be observed not only in the development of infrastructure and technical indicators but also in the features of wa...

  15. Thick-Big Descriptions

    DEFF Research Database (Denmark)

    Lai, Signe Sophus

    The paper discusses the rewards and challenges of employing commercial audience measurements data – gathered by media industries for profitmaking purposes – in ethnographic research on the Internet in everyday life. It questions claims to the objectivity of big data (Anderson 2008), the assumption...... communication systems, language and behavior appear as texts, outputs, and discourses (data to be ‘found’) – big data then documents things that in earlier research required interviews and observations (data to be ‘made’) (Jensen 2014). However, web-measurement enterprises build audiences according...... to a commercial logic (boyd & Crawford 2011) and is as such directed by motives that call for specific types of sellable user data and specific segmentation strategies. In combining big data and ‘thick descriptions’ (Geertz 1973) scholars need to question how ethnographic fieldwork might map the ‘data not seen...

  16. Big Data's Role in Precision Public Health.

    Science.gov (United States)

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts.

  17. Big Data, indispensable today

    Directory of Open Access Journals (Sweden)

    Radu-Ioan ENACHE

    2015-10-01

    Full Text Available Big data is and will be used more in the future as a tool for everything that happens both online and offline. Of course , online is a real hobbit, Big Data is found in this medium , offering many advantages , being a real help for all consumers. In this paper we talked about Big Data as being a plus in developing new applications, by gathering useful information about the users and their behaviour.We've also presented the key aspects of real-time monitoring and the architecture principles of this technology. The most important benefit brought to this paper is presented in the cloud section.

  18. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  19. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  20. Antigravity and the big crunch/big bang transition

    International Nuclear Information System (INIS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  1. Big data: een zoektocht naar instituties

    NARCIS (Netherlands)

    van der Voort, H.G.; Crompvoets, J

    2016-01-01

    Big data is a well-known phenomenon, even a buzzword nowadays. It refers to an abundance of data and new possibilities to process and use them. Big data is subject of many publications. Some pay attention to the many possibilities of big data, others warn us for their consequences. This special

  2. Data, Data, Data : Big, Linked & Open

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Eckartz, S.M.

    2013-01-01

    De gehele business en IT-wereld praat op dit moment over Big Data, een trend die medio 2013 Cloud Computing is gepasseerd (op basis van Google Trends). Ook beleidsmakers houden zich actief bezig met Big Data. Neelie Kroes, vice-president van de Europese Commissie, spreekt over de ‘Big Data

  3. The Human Genome Project: big science transforms biology and medicine.

    Science.gov (United States)

    Hood, Leroy; Rowen, Lee

    2013-01-01

    The Human Genome Project has transformed biology through its integrated big science approach to deciphering a reference human genome sequence along with the complete sequences of key model organisms. The project exemplifies the power, necessity and success of large, integrated, cross-disciplinary efforts - so-called 'big science' - directed towards complex major objectives. In this article, we discuss the ways in which this ambitious endeavor led to the development of novel technologies and analytical tools, and how it brought the expertise of engineers, computer scientists and mathematicians together with biologists. It established an open approach to data sharing and open-source software, thereby making the data resulting from the project accessible to all. The genome sequences of microbes, plants and animals have revolutionized many fields of science, including microbiology, virology, infectious disease and plant biology. Moreover, deeper knowledge of human sequence variation has begun to alter the practice of medicine. The Human Genome Project has inspired subsequent large-scale data acquisition initiatives such as the International HapMap Project, 1000 Genomes, and The Cancer Genome Atlas, as well as the recently announced Human Brain Project and the emerging Human Proteome Project.

  4. Generating a hot big bang via a change in topology

    International Nuclear Information System (INIS)

    Kandvup, H.E.

    1990-01-01

    This paper uses ideas developed recently in semiclassical quantum gravity to argue that many qualitative features of the hot big bang generally assumed in cosmology may be explained by the hypothesis that, interpreted semiclassically, the universe tunnelled into being via a quantum fluctuation from a small (Planck-sized), topologically complex entity to a topologically trivial entity (like a Friedmann universe) that rapidly grew to a more macroscopic size

  5. Generating a hot big bang via a change in topology

    Energy Technology Data Exchange (ETDEWEB)

    Kandvup, H.E. (Florida Univ., Gainesville, FL (USA). Space Astronomy Lab.); Masur, P.O. (Institute for Fundamental Theory, Univ. of Florida, Gainesville, FL (US))

    1990-08-01

    This paper uses ideas developed recently in semiclassical quantum gravity to argue that many qualitative features of the hot big bang generally assumed in cosmology may be explained by the hypothesis that, interpreted semiclassically, the universe tunnelled into being via a quantum fluctuation from a small (Planck-sized), topologically complex entity to a topologically trivial entity (like a Friedmann universe) that rapidly grew to a more macroscopic size.

  6. Methods and tools for big data visualization

    OpenAIRE

    Zubova, Jelena; Kurasova, Olga

    2015-01-01

    In this paper, methods and tools for big data visualization have been investigated. Challenges faced by the big data analysis and visualization have been identified. Technologies for big data analysis have been discussed. A review of methods and tools for big data visualization has been done. Functionalities of the tools have been demonstrated by examples in order to highlight their advantages and disadvantages.

  7. Use of aspiration method for collecting brain samples for rabies diagnosis in small wild animals.

    Science.gov (United States)

    Iamamoto, K; Quadros, J; Queiroz, L H

    2011-02-01

    In developing countries such as Brazil, where canine rabies is still a considerable problem, samples from wildlife species are infrequently collected and submitted for screening for rabies. A collaborative study was established involving environmental biologists and veterinarians for rabies epidemiological research in a specific ecological area located at the Sao Paulo State, Brazil. The wild animals' brains are required to be collected without skull damage because the skull's measurements are important in the identification of the captured animal species. For this purpose, samples from bats and small mammals were collected using an aspiration method by inserting a plastic pipette into the brain through the magnum foramen. While there is a progressive increase in the use of the plastic pipette technique in various studies undertaken, it is also appreciated that this method could foster collaborative research between wildlife scientists and rabies epidemiologists thus improving rabies surveillance. © 2009 Blackwell Verlag GmbH.

  8. Perspectives on making big data analytics work for oncology.

    Science.gov (United States)

    El Naqa, Issam

    2016-12-01

    Oncology, with its unique combination of clinical, physical, technological, and biological data provides an ideal case study for applying big data analytics to improve cancer treatment safety and outcomes. An oncology treatment course such as chemoradiotherapy can generate a large pool of information carrying the 5Vs hallmarks of big data. This data is comprised of a heterogeneous mixture of patient demographics, radiation/chemo dosimetry, multimodality imaging features, and biological markers generated over a treatment period that can span few days to several weeks. Efforts using commercial and in-house tools are underway to facilitate data aggregation, ontology creation, sharing, visualization and varying analytics in a secure environment. However, open questions related to proper data structure representation and effective analytics tools to support oncology decision-making need to be addressed. It is recognized that oncology data constitutes a mix of structured (tabulated) and unstructured (electronic documents) that need to be processed to facilitate searching and subsequent knowledge discovery from relational or NoSQL databases. In this context, methods based on advanced analytics and image feature extraction for oncology applications will be discussed. On the other hand, the classical p (variables)≫n (samples) inference problem of statistical learning is challenged in the Big data realm and this is particularly true for oncology applications where p-omics is witnessing exponential growth while the number of cancer incidences has generally plateaued over the past 5-years leading to a quasi-linear growth in samples per patient. Within the Big data paradigm, this kind of phenomenon may yield undesirable effects such as echo chamber anomalies, Yule-Simpson reversal paradox, or misleading ghost analytics. In this work, we will present these effects as they pertain to oncology and engage small thinking methodologies to counter these effects ranging from

  9. Big data analytics methods and applications

    CERN Document Server

    Rao, BLS; Rao, SB

    2016-01-01

    This book has a collection of articles written by Big Data experts to describe some of the cutting-edge methods and applications from their respective areas of interest, and provides the reader with a detailed overview of the field of Big Data Analytics as it is practiced today. The chapters cover technical aspects of key areas that generate and use Big Data such as management and finance; medicine and healthcare; genome, cytome and microbiome; graphs and networks; Internet of Things; Big Data standards; bench-marking of systems; and others. In addition to different applications, key algorithmic approaches such as graph partitioning, clustering and finite mixture modelling of high-dimensional data are also covered. The varied collection of themes in this volume introduces the reader to the richness of the emerging field of Big Data Analytics.

  10. The Big bang and the Quantum

    Science.gov (United States)

    Ashtekar, Abhay

    2010-06-01

    General relativity predicts that space-time comes to an end and physics comes to a halt at the big-bang. Recent developments in loop quantum cosmology have shown that these predictions cannot be trusted. Quantum geometry effects can resolve singularities, thereby opening new vistas. Examples are: The big bang is replaced by a quantum bounce; the `horizon problem' disappears; immediately after the big bounce, there is a super-inflationary phase with its own phenomenological ramifications; and, in presence of a standard inflation potential, initial conditions are naturally set for a long, slow roll inflation independently of what happens in the pre-big bang branch. As in my talk at the conference, I will first discuss the foundational issues and then the implications of the new Planck scale physics near the Big Bang.

  11. Big Bang baryosynthesis

    International Nuclear Information System (INIS)

    Turner, M.S.; Chicago Univ., IL

    1983-01-01

    In these lectures I briefly review Big Bang baryosynthesis. In the first lecture I discuss the evidence which exists for the BAU, the failure of non-GUT symmetrical cosmologies, the qualitative picture of baryosynthesis, and numerical results of detailed baryosynthesis calculations. In the second lecture I discuss the requisite CP violation in some detail, further the statistical mechanics of baryosynthesis, possible complications to the simplest scenario, and one cosmological implication of Big Bang baryosynthesis. (orig./HSI)

  12. Exploiting big data for critical care research.

    Science.gov (United States)

    Docherty, Annemarie B; Lone, Nazir I

    2015-10-01

    Over recent years the digitalization, collection and storage of vast quantities of data, in combination with advances in data science, has opened up a new era of big data. In this review, we define big data, identify examples of critical care research using big data, discuss the limitations and ethical concerns of using these large datasets and finally consider scope for future research. Big data refers to datasets whose size, complexity and dynamic nature are beyond the scope of traditional data collection and analysis methods. The potential benefits to critical care are significant, with faster progress in improving health and better value for money. Although not replacing clinical trials, big data can improve their design and advance the field of precision medicine. However, there are limitations to analysing big data using observational methods. In addition, there are ethical concerns regarding maintaining confidentiality of patients who contribute to these datasets. Big data have the potential to improve medical care and reduce costs, both by individualizing medicine, and bringing together multiple sources of data about individual patients. As big data become increasingly mainstream, it will be important to maintain public confidence by safeguarding data security, governance and confidentiality.

  13. Big five general contractors dominate civil construction market of nuclear power plants

    International Nuclear Information System (INIS)

    Anon.

    1985-01-01

    The Japanese construction industry is a key industry accounting for about 20 % of the GNP, and the investment in construction amounted to 51,200 billion yen in fiscal 1984. 515,000 firms employing about 5.5 million workers belong to the industry. 99.4 % of these firms is the enterprises capitalized at less than 100 million yen, and most of them are small self-employment enterprises. The Construction Business Law provides that those who wish to engage in construction are required to obtain a permit from the Construction Ministry or from a local prefectural governor. There are big five and seven sub-major construction companies in Japan. The big five formed the tie-up relations with three nuclear reactor manufacturers. 76 civil engineering and construction companies recorded the sales related to nuclear power in 1983 amounting to 330.9 billion yen, equivalent to 21 % of the total nuclear-related sales. The construction of nuclear power plants and the characteristics of the construction, and the activities of the big five in the field of nuclear industry are reported. (Kako, I.)

  14. Empathy and the Big Five

    OpenAIRE

    Paulus, Christoph

    2016-01-01

    Del Barrio et al. (2004) haben vor mehr als 10 Jahren versucht, eine direkte Beziehung zwischen Empathie und den Big Five herzustellen. Im Mittel hatten in ihrer Stichprobe Frauen höhere Werte in der Empathie und auf den Big Five-Faktoren mit Ausnahme des Faktors Neurotizismus. Zusammenhänge zu Empathie fanden sie in den Bereichen Offenheit, Verträglichkeit, Gewissenhaftigkeit und Extraversion. In unseren Daten besitzen Frauen sowohl in der Empathie als auch den Big Five signifikant höhere We...

  15. Oscillatory Activity in the Infant Brain and the Representation of Small Numbers.

    Science.gov (United States)

    Leung, Sumie; Mareschal, Denis; Rowsell, Renee; Simpson, David; Iaria, Leon; Grbic, Amanda; Kaufman, Jordy

    2016-01-01

    Gamma-band oscillatory activity (GBA) is an established neural signature of sustained occluded object representation in infants and adults. However, it is not yet known whether the magnitude of GBA in the infant brain reflects the quantity of occluded items held in memory. To examine this, we compared GBA of 6-8 month-old infants during occlusion periods after the representation of two objects vs. that of one object. We found that maintaining a representation of two objects during occlusion resulted in significantly greater GBA relative to maintaining a single object. Further, this enhancement was located in the right occipital region, which is consistent with previous object representation research in adults and infants. We conclude that enhanced GBA reflects neural processes underlying infants' representation of small numbers.

  16. Brain penetrant small molecule 18F-GnRH receptor (GnRH-R) antagonists: Synthesis and preliminary positron emission tomography imaging in rats

    International Nuclear Information System (INIS)

    Olberg, Dag E.; Bauer, Nadine; Andressen, Kjetil W.; Hjørnevik, Trine; Cumming, Paul; Levy, Finn O.; Klaveness, Jo; Haraldsen, Ira; Sutcliffe, Julie L.

    2016-01-01

    Introduction: The gonadotropin releasing hormone receptor (GnRH-R) has a well-described neuroendocrine function in the anterior pituitary. However, little is known about its function in the central nervous system (CNS), where it is most abundantly expressed in hippocampus and amygdala. Since peptide ligands based upon the endogenous decapetide GnRH do not pass the blood–brain-barrier, we are seeking a high-affinity small molecule GnRH-R ligand suitable for brain imaging by positron emission tomography. We have previously reported the radiosynthesis and in vitro evaluation of two novel [ 18 F]fluorinated GnRH-R ligands belonging to the furamide class of antagonists, with molecular weight less than 500 Da. We now extend this work using palladium coupling for the synthesis of four novel radioligands, with putatively reduced polar surface area and hydrophilicity relative to the two previously described compounds, and report the uptake of these 18 F-labeled compounds in brain of living rats. Methods: We synthesized reference standards of the small molecule GnRH-R antagonists as well as mesylate precursors for 18 F-labeling. The antagonists were tested for binding affinity for both human and rat GnRH-R. Serum and blood stability in vitro and in vivo were studied. Biodistribution and PET imaging studies were performed in male rats in order to assess brain penetration in vivo. Results: A palladium coupling methodology served for the synthesis of four novel fluorinated furamide GnRH receptor antagonists with reduced heteroatomic count. Radioligand binding assays in vitro revealed subnanomolar affinity of the new fluorinated compounds for both human and rat GnRH-R. The 18 F-GnRH antagonists were synthesized from the corresponding mesylate precursors in 5–15% overall radiochemical yield. The radiolabeled compounds demonstrated good in vivo stability. PET imaging with the 18 F-radiotracers in naive rats showed good permeability into brain and rapid washout, but absence of

  17. Big domains are novel Ca²+-binding modules: evidences from big domains of Leptospira immunoglobulin-like (Lig) proteins.

    Science.gov (United States)

    Raman, Rajeev; Rajanikanth, V; Palaniappan, Raghavan U M; Lin, Yi-Pin; He, Hongxuan; McDonough, Sean P; Sharma, Yogendra; Chang, Yung-Fu

    2010-12-29

    Many bacterial surface exposed proteins mediate the host-pathogen interaction more effectively in the presence of Ca²+. Leptospiral immunoglobulin-like (Lig) proteins, LigA and LigB, are surface exposed proteins containing Bacterial immunoglobulin like (Big) domains. The function of proteins which contain Big fold is not known. Based on the possible similarities of immunoglobulin and βγ-crystallin folds, we here explore the important question whether Ca²+ binds to a Big domains, which would provide a novel functional role of the proteins containing Big fold. We selected six individual Big domains for this study (three from the conserved part of LigA and LigB, denoted as Lig A3, Lig A4, and LigBCon5; two from the variable region of LigA, i.e., 9(th) (Lig A9) and 10(th) repeats (Lig A10); and one from the variable region of LigB, i.e., LigBCen2. We have also studied the conserved region covering the three and six repeats (LigBCon1-3 and LigCon). All these proteins bind the calcium-mimic dye Stains-all. All the selected four domains bind Ca²+ with dissociation constants of 2-4 µM. Lig A9 and Lig A10 domains fold well with moderate thermal stability, have β-sheet conformation and form homodimers. Fluorescence spectra of Big domains show a specific doublet (at 317 and 330 nm), probably due to Trp interaction with a Phe residue. Equilibrium unfolding of selected Big domains is similar and follows a two-state model, suggesting the similarity in their fold. We demonstrate that the Lig are Ca²+-binding proteins, with Big domains harbouring the binding motif. We conclude that despite differences in sequence, a Big motif binds Ca²+. This work thus sets up a strong possibility for classifying the proteins containing Big domains as a novel family of Ca²+-binding proteins. Since Big domain is a part of many proteins in bacterial kingdom, we suggest a possible function these proteins via Ca²+ binding.

  18. Act-Frequency Signatures of the Big Five.

    Science.gov (United States)

    Chapman, Benjamin P; Goldberg, Lewis R

    2017-10-01

    The traditional focus of work on personality and behavior has tended toward "major outcomes" such as health or antisocial behavior, or small sets of behaviors observable over short periods in laboratories or in convenience samples. In a community sample, we examined a wide set (400) of mundane, incidental or "every day" behavioral acts, the frequencies of which were reported over the past year. Using an exploratory methodology similar to genomic approaches (relying on the False Discovery Rate) revealed 26 prototypical acts for Intellect, 24 acts for Extraversion, 13 for Emotional Stability, nine for Conscientiousness, and six for Agreeableness. Many links were consistent with general intuition-for instance, low Conscientiousness with work and procrastination. Some of the most robust associations, however, were for acts too specific for a priori hypothesis. For instance, Extraversion was strongly associated with telling dirty jokes, Intellect with "loung[ing] around [the] house without clothes on", and Agreeableness with singing in the shower. Frequency categories for these acts changed with markedly non-linearity across Big Five Z-scores. Findings may help ground trait scores in emblematic acts, and enrich understanding of mundane or common behavioral signatures of the Big Five.

  19. The big data potential of epidemiological studies for criminology and forensics.

    Science.gov (United States)

    DeLisi, Matt

    2018-07-01

    Big data, the analysis of original datasets with large samples ranging from ∼30,000 to one million participants to mine unexplored data, has been under-utilized in criminology. However, there have been recent calls for greater synthesis between epidemiology and criminology and a small number of scholars have utilized epidemiological studies that were designed to measure alcohol and substance use to harvest behavioral and psychiatric measures that relate to the study of crime. These studies have been helpful in producing knowledge about the most serious, violent, and chronic offenders, but applications to more pathological forensic populations is lagging. Unfortunately, big data relating to crime and justice are restricted and limited to criminal justice purposes and not easily available to the research community. Thus, the study of criminal and forensic populations is limited in terms of data volume, velocity, and variety. Additional forays into epidemiology, increased use of available online judicial and correctional data, and unknown new frontiers are needed to bring criminology up to speed in the big data arena. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  20. A small, portable, battery-powered brain-computer interface system for motor rehabilitation.

    Science.gov (United States)

    McCrimmon, Colin M; Ming Wang; Silva Lopes, Lucas; Wang, Po T; Karimi-Bidhendi, Alireza; Liu, Charles Y; Heydari, Payam; Nenadic, Zoran; Do, An H

    2016-08-01

    Motor rehabilitation using brain-computer interface (BCI) systems may facilitate functional recovery in individuals after stroke or spinal cord injury. Nevertheless, these systems are typically ill-suited for widespread adoption due to their size, cost, and complexity. In this paper, a small, portable, and extremely cost-efficient (microcontroller and touchscreen. The system's performance was tested using a movement-related BCI task in 3 able-bodied subjects with minimal previous BCI experience. Specifically, subjects were instructed to alternate between relaxing and dorsiflexing their right foot, while their EEG was acquired and analyzed in real-time by the BCI system to decode their underlying movement state. The EEG signals acquired by the custom amplifier array were similar to those acquired by a commercial amplifier (maximum correlation coefficient ρ=0.85). During real-time BCI operation, the average correlation between instructional cues and decoded BCI states across all subjects (ρ=0.70) was comparable to that of full-size BCI systems. Small, portable, and inexpensive BCI systems such as the one reported here may promote a widespread adoption of BCI-based movement rehabilitation devices in stroke and spinal cord injury populations.

  1. Semantic Web Technologies and Big Data Infrastructures: SPARQL Federated Querying of Heterogeneous Big Data Stores

    OpenAIRE

    Konstantopoulos, Stasinos; Charalambidis, Angelos; Mouchakis, Giannis; Troumpoukis, Antonis; Jakobitsch, Jürgen; Karkaletsis, Vangelis

    2016-01-01

    The ability to cross-link large scale data with each other and with structured Semantic Web data, and the ability to uniformly process Semantic Web and other data adds value to both the Semantic Web and to the Big Data community. This paper presents work in progress towards integrating Big Data infrastructures with Semantic Web technologies, allowing for the cross-linking and uniform retrieval of data stored in both Big Data infrastructures and Semantic Web data. The technical challenges invo...

  2. Quantum fields in a big-crunch-big-bang spacetime

    International Nuclear Information System (INIS)

    Tolley, Andrew J.; Turok, Neil

    2002-01-01

    We consider quantum field theory on a spacetime representing the big-crunch-big-bang transition postulated in ekpyrotic or cyclic cosmologies. We show via several independent methods that an essentially unique matching rule holds connecting the incoming state, in which a single extra dimension shrinks to zero, to the outgoing state in which it reexpands at the same rate. For free fields in our construction there is no particle production from the incoming adiabatic vacuum. When interactions are included the particle production for fixed external momentum is finite at the tree level. We discuss a formal correspondence between our construction and quantum field theory on de Sitter spacetime

  3. Turning big bang into big bounce: II. Quantum dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Malkiewicz, Przemyslaw; Piechocki, Wlodzimierz, E-mail: pmalk@fuw.edu.p, E-mail: piech@fuw.edu.p [Theoretical Physics Department, Institute for Nuclear Studies, Hoza 69, 00-681 Warsaw (Poland)

    2010-11-21

    We analyze the big bounce transition of the quantum Friedmann-Robertson-Walker model in the setting of the nonstandard loop quantum cosmology (LQC). Elementary observables are used to quantize composite observables. The spectrum of the energy density operator is bounded and continuous. The spectrum of the volume operator is bounded from below and discrete. It has equally distant levels defining a quantum of the volume. The discreteness may imply a foamy structure of spacetime at a semiclassical level which may be detected in astro-cosmo observations. The nonstandard LQC method has a free parameter that should be fixed in some way to specify the big bounce transition.

  4. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-07-31

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to big data scaling. This presents a serious impediment since identify- ing and repairing dirty data often involves processing huge input datasets, handling sophisticated error discovery approaches and managing huge arbitrary errors. With large datasets, error detection becomes overly expensive and complicated especially when considering user-defined functions. Furthermore, a distinctive algorithm is de- sired to optimize inequality joins in sophisticated error discovery rather than na ̈ıvely parallelizing them. Also, when repairing large errors, their skewed distribution may obstruct effective error repairs. In this dissertation, I present solutions to overcome the above three problems in scaling data cleansing. First, I present BigDansing as a general system to tackle efficiency, scalability, and ease-of-use issues in data cleansing for Big Data. It automatically parallelizes the user’s code on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also enables parallel execution of serial repair algorithms by exploiting the graph representation of discovered errors. The experimental results show that BigDansing outperforms existing baselines up to more than two orders of magnitude. Although BigDansing scales cleansing jobs, it still lacks the ability to handle sophisticated error discovery requiring inequality joins. Therefore, I developed IEJoin as an algorithm for fast inequality joins. It is based on sorted arrays and space efficient bit-arrays to reduce the problem’s search space. By comparing IEJoin against well- known optimizations, I show that it is more scalable, and several orders of magnitude faster. BigDansing depends on vertex-centric graph systems, i.e., Pregel

  5. The ethics of big data in big agriculture

    Directory of Open Access Journals (Sweden)

    Isabelle M. Carbonell

    2016-03-01

    Full Text Available This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique insights on a field-by-field basis into a third or more of the US farmland. This power asymmetry may be rebalanced through open-sourced data, and publicly-funded data analytic tools which rival Climate Corp. in complexity and innovation for use in the public domain.

  6. Homogeneous and isotropic big rips?

    CERN Document Server

    Giovannini, Massimo

    2005-01-01

    We investigate the way big rips are approached in a fully inhomogeneous description of the space-time geometry. If the pressure and energy densities are connected by a (supernegative) barotropic index, the spatial gradients and the anisotropic expansion decay as the big rip is approached. This behaviour is contrasted with the usual big-bang singularities. A similar analysis is performed in the case of sudden (quiescent) singularities and it is argued that the spatial gradients may well be non-negligible in the vicinity of pressure singularities.

  7. Rate Change Big Bang Theory

    Science.gov (United States)

    Strickland, Ken

    2013-04-01

    The Rate Change Big Bang Theory redefines the birth of the universe with a dramatic shift in energy direction and a new vision of the first moments. With rate change graph technology (RCGT) we can look back 13.7 billion years and experience every step of the big bang through geometrical intersection technology. The analysis of the Big Bang includes a visualization of the first objects, their properties, the astounding event that created space and time as well as a solution to the mystery of anti-matter.

  8. Hepatic encephalopathy: Ever closer to its big bang.

    Science.gov (United States)

    Souto, Pablo A; Marcotegui, Ariel R; Orbea, Lisandro; Skerl, Juan; Perazzo, Juan Carlos

    2016-11-14

    Hepatic encephalopathy (HE) is a neuropsychiatric disorder that commonly complicates the course of patients with liver disease. Despite the fact that the syndrome was probably first recognized hundreds of years ago, the exact pathogenesis still remains unclear. Minimal hepatic encephalopathy (MHE) is the earliest form of HE and is estimated to affect more that 75% of patients with liver cirrhosis. It is characterized by cognitive impairment predominantly attention, reactiveness and integrative function with very subtle clinical manifestations. The development of MHE is associated with worsen in driving skills, daily activities and the increase of overall mortality. Skeletal muscle has the ability to shift from ammonia producer to ammonia detoxifying organ. Due to its large size, becomes the main ammonia detoxifying organ in case of chronic liver failure and muscular glutamine-synthase becomes important due to the failing liver and brain metabolic activity. Gut is the major glutamine consumer and ammonia producer organ in the body. Hepatocellular dysfunction due to liver disease, results in an impaired clearance of ammonium and in its inter-organ trafficking. Intestinal bacteria, can also represent an extra source of ammonia production and in cirrhosis, small intestinal bacterial overgrowth and symbiosis can be observed. In the study of HE, to get close to MHE is to get closer to its big bang; and from here, to travel less transited roads such as skeletal muscle and intestine, is to go even closer. The aim of this editorial is to expose this road for further and deeper work.

  9. Intelligent Test Mechanism Design of Worn Big Gear

    Directory of Open Access Journals (Sweden)

    Hong-Yu LIU

    2014-10-01

    Full Text Available With the continuous development of national economy, big gear was widely applied in metallurgy and mine domains. So, big gear plays an important role in above domains. In practical production, big gear abrasion and breach take place often. It affects normal production and causes unnecessary economic loss. A kind of intelligent test method was put forward on worn big gear mainly aimed at the big gear restriction conditions of high production cost, long production cycle and high- intensity artificial repair welding work. The measure equations transformations were made on involute straight gear. Original polar coordinate equations were transformed into rectangular coordinate equations. Big gear abrasion measure principle was introduced. Detection principle diagram was given. Detection route realization method was introduced. OADM12 laser sensor was selected. Detection on big gear abrasion area was realized by detection mechanism. Tested data of unworn gear and worn gear were led in designed calculation program written by Visual Basic language. Big gear abrasion quantity can be obtained. It provides a feasible method for intelligent test and intelligent repair welding on worn big gear.

  10. [Big data in medicine and healthcare].

    Science.gov (United States)

    Rüping, Stefan

    2015-08-01

    Healthcare is one of the business fields with the highest Big Data potential. According to the prevailing definition, Big Data refers to the fact that data today is often too large and heterogeneous and changes too quickly to be stored, processed, and transformed into value by previous technologies. The technological trends drive Big Data: business processes are more and more executed electronically, consumers produce more and more data themselves - e.g. in social networks - and finally ever increasing digitalization. Currently, several new trends towards new data sources and innovative data analysis appear in medicine and healthcare. From the research perspective, omics-research is one clear Big Data topic. In practice, the electronic health records, free open data and the "quantified self" offer new perspectives for data analytics. Regarding analytics, significant advances have been made in the information extraction from text data, which unlocks a lot of data from clinical documentation for analytics purposes. At the same time, medicine and healthcare is lagging behind in the adoption of Big Data approaches. This can be traced to particular problems regarding data complexity and organizational, legal, and ethical challenges. The growing uptake of Big Data in general and first best-practice examples in medicine and healthcare in particular, indicate that innovative solutions will be coming. This paper gives an overview of the potentials of Big Data in medicine and healthcare.

  11. Evaluation of the P-glycoprotein- and breast cancer resistance protein-mediated brain penetration of 11C-labeled topotecan using small-animal positron emission tomography

    International Nuclear Information System (INIS)

    Yamasaki, Tomoteru; Fujinaga, Masayuki; Kawamura, Kazunori; Hatori, Akiko; Yui, Joji; Nengaki, Nobuki; Ogawa, Masanao; Yoshida, Yuichiro; Wakizaka, Hidekatsu; Yanamoto, Kazuhiko; Fukumura, Toshimitsu; Zhang Mingrong

    2011-01-01

    Introduction: Topotecan (TPT) is a camptothecin derivative and is an anticancer drug working as a topoisomerase-I-specific inhibitor. But TPT cannot penetrate through the blood-brain barrier. In this study, we synthesized a new positron emission tomography (PET) probe, [ 11 C]TPT, to evaluate the P-glycoprotein (Pgp)- and breast cancer resistance protein (BCRP)-mediated brain penetration of [ 11 C]TPT using small-animal PET. Methods: [ 11 C]TPT was synthesized by the reaction of a desmethyl precursor with [ 11 C]CH 3 I. In vitro study using [ 11 C]TPT was carried out in MES-SA and doxorubicin-resistant MES-SA/Dx5 cells in the presence or absence of elacridar, a specific inhibitor for Pgp and BCRP. The biodistribution of [ 11 C]TPT was determined using small-animal PET and the dissection method in mice. Results: The transport of [ 11 C]TPT to the extracellular side was determined in MES-SA/Dx5 cells exhibiting the expressions of Pgp and BCRP at high levels. This transport was inhibited by coincubation with elacridar. In Mdr1a/b -/- Bcrp1 -/- mice, PET results indicated that the brain uptake of [ 11 C]TPT was about two times higher than that in wild-type mice. Similarly, the brain penetration of [ 11 C]TPT in wild-type mice was increased by treatment with elacridar. The radioactivity in the brain of elacridar-treated mice was maintained at a certain level after the injection of [ 11 C]TPT, although the radioactivity in the blood decreased with time. Conclusions: We demonstrated the increase of brain penetration of [ 11 C]TPT by deficiency and inhibition of Pgp and BCRP functions using small-animal PET in mice.

  12. From Big Data to Big Business

    DEFF Research Database (Denmark)

    Lund Pedersen, Carsten

    2017-01-01

    Idea in Brief: Problem: There is an enormous profit potential for manufacturing firms in big data, but one of the key barriers to obtaining data-driven growth is the lack of knowledge about which capabilities are needed to extract value and profit from data. Solution: We (BDBB research group at C...

  13. Making big sense from big data in toxicology by read-across.

    Science.gov (United States)

    Hartung, Thomas

    2016-01-01

    Modern information technologies have made big data available in safety sciences, i.e., extremely large data sets that may be analyzed only computationally to reveal patterns, trends and associations. This happens by (1) compilation of large sets of existing data, e.g., as a result of the European REACH regulation, (2) the use of omics technologies and (3) systematic robotized testing in a high-throughput manner. All three approaches and some other high-content technologies leave us with big data--the challenge is now to make big sense of these data. Read-across, i.e., the local similarity-based intrapolation of properties, is gaining momentum with increasing data availability and consensus on how to process and report it. It is predominantly applied to in vivo test data as a gap-filling approach, but can similarly complement other incomplete datasets. Big data are first of all repositories for finding similar substances and ensure that the available data is fully exploited. High-content and high-throughput approaches similarly require focusing on clusters, in this case formed by underlying mechanisms such as pathways of toxicity. The closely connected properties, i.e., structural and biological similarity, create the confidence needed for predictions of toxic properties. Here, a new web-based tool under development called REACH-across, which aims to support and automate structure-based read-across, is presented among others.

  14. [Big data in official statistics].

    Science.gov (United States)

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  15. Correlation between the progressive cytoplasmic expression of a novel small heat shock protein (Hsp16.2) and malignancy in brain tumors

    International Nuclear Information System (INIS)

    Pozsgai, Eva; Gomori, Eva; Szigeti, Andras; Boronkai, Arpad; Gallyas, Ferenc Jr; Sumegi, Balazs; Bellyei, Szabolcs

    2007-01-01

    Small heat shock proteins are molecular chaperones that protect proteins against stress-induced aggregation. They have also been found to have anti-apoptotic activity and to play a part in the development of tumors. Recently, we identified a new small heat shock protein, Hsp16.2 which displayed increased expression in neuroectodermal tumors. Our aim was to investigate the expression of Hsp16.2 in different types of brain tumors and to correlate its expression with the histological grade of the tumor. Immunohistochemistry with a polyclonal antibody to Hsp16.2 was carried out on formalin-fixed, paraffin-wax-embedded sections using the streptavidin-biotin method. 91 samples were examined and their histological grade was defined. According to the intensity of Hsp16.2 immunoreactivity, low (+), moderate (++), high (+++) or none (-) scores were given. Immunoblotting was carried out on 30 samples of brain tumors using SDS-polyacrylamide gel electrophoresis and Western-blotting. Low grade (grades 1–2) brain tumors displayed low cytoplasmic Hsp16.2 immunoreactivity, grade 3 tumors showed moderate cytoplasmic staining, while high grade (grade 4) tumors exhibited intensive cytoplasmic Hsp16.2 staining. Immunoblotting supported the above mentioned results. Normal brain tissue acted as a negative control for the experiment, since the cytoplasm did not stain for Hsp16.2. There was a positive correlation between the level of Hsp16.2 expression and the level of anaplasia in different malignant tissue samples. Hsp16.2 expression was directly correlated with the histological grade of brain tumors, therefore Hsp16.2 may have relevance as becoming a possible tumor marker

  16. Brain regions involved in subprocesses of small-space episodic object-location memory: a systematic review of lesion and functional neuroimaging studies.

    Science.gov (United States)

    Zimmermann, Kathrin; Eschen, Anne

    2017-04-01

    Object-location memory (OLM) enables us to keep track of the locations of objects in our environment. The neurocognitive model of OLM (Postma, A., Kessels, R. P. C., & Van Asselen, M. (2004). The neuropsychology of object-location memory. In G. L. Allen (Ed.), Human spatial memory: Remembering where (pp. 143-160). Mahwah, NJ: Lawrence Erlbaum, Postma, A., Kessels, R. P. C., & Van Asselen, M. (2008). How the brain remembers and forgets where things are: The neurocognition of object-location memory. Neuroscience & Biobehavioral Reviews, 32, 1339-1345. doi: 10.1016/j.neubiorev.2008.05.001 ) proposes that distinct brain regions are specialised for different subprocesses of OLM (object processing, location processing, and object-location binding; categorical and coordinate OLM; egocentric and allocentric OLM). It was based mainly on findings from lesion studies. However, recent episodic memory studies point to a contribution of additional or different brain regions to object and location processing within episodic OLM. To evaluate and update the neurocognitive model of OLM, we therefore conducted a systematic literature search for lesion as well as functional neuroimaging studies contrasting small-space episodic OLM with object memory or location memory. We identified 10 relevant lesion studies and 8 relevant functional neuroimaging studies. We could confirm some of the proposals of the neurocognitive model of OLM, but also differing hypotheses from episodic memory research, about which brain regions are involved in the different subprocesses of small-space episodic OLM. In addition, we were able to identify new brain regions as well as important research gaps.

  17. The paradoxical genesis of too-big-to-fail

    Directory of Open Access Journals (Sweden)

    Thomas S. Umlauft

    2014-03-01

    Full Text Available At least since the Global Financial Crisis of 2007-2009, the problem of too-big-to-fail (TBTF has received widespread attention. The research conducted in this context has, however, generally focused on the econometric aspect and the contribution of the TBTF doctrine to the financial crisis of 2007-2009, while the economic historical approach has been confined to tracing the doctrine to its first appearance. This paper attempts to fill this gap in the academic literature by offering an explanation for why, as opposed to how, the TBTF doctrine has developed. This paper identifies the US population’s distrust and at times hostility against the prospect of concentration of power in large financial institutions as the causal factor leading to the TBTF phenomenon. The resulting socially non-optimal regulation favoured a fragmented and fragile banking system based on small unit banks at the cost of more diversified branch banks. The Great Depression impressively highlighted the deep structural flaws of the US banking system. At the same time, however, it caused a shift in the public opinion, which had generally been opposed to deposit insurance, and thereby aligned the public interest with that of small banks, which would profit most from deposit insurance. The newly acquired public and political support enabled weak unit banks to lobby successfully against reforming the banking structure and instead for the adaption of federal deposit insurance. However, the Federal Deposit Insurance Corporation (FDIC only addressed the symptoms of the weak banking industry but not its causes. Moreover, the strongly biased FDIC policies have generally favoured creditors at large banks, which ultimately led to the TBTF doctrine which, in turn, provided banks with a non-technical incentive to grow in size in order to gain TBTF protection. Initially aimed at preserving the US financial landscape based on small unit banks, the FDIC as the main conduit for TBTF rescues

  18. Big-Leaf Mahogany on CITES Appendix II: Big Challenge, Big Opportunity

    Science.gov (United States)

    JAMES GROGAN; PAULO BARRETO

    2005-01-01

    On 15 November 2003, big-leaf mahogany (Swietenia macrophylla King, Meliaceae), the most valuable widely traded Neotropical timber tree, gained strengthened regulatory protection from its listing on Appendix II of the Convention on International Trade in Endangered Species ofWild Fauna and Flora (CITES). CITES is a United Nations-chartered agreement signed by 164...

  19. Estimation of extremely small field radiation dose for brain stereotactic radiotherapy using the Vero4DRT system.

    Science.gov (United States)

    Nakayama, Shinichi; Monzen, Hajime; Onishi, Yuichi; Kaneshige, Soichiro; Kanno, Ikuo

    2018-06-01

    The purpose of this study was a dosimetric validation of the Vero4DRT for brain stereotactic radiotherapy (SRT) with extremely small fields calculated by the treatment planning system (TPS) iPlan (Ver.4.5.1; algorithm XVMC). Measured and calculated data (e.g. percentage depth dose [PDD], dose profile, and point dose) were compared for small square fields of 30 × 30, 20 × 20, 10 × 10 and 5 × 5 mm 2 using ionization chambers of 0.01 or 0.04 cm 3 and a diamond detector. Dose verifications were performed using an ionization chamber and radiochromic film (EBT3; the equivalent field sizes used were 8.2, 8.7, 8.9, 9.5, and 12.9 mm 2 ) for five brain SRT cases irradiated with dynamic conformal arcs. The PDDs and dose profiles for the measured and calculated data were in good agreement for fields larger than or equal to 10 × 10 mm 2 when an appropriate detector was chosen. The dose differences for point doses in fields of 30 × 30, 20 × 20, 10 × 10 and 5 × 5 mm 2 were +0.48%, +0.56%, -0.52%, and +11.2% respectively. In the dose verifications for the brain SRT plans, the mean dose difference between the calculated and measured doses were -0.35% (range, -0.94% to +0.47%), with the average pass rates for the gamma index under the 3%/2 mm criterion being 96.71%, 93.37%, and 97.58% for coronal, sagittal, and axial planes respectively. The Vero4DRT system provides accurate delivery of radiation dose for small fields larger than or equal to 10 × 10 mm 2 . Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  20. Big Data as Information Barrier

    Directory of Open Access Journals (Sweden)

    Victor Ya. Tsvetkov

    2014-07-01

    Full Text Available The article covers analysis of ‘Big Data’ which has been discussed over last 10 years. The reasons and factors for the issue are revealed. It has proved that the factors creating ‘Big Data’ issue has existed for quite a long time, and from time to time, would cause the informational barriers. Such barriers were successfully overcome through the science and technologies. The conducted analysis refers the “Big Data” issue to a form of informative barrier. This issue may be solved correctly and encourages development of scientific and calculating methods.

  1. Big Data in Space Science

    OpenAIRE

    Barmby, Pauline

    2018-01-01

    It seems like “big data” is everywhere these days. In planetary science and astronomy, we’ve been dealing with large datasets for a long time. So how “big” is our data? How does it compare to the big data that a bank or an airline might have? What new tools do we need to analyze big datasets, and how can we make better use of existing tools? What kinds of science problems can we address with these? I’ll address these questions with examples including ESA’s Gaia mission, ...

  2. Big Data in Medicine is Driving Big Changes

    Science.gov (United States)

    Verspoor, K.

    2014-01-01

    Summary Objectives To summarise current research that takes advantage of “Big Data” in health and biomedical informatics applications. Methods Survey of trends in this work, and exploration of literature describing how large-scale structured and unstructured data sources are being used to support applications from clinical decision making and health policy, to drug design and pharmacovigilance, and further to systems biology and genetics. Results The survey highlights ongoing development of powerful new methods for turning that large-scale, and often complex, data into information that provides new insights into human health, in a range of different areas. Consideration of this body of work identifies several important paradigm shifts that are facilitated by Big Data resources and methods: in clinical and translational research, from hypothesis-driven research to data-driven research, and in medicine, from evidence-based practice to practice-based evidence. Conclusions The increasing scale and availability of large quantities of health data require strategies for data management, data linkage, and data integration beyond the limits of many existing information systems, and substantial effort is underway to meet those needs. As our ability to make sense of that data improves, the value of the data will continue to increase. Health systems, genetics and genomics, population and public health; all areas of biomedicine stand to benefit from Big Data and the associated technologies. PMID:25123716

  3. Main Issues in Big Data Security

    Directory of Open Access Journals (Sweden)

    Julio Moreno

    2016-09-01

    Full Text Available Data is currently one of the most important assets for companies in every field. The continuous growth in the importance and volume of data has created a new problem: it cannot be handled by traditional analysis techniques. This problem was, therefore, solved through the creation of a new paradigm: Big Data. However, Big Data originated new issues related not only to the volume or the variety of the data, but also to data security and privacy. In order to obtain a full perspective of the problem, we decided to carry out an investigation with the objective of highlighting the main issues regarding Big Data security, and also the solutions proposed by the scientific community to solve them. In this paper, we explain the results obtained after applying a systematic mapping study to security in the Big Data ecosystem. It is almost impossible to carry out detailed research into the entire topic of security, and the outcome of this research is, therefore, a big picture of the main problems related to security in a Big Data system, along with the principal solutions to them proposed by the research community.

  4. Small Quaternary Inhibitors K298 and K524: Cholinesterases Inhibition, Absorption, Brain Distribution, and Toxicity.

    Science.gov (United States)

    Karasova, Jana Zdarova; Hroch, Milos; Musilek, Kamil; Kuca, Kamil

    2016-02-01

    Inhibitors of acetylcholinesterase (AChE) may be used in the treatment of various cholinergic deficits, among them being myasthenia gravis (MG). This paper describes the first in vivo data for promising small quaternary inhibitors (K298 and K524): acute toxicity study, cholinesterase inhibition, absorption, and blood-brain barrier penetration. The newly prepared AChE inhibitors (bis-quinolinium and quinolinium compounds) possess a positive charge in the molecule which ensures that anti-AChE action is restricted to peripheral effect. HPLC-MS was used for determination of real plasma and brain concentration in the pharmacokinetic part of the study, and standard non-compartmental analysis was performed. The maximum plasma concentrations were attained at 30 min (K298; 928.76 ± 115.20 ng/ml) and 39 min (K524; 812.40 ± 54.96 ng/ml) after i.m. Both compounds are in fact able to target the central nervous system. It seems that the difference in the CNS distribution profile depends on an active efflux system. The K524 brain concentration was actively decreased to below an effective level; in contrast, K298 progressively accumulated in brain tissue. Peripheral AChE inhibitors are still first-line treatment in the mild forms of MG. Commonly prescribed carbamates have many severe side effects related to AChE carbamylation. The search for new treatment strategies is still important. Unlike carbamates, these new compounds target AChE via apparent π-π or π-cationic interaction aside at the AChE catalytic site.

  5. Harnessing the Power of Big Data to Improve Graduate Medical Education: Big Idea or Bust?

    Science.gov (United States)

    Arora, Vineet M

    2018-06-01

    With the advent of electronic medical records (EMRs) fueling the rise of big data, the use of predictive analytics, machine learning, and artificial intelligence are touted as transformational tools to improve clinical care. While major investments are being made in using big data to transform health care delivery, little effort has been directed toward exploiting big data to improve graduate medical education (GME). Because our current system relies on faculty observations of competence, it is not unreasonable to ask whether big data in the form of clinical EMRs and other novel data sources can answer questions of importance in GME such as when is a resident ready for independent practice.The timing is ripe for such a transformation. A recent National Academy of Medicine report called for reforms to how GME is delivered and financed. While many agree on the need to ensure that GME meets our nation's health needs, there is little consensus on how to measure the performance of GME in meeting this goal. During a recent workshop at the National Academy of Medicine on GME outcomes and metrics in October 2017, a key theme emerged: Big data holds great promise to inform GME performance at individual, institutional, and national levels. In this Invited Commentary, several examples are presented, such as using big data to inform clinical experience and provide clinically meaningful data to trainees, and using novel data sources, including ambient data, to better measure the quality of GME training.

  6. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  7. A survey of big data research

    Science.gov (United States)

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  8. Big Data in Action for Government : Big Data Innovation in Public Services, Policy, and Engagement

    OpenAIRE

    World Bank

    2017-01-01

    Governments have an opportunity to harness big data solutions to improve productivity, performance and innovation in service delivery and policymaking processes. In developing countries, governments have an opportunity to adopt big data solutions and leapfrog traditional administrative approaches

  9. The Big Five of Personality and structural imaging revisited: a VBM - DARTEL study.

    Science.gov (United States)

    Liu, Wei-Yin; Weber, Bernd; Reuter, Martin; Markett, Sebastian; Chu, Woei-Chyn; Montag, Christian

    2013-05-08

    The present study focuses on the neurostructural foundations of the human personality. In a large sample of 227 healthy human individuals (168 women and 59 men), we used MRI to examine the relationship between personality traits and both regional gray and white matter volume, while controlling for age and sex. Personality was assessed using the German version of the NEO Five-Factor Inventory that measures individual differences in the 'Big Five of Personality': extraversion, neuroticism, agreeableness, conscientiousness, and openness to experience. In contrast to most previous studies on neural correlates of the Big Five, we used improved processing strategies: white and gray matter were independently assessed by segmentation steps before data analysis. In addition, customized sex-specific diffeomorphic anatomical registration using exponentiated lie algebra templates were used. Our results did not show significant correlations between any dimension of the Big Five and regional gray matter volume. However, among others, higher conscientiousness scores correlated significantly with reductions in regional white matter volume in different brain areas, including the right insula, putamen, caudate, and left fusiformis. These correlations were driven by the female subsample. The present study suggests that many results from the literature on the neurostructural basis of personality should be reviewed carefully, considering the results when the sample size is larger, imaging methods are rigorously applied, and sex-related and age-related effects are controlled.

  10. A study on optimal scan conditions of big bore multi-slice computed tomography based on radiation dose and image noise

    International Nuclear Information System (INIS)

    Lee, J. S.; Ye, S. J.; Kim, E. H.

    2011-01-01

    The newly introduced Big Bore computed tomography (CT) has a possibility to increase the tube current product scan time (mA s) for compensation of image degradation due to larger gentry opening without sound guideline. The objective of this paper is to derive optimal scan conditions for Big Bore CT scanner, mainly relating to the dose of diagnostic CT. The weighted CT dose index (CTDI w ) was estimated at five typical protocols, such as head and neck, brain, paediatric, chest and abdomen. Noises were analysed in a circle of 1 or 2 cm of diameter in CT image slice. The results showed that measured CTDI w values generally follow the theoretical rule at all scanning conditions of every protocol. Although image noises decrease with increment of mA s, analysed image noises do follow the theoretical rule, but only in specific protocols. This phenomenon is presumed to result from the photon energy spectra arriving at the detection system of the Big Bore scanner. (authors)

  11. smallWig: parallel compression of RNA-seq WIG files.

    Science.gov (United States)

    Wang, Zhiying; Weissman, Tsachy; Milenkovic, Olgica

    2016-01-15

    We developed a new lossless compression method for WIG data, named smallWig, offering the best known compression rates for RNA-seq data and featuring random access functionalities that enable visualization, summary statistics analysis and fast queries from the compressed files. Our approach results in order of magnitude improvements compared with bigWig and ensures compression rates only a fraction of those produced by cWig. The key features of the smallWig algorithm are statistical data analysis and a combination of source coding methods that ensure high flexibility and make the algorithm suitable for different applications. Furthermore, for general-purpose file compression, the compression rate of smallWig approaches the empirical entropy of the tested WIG data. For compression with random query features, smallWig uses a simple block-based compression scheme that introduces only a minor overhead in the compression rate. For archival or storage space-sensitive applications, the method relies on context mixing techniques that lead to further improvements of the compression rate. Implementations of smallWig can be executed in parallel on different sets of chromosomes using multiple processors, thereby enabling desirable scaling for future transcriptome Big Data platforms. The development of next-generation sequencing technologies has led to a dramatic decrease in the cost of DNA/RNA sequencing and expression profiling. RNA-seq has emerged as an important and inexpensive technology that provides information about whole transcriptomes of various species and organisms, as well as different organs and cellular communities. The vast volume of data generated by RNA-seq experiments has significantly increased data storage costs and communication bandwidth requirements. Current compression tools for RNA-seq data such as bigWig and cWig either use general-purpose compressors (gzip) or suboptimal compression schemes that leave significant room for improvement. To substantiate

  12. 78 FR 3911 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN; Final Comprehensive...

    Science.gov (United States)

    2013-01-17

    ... DEPARTMENT OF THE INTERIOR Fish and Wildlife Service [FWS-R3-R-2012-N259; FXRS1265030000-134-FF03R06000] Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN; Final Comprehensive... significant impact (FONSI) for the environmental assessment (EA) for Big Stone National Wildlife Refuge...

  13. Big domains are novel Ca²+-binding modules: evidences from big domains of Leptospira immunoglobulin-like (Lig proteins.

    Directory of Open Access Journals (Sweden)

    Rajeev Raman

    Full Text Available BACKGROUND: Many bacterial surface exposed proteins mediate the host-pathogen interaction more effectively in the presence of Ca²+. Leptospiral immunoglobulin-like (Lig proteins, LigA and LigB, are surface exposed proteins containing Bacterial immunoglobulin like (Big domains. The function of proteins which contain Big fold is not known. Based on the possible similarities of immunoglobulin and βγ-crystallin folds, we here explore the important question whether Ca²+ binds to a Big domains, which would provide a novel functional role of the proteins containing Big fold. PRINCIPAL FINDINGS: We selected six individual Big domains for this study (three from the conserved part of LigA and LigB, denoted as Lig A3, Lig A4, and LigBCon5; two from the variable region of LigA, i.e., 9(th (Lig A9 and 10(th repeats (Lig A10; and one from the variable region of LigB, i.e., LigBCen2. We have also studied the conserved region covering the three and six repeats (LigBCon1-3 and LigCon. All these proteins bind the calcium-mimic dye Stains-all. All the selected four domains bind Ca²+ with dissociation constants of 2-4 µM. Lig A9 and Lig A10 domains fold well with moderate thermal stability, have β-sheet conformation and form homodimers. Fluorescence spectra of Big domains show a specific doublet (at 317 and 330 nm, probably due to Trp interaction with a Phe residue. Equilibrium unfolding of selected Big domains is similar and follows a two-state model, suggesting the similarity in their fold. CONCLUSIONS: We demonstrate that the Lig are Ca²+-binding proteins, with Big domains harbouring the binding motif. We conclude that despite differences in sequence, a Big motif binds Ca²+. This work thus sets up a strong possibility for classifying the proteins containing Big domains as a novel family of Ca²+-binding proteins. Since Big domain is a part of many proteins in bacterial kingdom, we suggest a possible function these proteins via Ca²+ binding.

  14. New 'bigs' in cosmology

    International Nuclear Information System (INIS)

    Yurov, Artyom V.; Martin-Moruno, Prado; Gonzalez-Diaz, Pedro F.

    2006-01-01

    This paper contains a detailed discussion on new cosmic solutions describing the early and late evolution of a universe that is filled with a kind of dark energy that may or may not satisfy the energy conditions. The main distinctive property of the resulting space-times is that they make to appear twice the single singular events predicted by the corresponding quintessential (phantom) models in a manner which can be made symmetric with respect to the origin of cosmic time. Thus, big bang and big rip singularity are shown to take place twice, one on the positive branch of time and the other on the negative one. We have also considered dark energy and phantom energy accretion onto black holes and wormholes in the context of these new cosmic solutions. It is seen that the space-times of these holes would then undergo swelling processes leading to big trip and big hole events taking place on distinct epochs along the evolution of the universe. In this way, the possibility is considered that the past and future be connected in a non-paradoxical manner in the universes described by means of the new symmetric solutions

  15. Oscillatory activity in the infant brain and the representation of small numbers

    Directory of Open Access Journals (Sweden)

    Sumie eLeung

    2016-02-01

    Full Text Available Gamma-band oscillatory activity (GBA is an established neural signature of sustained occluded object representation in infants and adults. However, it is not yet known whether the magnitude of GBA in the infant brain reflects the quantity of occluded items held in memory. To examine this, we compared GBA of 6- to 8-month-old infants during occlusion periods after the representation of two objects versus that of one object. We found that maintaining a representation of two objects during occlusion resulted in significantly greater GBA relative to maintaining a single object. Further, this enhancement was located in the right occipital region, which is consistent with previous object representation research in adults and infants. We conclude that enhanced GBA reflects neural processes underlying infants’ representation of small numbers.

  16. The ethics of biomedical big data

    CERN Document Server

    Mittelstadt, Brent Daniel

    2016-01-01

    This book presents cutting edge research on the new ethical challenges posed by biomedical Big Data technologies and practices. ‘Biomedical Big Data’ refers to the analysis of aggregated, very large datasets to improve medical knowledge and clinical care. The book describes the ethical problems posed by aggregation of biomedical datasets and re-use/re-purposing of data, in areas such as privacy, consent, professionalism, power relationships, and ethical governance of Big Data platforms. Approaches and methods are discussed that can be used to address these problems to achieve the appropriate balance between the social goods of biomedical Big Data research and the safety and privacy of individuals. Seventeen original contributions analyse the ethical, social and related policy implications of the analysis and curation of biomedical Big Data, written by leading experts in the areas of biomedical research, medical and technology ethics, privacy, governance and data protection. The book advances our understan...

  17. Scalable privacy-preserving big data aggregation mechanism

    Directory of Open Access Journals (Sweden)

    Dapeng Wu

    2016-08-01

    Full Text Available As the massive sensor data generated by large-scale Wireless Sensor Networks (WSNs recently become an indispensable part of ‘Big Data’, the collection, storage, transmission and analysis of the big sensor data attract considerable attention from researchers. Targeting the privacy requirements of large-scale WSNs and focusing on the energy-efficient collection of big sensor data, a Scalable Privacy-preserving Big Data Aggregation (Sca-PBDA method is proposed in this paper. Firstly, according to the pre-established gradient topology structure, sensor nodes in the network are divided into clusters. Secondly, sensor data is modified by each node according to the privacy-preserving configuration message received from the sink. Subsequently, intra- and inter-cluster data aggregation is employed during the big sensor data reporting phase to reduce energy consumption. Lastly, aggregated results are recovered by the sink to complete the privacy-preserving big data aggregation. Simulation results validate the efficacy and scalability of Sca-PBDA and show that the big sensor data generated by large-scale WSNs is efficiently aggregated to reduce network resource consumption and the sensor data privacy is effectively protected to meet the ever-growing application requirements.

  18. Combined baseline and one-month changes in big endothelin-1 and brain natriuretic peptide plasma concentrations predict clinical outcomes in patients with left ventricular dysfunction after acute myocardial infarction: Insights from the Eplerenone Post-Acute Myocardial Infarction Heart Failure Efficacy and Survival Study (EPHESUS) study.

    Science.gov (United States)

    Olivier, A; Girerd, N; Michel, J B; Ketelslegers, J M; Fay, R; Vincent, J; Bramlage, P; Pitt, B; Zannad, F; Rossignol, P

    2017-08-15

    Increased levels of neuro-hormonal biomarkers predict poor prognosis in patients with acute myocardial infarction (AMI) complicated by left ventricular systolic dysfunction (LVSD). The predictive value of repeated (one-month interval) brain natriuretic peptides (BNP) and big-endothelin 1 (BigET-1) measurements were investigated in patients with LVSD after AMI. In a sub-study of the Eplerenone Post-Acute Myocardial Infarction Heart Failure Efficacy and Survival Study (EPHESUS trial), BNP and BigET-1 were measured at baseline and at 1month in 476 patients. When included in the same Cox regression model, baseline BNP (p=0.0003) and BigET-1 (p=0.026) as well as the relative changes (after 1month) from baseline in BNP (p=0.049) and BigET-1 (p=0.045) were predictive of the composite of cardiovascular death or hospitalization for worsening heart failure. Adding baseline and changes in BigET-1 to baseline and changes in BNP led to a significant increase in prognostic reclassification as assessed by integrated discrimination improvement index (5.0%, p=0.01 for the primary endpoint). Both increased baseline and changes after one month in BigET-1 concentrations were shown to be associated with adverse clinical outcomes, independently from BNP baseline levels and one month changes, in patients after recent AMI complicated with LVSD. This novel result may be of clinical interest since such combined biomarker assessment could improve risk stratification and open new avenues for biomarker-guided targeted therapies. In the present study, we report for the first time in a population of patients with reduced LVEF after AMI and signs or symptoms of congestive HF, that increased baseline values of BNP and BigET-1 as well as a further rise of these markers over the first month after AMI, were independently predictive of future cardiovascular events. This approach may therefore be of clinical interest with the potential of improving risk stratification after AMI with reduced LVEF while

  19. Ethische aspecten van big data

    NARCIS (Netherlands)

    N. (Niek) van Antwerpen; Klaas Jan Mollema

    2017-01-01

    Big data heeft niet alleen geleid tot uitdagende technische vraagstukken, ook gaat het gepaard met allerlei nieuwe ethische en morele kwesties. Om verantwoord met big data om te gaan, moet ook over deze kwesties worden nagedacht. Want slecht datagebruik kan nadelige gevolgen hebben voor

  20. Epidemiology in wonderland: Big Data and precision medicine.

    Science.gov (United States)

    Saracci, Rodolfo

    2018-03-01

    Big Data and precision medicine, two major contemporary challenges for epidemiology, are critically examined from two different angles. In Part 1 Big Data collected for research purposes (Big research Data) and Big Data used for research although collected for other primary purposes (Big secondary Data) are discussed in the light of the fundamental common requirement of data validity, prevailing over "bigness". Precision medicine is treated developing the key point that high relative risks are as a rule required to make a variable or combination of variables suitable for prediction of disease occurrence, outcome or response to treatment; the commercial proliferation of allegedly predictive tests of unknown or poor validity is commented. Part 2 proposes a "wise epidemiology" approach to: (a) choosing in a context imprinted by Big Data and precision medicine-epidemiological research projects actually relevant to population health, (b) training epidemiologists, (c) investigating the impact on clinical practices and doctor-patient relation of the influx of Big Data and computerized medicine and (d) clarifying whether today "health" may be redefined-as some maintain in purely technological terms.

  1. Big Data and Analytics in Healthcare.

    Science.gov (United States)

    Tan, S S-L; Gao, G; Koch, S

    2015-01-01

    This editorial is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". The amount of data being generated in the healthcare industry is growing at a rapid rate. This has generated immense interest in leveraging the availability of healthcare data (and "big data") to improve health outcomes and reduce costs. However, the nature of healthcare data, and especially big data, presents unique challenges in processing and analyzing big data in healthcare. This Focus Theme aims to disseminate some novel approaches to address these challenges. More specifically, approaches ranging from efficient methods of processing large clinical data to predictive models that could generate better predictions from healthcare data are presented.

  2. "Big data" in economic history.

    Science.gov (United States)

    Gutmann, Myron P; Merchant, Emily Klancher; Roberts, Evan

    2018-03-01

    Big data is an exciting prospect for the field of economic history, which has long depended on the acquisition, keying, and cleaning of scarce numerical information about the past. This article examines two areas in which economic historians are already using big data - population and environment - discussing ways in which increased frequency of observation, denser samples, and smaller geographic units allow us to analyze the past with greater precision and often to track individuals, places, and phenomena across time. We also explore promising new sources of big data: organically created economic data, high resolution images, and textual corpora.

  3. Big Data Knowledge in Global Health Education.

    Science.gov (United States)

    Olayinka, Olaniyi; Kekeh, Michele; Sheth-Chandra, Manasi; Akpinar-Elci, Muge

    The ability to synthesize and analyze massive amounts of data is critical to the success of organizations, including those that involve global health. As countries become highly interconnected, increasing the risk for pandemics and outbreaks, the demand for big data is likely to increase. This requires a global health workforce that is trained in the effective use of big data. To assess implementation of big data training in global health, we conducted a pilot survey of members of the Consortium of Universities of Global Health. More than half the respondents did not have a big data training program at their institution. Additionally, the majority agreed that big data training programs will improve global health deliverables, among other favorable outcomes. Given the observed gap and benefits, global health educators may consider investing in big data training for students seeking a career in global health. Copyright © 2017 Icahn School of Medicine at Mount Sinai. Published by Elsevier Inc. All rights reserved.

  4. GEOSS: Addressing Big Data Challenges

    Science.gov (United States)

    Nativi, S.; Craglia, M.; Ochiai, O.

    2014-12-01

    In the sector of Earth Observation, the explosion of data is due to many factors including: new satellite constellations, the increased capabilities of sensor technologies, social media, crowdsourcing, and the need for multidisciplinary and collaborative research to face Global Changes. In this area, there are many expectations and concerns about Big Data. Vendors have attempted to use this term for their commercial purposes. It is necessary to understand whether Big Data is a radical shift or an incremental change for the existing digital infrastructures. This presentation tries to explore and discuss the impact of Big Data challenges and new capabilities on the Global Earth Observation System of Systems (GEOSS) and particularly on its common digital infrastructure called GCI. GEOSS is a global and flexible network of content providers allowing decision makers to access an extraordinary range of data and information at their desk. The impact of the Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on GEOSS is discussed. The main solutions and experimentation developed by GEOSS along these axes are introduced and analyzed. GEOSS is a pioneering framework for global and multidisciplinary data sharing in the Earth Observation realm; its experience on Big Data is valuable for the many lessons learned.

  5. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  6. BIG DATA IN TAMIL: OPPORTUNITIES, BENEFITS AND CHALLENGES

    OpenAIRE

    R.S. Vignesh Raj; Babak Khazaei; Ashik Ali

    2015-01-01

    This paper gives an overall introduction on big data and has tried to introduce Big Data in Tamil. It discusses the potential opportunities, benefits and likely challenges from a very Tamil and Tamil Nadu perspective. The paper has also made original contribution by proposing the ‘big data’s’ terminology in Tamil. The paper further suggests a few areas to explore using big data Tamil on the lines of the Tamil Nadu Government ‘vision 2023’. Whilst, big data has something to offer everyone, it ...

  7. Big data in biomedicine.

    Science.gov (United States)

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Big Data’s Role in Precision Public Health

    Science.gov (United States)

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts. PMID:29594091

  9. Big inquiry

    Energy Technology Data Exchange (ETDEWEB)

    Wynne, B [Lancaster Univ. (UK)

    1979-06-28

    The recently published report entitled 'The Big Public Inquiry' from the Council for Science and Society and the Outer Circle Policy Unit is considered, with especial reference to any future enquiry which may take place into the first commercial fast breeder reactor. Proposals embodied in the report include stronger rights for objectors and an attempt is made to tackle the problem that participation in a public inquiry is far too late to be objective. It is felt by the author that the CSS/OCPU report is a constructive contribution to the debate about big technology inquiries but that it fails to understand the deeper currents in the economic and political structure of technology which so influence the consequences of whatever formal procedures are evolved.

  10. Differential transgene expression in brain cells in vivo and in vitro from AAV-2 vectors with small transcriptional control units

    International Nuclear Information System (INIS)

    Kuegler, S.; Lingor, P.; Schoell, U.; Zolotukhin, S.; Baehr, M.

    2003-01-01

    Adeno-associated- (AAV) based vectors are promising tools for gene therapy applications in several organs, including the brain, but are limited by their small genome size. Two short promoters, the human synapsin 1 gene promoter (hSYN) and the murine cytomegalovirus immediate early promoter (mCMV), were evaluated in bicistronic AAV-2 vectors for their expression profiles in cultured primary brain cells and in the rat brain. Whereas transgene expression from the hSYN promoter was exclusively neuronal, the murine CMV promoter targeted expression mainly to astrocytes in vitro and showed weak transgene expression in vivo in retinal and cortical neurons, but strong expression in thalamic neurons. We propose that neuron specific transgene expression in combination with enhanced transgene capacity will further substantially improve AAV based vector technology

  11. Benefit of brain prophylactic irradiation in patients suffering from a small-cell bronchial cancer: retrospective study on 289 cases

    International Nuclear Information System (INIS)

    Assouline, A.; Tai, P.; Jancewicz, M.; Joseph, K.; Krzisch, C.; Yu, E.

    2011-01-01

    The authors report a study which aimed at determining the benefit of a brain prophylactic irradiation for patients suffering from a localized small-cell bronchial cancer and in partial response to the local-regional treatment of their disease. A retrospective analysis has been performed on a set of 289 patients who had been treated by chemo-radiotherapy with or without brain prophylactic irradiation between 1981 and 2007. Data are discussed in terms of remission level, survival with respect to the level of response to the local-regional treatment. Depending on this response level the irradiation increases or not the probability of global survival, or the probability of specific survival. Short communication

  12. Big data analytics with R and Hadoop

    CERN Document Server

    Prajapati, Vignesh

    2013-01-01

    Big Data Analytics with R and Hadoop is a tutorial style book that focuses on all the powerful big data tasks that can be achieved by integrating R and Hadoop.This book is ideal for R developers who are looking for a way to perform big data analytics with Hadoop. This book is also aimed at those who know Hadoop and want to build some intelligent applications over Big data with R packages. It would be helpful if readers have basic knowledge of R.

  13. Big data in forensic science and medicine.

    Science.gov (United States)

    Lefèvre, Thomas

    2018-07-01

    In less than a decade, big data in medicine has become quite a phenomenon and many biomedical disciplines got their own tribune on the topic. Perspectives and debates are flourishing while there is a lack for a consensual definition for big data. The 3Vs paradigm is frequently evoked to define the big data principles and stands for Volume, Variety and Velocity. Even according to this paradigm, genuine big data studies are still scarce in medicine and may not meet all expectations. On one hand, techniques usually presented as specific to the big data such as machine learning techniques are supposed to support the ambition of personalized, predictive and preventive medicines. These techniques are mostly far from been new and are more than 50 years old for the most ancient. On the other hand, several issues closely related to the properties of big data and inherited from other scientific fields such as artificial intelligence are often underestimated if not ignored. Besides, a few papers temper the almost unanimous big data enthusiasm and are worth attention since they delineate what is at stakes. In this context, forensic science is still awaiting for its position papers as well as for a comprehensive outline of what kind of contribution big data could bring to the field. The present situation calls for definitions and actions to rationally guide research and practice in big data. It is an opportunity for grounding a true interdisciplinary approach in forensic science and medicine that is mainly based on evidence. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  14. Evaluation of the P-glycoprotein- and breast cancer resistance protein-mediated brain penetration of {sup 11}C-labeled topotecan using small-animal positron emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Yamasaki, Tomoteru; Fujinaga, Masayuki; Kawamura, Kazunori; Hatori, Akiko; Yui, Joji [Department of Molecular Probes, Molecular Imaging Center, National Institute of Radiological Sciences, Chiba 263-8555 (Japan); Nengaki, Nobuki; Ogawa, Masanao; Yoshida, Yuichiro [Department of Molecular Probes, Molecular Imaging Center, National Institute of Radiological Sciences, Chiba 263-8555 (Japan); SHI Accelerator Service, Ltd., Tokyo 141-8686 (Japan); Wakizaka, Hidekatsu [Department of Biophysics, Molecular Imaging Center, National Institute of Radiological Sciences, Chiba 263-8555 (Japan); Yanamoto, Kazuhiko [Division of Health Sciences, Graduate School of Medicine, Osaka University, Osaka 565-0871 (Japan); Fukumura, Toshimitsu [Department of Molecular Probes, Molecular Imaging Center, National Institute of Radiological Sciences, Chiba 263-8555 (Japan); Zhang Mingrong, E-mail: zhang@nirs.go.jp [Department of Molecular Probes, Molecular Imaging Center, National Institute of Radiological Sciences, Chiba 263-8555 (Japan)

    2011-07-15

    Introduction: Topotecan (TPT) is a camptothecin derivative and is an anticancer drug working as a topoisomerase-I-specific inhibitor. But TPT cannot penetrate through the blood-brain barrier. In this study, we synthesized a new positron emission tomography (PET) probe, [{sup 11}C]TPT, to evaluate the P-glycoprotein (Pgp)- and breast cancer resistance protein (BCRP)-mediated brain penetration of [{sup 11}C]TPT using small-animal PET. Methods: [{sup 11}C]TPT was synthesized by the reaction of a desmethyl precursor with [{sup 11}C]CH{sub 3}I. In vitro study using [{sup 11}C]TPT was carried out in MES-SA and doxorubicin-resistant MES-SA/Dx5 cells in the presence or absence of elacridar, a specific inhibitor for Pgp and BCRP. The biodistribution of [{sup 11}C]TPT was determined using small-animal PET and the dissection method in mice. Results: The transport of [{sup 11}C]TPT to the extracellular side was determined in MES-SA/Dx5 cells exhibiting the expressions of Pgp and BCRP at high levels. This transport was inhibited by coincubation with elacridar. In Mdr1a/b{sup -/-}Bcrp1{sup -/-} mice, PET results indicated that the brain uptake of [{sup 11}C]TPT was about two times higher than that in wild-type mice. Similarly, the brain penetration of [{sup 11}C]TPT in wild-type mice was increased by treatment with elacridar. The radioactivity in the brain of elacridar-treated mice was maintained at a certain level after the injection of [{sup 11}C]TPT, although the radioactivity in the blood decreased with time. Conclusions: We demonstrated the increase of brain penetration of [{sup 11}C]TPT by deficiency and inhibition of Pgp and BCRP functions using small-animal PET in mice.

  15. NASA's Big Data Task Force

    Science.gov (United States)

    Holmes, C. P.; Kinter, J. L.; Beebe, R. F.; Feigelson, E.; Hurlburt, N. E.; Mentzel, C.; Smith, G.; Tino, C.; Walker, R. J.

    2017-12-01

    Two years ago NASA established the Ad Hoc Big Data Task Force (BDTF - https://science.nasa.gov/science-committee/subcommittees/big-data-task-force), an advisory working group with the NASA Advisory Council system. The scope of the Task Force included all NASA Big Data programs, projects, missions, and activities. The Task Force focused on such topics as exploring the existing and planned evolution of NASA's science data cyber-infrastructure that supports broad access to data repositories for NASA Science Mission Directorate missions; best practices within NASA, other Federal agencies, private industry and research institutions; and Federal initiatives related to big data and data access. The BDTF has completed its two-year term and produced several recommendations plus four white papers for NASA's Science Mission Directorate. This presentation will discuss the activities and results of the TF including summaries of key points from its focused study topics. The paper serves as an introduction to the papers following in this ESSI session.

  16. Big Data Technologies

    Science.gov (United States)

    Bellazzi, Riccardo; Dagliati, Arianna; Sacchi, Lucia; Segagni, Daniele

    2015-01-01

    The so-called big data revolution provides substantial opportunities to diabetes management. At least 3 important directions are currently of great interest. First, the integration of different sources of information, from primary and secondary care to administrative information, may allow depicting a novel view of patient’s care processes and of single patient’s behaviors, taking into account the multifaceted nature of chronic care. Second, the availability of novel diabetes technologies, able to gather large amounts of real-time data, requires the implementation of distributed platforms for data analysis and decision support. Finally, the inclusion of geographical and environmental information into such complex IT systems may further increase the capability of interpreting the data gathered and extract new knowledge from them. This article reviews the main concepts and definitions related to big data, it presents some efforts in health care, and discusses the potential role of big data in diabetes care. Finally, as an example, it describes the research efforts carried on in the MOSAIC project, funded by the European Commission. PMID:25910540

  17. Correlation between the progressive cytoplasmic expression of a novel small heat shock protein (Hsp16.2 and malignancy in brain tumors

    Directory of Open Access Journals (Sweden)

    Gallyas Ferenc

    2007-12-01

    Full Text Available Abstract Background Small heat shock proteins are molecular chaperones that protect proteins against stress-induced aggregation. They have also been found to have anti-apoptotic activity and to play a part in the development of tumors. Recently, we identified a new small heat shock protein, Hsp16.2 which displayed increased expression in neuroectodermal tumors. Our aim was to investigate the expression of Hsp16.2 in different types of brain tumors and to correlate its expression with the histological grade of the tumor. Methods Immunohistochemistry with a polyclonal antibody to Hsp16.2 was carried out on formalin-fixed, paraffin-wax-embedded sections using the streptavidin-biotin method. 91 samples were examined and their histological grade was defined. According to the intensity of Hsp16.2 immunoreactivity, low (+, moderate (++, high (+++ or none (- scores were given. Immunoblotting was carried out on 30 samples of brain tumors using SDS-polyacrylamide gel electrophoresis and Western-blotting. Results Low grade (grades 1–2 brain tumors displayed low cytoplasmic Hsp16.2 immunoreactivity, grade 3 tumors showed moderate cytoplasmic staining, while high grade (grade 4 tumors exhibited intensive cytoplasmic Hsp16.2 staining. Immunoblotting supported the above mentioned results. Normal brain tissue acted as a negative control for the experiment, since the cytoplasm did not stain for Hsp16.2. There was a positive correlation between the level of Hsp16.2 expression and the level of anaplasia in different malignant tissue samples. Conclusion Hsp16.2 expression was directly correlated with the histological grade of brain tumors, therefore Hsp16.2 may have relevance as becoming a possible tumor marker.

  18. The Berlin Inventory of Gambling behavior - Screening (BIG-S): Validation using a clinical sample.

    Science.gov (United States)

    Wejbera, Martin; Müller, Kai W; Becker, Jan; Beutel, Manfred E

    2017-05-18

    Published diagnostic questionnaires for gambling disorder in German are either based on DSM-III criteria or focus on aspects other than life time prevalence. This study was designed to assess the usability of the DSM-IV criteria based Berlin Inventory of Gambling Behavior Screening tool in a clinical sample and adapt it to DSM-5 criteria. In a sample of 432 patients presenting for behavioral addiction assessment at the University Medical Center Mainz, we checked the screening tool's results against clinical diagnosis and compared a subsample of n=300 clinically diagnosed gambling disorder patients with a comparison group of n=132. The BIG-S produced a sensitivity of 99.7% and a specificity of 96.2%. The instrument's unidimensionality and the diagnostic improvements of DSM-5 criteria were verified by exploratory and confirmatory factor analysis as well as receiver operating characteristic analysis. The BIG-S is a reliable and valid screening tool for gambling disorder and demonstrated its concise and comprehensible operationalization of current DSM-5 criteria in a clinical setting.

  19. Traffic information computing platform for big data

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn; Liu, Yan, E-mail: ztduan@chd.edu.cn; Dai, Jiting, E-mail: ztduan@chd.edu.cn; Kang, Jun, E-mail: ztduan@chd.edu.cn [Chang' an University School of Information Engineering, Xi' an, China and Shaanxi Engineering and Technical Research Center for Road and Traffic Detection, Xi' an (China)

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  20. Traffic information computing platform for big data

    International Nuclear Information System (INIS)

    Duan, Zongtao; Li, Ying; Zheng, Xibin; Liu, Yan; Dai, Jiting; Kang, Jun

    2014-01-01

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users

  1. Blood-brain barrier transport of drugs for the treatment of brain diseases.

    Science.gov (United States)

    Gabathuler, Reinhard

    2009-06-01

    The central nervous system is a sanctuary protected by barriers that regulate brain homeostasis and control the transport of endogenous compounds into the brain. The blood-brain barrier, formed by endothelial cells of the brain capillaries, restricts access to brain cells allowing entry only to amino acids, glucose and hormones needed for normal brain cell function and metabolism. This very tight regulation of brain cell access is essential for the survival of neurons which do not have a significant capacity to regenerate, but also prevents therapeutic compounds, small and large, from reaching the brain. As a result, various strategies are being developed to enhance access of drugs to the brain parenchyma at therapeutically meaningful concentrations to effectively manage disease.

  2. Fremtidens landbrug bliver big business

    DEFF Research Database (Denmark)

    Hansen, Henning Otte

    2016-01-01

    Landbrugets omverdensforhold og konkurrencevilkår ændres, og det vil nødvendiggøre en udvikling i retning af “big business“, hvor landbrugene bliver endnu større, mere industrialiserede og koncentrerede. Big business bliver en dominerende udvikling i dansk landbrug - men ikke den eneste...

  3. Quantum nature of the big bang.

    Science.gov (United States)

    Ashtekar, Abhay; Pawlowski, Tomasz; Singh, Parampreet

    2006-04-14

    Some long-standing issues concerning the quantum nature of the big bang are resolved in the context of homogeneous isotropic models with a scalar field. Specifically, the known results on the resolution of the big-bang singularity in loop quantum cosmology are significantly extended as follows: (i) the scalar field is shown to serve as an internal clock, thereby providing a detailed realization of the "emergent time" idea; (ii) the physical Hilbert space, Dirac observables, and semiclassical states are constructed rigorously; (iii) the Hamiltonian constraint is solved numerically to show that the big bang is replaced by a big bounce. Thanks to the nonperturbative, background independent methods, unlike in other approaches the quantum evolution is deterministic across the deep Planck regime.

  4. Brain metastasis from prostate small cell carcinoma: not to be neglected.

    NARCIS (Netherlands)

    Erasmus, C.E.; Verhagen, W.I.M.; Wauters, C.A.P.; Lindert, E.J. van

    2002-01-01

    BACKGROUND: Symptomatic brain metastases from prostatic carcinoma are rare (0.05% to 0.5%). CASE REPORT: A 70-year-old man presented with a homonymous hemianopsia due to brain metastatic prostatic carcinoma shortly before becoming symptomatic of prostatic disease. CT and MRI of the brain showed a

  5. Mentoring in Schools: An Impact Study of Big Brothers Big Sisters School-Based Mentoring

    Science.gov (United States)

    Herrera, Carla; Grossman, Jean Baldwin; Kauh, Tina J.; McMaken, Jennifer

    2011-01-01

    This random assignment impact study of Big Brothers Big Sisters School-Based Mentoring involved 1,139 9- to 16-year-old students in 10 cities nationwide. Youth were randomly assigned to either a treatment group (receiving mentoring) or a control group (receiving no mentoring) and were followed for 1.5 school years. At the end of the first school…

  6. Do the big-five personality traits predict empathic listening and assertive communication?

    OpenAIRE

    Sims, Ceri M.

    2016-01-01

    As personality traits can influence important social outcomes, the current research investigated whether the Big-Five had predictive influences on communication competences of active-empathic listening (AEL) and assertiveness. A sample of 245 adults of various ages completed the self-report scales. Both Agreeableness and Openness uniquely predicted AEL. Extraversion had the biggest influence onassertiveness but did not uniquely explain AEL variance. Conscientiousness and Neuroticism had small...

  7. Big data processing in the cloud - Challenges and platforms

    Science.gov (United States)

    Zhelev, Svetoslav; Rozeva, Anna

    2017-12-01

    Choosing the appropriate architecture and technologies for a big data project is a difficult task, which requires extensive knowledge in both the problem domain and in the big data landscape. The paper analyzes the main big data architectures and the most widely implemented technologies used for processing and persisting big data. Clouds provide for dynamic resource scaling, which makes them a natural fit for big data applications. Basic cloud computing service models are presented. Two architectures for processing big data are discussed, Lambda and Kappa architectures. Technologies for big data persistence are presented and analyzed. Stream processing as the most important and difficult to manage is outlined. The paper highlights main advantages of cloud and potential problems.

  8. Ethics and Epistemology in Big Data Research.

    Science.gov (United States)

    Lipworth, Wendy; Mason, Paul H; Kerridge, Ian; Ioannidis, John P A

    2017-12-01

    Biomedical innovation and translation are increasingly emphasizing research using "big data." The hope is that big data methods will both speed up research and make its results more applicable to "real-world" patients and health services. While big data research has been embraced by scientists, politicians, industry, and the public, numerous ethical, organizational, and technical/methodological concerns have also been raised. With respect to technical and methodological concerns, there is a view that these will be resolved through sophisticated information technologies, predictive algorithms, and data analysis techniques. While such advances will likely go some way towards resolving technical and methodological issues, we believe that the epistemological issues raised by big data research have important ethical implications and raise questions about the very possibility of big data research achieving its goals.

  9. Victoria Stodden: Scholarly Communication in the Era of Big Data and Big Computation

    OpenAIRE

    Stodden, Victoria

    2015-01-01

    Victoria Stodden gave the keynote address for Open Access Week 2015. "Scholarly communication in the era of big data and big computation" was sponsored by the University Libraries, Computational Modeling and Data Analytics, the Department of Computer Science, the Department of Statistics, the Laboratory for Interdisciplinary Statistical Analysis (LISA), and the Virginia Bioinformatics Institute. Victoria Stodden is an associate professor in the Graduate School of Library and Information Scien...

  10. Big Data: Concept, Potentialities and Vulnerabilities

    Directory of Open Access Journals (Sweden)

    Fernando Almeida

    2018-03-01

    Full Text Available The evolution of information systems and the growth in the use of the Internet and social networks has caused an explosion in the amount of available data relevant to the activities of the companies. Therefore, the treatment of these available data is vital to support operational, tactical and strategic decisions. This paper aims to present the concept of big data and the main technologies that support the analysis of large data volumes. The potential of big data is explored considering nine sectors of activity, such as financial, retail, healthcare, transports, agriculture, energy, manufacturing, public, and media and entertainment. In addition, the main current opportunities, vulnerabilities and privacy challenges of big data are discussed. It was possible to conclude that despite the potential for using the big data to grow in the previously identified areas, there are still some challenges that need to be considered and mitigated, namely the privacy of information, the existence of qualified human resources to work with Big Data and the promotion of a data-driven organizational culture.

  11. Big data analytics a management perspective

    CERN Document Server

    Corea, Francesco

    2016-01-01

    This book is about innovation, big data, and data science seen from a business perspective. Big data is a buzzword nowadays, and there is a growing necessity within practitioners to understand better the phenomenon, starting from a clear stated definition. This book aims to be a starting reading for executives who want (and need) to keep the pace with the technological breakthrough introduced by new analytical techniques and piles of data. Common myths about big data will be explained, and a series of different strategic approaches will be provided. By browsing the book, it will be possible to learn how to implement a big data strategy and how to use a maturity framework to monitor the progress of the data science team, as well as how to move forward from one stage to the next. Crucial challenges related to big data will be discussed, where some of them are more general - such as ethics, privacy, and ownership – while others concern more specific business situations (e.g., initial public offering, growth st...

  12. Review of Small Commercial Sensors for Indicative Monitoring of Ambient Gas

    OpenAIRE

    ALEIXANDRE Manuel; GERBOLES Michel

    2012-01-01

    The traditional ambient gases monitor stations are expensive, big and of complex operation. So they are not suitable for a network of sensors that cover large areas. To cover large areas these traditional systems algorithms usually interpolates the measurements to calculate the gas concentrations in points far away of the physical sensors. Small commercial sensors represent a big opportunity for making sensor networks that monitor the ambient gases within large areas w...

  13. Human factors in Big Data

    NARCIS (Netherlands)

    Boer, J. de

    2016-01-01

    Since 2014 I am involved in various (research) projects that try to make the hype around Big Data more concrete and tangible for the industry and government. Big Data is about multiple sources of (real-time) data that can be analysed, transformed to information and be used to make 'smart' decisions.

  14. Slaves to Big Data. Or Are We?

    Directory of Open Access Journals (Sweden)

    Mireille Hildebrandt

    2013-10-01

    Full Text Available

    In this contribution, the notion of Big Data is discussed in relation to the monetisation of personal data. The claim of some proponents, as well as adversaries, that Big Data implies that ‘n = all’, meaning that we no longer need to rely on samples because we have all the data, is scrutinised and found to be both overly optimistic and unnecessarily pessimistic. A set of epistemological and ethical issues is presented, focusing on the implications of Big Data for our perception, cognition, fairness, privacy and due process. The article then looks into the idea of user-centric personal data management to investigate to what extent it provides solutions for some of the problems triggered by the Big Data conundrum. Special attention is paid to the core principle of data protection legislation, namely purpose binding. Finally, this contribution seeks to inquire into the influence of Big Data politics on self, mind and society, and asks how we can prevent ourselves from becoming slaves to Big Data.

  15. Will Organization Design Be Affected By Big Data?

    Directory of Open Access Journals (Sweden)

    Giles Slinger

    2014-12-01

    Full Text Available Computing power and analytical methods allow us to create, collate, and analyze more data than ever before. When datasets are unusually large in volume, velocity, and variety, they are referred to as “big data.” Some observers have suggested that in order to cope with big data (a organizational structures will need to change and (b the processes used to design organizations will be different. In this article, we differentiate big data from relatively slow-moving, linked people data. We argue that big data will change organizational structures as organizations pursue the opportunities presented by big data. The processes by which organizations are designed, however, will be relatively unaffected by big data. Instead, organization design processes will be more affected by the complex links found in people data.

  16. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  17. Big Data

    OpenAIRE

    Bútora, Matúš

    2017-01-01

    Cieľom bakalárskej práca je popísať problematiku Big Data a agregačné operácie OLAP pre podporu rozhodovania, ktoré sú na ne aplikované pomocou technológie Apache Hadoop. Prevažná časť práce je venovaná popisu práve tejto technológie. Posledná kapitola sa zaoberá spôsobom aplikovania agregačných operácií a problematikou ich realizácie. Nasleduje celkové zhodnotenie práce a možnosti využitia výsledného systému do budúcna. The aim of the bachelor thesis is to describe the Big Data issue and ...

  18. Brain metastasis of small cell lung carcinoma. Comparison of Gd-DTPA enhanced magnetic resonance imaging and enhanced computerized tomography

    International Nuclear Information System (INIS)

    Nomoto, Yasushi; Yamaguchi, Yutaka; Miyamoto, Tadaaki.

    1994-01-01

    Small cell carcinoma of the lung (SCLC) frequently metastasizes into the brain, resulting in serious influences upon prognosis. Delayed brain damage caused by prophylactic cranial irradiation (PCI) is also problematic. Gadolinium diethylene triamine pentaacetic acid (Gd-DTPA) enhanced magnetic resonance imaging (MRI) was performed to detect early brain metastasis from SCLC, and its usefulness was compared with contrast computerized tomography (CT). Among 25 SCLC patients, brain metastasis was detected in 11 by MRI and in 10 by CT, although six of them were completely asymptomatic. In the 11 patients, 6.3 and 2.4 lesions were respectively detected on average by MRI and CT. The ability of MRI to detect metastatic lesions of ≥15 mm diameter did not differ from that of CT, but became different as lesions became smaller (P<0.002), and MRI had a decided advantage over CT because as many as 30 lesions of ≤5 mm diameter were detected by MRI, whereas such lesions visualized on CT numbered only one (P<0.0001). MRI was incomparably superior to CT (P<0.0004) for subtentorial lesions since 18 lesions were detected on MRI, but only three, measuring ≥25 mm in diameter, were demonstrated on CT. Gd-DTPA enhanced MRI was determined to be extremely useful in the early diagnosis of SCLC brain metastasis. MRI was thought to reduce delayed brain damage caused by PCI if performed according to an adequate schedule. (author)

  19. Association of Ki-67, p53, and bcl-2 expression of the primary non-small-cell lung cancer lesion with brain metastatic lesion

    International Nuclear Information System (INIS)

    Bubb, Robbin S.; Komaki, Ritsuko; Hachiya, Tsutomu; Milas, Ivan; Ro, Jae Y.; Langford, Lauren; Sawaya, Raymond; Putnam, Joe B.; Allen, Pamela; Cox, James D.; McDonnell, Timothy J.; Brock, William; Hong, Waun K.; Roth, Jack A.; Milas, Luka

    2002-01-01

    Purpose: The study was conducted to determine whether immunohistochemical analysis of Ki-67, p53, and bcl-2 in patients with non-small-cell lung cancer is associated with a higher rate of brain metastases and whether the intrapatient expression of these biomarkers (in the primary tumors vs. brain lesions) is similar. Methods and Materials: At the M. D. Anderson Cancer Center, tumors from 29 case patients with primary lung tumor and brain metastasis and 29 control patients with primary lung tumor but no brain metastasis were resected and examined for immunohistochemical expression. Ki-67, p53, and bcl-2 were analyzed in resected primary lung, lymph node, and metastatic brain tumors. Each control patient was matched by age, gender, and histology to a patient with brain metastasis. Results: No significant differences in patient survival characteristics were detected between the case group and control group. Also, difference in patient outcome between the two groups was not generally predicted by biomarker analysis. However, when the groups were combined, the biomarker analysis was predictive for certain patient outcome end points. Using median values as cutoff points between low and high expression of biomarkers, it was observed that high expression of Ki-67 (>40%) in lung primaries was associated with poorer disease-free survival (p=0.04), whereas low expression of p53 in lung primaries was associated with poorer overall survival (p=0.04), and these patients had a higher rate of nonbrain distant metastases (p=0.02). The patients with brain metastases were particularly prone to developing nonbrain distant metastases if the percentage of p53-positive cells in brain metastases was low (p=0.01). There was a positive correlation in the expression of Ki-67 (p=0.02) (r 2 =0.1608), as well as p53 (p 2 =0.7380), between lung primaries and brain metastases. Compared to Ki-67 and p53, bcl-2 was the least predictive. Conclusion: Differences in biomarker expression between the

  20. BigDansing

    KAUST Repository

    Khayyat, Zuhair; Ilyas, Ihab F.; Jindal, Alekh; Madden, Samuel; Ouzzani, Mourad; Papotti, Paolo; Quiané -Ruiz, Jorge-Arnulfo; Tang, Nan; Yin, Si

    2015-01-01

    of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic

  1. A Single Session of rTMS Enhances Small-Worldness in Writer’s Cramp: Evidence from Simultaneous EEG-fMRI Multi-Modal Brain Graph

    Directory of Open Access Journals (Sweden)

    Rose D. Bharath

    2017-09-01

    Full Text Available Background and Purpose: Repetitive transcranial magnetic stimulation (rTMS induces widespread changes in brain connectivity. As the network topology differences induced by a single session of rTMS are less known we undertook this study to ascertain whether the network alterations had a small-world morphology using multi-modal graph theory analysis of simultaneous EEG-fMRI.Method: Simultaneous EEG-fMRI was acquired in duplicate before (R1 and after (R2 a single session of rTMS in 14 patients with Writer’s Cramp (WC. Whole brain neuronal and hemodynamic network connectivity were explored using the graph theory measures and clustering coefficient, path length and small-world index were calculated for EEG and resting state fMRI (rsfMRI. Multi-modal graph theory analysis was used to evaluate the correlation of EEG and fMRI clustering coefficients.Result: A single session of rTMS was found to increase the clustering coefficient and small-worldness significantly in both EEG and fMRI (p < 0.05. Multi-modal graph theory analysis revealed significant modulations in the fronto-parietal regions immediately after rTMS. The rsfMRI revealed additional modulations in several deep brain regions including cerebellum, insula and medial frontal lobe.Conclusion: Multi-modal graph theory analysis of simultaneous EEG-fMRI can supplement motor physiology methods in understanding the neurobiology of rTMS in vivo. Coinciding evidence from EEG and rsfMRI reports small-world morphology for the acute phase network hyper-connectivity indicating changes ensuing low-frequency rTMS is probably not “noise”.

  2. Leveraging Mobile Network Big Data for Developmental Policy ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Some argue that big data and big data users offer advantages to generate evidence. ... Supported by IDRC, this research focused on transportation planning in urban ... Using mobile network big data for land use classification CPRsouth 2015.

  3. When Small is Big: Microcredit and Economic Development

    Directory of Open Access Journals (Sweden)

    George Brown

    2010-10-01

    Full Text Available Microcredit - the extension of small loans - gives people who would otherwise not have access to credit the opportunity to begin or expand businesses or to pursue job-specific training. These borrowers lack the income, credit history, assets, or security to borrow from other sources. Although the popularity and success of microcredit in developing countries has been trumpeted in the media, microcredit is established and growing in the United States and Canada as well. Its appeal comes from its capacity to provide the means for those who have the ability, drive, and commitment to overcome the hurdles to self-sufficiency. In this article, the role of microcredit as a stimulant for economic development is examined. First, its importance for the establishment of small businesss is described. Second, the article provides an overview of the general microcredit climate in the United states and the local situation in the Ottawa area. Third, brief stories about individuals who have received this type of loan reveal the human impact behind the economic benefits. Finally, the role of microcredit in funding startups is analyzed in comparison to other sources of available funding. The article concludes with a summary of the benefits of microcredit as a win-win proposition for economic development.

  4. Establishing a process of irradiating small animal brain using a CyberKnife and a microCT scanner

    International Nuclear Information System (INIS)

    Kim, Haksoo; Welford, Scott; Fabien, Jeffrey; Zheng, Yiran; Yuan, Jake; Brindle, James; Yao, Min; Lo, Simon; Wessels, Barry; Machtay, Mitchell; Sohn, Jason W.; Sloan, Andrew

    2014-01-01

    Purpose: Establish and validate a process of accurately irradiating small animals using the CyberKnife G4 System (version 8.5) with treatment plans designed to irradiate a hemisphere of a mouse brain based on microCT scanner images. Methods: These experiments consisted of four parts: (1) building a mouse phantom for intensity modulated radiotherapy (IMRT) quality assurance (QA), (2) proving usability of a microCT for treatment planning, (3) fabricating a small animal positioning system for use with the CyberKnife's image guided radiotherapy (IGRT) system, and (4)in vivo verification of targeting accuracy. A set of solid water mouse phantoms was designed and fabricated, with radiochromic films (RCF) positioned in selected planes to measure delivered doses. After down-sampling for treatment planning compatibility, a CT image set of a phantom was imported into the CyberKnife treatment planning system—MultiPlan (ver. 3.5.2). A 0.5 cm diameter sphere was contoured within the phantom to represent a hemispherical section of a mouse brain. A nude mouse was scanned in an alpha cradle using a microCT scanner (cone-beam, 157 × 149 pixels slices, 0.2 mm longitudinal slice thickness). Based on the results of our positional accuracy study, a planning treatment volume (PTV) was created. A stereotactic body mold of the mouse was “printed” using a 3D printer laying UV curable acrylic plastic. Printer instructions were based on exported contours of the mouse's skin. Positional reproducibility in the mold was checked by measuring ten CT scans. To verify accurate dose delivery in vivo, six mice were irradiated in the mold with a 4 mm target contour and a 2 mm PTV margin to 3 Gy and sacrificed within 20 min to avoid DNA repair. The brain was sliced and stained for analysis. Results: For the IMRT QA using a set of phantoms, the planned dose (6 Gy to the calculation point) was compared to the delivered dose measured via film and analyzed using Gamma analysis (3% and 3 mm). A

  5. Establishing a process of irradiating small animal brain using a CyberKnife and a microCT scanner

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Haksoo; Welford, Scott [Department of Radiation Oncology, School of Medicine, Case Western Reserve University, 10900 Euclid Avenue, Cleveland, Ohio 44106 (United States); Fabien, Jeffrey; Zheng, Yiran; Yuan, Jake; Brindle, James; Yao, Min; Lo, Simon; Wessels, Barry; Machtay, Mitchell; Sohn, Jason W., E-mail: jason.sohn@case.edu [Department of Radiation Oncology, School of Medicine, Case Western Reserve University, 10900 Euclid Avenue, Cleveland, Ohio 44106 and University Hospitals of Cleveland, 11100 Euclid Avenue, Cleveland, Ohio 44106 (United States); Sloan, Andrew [Department of Neurosurgery, School of Medicine, Case Western Reserve University, 10900 Euclid Avenue, Cleveland, Ohio 44106 (United States)

    2014-02-15

    Purpose: Establish and validate a process of accurately irradiating small animals using the CyberKnife G4 System (version 8.5) with treatment plans designed to irradiate a hemisphere of a mouse brain based on microCT scanner images. Methods: These experiments consisted of four parts: (1) building a mouse phantom for intensity modulated radiotherapy (IMRT) quality assurance (QA), (2) proving usability of a microCT for treatment planning, (3) fabricating a small animal positioning system for use with the CyberKnife's image guided radiotherapy (IGRT) system, and (4)in vivo verification of targeting accuracy. A set of solid water mouse phantoms was designed and fabricated, with radiochromic films (RCF) positioned in selected planes to measure delivered doses. After down-sampling for treatment planning compatibility, a CT image set of a phantom was imported into the CyberKnife treatment planning system—MultiPlan (ver. 3.5.2). A 0.5 cm diameter sphere was contoured within the phantom to represent a hemispherical section of a mouse brain. A nude mouse was scanned in an alpha cradle using a microCT scanner (cone-beam, 157 × 149 pixels slices, 0.2 mm longitudinal slice thickness). Based on the results of our positional accuracy study, a planning treatment volume (PTV) was created. A stereotactic body mold of the mouse was “printed” using a 3D printer laying UV curable acrylic plastic. Printer instructions were based on exported contours of the mouse's skin. Positional reproducibility in the mold was checked by measuring ten CT scans. To verify accurate dose delivery in vivo, six mice were irradiated in the mold with a 4 mm target contour and a 2 mm PTV margin to 3 Gy and sacrificed within 20 min to avoid DNA repair. The brain was sliced and stained for analysis. Results: For the IMRT QA using a set of phantoms, the planned dose (6 Gy to the calculation point) was compared to the delivered dose measured via film and analyzed using Gamma analysis (3% and 3 mm

  6. Practice Variation in Big-4 Transparency Reports

    DEFF Research Database (Denmark)

    Girdhar, Sakshi; Klarskov Jeppesen, Kim

    2018-01-01

    Purpose: The purpose of this paper is to examine the transparency reports published by the Big-4 public accounting firms in the UK, Germany and Denmark to understand the determinants of their content within the networks of big accounting firms. Design/methodology/approach: The study draws...... on a qualitative research approach, in which the content of transparency reports is analyzed and semi-structured interviews are conducted with key people from the Big-4 firms who are responsible for developing the transparency reports. Findings: The findings show that the content of transparency reports...... is inconsistent and the transparency reporting practice is not uniform within the Big-4 networks. Differences were found in the way in which the transparency reporting practices are coordinated globally by the respective central governing bodies of the Big-4. The content of the transparency reports...

  7. Big data and biomedical informatics: a challenging opportunity.

    Science.gov (United States)

    Bellazzi, R

    2014-05-22

    Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations.

  8. Was the big bang hot

    International Nuclear Information System (INIS)

    Wright, E.L.

    1983-01-01

    The author considers experiments to confirm the substantial deviations from a Planck curve in the Woody and Richards spectrum of the microwave background, and search for conducting needles in our galaxy. Spectral deviations and needle-shaped grains are expected for a cold Big Bang, but are not required by a hot Big Bang. (Auth.)

  9. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  10. A randomised comparison of radical radiotherapy with or without chemotherapy for patients with non-small cell lung cancer: Results from the Big Lung Trial

    International Nuclear Information System (INIS)

    Fairlamb, David; Milroy, Robert; Gower, Nicole; Parmar, Mahesh; Peake, Michael; Rudd, Robin; Souhami, Robert; Spiro, Stephen; Stephens, Richard; Waller, David

    2005-01-01

    Background: A meta-analysis of trials comparing primary treatment with or without chemotherapy for patients with non-small cell lung cancer published in 1995 suggested a survival benefit for cisplatin-based chemotherapy in each of the primary treatment settings studied, but it included many small trials, and trials with differing eligibility criteria and chemotherapy regimens. Methods: The Big Lung Trial was a large pragmatic trial designed to confirm the survival benefits seen in the meta-analysis, and this paper reports the findings in the radical radiotherapy setting. The trial closed before the required sample size was achieved due to slow accrual, with a total of 288 patients randomised to receive radical radiotherapy alone (146 patients) or sequential radical radiotherapy and cisplatin-based chemotherapy (142 patients). Results: There was no evidence that patients allocated sequential chemotherapy and radical radiotherapy had a better survival than those allocated radical radiotherapy alone, HR 1.07 (95% CI 0.84-1.38, P=0.57), median survival 13.0 months for the sequential group and 13.2 for the radical radiotherapy alone group. In addition, exploratory analyses could not identify any subgroup that might benefit more or less from chemotherapy. Conclusions: Despite not suggesting a survival benefit for the sequential addition of chemotherapy to radical radiotherapy, possibly because of the relatively small sample size and consequently wide confidence intervals, the results can still be regarded as consistent with the meta-analysis, and other similarly designed recently published large trials. Combining all these results suggests there may be a small median survival benefit with chemotherapy of between 2 and 8 weeks

  11. Keynote: Big Data, Big Opportunities

    OpenAIRE

    Borgman, Christine L.

    2014-01-01

    The enthusiasm for big data is obscuring the complexity and diversity of data in scholarship and the challenges for stewardship. Inside the black box of data are a plethora of research, technology, and policy issues. Data are not shiny objects that are easily exchanged. Rather, data are representations of observations, objects, or other entities used as evidence of phenomena for the purposes of research or scholarship. Data practices are local, varying from field to field, individual to indiv...

  12. Integrating R and Hadoop for Big Data Analysis

    OpenAIRE

    Bogdan Oancea; Raluca Mariana Dragoescu

    2014-01-01

    Analyzing and working with big data could be very diffi cult using classical means like relational database management systems or desktop software packages for statistics and visualization. Instead, big data requires large clusters with hundreds or even thousands of computing nodes. Offi cial statistics is increasingly considering big data for deriving new statistics because big data sources could produce more relevant and timely statistics than traditional sources. One of the software tools ...

  13. The challenges of big data.

    Science.gov (United States)

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. © 2016. Published by The Company of Biologists Ltd.

  14. Non-small cell lung cancer brain metastasis screening in the era of positron emission tomography-CT staging: Current practice and outcomes.

    Science.gov (United States)

    Diaz, Mauricio E; Debowski, Maciej; Hukins, Craig; Fielding, David; Fong, Kwun M; Bettington, Catherine S

    2018-05-10

    Several clinical guidelines indicate that brain metastasis screening (BMS) should be guided by disease stage in non-small cell lung cancer (NSCLC). We estimate that screening is performed more broadly in practice, and patients undergo brain imaging at considerable cost with questionable benefit. Our aim was to quantify the use and detection rate of BMS in a contemporary cohort staged with 18 F-fluorodeoxyglucose positron emission tomography/computed tomography (PET-CT). We conducted a retrospective review of prospectively collected data from three major lung cancer referral centres in Brisbane between January 2011 and December 2015. Patients included had a new diagnosis of NSCLC and had undergone a PET-CT to stage extra-cranial disease. BMS was defined as dedicated brain imaging with contrast-enhanced computed tomography (CE-CT) or magnetic resonance (MR), in the absence of clinically apparent neurological deficits. A total of 1751 eligible cases were identified and of these 718 (41%) underwent BMS. The majority had CE-CT imaging (n = 703). Asymptomatic brain metastases (BM) were detected in 18 patients (2.5%). Of these patients, 12 had concurrent non-brain metastases. Only six patients (0.8%) had BM alone. The rate of detection increased with N-stage (P = 0.02) and overall stage (P < 0.001). It was 0.5%, 1%, 1.6% and 7.3% for stage I, II, III and IV respectively. The overall screening rate increased with T-stage (P = 0.001), N-Stage (P < 0.001) and overall stage (P < 0.001). Non-small cell lung cancer BMS practices remain at odds with published guidelines. The low number of occult BMs detected supports the existing international recommendations. Rationalising BMS would minimise the burden on patients and the health care system. © 2018 The Royal Australian and New Zealand College of Radiologists.

  15. Big³. Editorial.

    Science.gov (United States)

    Lehmann, C U; Séroussi, B; Jaulent, M-C

    2014-05-22

    To provide an editorial introduction into the 2014 IMIA Yearbook of Medical Informatics with an overview of the content, the new publishing scheme, and upcoming 25th anniversary. A brief overview of the 2014 special topic, Big Data - Smart Health Strategies, and an outline of the novel publishing model is provided in conjunction with a call for proposals to celebrate the 25th anniversary of the Yearbook. 'Big Data' has become the latest buzzword in informatics and promise new approaches and interventions that can improve health, well-being, and quality of life. This edition of the Yearbook acknowledges the fact that we just started to explore the opportunities that 'Big Data' will bring. However, it will become apparent to the reader that its pervasive nature has invaded all aspects of biomedical informatics - some to a higher degree than others. It was our goal to provide a comprehensive view at the state of 'Big Data' today, explore its strengths and weaknesses, as well as its risks, discuss emerging trends, tools, and applications, and stimulate the development of the field through the aggregation of excellent survey papers and working group contributions to the topic. For the first time in history will the IMIA Yearbook be published in an open access online format allowing a broader readership especially in resource poor countries. For the first time, thanks to the online format, will the IMIA Yearbook be published twice in the year, with two different tracks of papers. We anticipate that the important role of the IMIA yearbook will further increase with these changes just in time for its 25th anniversary in 2016.

  16. Multilayer Brain Networks

    Science.gov (United States)

    Vaiana, Michael; Muldoon, Sarah Feldt

    2018-01-01

    The field of neuroscience is facing an unprecedented expanse in the volume and diversity of available data. Traditionally, network models have provided key insights into the structure and function of the brain. With the advent of big data in neuroscience, both more sophisticated models capable of characterizing the increasing complexity of the data and novel methods of quantitative analysis are needed. Recently, multilayer networks, a mathematical extension of traditional networks, have gained increasing popularity in neuroscience due to their ability to capture the full information of multi-model, multi-scale, spatiotemporal data sets. Here, we review multilayer networks and their applications in neuroscience, showing how incorporating the multilayer framework into network neuroscience analysis has uncovered previously hidden features of brain networks. We specifically highlight the use of multilayer networks to model disease, structure-function relationships, network evolution, and link multi-scale data. Finally, we close with a discussion of promising new directions of multilayer network neuroscience research and propose a modified definition of multilayer networks designed to unite and clarify the use of the multilayer formalism in describing real-world systems.

  17. Cloud Based Big Data Infrastructure: Architectural Components and Automated Provisioning

    OpenAIRE

    Demchenko, Yuri; Turkmen, Fatih; Blanchet, Christophe; Loomis, Charles; Laat, Caees de

    2016-01-01

    This paper describes the general architecture and functional components of the cloud based Big Data Infrastructure (BDI). The proposed BDI architecture is based on the analysis of the emerging Big Data and data intensive technologies and supported by the definition of the Big Data Architecture Framework (BDAF) that defines the following components of the Big Data technologies: Big Data definition, Data Management including data lifecycle and data structures, Big Data Infrastructure (generical...

  18. Physics with Big Karl Brainstorming. Abstracts

    International Nuclear Information System (INIS)

    Machner, H.; Lieb, J.

    2000-08-01

    Before summarizing details of the meeting, a short description of the spectrometer facility Big Karl is given. The facility is essentially a new instrument using refurbished dipole magnets from its predecessor. The large acceptance quadrupole magnets and the beam optics are new. Big Karl has a design very similar as the focussing spectrometers at MAMI (Mainz), AGOR (Groningen) and the high resolution spectrometer (HRS) in Hall A at Jefferson Laboratory with ΔE/E = 10 -4 but at some lower maximum momentum. The focal plane detectors consisting of multiwire drift chambers and scintillating hodoscopes are similar. Unlike HRS, Big Karl still needs Cerenkov counters and polarimeters in its focal plane; detectors which are necessary to perform some of the experiments proposed during the brainstorming. In addition, BIG KARL allows emission angle reconstruction via track measurements in its focal plane with high resolution. In the following the physics highlights, the proposed and potential experiments are summarized. During the meeting it became obvious that the physics to be explored at Big Karl can be grouped into five distinct categories, and this summary is organized accordingly. (orig.)

  19. Small millets, big potential: diverse, nutritious, and climate smart ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2016-04-29

    Apr 29, 2016 ... Integrated and focused public support is now needed for ... Include small millets in the Indian public distribution system (PDS) to 10 kg per ... but post-harvest losses, due to factors such as poor handling, transport, and.

  20. Seed bank and big sagebrush plant community composition in a range margin for big sagebrush

    Science.gov (United States)

    Martyn, Trace E.; Bradford, John B.; Schlaepfer, Daniel R.; Burke, Ingrid C.; Laurenroth, William K.

    2016-01-01

    The potential influence of seed bank composition on range shifts of species due to climate change is unclear. Seed banks can provide a means of both species persistence in an area and local range expansion in the case of increasing habitat suitability, as may occur under future climate change. However, a mismatch between the seed bank and the established plant community may represent an obstacle to persistence and expansion. In big sagebrush (Artemisia tridentata) plant communities in Montana, USA, we compared the seed bank to the established plant community. There was less than a 20% similarity in the relative abundance of species between the established plant community and the seed bank. This difference was primarily driven by an overrepresentation of native annual forbs and an underrepresentation of big sagebrush in the seed bank compared to the established plant community. Even though we expect an increase in habitat suitability for big sagebrush under future climate conditions at our sites, the current mismatch between the plant community and the seed bank could impede big sagebrush range expansion into increasingly suitable habitat in the future.

  1. Application and Prospect of Big Data in Water Resources

    Science.gov (United States)

    Xi, Danchi; Xu, Xinyi

    2017-04-01

    Because of developed information technology and affordable data storage, we h ave entered the era of data explosion. The term "Big Data" and technology relate s to it has been created and commonly applied in many fields. However, academic studies just got attention on Big Data application in water resources recently. As a result, water resource Big Data technology has not been fully developed. This paper introduces the concept of Big Data and its key technologies, including the Hadoop system and MapReduce. In addition, this paper focuses on the significance of applying the big data in water resources and summarizing prior researches by others. Most studies in this field only set up theoretical frame, but we define the "Water Big Data" and explain its tridimensional properties which are time dimension, spatial dimension and intelligent dimension. Based on HBase, the classification system of Water Big Data is introduced: hydrology data, ecology data and socio-economic data. Then after analyzing the challenges in water resources management, a series of solutions using Big Data technologies such as data mining and web crawler, are proposed. Finally, the prospect of applying big data in water resources is discussed, it can be predicted that as Big Data technology keeps developing, "3D" (Data Driven Decision) will be utilized more in water resources management in the future.

  2. Big Data in food and agriculture

    Directory of Open Access Journals (Sweden)

    Kelly Bronson

    2016-06-01

    Full Text Available Farming is undergoing a digital revolution. Our existing review of current Big Data applications in the agri-food sector has revealed several collection and analytics tools that may have implications for relationships of power between players in the food system (e.g. between farmers and large corporations. For example, Who retains ownership of the data generated by applications like Monsanto Corproation's Weed I.D . “app”? Are there privacy implications with the data gathered by John Deere's precision agricultural equipment? Systematically tracing the digital revolution in agriculture, and charting the affordances as well as the limitations of Big Data applied to food and agriculture, should be a broad research goal for Big Data scholarship. Such a goal brings data scholarship into conversation with food studies and it allows for a focus on the material consequences of big data in society.

  3. Big data optimization recent developments and challenges

    CERN Document Server

    2016-01-01

    The main objective of this book is to provide the necessary background to work with big data by introducing some novel optimization algorithms and codes capable of working in the big data setting as well as introducing some applications in big data optimization for both academics and practitioners interested, and to benefit society, industry, academia, and government. Presenting applications in a variety of industries, this book will be useful for the researchers aiming to analyses large scale data. Several optimization algorithms for big data including convergent parallel algorithms, limited memory bundle algorithm, diagonal bundle method, convergent parallel algorithms, network analytics, and many more have been explored in this book.

  4. Una aproximación a Big Data = An approach to Big Data

    OpenAIRE

    Puyol Moreno, Javier

    2014-01-01

    Big Data puede ser considerada como una tendencia en el avance de la tecnología que ha abierto la puerta a un nuevo enfoque para la comprensión y la toma de decisiones, que se utiliza para describir las enormes cantidades de datos (estructurados, no estructurados y semi- estructurados) que sería demasiado largo y costoso para cargar una base de datos relacional para su análisis. Así, el concepto de Big Data se aplica a toda la información que no puede ser procesada o analizada utilizando herr...

  5. Toward a Literature-Driven Definition of Big Data in Healthcare.

    Science.gov (United States)

    Baro, Emilie; Degoul, Samuel; Beuscart, Régis; Chazard, Emmanuel

    2015-01-01

    The aim of this study was to provide a definition of big data in healthcare. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n) and the number of variables (p) for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. A total of 196 papers were included. Big data can be defined as datasets with Log(n∗p) ≥ 7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR) data.

  6. Big Data Analytic, Big Step for Patient Management and Care in Puerto Rico.

    Science.gov (United States)

    Borrero, Ernesto E

    2018-01-01

    This letter provides an overview of the application of big data in health care system to improve quality of care, including predictive modelling for risk and resource use, precision medicine and clinical decision support, quality of care and performance measurement, public health and research applications, among others. The author delineates the tremendous potential for big data analytics and discuss how it can be successfully implemented in clinical practice, as an important component of a learning health-care system.

  7. Big Data and Biomedical Informatics: A Challenging Opportunity

    Science.gov (United States)

    2014-01-01

    Summary Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations. PMID:24853034

  8. Big data governance an emerging imperative

    CERN Document Server

    Soares, Sunil

    2012-01-01

    Written by a leading expert in the field, this guide focuses on the convergence of two major trends in information management-big data and information governance-by taking a strategic approach oriented around business cases and industry imperatives. With the advent of new technologies, enterprises are expanding and handling very large volumes of data; this book, nontechnical in nature and geared toward business audiences, encourages the practice of establishing appropriate governance over big data initiatives and addresses how to manage and govern big data, highlighting the relevant processes,

  9. Big Data and historical social science

    Directory of Open Access Journals (Sweden)

    Peter Bearman

    2015-11-01

    Full Text Available “Big Data” can revolutionize historical social science if it arises from substantively important contexts and is oriented towards answering substantively important questions. Such data may be especially important for answering previously largely intractable questions about the timing and sequencing of events, and of event boundaries. That said, “Big Data” makes no difference for social scientists and historians whose accounts rest on narrative sentences. Since such accounts are the norm, the effects of Big Data on the practice of historical social science may be more limited than one might wish.

  10. The Inverted Big-Bang

    OpenAIRE

    Vaas, Ruediger

    2004-01-01

    Our universe appears to have been created not out of nothing but from a strange space-time dust. Quantum geometry (loop quantum gravity) makes it possible to avoid the ominous beginning of our universe with its physically unrealistic (i.e. infinite) curvature, extreme temperature, and energy density. This could be the long sought after explanation of the big-bang and perhaps even opens a window into a time before the big-bang: Space itself may have come from an earlier collapsing universe tha...

  11. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  12. Classical propagation of strings across a big crunch/big bang singularity

    International Nuclear Information System (INIS)

    Niz, Gustavo; Turok, Neil

    2007-01-01

    One of the simplest time-dependent solutions of M theory consists of nine-dimensional Euclidean space times 1+1-dimensional compactified Milne space-time. With a further modding out by Z 2 , the space-time represents two orbifold planes which collide and re-emerge, a process proposed as an explanation of the hot big bang [J. Khoury, B. A. Ovrut, P. J. Steinhardt, and N. Turok, Phys. Rev. D 64, 123522 (2001).][P. J. Steinhardt and N. Turok, Science 296, 1436 (2002).][N. Turok, M. Perry, and P. J. Steinhardt, Phys. Rev. D 70, 106004 (2004).]. When the two planes are near, the light states of the theory consist of winding M2-branes, describing fundamental strings in a particular ten-dimensional background. They suffer no blue-shift as the M theory dimension collapses, and their equations of motion are regular across the transition from big crunch to big bang. In this paper, we study the classical evolution of fundamental strings across the singularity in some detail. We also develop a simple semiclassical approximation to the quantum evolution which allows one to compute the quantum production of excitations on the string and implement it in a simplified example

  13. Objective Ventricle Segmentation in Brain CT with Ischemic Stroke Based on Anatomical Knowledge

    Directory of Open Access Journals (Sweden)

    Xiaohua Qian

    2017-01-01

    Full Text Available Ventricle segmentation is a challenging technique for the development of detection system of ischemic stroke in computed tomography (CT, as ischemic stroke regions are adjacent to the brain ventricle with similar intensity. To address this problem, we developed an objective segmentation system of brain ventricle in CT. The intensity distribution of the ventricle was estimated based on clustering technique, connectivity, and domain knowledge, and the initial ventricle segmentation results were then obtained. To exclude the stroke regions from initial segmentation, a combined segmentation strategy was proposed, which is composed of three different schemes: (1 the largest three-dimensional (3D connected component was considered as the ventricular region; (2 the big stroke areas were removed by the image difference methods based on searching optimal threshold values; (3 the small stroke regions were excluded by the adaptive template algorithm. The proposed method was evaluated on 50 cases of patients with ischemic stroke. The mean Dice, sensitivity, specificity, and root mean squared error were 0.9447, 0.969, 0.998, and 0.219 mm, respectively. This system can offer a desirable performance. Therefore, the proposed system is expected to bring insights into clinic research and the development of detection system of ischemic stroke in CT.

  14. The Information Panopticon in the Big Data Era

    Directory of Open Access Journals (Sweden)

    Martin Berner

    2014-04-01

    Full Text Available Taking advantage of big data opportunities is challenging for traditional organizations. In this article, we take a panoptic view of big data – obtaining information from more sources and making it visible to all organizational levels. We suggest that big data requires the transformation from command and control hierarchies to post-bureaucratic organizational structures wherein employees at all levels can be empowered while simultaneously being controlled. We derive propositions that show how to best exploit big data technologies in organizations.

  15. Outcome of small cell lung cancer (SCLC) patients with brain metastases in a routine clinical setting

    International Nuclear Information System (INIS)

    Lekic, Mirko; Kovac, Viljem; Triller, Nadja; Knez, Lea; Sadikov, Aleksander; Cufer, Tanja

    2012-01-01

    Small cell lung cancer (SCLC) represents approximately 13 to 18% of all lung cancers. It is the most aggressive among lung cancers, mostly presented at an advanced stage, with median survival rates of 10 to12 months in patients treated with standard chemotherapy and radiotherapy. In approximately 15-20% of patients brain metastases are present already at the time of primary diagnosis; however, it is unclear how much it influences the outcome of disease according the other metastatic localisation. The objective of this analysis was to evaluate the median survival of SCLC patients treated by specific therapy (chemotherapy and/or radiotherapy) with regard to the presence or absence of brain metastases at the time of diagnosis. All SCLC patients have been treated in a routine clinical practice and followed up at the University Clinic Golnik in Slovenia. In the retrospective study the medical files from 2002 to 2007 were review. All patients with cytological or histological confirmed disease and eligible for specific oncological treatment were included in the study. They have been treated according to the guidelines valid at the time. Chemotherapy and regular followed-up were carried out at the University Clinic Golnik and radiotherapy at the Institute of Oncology Ljubljana. We found 251 patients eligible for the study. The median age of them was 65 years, majority were male (67%), smokers or ex-smokers (98%), with performance status 0 to 1 (83%). At the time of diagnosis no metastases were found in 64 patients (25.5%) and metastases outside the brain were presented in 153 (61.0%). Brain metastases, confirmed by a CT scan, were present in 34 patients (13.5%), most of them had also metastases at other localisations. All patients received chemotherapy and all patients with confirmed brain metastases received whole brain irradiation (WBRT). The radiotherapy with radical dose at primary tumour was delivered to 27 patients with limited disease and they got 4–6 cycles of

  16. WE-H-BRB-00: Big Data in Radiation Oncology

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2016-06-15

    Big Data in Radiation Oncology: (1) Overview of the NIH 2015 Big Data Workshop, (2) Where do we stand in the applications of big data in radiation oncology?, and (3) Learning Health Systems for Radiation Oncology: Needs and Challenges for Future Success The overriding goal of this trio panel of presentations is to improve awareness of the wide ranging opportunities for big data impact on patient quality care and enhancing potential for research and collaboration opportunities with NIH and a host of new big data initiatives. This presentation will also summarize the Big Data workshop that was held at the NIH Campus on August 13–14, 2015 and sponsored by AAPM, ASTRO, and NIH. The workshop included discussion of current Big Data cancer registry initiatives, safety and incident reporting systems, and other strategies that will have the greatest impact on radiation oncology research, quality assurance, safety, and outcomes analysis. Learning Objectives: To discuss current and future sources of big data for use in radiation oncology research To optimize our current data collection by adopting new strategies from outside radiation oncology To determine what new knowledge big data can provide for clinical decision support for personalized medicine L. Xing, NIH/NCI Google Inc.

  17. WE-H-BRB-00: Big Data in Radiation Oncology

    International Nuclear Information System (INIS)

    2016-01-01

    Big Data in Radiation Oncology: (1) Overview of the NIH 2015 Big Data Workshop, (2) Where do we stand in the applications of big data in radiation oncology?, and (3) Learning Health Systems for Radiation Oncology: Needs and Challenges for Future Success The overriding goal of this trio panel of presentations is to improve awareness of the wide ranging opportunities for big data impact on patient quality care and enhancing potential for research and collaboration opportunities with NIH and a host of new big data initiatives. This presentation will also summarize the Big Data workshop that was held at the NIH Campus on August 13–14, 2015 and sponsored by AAPM, ASTRO, and NIH. The workshop included discussion of current Big Data cancer registry initiatives, safety and incident reporting systems, and other strategies that will have the greatest impact on radiation oncology research, quality assurance, safety, and outcomes analysis. Learning Objectives: To discuss current and future sources of big data for use in radiation oncology research To optimize our current data collection by adopting new strategies from outside radiation oncology To determine what new knowledge big data can provide for clinical decision support for personalized medicine L. Xing, NIH/NCI Google Inc.

  18. De impact van Big Data op Internationale Betrekkingen

    NARCIS (Netherlands)

    Zwitter, Andrej

    Big Data changes our daily lives, but does it also change international politics? In this contribution, Andrej Zwitter (NGIZ chair at Groningen University) argues that Big Data impacts on international relations in ways that we only now start to understand. To comprehend how Big Data influences

  19. Epidemiology in the Era of Big Data

    Science.gov (United States)

    Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M

    2015-01-01

    Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called ‘3 Vs’: variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that, while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field’s future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future. PMID:25756221

  20. Big data and analytics strategic and organizational impacts

    CERN Document Server

    Morabito, Vincenzo

    2015-01-01

    This book presents and discusses the main strategic and organizational challenges posed by Big Data and analytics in a manner relevant to both practitioners and scholars. The first part of the book analyzes strategic issues relating to the growing relevance of Big Data and analytics for competitive advantage, which is also attributable to empowerment of activities such as consumer profiling, market segmentation, and development of new products or services. Detailed consideration is also given to the strategic impact of Big Data and analytics on innovation in domains such as government and education and to Big Data-driven business models. The second part of the book addresses the impact of Big Data and analytics on management and organizations, focusing on challenges for governance, evaluation, and change management, while the concluding part reviews real examples of Big Data and analytics innovation at the global level. The text is supported by informative illustrations and case studies, so that practitioners...

  1. The hot big bang and beyond

    Energy Technology Data Exchange (ETDEWEB)

    Turner, M.S. [Departments of Physics and of Astronomy & Astrophysics, Enrico Fermi Institute, The University of Chicago, Chicago, Illinois 60637-1433 (United States)]|[NASA/Fermilab Astrophysics Center, Fermi National Accelerator Laboratory, Batavia, Illinois 60510-0500 (United States)

    1995-08-01

    The hot big-bang cosmology provides a reliable accounting of the Universe from about 10{sup {minus}2} sec after the bang until the present, as well as a robust framework for speculating back to times as early as 10{sup {minus}43} sec. Cosmology faces a number of important challenges; foremost among them are determining the quantity and composition of matter in the Universe and developing a detailed and coherent picture of how structure (galaxies, clusters of galaxies, superclusters, voids, great walls, and so on) developed. At present there is a working hypothesis{emdash}cold dark matter{emdash}which is based upon inflation and which, if correct, would extend the big bang model back to 10{sup {minus}32} sec and cast important light on the unification of the forces. Many experiments and observations, from CBR anisotropy experiments to Hubble Space Telescope observations to experiments at Fermilab and CERN, are now putting the cold dark matter theory to the test. At present it appears that the theory is viable only if the Hubble constant is smaller than current measurements indicate (around 30 km s{sup {minus}1} Mpc{sup {minus}1}), or if the theory is modified slightly, e.g., by the addition of a cosmological constant, a small admixture of hot dark matter (5 eV {open_quote}{open_quote}worth of neutrinos{close_quote}{close_quote}), more relativistic particle or a tilted spectrum of density perturbations.

  2. Big Science and Long-tail Science

    CERN Document Server

    2008-01-01

    Jim Downing and I were privileged to be the guests of Salavtore Mele at CERN yesterday and to see the Atlas detector of the Large Hadron Collider . This is a wow experience - although I knew it was big, I hadnt realised how big.

  3. The Brain Rotation and Brain Diffusion Strategies of Small Islanders: Considering "Movement" in Lieu of "Place"

    Science.gov (United States)

    Baldacchino, Godfrey

    2006-01-01

    The "brain drain" phenomenon is typically seen as a zero-sum game, where one party's gain is presumed to be another's drain. This corresponds to deep-seated assumptions about what is "home" and what is "away". This article challenges the view, driven by much "brain drain" literature, that the dynamic is an…

  4. Toward a Literature-Driven Definition of Big Data in Healthcare

    Directory of Open Access Journals (Sweden)

    Emilie Baro

    2015-01-01

    Full Text Available Objective. The aim of this study was to provide a definition of big data in healthcare. Methods. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n and the number of variables (p for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. Results. A total of 196 papers were included. Big data can be defined as datasets with Log⁡(n*p≥7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Conclusion. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR data.

  5. Toward a Literature-Driven Definition of Big Data in Healthcare

    Science.gov (United States)

    Baro, Emilie; Degoul, Samuel; Beuscart, Régis; Chazard, Emmanuel

    2015-01-01

    Objective. The aim of this study was to provide a definition of big data in healthcare. Methods. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n) and the number of variables (p) for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. Results. A total of 196 papers were included. Big data can be defined as datasets with Log⁡(n∗p) ≥ 7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Conclusion. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR) data. PMID:26137488

  6. Big-Eyed Bugs Have Big Appetite for Pests

    Science.gov (United States)

    Many kinds of arthropod natural enemies (predators and parasitoids) inhabit crop fields in Arizona and can have a large negative impact on several pest insect species that also infest these crops. Geocoris spp., commonly known as big-eyed bugs, are among the most abundant insect predators in field c...

  7. CANONIC RELATION BASICO-MOTORICAL ABILITIES ON SITUATION SUCCESSFUL CHILDREN IN SMALL FOOTBALL

    Directory of Open Access Journals (Sweden)

    Izudin Tanović

    2011-03-01

    Full Text Available Football appertain to group polystructual sports, which characterize very big number different not standardize motorical abilities and technics which footballplayers execute in variable situations, begin intention or accidently during the game( Elsner,B.1985..For the difference of the big football,small football or Futsall presents game which is condition with very considerable space and time-limitation which are manifestate with very fast transformation of game from the phase of attack to phase of defence.Dinamic of game, difference of not-standaardize abilities, and application situation individual technic for successful deduction of actions request from player and very big level of psychocondition readiness. The aim of this exploration was to realize the level of influence basico-motorical abilities on situation successful in small football (Futsall game. Exploration is done on the children from the 12-14 years,at the school of small football in SFC “OT of MOSTAR”(OLD TOWN OF MOSTAR from the Mostar. Take into consideration strature characteristics and magnitude choosen pattern of children, and the target of exploration,processing the results are done with metod canonic-coleration analise. Final results of this exploration are characteristic for exploring strature,but also the same confirmation that exist statisticly very important binded between exploration spaces

  8. Big Data - What is it and why it matters.

    Science.gov (United States)

    Tattersall, Andy; Grant, Maria J

    2016-06-01

    Big data, like MOOCs, altmetrics and open access, is a term that has been commonplace in the library community for some time yet, despite its prevalence, many in the library and information sector remain unsure of the relationship between big data and their roles. This editorial explores what big data could mean for the day-to-day practice of health library and information workers, presenting examples of big data in action, considering the ethics of accessing big data sets and the potential for new roles for library and information workers. © 2016 Health Libraries Group.

  9. Brain glycogen

    DEFF Research Database (Denmark)

    Obel, Linea Lykke Frimodt; Müller, Margit S; Walls, Anne B

    2012-01-01

    Glycogen is a complex glucose polymer found in a variety of tissues, including brain, where it is localized primarily in astrocytes. The small quantity found in brain compared to e.g., liver has led to the understanding that brain glycogen is merely used during hypoglycemia or ischemia....... In this review evidence is brought forward highlighting what has been an emerging understanding in brain energy metabolism: that glycogen is more than just a convenient way to store energy for use in emergencies-it is a highly dynamic molecule with versatile implications in brain function, i.e., synaptic...... activity and memory formation. In line with the great spatiotemporal complexity of the brain and thereof derived focus on the basis for ensuring the availability of the right amount of energy at the right time and place, we here encourage a closer look into the molecular and subcellular mechanisms...

  10. Research on information security in big data era

    Science.gov (United States)

    Zhou, Linqi; Gu, Weihong; Huang, Cheng; Huang, Aijun; Bai, Yongbin

    2018-05-01

    Big data is becoming another hotspot in the field of information technology after the cloud computing and the Internet of Things. However, the existing information security methods can no longer meet the information security requirements in the era of big data. This paper analyzes the challenges and a cause of data security brought by big data, discusses the development trend of network attacks under the background of big data, and puts forward my own opinions on the development of security defense in technology, strategy and product.

  11. BIG DATA IN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2015-06-01

    Full Text Available In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured in order to improve current transactions, to develop new business models, to provide a real image of the supply and demand and thereby, generate market advantages. So, the companies that turn to Big Data have a competitive advantage over other firms. Looking from the perspective of IT organizations, they must accommodate the storage and processing Big Data, and provide analysis tools that are easily integrated into business processes. This paper aims to discuss aspects regarding the Big Data concept, the principles to build, organize and analyse huge datasets in the business environment, offering a three-layer architecture, based on actual software solutions. Also, the article refers to the graphical tools for exploring and representing unstructured data, Gephi and NodeXL.

  12. Fuzzy 2-partition entropy threshold selection based on Big Bang–Big Crunch Optimization algorithm

    Directory of Open Access Journals (Sweden)

    Baljit Singh Khehra

    2015-03-01

    Full Text Available The fuzzy 2-partition entropy approach has been widely used to select threshold value for image segmenting. This approach used two parameterized fuzzy membership functions to form a fuzzy 2-partition of the image. The optimal threshold is selected by searching an optimal combination of parameters of the membership functions such that the entropy of fuzzy 2-partition is maximized. In this paper, a new fuzzy 2-partition entropy thresholding approach based on the technology of the Big Bang–Big Crunch Optimization (BBBCO is proposed. The new proposed thresholding approach is called the BBBCO-based fuzzy 2-partition entropy thresholding algorithm. BBBCO is used to search an optimal combination of parameters of the membership functions for maximizing the entropy of fuzzy 2-partition. BBBCO is inspired by the theory of the evolution of the universe; namely the Big Bang and Big Crunch Theory. The proposed algorithm is tested on a number of standard test images. For comparison, three different algorithms included Genetic Algorithm (GA-based, Biogeography-based Optimization (BBO-based and recursive approaches are also implemented. From experimental results, it is observed that the performance of the proposed algorithm is more effective than GA-based, BBO-based and recursion-based approaches.

  13. Big Sib Students' Perceptions of the Educational Environment at the School of Medical Sciences, Universiti Sains Malaysia, using Dundee Ready Educational Environment Measure (DREEM) Inventory.

    Science.gov (United States)

    Arzuman, Hafiza; Yusoff, Muhamad Saiful Bahri; Chit, Som Phong

    2010-07-01

    A cross-sectional descriptive study was conducted among Big Sib students to explore their perceptions of the educational environment at the School of Medical Sciences, Universiti Sains Malaysia (USM) and its weak areas using the Dundee Ready Educational Environment Measure (DREEM) inventory. The DREEM inventory is a validated global instrument for measuring educational environments in undergraduate medical and health professional education. The English version of the DREEM inventory was administered to all Year 2 Big Sib students (n = 67) at a regular Big Sib session. The purpose of the study as well as confidentiality and ethical issues were explained to the students before the questionnaire was administered. The response rate was 62.7% (42 out of 67 students). The overall DREEM score was 117.9/200 (SD 14.6). The DREEM indicated that the Big Sib students' perception of educational environment of the medical school was more positive than negative. Nevertheless, the study also revealed some problem areas within the educational environment. This pilot study revealed that Big Sib students perceived a positive learning environment at the School of Medical Sciences, USM. It also identified some low-scored areas that require further exploration to pinpoint the exact problems. The relatively small study population selected from a particular group of students was the major limitation of the study. This small sample size also means that the study findings cannot be generalised.

  14. Small brain lesions and incident stroke and mortality: A cohort study

    Science.gov (United States)

    Windham, B Gwen; Deere, Bradley; Griswold, Michael E.; Wang, Wanmei; Bezerra, Daniel C; Shibata, Dean; Butler, Kenneth; Knopman, David; Gottesman, Rebecca F; Heiss, Gerardo; Mosley, Thomas H

    2015-01-01

    Background Although cerebral lesions ≥3mm on imaging are associated with incident stroke, lesions stroke risks associated with subclinical brain lesions by size (stroke; average 14.5 years follow-up. Measurements MRI lesions: none (n=1611), stroke (n=157), overall mortality (n=576), stroke mortality (n=50). Hazard Ratios (HR) estimated with proportional hazards models. Results Compared to no lesions, stroke risk was tripled with lesions Stroke risk doubled with WMH ≥3 (HR=2.14, 95% CI:1.45-3.16). Stroke mortality risk tripled with lesions stroke events (n=147), especially hemorrhagic (n=15); limited numbers of participants with only lesions ≤3mm (n=50) or with both lesions ≤3mm and 3–20mm (n=35). Conclusions Very small cerebrovascular lesions may be associated with increased risks of stroke and mortality; having both < 3 mm and ≥3 mm lesions may represent a particularly striking risk increase. Larger studies are needed to confirm findings and provide more precise estimates. PMID:26148278

  15. Addressing big data issues in Scientific Data Infrastructure

    NARCIS (Netherlands)

    Demchenko, Y.; Membrey, P.; Grosso, P.; de Laat, C.; Smari, W.W.; Fox, G.C.

    2013-01-01

    Big Data are becoming a new technology focus both in science and in industry. This paper discusses the challenges that are imposed by Big Data on the modern and future Scientific Data Infrastructure (SDI). The paper discusses a nature and definition of Big Data that include such features as Volume,

  16. Improving Healthcare Using Big Data Analytics

    Directory of Open Access Journals (Sweden)

    Revanth Sonnati

    2017-03-01

    Full Text Available In daily terms we call the current era as Modern Era which can also be named as the era of Big Data in the field of Information Technology. Our daily lives in todays world are rapidly advancing never quenching ones thirst. The fields of science engineering and technology are producing data at an exponential rate leading to Exabytes of data every day. Big data helps us to explore and re-invent many areas not limited to education health and law. The primary purpose of this paper is to provide an in-depth analysis in the area of Healthcare using the big data and analytics. The main purpose is to emphasize on the usage of the big data which is being stored all the time helping to look back in the history but this is the time to emphasize on the analyzation to improve the medication and services. Although many big data implementations happen to be in-house development this proposed implementation aims to propose a broader extent using Hadoop which just happen to be the tip of the iceberg. The focus of this paper is not limited to the improvement and analysis of the data it also focusses on the strengths and drawbacks compared to the conventional techniques available.

  17. Big Data - Smart Health Strategies

    Science.gov (United States)

    2014-01-01

    Summary Objectives To select best papers published in 2013 in the field of big data and smart health strategies, and summarize outstanding research efforts. Methods A systematic search was performed using two major bibliographic databases for relevant journal papers. The references obtained were reviewed in a two-stage process, starting with a blinded review performed by the two section editors, and followed by a peer review process operated by external reviewers recognized as experts in the field. Results The complete review process selected four best papers, illustrating various aspects of the special theme, among them: (a) using large volumes of unstructured data and, specifically, clinical notes from Electronic Health Records (EHRs) for pharmacovigilance; (b) knowledge discovery via querying large volumes of complex (both structured and unstructured) biological data using big data technologies and relevant tools; (c) methodologies for applying cloud computing and big data technologies in the field of genomics, and (d) system architectures enabling high-performance access to and processing of large datasets extracted from EHRs. Conclusions The potential of big data in biomedicine has been pinpointed in various viewpoint papers and editorials. The review of current scientific literature illustrated a variety of interesting methods and applications in the field, but still the promises exceed the current outcomes. As we are getting closer towards a solid foundation with respect to common understanding of relevant concepts and technical aspects, and the use of standardized technologies and tools, we can anticipate to reach the potential that big data offer for personalized medicine and smart health strategies in the near future. PMID:25123721

  18. About Big Data and its Challenges and Benefits in Manufacturing

    OpenAIRE

    Bogdan NEDELCU

    2013-01-01

    The aim of this article is to show the importance of Big Data and its growing influence on companies. It also shows what kind of big data is currently generated and how much big data is estimated to be generated. We can also see how much are the companies willing to invest in big data and how much are they currently gaining from their big data. There are also shown some major influences that big data has over one major segment in the industry (manufacturing) and the challenges that appear.

  19. Big Data Management in US Hospitals: Benefits and Barriers.

    Science.gov (United States)

    Schaeffer, Chad; Booton, Lawrence; Halleck, Jamey; Studeny, Jana; Coustasse, Alberto

    Big data has been considered as an effective tool for reducing health care costs by eliminating adverse events and reducing readmissions to hospitals. The purposes of this study were to examine the emergence of big data in the US health care industry, to evaluate a hospital's ability to effectively use complex information, and to predict the potential benefits that hospitals might realize if they are successful in using big data. The findings of the research suggest that there were a number of benefits expected by hospitals when using big data analytics, including cost savings and business intelligence. By using big data, many hospitals have recognized that there have been challenges, including lack of experience and cost of developing the analytics. Many hospitals will need to invest in the acquiring of adequate personnel with experience in big data analytics and data integration. The findings of this study suggest that the adoption, implementation, and utilization of big data technology will have a profound positive effect among health care providers.

  20. Big Data Strategy for Telco: Network Transformation

    OpenAIRE

    F. Amin; S. Feizi

    2014-01-01

    Big data has the potential to improve the quality of services; enable infrastructure that businesses depend on to adapt continually and efficiently; improve the performance of employees; help organizations better understand customers; and reduce liability risks. Analytics and marketing models of fixed and mobile operators are falling short in combating churn and declining revenue per user. Big Data presents new method to reverse the way and improve profitability. The benefits of Big Data and ...

  1. Big Data in Shipping - Challenges and Opportunities

    OpenAIRE

    Rødseth, Ørnulf Jan; Perera, Lokukaluge Prasad; Mo, Brage

    2016-01-01

    Big Data is getting popular in shipping where large amounts of information is collected to better understand and improve logistics, emissions, energy consumption and maintenance. Constraints to the use of big data include cost and quality of on-board sensors and data acquisition systems, satellite communication, data ownership and technical obstacles to effective collection and use of big data. New protocol standards may simplify the process of collecting and organizing the data, including in...

  2. [Relevance of big data for molecular diagnostics].

    Science.gov (United States)

    Bonin-Andresen, M; Smiljanovic, B; Stuhlmüller, B; Sörensen, T; Grützkau, A; Häupl, T

    2018-04-01

    Big data analysis raises the expectation that computerized algorithms may extract new knowledge from otherwise unmanageable vast data sets. What are the algorithms behind the big data discussion? In principle, high throughput technologies in molecular research already introduced big data and the development and application of analysis tools into the field of rheumatology some 15 years ago. This includes especially omics technologies, such as genomics, transcriptomics and cytomics. Some basic methods of data analysis are provided along with the technology, however, functional analysis and interpretation requires adaptation of existing or development of new software tools. For these steps, structuring and evaluating according to the biological context is extremely important and not only a mathematical problem. This aspect has to be considered much more for molecular big data than for those analyzed in health economy or epidemiology. Molecular data are structured in a first order determined by the applied technology and present quantitative characteristics that follow the principles of their biological nature. These biological dependencies have to be integrated into software solutions, which may require networks of molecular big data of the same or even different technologies in order to achieve cross-technology confirmation. More and more extensive recording of molecular processes also in individual patients are generating personal big data and require new strategies for management in order to develop data-driven individualized interpretation concepts. With this perspective in mind, translation of information derived from molecular big data will also require new specifications for education and professional competence.

  3. Big data in psychology: A framework for research advancement.

    Science.gov (United States)

    Adjerid, Idris; Kelley, Ken

    2018-02-22

    The potential for big data to provide value for psychology is significant. However, the pursuit of big data remains an uncertain and risky undertaking for the average psychological researcher. In this article, we address some of this uncertainty by discussing the potential impact of big data on the type of data available for psychological research, addressing the benefits and most significant challenges that emerge from these data, and organizing a variety of research opportunities for psychology. Our article yields two central insights. First, we highlight that big data research efforts are more readily accessible than many researchers realize, particularly with the emergence of open-source research tools, digital platforms, and instrumentation. Second, we argue that opportunities for big data research are diverse and differ both in their fit for varying research goals, as well as in the challenges they bring about. Ultimately, our outlook for researchers in psychology using and benefiting from big data is cautiously optimistic. Although not all big data efforts are suited for all researchers or all areas within psychology, big data research prospects are diverse, expanding, and promising for psychology and related disciplines. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  4. 'Big data' in pharmaceutical science: challenges and opportunities.

    Science.gov (United States)

    Dossetter, Al G; Ecker, Gerhard; Laverty, Hugh; Overington, John

    2014-05-01

    Future Medicinal Chemistry invited a selection of experts to express their views on the current impact of big data in drug discovery and design, as well as speculate on future developments in the field. The topics discussed include the challenges of implementing big data technologies, maintaining the quality and privacy of data sets, and how the industry will need to adapt to welcome the big data era. Their enlightening responses provide a snapshot of the many and varied contributions being made by big data to the advancement of pharmaceutical science.

  5. Addressing Data Veracity in Big Data Applications

    Energy Technology Data Exchange (ETDEWEB)

    Aman, Saima [Univ. of Southern California, Los Angeles, CA (United States). Dept. of Computer Science; Chelmis, Charalampos [Univ. of Southern California, Los Angeles, CA (United States). Dept. of Electrical Engineering; Prasanna, Viktor [Univ. of Southern California, Los Angeles, CA (United States). Dept. of Electrical Engineering

    2014-10-27

    Big data applications such as in smart electric grids, transportation, and remote environment monitoring involve geographically dispersed sensors that periodically send back information to central nodes. In many cases, data from sensors is not available at central nodes at a frequency that is required for real-time modeling and decision-making. This may be due to physical limitations of the transmission networks, or due to consumers limiting frequent transmission of data from sensors located at their premises for security and privacy concerns. Such scenarios lead to partial data problem and raise the issue of data veracity in big data applications. We describe a novel solution to the problem of making short term predictions (up to a few hours ahead) in absence of real-time data from sensors in Smart Grid. A key implication of our work is that by using real-time data from only a small subset of influential sensors, we are able to make predictions for all sensors. We thus reduce the communication complexity involved in transmitting sensory data in Smart Grids. We use real-world electricity consumption data from smart meters to empirically demonstrate the usefulness of our method. Our dataset consists of data collected at 15-min intervals from 170 smart meters in the USC Microgrid for 7 years, totaling 41,697,600 data points.

  6. Soft computing in big data processing

    CERN Document Server

    Park, Seung-Jong; Lee, Jee-Hyong

    2014-01-01

    Big data is an essential key to build a smart world as a meaning of the streaming, continuous integration of large volume and high velocity data covering from all sources to final destinations. The big data range from data mining, data analysis and decision making, by drawing statistical rules and mathematical patterns through systematical or automatically reasoning. The big data helps serve our life better, clarify our future and deliver greater value. We can discover how to capture and analyze data. Readers will be guided to processing system integrity and implementing intelligent systems. With intelligent systems, we deal with the fundamental data management and visualization challenges in effective management of dynamic and large-scale data, and efficient processing of real-time and spatio-temporal data. Advanced intelligent systems have led to managing the data monitoring, data processing and decision-making in realistic and effective way. Considering a big size of data, variety of data and frequent chan...

  7. Cosmic inflation and big bang interpreted as explosions

    Science.gov (United States)

    Rebhan, E.

    2012-12-01

    It has become common understanding that the recession of galaxies and the corresponding redshift of light received from them can only be explained by an expansion of the space between them and us. In this paper, for the presently favored case of a universe without spatial curvature, it is shown that this interpretation is restricted to comoving coordinates. It is proven by construction that within the framework of general relativity other coordinates exist in relation to which these phenomena can be explained by a motion of the cosmic substrate across space, caused by an explosionlike big bang or by inflation preceding an almost big bang. At the place of an observer, this motion occurs without any spatial expansion. It is shown that in these “explosion coordinates” the usual redshift comes about by a Doppler shift and a subsequent gravitational shift. Making use of this interpretation, it can easily be understood why in comoving coordinates light rays of short spatial extension expand and thus constitute an exemption from the rule that small objects up to the size of the solar system or even galaxies do not participate in the expansion of the universe. It is also discussed how the two interpretations can be reconciled with each other.

  8. Solution of a braneworld big crunch/big bang cosmology

    International Nuclear Information System (INIS)

    McFadden, Paul L.; Turok, Neil; Steinhardt, Paul J.

    2007-01-01

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c) 2 . At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios

  9. [Big data and their perspectives in radiation therapy].

    Science.gov (United States)

    Guihard, Sébastien; Thariat, Juliette; Clavier, Jean-Baptiste

    2017-02-01

    The concept of big data indicates a change of scale in the use of data and data aggregation into large databases through improved computer technology. One of the current challenges in the creation of big data in the context of radiation therapy is the transformation of routine care items into dark data, i.e. data not yet collected, and the fusion of databases collecting different types of information (dose-volume histograms and toxicity data for example). Processes and infrastructures devoted to big data collection should not impact negatively on the doctor-patient relationship, the general process of care or the quality of the data collected. The use of big data requires a collective effort of physicians, physicists, software manufacturers and health authorities to create, organize and exploit big data in radiotherapy and, beyond, oncology. Big data involve a new culture to build an appropriate infrastructure legally and ethically. Processes and issues are discussed in this article. Copyright © 2016 Société Française du Cancer. Published by Elsevier Masson SAS. All rights reserved.

  10. Current applications of big data in obstetric anesthesiology.

    Science.gov (United States)

    Klumpner, Thomas T; Bauer, Melissa E; Kheterpal, Sachin

    2017-06-01

    The narrative review aims to highlight several recently published 'big data' studies pertinent to the field of obstetric anesthesiology. Big data has been used to study rare outcomes, to identify trends within the healthcare system, to identify variations in practice patterns, and to highlight potential inequalities in obstetric anesthesia care. Big data studies have helped define the risk of rare complications of obstetric anesthesia, such as the risk of neuraxial hematoma in thrombocytopenic parturients. Also, large national databases have been used to better understand trends in anesthesia-related adverse events during cesarean delivery as well as outline potential racial/ethnic disparities in obstetric anesthesia care. Finally, real-time analysis of patient data across a number of disparate health information systems through the use of sophisticated clinical decision support and surveillance systems is one promising application of big data technology on the labor and delivery unit. 'Big data' research has important implications for obstetric anesthesia care and warrants continued study. Real-time electronic surveillance is a potentially useful application of big data technology on the labor and delivery unit.

  11. Coronal in vivo forward-imaging of rat brain morphology with an ultra-small optical coherence tomography fiber probe

    Science.gov (United States)

    Xie, Yijing; Bonin, Tim; Löffler, Susanne; Hüttmann, Gereon; Tronnier, Volker; Hofmann, Ulrich G.

    2013-02-01

    A well-established navigation method is one of the key conditions for successful brain surgery: it should be accurate, safe and online operable. Recent research shows that optical coherence tomography (OCT) is a potential solution for this application by providing a high resolution and small probe dimension. In this study a fiber-based spectral-domain OCT system utilizing a super-luminescent-diode with the center wavelength of 840 nm providing 14.5 μm axial resolution was used. A composite 125 μm diameter detecting probe with a gradient index (GRIN) fiber fused to a single mode fiber was employed. Signals were reconstructed into grayscale images by horizontally aligning A-scans from the same trajectory with different depths. The reconstructed images can display brain morphology along the entire trajectory. For scans of typical white matter, the signals showed a higher reflection of light intensity with lower penetration depth as well as a steeper attenuation rate compared to the scans typical for gray matter. Micro-structures such as axon bundles (70 μm) in the caudate nucleus are visible in the reconstructed images. This study explores the potential of OCT to be a navigation modality in brain surgery.

  12. Volume and Value of Big Healthcare Data.

    Science.gov (United States)

    Dinov, Ivo D

    Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions.

  13. Phosphorus mass balance in a highly eutrophic semi-enclosed inlet near a big metropolis: a small inlet can contribute towards particulate organic matter production.

    Science.gov (United States)

    Asaoka, Satoshi; Yamamoto, Tamiji

    2011-01-01

    Terrigenous loading into enclosed water bodies has been blamed for eutrophic conditions marked by massive algal growth and subsequent hypoxia due to decomposition of dead algal cells. This study aims to describe the eutrophication and hypoxia processes in a semi-enclosed water body lying near a big metropolis. Phosphorus mass balance in a small inlet, Ohko Inlet, located at the head of Hiroshima Bay, Japan, was quantified using a numerical model. Dissolved inorganic phosphorous inflow from Kaita Bay next to the inlet was five times higher than that from terrigenous load, which may cause an enhancement of primary production. Therefore, it was concluded that not only the reduction of material load from the land and the suppression of benthic flux are needed, but also reducing the inflow of high phosphorus and oxygen depleted water from Kaita Bay will form a collective alternative measure to remediate the environmental condition of the inlet. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Using Big Book to Teach Things in My House

    OpenAIRE

    Effrien, Intan; Lailatus, Sa’diyah; Nuruliftitah Maja, Neneng

    2017-01-01

    The purpose of this study to determine students' interest in learning using the big book media. Big book is a big book from the general book. The big book contains simple words and images that match the content of sentences and spelling. From here researchers can know the interest and development of students' knowledge. As well as train researchers to remain crative in developing learning media for students.

  15. Finding Big Gay Church: An Academic Congregation Exploring LGBTQ Intersections with Religion, Art, and Education

    Directory of Open Access Journals (Sweden)

    Mindi Rhoades

    2016-07-01

    Full Text Available Using the metaphor of mapping as an overarching metaphor, this article presents an amalgamated version of the first five years of Big Gay Church, an annual session at the National Art Education Association’s convention since 2009. Big Gay Church is a collaborative small group of queer art educators and allies coming together to explore the intersections of religion, education, the arts, culture, and LGBTQ identities. By using tools and constructs from dramatic inquiry and other performance pedagogies, as well as inviting attendees to fully participate as members of the congregation, we transform this conference session into an opportunity for scholarship, action, connection, and fellowship. Such arts-based academic interventions can provoke a re-imagining of ways forward, together, in education and research.

  16. A Review on Applicability of Big Data Technology in Nuclear Power Plant : Focused on O and M Phases

    Energy Technology Data Exchange (ETDEWEB)

    Cha, Jae-Min; Shin, Junguk Shin; Yeom, Choong-Sub [Institute for Advanced Engineering, Yongin (Korea, Republic of)

    2015-05-15

    With the rapid growth of information and communication technology (ICT), data has been explosively increasing. It is the most important component of big data concept which derives values from the data. Recently, big data technology has been applied to plant industry such as oil and gas plant, steel and iron plant, and power plant as well as traditional industries including communication, manufacturing, distribution, banking, and so on. It means that the big data technology has a high opportunity to enhance operational performance from tremendous data collected from numerous sensors, which are generally attached to the Structures, Systems, and Components (SSCs). Gartner reported that 'the big data has high potential opportunities in Manufacturing and Natural Resource industry sector'. In the paper, we analyze the applicability of the big data technology in the NPP focusing on O and M phase. For this, the following sequence of process: Operational concept definition, Problem analysis, Needs derivation is conducted. This research has some limitations as follows. 1) Only monitoring and diagnosis part in the operational phase is considered in the whole plant lifecycle activities. That is, the necessity of big data should be derived in the comprehensive and diverse viewpoints. 2) Target interviewee is too small. That is, the more interviewee should be considered to increase the credibility of the research results. In the further study, to overcome the limitations of this research, we plan to validate the necessity via quantitative survey methods with more experts in the various plant cycles. We also attempt to show the practical impacts of big data through the practical application into the NPP.

  17. A Review on Applicability of Big Data Technology in Nuclear Power Plant : Focused on O and M Phases

    International Nuclear Information System (INIS)

    Cha, Jae-Min; Shin, Junguk Shin; Yeom, Choong-Sub

    2015-01-01

    With the rapid growth of information and communication technology (ICT), data has been explosively increasing. It is the most important component of big data concept which derives values from the data. Recently, big data technology has been applied to plant industry such as oil and gas plant, steel and iron plant, and power plant as well as traditional industries including communication, manufacturing, distribution, banking, and so on. It means that the big data technology has a high opportunity to enhance operational performance from tremendous data collected from numerous sensors, which are generally attached to the Structures, Systems, and Components (SSCs). Gartner reported that 'the big data has high potential opportunities in Manufacturing and Natural Resource industry sector'. In the paper, we analyze the applicability of the big data technology in the NPP focusing on O and M phase. For this, the following sequence of process: Operational concept definition, Problem analysis, Needs derivation is conducted. This research has some limitations as follows. 1) Only monitoring and diagnosis part in the operational phase is considered in the whole plant lifecycle activities. That is, the necessity of big data should be derived in the comprehensive and diverse viewpoints. 2) Target interviewee is too small. That is, the more interviewee should be considered to increase the credibility of the research results. In the further study, to overcome the limitations of this research, we plan to validate the necessity via quantitative survey methods with more experts in the various plant cycles. We also attempt to show the practical impacts of big data through the practical application into the NPP

  18. Big Data Analytics Methodology in the Financial Industry

    Science.gov (United States)

    Lawler, James; Joseph, Anthony

    2017-01-01

    Firms in industry continue to be attracted by the benefits of Big Data Analytics. The benefits of Big Data Analytics projects may not be as evident as frequently indicated in the literature. The authors of the study evaluate factors in a customized methodology that may increase the benefits of Big Data Analytics projects. Evaluating firms in the…

  19. Big data: survey, technologies, opportunities, and challenges.

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  20. Big Data: Survey, Technologies, Opportunities, and Challenges

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Mahmoud Ali, Waleed Kamaleldin; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682