WorldWideScience

Sample records for intravenous human big-ivbaby

  1. The human experience with intravenous levodopa

    Directory of Open Access Journals (Sweden)

    Shan H Siddiqi

    2016-01-01

    Full Text Available Objective: To compile a comprehensive summary of published human experience with levodopa given intravenously, with a focus on information required by regulatory agencies.Background: While safe intravenous (IV use of levodopa has been documented for over 50 years, regulatory supervision for pharmaceuticals given by a route other than that approved by the U.S. Food and Drug Administration (FDA has become increasingly cautious. If delivering a drug by an alternate route raises the risk of adverse events, an investigational new drug (IND application is required, including a comprehensive review of toxicity data.Methods: Over 200 articles referring to IV levodopa were examined for details of administration, pharmacokinetics, benefit and side effects.Results: We identified 142 original reports describing IVLD use in humans, beginning with psychiatric research in 1959-1960 before the development of peripheral decarboxylase inhibitors. Over 2750 subjects have received IV levodopa, and reported outcomes include parkinsonian signs, sleep variables, hormone levels, hemodynamics, CSF amino acid composition, regional cerebral blood flow, cognition, perception and complex behavior. Mean pharmacokinetic variables were summarized for 49 healthy subjects and 190 with Parkinson’s disease. Side effects were those expected from clinical experience with oral levodopa and dopamine agonists. No articles reported deaths or induction of psychosis.Conclusion: Over 2750 patients have received IV levodopa with a safety profile comparable to that seen with oral administration.

  2. The Human Genome Project: big science transforms biology and medicine

    OpenAIRE

    Hood, Leroy; Rowen, Lee

    2013-01-01

    The Human Genome Project has transformed biology through its integrated big science approach to deciphering a reference human genome sequence along with the complete sequences of key model organisms. The project exemplifies the power, necessity and success of large, integrated, cross-disciplinary efforts - so-called ‘big science’ - directed towards complex major objectives. In this article, we discuss the ways in which this ambitious endeavor led to the development of novel technologies and a...

  3. Big Data Analysis of Human Genome Variations

    KAUST Repository

    Gojobori, Takashi

    2016-01-25

    Since the human genome draft sequence was in public for the first time in 2000, genomic analyses have been intensively extended to the population level. The following three international projects are good examples for large-scale studies of human genome variations: 1) HapMap Data (1,417 individuals) (http://hapmap.ncbi.nlm.nih.gov/downloads/genotypes/2010-08_phaseII+III/forward/), 2) HGDP (Human Genome Diversity Project) Data (940 individuals) (http://www.hagsc.org/hgdp/files.html), 3) 1000 genomes Data (2,504 individuals) http://ftp.1000genomes.ebi.ac.uk/vol1/ftp/release/20130502/ If we can integrate all three data into a single volume of data, we should be able to conduct a more detailed analysis of human genome variations for a total number of 4,861 individuals (= 1,417+940+2,504 individuals). In fact, we successfully integrated these three data sets by use of information on the reference human genome sequence, and we conducted the big data analysis. In particular, we constructed a phylogenetic tree of about 5,000 human individuals at the genome level. As a result, we were able to identify clusters of ethnic groups, with detectable admixture, that were not possible by an analysis of each of the three data sets. Here, we report the outcome of this kind of big data analyses and discuss evolutionary significance of human genomic variations. Note that the present study was conducted in collaboration with Katsuhiko Mineta and Kosuke Goto at KAUST.

  4. Statistical Challenges in "Big Data" Human Neuroimaging.

    Science.gov (United States)

    Smith, Stephen M; Nichols, Thomas E

    2018-01-17

    Smith and Nichols discuss "big data" human neuroimaging studies, with very large subject numbers and amounts of data. These studies provide great opportunities for making new discoveries about the brain but raise many new analytical challenges and interpretational risks. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. The Human Genome Project: big science transforms biology and medicine.

    Science.gov (United States)

    Hood, Leroy; Rowen, Lee

    2013-01-01

    The Human Genome Project has transformed biology through its integrated big science approach to deciphering a reference human genome sequence along with the complete sequences of key model organisms. The project exemplifies the power, necessity and success of large, integrated, cross-disciplinary efforts - so-called 'big science' - directed towards complex major objectives. In this article, we discuss the ways in which this ambitious endeavor led to the development of novel technologies and analytical tools, and how it brought the expertise of engineers, computer scientists and mathematicians together with biologists. It established an open approach to data sharing and open-source software, thereby making the data resulting from the project accessible to all. The genome sequences of microbes, plants and animals have revolutionized many fields of science, including microbiology, virology, infectious disease and plant biology. Moreover, deeper knowledge of human sequence variation has begun to alter the practice of medicine. The Human Genome Project has inspired subsequent large-scale data acquisition initiatives such as the International HapMap Project, 1000 Genomes, and The Cancer Genome Atlas, as well as the recently announced Human Brain Project and the emerging Human Proteome Project.

  6. Exploring Data in Human Resources Big Data

    Directory of Open Access Journals (Sweden)

    Adela BARA

    2016-01-01

    Full Text Available Nowadays, social networks and informatics technologies and infrastructures are constantly developing and affect each other. In this context, the HR recruitment process became complex and many multinational organizations have encountered selection issues. The objective of the paper is to develop a prototype system for assisting the selection of candidates for an intelligent management of human resources. Such a system can be a starting point for the efficient organization of semi-structured and unstructured data on recruitment activities. The article extends the research presented at the 14th International Conference on Informatics in Economy (IE 2015 in the scientific paper "Big Data challenges for human resources management".

  7. The dynamics of big data and human rights: the case of scientific research.

    Science.gov (United States)

    Vayena, Effy; Tasioulas, John

    2016-12-28

    In this paper, we address the complex relationship between big data and human rights. Because this is a vast terrain, we restrict our focus in two main ways. First, we concentrate on big data applications in scientific research, mostly health-related research. And, second, we concentrate on two human rights: the familiar right to privacy and the less well-known right to science. Our contention is that human rights interact in potentially complex ways with big data, not only constraining it, but also enabling it in various ways; and that such rights are dynamic in character, rather than fixed once and for all, changing in their implications over time in line with changes in the context we inhabit, and also as they interact among themselves in jointly responding to the opportunities and risks thrown up by a changing world. Understanding this dynamic interaction of human rights is crucial for formulating an ethic tailored to the realities-the new capabilities and risks-of the rapidly evolving digital environment.This article is part of the themed issue 'The ethical impact of data science'. © 2016 The Author(s).

  8. Human factors in Big Data

    NARCIS (Netherlands)

    Boer, J. de

    2016-01-01

    Since 2014 I am involved in various (research) projects that try to make the hype around Big Data more concrete and tangible for the industry and government. Big Data is about multiple sources of (real-time) data that can be analysed, transformed to information and be used to make 'smart' decisions.

  9. Deserts in the Deluge: TerraPopulus and Big Human-Environment Data.

    Science.gov (United States)

    Manson, S M; Kugler, T A; Haynes, D

    2016-01-01

    Terra Populus, or TerraPop, is a cyberinfrastructure project that integrates, preserves, and disseminates massive data collections describing characteristics of the human population and environment over the last six decades. TerraPop has made a number of GIScience advances in the handling of big spatial data to make information interoperable between formats and across scientific communities. In this paper, we describe challenges of these data, or 'deserts in the deluge' of data, that are common to spatial big data more broadly, and explore computational solutions specific to microdata, raster, and vector data models.

  10. Sandwich-type enzyme immunoassay for big endothelin-I in plasma: concentrations in healthy human subjects unaffected by sex or posture.

    Science.gov (United States)

    Aubin, P; Le Brun, G; Moldovan, F; Villette, J M; Créminon, C; Dumas, J; Homyrda, L; Soliman, H; Azizi, M; Fiet, J

    1997-01-01

    A sandwich-type enzyme immunoassay has been developed for measuring human big endothelin-1 (big ET-1) in human plasma and supernatant fluids from human cell cultures. Big ET-1 is the precursor of endothelin 1 (ET-1), the most potent vasoconstrictor known. A rabbit antibody raised against the big ET-1 COOH-terminus fragment was used as an immobilized antibody (anti-P16). The Fab' fragment of a monoclonal antibody (1B3) raised against the ET-1 loop fragment was used as the enzyme-labeled antibody, after being coupled to acetylcholinesterase. The lowest detectable value in the assay was 1.2 pg/mL (0.12 pg/well). The assay was highly specific for big ET-1, demonstrating no cross-reactivity with ET-1, big endothelin-2 (big ET-2), and big endothelin-3 (big ET-3). We used this assay to evaluate the effect of two different postural positions (supine and standing) on plasma big ET-1 concentrations in 11 male and 11 female healthy subjects. Data analysis revealed that neither sex nor body position influenced plasma big ET-1 concentrations. This assay should thus permit the detection of possible variations in plasma concentrations of big ET-1 in certain pathologies and, in association with ET-1 assay, make possible in vitro study of endothelin-converting enzyme activity in cell models. Such studies could clarify the physiological and clinical roles of this family of peptides.

  11. How Big Data Fast Tracked Human Mobility Research and the Lessons for Animal Movement Ecology

    KAUST Repository

    Thums, Michele; Ferná ndez-Gracia, Juan; Sequeira, Ana M. M.; Eguí luz, Ví ctor M.; Duarte, Carlos M.; Meekan, Mark G.

    2018-01-01

    The rise of the internet coupled with technological innovations such as smartphones have generated massive volumes of geo-referenced data (big data) on human mobility. This has allowed the number of studies of human mobility to rapidly overtake those of animal movement. Today, telemetry studies of animals are also approaching big data status. Here, we review recent advances in studies of human mobility and identify the opportunities they present for advancing our understanding of animal movement. We describe key analytical techniques, potential bottlenecks and a roadmap for progress toward a synthesis of movement patterns of wild animals.

  12. How Big Data Fast Tracked Human Mobility Research and the Lessons for Animal Movement Ecology

    Directory of Open Access Journals (Sweden)

    Michele Thums

    2018-02-01

    Full Text Available The rise of the internet coupled with technological innovations such as smartphones have generated massive volumes of geo-referenced data (big data on human mobility. This has allowed the number of studies of human mobility to rapidly overtake those of animal movement. Today, telemetry studies of animals are also approaching big data status. Here, we review recent advances in studies of human mobility and identify the opportunities they present for advancing our understanding of animal movement. We describe key analytical techniques, potential bottlenecks and a roadmap for progress toward a synthesis of movement patterns of wild animals.

  13. How Big Data Fast Tracked Human Mobility Research and the Lessons for Animal Movement Ecology

    KAUST Repository

    Thums, Michele

    2018-02-13

    The rise of the internet coupled with technological innovations such as smartphones have generated massive volumes of geo-referenced data (big data) on human mobility. This has allowed the number of studies of human mobility to rapidly overtake those of animal movement. Today, telemetry studies of animals are also approaching big data status. Here, we review recent advances in studies of human mobility and identify the opportunities they present for advancing our understanding of animal movement. We describe key analytical techniques, potential bottlenecks and a roadmap for progress toward a synthesis of movement patterns of wild animals.

  14. Big Data and Intelligence: Applications, Human Capital, and Education

    Directory of Open Access Journals (Sweden)

    Michael Landon-Murray

    2016-06-01

    Full Text Available The potential for big data to contribute to the US intelligence mission goes beyond bulk collection, social media and counterterrorism. Applications will speak to a range of issues of major concern to intelligence agencies, from military operations to climate change to cyber security. There are challenges too: procurement lags, data stovepiping, separating signal from noise, sources and methods, a range of normative issues, and central to managing these challenges, human capital. These potential applications and challenges are discussed and a closer look at what data scientists do in the Intelligence Community (IC is offered. Effectively filling the ranks of the IC’s data science workforce will depend on the provision of well-trained data scientists from the higher education system. Program offerings at America’s top fifty universities will thus be surveyed (just a few years ago there were reportedly no degrees in data science. One Master’s program that has melded data science with intelligence is examined as well as a university big data research center focused on security and intelligence. This discussion goes a long way to clarify the prospective uses of data science in intelligence while probing perhaps the key challenge to optimal application of big data in the IC.

  15. Using Big Data to Understand the Human Condition: The Kavli HUMAN Project.

    Science.gov (United States)

    Azmak, Okan; Bayer, Hannah; Caplin, Andrew; Chun, Miyoung; Glimcher, Paul; Koonin, Steven; Patrinos, Aristides

    2015-09-01

    Until now, most large-scale studies of humans have either focused on very specific domains of inquiry or have relied on between-subjects approaches. While these previous studies have been invaluable for revealing important biological factors in cardiac health or social factors in retirement choices, no single repository contains anything like a complete record of the health, education, genetics, environmental, and lifestyle profiles of a large group of individuals at the within-subject level. This seems critical today because emerging evidence about the dynamic interplay between biology, behavior, and the environment point to a pressing need for just the kind of large-scale, long-term synoptic dataset that does not yet exist at the within-subject level. At the same time that the need for such a dataset is becoming clear, there is also growing evidence that just such a synoptic dataset may now be obtainable-at least at moderate scale-using contemporary big data approaches. To this end, we introduce the Kavli HUMAN Project (KHP), an effort to aggregate data from 2,500 New York City households in all five boroughs (roughly 10,000 individuals) whose biology and behavior will be measured using an unprecedented array of modalities over 20 years. It will also richly measure environmental conditions and events that KHP members experience using a geographic information system database of unparalleled scale, currently under construction in New York. In this manner, KHP will offer both synoptic and granular views of how human health and behavior coevolve over the life cycle and why they evolve differently for different people. In turn, we argue that this will allow for new discovery-based scientific approaches, rooted in big data analytics, to improving the health and quality of human life, particularly in urban contexts.

  16. Human kinetics of orally and intravenously administered low-dose 1,2-(13)C-dichloroacetate.

    Science.gov (United States)

    Jia, Minghong; Coats, Bonnie; Chadha, Monisha; Frentzen, Barbara; Perez-Rodriguez, Javier; Chadik, Paul A; Yost, Richard A; Henderson, George N; Stacpoole, Peter W

    2006-12-01

    Dichloroacetate (DCA) is a putative environmental hazard, owing to its ubiquitous presence in the biosphere and its association with animal and human toxicity. We sought to determine the kinetics of environmentally relevant concentrations of 1,2-(13)C-DCA administered to healthy adults. Subjects received an oral or intravenous dose of 2.5 microg/kg of 1,2-(13)C-DCA. Plasma and urine concentrations of 1,2-(13)C-DCA were measured by a modified gas chromatography-tandem mass spectrometry method. 1,2-(13)C-DCA kinetics was determined by modeling using WinNonlin 4.1 software. Plasma concentrations of 1,2-(13)C-DCA peaked 10 minutes and 30 minutes after intravenous or oral administration, respectively. Plasma kinetic parameters varied as a function of dose and duration. Very little unchanged 1,2-(13)C-DCA was excreted in urine. Trace amounts of DCA alter its own kinetics after short-term exposure. These findings have important implications for interpreting the impact of this xenobiotic on human health.

  17. How They Move Reveals What Is Happening: Understanding the Dynamics of Big Events from Human Mobility Pattern

    Directory of Open Access Journals (Sweden)

    Jean Damascène Mazimpaka

    2017-01-01

    Full Text Available The context in which a moving object moves contributes to the movement pattern observed. Likewise, the movement pattern reflects the properties of the movement context. In particular, big events influence human mobility depending on the dynamics of the events. However, this influence has not been explored to understand big events. In this paper, we propose a methodology for learning about big events from human mobility pattern. The methodology involves extracting and analysing the stopping, approaching, and moving-away interactions between public transportation vehicles and the geographic context. The analysis is carried out at two different temporal granularity levels to discover global and local patterns. The results of evaluating this methodology on bus trajectories demonstrate that it can discover occurrences of big events from mobility patterns, roughly estimate the event start and end time, and reveal the temporal patterns of arrival and departure of event attendees. This knowledge can be usefully applied in transportation and event planning and management.

  18. Effect of intravenous infusion of glyceryl trinitrate on gastric and small intestinal motor function in healthy humans

    DEFF Research Database (Denmark)

    Madsen, Jan Lysgård; Fuglsang, Stefan; Graff, J

    2006-01-01

    : To examine the effect of intravenous infusion of glyceryl trinitrate on gastric and small intestinal motor function after a meal in healthy humans. METHODS: Nine healthy volunteers participated in a placebo-controlled, double-blind, crossover study. Each volunteer was examined during intravenous infusion...... of glyceryl trinitrate 1 microg/kg x min or saline. A gamma camera technique was used to measure gastric emptying and small intestinal transit after a 1600-kJ mixed liquid and solid meal. Furthermore, duodenal motility was assessed by manometry. RESULTS: Glyceryl trinitrate did not change gastric mean...... emptying time, gastric half emptying time, gastric retention at 15 min or small intestinal mean transit time. Glyceryl trinitrate did not influence the frequency of duodenal contractions, the amplitude of duodenal contractions or the duodenal motility index. CONCLUSIONS: Intravenous infusion of glyceryl...

  19. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  20. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  1. Nursing Needs Big Data and Big Data Needs Nursing.

    Science.gov (United States)

    Brennan, Patricia Flatley; Bakken, Suzanne

    2015-09-01

    Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.

  2. A Big Video Manifesto

    DEFF Research Database (Denmark)

    Mcilvenny, Paul Bruce; Davidsen, Jacob

    2017-01-01

    and beautiful visualisations. However, we also need to ask what the tools of big data can do both for the Humanities and for more interpretative approaches and methods. Thus, we prefer to explore how the power of computation, new sensor technologies and massive storage can also help with video-based qualitative......For the last few years, we have witnessed a hype about the potential results and insights that quantitative big data can bring to the social sciences. The wonder of big data has moved into education, traffic planning, and disease control with a promise of making things better with big numbers...

  3. Hepatic glycogen in humans. II. Gluconeogenetic formation after oral and intravenous glucose

    International Nuclear Information System (INIS)

    Radziuk, J.

    1989-01-01

    The amount of glycogen that is formed by gluconeogenetic pathways during glucose loading was quantitated in human subjects. Oral glucose loading was compared with its intravenous administration. Overnight-fasted subjects received a constant infusion or [3- 3 H]glucose and a marker for gluconeogenesis, [U- 14 C]lactate or sodium [ 14 C]bicarbonate [ 14 C]bicarbonate. An unlabeled glucose load was then administered. Postabsorptively, or after glucose infusion was terminated, a third tracer ([6- 3 H]glucose) infusion was initiated along with a three-step glucagon infusion. Without correcting for background stimulation of [ 14 C]glucose production or for dilution of 14 C with citric acid cycle carbon in the oxaloacetate pool, the amount of glycogen mobilized by the glucagon infusion that was produced by gluconeogenesis during oral glucose loading was 2.9 +/- 0.7 g calculated from [U- 14 C]-lactate incorporation and 7.4 +/- 1.3 g calculated using [ 14 C]bicarbonate as a gluconeogenetic marker. During intravenous glucose administration the latter measurement also yielded 7.2 +/- 1.1 g. When the two corrections above are applied, the respective quantities became 5.3 +/- 1.7 g for [U- 14 C]lactate as tracer and 14.7 +/- 4.3 and 13.9 +/- 3.6 g for oral and intravenous glucose with [ 14 C]bicarbonate as tracer (P less than 0.05, vs. [ 14 C]-lactate as tracer). When [2- 14 C]acetate was infused, the same amount of label was incorporated into mobilized glycogen regardless of which route of glucose administration was used. Comparison with previous data also suggests that 14 CO 2 is a potentially useful marker for the gluconeogenetic process in vivo

  4. Pharmacokinetics of high-dose intravenous melatonin in humans

    DEFF Research Database (Denmark)

    Andersen, Lars P H; Werner, Mads U; Rosenkilde, Mette Marie

    2016-01-01

    This crossover study investigated the pharmacokinetics and adverse effects of high-dose intravenous melatonin. Volunteers participated in 3 identical study sessions, receiving an intravenous bolus of 10 mg melatonin, 100 mg melatonin, and placebo. Blood samples were collected at baseline and 0, 60......, 120, 180, 240, 300, 360, and 420 minutes after the bolus. Quantitative determination of plasma melatonin concentrations was performed using a radioimmunoassay technique. Pharmacokinetic parameters were estimated by a compartmental pharmacokinetic analysis. Adverse effects included assessments...... of sedation and registration of other symptoms. Sedation, evaluated as simple reaction times, was measured at baseline and 120, 180, 300, and 420 minutes after the bolus. Twelve male volunteers completed the study. Median (IQR) Cmax after the bolus injections of 10 mg and 100 mg of melatonin were 221...

  5. Dose-Dependent Effect of Intravenous Administration of Human Umbilical Cord-Derived Mesenchymal Stem Cells in Neonatal Stroke Mice

    Science.gov (United States)

    Tanaka, Emi; Ogawa, Yuko; Mukai, Takeo; Sato, Yoshiaki; Hamazaki, Takashi; Nagamura-Inoue, Tokiko; Harada-Shiba, Mariko; Shintaku, Haruo; Tsuji, Masahiro

    2018-01-01

    Neonatal brain injury induced by stroke causes significant disability, including cerebral palsy, and there is no effective therapy for stroke. Recently, mesenchymal stem cells (MSCs) have emerged as a promising tool for stem cell-based therapies. In this study, we examined the safety and efficacy of intravenously administered human umbilical cord-derived MSCs (UC-MSCs) in neonatal stroke mice. Pups underwent permanent middle cerebral artery occlusion at postnatal day 12 (P12), and low-dose (1 × 104) or high-dose (1 × 105) UC-MSCs were administered intravenously 48 h after the insult (P14). To evaluate the effect of the UC-MSC treatment, neurological behavior and cerebral blood flow were measured, and neuroanatomical analysis was performed at P28. To investigate the mechanisms of intravenously injected UC-MSCs, systemic blood flowmetry, in vivo imaging and human brain-derived neurotrophic factor (BDNF) measurements were performed. Functional disability was significantly improved in the high-dose UC-MSC group when compared with the vehicle group, but cerebral blood flow and cerebral hemispheric volume were not restored by UC-MSC therapy. The level of exogenous human BDNF was elevated only in the cerebrospinal fluid of one pup 24 h after UC-MSC injection, and in vivo imaging revealed that most UC-MSCs were trapped in the lungs and disappeared in a week without migration toward the brain or other organs. We found that systemic blood flow was stable over the 10 min after cell administration and that there were no differences in mortality among the groups. Immunohistopathological assessment showed that the percent area of Iba1-positive staining in the peri-infarct cortex was significantly reduced with the high-dose UC-MSC treatment compared with the vehicle treatment. These results suggest that intravenous administration of UC-MSCs is safe for a mouse model of neonatal stroke and improves dysfunction after middle cerebral artery occlusion by modulating

  6. Dose-Dependent Effect of Intravenous Administration of Human Umbilical Cord-Derived Mesenchymal Stem Cells in Neonatal Stroke Mice

    Directory of Open Access Journals (Sweden)

    Emi Tanaka

    2018-03-01

    Full Text Available Neonatal brain injury induced by stroke causes significant disability, including cerebral palsy, and there is no effective therapy for stroke. Recently, mesenchymal stem cells (MSCs have emerged as a promising tool for stem cell-based therapies. In this study, we examined the safety and efficacy of intravenously administered human umbilical cord-derived MSCs (UC-MSCs in neonatal stroke mice. Pups underwent permanent middle cerebral artery occlusion at postnatal day 12 (P12, and low-dose (1 × 104 or high-dose (1 × 105 UC-MSCs were administered intravenously 48 h after the insult (P14. To evaluate the effect of the UC-MSC treatment, neurological behavior and cerebral blood flow were measured, and neuroanatomical analysis was performed at P28. To investigate the mechanisms of intravenously injected UC-MSCs, systemic blood flowmetry, in vivo imaging and human brain-derived neurotrophic factor (BDNF measurements were performed. Functional disability was significantly improved in the high-dose UC-MSC group when compared with the vehicle group, but cerebral blood flow and cerebral hemispheric volume were not restored by UC-MSC therapy. The level of exogenous human BDNF was elevated only in the cerebrospinal fluid of one pup 24 h after UC-MSC injection, and in vivo imaging revealed that most UC-MSCs were trapped in the lungs and disappeared in a week without migration toward the brain or other organs. We found that systemic blood flow was stable over the 10 min after cell administration and that there were no differences in mortality among the groups. Immunohistopathological assessment showed that the percent area of Iba1-positive staining in the peri-infarct cortex was significantly reduced with the high-dose UC-MSC treatment compared with the vehicle treatment. These results suggest that intravenous administration of UC-MSCs is safe for a mouse model of neonatal stroke and improves dysfunction after middle cerebral artery occlusion by

  7. What do Big Data do in Global Governance?

    DEFF Research Database (Denmark)

    Krause Hansen, Hans; Porter, Tony

    2017-01-01

    Two paradoxes associated with big data are relevant to global governance. First, while promising to increase the capacities of humans in governance, big data also involve an increasingly independent role for algorithms, technical artifacts, the Internet of things, and other objects, which can...... reduce the control of human actors. Second, big data involve new boundary transgressions as data are brought together from multiple sources while also creating new boundary conflicts as powerful actors seek to gain advantage by controlling big data and excluding competitors. These changes are not just...... about new data sources for global decision-makers, but instead signal more profound changes in the character of global governance....

  8. Systemic administration of antiretrovirals prior to exposure prevents rectal and intravenous HIV-1 transmission in humanized BLT mice.

    Directory of Open Access Journals (Sweden)

    Paul W Denton

    2010-01-01

    Full Text Available Successful antiretroviral pre-exposure prophylaxis (PrEP for mucosal and intravenous HIV-1 transmission could reduce new infections among targeted high-risk populations including discordant couples, injection drug users, high-risk women and men who have sex with men. Targeted antiretroviral PrEP could be particularly effective at slowing the spread of HIV-1 if a single antiretroviral combination were found to be broadly protective across multiple routes of transmission. Therefore, we designed our in vivo preclinical study to systematically investigate whether rectal and intravenous HIV-1 transmission can be blocked by antiretrovirals administered systemically prior to HIV-1 exposure. We performed these studies using a highly relevant in vivo model of mucosal HIV-1 transmission, humanized Bone marrow/Liver/Thymus mice (BLT. BLT mice are susceptible to HIV-1 infection via three major physiological routes of viral transmission: vaginal, rectal and intravenous. Our results show that BLT mice given systemic antiretroviral PrEP are efficiently protected from HIV-1 infection regardless of the route of exposure. Specifically, systemic antiretroviral PrEP with emtricitabine and tenofovir disoproxil fumarate prevented both rectal (Chi square = 8.6, df = 1, p = 0.003 and intravenous (Chi square = 13, df = 1, p = 0.0003 HIV-1 transmission. Our results indicate that antiretroviral PrEP has the potential to be broadly effective at preventing new rectal or intravenous HIV transmissions in targeted high risk individuals. These in vivo preclinical findings provide strong experimental evidence supporting the potential clinical implementation of antiretroviral based pre-exposure prophylactic measures to prevent the spread of HIV/AIDS.

  9. First-pass metabolism of ethanol in human beings: effect of intravenous infusion of fructose

    DEFF Research Database (Denmark)

    Parlesak, Alexandr; Billinger, MH; Schäfer, C.

    2004-01-01

    Intravenous infusion of fructose has been shown to enhance reduced form of nicotinamide adenine dinucleotide reoxidation and, thereby, to enhance the metabolism of ethanol. In the current study, the effect of fructose infusion on first-pass metabolism of ethanol was studied in human volunteers....... A significantly higher first-pass metabolism of ethanol was obtained after administration of fructose in comparison with findings for control experiments with an equimolar dose of glucose. Because fructose is metabolized predominantly in the liver and can be presumed to have virtually no effects in the stomach...

  10. Characterization of ornidazole metabolites in human bile after intraveneous doses by ultraperformance liquid chromatography/quadrupole time-of-flight mass spectrometry

    Directory of Open Access Journals (Sweden)

    Jiangbo Du

    2012-04-01

    Full Text Available Ultraperformance liquid chromatography/quadrupole time-of-flight mass spectrometry (UPLC/Q-TOF MS was used to characterize ornidazole metabolites in human bile after intravenous doses. A liquid chromatography tandem mass spectrometry (LC–MS/MS assay was developed for the determination of the bile level of ornidazole. Bile samples, collected from four patients with T-tube drainage after biliary tract surgery, were prepared by protein precipitation with acetonitrile before analysis. A total of 12 metabolites, including 10 novel metabolites, were detected and characterized. The metabolites of ornidazole in human bile were the products of hydrochloride (HCl elimination, oxidative dechlorination, hydroxylation, sulfation, diastereoisomeric glucuronation, and substitution of NO2 or Cl atom by cysteine or N-acetylcysteine, and oxidative dechlorination followed by further carboxylation. The bile levels of ornidazole at 12 h after multiple intravenous infusions were well above its minimal inhibitory concentration for common strains of anaerobic bacteria.

  11. Effect of intravenous infusion of glyceryl trinitrate on gastric and small intestinal motor function in healthy humans

    DEFF Research Database (Denmark)

    Madsen, Jan Lysgård; Fuglsang, Stefan; Graff, J

    2006-01-01

    of glyceryl trinitrate 1 microg/kg x min or saline. A gamma camera technique was used to measure gastric emptying and small intestinal transit after a 1600-kJ mixed liquid and solid meal. Furthermore, duodenal motility was assessed by manometry. RESULTS: Glyceryl trinitrate did not change gastric mean......BACKGROUND: Glyceryl trinitrate is a donor of nitric oxide that relaxes smooth muscle cells of the gastrointestinal tract. Little is known about the effect of glyceryl trinitrate on gastric emptying and no data exist on the possible effect of glyceryl trinitrate on small intestinal transit. AIM......: To examine the effect of intravenous infusion of glyceryl trinitrate on gastric and small intestinal motor function after a meal in healthy humans. METHODS: Nine healthy volunteers participated in a placebo-controlled, double-blind, crossover study. Each volunteer was examined during intravenous infusion...

  12. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  13. Big History or the 13800 million years from the Big Bang to the Human Brain

    Science.gov (United States)

    Gústafsson, Ludvik E.

    2017-04-01

    Big History is the integrated history of the Cosmos, Earth, Life, and Humanity. It is an attempt to understand our existence as a continuous unfolding of processes leading to ever more complex structures. Three major steps in the development of the Universe can be distinguished, the first being the creation of matter/energy and forces in the context of an expanding universe, while the second and third steps were reached when completely new qualities of matter came into existence. 1. Matter comes out of nothing Quantum fluctuations and the inflation event are thought to be responsible for the creation of stable matter particles in what is called the Big Bang. Along with simple particles the universe is formed. Later larger particles like atoms and the most simple chemical elements hydrogen and helium evolved. Gravitational contraction of hydrogen and helium formed the first stars und later on the first galaxies. Massive stars ended their lives in violent explosions releasing heavier elements like carbon, oxygen, nitrogen, sulfur and iron into the universe. Subsequent star formation led to star systems with bodies containing these heavier elements. 2. Matter starts to live About 9200 million years after the Big Bang a rather inconspicous star of middle size formed in one of a billion galaxies. The leftovers of the star formation clumped into bodies rotating around the central star. In some of them elements like silicon, oxygen, iron and many other became the dominant matter. On the third of these bodies from the central star much of the surface was covered with an already very common chemical compound in the universe, water. Fluid water and plenty of various elements, especially carbon, were the ingredients of very complex chemical compounds that made up even more complex structures. These were able to replicate themselves. Life had appeared, the only occasion that we human beings know of. Life evolved subsequently leading eventually to the formation of multicellular

  14. Methods of preparing and using intravenous nutrient compositions

    International Nuclear Information System (INIS)

    Beigler, M.A.; Koury, A.J.

    1983-01-01

    A method for preparing a stable, dry-packaged, sterile, nutrient composition which upon addition of sterile, pyrogen-free water is suitable for intravenous administration to a mammal, including a human, is described. The method comprises providing the nutrients in a specific dry form and state of physical purity acceptable for intravenous administration, sealing the nutrients in a particular type of container adapted to receive and dispense sterile fluids and subjecting the container and its sealed contents to a sterilizing, nondestructive dose of ionizing radiation. The method results in a packaged, sterile nutrient composition which may be dissolved by the addition of sterile pyrogen-free water. The resulting aqueous intravenous solution may be safely administered to a mammal in need of nutrient therapy. The packaged nutrient compositions of the invention exhibit greatly extended storage life and provide an economical method of providing intravenous solutions which are safe and efficacious for use. (author)

  15. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  16. Pharmacokinetics and pharmacodynamics of eltanolone (pregnanolone), a new steroid intravenous anaesthetic, in humans

    DEFF Research Database (Denmark)

    Carl, Peder; Høgskilde, S; Lang-Jensen, T

    1994-01-01

    Eltanolone, a new intravenous steroid anaesthetic agent was administered intravenously in a dose of 0.6 mg.kg-1 over 45 s to eight healthy male volunteers to evaluate some of its pharmacokinetic and pharmacodynamic effects. Drug concentration-time data were analysed by PCNONLIN, a non...

  17. Clinical Evaluation of Ciprofloxacin Intravenous Preparation ...

    African Journals Online (AJOL)

    The most common site of bacteria infection in humans is the urinary tract. For nosocomial infections it is the catheterized urinary tract. Compromised immune responses in hospitalized patients contribute to the difficulties encountered in treating their infections. In these patients, administration of intravenous antibiotic is ...

  18. Crisis analytics : big data-driven crisis response

    NARCIS (Netherlands)

    Qadir, Junaid; ur Rasool, Raihan; Zwitter, Andrej; Sathiaseelan, Arjuna; Crowcroft, Jon

    2016-01-01

    Disasters have long been a scourge for humanity. With the advances in technology (in terms of computing, communications, and the ability to process, and analyze big data), our ability to respond to disasters is at an inflection point. There is great optimism that big data tools can be leveraged to

  19. Human factors/ergonomics implications of big data analytics: Chartered Institute of Ergonomics and Human Factors annual lecture.

    Science.gov (United States)

    Drury, Colin G

    2015-01-01

    In recent years, advances in sensor technology, connectedness and computational power have come together to produce huge data-sets. The treatment and analysis of these data-sets is known as big data analytics (BDA), and the somewhat related term data mining. Fields allied to human factors/ergonomics (HFE), e.g. statistics, have developed computational methods to derive meaningful, actionable conclusions from these data bases. This paper examines BDA, often characterised by volume, velocity and variety, giving examples of successful BDA use. This examination provides context by considering examples of using BDA on human data, using BDA in HFE studies, and studies of how people perform BDA. Significant issues for HFE are the reliance of BDA on correlation rather than hypotheses and theory, the ethics of BDA and the use of HFE in data visualisation.

  20. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2015 The Authors.

  1. Implicit transitive inference and the human hippocampus: does intravenous midazolam function as a reversible hippocampal lesion?

    Directory of Open Access Journals (Sweden)

    Greene Anthony J

    2007-09-01

    Full Text Available Abstract Recent advances have led to an understanding that the hippocampus is involved more broadly than explicit or declarative memory alone. Tasks which involve the acquisition of complex associations involve the hippocampus whether the learning is explicit or implicit. One hippocampal-dependent implicit task is transitive inference (TI. Recently it was suggested that implicit transitive inference does not depend upon the hippocampus (Frank, M. J., O'Reilly, R. C., & Curran, T. 2006. When memory fails, intuition reigns: midazolam enhances implicit inference in humans. Psychological Science, 17, 700–707. The authors demonstrated that intravenous midazolam, which is thought to inactivate the hippocampus, may enhance TI performance. Three critical assumptions are required but not met: 1 that deactivations of other regions could not account for the effect 2 that intravenous midazolam does indeed deactivate the hippocampus and 3 that midazolam influences explicit but not implicit memory. Each of these assumptions is seriously flawed. Consequently, the suggestion that implicit TI does not depend upon the hippocampus is unfounded.

  2. The challenges of big data.

    Science.gov (United States)

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. © 2016. Published by The Company of Biologists Ltd.

  3. Digital humanitarians how big data is changing the face of humanitarian response

    CERN Document Server

    Meier, Patrick

    2015-01-01

    The Rise of Digital HumanitariansMapping Haiti LiveSupporting Search And Rescue EffortsPreparing For The Long Haul Launching An SMS Life Line Sending In The Choppers Openstreetmap To The Rescue Post-Disaster Phase The Human Story Doing Battle With Big Data Rise Of Digital Humanitarians This Book And YouThe Rise of Big (Crisis) DataBig (Size) Data Finding Needles In Big (Size) Data Policy, Not Simply Technology Big (False) Data Unpacking Big (False) Data Calling 991 And 999 Big (

  4. A Big Bang model of human colorectal tumor growth.

    Science.gov (United States)

    Sottoriva, Andrea; Kang, Haeyoun; Ma, Zhicheng; Graham, Trevor A; Salomon, Matthew P; Zhao, Junsong; Marjoram, Paul; Siegmund, Kimberly; Press, Michael F; Shibata, Darryl; Curtis, Christina

    2015-03-01

    What happens in early, still undetectable human malignancies is unknown because direct observations are impractical. Here we present and validate a 'Big Bang' model, whereby tumors grow predominantly as a single expansion producing numerous intermixed subclones that are not subject to stringent selection and where both public (clonal) and most detectable private (subclonal) alterations arise early during growth. Genomic profiling of 349 individual glands from 15 colorectal tumors showed an absence of selective sweeps, uniformly high intratumoral heterogeneity (ITH) and subclone mixing in distant regions, as postulated by our model. We also verified the prediction that most detectable ITH originates from early private alterations and not from later clonal expansions, thus exposing the profile of the primordial tumor. Moreover, some tumors appear 'born to be bad', with subclone mixing indicative of early malignant potential. This new model provides a quantitative framework to interpret tumor growth dynamics and the origins of ITH, with important clinical implications.

  5. Incarcerated intravenous heroin users: predictors of post-release utilization of methadone maintenance treatment.

    Science.gov (United States)

    Lin, Huang-Chi; Wang, Peng-Wei; Yang, Yi-Hsin; Tsai, Jih-Jin; Yen, Cheng-Fang

    2016-01-01

    Incarcerated intravenous heroin users have more problematic patterns of heroin use, but are less likely to access methadone maintenance treatment by their own initiative than heroin users in the community. The present study examined predictors for receiving methadone maintenance treatment post-release among incarcerated intravenous heroin users within a 24-month period. This cohort study recruited 315 incarcerated intravenous heroin users detained in 4 prisons in southern Taiwan and followed up within the 24-month period post-release. Cox proportional hazards regression analysis was applied to determine the predictive effects of sociodemographic and drug-use characteristics, attitude toward methadone maintenance treatment, human immunodeficiency virus serostatus, perceived family support, and depression for access to methadone maintenance treatment after release. There were 295 (93.7%) incarcerated intravenous heroin users released that entered the follow-up phase of the study. During the 24-month follow-up period, 50.8% of them received methadone maintenance treatment. After controlling for the effects of the detainment period before and after recruitment by Cox proportional hazards regression analysis, incarcerated intravenous heroin users who had positive human immunodeficiency virus serostatus (HR = 2.85, 95% CI = 1.80-4.52, p maintenance treatment before committal (HR = 1.94, 95% CI = 1.23-3.05, p maintenance treatment within the 24-month follow-up period. Positive human immunodeficiency virus serostatus with fully subsidized treatment and previous methadone maintenance treatment experiences predicted access of methadone maintenance treatment post-release. Strategies for getting familiar with methadone maintenance treatment during detainment, including providing methadone maintenance treatment prior to release and lowering the economic burden of receiving treatment, may facilitate entry of methadone maintenance treatment for incarcerated intravenous heroin

  6. A data analysis framework for biomedical big data: Application on mesoderm differentiation of human pluripotent stem cells.

    Science.gov (United States)

    Ulfenborg, Benjamin; Karlsson, Alexander; Riveiro, Maria; Améen, Caroline; Åkesson, Karolina; Andersson, Christian X; Sartipy, Peter; Synnergren, Jane

    2017-01-01

    The development of high-throughput biomolecular technologies has resulted in generation of vast omics data at an unprecedented rate. This is transforming biomedical research into a big data discipline, where the main challenges relate to the analysis and interpretation of data into new biological knowledge. The aim of this study was to develop a framework for biomedical big data analytics, and apply it for analyzing transcriptomics time series data from early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. To this end, transcriptome profiling by microarray was performed on differentiating human pluripotent stem cells sampled at eleven consecutive days. The gene expression data was analyzed using the five-stage analysis framework proposed in this study, including data preparation, exploratory data analysis, confirmatory analysis, biological knowledge discovery, and visualization of the results. Clustering analysis revealed several distinct expression profiles during differentiation. Genes with an early transient response were strongly related to embryonic- and mesendoderm development, for example CER1 and NODAL. Pluripotency genes, such as NANOG and SOX2, exhibited substantial downregulation shortly after onset of differentiation. Rapid induction of genes related to metal ion response, cardiac tissue development, and muscle contraction were observed around day five and six. Several transcription factors were identified as potential regulators of these processes, e.g. POU1F1, TCF4 and TBP for muscle contraction genes. Pathway analysis revealed temporal activity of several signaling pathways, for example the inhibition of WNT signaling on day 2 and its reactivation on day 4. This study provides a comprehensive characterization of biological events and key regulators of the early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. The proposed analysis framework can be used to structure

  7. Big Data: Concept, Potentialities and Vulnerabilities

    Directory of Open Access Journals (Sweden)

    Fernando Almeida

    2018-03-01

    Full Text Available The evolution of information systems and the growth in the use of the Internet and social networks has caused an explosion in the amount of available data relevant to the activities of the companies. Therefore, the treatment of these available data is vital to support operational, tactical and strategic decisions. This paper aims to present the concept of big data and the main technologies that support the analysis of large data volumes. The potential of big data is explored considering nine sectors of activity, such as financial, retail, healthcare, transports, agriculture, energy, manufacturing, public, and media and entertainment. In addition, the main current opportunities, vulnerabilities and privacy challenges of big data are discussed. It was possible to conclude that despite the potential for using the big data to grow in the previously identified areas, there are still some challenges that need to be considered and mitigated, namely the privacy of information, the existence of qualified human resources to work with Big Data and the promotion of a data-driven organizational culture.

  8. Acute Toxicity of Intravenously Administered Titanium Dioxide Nanoparticles in Mice

    OpenAIRE

    Xu, Jiaying; Shi, Hongbo; Ruth, Magaye; Yu, Hongsheng; Lazar, Lissy; Zou, Baobo; Yang, Cui; Wu, Aiguo; Zhao, Jinshun

    2013-01-01

    BACKGROUND: With a wide range of applications, titanium dioxide (TiO₂) nanoparticles (NPs) are manufactured worldwide in large quantities. Recently, in the field of nanomedicine, intravenous injection of TiO₂ nanoparticulate carriers directly into the bloodstream has raised public concerns on their toxicity to humans. METHODS: In this study, mice were injected intravenously with a single dose of TiO₂ NPs at varying dose levels (0, 140, 300, 645, or 1387 mg/kg). Animal mortality, blood biochem...

  9. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    Science.gov (United States)

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  10. Phase 1A safety assessment of intravenous amitriptyline

    NARCIS (Netherlands)

    Fridrich, Peter; Colvin, Hans Peter; Zizza, Anthony; Wasan, Ajay D.; Lukanich, Jean; Lirk, Philipp; Saria, Alois; Zernig, Gerald; Hamp, Thomas; Gerner, Peter

    2007-01-01

    The antidepressant amitriptyline is used as an adjuvant in the treatment of chronic pain. Among its many actions, amitriptyline blocks Na+ channels and nerves in several animal and human models. As perioperative intravenous lidocaine has been suggested to decrease postoperative pain, amitriptyline,

  11. Big Data in Medicine is Driving Big Changes

    Science.gov (United States)

    Verspoor, K.

    2014-01-01

    Summary Objectives To summarise current research that takes advantage of “Big Data” in health and biomedical informatics applications. Methods Survey of trends in this work, and exploration of literature describing how large-scale structured and unstructured data sources are being used to support applications from clinical decision making and health policy, to drug design and pharmacovigilance, and further to systems biology and genetics. Results The survey highlights ongoing development of powerful new methods for turning that large-scale, and often complex, data into information that provides new insights into human health, in a range of different areas. Consideration of this body of work identifies several important paradigm shifts that are facilitated by Big Data resources and methods: in clinical and translational research, from hypothesis-driven research to data-driven research, and in medicine, from evidence-based practice to practice-based evidence. Conclusions The increasing scale and availability of large quantities of health data require strategies for data management, data linkage, and data integration beyond the limits of many existing information systems, and substantial effort is underway to meet those needs. As our ability to make sense of that data improves, the value of the data will continue to increase. Health systems, genetics and genomics, population and public health; all areas of biomedicine stand to benefit from Big Data and the associated technologies. PMID:25123716

  12. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  13. Intravenous/oral ciprofloxacin therapy versus intravenous ceftazidime therapy for selected bacterial infections.

    Science.gov (United States)

    Gaut, P L; Carron, W C; Ching, W T; Meyer, R D

    1989-11-30

    The efficacy and toxicity of sequential intravenous and oral ciprofloxacin therapy was compared with intravenously administered ceftazidime in a prospective, randomized, controlled, non-blinded trial. Thirty-two patients (16 patients receiving ciprofloxacin and 16 patients receiving ceftazidime) with 38 infections caused by susceptible Pseudomonas aeruginosa, enteric gram-negative rods, Salmonella group B, Serratia marcescens, Pseudomonas cepacia, and Xanthomonas maltophilia at various sites were evaluable for determination of efficacy. Length of therapy varied from seven to 25 days. Concomitant antimicrobials included intravenously administered beta-lactams for gram-positive organisms, intravenous/oral metronidazole and clindamycin for anaerobes, and intravenous/local amphotericin B for Candida albicans. Intravenous administration of 200 mg ciprofloxacin every 12 hours to 11 patients produced peak serum levels between 1.15 and 3.12 micrograms/ml; trough levels ranged between 0.08 and 0.86 micrograms/ml. Overall response rates were similar for patients receiving ciprofloxacin and ceftazidime. Emergence of resistance was similar in both groups--one Enterobacter cloacae and two P. aeruginosa became resistant after ciprofloxacin therapy and two P. aeruginosa became resistant after ceftazidime therapy. The frequency of superinfection with a variety of organisms was also similar in both groups. Adverse events related to ciprofloxacin included transient pruritus at the infusion site and generalized rash leading to drug discontinuation (one patient each), and with ceftazidime adverse effects included pain at the site of infusion and the development of allergic interstitial nephritis (one patient each). Overall, intravenous/oral ciprofloxin therapy appears to be as safe and effective as intravenous ceftazidime therapy in the treatment of a variety of infections due to susceptible aerobic gram-negative organisms.

  14. Big Data, Big Problems: A Healthcare Perspective.

    Science.gov (United States)

    Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M

    2017-01-01

    Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."

  15. SETI as a part of Big History

    Science.gov (United States)

    Maccone, Claudio

    2014-08-01

    Big History is an emerging academic discipline which examines history scientifically from the Big Bang to the present. It uses a multidisciplinary approach based on combining numerous disciplines from science and the humanities, and explores human existence in the context of this bigger picture. It is taught at some universities. In a series of recent papers ([11] through [15] and [17] through [18]) and in a book [16], we developed a new mathematical model embracing Darwinian Evolution (RNA to Humans, see, in particular, [17] and Human History (Aztecs to USA, see [16]) and then we extrapolated even that into the future up to ten million years (see 18), the minimum time requested for a civilization to expand to the whole Milky Way (Fermi paradox). In this paper, we further extend that model in the past so as to let it start at the Big Bang (13.8 billion years ago) thus merging Big History, Evolution on Earth and SETI (the modern Search for ExtraTerrestrial Intelligence) into a single body of knowledge of a statistical type. Our idea is that the Geometric Brownian Motion (GBM), so far used as the key stochastic process of financial mathematics (Black-Sholes models and related 1997 Nobel Prize in Economics!) may be successfully applied to the whole of Big History. In particular, in this paper we derive the Statistical Drake Equation (namely the statistical extension of the classical Drake Equation typical of SETI) can be regarded as the “frozen in time” part of GBM. This makes SETI a subset of our Big History Theory based on GBMs: just as the GBM is the “movie” unfolding in time, so the Statistical Drake Equation is its “still picture”, static in time, and the GBM is the time-extension of the Drake Equation. Darwinian Evolution on Earth may be easily described as an increasing GBM in the number of living species on Earth over the last 3.5 billion years. The first of them was RNA 3.5 billion years ago, and now 50 million living species or more exist, each

  16. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  17. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van

  18. Privacy Challenges of Genomic Big Data.

    Science.gov (United States)

    Shen, Hong; Ma, Jian

    2017-01-01

    With the rapid advancement of high-throughput DNA sequencing technologies, genomics has become a big data discipline where large-scale genetic information of human individuals can be obtained efficiently with low cost. However, such massive amount of personal genomic data creates tremendous challenge for privacy, especially given the emergence of direct-to-consumer (DTC) industry that provides genetic testing services. Here we review the recent development in genomic big data and its implications on privacy. We also discuss the current dilemmas and future challenges of genomic privacy.

  19. Cognitive computing and big data analytics

    CERN Document Server

    Hurwitz, Judith; Bowles, Adrian

    2015-01-01

    MASTER THE ABILITY TO APPLY BIG DATA ANALYTICS TO MASSIVE AMOUNTS OF STRUCTURED AND UNSTRUCTURED DATA Cognitive computing is a technique that allows humans and computers to collaborate in order to gain insights and knowledge from data by uncovering patterns and anomalies. This comprehensive guide explains the underlying technologies, such as artificial intelligence, machine learning, natural language processing, and big data analytics. It then demonstrates how you can use these technologies to transform your organization. You will explore how different vendors and different industries are a

  20. Citizens’ Media Meets Big Data: The Emergence of Data Activism

    NARCIS (Netherlands)

    Milan, S.; Gutiérrez, M.

    2015-01-01

    Big data presents citizens with new challenges and opportunities. ‘Data activism’ practices emerge at the intersection of the social and technological dimension of human action, whereby citizens take a critical approach to big data, and appropriate and manipulate data for advocacy and social change.

  1. Intravenous Iron Carboxymaltose as a Potential Therapeutic in Anemia of Inflammation.

    Directory of Open Access Journals (Sweden)

    Niklas Lofruthe

    Full Text Available Intravenous iron supplementation is an effective therapy in iron deficiency anemia (IDA, but controversial in anemia of inflammation (AI. Unbound iron can be used by bacteria and viruses for their replication and enhance the inflammatory response. Nowadays available high molecular weight iron complexes for intravenous iron substitution, such as ferric carboxymaltose, might be useful in AI, as these pharmaceuticals deliver low doses of free iron over a prolonged period of time. We tested the effects of intravenous iron carboxymaltose in murine AI: Wild-type mice were exposed to the heat-killed Brucella abortus (BA model and treated with or without high molecular weight intravenous iron. 4h after BA injection followed by 2h after intravenous iron treatment, inflammatory cytokines were upregulated by BA, but not enhanced by iron treatment. In long term experiments, mice were fed a regular or an iron deficient diet and then treated with intravenous iron or saline 14 days after BA injection. Iron treatment in mice with BA-induced AI was effective 24h after iron administration. In contrast, mice with IDA (on iron deficiency diet prior to BA-IA required 7d to recover from AI. In these experiments, inflammatory markers were not further induced in iron-treated compared to vehicle-treated BA-injected mice. These results demonstrate that intravenous iron supplementation effectively treated the murine BA-induced AI without further enhancement of the inflammatory response. Studies in humans have to reveal treatment options for AI in patients.

  2. Crystallization and preliminary crystallographic analysis of the fourth FAS1 domain of human BigH3

    International Nuclear Information System (INIS)

    Yoo, Ji-Ho; Kim, EungKweon; Kim, Jongsun; Cho, Hyun-Soo

    2007-01-01

    The crystallization and X-ray diffraction analysis of the fourth FAS1 domain of human BigH3 are reported. The protein BigH3 is a cell-adhesion molecule induced by transforming growth factor-β (TGF-β). It consists of four homologous repeat domains known as FAS1 domains; mutations in these domains have been linked to corneal dystrophy. The fourth FAS1 domain was expressed in Escherichia coli B834 (DE3) (a methionine auxotroph) and purified by DEAE anion-exchange and gel-filtration chromatography. The FAS1 domain was crystallized using the vapour-diffusion method. A SAD diffraction data set was collected to a resolution of 2.5 Å at 100 K. The crystal belonged to space group P6 1 or P6 5 and had two molecules per asymmetric unit, with unit-cell parameters a = b = 62.93, c = 143.27 Å, α = β = 90.0, γ = 120.0°

  3. Intraarticular and intravenous administration of 99MTc-HMPAO-labeled human mesenchymal stem cells (99MTC-AH-MSCS): In vivo imaging and biodistribution

    International Nuclear Information System (INIS)

    Meseguer-Olmo, Luis; Montellano, Antonio Jesús; Martínez, Teresa; Martínez, Carlos M.; Revilla-Nuin, Beatriz; Roldán, Marta; Mora, Cristina Fuente; López-Lucas, Maria Dolores; Fuente, Teodomiro

    2017-01-01

    Introduction: Therapeutic application of intravenous administered (IV) human bone marrow-derived mesenchymal stem cells (ahMSCs) appears to have as main drawback the massive retention of cells in the lung parenchyma, questioning the suitability of this via of administration. Intraarticular administration (IAR) could be considered as an alternative route for therapy in degenerative and traumatic joint lesions. Our work is outlined as a comparative study of biodistribution of 99m Tc-ahMSCs after IV and IAR administration, via scintigraphic study in an animal model. Methods: Isolated primary culture of adult human mesenchymal stem cells was labeled with 99m Tc-HMPAO for scintigraphic study of in vivo distribution after intravenous and intra-articular (knee) administration in rabbits. Results: IV administration of radiolabeled ahMSCs showed the bulk of radioactivity in the lung parenchyma while IAR images showed activity mainly in the injected cavity and complete absence of uptake in pulmonary bed. Conclusions: Our study shows that IAR administration overcomes the limitations of IV injection, in particular, those related to cells destruction in the lung parenchyma. After IAR administration, cells remain within the joint cavity, as expected given its size and adhesion properties. Advances in knowledge: Intra-articular administration of adult human mesenchymal stem cells could be a suitable route for therapeutic effect in joint lesions. Implications for patient care: Local administration of adult human mesenchymal stem cells could improve their therapeutic effects, minimizing side effects in patients.

  4. BigOP: Generating Comprehensive Big Data Workloads as a Benchmarking Framework

    OpenAIRE

    Zhu, Yuqing; Zhan, Jianfeng; Weng, Chuliang; Nambiar, Raghunath; Zhang, Jinchao; Chen, Xingzhen; Wang, Lei

    2014-01-01

    Big Data is considered proprietary asset of companies, organizations, and even nations. Turning big data into real treasure requires the support of big data systems. A variety of commercial and open source products have been unleashed for big data storage and processing. While big data users are facing the choice of which system best suits their needs, big data system developers are facing the question of how to evaluate their systems with regard to general big data processing needs. System b...

  5. THE FASTEST OODA LOOP: THE IMPLICATIONS OF BIG DATA FOR AIR POWER

    Science.gov (United States)

    2016-06-01

    need for a human interpreter. Until the rise of Big Data , automated translation only had a “small” library of several million words to pull from and...AIR COMMAND AND STAFF COLLEGE AIR UNIVERSITY THE FASTEST OODA LOOP: THE IMPLICATIONS OF BIG DATA FOR AIR POWER by Aaron J. Dove, Maj, USAF A...1 Previous Academic Study....................................................................................................2 Why Big Data

  6. How Big Is Too Big?

    Science.gov (United States)

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  7. Big Data and Nursing: Implications for the Future.

    Science.gov (United States)

    Topaz, Maxim; Pruinelli, Lisiane

    2017-01-01

    Big data is becoming increasingly more prevalent and it affects the way nurses learn, practice, conduct research and develop policy. The discipline of nursing needs to maximize the benefits of big data to advance the vision of promoting human health and wellbeing. However, current practicing nurses, educators and nurse scientists often lack the required skills and competencies necessary for meaningful use of big data. Some of the key skills for further development include the ability to mine narrative and structured data for new care or outcome patterns, effective data visualization techniques, and further integration of nursing sensitive data into artificial intelligence systems for better clinical decision support. We provide growth-path vision recommendations for big data competencies for practicing nurses, nurse educators, researchers, and policy makers to help prepare the next generation of nurses and improve patient outcomes trough better quality connected health.

  8. Medios ciudadanos y big data: La emergencia del activismo de datos

    NARCIS (Netherlands)

    Milan, S.; Gutiérrez, M.

    2015-01-01

    Big data presents citizens with new challenges and opportunities. ‘Data activism’ practices emerge at the intersection of the social and techsnological dimension of human action, whereby citizens take a critical approach to big data, and appropriate and manipulate data for advocacy and social

  9. Use of intravenous immunoglobulin in neonates with haemolytic disease and immune thrombocytopenia

    Directory of Open Access Journals (Sweden)

    Marković-Sovtić Gordana

    2013-01-01

    Full Text Available Background/Aim. Intravenous immunoglobulin is a blood product made of human polyclonal immunoglobulin G. The mode of action of intravenous immunoglobulin is very complex. It is indicated in treatment of neonatal immune thrombocytopenia and haemolytic disease of the newborn. The aim of the study was to present our experience in the use of intravenous immunoglobulin in a group of term neonates. Methods. We analysed all relevant clinical and laboratory data of 23 neonates who recieved intravenous immunoglobulin during their hospitalization in Neonatal Intensive Care Unit of Mother and Child Health Care Institute over a five year period, from 2006. to 2010. Results. There were 11 patients with haemolytic disease of the newborn and 12 neonates with immune thrombocytopenia. All of them recieved 1-2 g/kg intravenous immunoglobulin in the course of their treatment. There was no adverse effects of intravenous immunoglobulin use. The use of intravenous immunoglobulin led to an increase in platelet number in thrombocytopenic patients, whereas in those with haemolytic disease serum bilirubin level decreased significantly, so that some patients whose bilirubin level was very close to the exchange transfusion criterion, avoided this procedure. Conclusion. The use of intravenous immunoglobulin was shown to be an effective treatment in reducing the need for exchange transfusion, duration of phototherapy and the length of hospital stay in neonates with haemolytic disease. When used in treatment of neonatal immune thrombocytopenia, it leads to an increase in the platelet number, thus decreasing the risk of serious complications of thrombocytopenia.

  10. Game, cloud architecture and outreach for The BIG Bell Test

    Science.gov (United States)

    Abellan, Carlos; Tura, Jordi; Garcia, Marta; Beduini, Federica; Hirschmann, Alina; Pruneri, Valerio; Acin, Antonio; Marti, Maria; Mitchell, Morgan

    The BIG Bell test uses the input from the Bellsters, self-selected human participants introducing zeros and ones through an online videogame, to perform a suite of quantum physics experiments. In this talk, we will explore the videogame, the data infrastructure and the outreach efforts of the BIG Bell test collaboration. First, we will discuss how the game was designed so as to eliminate possible feedback mechanisms that could influence people's behavior. Second, we will discuss the cloud architecture design for scalability as well as explain how we sent each individual bit from the users to the labs. Also, and using all the bits collected via the BIG Bell test interface, we will show a data analysis on human randomness, e.g. are younger Bellsters more random than older Bellsters? Finally, we will talk about the outreach and communication efforts of the BIG Bell test collaboration, exploring both the social media campaigns as well as the close interaction with teachers and educators to bring the project into classrooms.

  11. The International Big History Association

    Science.gov (United States)

    Duffy, Michael; Duffy, D'Neil

    2013-01-01

    IBHA, the International Big History Association, was organized in 2010 and "promotes the unified, interdisciplinary study and teaching of history of the Cosmos, Earth, Life, and Humanity." This is the vision that Montessori embraced long before the discoveries of modern science fleshed out the story of the evolving universe. "Big…

  12. Medios ciudadanos y big data: La emergencia del activismo de datos

    OpenAIRE

    Milan, S.; Gutiérrez, M.

    2015-01-01

    Big data presents citizens with new challenges and opportunities. ‘Data activism’ practices emerge at the intersection of the social and techsnological dimension of human action, whereby citizens take a critical approach to big data, and appropriate and manipulate data for advocacy and social change. This theoretical article explores the emergence of data activism as an empirical reality and a heuristic tool to study how people engage politically with big data. We ground the concept on a mult...

  13. [Three applications and the challenge of the big data in otology].

    Science.gov (United States)

    Lei, Guanxiong; Li, Jianan; Shen, Weidong; Yang, Shiming

    2016-03-01

    With the expansion of human practical activities, more and more areas have suffered from big data problems. The emergence of big data requires people to update the research paradigm and develop new technical methods. This review discussed that big data might bring opportunities and challenges in the area of auditory implantation, the deafness genome, and auditory pathophysiology, and pointed out that we needed to find appropriate theories and methods to make this kind of expectation into reality.

  14. BIG Data - BIG Gains? Understanding the Link Between Big Data Analytics and Innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance for product innovations. Since big data technologies provide new data information practices, they create new decision-making possibilities, which firms can use to realize innovations. Applying German firm-level data we find suggestive evidence that big data analytics matters for the likelihood of becoming a product innovator as well as the market success of the firms’ product innovat...

  15. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  16. Big data analysis new algorithms for a new society

    CERN Document Server

    Stefanowski, Jerzy

    2016-01-01

    This edited volume is devoted to Big Data Analysis from a Machine Learning standpoint as presented by some of the most eminent researchers in this area. It demonstrates that Big Data Analysis opens up new research problems which were either never considered before, or were only considered within a limited range. In addition to providing methodological discussions on the principles of mining Big Data and the difference between traditional statistical data analysis and newer computing frameworks, this book presents recently developed algorithms affecting such areas as business, financial forecasting, human mobility, the Internet of Things, information networks, bioinformatics, medical systems and life science. It explores, through a number of specific examples, how the study of Big Data Analysis has evolved and how it has started and will most likely continue to affect society. While the benefits brought upon by Big Data Analysis are underlined, the book also discusses some of the warnings that have been issued...

  17. Intentional intravenous mercury injection

    African Journals Online (AJOL)

    In this case report, intravenous complications, treatment strategies and possible ... Mercury toxicity is commonly associated with vapour inhalation or oral ingestion, for which there exist definite treatment options. Intravenous mercury ... personality, anxiousness, irritability, insomnia, depression and drowsi- ness.[1] However ...

  18. Global fluctuation spectra in big-crunch-big-bang string vacua

    International Nuclear Information System (INIS)

    Craps, Ben; Ovrut, Burt A.

    2004-01-01

    We study big-crunch-big-bang cosmologies that correspond to exact world-sheet superconformal field theories of type II strings. The string theory spacetime contains a big crunch and a big bang cosmology, as well as additional 'whisker' asymptotic and intermediate regions. Within the context of free string theory, we compute, unambiguously, the scalar fluctuation spectrum in all regions of spacetime. Generically, the big crunch fluctuation spectrum is altered while passing through the bounce singularity. The change in the spectrum is characterized by a function Δ, which is momentum and time dependent. We compute Δ explicitly and demonstrate that it arises from the whisker regions. The whiskers are also shown to lead to 'entanglement' entropy in the big bang region. Finally, in the Milne orbifold limit of our superconformal vacua, we show that Δ→1 and, hence, the fluctuation spectrum is unaltered by the big-crunch-big-bang singularity. We comment on, but do not attempt to resolve, subtleties related to gravitational back reaction and light winding modes when interactions are taken into account

  19. Big Argumentation?

    Directory of Open Access Journals (Sweden)

    Daniel Faltesek

    2013-08-01

    Full Text Available Big Data is nothing new. Public concern regarding the mass diffusion of data has appeared repeatedly with computing innovations, in the formation before Big Data it was most recently referred to as the information explosion. In this essay, I argue that the appeal of Big Data is not a function of computational power, but of a synergistic relationship between aesthetic order and a politics evacuated of a meaningful public deliberation. Understanding, and challenging, Big Data requires an attention to the aesthetics of data visualization and the ways in which those aesthetics would seem to depoliticize information. The conclusion proposes an alternative argumentative aesthetic as the appropriate response to the depoliticization posed by the popular imaginary of Big Data.

  20. Clearance of 131I-labeled murine monoclonal antibody from patients' blood by intravenous human anti-murine immunoglobulin antibody

    International Nuclear Information System (INIS)

    Stewart, J.S.; Sivolapenko, G.B.; Hird, V.; Davies, K.A.; Walport, M.; Ritter, M.A.; Epenetos, A.A.

    1990-01-01

    Five patients treated with intraperitoneal 131I-labeled mouse monoclonal antibody for ovarian cancer also received i.v. exogenous polyclonal human anti-murine immunoglobulin antibody. The pharmacokinetics of 131I-labeled monoclonal antibody in these patients were compared with those of 28 other patients receiving i.p.-radiolabeled monoclonal antibody for the first time without exogenous human anti-murine immunoglobulin, and who had no preexisting endogenous human anti-murine immunoglobulin antibody. Patients receiving i.v. human anti-murine immunoglobulin antibody demonstrated a rapid clearance of 131I-labeled monoclonal antibody from their circulation. The (mean) maximum 131I blood content was 11.4% of the injected activity in patients receiving human anti-murine immunoglobulin antibody compared to 23.3% in patients not given human anti-murine immunoglobulin antibody. Intravenous human anti-murine immunoglobulin antibody decreased the radiation dose to bone marrow (from 131I-labeled monoclonal antibody in the vascular compartment) 4-fold. Following the injection of human anti-murine immunoglobulin antibody, 131I-monoclonal/human anti-murine immunoglobulin antibody immune complexes were rapidly transported to the liver. Antibody dehalogenation in the liver was rapid, with 87% of the injected 131I excreted in 5 days. Despite the efficient hepatic uptake of immune complexes, dehalogenation of monoclonal antibody was so rapid that the radiation dose to liver parenchyma from circulating 131I was decreased 4-fold rather than increased. All patients developed endogenous human anti-murine immunoglobulin antibody 2 to 3 weeks after treatment

  1. Hepatic glycogen in humans. I. Direct formation after oral and intravenous glucose or after a 24-h fast

    International Nuclear Information System (INIS)

    Radziuk, J.

    1989-01-01

    The formation of hepatic glycogen by the direct pathway is assessed in humans after a 12-h fast and oral loading (100 g) or intravenous infusion (90 g) and after a 24-h fast and the same oral glucose load. The methodology used is based on the double tracer method. [3- 3 H]glucose is infused at a constant rate for the determination of the metabolic clearance of glucose. [1- 14 C]glucose is administered with the glucose load. One hour after absorption or the intravenous glucose infusion is terminated, a glucagon infusion is initiated to mobilize the glycogen labeled with [1- 14 C]glucose and formed during the absorptive period. At this time a third tracer, [6- 3 H]glucose, is administered to measure glucose clearance. It was found that after the 12-h fast and oral glucose loading 7.2 +/- 1.1 g of hepatic glycogen appears to be formed directly from glucose compared with 8.4 +/- 1.0 g after the same load and a 24-h fast and 8.5 +/- 0.4 g after a 12-h fast and an equivalent intravenous glucose infusion. When the amount of label ([ 14 C]glucose) mobilized that was not corrected for metabolic recycling was calculated, the data suggested that the amount of glycogen formed by gluconeogenic pathways was probably at least equal to that formed by direct uptake. It was also approximately 60% greater after a 24-h fast. It can be concluded that the amount of hepatic glycogen formed directly from glucose during glucose loading is not significantly altered by the route of entry or the extension of the fasting period to 24 h. The data suggest, however, that gluconeogenetic formation of glycogen increases with fasting

  2. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations......The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  3. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  4. Big Data Innovation Challenge : Pioneering Approaches to Data-Driven Development

    OpenAIRE

    World Bank Group

    2016-01-01

    Big data can sound remote and lacking a human dimension, with few obvious links to development and impacting the lives of the poor. Concepts such as anti-poverty targeting, market access or rural electrification seem far more relevant – and easier to grasp. And yet some of today’s most groundbreaking initiatives in these areas rely on big data. This publication profiles these and more, sho...

  5. From big data to deep insight in developmental science.

    Science.gov (United States)

    Gilmore, Rick O

    2016-01-01

    The use of the term 'big data' has grown substantially over the past several decades and is now widespread. In this review, I ask what makes data 'big' and what implications the size, density, or complexity of datasets have for the science of human development. A survey of existing datasets illustrates how existing large, complex, multilevel, and multimeasure data can reveal the complexities of developmental processes. At the same time, significant technical, policy, ethics, transparency, cultural, and conceptual issues associated with the use of big data must be addressed. Most big developmental science data are currently hard to find and cumbersome to access, the field lacks a culture of data sharing, and there is no consensus about who owns or should control research data. But, these barriers are dissolving. Developmental researchers are finding new ways to collect, manage, store, share, and enable others to reuse data. This promises a future in which big data can lead to deeper insights about some of the most profound questions in behavioral science. © 2016 The Authors. WIREs Cognitive Science published by Wiley Periodicals, Inc.

  6. A practical guide to big data research in psychology.

    Science.gov (United States)

    Chen, Eric Evan; Wojcik, Sean P

    2016-12-01

    The massive volume of data that now covers a wide variety of human behaviors offers researchers in psychology an unprecedented opportunity to conduct innovative theory- and data-driven field research. This article is a practical guide to conducting big data research, covering data management, acquisition, processing, and analytics (including key supervised and unsupervised learning data mining methods). It is accompanied by walkthrough tutorials on data acquisition, text analysis with latent Dirichlet allocation topic modeling, and classification with support vector machines. Big data practitioners in academia, industry, and the community have built a comprehensive base of tools and knowledge that makes big data research accessible to researchers in a broad range of fields. However, big data research does require knowledge of software programming and a different analytical mindset. For those willing to acquire the requisite skills, innovative analyses of unexpected or previously untapped data sources can offer fresh ways to develop, test, and extend theories. When conducted with care and respect, big data research can become an essential complement to traditional research. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  7. From big bang to big crunch and beyond

    International Nuclear Information System (INIS)

    Elitzur, Shmuel; Rabinovici, Eliezer; Giveon, Amit; Kutasov, David

    2002-01-01

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a 'big bang' singularity, expands and then contracts to a 'big crunch' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spacetimes is further connected at the singularities to a series of non-compact static regions with closed timelike curves. These regions contain boundaries, on which the observables of the theory live. This suggests a holographic interpretation of the physics. (author)

  8. BIG data - BIG gains? Empirical evidence on the link between big data analytics and innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance in terms of product innovations. Since big data technologies provide new data information practices, they create novel decision-making possibilities, which are widely believed to support firms’ innovation process. Applying German firm-level data within a knowledge production function framework we find suggestive evidence that big data analytics is a relevant determinant for the likel...

  9. LLNL's Big Science Capabilities Help Spur Over $796 Billion in U.S. Economic Activity Sequencing the Human Genome

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Jeffrey S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-28

    LLNL’s successful history of taking on big science projects spans beyond national security and has helped create billions of dollars per year in new economic activity. One example is LLNL’s role in helping sequence the human genome. Over $796 billion in new economic activity in over half a dozen fields has been documented since LLNL successfully completed this Grand Challenge.

  10. Big Five personality group differences across academic majors

    DEFF Research Database (Denmark)

    Vedel, Anna

    2016-01-01

    During the past decades, a number of studies have explored personality group differences in the Big Five personality traits among students in different academic majors. To date, though, this research has not been reviewed systematically. This was the aim of the present review. A systematic...... literature search identified twelve eligible studies yielding an aggregated sample size of 13,389. Eleven studies reported significant group differences in one or multiple Big Five personality traits. Consistent findings across studies were that students of arts/humanities and psychology scored high...... on Conscientiousness. Effect sizes were calculated to estimate the magnitude of the personality group differences. These effect sizes were consistent across studies comparing similar pairs of academic majors. For all Big Five personality traits medium effect sizes were found frequently, and for Openness even large...

  11. Semantic Web technologies for the big data in life sciences.

    Science.gov (United States)

    Wu, Hongyan; Yamaguchi, Atsuko

    2014-08-01

    The life sciences field is entering an era of big data with the breakthroughs of science and technology. More and more big data-related projects and activities are being performed in the world. Life sciences data generated by new technologies are continuing to grow in not only size but also variety and complexity, with great speed. To ensure that big data has a major influence in the life sciences, comprehensive data analysis across multiple data sources and even across disciplines is indispensable. The increasing volume of data and the heterogeneous, complex varieties of data are two principal issues mainly discussed in life science informatics. The ever-evolving next-generation Web, characterized as the Semantic Web, is an extension of the current Web, aiming to provide information for not only humans but also computers to semantically process large-scale data. The paper presents a survey of big data in life sciences, big data related projects and Semantic Web technologies. The paper introduces the main Semantic Web technologies and their current situation, and provides a detailed analysis of how Semantic Web technologies address the heterogeneous variety of life sciences big data. The paper helps to understand the role of Semantic Web technologies in the big data era and how they provide a promising solution for the big data in life sciences.

  12. What Role for Law, Human Rights, and Bioethics in an Age of Big Data, Consortia Science, and Consortia Ethics? The Importance of Trustworthiness.

    Science.gov (United States)

    Dove, Edward S; Özdemir, Vural

    2015-09-01

    The global bioeconomy is generating new paradigm-shifting practices of knowledge co-production, such as collective innovation; large-scale, data-driven global consortia science (Big Science); and consortia ethics (Big Ethics). These bioeconomic and sociotechnical practices can be forces for progressive social change, but they can also raise predicaments at the interface of law, human rights, and bioethics. In this article, we examine one such double-edged practice: the growing, multivariate exploitation of Big Data in the health sector, particularly by the private sector. Commercial exploitation of health data for knowledge-based products is a key aspect of the bioeconomy and is also a topic of concern among publics around the world. It is exacerbated in the current age of globally interconnected consortia science and consortia ethics, which is characterized by accumulating epistemic proximity, diminished academic independence, "extreme centrism", and conflicted/competing interests among innovation actors. Extreme centrism is of particular importance as a new ideology emerging from consortia science and consortia ethics; this relates to invariably taking a middle-of-the-road populist stance, even in the event of human rights breaches, so as to sustain the populist support needed for consortia building and collective innovation. What role do law, human rights, and bioethics-separate and together-have to play in addressing these predicaments and opportunities in early 21st century science and society? One answer we propose is an intertwined ethico-legal normative construct, namely trustworthiness . By considering trustworthiness as a central pillar at the intersection of law, human rights, and bioethics, we enable others to trust us, which in turns allows different actors (both nonprofit and for-profit) to operate more justly in consortia science and ethics, as well as to access and responsibly use health data for public benefit.

  13. What Role for Law, Human Rights, and Bioethics in an Age of Big Data, Consortia Science, and Consortia Ethics? The Importance of Trustworthiness

    Science.gov (United States)

    Dove, Edward S.; Özdemir, Vural

    2015-01-01

    The global bioeconomy is generating new paradigm-shifting practices of knowledge co-production, such as collective innovation; large-scale, data-driven global consortia science (Big Science); and consortia ethics (Big Ethics). These bioeconomic and sociotechnical practices can be forces for progressive social change, but they can also raise predicaments at the interface of law, human rights, and bioethics. In this article, we examine one such double-edged practice: the growing, multivariate exploitation of Big Data in the health sector, particularly by the private sector. Commercial exploitation of health data for knowledge-based products is a key aspect of the bioeconomy and is also a topic of concern among publics around the world. It is exacerbated in the current age of globally interconnected consortia science and consortia ethics, which is characterized by accumulating epistemic proximity, diminished academic independence, “extreme centrism”, and conflicted/competing interests among innovation actors. Extreme centrism is of particular importance as a new ideology emerging from consortia science and consortia ethics; this relates to invariably taking a middle-of-the-road populist stance, even in the event of human rights breaches, so as to sustain the populist support needed for consortia building and collective innovation. What role do law, human rights, and bioethics—separate and together—have to play in addressing these predicaments and opportunities in early 21st century science and society? One answer we propose is an intertwined ethico-legal normative construct, namely trustworthiness. By considering trustworthiness as a central pillar at the intersection of law, human rights, and bioethics, we enable others to trust us, which in turns allows different actors (both nonprofit and for-profit) to operate more justly in consortia science and ethics, as well as to access and responsibly use health data for public benefit. PMID:26345196

  14. What Role for Law, Human Rights, and Bioethics in an Age of Big Data, Consortia Science, and Consortia Ethics? The Importance of Trustworthiness

    Directory of Open Access Journals (Sweden)

    Edward S. Dove

    2015-08-01

    Full Text Available The global bioeconomy is generating new paradigm-shifting practices of knowledge co-production, such as collective innovation; large-scale, data-driven global consortia science (Big Science; and consortia ethics (Big Ethics. These bioeconomic and sociotechnical practices can be forces for progressive social change, but they can also raise predicaments at the interface of law, human rights, and bioethics. In this article, we examine one such double-edged practice: the growing, multivariate exploitation of Big Data in the health sector, particularly by the private sector. Commercial exploitation of health data for knowledge-based products is a key aspect of the bioeconomy and is also a topic of concern among publics around the world. It is exacerbated in the current age of globally interconnected consortia science and consortia ethics, which is characterized by accumulating epistemic proximity, diminished academic independence, “extreme centrism”, and conflicted/competing interests among innovation actors. Extreme centrism is of particular importance as a new ideology emerging from consortia science and consortia ethics; this relates to invariably taking a middle-of-the-road populist stance, even in the event of human rights breaches, so as to sustain the populist support needed for consortia building and collective innovation. What role do law, human rights, and bioethics—separate and together—have to play in addressing these predicaments and opportunities in early 21st century science and society? One answer we propose is an intertwined ethico-legal normative construct, namely trustworthiness. By considering trustworthiness as a central pillar at the intersection of law, human rights, and bioethics, we enable others to trust us, which in turns allows different actors (both nonprofit and for-profit to operate more justly in consortia science and ethics, as well as to access and responsibly use health data for public benefit.

  15. Benchmarking Big Data Systems and the BigData Top100 List.

    Science.gov (United States)

    Baru, Chaitanya; Bhandarkar, Milind; Nambiar, Raghunath; Poess, Meikel; Rabl, Tilmann

    2013-03-01

    "Big data" has become a major force of innovation across enterprises of all sizes. New platforms with increasingly more features for managing big datasets are being announced almost on a weekly basis. Yet, there is currently a lack of any means of comparability among such platforms. While the performance of traditional database systems is well understood and measured by long-established institutions such as the Transaction Processing Performance Council (TCP), there is neither a clear definition of the performance of big data systems nor a generally agreed upon metric for comparing these systems. In this article, we describe a community-based effort for defining a big data benchmark. Over the past year, a Big Data Benchmarking Community has become established in order to fill this void. The effort focuses on defining an end-to-end application-layer benchmark for measuring the performance of big data applications, with the ability to easily adapt the benchmark specification to evolving challenges in the big data space. This article describes the efforts that have been undertaken thus far toward the definition of a BigData Top100 List. While highlighting the major technical as well as organizational challenges, through this article, we also solicit community input into this process.

  16. Ultrasonography-guided peripheral intravenous access versus traditional approaches in patients with difficult intravenous access.

    Science.gov (United States)

    Costantino, Thomas G; Parikh, Aman K; Satz, Wayne A; Fojtik, John P

    2005-11-01

    We assess the success rate of emergency physicians in placing peripheral intravenous catheters in difficult-access patients who were unsuccessfully cannulated by emergency nurses. A technique using real-time ultrasonographic guidance by 2 physicians was compared with traditional approaches using palpation and landmark guidance. This was a prospective, systematically allocated study of all patients requiring intravenous access who presented to 2 university hospitals between October 2003 and March 2004. Inclusion criterion was the inability of any available nurse to obtain intravenous access after at least 3 attempts on a subgroup of patients who had a history of difficult intravenous access because of obesity, history of intravenous drug abuse, or chronic medical problems. Exclusion criterion was the need for central venous access. Patients presenting on odd days were allocated to the ultrasonographic-guided group, and those presenting on even days were allocated to the traditional-approach group. Endpoints were successful cannulation, number of sticks, time, and patient satisfaction. Sixty patients were enrolled, 39 on odd days and 21 on even days. Success rate was greater for the ultrasonographic group (97%) versus control (33%), difference in proportions of 64% (95% confidence interval [CI] 39% to 71%). The ultrasonographic group required less overall time (13 minutes versus 30 minutes, for a difference of 17 [95% CI 0.8 to 25.6]), less time to successful cannulation from first percutaneous puncture (4 minutes versus 15 minutes, for a difference of 11 [95% CI 8.2 to 19.4]), and fewer percutaneous punctures (1.7 versus 3.7, for a difference of 2.0 [95% CI 1.27 to 2.82]) and had greater patient satisfaction (8.7 versus 5.7, for a difference of 3.0 [95% CI 1.82 to 4.29]) than the traditional landmark approach. Ultrasonographic-guided peripheral intravenous access is more successful than traditional "blind" techniques, requires less time, decreases the number of

  17. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  18. BigDataBench: a Big Data Benchmark Suite from Internet Services

    OpenAIRE

    Wang, Lei; Zhan, Jianfeng; Luo, Chunjie; Zhu, Yuqing; Yang, Qiang; He, Yongqiang; Gao, Wanling; Jia, Zhen; Shi, Yingjie; Zhang, Shujie; Zheng, Chen; Lu, Gang; Zhan, Kent; Li, Xiaona; Qiu, Bizhu

    2014-01-01

    As architecture, systems, and data management communities pay greater attention to innovative big data systems and architectures, the pressure of benchmarking and evaluating these systems rises. Considering the broad use of big data systems, big data benchmarks must include diversity of data and workloads. Most of the state-of-the-art big data benchmarking efforts target evaluating specific types of applications or system software stacks, and hence they are not qualified for serving the purpo...

  19. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  20. BigDansing

    KAUST Repository

    Khayyat, Zuhair

    2015-06-02

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to scaling to big datasets. This presents a serious impediment since data cleansing often involves costly computations such as enumerating pairs of tuples, handling inequality joins, and dealing with user-defined functions. In this paper, we present BigDansing, a Big Data Cleansing system to tackle efficiency, scalability, and ease-of-use issues in data cleansing. The system can run on top of most common general purpose data processing platforms, ranging from DBMSs to MapReduce-like frameworks. A user-friendly programming interface allows users to express data quality rules both declaratively and procedurally, with no requirement of being aware of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic and real datasets show that BigDansing outperforms existing baseline systems up to more than two orders of magnitude without sacrificing the quality provided by the repair algorithms.

  1. Intravenous Therapy: Hazards, Complications and Their Prevention ...

    African Journals Online (AJOL)

    Breaks in aseptic techniques, faulty handling of parenteral fluid containers, failure to discard out-dated intravenous solutions and tubings contribute to occurrence of intravenous-associated sepsis. Improper technique and lack of pharmaceutical knowledge when adding drugs into intravenous fluids contribute to ...

  2. Big science

    CERN Multimedia

    Nadis, S

    2003-01-01

    " "Big science" is moving into astronomy, bringing large experimental teams, multi-year research projects, and big budgets. If this is the wave of the future, why are some astronomers bucking the trend?" (2 pages).

  3. A tomographic approach to intravenous coronary arteriography

    International Nuclear Information System (INIS)

    Ritman, E.L.; Bove, A.A.

    1986-01-01

    Coronary artery anatomy can be visualized using high speed, volume scanning X-ray CT. A single scan during a bolus injection of contrast medium provides image data for display of all angles of view of the opacified coronary arterial tree. Due to the tomographic nature of volume image data the superposition of contrast filled cardiac chambers, such as would occur in the levophase of an intravenous injection of contrast agent, can be eliminated. Data are presented which support these statements. The Dynamic Spatial Reconstructor (DSR) was used to scan a life-like radiologic phantom of an adult human thorax in which the left atrial and ventricular chambers and the major epicardial coronary arteries were opacified so as to simulate the levophase of an intravenous injection of contrast agent. A catheter filled with diluted contrast agent and with regions of luminal narrowing (i.e. 'stenoses') was advanced along a tract equivalent to a right ventricular catheterization. Ease of visualization of the catheter 'stenoses' and the accuracy with which they can be measured are presented. (Auth.)

  4. Where are human subjects in Big Data research? The emerging ethics divide

    Directory of Open Access Journals (Sweden)

    Jacob Metcalf

    2016-06-01

    Full Text Available There are growing discontinuities between the research practices of data science and established tools of research ethics regulation. Some of the core commitments of existing research ethics regulations, such as the distinction between research and practice, cannot be cleanly exported from biomedical research to data science research. Such discontinuities have led some data science practitioners and researchers to move toward rejecting ethics regulations outright. These shifts occur at the same time as a proposal for major revisions to the Common Rule—the primary regulation governing human-subjects research in the USA—is under consideration for the first time in decades. We contextualize these revisions in long-running complaints about regulation of social science research and argue data science should be understood as continuous with social sciences in this regard. The proposed regulations are more flexible and scalable to the methods of non-biomedical research, yet problematically largely exclude data science methods from human-subjects regulation, particularly uses of public datasets. The ethical frameworks for Big Data research are highly contested and in flux, and the potential harms of data science research are unpredictable. We examine several contentious cases of research harms in data science, including the 2014 Facebook emotional contagion study and the 2016 use of geographical data techniques to identify the pseudonymous artist Banksy. To address disputes about application of human-subjects research ethics in data science, critical data studies should offer a historically nuanced theory of “data subjectivity” responsive to the epistemic methods, harms and benefits of data science and commerce.

  5. [Big data, medical language and biomedical terminology systems].

    Science.gov (United States)

    Schulz, Stefan; López-García, Pablo

    2015-08-01

    A variety of rich terminology systems, such as thesauri, classifications, nomenclatures and ontologies support information and knowledge processing in health care and biomedical research. Nevertheless, human language, manifested as individually written texts, persists as the primary carrier of information, in the description of disease courses or treatment episodes in electronic medical records, and in the description of biomedical research in scientific publications. In the context of the discussion about big data in biomedicine, we hypothesize that the abstraction of the individuality of natural language utterances into structured and semantically normalized information facilitates the use of statistical data analytics to distil new knowledge out of textual data from biomedical research and clinical routine. Computerized human language technologies are constantly evolving and are increasingly ready to annotate narratives with codes from biomedical terminology. However, this depends heavily on linguistic and terminological resources. The creation and maintenance of such resources is labor-intensive. Nevertheless, it is sensible to assume that big data methods can be used to support this process. Examples include the learning of hierarchical relationships, the grouping of synonymous terms into concepts and the disambiguation of homonyms. Although clear evidence is still lacking, the combination of natural language technologies, semantic resources, and big data analytics is promising.

  6. Big bang and big crunch in matrix string theory

    OpenAIRE

    Bedford, J; Papageorgakis, C; Rodríguez-Gómez, D; Ward, J

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  7. Big Data in Science and Healthcare: A Review of Recent Literature and Perspectives

    Science.gov (United States)

    Miron-Shatz, T.; Lau, A. Y. S.; Paton, C.

    2014-01-01

    Summary Objectives As technology continues to evolve and rise in various industries, such as healthcare, science, education, and gaming, a sophisticated concept known as Big Data is surfacing. The concept of analytics aims to understand data. We set out to portray and discuss perspectives of the evolving use of Big Data in science and healthcare and, to examine some of the opportunities and challenges. Methods A literature review was conducted to highlight the implications associated with the use of Big Data in scientific research and healthcare innovations, both on a large and small scale. Results Scientists and health-care providers may learn from one another when it comes to understanding the value of Big Data and analytics. Small data, derived by patients and consumers, also requires analytics to become actionable. Connectivism provides a framework for the use of Big Data and analytics in the areas of science and healthcare. This theory assists individuals to recognize and synthesize how human connections are driving the increase in data. Despite the volume and velocity of Big Data, it is truly about technology connecting humans and assisting them to construct knowledge in new ways. Concluding Thoughts The concept of Big Data and associated analytics are to be taken seriously when approaching the use of vast volumes of both structured and unstructured data in science and health-care. Future exploration of issues surrounding data privacy, confidentiality, and education are needed. A greater focus on data from social media, the quantified self-movement, and the application of analytics to “small data” would also be useful. PMID:25123717

  8. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  9. Big data uncertainties.

    Science.gov (United States)

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  10. Towards Geo-spatial Information Science in Big Data Era

    Directory of Open Access Journals (Sweden)

    LI Deren

    2016-04-01

    Full Text Available Since the 1990s, with the advent of worldwide information revolution and the development of internet, geospatial information science have also come of age, which pushed forward the building of digital Earth and cyber city. As we entered the 21st century, with the development and integration of global information technology and industrialization, internet of things and cloud computing came into being, human society enters into the big data era. This article covers the key features (ubiquitous, multi-dimension and dynamics, internet+networking, full automation and real-time, from sensing to recognition, crowdsourcing and VGI, and service-oriented of geospatial information science in the big data era and addresses the key technical issues (non-linear four dimensional Earth reference frame system, space based enhanced GNSS, space-air and land unified network communication techniques, on board processing techniques for multi-sources image data, smart interface service techniques for space-borne information, space based resource scheduling and network security, design and developing of a payloads based multi-functional satellite platform. That needs to be resolved to provide a new definition of geospatial information science in big data era. Based on the discussion in this paper, the author finally proposes a new definition of geospatial information science (geomatics, i.e. Geomatics is a multiple discipline science and technology which, using a systematic approach, integrates all the means for spatio-temporal data acquisition, information extraction, networked management, knowledge discovering, spatial sensing and recognition, as well as intelligent location based services of any physical objects and human activities around the earth and its environment. Starting from this new definition, geospatial information science will get much more chances and find much more tasks in big data era for generation of smart earth and smart city . Our profession

  11. Sexual dimorphism in relation to big-game hunting and economy in modern human populations.

    Science.gov (United States)

    Collier, S

    1993-08-01

    Postcranial skeletal data from two recent Eskimo populations are used to test David Frayer's model of sexual dimorphism reduction in Europe between the Upper Paleolithic and Mesolithic. Frayer argued that a change from big-game hunting and adoption of new technology in the Mesolithic reduced selection for large body size in males and led to a reduction in skeletal sexual dimorphism. Though aspects of Frayer's work have been criticized in the literature, the association of big-game hunting and high sexual dimorphism is untested. This study employs univariate and multivariate analysis to test that association by examining sexual dimorphism of cranial and postcranial bones of two recent Alaskan Eskimo populations, one being big-game (whale and other large marine mammal) hunting people, and the second being salmon fishing, riverine people. While big-game hunting influences skeletal robusticity, it cannot be said to lead to greater sexual dimorphism generally. The two populations had different relative sexual dimorphism levels for different parts of the body. Notably, the big-game hunting (whaling) Eskimos had the lower multivariate dimorphism in the humerus, which could be expected to be the structure under greatest exertion by such hunting in males. While the exertions of the whale hunting economic activities led to high skeletal robusticity, as predicted by Frayer's model, this was true of the females as well as the males, resulting in low sexual dimorphism in some features. Females are half the sexual dimorphism equation, and they cannot be seen as constants in any model of economic behavior.

  12. INTRAVENOUS IMMUNOGLOBULIN IN PEDIATRIC RHEUMATOLOGY PRACTICE

    Directory of Open Access Journals (Sweden)

    E. I. Alexeeva

    2015-01-01

    Full Text Available Modern successful treatment of rheumatic diseases is impossible without the use of intravenous immunoglobulin. The use of intravenous immunoglobulin is based on strict indications developed as a result of long-term multicenter controlled studies. The article highlights the issues of using immunoglobulin in pediatric rheumatology practice, and provides the review of literature with the results from the evaluation of the efficiency of intravenous immunoglobulin confirming the efficiency of the drug only for certain rheumatic diseases. 

  13. Big bang and big crunch in matrix string theory

    International Nuclear Information System (INIS)

    Bedford, J.; Ward, J.; Papageorgakis, C.; Rodriguez-Gomez, D.

    2007-01-01

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe

  14. Experimental research on preventing mechanical phlebitis arising from indwelling needles in intravenous therapy by external application of mirabilite.

    Science.gov (United States)

    Lu, Yanyan; Hao, Chunyan; He, Wubin; Tang, Can; Shao, Zhenya

    2018-01-01

    Various types of complications arising from intravenous indwelling needles have become a challenge in clinical care. It is urgent to seek a simple and cost-effective method for prevention and treatment of phlebitis. We investigated the roles of mirabilite in preventing and treating phlebitis caused by intravenous indwelling needles and provide guidance for prevention and treatment of mechanical phlebitis caused by intravenous indwelling needles. A total of 57 healthy congeneric big-eared New Zealand rabbits were randomly divided into 3 groups: blank control, indwelling needle, and group with external application of mirabilite. The ear vein of each rabbit was punctured with an intravenous indwelling needle. The ear vein specimens were taken at 3, 5, and 7 days after indwelling. The hematoxylin and eosin stained pathological tissue sections of the ear veins of the rabbits in each group were observed. The expression levels of IL-1 and IL-6, and tumour necrosis factor-α (TNF-α) in the vascular tissue of the ear veins of the rabbits in each group were detected with the immunofluorescence method. In the blank control group, there was no inflammatory cellular infiltration and no proliferation of fibrous tissue around the vascular wall. With the increase of the indwelling time, proliferation of fibrous tissue in vascular wall, increased inflammatory cellular infiltration and organized thrombus in the vascular tissue occurred in the ear veins of the rabbits in the indwelling needle group and group with external application of mirabilite. Compared with the indwelling needle group, the group with external application of mirabilite had significantly decreased fibrous tissue in the vascular wall and significantly decreased inflammatory cellular infiltration. At the same point in indwelling time, the expression levels of IL-1, IL-6, and TNF-α in the indwelling needle and group with external application of mirabilite were significantly higher than that in the blank control

  15. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  16. Small Area Model-Based Estimators Using Big Data Sources

    Directory of Open Access Journals (Sweden)

    Marchetti Stefano

    2015-06-01

    Full Text Available The timely, accurate monitoring of social indicators, such as poverty or inequality, on a finegrained spatial and temporal scale is a crucial tool for understanding social phenomena and policymaking, but poses a great challenge to official statistics. This article argues that an interdisciplinary approach, combining the body of statistical research in small area estimation with the body of research in social data mining based on Big Data, can provide novel means to tackle this problem successfully. Big Data derived from the digital crumbs that humans leave behind in their daily activities are in fact providing ever more accurate proxies of social life. Social data mining from these data, coupled with advanced model-based techniques for fine-grained estimates, have the potential to provide a novel microscope through which to view and understand social complexity. This article suggests three ways to use Big Data together with small area estimation techniques, and shows how Big Data has the potential to mirror aspects of well-being and other socioeconomic phenomena.

  17. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  18. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  19. Urban Big Data and the Development of City Intelligence

    Directory of Open Access Journals (Sweden)

    Yunhe Pan

    2016-06-01

    Full Text Available This study provides a definition for urban big data while exploring its features and applications of China's city intelligence. The differences between city intelligence in China and the “smart city” concept in other countries are compared to highlight and contrast the unique definition and model for China's city intelligence in this paper. Furthermore, this paper examines the role of urban big data in city intelligence by showing that it not only serves as the cornerstone of this trend as it also plays a core role in the diffusion of city intelligence technology and serves as an inexhaustible resource for the sustained development of city intelligence. This study also points out the challenges of shaping and developing of China's urban big data. Considering the supporting and core role that urban big data plays in city intelligence, the study then expounds on the key points of urban big data, including infrastructure support, urban governance, public services, and economic and industrial development. Finally, this study points out that the utility of city intelligence as an ideal policy tool for advancing the goals of China's urban development. In conclusion, it is imperative that China make full use of its unique advantages—including using the nation's current state of development and resources, geographical advantages, and good human relations—in subjective and objective conditions to promote the development of city intelligence through the proper application of urban big data.

  20. A pharmacokinetic evaluation of five H(1) antagonists after an oral and intravenous microdose to human subjects.

    Science.gov (United States)

    Madan, Ajay; O'Brien, Zhihong; Wen, Jianyun; O'Brien, Chris; Farber, Robert H; Beaton, Graham; Crowe, Paul; Oosterhuis, Berend; Garner, R Colin; Lappin, Graham; Bozigian, Haig P

    2009-03-01

    To evaluate the pharmacokinetics (PK) of five H(1) receptor antagonists in human volunteers after a single oral and intravenous (i.v.) microdose (0.1 mg). Five H(1) receptor antagonists, namely NBI-1, NBI-2, NBI-3, NBI-4 and diphenhydramine, were administered to human volunteers as a single 0.1-mg oral and i.v. dose. Blood samples were collected up to 48 h, and the parent compound in the plasma extract was quantified by high-performance liquid chromatography and accelerator mass spectroscopy. The median clearance (CL), apparent volume of distribution (V(d)) and apparent terminal elimination half-life (t(1/2)) of diphenhydramine after an i.v. microdose were 24.7 l h(-1), 302 l and 9.3 h, and the oral C(max) and AUC(0-infinity) were 0.195 ng ml(-1) and 1.52 ng h ml(-1), respectively. These data were consistent with previously published diphenhydramine data at 500 times the microdose. The rank order of oral bioavailability of the five compounds was as follows: NBI-2 > NBI-1 > NBI-3 > diphenhydramine > NBI-4, whereas the rank order for CL was NBI-4 > diphenhydramine > NBI-1 > NBI-3 > NBI-2. Human microdosing provided estimates of clinical PK of four structurally related compounds, which were deemed useful for compound selection.

  1. A pharmacokinetic evaluation of five H1 antagonists after an oral and intravenous microdose to human subjects

    Science.gov (United States)

    Madan, Ajay; O'Brien, Zhihong; Wen, Jianyun; O'Brien, Chris; Farber, Robert H; Beaton, Graham; Crowe, Paul; Oosterhuis, Berend; Garner, R Colin; Lappin, Graham; Bozigian, Haig P

    2009-01-01

    AIMS To evaluate the pharmacokinetics (PK) of five H1 receptor antagonists in human volunteers after a single oral and intravenous (i.v.) microdose (0.1 mg). METHODS Five H1 receptor antagonists, namely NBI-1, NBI-2, NBI-3, NBI-4 and diphenhydramine, were administered to human volunteers as a single 0.1-mg oral and i.v. dose. Blood samples were collected up to 48 h, and the parent compound in the plasma extract was quantified by high-performance liquid chromatography and accelerator mass spectroscopy. RESULTS The median clearance (CL), apparent volume of distribution (Vd) and apparent terminal elimination half-life (t1/2) of diphenhydramine after an i.v. microdose were 24.7 l h−1, 302 l and 9.3 h, and the oral Cmax and AUC0–∞ were 0.195 ng ml−1 and 1.52 ng h ml−1, respectively. These data were consistent with previously published diphenhydramine data at 500 times the microdose. The rank order of oral bioavailability of the five compounds was as follows: NBI-2 > NBI-1 > NBI-3 > diphenhydramine > NBI-4, whereas the rank order for CL was NBI-4 > diphenhydramine > NBI-1 > NBI-3 > NBI-2. CONCLUSIONS Human microdosing provided estimates of clinical PK of four structurally related compounds, which were deemed useful for compound selection. PMID:19523012

  2. The Scope of Big Data in One Medicine: Unprecedented Opportunities and Challenges

    Directory of Open Access Journals (Sweden)

    Molly E. McCue

    2017-11-01

    Full Text Available Advances in high-throughput molecular biology and electronic health records (EHR, coupled with increasing computer capabilities have resulted in an increased interest in the use of big data in health care. Big data require collection and analysis of data at an unprecedented scale and represents a paradigm shift in health care, offering (1 the capacity to generate new knowledge more quickly than traditional scientific approaches; (2 unbiased collection and analysis of data; and (3 a holistic understanding of biology and pathophysiology. Big data promises more personalized and precision medicine for patients with improved accuracy and earlier diagnosis, and therapy tailored to an individual’s unique combination of genes, environmental risk, and precise disease phenotype. This promise comes from data collected from numerous sources, ranging from molecules to cells, to tissues, to individuals and populations—and the integration of these data into networks that improve understanding of heath and disease. Big data-driven science should play a role in propelling comparative medicine and “one medicine” (i.e., the shared physiology, pathophysiology, and disease risk factors across species forward. Merging of data from EHR across institutions will give access to patient data on a scale previously unimaginable, allowing for precise phenotype definition and objective evaluation of risk factors and response to therapy. High-throughput molecular data will give insight into previously unexplored molecular pathophysiology and disease etiology. Investigation and integration of big data from a variety of sources will result in stronger parallels drawn at the molecular level between human and animal disease, allow for predictive modeling of infectious disease and identification of key areas of intervention, and facilitate step-changes in our understanding of disease that can make a substantial impact on animal and human health. However, the use of big data

  3. The Scope of Big Data in One Medicine: Unprecedented Opportunities and Challenges.

    Science.gov (United States)

    McCue, Molly E; McCoy, Annette M

    2017-01-01

    Advances in high-throughput molecular biology and electronic health records (EHR), coupled with increasing computer capabilities have resulted in an increased interest in the use of big data in health care. Big data require collection and analysis of data at an unprecedented scale and represents a paradigm shift in health care, offering (1) the capacity to generate new knowledge more quickly than traditional scientific approaches; (2) unbiased collection and analysis of data; and (3) a holistic understanding of biology and pathophysiology. Big data promises more personalized and precision medicine for patients with improved accuracy and earlier diagnosis, and therapy tailored to an individual's unique combination of genes, environmental risk, and precise disease phenotype. This promise comes from data collected from numerous sources, ranging from molecules to cells, to tissues, to individuals and populations-and the integration of these data into networks that improve understanding of heath and disease. Big data-driven science should play a role in propelling comparative medicine and "one medicine" (i.e., the shared physiology, pathophysiology, and disease risk factors across species) forward. Merging of data from EHR across institutions will give access to patient data on a scale previously unimaginable, allowing for precise phenotype definition and objective evaluation of risk factors and response to therapy. High-throughput molecular data will give insight into previously unexplored molecular pathophysiology and disease etiology. Investigation and integration of big data from a variety of sources will result in stronger parallels drawn at the molecular level between human and animal disease, allow for predictive modeling of infectious disease and identification of key areas of intervention, and facilitate step-changes in our understanding of disease that can make a substantial impact on animal and human health. However, the use of big data comes with significant

  4. Big data naturally rescaled

    International Nuclear Information System (INIS)

    Stoop, Ruedi; Kanders, Karlis; Lorimer, Tom; Held, Jenny; Albert, Carlo

    2016-01-01

    We propose that a handle could be put on big data by looking at the systems that actually generate the data, rather than the data itself, realizing that there may be only few generic processes involved in this, each one imprinting its very specific structures in the space of systems, the traces of which translate into feature space. From this, we propose a practical computational clustering approach, optimized for coping with such data, inspired by how the human cortex is known to approach the problem.

  5. HIV antibodies among intravenous drug users in Bahrain.

    Science.gov (United States)

    al-Haddad, M K; Khashaba, A S; Baig, B Z; Khalfan, S

    1994-09-01

    A 12-month study was conducted to identify risk factors for human immunodeficiency virus (HIV) infections among intravenous drug users (IDU) attending drug rehabilitation clinic of the Psychiatric Hospital, Manama, Bahrain. Patients provided demographic and behavioural information based on a questionnaire. Two hundred and forty male IDUs participated in the study on voluntary basis. The seroprevalence of HIV was 21.1 per cent. The presence of HIV antibody was associated with educational status, frequency of injecting drugs and needle sharing.

  6. The Shadow of Big Data: Data-Citizenship and Exclusion

    DEFF Research Database (Denmark)

    Rossi, Luca; Hjelholt, Morten; Neumayer, Christina

    2016-01-01

    The shadow of Big Data: data-citizenship and exclusion Big data are understood as being able to provide insights on human behaviour at an individual as well as at an aggregated societal level (Manyka et al. 2011). These insights are expected to be more detailed and precise than anything before...... thanks to the large volume of digital data and to the unobstrusive nature of the data collection (Fishleigh 2014). Within this perspective, these two dimensions (volume and unobstrusiveness) define contemporary big data techniques as a socio-technical offering to society, a live representation of itself...... this process "data-citizenship" emerges. Data-citizenship assumes that citizens will be visible to the state through the data they produce. On a general level data-citizenship shifts citizenship from an intrinsic status of a group of people to a status achieved through action. This approach assumes equal...

  7. Ontologies, methodologies, and new uses of Big Data in the social and cultural sciences

    Directory of Open Access Journals (Sweden)

    Robin Wagner-Pacifici

    2015-12-01

    Full Text Available In our Introduction to the Conceiving the Social with Big Data Special Issue of Big Data & Society , we survey the 18 contributions from scholars in the humanities and social sciences, and highlight several questions and themes that emerge within and across them. These emergent issues reflect the challenges, problems, and promises of working with Big Data to access and assess the social. They include puzzles about the locus and nature of human life, the nature of interpretation, the categorical constructions of individual entities and agents, the nature and relevance of contexts and temporalities, and the determinations of causality. As such, the Introduction reflects on the contributions along a series of binaries that capture the dualities and dynamisms of these themes: Life/Data; Mind/Machine; and Induction/Deduction.

  8. Big Data for Global History: The Transformative Promise of Digital Humanities

    Directory of Open Access Journals (Sweden)

    Joris van Eijnatten

    2013-12-01

    Full Text Available This article discusses the promises and challenges of digital humanitiesmethodologies for historical inquiry. In order to address the great outstanding question whether big data will re-invigorate macro-history, a number of research projects are described that use cultural text mining to explore big data repositories of digitised newspapers. The advantages of quantitative analysis, visualisation and named entity recognition in both exploration and analysis are illustrated in the study of public debates on drugs, drug trafficking, and drug users in the early twentieth century (wahsp, the comparative study of discourses about heredity, genetics, and eugenics in Dutch and German newspapers, 1863-1940 (biland and the study of trans-Atlantic discourses (Translantis. While many technological and practical obstacles remain, advantages over traditional hermeneutic methodology are found in heuristics, analytics, quantitative trans-disciplinarity, and reproducibility, offering a quantitative and trans-national perspective on the history of mentalities.

  9. Euthanasia of Small Animals with Nitrogen; Comparison with Intravenous Pentobarbital

    OpenAIRE

    Quine, John P.; Buckingham, William; Strunin, Leo

    1988-01-01

    Intravenous pentobarbital (with or without addition of saturated potassium chloride) was compared with nitrogen gas exposure for euthanasia of small animals (dogs, cats, and rabbits) in a humane society environment. Initially, electrocardiographic) and electroencephalographic monitoring were used to establish the time of death in presedated animals given either pentobarbital or exposed to nitrogen; later, nitrogen euthanasia alone was studied. Sedation with acepromazine delayed the effects of...

  10. Pharmacokinetics of Oral and Intravenous Paracetamol (Acetaminophen) When Co-Administered with Intravenous Morphine in Healthy Adult Subjects.

    Science.gov (United States)

    Raffa, Robert B; Pawasauskas, Jayne; Pergolizzi, Joseph V; Lu, Luke; Chen, Yin; Wu, Sutan; Jarrett, Brant; Fain, Randi; Hill, Lawrence; Devarakonda, Krishna

    2018-03-01

    Several features favor paracetamol (acetaminophen) administration by the intravenous rather than the oral route in the postoperative setting. This study compared the pharmacokinetics and bioavailability of oral and intravenous paracetamol when given with or without an opioid, morphine. In this randomized, single-blind, parallel, repeat-dose study in healthy adults, subjects received four repeat doses of oral or intravenous 1000 mg paracetamol at 6-h intervals, and morphine infusions (0.125 mg/kg) at the 2nd and 3rd intervals. Comparisons of plasma pharmacokinetic profiles were conducted before, during, and after opioid co-administrations. Twenty-two subjects were included in the pharmacokinetic analysis. Observed paracetamol peak concentration (C max ) and area under the plasma concentration-time curve over the dosing interval (AUC 0-6 ) were reduced when oral paracetamol was co-administered with morphine (reduced from 11.6 to 7.25 µg/mL and from 31.00 to 25.51 µg·h/mL, respectively), followed by an abruptly increased C max and AUC 0-6 upon discontinuation of morphine (to 13.5 µg/mL and 52.38 µg·h/mL, respectively). There was also a significantly prolonged mean time to peak plasma concentration (T max ) after the 4th dose of oral paracetamol (2.84 h) compared to the 1st dose (1.48 h). However, pharmacokinetic parameters of paracetamol were not impacted when intravenous paracetamol was co-administered with morphine. Morphine co-administration significantly impacted the pharmacokinetics of oral but not intravenous paracetamol. The abrupt release of accumulated paracetamol at the end of morphine-mediated gastrointestinal inhibition following oral but not intravenous administration of paracetamol suggests that intravenous paracetamol provides a better option for the management of postoperative pain. CLINICALTRIALS. NCT02848729.

  11. Big Data en surveillance, deel 1 : Definities en discussies omtrent Big Data

    NARCIS (Netherlands)

    Timan, Tjerk

    2016-01-01

    Naar aanleiding van een (vrij kort) college over surveillance en Big Data, werd me gevraagd iets dieper in te gaan op het thema, definities en verschillende vraagstukken die te maken hebben met big data. In dit eerste deel zal ik proberen e.e.a. uiteen te zetten betreft Big Data theorie en

  12. Characterizing Big Data Management

    OpenAIRE

    Rogério Rossi; Kechi Hirama

    2015-01-01

    Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: t...

  13. Evidence of Big Five and Aggressive Personalities in Gait Biomechanics.

    Science.gov (United States)

    Satchell, Liam; Morris, Paul; Mills, Chris; O'Reilly, Liam; Marshman, Paul; Akehurst, Lucy

    2017-01-01

    Behavioral observation techniques which relate action to personality have long been neglected (Furr and Funder in Handbook of research methods in personality psychology, The Guilford Press, New York, 2007) and, when employed, often use human judges to code behavior. In the current study we used an alternative to human coding (biomechanical research techniques) to investigate how personality traits are manifest in gait. We used motion capture technology to record 29 participants walking on a treadmill at their natural speed. We analyzed their thorax and pelvis movements, as well as speed of gait. Participants completed personality questionnaires, including a Big Five measure and a trait aggression questionnaire. We found that gait related to several of our personality measures. The magnitude of upper body movement, lower body movement, and walking speed, were related to Big Five personality traits and aggression. Here, we present evidence that some gait measures can relate to Big Five and aggressive personalities. We know of no other examples of research where gait has been shown to correlate with self-reported measures of personality and suggest that more research should be conducted between largely automatic movement and personality.

  14. Mycotic aneurysms in intravenous drug abusers: the utility of intravenous digital subtraction angiography

    International Nuclear Information System (INIS)

    Shetty, P.C.; Krasicky, G.A.; Sharma, R.P.; Vemuri, B.R.; Burke, M.M.

    1985-01-01

    Two-hundred thirteen intravenous digital subtraction angiographic (DSA) examinations were performed on 195 intravenous drug abusers to rule out the possibility of a mycotic aneurysm in a groin, neck, or upper extremity infection. Twenty-three surgically proved cases of mycotic aneurysm were correctly identified with no false positive results. In addition, six cases of major venous occlusion were documented. The authors present the results of their experience and conclude that DSA is an effective and cost-efficient method of examining this high risk patient population

  15. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  16. An analysis of cross-sectional differences in big and non-big public accounting firms' audit programs

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans); Drieenhuizen, F.; Stein, M.T.; Simunic, D.A.

    2006-01-01

    A significant body of prior research has shown that audits by the Big 5 (now Big 4) public accounting firms are quality differentiated relative to non-Big 5 audits. This result can be derived analytically by assuming that Big 5 and non-Big 5 firms face different loss functions for "audit failures"

  17. Big Data is invading big places as CERN

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Big Data technologies are becoming more popular with the constant grow of data generation in different fields such as social networks, internet of things and laboratories like CERN. How is CERN making use of such technologies? How machine learning is applied at CERN with Big Data technologies? How much data we move and how it is analyzed? All these questions will be answered during the talk.

  18. The big bang

    International Nuclear Information System (INIS)

    Chown, Marcus.

    1987-01-01

    The paper concerns the 'Big Bang' theory of the creation of the Universe 15 thousand million years ago, and traces events which physicists predict occurred soon after the creation. Unified theory of the moment of creation, evidence of an expanding Universe, the X-boson -the particle produced very soon after the big bang and which vanished from the Universe one-hundredth of a second after the big bang, and the fate of the Universe, are all discussed. (U.K.)

  19. The processing of intravenous coronary angiography angiograms produced by synchrotron radiation. Ch. 20B

    International Nuclear Information System (INIS)

    Zeman, H.D.

    1991-01-01

    Intravenous coronary angiography using synchrotron radiation (SR) has been demonstrated in recent years to hold promise for performing diagnostic examinations of human patients less invasively than the presently required arterially invasive procedures. The high intensity and tunability of SR, the linearity and large dynamic range of multi-channel Si(Li) detectors, and the scatter reducing properties of a fan-beam geometry should eventually lead to intravenous images of the human heart of a quality equal to that already achieved in dogs. However, two major problems with the intravenous angiography technique remain. Contrast material in the cardiac chambers and great vessels obscures the coronary arteries overlying these structures. In addition, the contrast material in the capillary bed of the heart muscle produces a gray haze that limits the extent to which contrast enhancement can be used to bring out details in the coronary arteries without turning this haze into a black cloud. For these reasons, an image processing technique is necessary which can remove large smooth opaque structures from the angiogram, allowing the fine detail overlying them to be made visible, and allowing contrast enhancement of this detail to be performed. This chapter discusses the image processing technique and illustrates this technique by some experimental results. (author). 13 refs.; 15 figs

  20. DISTINGUISHED CHARACTERISTICS OF INFECTIVE ENDOCARDITIS IN HIV/AIDS AMONG INTRAVENOUS DRUGS ABUSED

    Directory of Open Access Journals (Sweden)

    E. Y. Ponomareva

    2011-01-01

    Full Text Available The aim – definition of distinguished characteristics of the right-sided infective endocarditis (IE inintravenous drugs abused with human immunodeficiency virus (HIV/acquired immunodeficiency syndrome (AIDS.Materials and methods. The study included 10 patients with right-sided IE in conjunction with HIV/AIDS. All patients were male, age – from 28to 36 years.Results. Course of the IE in HIV/AIDS among intravenous drugs abused in general corresponds to features specific to IE in intravenous drug users without HIV infection. Distinctive features of IE in these patients are a large burden of lung disease, its disseminated character, more tissue oxygenation disorders and marked pulmonary hypertension and haematological disorders (lymphopenia, anemia, and late diagnosis of IE.Conclusion. Features of the current right-sided IE in intravenous drugs abused with HIV/AIDS are distinguished . Difficulties in diagnosis of IE inHIV infection are due to variety of causes of prolonged fever, which should guide doctors to more frequent use of transthoracic echocardiography during prolonged fever in HIV-infected patients.

  1. Small Big Data Congress 2017

    NARCIS (Netherlands)

    Doorn, J.

    2017-01-01

    TNO, in collaboration with the Big Data Value Center, presents the fourth Small Big Data Congress! Our congress aims at providing an overview of practical and innovative applications based on big data. Do you want to know what is happening in applied research with big data? And what can already be

  2. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  3. Big Data and Neuroimaging.

    Science.gov (United States)

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  4. Orthostatic stability with intravenous levodopa

    Directory of Open Access Journals (Sweden)

    Shan H. Siddiqi

    2015-08-01

    Full Text Available Intravenous levodopa has been used in a multitude of research studies due to its more predictable pharmacokinetics compared to the oral form, which is used frequently as a treatment for Parkinson’s disease (PD. Levodopa is the precursor for dopamine, and intravenous dopamine would strongly affect vascular tone, but peripheral decarboxylase inhibitors are intended to block such effects. Pulse and blood pressure, with orthostatic changes, were recorded before and after intravenous levodopa or placebo—after oral carbidopa—in 13 adults with a chronic tic disorder and 16 tic-free adult control subjects. Levodopa caused no statistically or clinically significant changes in blood pressure or pulse. These data add to previous data that support the safety of i.v. levodopa when given with adequate peripheral inhibition of DOPA decarboxylase.

  5. Big Data; A Management Revolution : The emerging role of big data in businesses

    OpenAIRE

    Blasiak, Kevin

    2014-01-01

    Big data is a term that was coined in 2012 and has since then emerged to one of the top trends in business and technology. Big data is an agglomeration of different technologies resulting in data processing capabilities that have been unreached before. Big data is generally characterized by 4 factors. Volume, velocity and variety. These three factors distinct it from the traditional data use. The possibilities to utilize this technology are vast. Big data technology has touch points in differ...

  6. Priming the Pump for Big Data at Sentara Healthcare.

    Science.gov (United States)

    Kern, Howard P; Reagin, Michael J; Reese, Bertram S

    2016-01-01

    Today's healthcare organizations are facing significant demands with respect to managing population health, demonstrating value, and accepting risk for clinical outcomes across the continuum of care. The patient's environment outside the walls of the hospital and physician's office-and outside the electronic health record (EHR)-has a substantial impact on clinical care outcomes. The use of big data is key to understanding factors that affect the patient's health status and enhancing clinicians' ability to anticipate how the patient will respond to various therapies. Big data is essential to delivering sustainable, highquality, value-based healthcare, as well as to the success of new models of care such as clinically integrated networks (CINs) and accountable care organizations.Sentara Healthcare, based in Norfolk, Virginia, has been an early adopter of the technologies that have readied us for our big data journey: EHRs, telehealth-supported electronic intensive care units, and telehealth primary care support through MDLIVE. Although we would not say Sentara is at the cutting edge of the big data trend, it certainly is among the fast followers. Use of big data in healthcare is still at an early stage compared with other industries. Tools for data analytics are maturing, but traditional challenges such as heightened data security and limited human resources remain the primary focus for regional health systems to improve care and reduce costs. Sentara primarily makes actionable use of big data in our CIN, Sentara Quality Care Network, and at our health plan, Optima Health. Big data projects can be expensive, and justifying the expense organizationally has often been easier in times of crisis. We have developed an analytics strategic plan separate from but aligned with corporate system goals to ensure optimal investment and management of this essential asset.

  7. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  8. Cryptography for Big Data Security

    Science.gov (United States)

    2015-07-13

    Cryptography for Big Data Security Book Chapter for Big Data: Storage, Sharing, and Security (3S) Distribution A: Public Release Ariel Hamlin1 Nabil...Email: arkady@ll.mit.edu ii Contents 1 Cryptography for Big Data Security 1 1.1 Introduction...48 Chapter 1 Cryptography for Big Data Security 1.1 Introduction With the amount

  9. Big data challenges in decoding cortical activity in a human with quadriplegia to inform a brain computer interface.

    Science.gov (United States)

    Friedenberg, David A; Bouton, Chad E; Annetta, Nicholas V; Skomrock, Nicholas; Mingming Zhang; Schwemmer, Michael; Bockbrader, Marcia A; Mysiw, W Jerry; Rezai, Ali R; Bresler, Herbert S; Sharma, Gaurav

    2016-08-01

    Recent advances in Brain Computer Interfaces (BCIs) have created hope that one day paralyzed patients will be able to regain control of their paralyzed limbs. As part of an ongoing clinical study, we have implanted a 96-electrode Utah array in the motor cortex of a paralyzed human. The array generates almost 3 million data points from the brain every second. This presents several big data challenges towards developing algorithms that should not only process the data in real-time (for the BCI to be responsive) but are also robust to temporal variations and non-stationarities in the sensor data. We demonstrate an algorithmic approach to analyze such data and present a novel method to evaluate such algorithms. We present our methodology with examples of decoding human brain data in real-time to inform a BCI.

  10. Data: Big and Small.

    Science.gov (United States)

    Jones-Schenk, Jan

    2017-02-01

    Big data is a big topic in all leadership circles. Leaders in professional development must develop an understanding of what data are available across the organization that can inform effective planning for forecasting. Collaborating with others to integrate data sets can increase the power of prediction. Big data alone is insufficient to make big decisions. Leaders must find ways to access small data and triangulate multiple types of data to ensure the best decision making. J Contin Educ Nurs. 2017;48(2):60-61. Copyright 2017, SLACK Incorporated.

  11. Fuzzy VIKOR approach for selection of big data analyst in procurement management

    Directory of Open Access Journals (Sweden)

    Surajit Bag

    2016-07-01

    Full Text Available Background: Big data and predictive analysis have been hailed as the fourth paradigm of science. Big data and analytics are critical to the future of business sustainability. The demand for data scientists is increasing with the dynamic nature of businesses, thus making it indispensable to manage big data, derive meaningful results and interpret management decisions. Objectives: The purpose of this study was to provide a brief conceptual review of big data and analytics and further illustrate the use of a multicriteria decision-making technique in selecting the right skilled candidate for big data and analytics in procurement management. Method: It is important for firms to select and recruit the right data analyst, both in terms of skills sets and scope of analysis. The nature of such a problem is complex and multicriteria decision-making, which deals with both qualitative and quantitative factors. In the current study, an application of the Fuzzy VIsekriterijumska optimizacija i KOmpromisno Resenje (VIKOR method was used to solve the big data analyst selection problem. Results: From this study, it was identified that Technical knowledge (C1, Intellectual curiosity (C4 and Business acumen (C5 are the strongest influential criteria and must be present in the candidate for the big data and analytics job. Conclusion: Fuzzy VIKOR is the perfect technique in this kind of multiple criteria decisionmaking problematic scenario. This study will assist human resource managers and procurement managers in selecting the right workforce for big data analytics.

  12. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  13. The Review of Visual Analysis Methods of Multi-modal Spatio-temporal Big Data

    Directory of Open Access Journals (Sweden)

    ZHU Qing

    2017-10-01

    Full Text Available The visual analysis of spatio-temporal big data is not only the state-of-art research direction of both big data analysis and data visualization, but also the core module of pan-spatial information system. This paper reviews existing visual analysis methods at three levels:descriptive visual analysis, explanatory visual analysis and exploratory visual analysis, focusing on spatio-temporal big data's characteristics of multi-source, multi-granularity, multi-modal and complex association.The technical difficulties and development tendencies of multi-modal feature selection, innovative human-computer interaction analysis and exploratory visual reasoning in the visual analysis of spatio-temporal big data were discussed. Research shows that the study of descriptive visual analysis for data visualizationis is relatively mature.The explanatory visual analysis has become the focus of the big data analysis, which is mainly based on interactive data mining in a visual environment to diagnose implicit reason of problem. And the exploratory visual analysis method needs a major break-through.

  14. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  15. Big Data Analytics An Overview

    Directory of Open Access Journals (Sweden)

    Jayshree Dwivedi

    2015-08-01

    Full Text Available Big data is a data beyond the storage capacity and beyond the processing power is called big data. Big data term is used for data sets its so large or complex that traditional data it involves data sets with sizes. Big data size is a constantly moving target year by year ranging from a few dozen terabytes to many petabytes of data means like social networking sites the amount of data produced by people is growing rapidly every year. Big data is not only a data rather it become a complete subject which includes various tools techniques and framework. It defines the epidemic possibility and evolvement of data both structured and unstructured. Big data is a set of techniques and technologies that require new forms of assimilate to uncover large hidden values from large datasets that are diverse complex and of a massive scale. It is difficult to work with using most relational database management systems and desktop statistics and visualization packages exacting preferably massively parallel software running on tens hundreds or even thousands of servers. Big data environment is used to grab organize and resolve the various types of data. In this paper we describe applications problems and tools of big data and gives overview of big data.

  16. The use of big data in transfusion medicine.

    Science.gov (United States)

    Pendry, K

    2015-06-01

    'Big data' refers to the huge quantities of digital information now available that describe much of human activity. The science of data management and analysis is rapidly developing to enable organisations to convert data into useful information and knowledge. Electronic health records and new developments in Pathology Informatics now support the collection of 'big laboratory and clinical data', and these digital innovations are now being applied to transfusion medicine. To use big data effectively, we must address concerns about confidentiality and the need for a change in culture and practice, remove barriers to adopting common operating systems and data standards and ensure the safe and secure storage of sensitive personal information. In the UK, the aim is to formulate a single set of data and standards for communicating test results and so enable pathology data to contribute to national datasets. In transfusion, big data has been used for benchmarking, detection of transfusion-related complications, determining patterns of blood use and definition of blood order schedules for surgery. More generally, rapidly available information can monitor compliance with key performance indicators for patient blood management and inventory management leading to better patient care and reduced use of blood. The challenges of enabling reliable systems and analysis of big data and securing funding in the restrictive financial climate are formidable, but not insurmountable. The promise is that digital information will soon improve the implementation of best practice in transfusion medicine and patient blood management globally. © 2015 British Blood Transfusion Society.

  17. Urbanising Big

    DEFF Research Database (Denmark)

    Ljungwall, Christer

    2013-01-01

    Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis.......Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis....

  18. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Boyd, Richard N.

    2001-01-01

    The precision of measurements in modern cosmology has made huge strides in recent years, with measurements of the cosmic microwave background and the determination of the Hubble constant now rivaling the level of precision of the predictions of big bang nucleosynthesis. However, these results are not necessarily consistent with the predictions of the Standard Model of big bang nucleosynthesis. Reconciling these discrepancies may require extensions of the basic tenets of the model, and possibly of the reaction rates that determine the big bang abundances

  19. Comparison of postinfusion phlebitis in intravenous push versus intravenous piggyback cefazolin.

    Science.gov (United States)

    Biggar, Constance; Nichols, Cynthia

    2012-01-01

    Reducing health care costs without adversely affecting patient safety is a constant challenge for health care institutions. Cefazolin prophylaxis via intravenous push (IVP) is more cost-effective than via intravenous piggyback (IVPB). The purpose of this study was to determine whether patient safety would be compromised (ie, an increased rate of phlebitis) with a change to the IVP method. Rates of phlebitis in orthopedic surgical patients receiving cefazolin prophylaxis via IVP versus IVPB were evaluated in a prospective quasi-experimental design of 240 patients. The first 120 subjects received cefazolin via IVPB, and the second 120 subjects received it via IVP. Results indicated no statistically significant difference in phlebitis rates in the IVPB (3.4%) versus the IVP groups (3.3%).

  20. A big data approach to the concordance of the toxicity of pharmaceuticals in animals and humans.

    Science.gov (United States)

    Clark, Matthew; Steger-Hartmann, Thomas

    2018-07-01

    Although lack of efficacy is an important cause of late stage attrition in drug development the shortcomings in the translation of toxicities observed during the preclinical development to observations in clinical trials or post-approval is an ongoing topic of research. The concordance between preclinical and clinical safety observations has been analyzed only on relatively small data sets, mostly over short time periods of drug approvals. We therefore explored the feasibility of a big-data analysis on a set of 3,290 approved drugs and formulations for which 1,637,449 adverse events were reported for both humans animal species in regulatory submissions over a period of more than 70 years. The events reported in five species - rat, dog, mouse, rabbit, and cynomolgus monkey - were treated as diagnostic tests for human events and the diagnostic power was computed for each event/species pair using likelihood ratios. The animal-human translation of many key observations is confirmed as being predictive, such as QT prolongation and arrhythmias in dog. Our study confirmed the general predictivity of animal safety observations for humans, but also identified issues of such automated analyses which are on the one hand related to data curation and controlled vocabularies, on the other hand to methodological changes over the course of time. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.

  1. [Peripheral intravenous catheter-related phlebitis].

    Science.gov (United States)

    van der Sar-van der Brugge, Simone; Posthuma, E F M Ward

    2011-01-01

    Phlebitis is a very common complication of the use of intravenous catheters. Two patients with an i.v. catheter complicated by thrombophlebitis are described. Patient A was immunocompromised due to chronic lymphatic leukaemia and developed septic thrombophlebitis with positive blood cultures for S. Aureus. Patient B was being treated with flucloxacillin because of an S. Aureus infection and developed chemical phlebitis. Septic phlebitis is rare, but potentially serious. Chemical or mechanical types of thrombophlebitis are usually less severe, but happen very frequently. Risk factors include: female sex, previous episode of phlebitis, insertion at (ventral) forearm, emergency placement and administration of antibiotics. Until recently, routine replacement of peripheral intravenous catheters after 72-96 h was recommended, but randomised controlled trials have not shown any benefit of this routine. A recent Cochrane Review recommends replacement of peripheral intravenous catheters when clinically indicated only.

  2. The ethics of big data in big agriculture

    OpenAIRE

    Carbonell (Isabelle M.)

    2016-01-01

    This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique in...

  3. The big data-big model (BDBM) challenges in ecological research

    Science.gov (United States)

    Luo, Y.

    2015-12-01

    The field of ecology has become a big-data science in the past decades due to development of new sensors used in numerous studies in the ecological community. Many sensor networks have been established to collect data. For example, satellites, such as Terra and OCO-2 among others, have collected data relevant on global carbon cycle. Thousands of field manipulative experiments have been conducted to examine feedback of terrestrial carbon cycle to global changes. Networks of observations, such as FLUXNET, have measured land processes. In particular, the implementation of the National Ecological Observatory Network (NEON), which is designed to network different kinds of sensors at many locations over the nation, will generate large volumes of ecological data every day. The raw data from sensors from those networks offer an unprecedented opportunity for accelerating advances in our knowledge of ecological processes, educating teachers and students, supporting decision-making, testing ecological theory, and forecasting changes in ecosystem services. Currently, ecologists do not have the infrastructure in place to synthesize massive yet heterogeneous data into resources for decision support. It is urgent to develop an ecological forecasting system that can make the best use of multiple sources of data to assess long-term biosphere change and anticipate future states of ecosystem services at regional and continental scales. Forecasting relies on big models that describe major processes that underlie complex system dynamics. Ecological system models, despite great simplification of the real systems, are still complex in order to address real-world problems. For example, Community Land Model (CLM) incorporates thousands of processes related to energy balance, hydrology, and biogeochemistry. Integration of massive data from multiple big data sources with complex models has to tackle Big Data-Big Model (BDBM) challenges. Those challenges include interoperability of multiple

  4. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  5. Applications of Big Data in Education

    OpenAIRE

    Faisal Kalota

    2015-01-01

    Big Data and analytics have gained a huge momentum in recent years. Big Data feeds into the field of Learning Analytics (LA) that may allow academic institutions to better understand the learners' needs and proactively address them. Hence, it is important to have an understanding of Big Data and its applications. The purpose of this descriptive paper is to provide an overview of Big Data, the technologies used in Big Data, and some of the applications of Big Data in educa...

  6. Big Data Semantics

    NARCIS (Netherlands)

    Ceravolo, Paolo; Azzini, Antonia; Angelini, Marco; Catarci, Tiziana; Cudré-Mauroux, Philippe; Damiani, Ernesto; Mazak, Alexandra; van Keulen, Maurice; Jarrar, Mustafa; Santucci, Giuseppe; Sattler, Kai-Uwe; Scannapieco, Monica; Wimmer, Manuel; Wrembel, Robert; Zaraket, Fadi

    2018-01-01

    Big Data technology has discarded traditional data modeling approaches as no longer applicable to distributed data processing. It is, however, largely recognized that Big Data impose novel challenges in data and infrastructure management. Indeed, multiple components and procedures must be

  7. Optimizing the use of intravenous therapy in internal medicine.

    Science.gov (United States)

    Champion, Karine; Mouly, Stéphane; Lloret-Linares, Celia; Lopes, Amanda; Vicaut, Eric; Bergmann, Jean-François

    2013-10-01

    We aimed to evaluate the impact of physicians' educational programs in the reduction of inappropriate intravenous lines in internal medicine. Fifty-six French internal medicine units were enrolled in a nationwide, prospective, blinded, randomized controlled trial. Forms describing the patients with an intravenous line and internal medicine department characteristics were filled out on 2 separate days in January and April 2007. Following the first visit, all units were randomly assigned to either a specific education program on the appropriate indications of an intravenous line, during February and March 2007, or no training (control group). The Investigators' Committee then blindly evaluated the clinical relevance of the intravenous line according to pre-established criteria. The primary outcome was the percentage of inappropriate intravenous lines. During January 2007, intravenous lines were used in 475 (24.9%) of the 1910 hospitalized patients. Of these, 80 (16.8%) were considered inappropriate. In April 2007, 416 (22.8%) of the 1823 hospitalized patients received an intravenous line, which was considered in 10.2% (21/205) of patients managed by trained physicians, versus 16.6% (35/211) of patients in the control group (relative difference 39%; 95% confidence interval, -0.6-13.3; P = .05). Reduced intravenous administration of fluids, antibiotics, and analgesics accounted for the observed decrease. The use of a simple education program reduced the rate of inappropriate intravenous lines by almost 40% in an internal medicine setting (NCT01633307). Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Comparative validity of brief to medium-length Big Five and Big Six personality questionnaires

    NARCIS (Netherlands)

    Thalmayer, A.G.; Saucier, G.; Eigenhuis, A.

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five

  9. Complex intravenous anesthesia in interventional procedures

    International Nuclear Information System (INIS)

    Xie Zonggui; Hu Yuanming; Huang Yunlong; You Yong; Wu Juan; Huang Zengping; Li Jian

    2006-01-01

    Objective: To evaluate the value and safety of Diprivan and Fentany intravenous administration of analgesia in interventional procedures. Methods: Diprivan with Fentany intravenous administration for analgesia was used in eighty interventional procedures of sixty-five patients, without tracheal tube insertion. Vital signs including HR, BP, arterial oxygen saturation (SpO 2 ) and patients' reaction to operating were recorded. Results: Intravenous anesthesia was cared out successfully in eighty interventional procedures, with patients under sleeping condition during the operation, together with no pain and no agony memory of the procedure. The amount of Diprivan was 500±100 mg and Fentany was 0.2±0.025 mg. Mean arterial pressure and SpO 2 were 11.4±2.2 kPa, 10.6±2.1 kPa and 98±1.0, 96±1.5 respectively before and after ten minutes of the operation, with no significant difference. Conclusions: Diprivan with Fentany intravenous administration for interventional procedure analgesia possess good safety, painless and no agony memory of the procedure; therefor ought to be recommended. (authors)

  10. A chromatographic method for the production of a human immunoglobulin G solution for intravenous use

    Directory of Open Access Journals (Sweden)

    K. Tanaka

    1998-11-01

    Full Text Available Immunoglobulin G (IgG of excellent quality for intravenous use was obtained from the cryosupernatant of human plasma by a chromatographic method based on a mixture of ion-exchange, DEAE-Sepharose FF and arginine Sepharose 4B affinity chromatography and a final purification step by Sephacryl S-300 HR gel filtration. The yield of 10 experimental batches produced was 3.5 g IgG per liter of plasma. A solvent/detergent combination of 1% Tri (n-butyl phosphate and 1% Triton X-100 was used to inactivate lipid-coated viruses. Analysis of the final product (5% liquid IgG based on the mean for 10 batches showed 94% monomers, 5.5% dimers and 0.5% polymers and aggregates. Anticomplementary activity was 0.3 CH50/mg IgG and prekallikrein activator levels were less than 5 IU/ml. Stability at 37ºC for 30 days in the liquid state was satisfactory. IgG was stored in flasks (2.5 g/flask at 4 to 8ºC. All the characteristics of the product were consistent with the requirements of the 1997 Pharmacopée Européenne.

  11. Kinetics of intravenous radiographic contrast medium injections as used on CT: simulation with time delay differential equations in a basic human cardiovascular multicompartment model.

    Science.gov (United States)

    Violon, D

    2012-12-01

    To develop a multicompartment model of only essential human body components that predicts the contrast medium concentration vs time curve in a chosen compartment after an intravenous injection. Also to show that the model can be used to time adequately contrast-enhanced CT series. A system of linked time delay instead of ordinary differential equations described the model and was solved with a Matlab program (Matlab v. 6.5; The Mathworks, Inc., Natick, MA). All the injection and physiological parameters were modified to cope with normal or pathological situations. In vivo time-concentration curves from the literature were recalculated to validate the model. The recalculated contrast medium time-concentration curves and parameters are given. The results of the statistical analysis of the study findings are expressed as the median prediction error and the median absolute prediction error values for both the time delay and ordinary differential equation systems; these are situated well below the generally accepted maximum 20% limit. The presented program correctly predicts the time-concentration curve of an intravenous contrast medium injection and, consequently, allows an individually tailored approach of CT examinations with optimised use of the injected contrast medium volume, as long as time delay instead of ordinary differential equations are used. The presented program offers good preliminary knowledge of the time-contrast medium concentration curve after any intravenous injection, allowing adequate timing of a CT examination, required by the short scan time of present-day scanners. The injected volume of contrast medium can be tailored to the individual patient with no more contrast medium than is strictly needed.

  12. Exploring the potential of open big data from ticketing websites to characterize travel patterns within the Chinese high-speed rail system.

    Directory of Open Access Journals (Sweden)

    Sheng Wei

    Full Text Available Big data have contributed to deepen our understanding in regards to many human systems, particularly human mobility patterns and the structure and functioning of transportation systems. Resonating the recent call for 'open big data,' big data from various sources on a range of scales have become increasingly accessible to the public. However, open big data relevant to travelers within public transit tools remain scarce, hindering any further in-depth study on human mobility patterns. Here, we explore ticketing-website derived data that are publically available but have been largely neglected. We demonstrate the power, potential and limitations of this open big data, using the Chinese high-speed rail (HSR system as an example. Using an application programming interface, we automatically collected the data on the remaining tickets (RTD for scheduled trains at the last second before departure in order to retrieve information on unused transit capacity, occupancy rate of trains, and passenger flux at stations. We show that this information is highly useful in characterizing the spatiotemporal patterns of traveling behaviors on the Chinese HSR, such as weekend traveling behavior, imbalanced commuting behavior, and station functionality. Our work facilitates the understanding of human traveling patterns along the Chinese HSR, and the functionality of the largest HSR system in the world. We expect our work to attract attention regarding this unique open big data source for the study of analogous transportation systems.

  13. An overview of big data and data science education at South African universities

    Directory of Open Access Journals (Sweden)

    Eduan Kotzé

    2016-02-01

    Full Text Available Man and machine are generating data electronically at an astronomical speed and in such a way that society is experiencing cognitive challenges to analyse this data meaningfully. Big data firms, such as Google and Facebook, identified this problem several years ago and are continuously developing new technologies or improving existing technologies in order to facilitate the cognitive analysis process of these large data sets. The purpose of this article is to contribute to our theoretical understanding of the role that big data might play in creating new training opportunities for South African universities. The article investigates emerging literature on the characteristics and main components of big data, together with the Hadoop application stack as an example of big data technology. Due to the rapid development of big data technology, a paradigm shift of human resources is required to analyse these data sets; therefore, this study examines the state of big data teaching at South African universities. This article also provides an overview of possible big data sources for South African universities, as well as relevant big data skills that data scientists need. The study also investigates existing academic programs in South Africa, where the focus is on teaching advanced database systems. The study found that big data and data science topics are introduced to students on a postgraduate level, but that the scope is very limited. This article contributes by proposing important theoretical topics that could be introduced as part of the existing academic programs. More research is required, however, to expand these programs in order to meet the growing demand for data scientists with big data skills.

  14. Surface urban heat island across 419 global big cities.

    Science.gov (United States)

    Peng, Shushi; Piao, Shilong; Ciais, Philippe; Friedlingstein, Pierre; Ottle, Catherine; Bréon, François-Marie; Nan, Huijuan; Zhou, Liming; Myneni, Ranga B

    2012-01-17

    Urban heat island is among the most evident aspects of human impacts on the earth system. Here we assess the diurnal and seasonal variation of surface urban heat island intensity (SUHII) defined as the surface temperature difference between urban area and suburban area measured from the MODIS. Differences in SUHII are analyzed across 419 global big cities, and we assess several potential biophysical and socio-economic driving factors. Across the big cities, we show that the average annual daytime SUHII (1.5 ± 1.2 °C) is higher than the annual nighttime SUHII (1.1 ± 0.5 °C) (P < 0.001). But no correlation is found between daytime and nighttime SUHII across big cities (P = 0.84), suggesting different driving mechanisms between day and night. The distribution of nighttime SUHII correlates positively with the difference in albedo and nighttime light between urban area and suburban area, while the distribution of daytime SUHII correlates negatively across cities with the difference of vegetation cover and activity between urban and suburban areas. Our results emphasize the key role of vegetation feedbacks in attenuating SUHII of big cities during the day, in particular during the growing season, further highlighting that increasing urban vegetation cover could be one effective way to mitigate the urban heat island effect.

  15. Big Data and medicine: a big deal?

    Science.gov (United States)

    Mayer-Schönberger, V; Ingelsson, E

    2018-05-01

    Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research. Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data's role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval. © 2017 The Association for the Publication of the Journal of Internal Medicine.

  16. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  17. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  18. Comparative validity of brief to medium-length Big Five and Big Six Personality Questionnaires.

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-12-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are faced with a variety of options as to inventory length. Furthermore, a 6-factor model has been proposed to extend and update the Big Five model, in part by adding a dimension of Honesty/Humility or Honesty/Propriety. In this study, 3 popular brief to medium-length Big Five measures (NEO Five Factor Inventory, Big Five Inventory [BFI], and International Personality Item Pool), and 3 six-factor measures (HEXACO Personality Inventory, Questionnaire Big Six Scales, and a 6-factor version of the BFI) were placed in competition to best predict important student life outcomes. The effect of test length was investigated by comparing brief versions of most measures (subsets of items) with original versions. Personality questionnaires were administered to undergraduate students (N = 227). Participants' college transcripts and student conduct records were obtained 6-9 months after data was collected. Six-factor inventories demonstrated better predictive ability for life outcomes than did some Big Five inventories. Additional behavioral observations made on participants, including their Facebook profiles and cell-phone text usage, were predicted similarly by Big Five and 6-factor measures. A brief version of the BFI performed surprisingly well; across inventory platforms, increasing test length had little effect on predictive validity. Comparative validity of the models and measures in terms of outcome prediction and parsimony is discussed.

  19. Big Machines and Big Science: 80 Years of Accelerators at Stanford

    Energy Technology Data Exchange (ETDEWEB)

    Loew, Gregory

    2008-12-16

    Longtime SLAC physicist Greg Loew will present a trip through SLAC's origins, highlighting its scientific achievements, and provide a glimpse of the lab's future in 'Big Machines and Big Science: 80 Years of Accelerators at Stanford.'

  20. Dual of big bang and big crunch

    International Nuclear Information System (INIS)

    Bak, Dongsu

    2007-01-01

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory

  1. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  2. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  3. Today and tomorrow of intravenous coronary angiography programme in Japan

    International Nuclear Information System (INIS)

    Ando, Masami; Hyodo, Kazuyuki

    1994-01-01

    Development of an intravenous coronary angiography system using monochromated synchrotron radiation at the Photon Factory is described. This comprises an asymmetric cut silicon monochromator crystal to get a larger exposure area, a two dimensional imaging system using an imaging intensifier coupled to a CCD TV camera and a fast video data acquisition system. The whole system is under development using alive dogs. A future system including a dedicated insertion device applicable to alive humans is also proposed. (author)

  4. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  5. Generalized formal model of Big Data

    OpenAIRE

    Shakhovska, N.; Veres, O.; Hirnyak, M.

    2016-01-01

    This article dwells on the basic characteristic features of the Big Data technologies. It is analyzed the existing definition of the “big data” term. The article proposes and describes the elements of the generalized formal model of big data. It is analyzed the peculiarities of the application of the proposed model components. It is described the fundamental differences between Big Data technology and business analytics. Big Data is supported by the distributed file system Google File System ...

  6. BigWig and BigBed: enabling browsing of large distributed datasets.

    Science.gov (United States)

    Kent, W J; Zweig, A S; Barber, G; Hinrichs, A S; Karolchik, D

    2010-09-01

    BigWig and BigBed files are compressed binary indexed files containing data at several resolutions that allow the high-performance display of next-generation sequencing experiment results in the UCSC Genome Browser. The visualization is implemented using a multi-layered software approach that takes advantage of specific capabilities of web-based protocols and Linux and UNIX operating systems files, R trees and various indexing and compression tricks. As a result, only the data needed to support the current browser view is transmitted rather than the entire file, enabling fast remote access to large distributed data sets. Binaries for the BigWig and BigBed creation and parsing utilities may be downloaded at http://hgdownload.cse.ucsc.edu/admin/exe/linux.x86_64/. Source code for the creation and visualization software is freely available for non-commercial use at http://hgdownload.cse.ucsc.edu/admin/jksrc.zip, implemented in C and supported on Linux. The UCSC Genome Browser is available at http://genome.ucsc.edu.

  7. Intravenous cidofovir for resistant cutaneous warts in a patient with psoriasis treated with monoclonal antibodies.

    LENUS (Irish Health Repository)

    McAleer, M A

    2012-02-01

    Human papilloma virus is a common and often distressing cutaneous disease. It can be therapeutically challenging, especially in immunocompromised patients. We report a case of recalcitrant cutaneous warts that resolved with intravenous cidofovir treatment. The patient was immunocompromised secondary to monoclonal antibody therapy for psoriasis.

  8. Computed tomography intravenous cholangiography

    International Nuclear Information System (INIS)

    Nascimento, S.; Murray, W.; Wilson, P.

    1997-01-01

    Indications for direct visualization of the bile ducts include bile duct dilatation demonstrated by ultrasound or computed tomography (CT) scanning, where the cause of the bile duct dilatation is uncertain or where the anatomy of bile duct obstruction needs further clarification. Another indication is right upper quadrant pain, particularly in a post-cholecystectomy patient, where choledocholithiasis is suspected. A possible new indication is pre-operative evaluation prior to laparoscopic cholecystectomy. The bile ducts are usually studied by endoscopic retrograde cholangiopancreatography (ERCP), or, less commonly, trans-hepatic cholangiography. The old technique of intravenous cholangiography has fallen into disrepute because of inconsistent bile-duct opacification. The advent of spiral CT scanning has renewed interest in intravenous cholangiography. The CT technique is very sensitive to the contrast agent in the bile ducts, and angiographic and three-dimensional reconstructions of the biliary tree can readily be obtained using the CT intravenous cholangiogram technique (CT IVC). Seven patients have been studied using this CT IVC technique, between February 1995 and June 1996, and are the subject of the present report. Eight further studies have since been performed. The results suggest that CT IVC could replace ERCP as the primary means of direct cholangiography, where pancreatic duct visualization is not required. (authors)

  9. Big data-driven business how to use big data to win customers, beat competitors, and boost profits

    CERN Document Server

    Glass, Russell

    2014-01-01

    Get the expert perspective and practical advice on big data The Big Data-Driven Business: How to Use Big Data to Win Customers, Beat Competitors, and Boost Profits makes the case that big data is for real, and more than just big hype. The book uses real-life examples-from Nate Silver to Copernicus, and Apple to Blackberry-to demonstrate how the winners of the future will use big data to seek the truth. Written by a marketing journalist and the CEO of a multi-million-dollar B2B marketing platform that reaches more than 90% of the U.S. business population, this book is a comprehens

  10. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  11. SDN Low Latency for Medical Big Data Using Wavelets

    Directory of Open Access Journals (Sweden)

    Fadia Shah

    2017-06-01

    Full Text Available New era is the age of 5G. The network has moved from the simple internet connection towards advanced LTE connections and transmission. The information and communication technology has reshaped telecommunication. For this, among many types of big data, Medical Big Data is one of the most sensitive forms of data. Wavelet is a technical tool to reduce the size of this data to make it available for the user for more time. It is also responsible for low latency and high speed data transmission over the network. The key concern is the Medical Big Data should be accurate and reliable enough so that the recommended treatment should be the concerned one. This paper proposed the scheme to support the concept of data availability without losing crucial information, via Wavelet the Medical Data compression and through SDN supportive architecture by making data availability over the wireless network. Such scheme is in favor of the efficient use of technology for the benefit of human beings in the support of medical treatments.

  12. Stalin's Big Fleet Program

    National Research Council Canada - National Science Library

    Mauner, Milan

    2002-01-01

    Although Dr. Milan Hauner's study 'Stalin's Big Fleet program' has focused primarily on the formation of Big Fleets during the Tsarist and Soviet periods of Russia's naval history, there are important lessons...

  13. Five Big, Big Five Issues : Rationale, Content, Structure, Status, and Crosscultural Assessment

    NARCIS (Netherlands)

    De Raad, Boele

    1998-01-01

    This article discusses the rationale, content, structure, status, and crosscultural assessment of the Big Five trait factors, focusing on topics of dispute and misunderstanding. Taxonomic restrictions of the original Big Five forerunner, the "Norman Five," are discussed, and criticisms regarding the

  14. Intentional intravenous mercury injection | Yudelowitz | South African ...

    African Journals Online (AJOL)

    Intravenous mercury injection is rarely seen, with few documented cases. Treatment strategies are not clearly defined for such cases, although a few options do show benefit. This case report describes a 29-year-old man suffering from bipolar disorder, who presented following self-inflicted intravenous injection of mercury.

  15. Acute effect of intravenously applied alcohol in the human striatal and extrastriatal D2 /D3 dopamine system.

    Science.gov (United States)

    Pfeifer, Philippe; Tüscher, Oliver; Buchholz, Hans Georg; Gründer, Gerhard; Vernaleken, Ingo; Paulzen, Michael; Zimmermann, Ulrich S; Maus, Stephan; Lieb, Klaus; Eggermann, Thomas; Fehr, Christoph; Schreckenberger, Mathias

    2017-09-01

    Investigations on the acute effects of alcohol in the human mesolimbic dopamine D 2 /D 3 receptor system have yielded conflicting results. With respect to the effects of alcohol on extrastriatal D 2 /D 3 dopamine receptors no investigations have been reported yet. Therefore we applied PET imaging using the postsynaptic dopamine D 2 /D 3 receptor ligand [ 18 F]fallypride addressing the question, whether intravenously applied alcohol stimulates the extrastriatal and striatal dopamine system. We measured subjective effects of alcohol and made correlation analyses with the striatal and extrastriatal D 2 /D 3 binding potential. Twenty-four healthy male μ-opioid receptor (OPRM1)118G allele carriers underwent a standardized intravenous and placebo alcohol administration. The subjective effects of alcohol were measured with a visual analogue scale. For the evaluation of the dopamine response we calculated the binding potential (BP ND ) by using the simplified reference tissue model (SRTM). In addition, we calculated distribution volumes (target and reference regions) in 10 subjects for which metabolite corrected arterial samples were available. In the alcohol condition no significant dopamine response in terms of a reduction of BP ND was observed in striatal and extrastriatal brain regions. We found a positive correlation for 'liking' alcohol and the BP ND in extrastriatal brain regions (Inferior frontal cortex (IFC) (r = 0.533, p = 0.007), orbitofrontal cortex (OFC) (r = 0.416, p = 0.043) and prefrontal cortex (PFC) (r = 0.625, p = 0.001)). The acute alcohol effects on the D 2 /D 3 dopamine receptor binding potential of the striatal and extrastriatal system in our experiment were insignificant. A positive correlation of the subjective effect of 'liking' alcohol with cortical D 2 /D 3 receptors may hint at an addiction relevant trait. © 2016 Society for the Study of Addiction.

  16. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  17. Intravenous iron-containing products: EMA procrastination.

    Science.gov (United States)

    2014-07-01

    A European reassessment has led to identical changes in the summaries of product characteristics (SPCs) for all intravenous iron-containing products: the risk of serious adverse effects is now highlighted, underlining the fact that intravenous iron-containing products should only be used when the benefits clearly outweigh the harms. Unfortunately, iron dextran still remains on the market despite a higher risk of hypersensitivity reactions than with iron sucrose.

  18. Big Data and HPC collocation: Using HPC idle resources for Big Data Analytics

    OpenAIRE

    MERCIER , Michael; Glesser , David; Georgiou , Yiannis; Richard , Olivier

    2017-01-01

    International audience; Executing Big Data workloads upon High Performance Computing (HPC) infrastractures has become an attractive way to improve their performances. However, the collocation of HPC and Big Data workloads is not an easy task, mainly because of their core concepts' differences. This paper focuses on the challenges related to the scheduling of both Big Data and HPC workloads on the same computing platform. In classic HPC workloads, the rigidity of jobs tends to create holes in ...

  19. Intravenous lidocaine for postmastectomy pain treatment: randomized, blind, placebo controlled clinical trial

    Directory of Open Access Journals (Sweden)

    Tania Cursino de Menezes Couceiro

    2015-06-01

    Full Text Available BACKGROUND AND OBJECTIVE: Postoperative pain treatment in mastectomy remains a major challenge despite the multimodal approach. The aim of this study was to investigate the analgesic effect of intravenous lidocaine in patients undergoing mastectomy, as well as the postoperative consumption of opioids. METHODS: After approval by the Human Research Ethics Committee of the Instituto de Medicina Integral Prof. Fernando Figueira in Recife, Pernambuco, a randomized, blind, controlled trial was conducted with intravenous lidocaine at a dose of 3 mg/kg infused over 1 h in 45 women undergoing mastectomy under general anesthesia. One patient from placebo group was. RESULTS: Groups were similar in age, body mass index, type of surgery, and postoperative need for opioids. Two of 22 patients in lidocaine group and three of 22 patients in placebo group requested opioid (p = 0.50. Pain on awakening was identified in 4/22 of lidocaine group and 5/22 of placebo group (p = 0.50; in the post-anesthetic recovery room in 14/22 and 12/22 (p = 0.37 of lidocaine and placebo groups, respectively. Pain evaluation 24 h after surgery showed that 2/22 and 3/22 patients (p = 0.50 of lidocaine and placebo groups, respectively, complained of pain. CONCLUSION: Intravenous lidocaine at a dose of 3 mg/kg administered over a period of an hour during mastectomy did not promote additional analgesia compared to placebo in the first 24 h, and has not decreased opioid consumption. However, a beneficial effect of intravenous lidocaine in selected and/or other therapeutic regimens patients cannot be ruled out.

  20. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  1. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  2. Big data - a 21st century science Maginot Line? No-boundary thinking: shifting from the big data paradigm.

    Science.gov (United States)

    Huang, Xiuzhen; Jennings, Steven F; Bruce, Barry; Buchan, Alison; Cai, Liming; Chen, Pengyin; Cramer, Carole L; Guan, Weihua; Hilgert, Uwe Kk; Jiang, Hongmei; Li, Zenglu; McClure, Gail; McMullen, Donald F; Nanduri, Bindu; Perkins, Andy; Rekepalli, Bhanu; Salem, Saeed; Specker, Jennifer; Walker, Karl; Wunsch, Donald; Xiong, Donghai; Zhang, Shuzhong; Zhang, Yu; Zhao, Zhongming; Moore, Jason H

    2015-01-01

    Whether your interests lie in scientific arenas, the corporate world, or in government, you have certainly heard the praises of big data: Big data will give you new insights, allow you to become more efficient, and/or will solve your problems. While big data has had some outstanding successes, many are now beginning to see that it is not the Silver Bullet that it has been touted to be. Here our main concern is the overall impact of big data; the current manifestation of big data is constructing a Maginot Line in science in the 21st century. Big data is not "lots of data" as a phenomena anymore; The big data paradigm is putting the spirit of the Maginot Line into lots of data. Big data overall is disconnecting researchers and science challenges. We propose No-Boundary Thinking (NBT), applying no-boundary thinking in problem defining to address science challenges.

  3. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Andersen, Kristina Vaarst; Jeppesen, Jacob

    In this paper we investigate the micro-mechanisms governing structural evolution and performance of scientific collaboration. Scientific discovery tends not to be lead by so called lone ?stars?, or big egos, but instead by collaboration among groups of researchers, from a multitude of institutions...

  4. Big Data and Big Science

    OpenAIRE

    Di Meglio, Alberto

    2014-01-01

    Brief introduction to the challenges of big data in scientific research based on the work done by the HEP community at CERN and how the CERN openlab promotes collaboration among research institutes and industrial IT companies. Presented at the FutureGov 2014 conference in Singapore.

  5. Intravenous Lipid Emulsion as an Antidote for the Treatment of Acute Poisoning: A Bibliometric Analysis of Human and Animal Studies.

    Science.gov (United States)

    Zyoud, Sa'ed H; Waring, W Stephen; Al-Jabi, Samah W; Sweileh, Waleed M; Rahhal, Belal; Awang, Rahmat

    2016-11-01

    In recent years, there has been increasing interest in the role of intravenous lipid formulations as potential antidotes in patients with severe cardiotoxicity caused by drug toxicity. The aim of this study was to conduct a comprehensive bibliometric analysis of all human and animal studies featuring lipid emulsion as an antidote for the treatment of acute poisoning. The Scopus database search was performed on 5 February 2016 to analyse the research output related to intravenous lipid emulsion as an antidote for the treatment of acute poisoning. Research indicators used for analysis included total number of articles, date (year) of publication, total citations, value of the h-index, document types, countries of publication, journal names, collaboration patterns and institutions. A total of 594 articles were retrieved from Scopus database for the period of 1955-2015. The percentage share of global intravenous lipid emulsion research output showed that research output was 85.86% in 2006-2015 with yearly average growth in this field of 51 articles per year. The USA, United Kingdom (UK), France, Canada, New Zealand, Germany, Australia, China, Turkey and Japan accounted for 449 (75.6%) of all the publications. The total number of citations for all documents was 9,333, with an average of 15.7 citations per document. The h-index of the retrieved documents for lipid emulsion research as antidote for the treatment of acute poisoning was 49. The USA and the UK achieved the highest h-indices, 34 and 14, respectively. New Zealand produced the greatest number of documents with international collaboration (51.9%) followed by Australia (50%) and Canada (41.4%) out of the total number of publications for each country. In summary, we found an increase in the number of publications in the field of lipid emulsion after 2006. The results of this study demonstrate that the majority of publications in the field of lipid emulsion were published by high-income countries. Researchers from

  6. Challenges of Big Data Analysis.

    Science.gov (United States)

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  7. Big data is not a monolith

    CERN Document Server

    Ekbia, Hamid R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  8. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......, and highlight some recent methodological advancements in machine learning and image analysis triggered by astronomical applications....

  9. Poker Player Behavior After Big Wins and Big Losses

    OpenAIRE

    Gary Smith; Michael Levere; Robert Kurtzman

    2009-01-01

    We find that experienced poker players typically change their style of play after winning or losing a big pot--most notably, playing less cautiously after a big loss, evidently hoping for lucky cards that will erase their loss. This finding is consistent with Kahneman and Tversky's (Kahneman, D., A. Tversky. 1979. Prospect theory: An analysis of decision under risk. Econometrica 47(2) 263-292) break-even hypothesis and suggests that when investors incur a large loss, it might be time to take ...

  10. INTRAVENOUS REGIONAL ANTIBIOTIC PERFUSION THERAPY AS AN ADJUNCTIVE TREATMENT FOR DIGITAL LESIONS IN SEABIRDS.

    Science.gov (United States)

    Fiorello, Christine V

    2017-03-01

    Foot infections are a common problem among seabirds in wildlife rehabilitation. Pododermatitis and digital infections are often challenging to treat because of the presence of suboptimal substrates, abnormal weight-bearing due to injuries, and suboptimal nutritional or health status. Seabirds represent the majority of animals requiring rehabilitation after oil spills, and foot problems are a common reason for euthanasia among these birds. Antibiotic intravenous regional perfusion therapy is frequently used in humans and other species to treat infections of the distal extremities, but it has not been evaluated in seabirds. During the 2015 Refugio oil spill response, four birds with foot lesions (pododermatitis, osteomyelitis, or both) were treated with ampicillin/sulbactam administered intravenously to the affected limb(s) in addition to systemic antibiotics and anti-inflammatories. Three of the birds, all brown pelicans ( Pelecanus occidentalis ) recovered rapidly and were released. Two of these birds had acute pododermatitis and were treated once with intravenous regional perfusion. They were released approximately 3 wk after the perfusion therapy. The third pelican had osteomyelitis of a digit. It was treated twice with intravenous regional perfusion and was released about 1 mo after the initial perfusion therapy. The fourth bird, a Pacific loon ( Gavia pacifica ), was treated once with perfusion therapy but did not respond to treatment and was euthanatized. No serious adverse effects were observed. This technique should be explored further in avian species.

  11. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  12. Real-Time Information Extraction from Big Data

    Science.gov (United States)

    2015-10-01

    Introduction Enormous amounts of data are being generated by a large number of sensors and devices (Internet of Things: IoT ), and this data is...brief summary in Section 7. Data Access Patterns for Current and Big Data Systems Many current solution architectures rely on accessing data resident...by highly skilled human experts based on their intuition and vast knowledge. We do not have, and cannot produce enough experts to fill our

  13. Big Five Personality Factors and Facets as Predictors of Openness to Diversity.

    Science.gov (United States)

    Han, Suejung; Pistole, M Carole

    2017-11-17

    Openness to diversity is a crucial component of cultural competence needed in the increasingly diversified modern society and a necessary condition for benefitting from diversity contacts and interventions (e.g., diversity training, cultural courses). Responding to the recent call for more research on personality and its relation to diversity outcomes, we examined the associations between Big Five personality (i.e., Openness to Experience, Agreeableness, Extraversion, Neuroticism, and Conscientiousness) higher order factors and lower order facets and universal-diverse orientation (i.e., open attitude of appreciating human universality and diversity; Miville et al., 1999 ). In the Study 1 (N = 338) web survey on Big Five factors, Openness to Experience and Agreeableness were associated with universal-diverse orientation significantly. In the Study 2 (N = 176) paper survey on both Big Five factors and facets, Openness to Experience, low Neuroticism, and Conscientiousness, and various lower-order facets of all the Big Five personality were associated with universal-diverse orientation significantly. Practical implications were suggested on how personality facets could be incorporated into current diversity interventions to enhance their effectiveness of promoting openness to diversity.

  14. Big data in Finnish financial services

    OpenAIRE

    Laurila, M. (Mikko)

    2017-01-01

    Abstract This thesis aims to explore the concept of big data, and create understanding of big data maturity in the Finnish financial services industry. The research questions of this thesis are “What kind of big data solutions are being implemented in the Finnish financial services sector?” and “Which factors impede faster implementation of big data solutions in the Finnish financial services sector?”. ...

  15. Big data in fashion industry

    Science.gov (United States)

    Jain, S.; Bruniaux, J.; Zeng, X.; Bruniaux, P.

    2017-10-01

    Significant work has been done in the field of big data in last decade. The concept of big data includes analysing voluminous data to extract valuable information. In the fashion world, big data is increasingly playing a part in trend forecasting, analysing consumer behaviour, preference and emotions. The purpose of this paper is to introduce the term fashion data and why it can be considered as big data. It also gives a broad classification of the types of fashion data and briefly defines them. Also, the methodology and working of a system that will use this data is briefly described.

  16. Big data bioinformatics.

    Science.gov (United States)

    Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao

    2014-12-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.

  17. Changing the personality of a face: Perceived Big Two and Big Five personality factors modeled in real photographs.

    Science.gov (United States)

    Walker, Mirella; Vetter, Thomas

    2016-04-01

    General, spontaneous evaluations of strangers based on their faces have been shown to reflect judgments of these persons' intention and ability to harm. These evaluations can be mapped onto a 2D space defined by the dimensions trustworthiness (intention) and dominance (ability). Here we go beyond general evaluations and focus on more specific personality judgments derived from the Big Two and Big Five personality concepts. In particular, we investigate whether Big Two/Big Five personality judgments can be mapped onto the 2D space defined by the dimensions trustworthiness and dominance. Results indicate that judgments of the Big Two personality dimensions almost perfectly map onto the 2D space. In contrast, at least 3 of the Big Five dimensions (i.e., neuroticism, extraversion, and conscientiousness) go beyond the 2D space, indicating that additional dimensions are necessary to describe more specific face-based personality judgments accurately. Building on this evidence, we model the Big Two/Big Five personality dimensions in real facial photographs. Results from 2 validation studies show that the Big Two/Big Five are perceived reliably across different samples of faces and participants. Moreover, results reveal that participants differentiate reliably between the different Big Two/Big Five dimensions. Importantly, this high level of agreement and differentiation in personality judgments from faces likely creates a subjective reality which may have serious consequences for those being perceived-notably, these consequences ensue because the subjective reality is socially shared, irrespective of the judgments' validity. The methodological approach introduced here might prove useful in various psychological disciplines. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  18. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  19. Big game hunting practices, meanings, motivations and constraints: a survey of Oregon big game hunters

    Science.gov (United States)

    Suresh K. Shrestha; Robert C. Burns

    2012-01-01

    We conducted a self-administered mail survey in September 2009 with randomly selected Oregon hunters who had purchased big game hunting licenses/tags for the 2008 hunting season. Survey questions explored hunting practices, the meanings of and motivations for big game hunting, the constraints to big game hunting participation, and the effects of age, years of hunting...

  20. Cardiovascular effects of intravenous ghrelin infusion in healthy young men

    DEFF Research Database (Denmark)

    Vestergaard, Esben Thyssen; Andersen, Niels Holmark; Hansen, Troels Krarup

    2007-01-01

    Ghrelin infusion improves cardiac function in patients suffering from cardiac failure, and bolus administration of ghrelin increases cardiac output in healthy subjects. The cardiovascular effects of more continuous intravenous ghrelin exposure remain to be studied. We therefore studied the cardio......Ghrelin infusion improves cardiac function in patients suffering from cardiac failure, and bolus administration of ghrelin increases cardiac output in healthy subjects. The cardiovascular effects of more continuous intravenous ghrelin exposure remain to be studied. We therefore studied...... the cardiovascular effects of a constant infusion of human ghrelin at a rate of 5 pmol/kg per minute for 180 min. Fifteen healthy, young (aged 23.2 ± 0.5 yr), normal-weight (23.0 ± 0.4 kg/m2) men volunteered in a randomized double-blind, placebo-controlled crossover study. With the subjects remaining fasting, peak...... myocardial systolic velocity S′, tissue tracking TT, left ventricular ejection fraction EF, and endothelium-dependent flow-mediated vasodilatation were measured. Ghrelin infusion increased S′ 9% (P = 0.002) and TT 10% (P

  1. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  2. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  3. Exploring complex and big data

    Directory of Open Access Journals (Sweden)

    Stefanowski Jerzy

    2017-12-01

    Full Text Available This paper shows how big data analysis opens a range of research and technological problems and calls for new approaches. We start with defining the essential properties of big data and discussing the main types of data involved. We then survey the dedicated solutions for storing and processing big data, including a data lake, virtual integration, and a polystore architecture. Difficulties in managing data quality and provenance are also highlighted. The characteristics of big data imply also specific requirements and challenges for data mining algorithms, which we address as well. The links with related areas, including data streams and deep learning, are discussed. The common theme that naturally emerges from this characterization is complexity. All in all, we consider it to be the truly defining feature of big data (posing particular research and technological challenges, which ultimately seems to be of greater importance than the sheer data volume.

  4. Was there a big bang

    International Nuclear Information System (INIS)

    Narlikar, J.

    1981-01-01

    In discussing the viability of the big-bang model of the Universe relative evidence is examined including the discrepancies in the age of the big-bang Universe, the red shifts of quasars, the microwave background radiation, general theory of relativity aspects such as the change of the gravitational constant with time, and quantum theory considerations. It is felt that the arguments considered show that the big-bang picture is not as soundly established, either theoretically or observationally, as it is usually claimed to be, that the cosmological problem is still wide open and alternatives to the standard big-bang picture should be seriously investigated. (U.K.)

  5. The big five personality traits and environmental concern: the moderating roles of individualism/collectivism and gender

    OpenAIRE

    Abbas Abdollahi; Simin Hosseinian; Samaneh Karbalaei; Ahmad Beh‐Pajooh; Yousef Kesh; Mahmoud Najafi

    2017-01-01

    Environmental pollution has become a serious challenge for humanity and the environment. Therefore, this study aims to examine the relationships between the Big Five personality traits, individualism, collectivism, participant’s age, and environmental concern, and testing the moderating roles of individualism/collectivism and gender in the relationship between the Big Five personality traits and environmental concern. In this quantitative study, the multi-stage cluster random sampling method ...

  6. BIG DATA-DRIVEN MARKETING: AN ABSTRACT

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (BD use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: What are the key antecedents of big data customer analytics use? How, and to what extent, does big data customer an...

  7. Microbiological quality of some brands of intravenous fluids ...

    African Journals Online (AJOL)

    Microbiological quality of some brands of intravenous fluids produced by some pharmaceutical companies in Nigeria was investigated. Membrane filtration method was used for concentration of contaminating organisms in the intravenous fluids. Thioglycollate medium, Tryptone Soya broth, Brilliant Green Agar ...

  8. Big Data Analytics in Medicine and Healthcare.

    Science.gov (United States)

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  9. The trashing of Big Green

    International Nuclear Information System (INIS)

    Felten, E.

    1990-01-01

    The Big Green initiative on California's ballot lost by a margin of 2-to-1. Green measures lost in five other states, shocking ecology-minded groups. According to the postmortem by environmentalists, Big Green was a victim of poor timing and big spending by the opposition. Now its supporters plan to break up the bill and try to pass some provisions in the Legislature

  10. Usefulness of high-dose intravenous human immunoglobulins treatment for refractory recurrent pericarditis.

    Science.gov (United States)

    Moretti, Michele; Buiatti, Alessandra; Merlo, Marco; Massa, Laura; Fabris, Enrico; Pinamonti, Bruno; Sinagra, Gianfranco

    2013-11-01

    The management of refractory recurrent pericarditis is challenging. Previous clinical reports have noted a beneficial effect of high-dose intravenous human immunoglobulins (IvIgs) in isolated and systemic inflammatory disease-related forms. In this article, we analyzed retrospectively our clinical experience with IvIg therapy in a series of clinical cases of pericarditis refractory to conventional treatment. We retrospectively analyzed 9 patients (1994 to 2010) with refractory recurrent pericarditis, who received high-dose IvIg as a part of their medical treatment. Nonsteroidal anti-inflammatory drugs (NSAIDs), steroids, or colchicine treatment was not discontinued during IvIg treatment. No patients had a history of autoimmune or connective tissue diseases. During an average period of 11 months from the first recurrence, patients had experienced a mean of 5 relapses before the first IvIg treatment. In 4 cases, patients showed complete clinical remission with no further relapse after the first IvIg cycle. Two patients experienced a single minor relapse, responsive to short-term nonsteroidal anti-inflammatory drugs. In 2 patients, we performed a second cycle of IvIg after a recurrence of pericarditis, with subsequent complete remission. One patient did not respond to 3 cycles of IvIg and subsequently underwent pericardial window and long-term immunosuppressive treatment. No major adverse effect was observed in consequence of IvIg administration in all the cases. In conclusion, although IvIg mode of action is still poorly understood in this setting, this treatment can be considered as an option in patients with recurrent pericarditis refractory to conventional medical treatment and, in our small series, has proved to be effective in 8 of 9 cases. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  12. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  13. Administration costs of intravenous biologic drugs for rheumatoid arthritis

    OpenAIRE

    Soini, Erkki J; Leussu, Miina; Hallinen, Taru

    2013-01-01

    Background Cost-effectiveness studies explicitly reporting infusion times, drug-specific administration costs for infusions or real-payer intravenous drug cost are few in number. Yet, administration costs for infusions are needed in the health economic evaluations assessing intravenously-administered drugs. Objectives To estimate the drug-specific administration and total cost of biologic intravenous rheumatoid arthritis (RA) drugs in the adult population and to compare the obtained costs wit...

  14. Resources available for autism research in the big data era: a systematic review

    Directory of Open Access Journals (Sweden)

    Reem Al-jawahiri

    2017-01-01

    Full Text Available Recently, there has been a move encouraged by many stakeholders towards generating big, open data in many areas of research. One area where big, open data is particularly valuable is in research relating to complex heterogeneous disorders such as Autism Spectrum Disorder (ASD. The inconsistencies of findings and the great heterogeneity of ASD necessitate the use of big and open data to tackle important challenges such as understanding and defining the heterogeneity and potential subtypes of ASD. To this end, a number of initiatives have been established that aim to develop big and/or open data resources for autism research. In order to provide a useful data reference for autism researchers, a systematic search for ASD data resources was conducted using the Scopus database, the Google search engine, and the pages on ‘recommended repositories’ by key journals, and the findings were translated into a comprehensive list focused on ASD data. The aim of this review is to systematically search for all available ASD data resources providing the following data types: phenotypic, neuroimaging, human brain connectivity matrices, human brain statistical maps, biospecimens, and ASD participant recruitment. A total of 33 resources were found containing different types of data from varying numbers of participants. Description of the data available from each data resource, and links to each resource is provided. Moreover, key implications are addressed and underrepresented areas of data are identified.

  15. Low-dose intravenous lidocaine as treatment for proctalgia fugax.

    Science.gov (United States)

    Peleg, Roni; Shvartzman, Pesach

    2002-01-01

    Proctalgia fugax is characterized by a sudden internal anal sphincter and anorectic ring attack of pain of a short duration. Description of the influence of intravenous lidocaine treatment for proctalgia fugax. A 28-year-old patient suffering of proctalgia fugax for 8 months. Conventional treatment efforts did not improve his condition. A single dose of an intravenous lidocaine infusion completely stopped his pain attacks. Based on the experience reported in this case and the potential benefit of this treatment for proctalgia fugax, controlled studies comparing intravenous lidocaine with placebo should be conducted to confirm the observation and to provide a more concrete basis for the use of intravenous lidocaine for this indication.

  16. Medical big data: promise and challenges.

    Science.gov (United States)

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  17. Medical big data: promise and challenges

    Directory of Open Access Journals (Sweden)

    Choong Ho Lee

    2017-03-01

    Full Text Available The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  18. What is beyond the big five?

    Science.gov (United States)

    Saucier, G; Goldberg, L R

    1998-08-01

    Previous investigators have proposed that various kinds of person-descriptive content--such as differences in attitudes or values, in sheer evaluation, in attractiveness, or in height and girth--are not adequately captured by the Big Five Model. We report on a rather exhaustive search for reliable sources of Big Five-independent variation in data from person-descriptive adjectives. Fifty-three candidate clusters were developed in a college sample using diverse approaches and sources. In a nonstudent adult sample, clusters were evaluated with respect to a minimax criterion: minimum multiple correlation with factors from Big Five markers and maximum reliability. The most clearly Big Five-independent clusters referred to Height, Girth, Religiousness, Employment Status, Youthfulness and Negative Valence (or low-base-rate attributes). Clusters referring to Fashionableness, Sensuality/Seductiveness, Beauty, Masculinity, Frugality, Humor, Wealth, Prejudice, Folksiness, Cunning, and Luck appeared to be potentially beyond the Big Five, although each of these clusters demonstrated Big Five multiple correlations of .30 to .45, and at least one correlation of .20 and over with a Big Five factor. Of all these content areas, Religiousness, Negative Valence, and the various aspects of Attractiveness were found to be represented by a substantial number of distinct, common adjectives. Results suggest directions for supplementing the Big Five when one wishes to extend variable selection outside the domain of personality traits as conventionally defined.

  19. Big Data Analytics and Its Applications

    Directory of Open Access Journals (Sweden)

    Mashooque A. Memon

    2017-10-01

    Full Text Available The term, Big Data, has been authored to refer to the extensive heave of data that can't be managed by traditional data handling methods or techniques. The field of Big Data plays an indispensable role in various fields, such as agriculture, banking, data mining, education, chemistry, finance, cloud computing, marketing, health care stocks. Big data analytics is the method for looking at big data to reveal hidden patterns, incomprehensible relationship and other important data that can be utilize to resolve on enhanced decisions. There has been a perpetually expanding interest for big data because of its fast development and since it covers different areas of applications. Apache Hadoop open source technology created in Java and keeps running on Linux working framework was used. The primary commitment of this exploration is to display an effective and free solution for big data application in a distributed environment, with its advantages and indicating its easy use. Later on, there emerge to be a required for an analytical review of new developments in the big data technology. Healthcare is one of the best concerns of the world. Big data in healthcare imply to electronic health data sets that are identified with patient healthcare and prosperity. Data in the healthcare area is developing past managing limit of the healthcare associations and is relied upon to increment fundamentally in the coming years.

  20. Measuring the Promise of Big Data Syllabi

    Science.gov (United States)

    Friedman, Alon

    2018-01-01

    Growing interest in Big Data is leading industries, academics and governments to accelerate Big Data research. However, how teachers should teach Big Data has not been fully examined. This article suggests criteria for redesigning Big Data syllabi in public and private degree-awarding higher education establishments. The author conducted a survey…

  1. Lecture 10: The European Bioinformatics Institute - "Big data" for biomedical sciences

    CERN Multimedia

    CERN. Geneva; Dana, Jose

    2013-01-01

    Part 1: Big data for biomedical sciences (Tom Hancocks) Ten years ago witnessed the completion of the first international 'Big Biology' project that sequenced the human genome. In the years since biological sciences, have seen a vast growth in data. In the coming years advances will come from integration of experimental approaches and the translation into applied technologies is the hospital, clinic and even at home. This talk will examine the development of infrastructure, physical and virtual, that will allow millions of life scientists across Europe better access to biological data Tom studied Human Genetics at the University of Leeds and McMaster University, before completing an MSc in Analytical Genomics at the University of Birmingham. He has worked for the UK National Health Service in diagnostic genetics and in training healthcare scientists and clinicians in bioinformatics. Tom joined the EBI in 2012 and is responsible for the scientific development and delivery of training for the BioMedBridges pr...

  2. Cost-minimization of mabthera intravenous versus subcutaneous administration

    NARCIS (Netherlands)

    Bax, P.; Postma, M.J.

    2013-01-01

    Objectives: To identify and compare all costs related to preparing and administrating MabThera for the intravenous and subcutaneous formulations in Dutch hematological patients. The a priori notion is that the costs of subcutaneous MabThera injections are lower compared to intravenous infusion due

  3. 77 FR 27245 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN

    Science.gov (United States)

    2012-05-09

    ... DEPARTMENT OF THE INTERIOR Fish and Wildlife Service [FWS-R3-R-2012-N069; FXRS1265030000S3-123-FF03R06000] Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN AGENCY: Fish and... plan (CCP) and environmental assessment (EA) for Big Stone National Wildlife Refuge (Refuge, NWR) for...

  4. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  5. Big data and educational research

    OpenAIRE

    Beneito-Montagut, Roser

    2017-01-01

    Big data and data analytics offer the promise to enhance teaching and learning, improve educational research and progress education governance. This chapter aims to contribute to the conceptual and methodological understanding of big data and analytics within educational research. It describes the opportunities and challenges that big data and analytics bring to education as well as critically explore the perils of applying a data driven approach to education. Despite the claimed value of the...

  6. Mayer Rokitansky Kuster Hauser (MRKH) syndrome with absent thumbs and big toes.

    Science.gov (United States)

    Yunus, Mahira

    2014-01-01

    Mayer-Rokitansky-Kuster-Hauser (MRKH) syndrome is a rare developmental failure of Müllerian ducts. Principle clinical features of MRKH syndrome are primary amenorrhoea associated with congenital absence of vagina, uterine anomalies, normal ovaries, 46 XX karyotype with normal female secondary sexual characteristics and frequent association with renal, skeletal, and other congenital anomalies. A case of a 3-year-old child with congenitally absent thumbs and big toes is reported herein; she was brought in with complaints of urinary incontinence. Radiological investigation (ultrasound and magnetic resonance imaging (MRI) scan) revealed absent uterus and vagina while both ovaries were normal. Intravenous urography (IVU) study showed bifid pelvicalyceal systems bilaterally. Karyotyping revealed a 46 XX female phenotype. Laparoscopy confirmed normal ovaries bilaterally and small unfused uterine buds lying beside both ovaries on each side of pelvis. Early diagnosis of MRKH syndrome is essential for timely planning of vaginal and (if possible) uterine reconstructive surgeries.

  7. Thick-Big Descriptions

    DEFF Research Database (Denmark)

    Lai, Signe Sophus

    The paper discusses the rewards and challenges of employing commercial audience measurements data – gathered by media industries for profitmaking purposes – in ethnographic research on the Internet in everyday life. It questions claims to the objectivity of big data (Anderson 2008), the assumption...... communication systems, language and behavior appear as texts, outputs, and discourses (data to be ‘found’) – big data then documents things that in earlier research required interviews and observations (data to be ‘made’) (Jensen 2014). However, web-measurement enterprises build audiences according...... to a commercial logic (boyd & Crawford 2011) and is as such directed by motives that call for specific types of sellable user data and specific segmentation strategies. In combining big data and ‘thick descriptions’ (Geertz 1973) scholars need to question how ethnographic fieldwork might map the ‘data not seen...

  8. Structural Insights into Arl1-Mediated Targeting of the Arf-GEF BIG1 to the trans-Golgi

    Directory of Open Access Journals (Sweden)

    Antonio Galindo

    2016-07-01

    Full Text Available The GTPase Arf1 is the major regulator of vesicle traffic at both the cis- and trans-Golgi. Arf1 is activated at the cis-Golgi by the guanine nucleotide exchange factor (GEF GBF1 and at the trans-Golgi by the related GEF BIG1 or its paralog, BIG2. The trans-Golgi-specific targeting of BIG1 and BIG2 depends on the Arf-like GTPase Arl1. We find that Arl1 binds to the dimerization and cyclophilin binding (DCB domain in BIG1 and report a crystal structure of human Arl1 bound to this domain. Residues in the DCB domain that bind Arl1 are required for BIG1 to locate to the Golgi in vivo. DCB domain-binding residues in Arl1 have a distinct conformation from those in known Arl1-effector complexes, and this plasticity allows Arl1 to interact with different effectors of unrelated structure. The findings provide structural insight into how Arf1 GEFs, and hence active Arf1, achieve their correct subcellular distribution.

  9. Big Data's Role in Precision Public Health.

    Science.gov (United States)

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts.

  10. Comparison of the pharmacokinetics of imipenem after intravenous and intrathecal administration in rabbits.

    Science.gov (United States)

    Wang, Y; Qiu, L; Dong, J; Wang, B; Shi, Z; Liu, B; Wang, W; Zhang, J; Cai, S; Ye, G; Cai, X

    2013-03-01

    Intrathecal administration of antibiotics has potentially high effectiveness for the treatment for severe intracranial infections, particularly nosocomial meningitis. The use of intrathecal injection of antibiotics has been reported mostly in case reports. However, there is sparse data regarding the pharmacokinetics of antibiotics after intrathecal administration. This study investigated whether intrathecal injection is an effective method for the administration of imipenem. The pharmacokinetics of imipenem after intrathecal and intravenous administration of 1:1 imipenem: cilastatin (IMI/CIL) to rabbits were compared. The AUC0-t in the cerebrospinal fluid for intrathecal administration was approximately twice that of an equal dose of intravenous administration at doses of 0.35, 0.7, and 1.4 mg/kg. Brain concentrations of imipenem after intrathecal injection were three times greater than observed after intravenous injection and remained high for at least 8 hours post-injection. Elimination of imipenem after administration by either route was primarily via urine, but a transient surge of imipenem in bile and intestinal tissue was observed. Results indicate that there is a clinical potential for intrathecally administered IMI/CIL. Further studies are warranted to investigate the potential for seizure and to assess the translatability of the rabbit model to human treatment.

  11. Big Data, indispensable today

    Directory of Open Access Journals (Sweden)

    Radu-Ioan ENACHE

    2015-10-01

    Full Text Available Big data is and will be used more in the future as a tool for everything that happens both online and offline. Of course , online is a real hobbit, Big Data is found in this medium , offering many advantages , being a real help for all consumers. In this paper we talked about Big Data as being a plus in developing new applications, by gathering useful information about the users and their behaviour.We've also presented the key aspects of real-time monitoring and the architecture principles of this technology. The most important benefit brought to this paper is presented in the cloud section.

  12. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  13. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  14. Antigravity and the big crunch/big bang transition

    International Nuclear Information System (INIS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  15. Big data: een zoektocht naar instituties

    NARCIS (Netherlands)

    van der Voort, H.G.; Crompvoets, J

    2016-01-01

    Big data is a well-known phenomenon, even a buzzword nowadays. It refers to an abundance of data and new possibilities to process and use them. Big data is subject of many publications. Some pay attention to the many possibilities of big data, others warn us for their consequences. This special

  16. Data, Data, Data : Big, Linked & Open

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Eckartz, S.M.

    2013-01-01

    De gehele business en IT-wereld praat op dit moment over Big Data, een trend die medio 2013 Cloud Computing is gepasseerd (op basis van Google Trends). Ook beleidsmakers houden zich actief bezig met Big Data. Neelie Kroes, vice-president van de Europese Commissie, spreekt over de ‘Big Data

  17. Personalized medicine beyond genomics: alternative futures in big data-proteomics, environtome and the social proteome.

    Science.gov (United States)

    Özdemir, Vural; Dove, Edward S; Gürsoy, Ulvi K; Şardaş, Semra; Yıldırım, Arif; Yılmaz, Şenay Görücü; Ömer Barlas, I; Güngör, Kıvanç; Mete, Alper; Srivastava, Sanjeeva

    2017-01-01

    No field in science and medicine today remains untouched by Big Data, and psychiatry is no exception. Proteomics is a Big Data technology and a next generation biomarker, supporting novel system diagnostics and therapeutics in psychiatry. Proteomics technology is, in fact, much older than genomics and dates to the 1970s, well before the launch of the international Human Genome Project. While the genome has long been framed as the master or "elite" executive molecule in cell biology, the proteome by contrast is humble. Yet the proteome is critical for life-it ensures the daily functioning of cells and whole organisms. In short, proteins are the blue-collar workers of biology, the down-to-earth molecules that we cannot live without. Since 2010, proteomics has found renewed meaning and international attention with the launch of the Human Proteome Project and the growing interest in Big Data technologies such as proteomics. This article presents an interdisciplinary technology foresight analysis and conceptualizes the terms "environtome" and "social proteome". We define "environtome" as the entire complement of elements external to the human host, from microbiome, ambient temperature and weather conditions to government innovation policies, stock market dynamics, human values, political power and social norms that collectively shape the human host spatially and temporally. The "social proteome" is the subset of the environtome that influences the transition of proteomics technology to innovative applications in society. The social proteome encompasses, for example, new reimbursement schemes and business innovation models for proteomics diagnostics that depart from the "once-a-life-time" genotypic tests and the anticipated hype attendant to context and time sensitive proteomics tests. Building on the "nesting principle" for governance of complex systems as discussed by Elinor Ostrom, we propose here a 3-tiered organizational architecture for Big Data science such as

  18. Comparison of the Effectiveness of a Virtual Simulator With a Plastic Arm Model in Teaching Intravenous Catheter Insertion Skills.

    Science.gov (United States)

    Günay İsmailoğlu, Elif; Zaybak, Ayten

    2018-02-01

    The objective of this study was to compare the effectiveness of a virtual intravenous simulator with a plastic arm model in teaching intravenous catheter insertion skills to nursing students. We used a randomized controlled quasi-experimental trial design and recruited 65 students who were assigned to the experimental (n = 33) and control (n = 32) groups using the simple random sampling method. The experimental group received intravenous catheterization skills training on the virtual intravenous simulator, and the control group received the same training on a plastic model of a human arm. Data were collected using the personal information form, intravenous catheterization knowledge assessment form, Intravenous Catheterization Skill Test, Self-Confidence and Satisfaction Scale, and Fear Symptoms Scale. In the study, the mean scores in the control group were 20.44 for psychomotor skills, 15.62 for clinical psychomotor skills, 31.78 for self-confidence, and 21.77 for satisfaction. The mean scores in the experimental group were 45.18 for psychomotor skills, 16.28 for clinical psychomotor skills, 34.18 for self-confidence, and 43.89 for satisfaction. The results indicated that psychomotor skills and satisfaction scores were higher in the experimental group, while the clinical psychomotor skills and self-confidence scores were similar in both groups. More students in the control group reported experiencing symptoms such as cold and sweaty hands, significant restlessness, and tense muscles than those in the experimental group.

  19. 76 FR 67130 - Bridger-Teton National Forest; Big Piney Ranger District; Wyoming; Environmental Impact Statement...

    Science.gov (United States)

    2011-10-31

    ... management actions, and (2) minimize food and other types of habituation and bear/human conflicts. Updated... project area is within the DFC 10 (Simultaneous Development of Resources, Opportunities for Human Experiences and Support for Big-game and a Wide Variety of Wildlife Species. Approximately five percent of the...

  20. High Efficiency of Human Normal Immunoglobulin for Intravenous Administration in a Patient with Kawasaki Syndrome Diagnosed in the Later Stages

    Directory of Open Access Journals (Sweden)

    Tatyana V. Sleptsova

    2016-01-01

    Full Text Available The article describes a case of late diagnosis of mucocutaneous lymphonodular syndrome (Kawasaki syndrome. At the beginning of the therapy, the child had fever, conjunctivitis, stomatitis, rash, solid swelling of hands and feet, and coronaritis with the development of aneurysms. The article describes the successful use of normal human immunoglobulin for intravenous administration at a dose of 2 g/kg body weight per course in combination with acetylsalicylic acid at the dose of 80 mg/kg per day. After 3 days of treatment, the rash disappeared; limb swelling and symptoms of conjunctivitis significantly reduced; and laboratory parameters of disease activity became normal (erythrocyte sedimentation rate, C-reactive protein concentration. After 3 months, inflammation in the coronary arteries was stopped. After 6 months, a regression of coronary artery aneurysms was recorded. No adverse effects during the immunoglobulin therapy were observed.

  1. Methods and tools for big data visualization

    OpenAIRE

    Zubova, Jelena; Kurasova, Olga

    2015-01-01

    In this paper, methods and tools for big data visualization have been investigated. Challenges faced by the big data analysis and visualization have been identified. Technologies for big data analysis have been discussed. A review of methods and tools for big data visualization has been done. Functionalities of the tools have been demonstrated by examples in order to highlight their advantages and disadvantages.

  2. Small data in the era of big data

    OpenAIRE

    Kitchin, Rob; Lauriault, Tracey P.

    2015-01-01

    Academic knowledge building has progressed for the past few centuries using small data studies characterized by sampled data generated to answer specific questions. It is a strategy that has been remarkably successful, enabling the sciences, social sciences and humanities to advance in leaps and bounds. This approach is presently being challenged by the development of big data. Small data studies will however, we argue, continue to be popular and valuable in the fut...

  3. Big data analytics methods and applications

    CERN Document Server

    Rao, BLS; Rao, SB

    2016-01-01

    This book has a collection of articles written by Big Data experts to describe some of the cutting-edge methods and applications from their respective areas of interest, and provides the reader with a detailed overview of the field of Big Data Analytics as it is practiced today. The chapters cover technical aspects of key areas that generate and use Big Data such as management and finance; medicine and healthcare; genome, cytome and microbiome; graphs and networks; Internet of Things; Big Data standards; bench-marking of systems; and others. In addition to different applications, key algorithmic approaches such as graph partitioning, clustering and finite mixture modelling of high-dimensional data are also covered. The varied collection of themes in this volume introduces the reader to the richness of the emerging field of Big Data Analytics.

  4. The Big bang and the Quantum

    Science.gov (United States)

    Ashtekar, Abhay

    2010-06-01

    General relativity predicts that space-time comes to an end and physics comes to a halt at the big-bang. Recent developments in loop quantum cosmology have shown that these predictions cannot be trusted. Quantum geometry effects can resolve singularities, thereby opening new vistas. Examples are: The big bang is replaced by a quantum bounce; the `horizon problem' disappears; immediately after the big bounce, there is a super-inflationary phase with its own phenomenological ramifications; and, in presence of a standard inflation potential, initial conditions are naturally set for a long, slow roll inflation independently of what happens in the pre-big bang branch. As in my talk at the conference, I will first discuss the foundational issues and then the implications of the new Planck scale physics near the Big Bang.

  5. Big Bang baryosynthesis

    International Nuclear Information System (INIS)

    Turner, M.S.; Chicago Univ., IL

    1983-01-01

    In these lectures I briefly review Big Bang baryosynthesis. In the first lecture I discuss the evidence which exists for the BAU, the failure of non-GUT symmetrical cosmologies, the qualitative picture of baryosynthesis, and numerical results of detailed baryosynthesis calculations. In the second lecture I discuss the requisite CP violation in some detail, further the statistical mechanics of baryosynthesis, possible complications to the simplest scenario, and one cosmological implication of Big Bang baryosynthesis. (orig./HSI)

  6. Exploiting big data for critical care research.

    Science.gov (United States)

    Docherty, Annemarie B; Lone, Nazir I

    2015-10-01

    Over recent years the digitalization, collection and storage of vast quantities of data, in combination with advances in data science, has opened up a new era of big data. In this review, we define big data, identify examples of critical care research using big data, discuss the limitations and ethical concerns of using these large datasets and finally consider scope for future research. Big data refers to datasets whose size, complexity and dynamic nature are beyond the scope of traditional data collection and analysis methods. The potential benefits to critical care are significant, with faster progress in improving health and better value for money. Although not replacing clinical trials, big data can improve their design and advance the field of precision medicine. However, there are limitations to analysing big data using observational methods. In addition, there are ethical concerns regarding maintaining confidentiality of patients who contribute to these datasets. Big data have the potential to improve medical care and reduce costs, both by individualizing medicine, and bringing together multiple sources of data about individual patients. As big data become increasingly mainstream, it will be important to maintain public confidence by safeguarding data security, governance and confidentiality.

  7. Empathy and the Big Five

    OpenAIRE

    Paulus, Christoph

    2016-01-01

    Del Barrio et al. (2004) haben vor mehr als 10 Jahren versucht, eine direkte Beziehung zwischen Empathie und den Big Five herzustellen. Im Mittel hatten in ihrer Stichprobe Frauen höhere Werte in der Empathie und auf den Big Five-Faktoren mit Ausnahme des Faktors Neurotizismus. Zusammenhänge zu Empathie fanden sie in den Bereichen Offenheit, Verträglichkeit, Gewissenhaftigkeit und Extraversion. In unseren Daten besitzen Frauen sowohl in der Empathie als auch den Big Five signifikant höhere We...

  8. Advances in the use of intravenous techniques in ambulatory anesthesia

    Directory of Open Access Journals (Sweden)

    Eng MR

    2015-07-01

    Full Text Available Matthew R Eng,1 Paul F White1,2 1Department of Anesthesiology, Cedars-Sinai Medical Center, Los Angeles, CA, USA; 2White Mountain Institute, The Sea Ranch, CA, USA Summary statement: Advances in the use of intravenous techniques in ambulatory anesthesia has become important for the anesthesiologist as the key perioperative physician in outpatient surgery. Key techniques and choices of anesthetics are important in accomplishing fast track goals of ambulatory surgery. Purpose of review: The anesthesiologist in the outpatient environment must focus on improving perioperative efficiency and reducing recovery times while accounting for patients' well-being and safety. This review article focuses on recent intravenous anesthetic techniques to accomplish these goals. Recent findings: This review is an overview of techniques in intravenous anesthesia for ambulatory anesthesia. Intravenous techniques may be tailored to accomplish outpatient surgery goals for the type of surgical procedure and individual patient needs. Careful anesthetic planning and the application of the plans are critical to an anesthesiologist's success with fast-track ambulatory surgery. Conclusion: Careful planning and application of intravenous techniques are critical to an anesthesiologist's success with fast-track ambulatory surgery. Keywords: intravenous anesthesia, outpatient anesthesia, fast-track surgery

  9. The intravenous injection of illicit drugs and needle sharing: an historical perspective.

    Science.gov (United States)

    Zule, W A; Vogtsberger, K N; Desmond, D P

    1997-01-01

    This study reviewed the literature on the history of needle sharing and intravenous drug abuse. Reports suggest that needle sharing was practiced by drug abusers as early as 1902 in China and 1914 in the United States. Intravenous drug abuse was first mentioned in the literature in 1925. However other references suggest that some opioid users were injecting intravenously prior to 1920. Outbreaks of malaria in Egypt, the United States, and China between 1929 and 1937 were attributed to needle sharing and intravenous injection of opioids. These reports suggest that both needle sharing and intravenous drug use were common by 1937. Factors such as medical use of intravenous injections, enactment and zealous enforcement of antinarcotic laws, and interactions among drug users in institutional settings such as regional hospitals and prisons may have contributed to the spread of both needle sharing and the intravenous technique among drug abusers.

  10. Big domains are novel Ca²+-binding modules: evidences from big domains of Leptospira immunoglobulin-like (Lig) proteins.

    Science.gov (United States)

    Raman, Rajeev; Rajanikanth, V; Palaniappan, Raghavan U M; Lin, Yi-Pin; He, Hongxuan; McDonough, Sean P; Sharma, Yogendra; Chang, Yung-Fu

    2010-12-29

    Many bacterial surface exposed proteins mediate the host-pathogen interaction more effectively in the presence of Ca²+. Leptospiral immunoglobulin-like (Lig) proteins, LigA and LigB, are surface exposed proteins containing Bacterial immunoglobulin like (Big) domains. The function of proteins which contain Big fold is not known. Based on the possible similarities of immunoglobulin and βγ-crystallin folds, we here explore the important question whether Ca²+ binds to a Big domains, which would provide a novel functional role of the proteins containing Big fold. We selected six individual Big domains for this study (three from the conserved part of LigA and LigB, denoted as Lig A3, Lig A4, and LigBCon5; two from the variable region of LigA, i.e., 9(th) (Lig A9) and 10(th) repeats (Lig A10); and one from the variable region of LigB, i.e., LigBCen2. We have also studied the conserved region covering the three and six repeats (LigBCon1-3 and LigCon). All these proteins bind the calcium-mimic dye Stains-all. All the selected four domains bind Ca²+ with dissociation constants of 2-4 µM. Lig A9 and Lig A10 domains fold well with moderate thermal stability, have β-sheet conformation and form homodimers. Fluorescence spectra of Big domains show a specific doublet (at 317 and 330 nm), probably due to Trp interaction with a Phe residue. Equilibrium unfolding of selected Big domains is similar and follows a two-state model, suggesting the similarity in their fold. We demonstrate that the Lig are Ca²+-binding proteins, with Big domains harbouring the binding motif. We conclude that despite differences in sequence, a Big motif binds Ca²+. This work thus sets up a strong possibility for classifying the proteins containing Big domains as a novel family of Ca²+-binding proteins. Since Big domain is a part of many proteins in bacterial kingdom, we suggest a possible function these proteins via Ca²+ binding.

  11. Relações hierárquicas entre os traços amplos do Big Five Hierarchical relationship between the broad traits of the Big Five

    Directory of Open Access Journals (Sweden)

    Cristiano Mauro Assis Gomes

    2012-01-01

    Full Text Available O modelo Big Five sustenta que a personalidade humana é composta por dezenas de fatores específicos. Apesar dessa diversidade, esses fatores confluem para cinco traços amplos que estão em um mesmo nível de hierarquia. O presente estudo apresenta uma hipótese alternativa, postulando níveis entre os traços amplos do modelo. Fizeram parte do estudo 684 estudantes do ensino fundamental e médio de uma escola particular de Belo Horizonte, MG, com idades entre 10 e 18 anos (m = 13,71 e DP= 2,11. Para medir os fatores do Big Five foi utilizado o Inventário de Características de Personalidade, anteriormente chamado de Inventário dos Adjetivos de Personalidade, de Pinheiro, Gomes e Braga (2009. O instrumento mensura oito polaridades das 10 polaridades presentes nos cinco traços amplos do Big Five. Dois modelos foram comparados via método path analysis: um modelo de quatro níveis hierárquicos e um modelo não hierárquico. O modelo hierárquico apresentou adequado grau de ajuste aos dados e mostrou-se superior ao modelo não hierárquico, que não se ajusta aos dados. Implicações são discutidas para o modelo Big Five.The Big Five model sustains that human personality is composed by dozens of specific factors. Despite of diversity, specific factors are integrated in five broad traits that are in the same hierarchical level. The current study presents an alternative hypothesis arguing that there are hierarchical levels between the broad traits of the model. Six hundred and eighty-four junior and high school level students from 10 to 18 years old (M = 13.71 and SD= 2.11 of a private school in the city of Belo Horizonte, Minas Gerais, Brazil participated in the study. The Big Five was measured by an Inventory of Personality Traits, initially named as Personality Adjective Inventory, elaborated by Pinheiro, Gomes and Braga (2009. This instrument measures eight polarities of the ten presented in the Big Five Model. Two models were compared

  12. Semantic Web Technologies and Big Data Infrastructures: SPARQL Federated Querying of Heterogeneous Big Data Stores

    OpenAIRE

    Konstantopoulos, Stasinos; Charalambidis, Angelos; Mouchakis, Giannis; Troumpoukis, Antonis; Jakobitsch, Jürgen; Karkaletsis, Vangelis

    2016-01-01

    The ability to cross-link large scale data with each other and with structured Semantic Web data, and the ability to uniformly process Semantic Web and other data adds value to both the Semantic Web and to the Big Data community. This paper presents work in progress towards integrating Big Data infrastructures with Semantic Web technologies, allowing for the cross-linking and uniform retrieval of data stored in both Big Data infrastructures and Semantic Web data. The technical challenges invo...

  13. Quantum fields in a big-crunch-big-bang spacetime

    International Nuclear Information System (INIS)

    Tolley, Andrew J.; Turok, Neil

    2002-01-01

    We consider quantum field theory on a spacetime representing the big-crunch-big-bang transition postulated in ekpyrotic or cyclic cosmologies. We show via several independent methods that an essentially unique matching rule holds connecting the incoming state, in which a single extra dimension shrinks to zero, to the outgoing state in which it reexpands at the same rate. For free fields in our construction there is no particle production from the incoming adiabatic vacuum. When interactions are included the particle production for fixed external momentum is finite at the tree level. We discuss a formal correspondence between our construction and quantum field theory on de Sitter spacetime

  14. Turning big bang into big bounce: II. Quantum dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Malkiewicz, Przemyslaw; Piechocki, Wlodzimierz, E-mail: pmalk@fuw.edu.p, E-mail: piech@fuw.edu.p [Theoretical Physics Department, Institute for Nuclear Studies, Hoza 69, 00-681 Warsaw (Poland)

    2010-11-21

    We analyze the big bounce transition of the quantum Friedmann-Robertson-Walker model in the setting of the nonstandard loop quantum cosmology (LQC). Elementary observables are used to quantize composite observables. The spectrum of the energy density operator is bounded and continuous. The spectrum of the volume operator is bounded from below and discrete. It has equally distant levels defining a quantum of the volume. The discreteness may imply a foamy structure of spacetime at a semiclassical level which may be detected in astro-cosmo observations. The nonstandard LQC method has a free parameter that should be fixed in some way to specify the big bounce transition.

  15. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-07-31

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to big data scaling. This presents a serious impediment since identify- ing and repairing dirty data often involves processing huge input datasets, handling sophisticated error discovery approaches and managing huge arbitrary errors. With large datasets, error detection becomes overly expensive and complicated especially when considering user-defined functions. Furthermore, a distinctive algorithm is de- sired to optimize inequality joins in sophisticated error discovery rather than na ̈ıvely parallelizing them. Also, when repairing large errors, their skewed distribution may obstruct effective error repairs. In this dissertation, I present solutions to overcome the above three problems in scaling data cleansing. First, I present BigDansing as a general system to tackle efficiency, scalability, and ease-of-use issues in data cleansing for Big Data. It automatically parallelizes the user’s code on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also enables parallel execution of serial repair algorithms by exploiting the graph representation of discovered errors. The experimental results show that BigDansing outperforms existing baselines up to more than two orders of magnitude. Although BigDansing scales cleansing jobs, it still lacks the ability to handle sophisticated error discovery requiring inequality joins. Therefore, I developed IEJoin as an algorithm for fast inequality joins. It is based on sorted arrays and space efficient bit-arrays to reduce the problem’s search space. By comparing IEJoin against well- known optimizations, I show that it is more scalable, and several orders of magnitude faster. BigDansing depends on vertex-centric graph systems, i.e., Pregel

  16. The ethics of big data in big agriculture

    Directory of Open Access Journals (Sweden)

    Isabelle M. Carbonell

    2016-03-01

    Full Text Available This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique insights on a field-by-field basis into a third or more of the US farmland. This power asymmetry may be rebalanced through open-sourced data, and publicly-funded data analytic tools which rival Climate Corp. in complexity and innovation for use in the public domain.

  17. Homogeneous and isotropic big rips?

    CERN Document Server

    Giovannini, Massimo

    2005-01-01

    We investigate the way big rips are approached in a fully inhomogeneous description of the space-time geometry. If the pressure and energy densities are connected by a (supernegative) barotropic index, the spatial gradients and the anisotropic expansion decay as the big rip is approached. This behaviour is contrasted with the usual big-bang singularities. A similar analysis is performed in the case of sudden (quiescent) singularities and it is argued that the spatial gradients may well be non-negligible in the vicinity of pressure singularities.

  18. Rate Change Big Bang Theory

    Science.gov (United States)

    Strickland, Ken

    2013-04-01

    The Rate Change Big Bang Theory redefines the birth of the universe with a dramatic shift in energy direction and a new vision of the first moments. With rate change graph technology (RCGT) we can look back 13.7 billion years and experience every step of the big bang through geometrical intersection technology. The analysis of the Big Bang includes a visualization of the first objects, their properties, the astounding event that created space and time as well as a solution to the mystery of anti-matter.

  19. Intelligent Test Mechanism Design of Worn Big Gear

    Directory of Open Access Journals (Sweden)

    Hong-Yu LIU

    2014-10-01

    Full Text Available With the continuous development of national economy, big gear was widely applied in metallurgy and mine domains. So, big gear plays an important role in above domains. In practical production, big gear abrasion and breach take place often. It affects normal production and causes unnecessary economic loss. A kind of intelligent test method was put forward on worn big gear mainly aimed at the big gear restriction conditions of high production cost, long production cycle and high- intensity artificial repair welding work. The measure equations transformations were made on involute straight gear. Original polar coordinate equations were transformed into rectangular coordinate equations. Big gear abrasion measure principle was introduced. Detection principle diagram was given. Detection route realization method was introduced. OADM12 laser sensor was selected. Detection on big gear abrasion area was realized by detection mechanism. Tested data of unworn gear and worn gear were led in designed calculation program written by Visual Basic language. Big gear abrasion quantity can be obtained. It provides a feasible method for intelligent test and intelligent repair welding on worn big gear.

  20. [Big data in medicine and healthcare].

    Science.gov (United States)

    Rüping, Stefan

    2015-08-01

    Healthcare is one of the business fields with the highest Big Data potential. According to the prevailing definition, Big Data refers to the fact that data today is often too large and heterogeneous and changes too quickly to be stored, processed, and transformed into value by previous technologies. The technological trends drive Big Data: business processes are more and more executed electronically, consumers produce more and more data themselves - e.g. in social networks - and finally ever increasing digitalization. Currently, several new trends towards new data sources and innovative data analysis appear in medicine and healthcare. From the research perspective, omics-research is one clear Big Data topic. In practice, the electronic health records, free open data and the "quantified self" offer new perspectives for data analytics. Regarding analytics, significant advances have been made in the information extraction from text data, which unlocks a lot of data from clinical documentation for analytics purposes. At the same time, medicine and healthcare is lagging behind in the adoption of Big Data approaches. This can be traced to particular problems regarding data complexity and organizational, legal, and ethical challenges. The growing uptake of Big Data in general and first best-practice examples in medicine and healthcare in particular, indicate that innovative solutions will be coming. This paper gives an overview of the potentials of Big Data in medicine and healthcare.

  1. From Big Data to Big Business

    DEFF Research Database (Denmark)

    Lund Pedersen, Carsten

    2017-01-01

    Idea in Brief: Problem: There is an enormous profit potential for manufacturing firms in big data, but one of the key barriers to obtaining data-driven growth is the lack of knowledge about which capabilities are needed to extract value and profit from data. Solution: We (BDBB research group at C...

  2. Disposition of nasal, intravenous, and oral methadone in healthy volunteers.

    Science.gov (United States)

    Dale, Ola; Hoffer, Christine; Sheffels, Pamela; Kharasch, Evan D

    2002-11-01

    Nasal administration of many opioids demonstrates rapid uptake and fast onset of action. Nasal administration may be an alternative to intravenous and oral administration of methadone and was therefore studied in human volunteers. The study was approved by the Institutional Review Board of the University of Washington, Seattle. Eight healthy volunteers (6 men and 2 women) aged 19 to 33 years were enrolled after informed written consent was obtained. Subjects received 10 mg methadone hydrochloride nasally, orally, or intravenously on 3 separate occasions in a crossover design. Nasal methadone (50 mg/mL in aqueous solution) was given as a 100-microL spray in each nostril (Pfeiffer BiDose sprayer). Blood samples for liquid chromatography-mass spectrometry analyses of methadone and the metabolite 2-ethyl-1,5-dimethyl-3,3-diphenylpyrrolinium were drawn for up to 96 hours. The methadone effect was measured by noninvasive infrared pupilometry coincident with blood sampling. Nasal uptake of methadone was rapid, with maximum plasma concentrations occurring within 7 minutes. The maximum effects of intravenous, nasal, and oral methadone, on the basis of dark-adapted pupil diameter, were reached in about 15 minutes, 30 minutes, and 2 hours, respectively. The respective durations were 24, 10, and 8 hours. Both nasal and oral bioavailabilities were 0.85. Subjects reported that nasal methadone caused a burning sensation. Nasal administration of methadone results in rapid absorption and onset of effect and high bioavailability, which was greater than that reported for other nasal opioids, with a similar duration of effect. Nasal administration may be an alternative route of methadone administration; however, improved formulations are desirable to reduce nasal irritation.

  3. Acute toxicity of intravenously administered titanium dioxide nanoparticles in mice.

    Directory of Open Access Journals (Sweden)

    Jiaying Xu

    Full Text Available BACKGROUND: With a wide range of applications, titanium dioxide (TiO₂ nanoparticles (NPs are manufactured worldwide in large quantities. Recently, in the field of nanomedicine, intravenous injection of TiO₂ nanoparticulate carriers directly into the bloodstream has raised public concerns on their toxicity to humans. METHODS: In this study, mice were injected intravenously with a single dose of TiO₂ NPs at varying dose levels (0, 140, 300, 645, or 1387 mg/kg. Animal mortality, blood biochemistry, hematology, genotoxicity and histopathology were investigated 14 days after treatment. RESULTS: Death of mice in the highest dose (1387 mg/kg group was observed at day two after TiO₂ NPs injection. At day 7, acute toxicity symptoms, such as decreased physical activity and decreased intake of food and water, were observed in the highest dose group. Hematological analysis and the micronucleus test showed no significant acute hematological or genetic toxicity except an increase in the white blood cell (WBC count among mice 645 mg/kg dose group. However, the spleen of the mice showed significantly higher tissue weight/body weight (BW coefficients, and lower liver and kidney coefficients in the TiO₂ NPs treated mice compared to control. The biochemical parameters and histological tissue sections indicated that TiO₂ NPs treatment could induce different degrees of damage in the brain, lung, spleen, liver and kidneys. However, no pathological effects were observed in the heart in TiO₂ NPs treated mice. CONCLUSIONS: Intravenous injection of TiO₂ NPs at high doses in mice could cause acute toxicity effects in the brain, lung, spleen, liver, and kidney. No significant hematological or genetic toxicity was observed.

  4. Making big sense from big data in toxicology by read-across.

    Science.gov (United States)

    Hartung, Thomas

    2016-01-01

    Modern information technologies have made big data available in safety sciences, i.e., extremely large data sets that may be analyzed only computationally to reveal patterns, trends and associations. This happens by (1) compilation of large sets of existing data, e.g., as a result of the European REACH regulation, (2) the use of omics technologies and (3) systematic robotized testing in a high-throughput manner. All three approaches and some other high-content technologies leave us with big data--the challenge is now to make big sense of these data. Read-across, i.e., the local similarity-based intrapolation of properties, is gaining momentum with increasing data availability and consensus on how to process and report it. It is predominantly applied to in vivo test data as a gap-filling approach, but can similarly complement other incomplete datasets. Big data are first of all repositories for finding similar substances and ensure that the available data is fully exploited. High-content and high-throughput approaches similarly require focusing on clusters, in this case formed by underlying mechanisms such as pathways of toxicity. The closely connected properties, i.e., structural and biological similarity, create the confidence needed for predictions of toxic properties. Here, a new web-based tool under development called REACH-across, which aims to support and automate structure-based read-across, is presented among others.

  5. Integrative Analysis of Omics Big Data.

    Science.gov (United States)

    Yu, Xiang-Tian; Zeng, Tao

    2018-01-01

    The diversity and huge omics data take biology and biomedicine research and application into a big data era, just like that popular in human society a decade ago. They are opening a new challenge from horizontal data ensemble (e.g., the similar types of data collected from different labs or companies) to vertical data ensemble (e.g., the different types of data collected for a group of person with match information), which requires the integrative analysis in biology and biomedicine and also asks for emergent development of data integration to address the great changes from previous population-guided to newly individual-guided investigations.Data integration is an effective concept to solve the complex problem or understand the complicate system. Several benchmark studies have revealed the heterogeneity and trade-off that existed in the analysis of omics data. Integrative analysis can combine and investigate many datasets in a cost-effective reproducible way. Current integration approaches on biological data have two modes: one is "bottom-up integration" mode with follow-up manual integration, and the other one is "top-down integration" mode with follow-up in silico integration.This paper will firstly summarize the combinatory analysis approaches to give candidate protocol on biological experiment design for effectively integrative study on genomics and then survey the data fusion approaches to give helpful instruction on computational model development for biological significance detection, which have also provided newly data resources and analysis tools to support the precision medicine dependent on the big biomedical data. Finally, the problems and future directions are highlighted for integrative analysis of omics big data.

  6. A Big Data Guide to Understanding Climate Change: The Case for Theory-Guided Data Science.

    Science.gov (United States)

    Faghmous, James H; Kumar, Vipin

    2014-09-01

    Global climate change and its impact on human life has become one of our era's greatest challenges. Despite the urgency, data science has had little impact on furthering our understanding of our planet in spite of the abundance of climate data. This is a stark contrast from other fields such as advertising or electronic commerce where big data has been a great success story. This discrepancy stems from the complex nature of climate data as well as the scientific questions climate science brings forth. This article introduces a data science audience to the challenges and opportunities to mine large climate datasets, with an emphasis on the nuanced difference between mining climate data and traditional big data approaches. We focus on data, methods, and application challenges that must be addressed in order for big data to fulfill their promise with regard to climate science applications. More importantly, we highlight research showing that solely relying on traditional big data techniques results in dubious findings, and we instead propose a theory-guided data science paradigm that uses scientific theory to constrain both the big data techniques as well as the results-interpretation process to extract accurate insight from large climate data .

  7. [Big data in official statistics].

    Science.gov (United States)

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  8. Development of a translational model to screen medications for cocaine use disorder II: Choice between intravenous cocaine and money in humans

    Science.gov (United States)

    Lile, Joshua A.; Stoops, William W.; Rush, Craig R.; Negus, S. Stevens; Glaser, Paul E. A.; Hatton, Kevin W.; Hays, Lon R.

    2016-01-01

    Background A medication for treating cocaine use disorder has yet to be approved. Laboratory-based evaluation of candidate medications in animals and humans is a valuable means to demonstrate safety, tolerability and initial efficacy of potential medications. However, animal-to-human translation has been hampered by a lack of coordination. Therefore, we designed homologous cocaine self-administration studies in rhesus monkeys (see companion article) and human subjects in an attempt to develop linked, functionally equivalent procedures for research on candidate medications for cocaine use disorder. Methods Eight (N=8) subjects with cocaine use disorder completed 12 experimental sessions in which they responded to receive money ($0.01, $1.00 and $3.00) or intravenous cocaine (0, 3, 10 and 30 mg/70 kg) under independent, concurrent progressive-ratio schedules. Prior to the completion of 9 choice trials, subjects sampled the cocaine dose available during that session and were informed of the monetary alternative value. Results The allocation of behavior varied systematically as a function of cocaine dose and money value. Moreover, a similar pattern of cocaine choice was demonstrated in rhesus monkeys and humans across different cocaine doses and magnitudes of the species-specific alternative reinforcers. The subjective and cardiovascular responses to IV cocaine were an orderly function of dose, although heart rate and blood pressure remained within safe limits. Conclusions These coordinated studies successfully established drug vs. non-drug choice procedures in humans and rhesus monkeys that yielded similar cocaine choice behavior across species. This translational research platform will be used in future research to enhance the efficiency of developing interventions to reduce cocaine use. PMID:27269368

  9. Big-Leaf Mahogany on CITES Appendix II: Big Challenge, Big Opportunity

    Science.gov (United States)

    JAMES GROGAN; PAULO BARRETO

    2005-01-01

    On 15 November 2003, big-leaf mahogany (Swietenia macrophylla King, Meliaceae), the most valuable widely traded Neotropical timber tree, gained strengthened regulatory protection from its listing on Appendix II of the Convention on International Trade in Endangered Species ofWild Fauna and Flora (CITES). CITES is a United Nations-chartered agreement signed by 164...

  10. Big Data as Information Barrier

    Directory of Open Access Journals (Sweden)

    Victor Ya. Tsvetkov

    2014-07-01

    Full Text Available The article covers analysis of ‘Big Data’ which has been discussed over last 10 years. The reasons and factors for the issue are revealed. It has proved that the factors creating ‘Big Data’ issue has existed for quite a long time, and from time to time, would cause the informational barriers. Such barriers were successfully overcome through the science and technologies. The conducted analysis refers the “Big Data” issue to a form of informative barrier. This issue may be solved correctly and encourages development of scientific and calculating methods.

  11. Big Data in Space Science

    OpenAIRE

    Barmby, Pauline

    2018-01-01

    It seems like “big data” is everywhere these days. In planetary science and astronomy, we’ve been dealing with large datasets for a long time. So how “big” is our data? How does it compare to the big data that a bank or an airline might have? What new tools do we need to analyze big datasets, and how can we make better use of existing tools? What kinds of science problems can we address with these? I’ll address these questions with examples including ESA’s Gaia mission, ...

  12. Main Issues in Big Data Security

    Directory of Open Access Journals (Sweden)

    Julio Moreno

    2016-09-01

    Full Text Available Data is currently one of the most important assets for companies in every field. The continuous growth in the importance and volume of data has created a new problem: it cannot be handled by traditional analysis techniques. This problem was, therefore, solved through the creation of a new paradigm: Big Data. However, Big Data originated new issues related not only to the volume or the variety of the data, but also to data security and privacy. In order to obtain a full perspective of the problem, we decided to carry out an investigation with the objective of highlighting the main issues regarding Big Data security, and also the solutions proposed by the scientific community to solve them. In this paper, we explain the results obtained after applying a systematic mapping study to security in the Big Data ecosystem. It is almost impossible to carry out detailed research into the entire topic of security, and the outcome of this research is, therefore, a big picture of the main problems related to security in a Big Data system, along with the principal solutions to them proposed by the research community.

  13. Big Bayou Creek and Little Bayou Creek Watershed Monitoring Program

    Energy Technology Data Exchange (ETDEWEB)

    Kszos, L.A.; Peterson, M.J.; Ryon; Smith, J.G.

    1999-03-01

    Biological monitoring of Little Bayou and Big Bayou creeks, which border the Paducah Site, has been conducted since 1987. Biological monitoring was conducted by University of Kentucky from 1987 to 1991 and by staff of the Environmental Sciences Division (ESD) at Oak Ridge National Laboratory (ORNL) from 1991 through March 1999. In March 1998, renewed Kentucky Pollutant Discharge Elimination System (KPDES) permits were issued to the US Department of Energy (DOE) and US Enrichment Corporation. The renewed DOE permit requires that a watershed monitoring program be developed for the Paducah Site within 90 days of the effective date of the renewed permit. This plan outlines the sampling and analysis that will be conducted for the watershed monitoring program. The objectives of the watershed monitoring are to (1) determine whether discharges from the Paducah Site and the Solid Waste Management Units (SWMUs) associated with the Paducah Site are adversely affecting instream fauna, (2) assess the ecological health of Little Bayou and Big Bayou creeks, (3) assess the degree to which abatement actions ecologically benefit Big Bayou Creek and Little Bayou Creek, (4) provide guidance for remediation, (5) provide an evaluation of changes in potential human health concerns, and (6) provide data which could be used to assess the impact of inadvertent spills or fish kill. According to the cleanup will result in these watersheds [Big Bayou and Little Bayou creeks] achieving compliance with the applicable water quality criteria.

  14. Principales parámetros para el estudio de la colaboración científica en Big Science

    Directory of Open Access Journals (Sweden)

    Ortoll, Eva

    2014-12-01

    Full Text Available In several scientific disciplines research has shifted from experiments of a reduced scale to large and complex collaborations. Many recent scientific achievements like the human genome sequencing or the discovery of the Higgs boson have taken place within the “big science” paradigm. The study of scientific collaboration needs to take into account all the diverse factors that have an influence on it. In the case of big science experiments, some of those aspects are particularly important: number of institutions involved, cultural differences, diversity of spaces and infrastructures or the conceptualization of research problems. By considering these specific factors we present a set of parameters for the analysis of scientific collaboration in big science projects. The utility of these parameters is illustrated through a comparative study of two large big science projects: the ATLAS experiment and the Human Genome Project.En varias áreas de la ciencia se ha pasado de trabajar en experimentos reducidos a participar en grandes y complejas colaboraciones. Muchos de los grandes avances científicos recientes como la secuenciación del genoma humano o el descubrimiento del bosón de Higgs se enmarcan en el paradigma denominado big science. El estudio de la colaboración científica debe tener en cuenta los factores de todo tipo que influyen en dicha colaboración. Los experimentos de big science inciden especialmente en algunos de estos aspectos: volumen de instituciones implicadas, diferencias culturales, diversidad de espacios e infraestructuras o la propia conceptualización del problema de investigación. Atendiendo a estas particularidades, en este trabajo presentamos un conjunto de parámetros para el análisis de la colaboración científica en proyectos big science. Ilustramos la utilidad de esos parámetros mediante un estudio comparativo de dos grandes proyectos de big science: el experimento ATLAS y el Proyecto Genoma Humano.

  15. Harnessing the Power of Big Data to Improve Graduate Medical Education: Big Idea or Bust?

    Science.gov (United States)

    Arora, Vineet M

    2018-06-01

    With the advent of electronic medical records (EMRs) fueling the rise of big data, the use of predictive analytics, machine learning, and artificial intelligence are touted as transformational tools to improve clinical care. While major investments are being made in using big data to transform health care delivery, little effort has been directed toward exploiting big data to improve graduate medical education (GME). Because our current system relies on faculty observations of competence, it is not unreasonable to ask whether big data in the form of clinical EMRs and other novel data sources can answer questions of importance in GME such as when is a resident ready for independent practice.The timing is ripe for such a transformation. A recent National Academy of Medicine report called for reforms to how GME is delivered and financed. While many agree on the need to ensure that GME meets our nation's health needs, there is little consensus on how to measure the performance of GME in meeting this goal. During a recent workshop at the National Academy of Medicine on GME outcomes and metrics in October 2017, a key theme emerged: Big data holds great promise to inform GME performance at individual, institutional, and national levels. In this Invited Commentary, several examples are presented, such as using big data to inform clinical experience and provide clinically meaningful data to trainees, and using novel data sources, including ambient data, to better measure the quality of GME training.

  16. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  17. A survey of big data research

    Science.gov (United States)

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  18. Big Data in Action for Government : Big Data Innovation in Public Services, Policy, and Engagement

    OpenAIRE

    World Bank

    2017-01-01

    Governments have an opportunity to harness big data solutions to improve productivity, performance and innovation in service delivery and policymaking processes. In developing countries, governments have an opportunity to adopt big data solutions and leapfrog traditional administrative approaches

  19. 78 FR 3911 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN; Final Comprehensive...

    Science.gov (United States)

    2013-01-17

    ... DEPARTMENT OF THE INTERIOR Fish and Wildlife Service [FWS-R3-R-2012-N259; FXRS1265030000-134-FF03R06000] Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN; Final Comprehensive... significant impact (FONSI) for the environmental assessment (EA) for Big Stone National Wildlife Refuge...

  20. Big domains are novel Ca²+-binding modules: evidences from big domains of Leptospira immunoglobulin-like (Lig proteins.

    Directory of Open Access Journals (Sweden)

    Rajeev Raman

    Full Text Available BACKGROUND: Many bacterial surface exposed proteins mediate the host-pathogen interaction more effectively in the presence of Ca²+. Leptospiral immunoglobulin-like (Lig proteins, LigA and LigB, are surface exposed proteins containing Bacterial immunoglobulin like (Big domains. The function of proteins which contain Big fold is not known. Based on the possible similarities of immunoglobulin and βγ-crystallin folds, we here explore the important question whether Ca²+ binds to a Big domains, which would provide a novel functional role of the proteins containing Big fold. PRINCIPAL FINDINGS: We selected six individual Big domains for this study (three from the conserved part of LigA and LigB, denoted as Lig A3, Lig A4, and LigBCon5; two from the variable region of LigA, i.e., 9(th (Lig A9 and 10(th repeats (Lig A10; and one from the variable region of LigB, i.e., LigBCen2. We have also studied the conserved region covering the three and six repeats (LigBCon1-3 and LigCon. All these proteins bind the calcium-mimic dye Stains-all. All the selected four domains bind Ca²+ with dissociation constants of 2-4 µM. Lig A9 and Lig A10 domains fold well with moderate thermal stability, have β-sheet conformation and form homodimers. Fluorescence spectra of Big domains show a specific doublet (at 317 and 330 nm, probably due to Trp interaction with a Phe residue. Equilibrium unfolding of selected Big domains is similar and follows a two-state model, suggesting the similarity in their fold. CONCLUSIONS: We demonstrate that the Lig are Ca²+-binding proteins, with Big domains harbouring the binding motif. We conclude that despite differences in sequence, a Big motif binds Ca²+. This work thus sets up a strong possibility for classifying the proteins containing Big domains as a novel family of Ca²+-binding proteins. Since Big domain is a part of many proteins in bacterial kingdom, we suggest a possible function these proteins via Ca²+ binding.

  1. New 'bigs' in cosmology

    International Nuclear Information System (INIS)

    Yurov, Artyom V.; Martin-Moruno, Prado; Gonzalez-Diaz, Pedro F.

    2006-01-01

    This paper contains a detailed discussion on new cosmic solutions describing the early and late evolution of a universe that is filled with a kind of dark energy that may or may not satisfy the energy conditions. The main distinctive property of the resulting space-times is that they make to appear twice the single singular events predicted by the corresponding quintessential (phantom) models in a manner which can be made symmetric with respect to the origin of cosmic time. Thus, big bang and big rip singularity are shown to take place twice, one on the positive branch of time and the other on the negative one. We have also considered dark energy and phantom energy accretion onto black holes and wormholes in the context of these new cosmic solutions. It is seen that the space-times of these holes would then undergo swelling processes leading to big trip and big hole events taking place on distinct epochs along the evolution of the universe. In this way, the possibility is considered that the past and future be connected in a non-paradoxical manner in the universes described by means of the new symmetric solutions

  2. 2nd INNS Conference on Big Data

    CERN Document Server

    Manolopoulos, Yannis; Iliadis, Lazaros; Roy, Asim; Vellasco, Marley

    2017-01-01

    The book offers a timely snapshot of neural network technologies as a significant component of big data analytics platforms. It promotes new advances and research directions in efficient and innovative algorithmic approaches to analyzing big data (e.g. deep networks, nature-inspired and brain-inspired algorithms); implementations on different computing platforms (e.g. neuromorphic, graphics processing units (GPUs), clouds, clusters); and big data analytics applications to solve real-world problems (e.g. weather prediction, transportation, energy management). The book, which reports on the second edition of the INNS Conference on Big Data, held on October 23–25, 2016, in Thessaloniki, Greece, depicts an interesting collaborative adventure of neural networks with big data and other learning technologies.

  3. The ethics of biomedical big data

    CERN Document Server

    Mittelstadt, Brent Daniel

    2016-01-01

    This book presents cutting edge research on the new ethical challenges posed by biomedical Big Data technologies and practices. ‘Biomedical Big Data’ refers to the analysis of aggregated, very large datasets to improve medical knowledge and clinical care. The book describes the ethical problems posed by aggregation of biomedical datasets and re-use/re-purposing of data, in areas such as privacy, consent, professionalism, power relationships, and ethical governance of Big Data platforms. Approaches and methods are discussed that can be used to address these problems to achieve the appropriate balance between the social goods of biomedical Big Data research and the safety and privacy of individuals. Seventeen original contributions analyse the ethical, social and related policy implications of the analysis and curation of biomedical Big Data, written by leading experts in the areas of biomedical research, medical and technology ethics, privacy, governance and data protection. The book advances our understan...

  4. Scalable privacy-preserving big data aggregation mechanism

    Directory of Open Access Journals (Sweden)

    Dapeng Wu

    2016-08-01

    Full Text Available As the massive sensor data generated by large-scale Wireless Sensor Networks (WSNs recently become an indispensable part of ‘Big Data’, the collection, storage, transmission and analysis of the big sensor data attract considerable attention from researchers. Targeting the privacy requirements of large-scale WSNs and focusing on the energy-efficient collection of big sensor data, a Scalable Privacy-preserving Big Data Aggregation (Sca-PBDA method is proposed in this paper. Firstly, according to the pre-established gradient topology structure, sensor nodes in the network are divided into clusters. Secondly, sensor data is modified by each node according to the privacy-preserving configuration message received from the sink. Subsequently, intra- and inter-cluster data aggregation is employed during the big sensor data reporting phase to reduce energy consumption. Lastly, aggregated results are recovered by the sink to complete the privacy-preserving big data aggregation. Simulation results validate the efficacy and scalability of Sca-PBDA and show that the big sensor data generated by large-scale WSNs is efficiently aggregated to reduce network resource consumption and the sensor data privacy is effectively protected to meet the ever-growing application requirements.

  5. Big Data in Science and Healthcare: A Review of Recent Literature and Perspectives. Contribution of the IMIA Social Media Working Group.

    Science.gov (United States)

    Hansen, M M; Miron-Shatz, T; Lau, A Y S; Paton, C

    2014-08-15

    As technology continues to evolve and rise in various industries, such as healthcare, science, education, and gaming, a sophisticated concept known as Big Data is surfacing. The concept of analytics aims to understand data. We set out to portray and discuss perspectives of the evolving use of Big Data in science and healthcare and, to examine some of the opportunities and challenges. A literature review was conducted to highlight the implications associated with the use of Big Data in scientific research and healthcare innovations, both on a large and small scale. Scientists and health-care providers may learn from one another when it comes to understanding the value of Big Data and analytics. Small data, derived by patients and consumers, also requires analytics to become actionable. Connectivism provides a framework for the use of Big Data and analytics in the areas of science and healthcare. This theory assists individuals to recognize and synthesize how human connections are driving the increase in data. Despite the volume and velocity of Big Data, it is truly about technology connecting humans and assisting them to construct knowledge in new ways. Concluding Thoughts: The concept of Big Data and associated analytics are to be taken seriously when approaching the use of vast volumes of both structured and unstructured data in science and health-care. Future exploration of issues surrounding data privacy, confidentiality, and education are needed. A greater focus on data from social media, the quantified self-movement, and the application of analytics to "small data" would also be useful.

  6. Intravenous immunoglobulin and Alzheimer's disease immunotherapy.

    Science.gov (United States)

    Solomon, Beka

    2007-02-01

    Amyloid-beta peptide (Abeta) contributes to the acute progression of Alzheimer's disease (AD) and has become the main target for therapeutics. Active immunization with Abeta in individuals with AD has been efficacious; however, some patients developed side effects, possibly related to an autoimmune response. Evidence that intravenous immunoglobulin (IVIg), an FDA-approved purified immunoglobulin fraction from normal human donor blood, shows promise of passive immunotherapy for AD is reviewed. Investigations into the molecular effects of IVIg on Abeta clearance, using the BV-2 cellular microglia line, demonstrate that IVIg dissolves Abeta fibrils in vitro, increases cellular tolerance to Abeta, enhances microglial migration toward Abeta deposits, and mediates phagocytosis of Abeta. Preliminary clinical results indicate that IVIg, which contains natural antibodies against the Abeta, warrants further study into its potential to deliver a controlled immune attack on the peptide, avoiding the immune toxicities that have had a negative impact on the first clinical trials of vaccine against Abeta.

  7. Ethische aspecten van big data

    NARCIS (Netherlands)

    N. (Niek) van Antwerpen; Klaas Jan Mollema

    2017-01-01

    Big data heeft niet alleen geleid tot uitdagende technische vraagstukken, ook gaat het gepaard met allerlei nieuwe ethische en morele kwesties. Om verantwoord met big data om te gaan, moet ook over deze kwesties worden nagedacht. Want slecht datagebruik kan nadelige gevolgen hebben voor

  8. Epidemiology in wonderland: Big Data and precision medicine.

    Science.gov (United States)

    Saracci, Rodolfo

    2018-03-01

    Big Data and precision medicine, two major contemporary challenges for epidemiology, are critically examined from two different angles. In Part 1 Big Data collected for research purposes (Big research Data) and Big Data used for research although collected for other primary purposes (Big secondary Data) are discussed in the light of the fundamental common requirement of data validity, prevailing over "bigness". Precision medicine is treated developing the key point that high relative risks are as a rule required to make a variable or combination of variables suitable for prediction of disease occurrence, outcome or response to treatment; the commercial proliferation of allegedly predictive tests of unknown or poor validity is commented. Part 2 proposes a "wise epidemiology" approach to: (a) choosing in a context imprinted by Big Data and precision medicine-epidemiological research projects actually relevant to population health, (b) training epidemiologists, (c) investigating the impact on clinical practices and doctor-patient relation of the influx of Big Data and computerized medicine and (d) clarifying whether today "health" may be redefined-as some maintain in purely technological terms.

  9. Psycho-informatics: Big Data shaping modern psychometrics.

    Science.gov (United States)

    Markowetz, Alexander; Błaszkiewicz, Konrad; Montag, Christian; Switala, Christina; Schlaepfer, Thomas E

    2014-04-01

    For the first time in history, it is possible to study human behavior on great scale and in fine detail simultaneously. Online services and ubiquitous computational devices, such as smartphones and modern cars, record our everyday activity. The resulting Big Data offers unprecedented opportunities for tracking and analyzing behavior. This paper hypothesizes the applicability and impact of Big Data technologies in the context of psychometrics both for research and clinical applications. It first outlines the state of the art, including the severe shortcomings with respect to quality and quantity of the resulting data. It then presents a technological vision, comprised of (i) numerous data sources such as mobile devices and sensors, (ii) a central data store, and (iii) an analytical platform, employing techniques from data mining and machine learning. To further illustrate the dramatic benefits of the proposed methodologies, the paper then outlines two current projects, logging and analyzing smartphone usage. One such study attempts to thereby quantify severity of major depression dynamically; the other investigates (mobile) Internet Addiction. Finally, the paper addresses some of the ethical issues inherent to Big Data technologies. In summary, the proposed approach is about to induce the single biggest methodological shift since the beginning of psychology or psychiatry. The resulting range of applications will dramatically shape the daily routines of researches and medical practitioners alike. Indeed, transferring techniques from computer science to psychiatry and psychology is about to establish Psycho-Informatics, an entire research direction of its own. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Big Data and Analytics in Healthcare.

    Science.gov (United States)

    Tan, S S-L; Gao, G; Koch, S

    2015-01-01

    This editorial is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". The amount of data being generated in the healthcare industry is growing at a rapid rate. This has generated immense interest in leveraging the availability of healthcare data (and "big data") to improve health outcomes and reduce costs. However, the nature of healthcare data, and especially big data, presents unique challenges in processing and analyzing big data in healthcare. This Focus Theme aims to disseminate some novel approaches to address these challenges. More specifically, approaches ranging from efficient methods of processing large clinical data to predictive models that could generate better predictions from healthcare data are presented.

  11. Big Data for Business Ecosystem Players

    Directory of Open Access Journals (Sweden)

    Perko Igor

    2016-06-01

    Full Text Available In the provided research, some of the Big Data most prospective usage domains connect with distinguished player groups found in the business ecosystem. Literature analysis is used to identify the state of the art of Big Data related research in the major domains of its use-namely, individual marketing, health treatment, work opportunities, financial services, and security enforcement. System theory was used to identify business ecosystem major player types disrupted by Big Data: individuals, small and mid-sized enterprises, large organizations, information providers, and regulators. Relationships between the domains and players were explained through new Big Data opportunities and threats and by players’ responsive strategies. System dynamics was used to visualize relationships in the provided model.

  12. "Big data" in economic history.

    Science.gov (United States)

    Gutmann, Myron P; Merchant, Emily Klancher; Roberts, Evan

    2018-03-01

    Big data is an exciting prospect for the field of economic history, which has long depended on the acquisition, keying, and cleaning of scarce numerical information about the past. This article examines two areas in which economic historians are already using big data - population and environment - discussing ways in which increased frequency of observation, denser samples, and smaller geographic units allow us to analyze the past with greater precision and often to track individuals, places, and phenomena across time. We also explore promising new sources of big data: organically created economic data, high resolution images, and textual corpora.

  13. Big Data Knowledge in Global Health Education.

    Science.gov (United States)

    Olayinka, Olaniyi; Kekeh, Michele; Sheth-Chandra, Manasi; Akpinar-Elci, Muge

    The ability to synthesize and analyze massive amounts of data is critical to the success of organizations, including those that involve global health. As countries become highly interconnected, increasing the risk for pandemics and outbreaks, the demand for big data is likely to increase. This requires a global health workforce that is trained in the effective use of big data. To assess implementation of big data training in global health, we conducted a pilot survey of members of the Consortium of Universities of Global Health. More than half the respondents did not have a big data training program at their institution. Additionally, the majority agreed that big data training programs will improve global health deliverables, among other favorable outcomes. Given the observed gap and benefits, global health educators may consider investing in big data training for students seeking a career in global health. Copyright © 2017 Icahn School of Medicine at Mount Sinai. Published by Elsevier Inc. All rights reserved.

  14. [Reducing fear in preschool children receiving intravenous injections].

    Science.gov (United States)

    Hsieh, Yi-Chuan; Liu, Hui-Tzu; Cho, Yen-Hua

    2012-06-01

    Our pediatric medical ward administers an average of 80 intravenous injections to preschool children. We found that 91.1% exhibit behavior indicative of fear and anxiety. Over three-quarters (77.8%) of this number suffer severe fear and actively resist receiving injections. Such behavior places a greater than normal burden on human and material resources and often gives family members negative impressions that lower their trust in the healthcare service while raising nurse-patient tensions. Using observation and interviews, we found primary factors in injection fear to be: Past negative experiences, lack of adequate prior communication, measures taken to preemptively control child resistance, and default cognitive behavioral strategies from nursing staff. This project worked to develop a strategy to reduce cases of severe injection fear in preschool children from 77.8% to 38.9% and achieve a capacity improvement target for members of 50%. Our team identified several potential strategy solutions from research papers and books between August 1st, 2009 and April 30th, 2010. Our proposed method included therapeutic games, self-selection of injection position, and cognitive behavioral strategies to divert attention. Other measures were also specified as standard operating procedures for administering pediatric intravenous injections. We applied the strategy on 45 preschool children and identified a post-injection "severe fear" level of 37.8%. This project was designed to reduce fear in children to make them more accepting of vaccinations and to enhance children's positive treatment experience in order to raise nursing care quality.

  15. GEOSS: Addressing Big Data Challenges

    Science.gov (United States)

    Nativi, S.; Craglia, M.; Ochiai, O.

    2014-12-01

    In the sector of Earth Observation, the explosion of data is due to many factors including: new satellite constellations, the increased capabilities of sensor technologies, social media, crowdsourcing, and the need for multidisciplinary and collaborative research to face Global Changes. In this area, there are many expectations and concerns about Big Data. Vendors have attempted to use this term for their commercial purposes. It is necessary to understand whether Big Data is a radical shift or an incremental change for the existing digital infrastructures. This presentation tries to explore and discuss the impact of Big Data challenges and new capabilities on the Global Earth Observation System of Systems (GEOSS) and particularly on its common digital infrastructure called GCI. GEOSS is a global and flexible network of content providers allowing decision makers to access an extraordinary range of data and information at their desk. The impact of the Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on GEOSS is discussed. The main solutions and experimentation developed by GEOSS along these axes are introduced and analyzed. GEOSS is a pioneering framework for global and multidisciplinary data sharing in the Earth Observation realm; its experience on Big Data is valuable for the many lessons learned.

  16. BIG DATA IN TAMIL: OPPORTUNITIES, BENEFITS AND CHALLENGES

    OpenAIRE

    R.S. Vignesh Raj; Babak Khazaei; Ashik Ali

    2015-01-01

    This paper gives an overall introduction on big data and has tried to introduce Big Data in Tamil. It discusses the potential opportunities, benefits and likely challenges from a very Tamil and Tamil Nadu perspective. The paper has also made original contribution by proposing the ‘big data’s’ terminology in Tamil. The paper further suggests a few areas to explore using big data Tamil on the lines of the Tamil Nadu Government ‘vision 2023’. Whilst, big data has something to offer everyone, it ...

  17. Big data in biomedicine.

    Science.gov (United States)

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Big Data’s Role in Precision Public Health

    Science.gov (United States)

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts. PMID:29594091

  19. Imaging of chest disease due to intravenous heroin abuse

    International Nuclear Information System (INIS)

    Lian Xuhui; Chen Zhong; Ye Wenqin

    2002-01-01

    Objective: To study the imaging findings of the chest disease due to intravenous heroin abuse. Methods: Twenty-five cases of clinically confirmed chest disease due to intravenous heroin abuse were retrospectively analyzed. 25 cases had conventional X-ray film, 6 cases had CT scanning, and 6 cases had echocardiography scanning. Results: On X-ray and CT, the following signs were found: lung making manifold (n = 5), small patchy shadow (n = 15), pneumatocele (n = 16), small cavity (n = 16), small node (n = 7), pleural effusion (n = 8 ), pneumothorax (n = 2), hydropneumothorax (n = 6), pulmonary edema (n = 2), megacardia (n = 11), multiple-shaped lesion (n = 20). On echocardiography, tricuspid vegetation (n = 4) and tricuspid insufficiency (n = 4) were found. Conclusion: The X-ray and CT manifestations of chest inflammation due to intravenous heroin abuse are multiple. The multiple small cavities and pneumatoceles sign are of some value in the diagnosis of lung inflammation due to intravenous heroin abuse among young patients

  20. Big inquiry

    Energy Technology Data Exchange (ETDEWEB)

    Wynne, B [Lancaster Univ. (UK)

    1979-06-28

    The recently published report entitled 'The Big Public Inquiry' from the Council for Science and Society and the Outer Circle Policy Unit is considered, with especial reference to any future enquiry which may take place into the first commercial fast breeder reactor. Proposals embodied in the report include stronger rights for objectors and an attempt is made to tackle the problem that participation in a public inquiry is far too late to be objective. It is felt by the author that the CSS/OCPU report is a constructive contribution to the debate about big technology inquiries but that it fails to understand the deeper currents in the economic and political structure of technology which so influence the consequences of whatever formal procedures are evolved.

  1. Cartography in the Age of Spatio-temporal Big Data

    Directory of Open Access Journals (Sweden)

    WANG Jiayao

    2017-10-01

    Full Text Available Cartography is an ancient science with almost the same long history as the world's oldest culture.Since ancient times,the movement and change of anything and any phenomena,including human activities,have been carried out in a certain time and space.The development of science and technology and the progress of social civilization have made social management and governance more and more dependent on time and space.The information source,theme,content,carrier,form,production methods and application methods of map are different in different historical periods,so that its all-round value is different. With the arrival of the big data age,the scientific paradigm has now entered the era of "data-intensive" paradigm,so is the cartography,with obvious characteristics of big data science.All big data are caused by movement and change of all things and phenomena in the geographic world,so they have space and time characteristics and thus cannot be separated from the spatial reference and time reference.Therefore,big data is big spatio-temporal data essentially.Since the late 1950s and early 1960s,modern cartography,that is,the cartography in the information age,takes spatio-temporal data as the object,and focuses on the processing and expression of spatio-temporal data,but not in the face of the large scale multi-source heterogeneous and multi-dimensional dynamic data flow(or flow datafrom sky to the sea.The real-time dynamic nature,the theme pertinence,the content complexity,the carrier diversification,the expression form personalization,the production method modernization,the application ubiquity of the map,is incomparable in the past period,which leads to the great changes of the theory,technology and application system of cartography.And all these changes happen to occur in the 60 years since the late 1950s and early 1960s,so this article was written to commemorate the 60th anniversary of the "Acta Geodaetica et Cartographica Sinica".

  2. Catheter indwell time and phlebitis development during peripheral intravenous catheter administration.

    Science.gov (United States)

    Pasalioglu, Kadriye Burcu; Kaya, Hatice

    2014-07-01

    Intravenous catheters have been indispensable tools of modern medicine. Although intravenous applications can be used for a multitude of purposes, these applications may cause complications, some of which have serious effects. Of these complications, the most commonly observed is phlebitis. This study was conducted to determine the effect of catheter indwell time on phlebitis development during peripheral intravenous catheter administration. This study determined the effect of catheter indwell time on phlebitis development during peripheral intravenous catheter administration. The study included a total of 103 individuals who were administered 439 catheters and satisfied the study enrollment criteria at one infectious diseases clinic in Istanbul/Turkey. Data were compiled from Patient Information Forms, Peripheral Intravenous Catheter and Therapy Information Forms, reported grades based on the Visual Infusion Phlebitis Assessment Scale, and Peripheral Intravenous Catheter Nurse Observation Forms. The data were analyzed using SPSS. Results : The mean patient age was 53.75±15.54 (standard deviation) years, and 59.2% of the study participants were men. Phlebitis was detected in 41.2% of peripheral intravenous catheters, and the rate decreased with increased catheter indwell time. Analyses showed that catheter indwell time, antibiotic usage, sex, and catheterization sites were significantly associated with development of phlebitis. The results of this study show that catheters can be used for longer periods of time when administered under optimal conditions and with appropriate surveillance.

  3. Big data analytics with R and Hadoop

    CERN Document Server

    Prajapati, Vignesh

    2013-01-01

    Big Data Analytics with R and Hadoop is a tutorial style book that focuses on all the powerful big data tasks that can be achieved by integrating R and Hadoop.This book is ideal for R developers who are looking for a way to perform big data analytics with Hadoop. This book is also aimed at those who know Hadoop and want to build some intelligent applications over Big data with R packages. It would be helpful if readers have basic knowledge of R.

  4. Effects of Intravenous Administration of Human Umbilical Cord Blood Stem Cells in 3-Acetylpyridine-Lesioned Rats

    Science.gov (United States)

    Calatrava-Ferreras, Lucía; Gonzalo-Gobernado, Rafael; Herranz, Antonio S.; Reimers, Diana; Montero Vega, Teresa; Jiménez-Escrig, Adriano; Richart López, Luis Alberto; Bazán, Eulalia

    2012-01-01

    Cerebellar ataxias include a heterogeneous group of infrequent diseases characterized by lack of motor coordination caused by disturbances in the cerebellum and its associated circuits. Current therapies are based on the use of drugs that correct some of the molecular processes involved in their pathogenesis. Although these treatments yielded promising results, there is not yet an effective therapy for these diseases. Cell replacement strategies using human umbilical cord blood mononuclear cells (HuUCBMCs) have emerged as a promising approach for restoration of function in neurodegenerative diseases. The aim of this work was to investigate the potential therapeutic activity of HuUCBMCs in the 3-acetylpyridine (3-AP) rat model of cerebellar ataxia. Intravenous administered HuUCBMCs reached the cerebellum and brain stem of 3-AP ataxic rats. Grafted cells reduced 3-AP-induced neuronal loss promoted the activation of microglia in the brain stem, and prevented the overexpression of GFAP elicited by 3-AP in the cerebellum. In addition, HuUCBMCs upregulated the expression of proteins that are critical for cell survival, such as phospho-Akt and Bcl-2, in the cerebellum and brain stem of 3-AP ataxic rats. As all these effects were accompanied by a temporal but significant improvement in motor coordination, HuUCBMCs grafts can be considered as an effective cell replacement therapy for cerebellar disorders. PMID:23150735

  5. Big data in forensic science and medicine.

    Science.gov (United States)

    Lefèvre, Thomas

    2018-07-01

    In less than a decade, big data in medicine has become quite a phenomenon and many biomedical disciplines got their own tribune on the topic. Perspectives and debates are flourishing while there is a lack for a consensual definition for big data. The 3Vs paradigm is frequently evoked to define the big data principles and stands for Volume, Variety and Velocity. Even according to this paradigm, genuine big data studies are still scarce in medicine and may not meet all expectations. On one hand, techniques usually presented as specific to the big data such as machine learning techniques are supposed to support the ambition of personalized, predictive and preventive medicines. These techniques are mostly far from been new and are more than 50 years old for the most ancient. On the other hand, several issues closely related to the properties of big data and inherited from other scientific fields such as artificial intelligence are often underestimated if not ignored. Besides, a few papers temper the almost unanimous big data enthusiasm and are worth attention since they delineate what is at stakes. In this context, forensic science is still awaiting for its position papers as well as for a comprehensive outline of what kind of contribution big data could bring to the field. The present situation calls for definitions and actions to rationally guide research and practice in big data. It is an opportunity for grounding a true interdisciplinary approach in forensic science and medicine that is mainly based on evidence. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  6. NASA's Big Data Task Force

    Science.gov (United States)

    Holmes, C. P.; Kinter, J. L.; Beebe, R. F.; Feigelson, E.; Hurlburt, N. E.; Mentzel, C.; Smith, G.; Tino, C.; Walker, R. J.

    2017-12-01

    Two years ago NASA established the Ad Hoc Big Data Task Force (BDTF - https://science.nasa.gov/science-committee/subcommittees/big-data-task-force), an advisory working group with the NASA Advisory Council system. The scope of the Task Force included all NASA Big Data programs, projects, missions, and activities. The Task Force focused on such topics as exploring the existing and planned evolution of NASA's science data cyber-infrastructure that supports broad access to data repositories for NASA Science Mission Directorate missions; best practices within NASA, other Federal agencies, private industry and research institutions; and Federal initiatives related to big data and data access. The BDTF has completed its two-year term and produced several recommendations plus four white papers for NASA's Science Mission Directorate. This presentation will discuss the activities and results of the TF including summaries of key points from its focused study topics. The paper serves as an introduction to the papers following in this ESSI session.

  7. Big Data Technologies

    Science.gov (United States)

    Bellazzi, Riccardo; Dagliati, Arianna; Sacchi, Lucia; Segagni, Daniele

    2015-01-01

    The so-called big data revolution provides substantial opportunities to diabetes management. At least 3 important directions are currently of great interest. First, the integration of different sources of information, from primary and secondary care to administrative information, may allow depicting a novel view of patient’s care processes and of single patient’s behaviors, taking into account the multifaceted nature of chronic care. Second, the availability of novel diabetes technologies, able to gather large amounts of real-time data, requires the implementation of distributed platforms for data analysis and decision support. Finally, the inclusion of geographical and environmental information into such complex IT systems may further increase the capability of interpreting the data gathered and extract new knowledge from them. This article reviews the main concepts and definitions related to big data, it presents some efforts in health care, and discusses the potential role of big data in diabetes care. Finally, as an example, it describes the research efforts carried on in the MOSAIC project, funded by the European Commission. PMID:25910540

  8. The Berlin Inventory of Gambling behavior - Screening (BIG-S): Validation using a clinical sample.

    Science.gov (United States)

    Wejbera, Martin; Müller, Kai W; Becker, Jan; Beutel, Manfred E

    2017-05-18

    Published diagnostic questionnaires for gambling disorder in German are either based on DSM-III criteria or focus on aspects other than life time prevalence. This study was designed to assess the usability of the DSM-IV criteria based Berlin Inventory of Gambling Behavior Screening tool in a clinical sample and adapt it to DSM-5 criteria. In a sample of 432 patients presenting for behavioral addiction assessment at the University Medical Center Mainz, we checked the screening tool's results against clinical diagnosis and compared a subsample of n=300 clinically diagnosed gambling disorder patients with a comparison group of n=132. The BIG-S produced a sensitivity of 99.7% and a specificity of 96.2%. The instrument's unidimensionality and the diagnostic improvements of DSM-5 criteria were verified by exploratory and confirmatory factor analysis as well as receiver operating characteristic analysis. The BIG-S is a reliable and valid screening tool for gambling disorder and demonstrated its concise and comprehensible operationalization of current DSM-5 criteria in a clinical setting.

  9. Estimation of absorbed doses in humans due to intravenous administration of fluorine-18-fluorodeoxyglucose in PET studies

    International Nuclear Information System (INIS)

    Mejia, A.A.; Nakamura, T.; Masatoshi, I.; Hatazawa, J.; Masaki, M.; Watanuki, S.

    1991-01-01

    Radiation absorbed doses due to intravenous administration of fluorine-18-fluorodeoxyglucose in positron emission tomography (PET) studies were estimated in normal volunteers. The time-activity curves were obtained for seven human organs (brain, heart, kidney, liver, lung, pancreas, and spleen) by using dynamic PET scans and for bladder content by using a single detector. These time-activity curves were used for the calculation of the cumulative activity in these organs. Absorbed doses were calculated by the MIRD method using the absorbed dose per unit of cumulated activity, 'S' value, transformed for the Japanese physique and the organ masses of the Japanese reference man. The bladder wall and the heart were the organs receiving higher doses of 1.2 x 10(-1) and 4.5 x 10(-2) mGy/MBq, respectively. The brain received a dose of 2.9 x 10(-2) mGy/MBq, and other organs received doses between 1.0 x 10(-2) and 3.0 x 10(-2) mGy/MBq. The effective dose equivalent was estimated to be 2.4 x 10(-2) mSv/MBq. These results were comparable to values of absorbed doses reported by other authors on the radiation dosimetry of this radiopharmaceutical

  10. Spongiform leucoencephalopathy following intravenous heroin abuse: Radiological and histopathological findings

    International Nuclear Information System (INIS)

    Robertson, A.S.; Jain, S.; O'Neil, R.A.

    2001-01-01

    A case of spongiform leucoencephalopathy in a known intravenous heroin abuser is presented. To our knowledge, this is the only case of heroin-related spongiform leucoencephalopathy reported in Australia. The relationship to intravenous rather than inhaled heroin is particularly unusual with only one other possible case documented in the literature. The imaging and histopathological findings are described. Neurological examination revealed disorientation in time and place, memory loss and cognitive impairment but no focal signs. Biochemical and haematological profiles were normal. Viral serology was positive for hepatitis C but negative for hepatitis B and human immunodeficiency virus (HIV). Cerebral CT revealed diffuse symmetrical hypodensity of the cerebral white matter. The ventricles and subarachnoid spaces were of normal size. Magnetic resonance imaging showed diffuse symmetrical signal abnormality in the cerebral white matter. These changes were hyperintense on proton density, T2-weighted, modified T2-weighted (FLAIR) and diffusion-weighted images. T1 -weighted scans showed corresponding hypointensity. There was no enhancement after intravenous gadolinium. Cerebral spinal fluid (CSF) specimens were negative for a variety of virological, immunological and bacteriological markers. No viral or bacterial growth was demonstrated. Oligoclonal bands for multiple sclerosis and Protein 134 for Wilson's disease were negative. Right frontal brain biopsy showed spongiform white matter and degenerative change with prominent fibrous gliosis. In severely affected areas, loss of normal myelin staining and axonal loss were present, accompanied by scattered foamy macrophages. Loss of oligodendroglial nuclei was also present. There was no evidence of inflammation or progressive multifocal leucoencephalopathy. No bacteria or virus particles were seen on electron microscopic examination of the brain tissue. Following the biopsy, the patient discharged himself from hospital and the

  11. Traffic information computing platform for big data

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn; Liu, Yan, E-mail: ztduan@chd.edu.cn; Dai, Jiting, E-mail: ztduan@chd.edu.cn; Kang, Jun, E-mail: ztduan@chd.edu.cn [Chang' an University School of Information Engineering, Xi' an, China and Shaanxi Engineering and Technical Research Center for Road and Traffic Detection, Xi' an (China)

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  12. Traffic information computing platform for big data

    International Nuclear Information System (INIS)

    Duan, Zongtao; Li, Ying; Zheng, Xibin; Liu, Yan; Dai, Jiting; Kang, Jun

    2014-01-01

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users

  13. Fremtidens landbrug bliver big business

    DEFF Research Database (Denmark)

    Hansen, Henning Otte

    2016-01-01

    Landbrugets omverdensforhold og konkurrencevilkår ændres, og det vil nødvendiggøre en udvikling i retning af “big business“, hvor landbrugene bliver endnu større, mere industrialiserede og koncentrerede. Big business bliver en dominerende udvikling i dansk landbrug - men ikke den eneste...

  14. How universal is the Big Five? Testing the five-factor model of personality variation among forager-farmers in the Bolivian Amazon.

    Science.gov (United States)

    Gurven, Michael; von Rueden, Christopher; Massenkoff, Maxim; Kaplan, Hillard; Lero Vie, Marino

    2013-02-01

    The five-factor model (FFM) of personality variation has been replicated across a range of human societies, suggesting the FFM is a human universal. However, most studies of the FFM have been restricted to literate, urban populations, which are uncharacteristic of the majority of human evolutionary history. We present the first test of the FFM in a largely illiterate, indigenous society. Tsimane forager-horticulturalist men and women of Bolivia (n = 632) completed a translation of the 44-item Big Five Inventory (Benet-Martínez & John, 1998), a widely used metric of the FFM. We failed to find robust support for the FFM, based on tests of (a) internal consistency of items expected to segregate into the Big Five factors, (b) response stability of the Big Five, (c) external validity of the Big Five with respect to observed behavior, (d) factor structure according to exploratory and confirmatory factor analysis, and (e) similarity with a U.S. target structure based on Procrustes rotation analysis. Replication of the FFM was not improved in a separate sample of Tsimane adults (n = 430), who evaluated their spouses on the Big Five Inventory. Removal of reverse-scored items that may have elicited response biases produced factors suggestive of Extraversion, Agreeableness, and Conscientiousness, but fit to the FFM remained poor. Response styles may covary with exposure to education, but we found no better fit to the FFM among Tsimane who speak Spanish or have attended school. We argue that Tsimane personality variation displays 2 principal factors that may reflect socioecological characteristics common to small-scale societies. We offer evolutionary perspectives on why the structure of personality variation may not be invariant across human societies. (c) 2013 APA, all rights reserved.

  15. How Universal Is the Big Five? Testing the Five-Factor Model of Personality Variation Among Forager–Farmers in the Bolivian Amazon

    Science.gov (United States)

    Gurven, Michael; von Rueden, Christopher; Massenkoff, Maxim; Kaplan, Hillard; Vie, Marino Lero

    2014-01-01

    The five-factor model (FFM) of personality variation has been replicated across a range of human societies, suggesting the FFM is a human universal. However, most studies of the FFM have been restricted to literate, urban populations, which are uncharacteristic of the majority of human evolutionary history. We present the first test of the FFM in a largely illiterate, indigenous society. Tsimane forager–horticulturalist men and women of Bolivia (n = 632) completed a translation of the 44-item Big Five Inventory (Benet-Martínez & John, 1998), a widely used metric of the FFM. We failed to find robust support for the FFM, based on tests of (a) internal consistency of items expected to segregate into the Big Five factors, (b) response stability of the Big Five, (c) external validity of the Big Five with respect to observed behavior, (d) factor structure according to exploratory and confirmatory factor analysis, and (e) similarity with a U.S. target structure based on Procrustes rotation analysis. Replication of the FFM was not improved in a separate sample of Tsimane adults (n = 430), who evaluated their spouses on the Big Five Inventory. Removal of reverse-scored items that may have elicited response biases produced factors suggestive of Extraversion, Agreeableness, and Conscientiousness, but fit to the FFM remained poor. Response styles may covary with exposure to education, but we found no better fit to the FFM among Tsimane who speak Spanish or have attended school. We argue that Tsimane personality variation displays 2 principal factors that may reflect socioecological characteristics common to small-scale societies. We offer evolutionary perspectives on why the structure of personality variation may not be invariant across human societies. PMID:23245291

  16. Intravenous to oral conversion of fluoroquinolones: knowledge versus clinical practice patterns.

    Science.gov (United States)

    Conort, Ornella; Gabardi, Steven; Didier, Marie-Pauline; Hazebroucq, Georges; Cariou, Alain

    2002-04-01

    To assess the knowledge of prescribers regarding intravenous to oral conversions of fluoroquinolones, the frequency and time until conversion, and to compare prescriber knowledge with the data collected concerning the reasons stated for continuation of intravenous fluoroquinolones. Prospective chart review and questionnaire. Large teaching hospital in Paris, France. Fifty-one males and females. Data were collected on in-patients receiving intravenous fluoroquinolone for at least three days and hospitalized in one of six in-patient units. Patients receiving intravenous fluoroquinolone for less than three days were excluded. A questionnaire to assess the awareness of a potential conversion was distributed to those practitioners who had patients reviewed during the data-collection phase. The questionnaire revealed the ten most common reasons for continuing intravenous administration for more than three days. However, the physicians agreed that most patients should be converted as soon as possible. Practice patterns differed, with only 17 of 51 patients actually converted to oral therapy. In theory, the clinicians were aware of when to perform the conversion. However, in practice, the frequency of conversion was lower than optimum. Changes in clinical practice are needed to decrease the costs of intravenous therapy, without jeopardizing quality of care.

  17. The Ecology of Human Mobility

    KAUST Repository

    Meekan, Mark G.

    2017-02-03

    Mobile phones and other geolocated devices have produced unprecedented volumes of data on human movement. Analysis of pooled individual human trajectories using big data approaches has revealed a wealth of emergent features that have ecological parallels in animals across a diverse array of phenomena including commuting, epidemics, the spread of innovations and culture, and collective behaviour. Movement ecology, which explores how animals cope with and optimize variability in resources, has the potential to provide a theoretical framework to aid an understanding of human mobility and its impacts on ecosystems. In turn, big data on human movement can be explored in the context of animal movement ecology to provide solutions for urgent conservation problems and management challenges.

  18. Quantum nature of the big bang.

    Science.gov (United States)

    Ashtekar, Abhay; Pawlowski, Tomasz; Singh, Parampreet

    2006-04-14

    Some long-standing issues concerning the quantum nature of the big bang are resolved in the context of homogeneous isotropic models with a scalar field. Specifically, the known results on the resolution of the big-bang singularity in loop quantum cosmology are significantly extended as follows: (i) the scalar field is shown to serve as an internal clock, thereby providing a detailed realization of the "emergent time" idea; (ii) the physical Hilbert space, Dirac observables, and semiclassical states are constructed rigorously; (iii) the Hamiltonian constraint is solved numerically to show that the big bang is replaced by a big bounce. Thanks to the nonperturbative, background independent methods, unlike in other approaches the quantum evolution is deterministic across the deep Planck regime.

  19. Mentoring in Schools: An Impact Study of Big Brothers Big Sisters School-Based Mentoring

    Science.gov (United States)

    Herrera, Carla; Grossman, Jean Baldwin; Kauh, Tina J.; McMaken, Jennifer

    2011-01-01

    This random assignment impact study of Big Brothers Big Sisters School-Based Mentoring involved 1,139 9- to 16-year-old students in 10 cities nationwide. Youth were randomly assigned to either a treatment group (receiving mentoring) or a control group (receiving no mentoring) and were followed for 1.5 school years. At the end of the first school…

  20. Optimal timing for intravenous administration set replacement.

    Science.gov (United States)

    Gillies, D; O'Riordan, L; Wallen, M; Morrison, A; Rankin, K; Nagy, S

    2005-10-19

    Administration of intravenous therapy is a common occurrence within the hospital setting. Routine replacement of administration sets has been advocated to reduce intravenous infusion contamination. If decreasing the frequency of changing intravenous administration sets does not increase infection rates, a change in practice could result in considerable cost savings. The objective of this review was to identify the optimal interval for the routine replacement of intravenous administration sets when infusate or parenteral nutrition (lipid and non-lipid) solutions are administered to people in hospital via central or peripheral venous catheters. We searched The Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, CINAHL, EMBASE: all from inception to February 2004; reference lists of identified trials, and bibliographies of published reviews. We also contacted researchers in the field. We did not have a language restriction. We included all randomized or quasi-randomized controlled trials addressing the frequency of replacing intravenous administration sets when parenteral nutrition (lipid and non-lipid containing solutions) or infusions (excluding blood) were administered to people in hospital via a central or peripheral catheter. Two authors assessed all potentially relevant studies. We resolved disagreements between the two authors by discussion with a third author. We collected data for the outcomes; infusate contamination; infusate-related bloodstream infection; catheter contamination; catheter-related bloodstream infection; all-cause bloodstream infection and all-cause mortality. We identified 23 references for review. We excluded eight of these studies; five because they did not fit the inclusion criteria and three because of inadequate data. We extracted data from the remaining 15 references (13 studies) with 4783 participants. We conclude that there is no evidence that changing intravenous administration sets more often than every 96 hours

  1. The Big Five of Personality and structural imaging revisited: a VBM - DARTEL study.

    Science.gov (United States)

    Liu, Wei-Yin; Weber, Bernd; Reuter, Martin; Markett, Sebastian; Chu, Woei-Chyn; Montag, Christian

    2013-05-08

    The present study focuses on the neurostructural foundations of the human personality. In a large sample of 227 healthy human individuals (168 women and 59 men), we used MRI to examine the relationship between personality traits and both regional gray and white matter volume, while controlling for age and sex. Personality was assessed using the German version of the NEO Five-Factor Inventory that measures individual differences in the 'Big Five of Personality': extraversion, neuroticism, agreeableness, conscientiousness, and openness to experience. In contrast to most previous studies on neural correlates of the Big Five, we used improved processing strategies: white and gray matter were independently assessed by segmentation steps before data analysis. In addition, customized sex-specific diffeomorphic anatomical registration using exponentiated lie algebra templates were used. Our results did not show significant correlations between any dimension of the Big Five and regional gray matter volume. However, among others, higher conscientiousness scores correlated significantly with reductions in regional white matter volume in different brain areas, including the right insula, putamen, caudate, and left fusiformis. These correlations were driven by the female subsample. The present study suggests that many results from the literature on the neurostructural basis of personality should be reviewed carefully, considering the results when the sample size is larger, imaging methods are rigorously applied, and sex-related and age-related effects are controlled.

  2. Big data processing in the cloud - Challenges and platforms

    Science.gov (United States)

    Zhelev, Svetoslav; Rozeva, Anna

    2017-12-01

    Choosing the appropriate architecture and technologies for a big data project is a difficult task, which requires extensive knowledge in both the problem domain and in the big data landscape. The paper analyzes the main big data architectures and the most widely implemented technologies used for processing and persisting big data. Clouds provide for dynamic resource scaling, which makes them a natural fit for big data applications. Basic cloud computing service models are presented. Two architectures for processing big data are discussed, Lambda and Kappa architectures. Technologies for big data persistence are presented and analyzed. Stream processing as the most important and difficult to manage is outlined. The paper highlights main advantages of cloud and potential problems.

  3. Intravenous human immunoglobulins for refractory recurrent pericarditis: a systematic review of all published cases.

    Science.gov (United States)

    Imazio, Massimo; Lazaros, George; Picardi, Elisa; Vasileiou, Panagiotis; Carraro, Mara; Tousoulis, Dimitrios; Belli, Riccardo; Gaita, Fiorenzo

    2016-04-01

    Refractory recurrent pericarditis is a major clinical challenge after colchicine failure, especially in corticosteroid-dependent patients. Human intravenous immunoglobulins (IVIGs) have been proposed as possible therapeutic options for these cases. The goal of this systematic review is to assess the efficacy and safety of IVIGs in this context. Studies reporting the use of IVIG for the treatment of recurrent pericarditis and published up to October 2014 were searched in several databases. All references found, upon initial assessment at title and abstract level for suitability, were consequently retrieved as full reports for further appraisal. Among the 18 citations retrieved, 17 reports (4 case series and 13 single case reports, with an overall population of 30 patients) were included. The mean disease duration was 14 months and the mean number of recurrences before IVIG was 3. Approximately 47% of patients had idiopathic recurrent pericarditis, 10% had an infective cause, and the remainder a systemic inflammatory disease. Nineteen out of the 30 patients (63.3%) were on corticosteroids at IVIG commencement. IVIGs were generally administered at a dose of 400-500 mg/kg/day for 5 consecutive days with repeated cycles according to the clinical response. Complications were uncommon (headache in ~3%) and not life-threatening. After a mean follow-up of approximately 33th months, recurrences occurred in 26.6% of cases after the first IVIG cycle, and 22 of the 30 patients (73.3%) were recurrence-free. Five patients (16.6%) were on corticosteroids at the end of the follow-up. IVIGs are rapidly acting, well tolerated, and efficacious steroid-sparing agents in refractory pericarditis.

  4. Ethics and Epistemology in Big Data Research.

    Science.gov (United States)

    Lipworth, Wendy; Mason, Paul H; Kerridge, Ian; Ioannidis, John P A

    2017-12-01

    Biomedical innovation and translation are increasingly emphasizing research using "big data." The hope is that big data methods will both speed up research and make its results more applicable to "real-world" patients and health services. While big data research has been embraced by scientists, politicians, industry, and the public, numerous ethical, organizational, and technical/methodological concerns have also been raised. With respect to technical and methodological concerns, there is a view that these will be resolved through sophisticated information technologies, predictive algorithms, and data analysis techniques. While such advances will likely go some way towards resolving technical and methodological issues, we believe that the epistemological issues raised by big data research have important ethical implications and raise questions about the very possibility of big data research achieving its goals.

  5. Victoria Stodden: Scholarly Communication in the Era of Big Data and Big Computation

    OpenAIRE

    Stodden, Victoria

    2015-01-01

    Victoria Stodden gave the keynote address for Open Access Week 2015. "Scholarly communication in the era of big data and big computation" was sponsored by the University Libraries, Computational Modeling and Data Analytics, the Department of Computer Science, the Department of Statistics, the Laboratory for Interdisciplinary Statistical Analysis (LISA), and the Virginia Bioinformatics Institute. Victoria Stodden is an associate professor in the Graduate School of Library and Information Scien...

  6. Big data analytics a management perspective

    CERN Document Server

    Corea, Francesco

    2016-01-01

    This book is about innovation, big data, and data science seen from a business perspective. Big data is a buzzword nowadays, and there is a growing necessity within practitioners to understand better the phenomenon, starting from a clear stated definition. This book aims to be a starting reading for executives who want (and need) to keep the pace with the technological breakthrough introduced by new analytical techniques and piles of data. Common myths about big data will be explained, and a series of different strategic approaches will be provided. By browsing the book, it will be possible to learn how to implement a big data strategy and how to use a maturity framework to monitor the progress of the data science team, as well as how to move forward from one stage to the next. Crucial challenges related to big data will be discussed, where some of them are more general - such as ethics, privacy, and ownership – while others concern more specific business situations (e.g., initial public offering, growth st...

  7. Slaves to Big Data. Or Are We?

    Directory of Open Access Journals (Sweden)

    Mireille Hildebrandt

    2013-10-01

    Full Text Available

    In this contribution, the notion of Big Data is discussed in relation to the monetisation of personal data. The claim of some proponents, as well as adversaries, that Big Data implies that ‘n = all’, meaning that we no longer need to rely on samples because we have all the data, is scrutinised and found to be both overly optimistic and unnecessarily pessimistic. A set of epistemological and ethical issues is presented, focusing on the implications of Big Data for our perception, cognition, fairness, privacy and due process. The article then looks into the idea of user-centric personal data management to investigate to what extent it provides solutions for some of the problems triggered by the Big Data conundrum. Special attention is paid to the core principle of data protection legislation, namely purpose binding. Finally, this contribution seeks to inquire into the influence of Big Data politics on self, mind and society, and asks how we can prevent ourselves from becoming slaves to Big Data.

  8. Will Organization Design Be Affected By Big Data?

    Directory of Open Access Journals (Sweden)

    Giles Slinger

    2014-12-01

    Full Text Available Computing power and analytical methods allow us to create, collate, and analyze more data than ever before. When datasets are unusually large in volume, velocity, and variety, they are referred to as “big data.” Some observers have suggested that in order to cope with big data (a organizational structures will need to change and (b the processes used to design organizations will be different. In this article, we differentiate big data from relatively slow-moving, linked people data. We argue that big data will change organizational structures as organizations pursue the opportunities presented by big data. The processes by which organizations are designed, however, will be relatively unaffected by big data. Instead, organization design processes will be more affected by the complex links found in people data.

  9. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  10. Big Data

    OpenAIRE

    Bútora, Matúš

    2017-01-01

    Cieľom bakalárskej práca je popísať problematiku Big Data a agregačné operácie OLAP pre podporu rozhodovania, ktoré sú na ne aplikované pomocou technológie Apache Hadoop. Prevažná časť práce je venovaná popisu práve tejto technológie. Posledná kapitola sa zaoberá spôsobom aplikovania agregačných operácií a problematikou ich realizácie. Nasleduje celkové zhodnotenie práce a možnosti využitia výsledného systému do budúcna. The aim of the bachelor thesis is to describe the Big Data issue and ...

  11. Effects of a transmitted light device for pediatric peripheral venipuncture and intravenous cannulation

    Directory of Open Access Journals (Sweden)

    Yamazaki S

    2011-10-01

    Full Text Available Shinya Yamazaki1, Shu Tomita1, Masahiro Watanabe1, Hiroyoshi Kawaai1, Kazuhiro Shimamura2 1Department of Dental Anesthesiology; 2Department of Pediatric Dentistry, Ohu University Dental Hospital, Koriyama City, Fukushima Prefecture, Japan Abstract: Pediatric peripheral venipuncture and intravenous cannulation are difficult. However, successful venipuncture and intravenous cannulation are absolutely required for pediatric clinical risk management. This study assessed the success rate of venipuncture and intravenous cannulation when transmitted light was applied to the pediatric dorsum manus. The subjects included 100 young children who were scheduled for dental treatment or oral surgery under general anesthesia. Anesthesia was induced, and insertion of an intravenous catheter into the dorsum manus was attempted with or without using transmitted light. The patients were evaluated to determine whether the venipuncture was successful, and whether the intravenous cannulation of the external catheter was successful. The success rate of venipuncture was 100% when transmitted light was used, and 83% when the transmitted light was not used (P = 0.000016. In addition, the success rate of intravenous cannulation was 88% when transmitted light was used, and 55% when the transmitted light was not used (P = 0.0000002. The shape of the vein in the dorsum manus can be clearly recognized when transmitted light is used. The use of light significantly increased the success rate of intravenous cannulation, because it allowed direct confirmation of the direction to push the intravenous catheter forward. The use of transmitted light allows for more successful venipuncture and intravenous cannulation in young children. Keywords: transmitted light, pediatric peripheral venipuncture, pediatric peripheral intravenous cannulation

  12. Big Data solutions on a small scale: Evaluating accessible high-performance computing for social research

    Directory of Open Access Journals (Sweden)

    Dhiraj Murthy

    2014-11-01

    Full Text Available Though full of promise, Big Data research success is often contingent on access to the newest, most advanced, and often expensive hardware systems and the expertise needed to build and implement such systems. As a result, the accessibility of the growing number of Big Data-capable technology solutions has often been the preserve of business analytics. Pay as you store/process services like Amazon Web Services have opened up possibilities for smaller scale Big Data projects. There is high demand for this type of research in the digital humanities and digital sociology, for example. However, scholars are increasingly finding themselves at a disadvantage as available data sets of interest continue to grow in size and complexity. Without a large amount of funding or the ability to form interdisciplinary partnerships, only a select few find themselves in the position to successfully engage Big Data. This article identifies several notable and popular Big Data technologies typically implemented using large and extremely powerful cloud-based systems and investigates the feasibility and utility of development of Big Data analytics systems implemented using low-cost commodity hardware in basic and easily maintainable configurations for use within academic social research. Through our investigation and experimental case study (in the growing field of social Twitter analytics, we found that not only are solutions like Cloudera’s Hadoop feasible, but that they can also enable robust, deep, and fruitful research outcomes in a variety of use-case scenarios across the disciplines.

  13. Heterogeneity of pituitary and plasma prolactin in man: decreased affinity of big prolactin in a radioreceptor assay and evidence for its secretion

    International Nuclear Information System (INIS)

    Garnier, P.E.; Aubert, M.L.; Kaplan, S.L.; Grumbach, M.M.

    1978-01-01

    Molecular heterogeneity of immunoreactive human PRL (IR-hPRL) plasma was assessed by exclusion chromatography in blood from 4 normal adults, 3 newborn infants, 2 late gestational women, 3 patients with primary hypothyroidism and high PRL levels, 2 with functional hyperprolactinemia, 3 with acromegaly, and 10 with PRL-secreting tumors. Three forms of PRL were detected: big-big hPRL, big hPRL, and little hPRL. In normal subjects, the proportion of big-big, big, and little hPRL components was 5.1%, 9.1%, and 85.8%, respectively, without change in the distribution after TRF stimulation. In 8 of 10 patients with PRL-secreting tumors, we detected a significantly higher proportion of big PRL. In 2 additional patients with prolactinomas, the proportion of big PRL was much higher. In 3 of 10 patients, the molecular heterogeneity of the tumor PRL was similar to that in plasma. In 1 acromegalic, there was a very high proportion of big-big hPRL. The PRL fractions were tested in a radioreceptor assay (RRA) using membranes from rabbit mammary gland. Big PRL was much less active than little PRL in the RRA. The fractions were rechromatographed after storage. Big PRL partially distributed as little or big-big PRL, while little PRL remained unchanged. Big-big PRL from tumor extract partially converted into big and little PRL. The big PRL obtained by rechromatography had low activity in the RRA. These observations suggest at least part of the receptor activity of big PRL may arise from generation of or contamination by little PRL. The decreased binding affinity of big PRL in the RRA also indicates that big PRL has little, if any, biological activity. The evidence suggests big PRL is a native PRL dimer linked by intermolecular disulfide bonds which arises in the lactotrope as a postsynthetic product or derivative and is not a true precursor prohormone

  14. BigDansing

    KAUST Repository

    Khayyat, Zuhair; Ilyas, Ihab F.; Jindal, Alekh; Madden, Samuel; Ouzzani, Mourad; Papotti, Paolo; Quiané -Ruiz, Jorge-Arnulfo; Tang, Nan; Yin, Si

    2015-01-01

    of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic

  15. Leveraging Mobile Network Big Data for Developmental Policy ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Some argue that big data and big data users offer advantages to generate evidence. ... Supported by IDRC, this research focused on transportation planning in urban ... Using mobile network big data for land use classification CPRsouth 2015.

  16. Efficacy and safety of intravenous fentanyl administered by ambulance personnel

    DEFF Research Database (Denmark)

    Friesgaard, Kristian Dahl; Nikolajsen, Lone; Giebner, Matthias

    2016-01-01

    BACKGROUND: Management of pain in the pre-hospital setting is often inadequate. In 2011, ambulance personnel were authorized to administer intravenous fentanyl in the Central Denmark Region. The aim of this study was to evaluate the efficacy and safety of intravenous fentanyl administered...... by ambulance personnel. METHODS: Pre-hospital medical charts from 2348 adults treated with intravenous fentanyl by ambulance personnel during a 6-month period were reviewed. The primary outcome was the change in pain intensity on a numeric rating scale (NRS) from before fentanyl treatment to hospital arrival...... patients (1.3%) and hypotension observed in 71 patients (3.0%). CONCLUSION: Intravenous fentanyl caused clinically meaningful pain reduction in most patients and was safe in the hands of ambulance personnel. Many patients had moderate to severe pain at hospital arrival. As the protocol allowed higher doses...

  17. Transdisciplinary Perspectives in Bioethics: A Co-evolutionary Introduction from the Big History

    OpenAIRE

    Javier Collado-Ruano

    2016-01-01

    The main objective of this work is to expand the bioethics notion expressed in the Article 17th of the Universal Declaration on Bioethics and Human Rights, concerning the interconnections between human beings and other life forms. For this purpose, it is combined the transdisciplinary methodology with the theoretical framework of the “Big History” to approach the co-evolutionary phenomena that life is developing on Earth for some 3.8 billion years. As a result, the study introduces us to t...

  18. Intravenous infusion of H2-saline suppresses oxidative stress and elevates antioxidant potential in Thoroughbred horses after racing exercise.

    Science.gov (United States)

    Yamazaki, Masahiko; Kusano, Kanichi; Ishibashi, Toru; Kiuchi, Masataka; Koyama, Katsuhiro

    2015-10-23

    Upon intensive, exhaustive exercise, exercise-induced reactive oxygen species may exceed the antioxidant defence threshold, consequently resulting in muscular damage or late-onset chronic inflammation. Recently, the therapeutic antioxidant and anti-inflammatory effects of molecular hydrogen (H2) for human rheumatoid arthritis have been demonstrated. However, it is also important to clarify the effects of administrating H2 in large animals other than humans, as H2 is thought to reach the target organ by passive diffusion upon delivery from the blood flow, indicating that the distance from the administration point to the target is critical. However, data on the effects of H2 on oxidative stress in real-life exhaustive exercise in large animals are currently lacking. We here investigated 13 Thoroughbred horses administered intravenous 2-L saline with or without 0.6-ppm H2 (placebo, N = 6; H2, N = 7) before participating in a high-intensity simulation race. Intravenous H2-saline significantly suppressed oxidative stress immediately, 3 h, and 24 h after the race, although the antioxidant capability was not affected throughout the study. The serum creatine kinase, lactate, and uric acid levels were increased in both groups. Taken together, these results indicate that intravenous H2-saline can significantly and specifically suppress oxidative stress induced after exhaustive racing in Thoroughbred horses.

  19. Catheter fracture of intravenous ports and its management.

    Science.gov (United States)

    Wu, Ching-Yang; Fu, Jui-Ying; Feng, Po-Hao; Kao, Tsung-Chi; Yu, Sheng-Yueh; Li, Hao-Jui; Ko, Po-Jen; Hsieh, Hung-Chang

    2011-11-01

    Intravenous ports are widely used for oncology patients. However, catheter fractures may lead to the need for re-intervention. We aimed to identify the risk factors associated with catheter fractures. Between January 1 and December 31, 2006, we retrospectively reviewed the clinical data and plain chest films of 1,505 patients implanted with an intravenous port at Chang Gung Memorial Hospital. Different vascular sites were compared using the chi-square or Fisher's exact test for categorical variables, and the t test was used for continuous variables with normal distribution; P port type Arrow French (Fr.) 8.1 (P port and catheter removal is recommended. Female gender, intravenous port implantation via the subclavian route, and the Arrow Fr. 8.1 port were found to be risk factors. Patients with these risk factors should be monitored closely to avoid catheter fractures.

  20. The possible role of intravenous lipid emulsion in the treatment of chemical warfare agent poisoning

    Directory of Open Access Journals (Sweden)

    Arik Eisenkraft

    Full Text Available Organophosphates (OPs are cholinesterase inhibitors that lead to a characteristic toxidrome of hypersecretion, miosis, dyspnea, respiratory insufficiency, convulsions and, without proper and early antidotal treatment, death. Most of these compounds are highly lipophilic. Sulfur mustard is a toxic lipophilic alkylating agent, exerting its damage through alkylation of cellular macromolecules (e.g., DNA, proteins and intense activation of pro-inflammatory pathways. Currently approved antidotes against OPs include the peripheral anticholinergic drug atropine and an oxime that reactivates the inhibited cholinesterase. Benzodiazepines are used to stop organophosphate-induced seizures. Despite these approved drugs, efforts have been made to introduce other medical countermeasures in order to attenuate both the short-term and long-term clinical effects following exposure. Currently, there is no antidote against sulfur mustard poisoning. Intravenous lipid emulsions are used as a source of calories in parenteral nutrition. In recent years, efficacy of lipid emulsions has been shown in the treatment of poisoning by fat-soluble compounds in animal models as well as clinically in humans. In this review we discuss the usefulness of intravenous lipid emulsions as an adjunct to the in-hospital treatment of chemical warfare agent poisoning. Keywords: Intravenous lipid emulsion, Organophosphates, Sulfur mustard, Antidotes, Poisoning, Chemical Warfare agents

  1. Practice Variation in Big-4 Transparency Reports

    DEFF Research Database (Denmark)

    Girdhar, Sakshi; Klarskov Jeppesen, Kim

    2018-01-01

    Purpose: The purpose of this paper is to examine the transparency reports published by the Big-4 public accounting firms in the UK, Germany and Denmark to understand the determinants of their content within the networks of big accounting firms. Design/methodology/approach: The study draws...... on a qualitative research approach, in which the content of transparency reports is analyzed and semi-structured interviews are conducted with key people from the Big-4 firms who are responsible for developing the transparency reports. Findings: The findings show that the content of transparency reports...... is inconsistent and the transparency reporting practice is not uniform within the Big-4 networks. Differences were found in the way in which the transparency reporting practices are coordinated globally by the respective central governing bodies of the Big-4. The content of the transparency reports...

  2. Successful outcome after intravenous gasoline injection.

    Science.gov (United States)

    Domej, Wolfgang; Mitterhammer, Heike; Stauber, Rudolf; Kaufmann, Peter; Smolle, Karl Heinz

    2007-12-01

    Gasoline, ingested intentionally or accidentally, is toxic. The majority of reported cases of gasoline intoxication involve oral ingestion or inhalation. Data are scarce on complications and outcomes following hydrocarbon poisoning by intravenous injection. Following a suicide attempt by intravenous self-injection of 10 ml of gasoline, a 26-year-old medical student was admitted to the intensive care unit (ICU) with hemoptysis, symptoms of acute respiratory failure, chest pain, and severe abdominal cramps. Gas exchange was severely impaired and a chest x-ray indicated chemical pneumonitis. Initial treatment consisted of mechanical ventilation, supportive hyperventilation, administration of nitrogen oxide (NO), and prednisone. Unfortunately, the patient developed multi-organ dysfunction syndrome (MODS) complicated by life-threatening severe vasoplegia within 24 hours after gasoline injection. High doses of vasopressors along with massive amounts of parenteral fluids were necessary. Despite fluid replacement, renal function worsened and required hemofiltration on 5 sequential days. After 12 days of intensive care management, the patient recovered completely and was discharged to a psychiatric care facility. Intravenous gasoline injection causes major injury to the lungs, the organ bearing the first capillary bed encountered. Treatment of gasoline poisoning is symptomatic because no specific antidote is available. Early and aggressive supportive care may be conducive to a favorable outcome with minimal residual pulmonary sequelae.

  3. A randomized clinical trial of recombinant human hyaluronidase-facilitated subcutaneous versus intravenous rehydration in mild to moderately dehydrated children in the emergency department.

    Science.gov (United States)

    Spandorfer, Philip R; Mace, Sharon E; Okada, Pamela J; Simon, Harold K; Allen, Coburn H; Spiro, David M; Friend, Keith; Harb, George; Lebel, Francois

    2012-11-01

    Alternative treatment of dehydration is needed when intravenous (IV) or oral rehydration therapy fails. Subcutaneous (SC) hydration facilitated by recombinant human hyaluronidase offers an alternative treatment for dehydration. This clinical trial is the first to compare recombinant human hyaluronidase-facilitated SC (rHFSC) rehydration with standard IV rehydration for use in dehydrated children. This Phase IV noninferiority trial evaluated whether rHFSC fluid administration can be given safely and effectively, with volumes similar to those delivered intravenously, to children who have mild to moderate dehydration. The study included mild to moderately dehydrated children (Gorelick dehydration score) aged 1 month to 10 years. They were randomized to receive 20 mL/kg of isotonic fluids using rHFSC or IV therapy over 1 hour and then as needed until clinically rehydrated. The primary outcome was total volume of fluid administered (emergency department [ED] plus inpatient hospitalization). Secondary outcomes included mean volume infused in the ED alone, postinfusion dehydration scores and weight changes, line placement success and time, safety, and provider and parent/guardian questionnaire. 148 patients (mean age, 2.3 [1.91] years]; white, 53.4%; black, 31.8%) were enrolled in the intention-to-treat population (73 rHFSC; 75 IV). The primary outcome, mean total volume infused, was 365.0 (324.6) mL in the rHFSC group over 3.1 hours versus 455.8 (597.4) mL in the IV group over 6.6 hours (P = 0.51). The secondary outcome of mean volume infused in the ED alone was 334.3 (226.40) mL in the rHFSC group versus 299.6 (252.33) mL in the IV group (P = 0.03). Dehydration scores and weight changes postinfusion were similar. Successful line placement occurred in all 73 rHFSC-treated patients and 59 of 75 (78.7%) IV-treated patients (P dehydrated children, rHFSC was inferior to IV hydration for the primary outcome measure. However, rHFSC was noninferior in the ED phase of hydration

  4. Big data and biomedical informatics: a challenging opportunity.

    Science.gov (United States)

    Bellazzi, R

    2014-05-22

    Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations.

  5. Big data reduction framework for value creation in sustainable enterprises

    OpenAIRE

    Rehman, Muhammad Habib ur; Chang, Victor; Batool, Aisha; Teh, Ying Wah

    2016-01-01

    Value creation is a major sustainability factor for enterprises, in addition to profit maximization and revenue generation. Modern enterprises collect big data from various inbound and outbound data sources. The inbound data sources handle data generated from the results of business operations, such as manufacturing, supply chain management, marketing, and human resource management, among others. Outbound data sources handle customer-generated data which are acquired directly or indirectly fr...

  6. Was the big bang hot

    International Nuclear Information System (INIS)

    Wright, E.L.

    1983-01-01

    The author considers experiments to confirm the substantial deviations from a Planck curve in the Woody and Richards spectrum of the microwave background, and search for conducting needles in our galaxy. Spectral deviations and needle-shaped grains are expected for a cold Big Bang, but are not required by a hot Big Bang. (Auth.)

  7. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  8. Intravenous Transplantation of Mesenchymal Stromal Cells to Enhance Peripheral Nerve Regeneration

    Directory of Open Access Journals (Sweden)

    Stella M. Matthes

    2013-01-01

    Full Text Available Peripheral nerve injury is a common and devastating complication after trauma and can cause irreversible impairment or even complete functional loss of the affected limb. While peripheral nerve repair results in some axonal regeneration and functional recovery, the clinical outcome is not optimal and research continues to optimize functional recovery after nerve repair. Cell transplantation approaches are being used experimentally to enhance regeneration. Intravenous infusion of mesenchymal stromal cells (MSCs into spinal cord injury and stroke was shown to improve functional outcome. However, the repair potential of intravenously transplanted MSCs in peripheral nerve injury has not been addressed yet. Here we describe the impact of intravenously infused MSCs on functional outcome in a peripheral nerve injury model. Rat sciatic nerves were transected followed, by intravenous MSCs transplantation. Footprint analysis was carried out and 21 days after transplantation, the nerves were removed for histology. Labelled MSCs were found in the sciatic nerve lesion site after intravenous injection and regeneration was improved. Intravenously infused MSCs after acute peripheral nerve target the lesion site and survive within the nerve and the MSC treated group showed greater functional improvement. The results of study suggest that nerve repair with cell transplantation could lead to greater functional outcome.

  9. Keynote: Big Data, Big Opportunities

    OpenAIRE

    Borgman, Christine L.

    2014-01-01

    The enthusiasm for big data is obscuring the complexity and diversity of data in scholarship and the challenges for stewardship. Inside the black box of data are a plethora of research, technology, and policy issues. Data are not shiny objects that are easily exchanged. Rather, data are representations of observations, objects, or other entities used as evidence of phenomena for the purposes of research or scholarship. Data practices are local, varying from field to field, individual to indiv...

  10. Systematic review of the effect of intravenous lipid emulsion therapy for local anesthetic toxicity

    DEFF Research Database (Denmark)

    Høgberg, Lotte Christine Groth; Bania, Theodore C; Lavergne, Valéry

    2016-01-01

    BACKGROUND: Following national and regional recommendations, intravenous lipid emulsion (ILE) has become established in clinical practice as a treatment for acute local anesthetic (LA) toxicity, although evidence of efficacy is limited to animal studies and human case reports. A collaborative lipid......-defined inclusion and exclusion criteria. Pre-treatment experiments, pharmacokinetic studies not involving toxicity and studies that did not address antidotal use of ILE were excluded. RESULTS: We included 113 studies and reports. Of these, 76 were human and 38 animal studies. One publication included both a human...... case report and an animal study. Human studies included one randomized controlled crossover trial involving 16 healthy volunteers. The subclinical LA toxicity design did not show a difference in the effects of ILE versus saline. There was one case series and 73 case reports of ILE use in the context...

  11. Integrating R and Hadoop for Big Data Analysis

    OpenAIRE

    Bogdan Oancea; Raluca Mariana Dragoescu

    2014-01-01

    Analyzing and working with big data could be very diffi cult using classical means like relational database management systems or desktop software packages for statistics and visualization. Instead, big data requires large clusters with hundreds or even thousands of computing nodes. Offi cial statistics is increasingly considering big data for deriving new statistics because big data sources could produce more relevant and timely statistics than traditional sources. One of the software tools ...

  12. Patient Privacy in the Era of Big Data.

    Science.gov (United States)

    Kayaalp, Mehmet

    2018-01-20

    Privacy was defined as a fundamental human right in the Universal Declaration of Human Rights at the 1948 United Nations General Assembly. However, there is still no consensus on what constitutes privacy. In this review, we look at the evolution of privacy as a concept from the era of Hippocrates to the era of social media and big data. To appreciate the modern measures of patient privacy protection and correctly interpret the current regulatory framework in the United States, we need to analyze and understand the concepts of individually identifiable information, individually identifiable health information, protected health information, and de-identification. The Privacy Rule of the Health Insurance Portability and Accountability Act defines the regulatory framework and casts a balance between protective measures and access to health information for secondary (scientific) use. The rule defines the conditions when health information is protected by law and how protected health information can be de-identified for secondary use. With the advents of artificial intelligence and computational linguistics, computational text de-identification algorithms produce de-identified results nearly as well as those produced by human experts, but much faster, more consistently and basically for free. Modern clinical text de-identification systems now pave the road to big data and enable scientists to access de-identified clinical information while firmly protecting patient privacy. However, clinical text de-identification is not a perfect process. In order to maximize the protection of patient privacy and to free clinical and scientific information from the confines of electronic healthcare systems, all stakeholders, including patients, health institutions and institutional review boards, scientists and the scientific communities, as well as regulatory and law enforcement agencies must collaborate closely. On the one hand, public health laws and privacy regulations define rules

  13. Big³. Editorial.

    Science.gov (United States)

    Lehmann, C U; Séroussi, B; Jaulent, M-C

    2014-05-22

    To provide an editorial introduction into the 2014 IMIA Yearbook of Medical Informatics with an overview of the content, the new publishing scheme, and upcoming 25th anniversary. A brief overview of the 2014 special topic, Big Data - Smart Health Strategies, and an outline of the novel publishing model is provided in conjunction with a call for proposals to celebrate the 25th anniversary of the Yearbook. 'Big Data' has become the latest buzzword in informatics and promise new approaches and interventions that can improve health, well-being, and quality of life. This edition of the Yearbook acknowledges the fact that we just started to explore the opportunities that 'Big Data' will bring. However, it will become apparent to the reader that its pervasive nature has invaded all aspects of biomedical informatics - some to a higher degree than others. It was our goal to provide a comprehensive view at the state of 'Big Data' today, explore its strengths and weaknesses, as well as its risks, discuss emerging trends, tools, and applications, and stimulate the development of the field through the aggregation of excellent survey papers and working group contributions to the topic. For the first time in history will the IMIA Yearbook be published in an open access online format allowing a broader readership especially in resource poor countries. For the first time, thanks to the online format, will the IMIA Yearbook be published twice in the year, with two different tracks of papers. We anticipate that the important role of the IMIA yearbook will further increase with these changes just in time for its 25th anniversary in 2016.

  14. Cloud Based Big Data Infrastructure: Architectural Components and Automated Provisioning

    OpenAIRE

    Demchenko, Yuri; Turkmen, Fatih; Blanchet, Christophe; Loomis, Charles; Laat, Caees de

    2016-01-01

    This paper describes the general architecture and functional components of the cloud based Big Data Infrastructure (BDI). The proposed BDI architecture is based on the analysis of the emerging Big Data and data intensive technologies and supported by the definition of the Big Data Architecture Framework (BDAF) that defines the following components of the Big Data technologies: Big Data definition, Data Management including data lifecycle and data structures, Big Data Infrastructure (generical...

  15. Physics with Big Karl Brainstorming. Abstracts

    International Nuclear Information System (INIS)

    Machner, H.; Lieb, J.

    2000-08-01

    Before summarizing details of the meeting, a short description of the spectrometer facility Big Karl is given. The facility is essentially a new instrument using refurbished dipole magnets from its predecessor. The large acceptance quadrupole magnets and the beam optics are new. Big Karl has a design very similar as the focussing spectrometers at MAMI (Mainz), AGOR (Groningen) and the high resolution spectrometer (HRS) in Hall A at Jefferson Laboratory with ΔE/E = 10 -4 but at some lower maximum momentum. The focal plane detectors consisting of multiwire drift chambers and scintillating hodoscopes are similar. Unlike HRS, Big Karl still needs Cerenkov counters and polarimeters in its focal plane; detectors which are necessary to perform some of the experiments proposed during the brainstorming. In addition, BIG KARL allows emission angle reconstruction via track measurements in its focal plane with high resolution. In the following the physics highlights, the proposed and potential experiments are summarized. During the meeting it became obvious that the physics to be explored at Big Karl can be grouped into five distinct categories, and this summary is organized accordingly. (orig.)

  16. The Ecology of Human Mobility.

    Science.gov (United States)

    Meekan, Mark G; Duarte, Carlos M; Fernández-Gracia, Juan; Thums, Michele; Sequeira, Ana M M; Harcourt, Rob; Eguíluz, Víctor M

    2017-03-01

    Mobile phones and other geolocated devices have produced unprecedented volumes of data on human movement. Analysis of pooled individual human trajectories using big data approaches has revealed a wealth of emergent features that have ecological parallels in animals across a diverse array of phenomena including commuting, epidemics, the spread of innovations and culture, and collective behaviour. Movement ecology, which explores how animals cope with and optimize variability in resources, has the potential to provide a theoretical framework to aid an understanding of human mobility and its impacts on ecosystems. In turn, big data on human movement can be explored in the context of animal movement ecology to provide solutions for urgent conservation problems and management challenges. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Seed bank and big sagebrush plant community composition in a range margin for big sagebrush

    Science.gov (United States)

    Martyn, Trace E.; Bradford, John B.; Schlaepfer, Daniel R.; Burke, Ingrid C.; Laurenroth, William K.

    2016-01-01

    The potential influence of seed bank composition on range shifts of species due to climate change is unclear. Seed banks can provide a means of both species persistence in an area and local range expansion in the case of increasing habitat suitability, as may occur under future climate change. However, a mismatch between the seed bank and the established plant community may represent an obstacle to persistence and expansion. In big sagebrush (Artemisia tridentata) plant communities in Montana, USA, we compared the seed bank to the established plant community. There was less than a 20% similarity in the relative abundance of species between the established plant community and the seed bank. This difference was primarily driven by an overrepresentation of native annual forbs and an underrepresentation of big sagebrush in the seed bank compared to the established plant community. Even though we expect an increase in habitat suitability for big sagebrush under future climate conditions at our sites, the current mismatch between the plant community and the seed bank could impede big sagebrush range expansion into increasingly suitable habitat in the future.

  18. Application and Prospect of Big Data in Water Resources

    Science.gov (United States)

    Xi, Danchi; Xu, Xinyi

    2017-04-01

    Because of developed information technology and affordable data storage, we h ave entered the era of data explosion. The term "Big Data" and technology relate s to it has been created and commonly applied in many fields. However, academic studies just got attention on Big Data application in water resources recently. As a result, water resource Big Data technology has not been fully developed. This paper introduces the concept of Big Data and its key technologies, including the Hadoop system and MapReduce. In addition, this paper focuses on the significance of applying the big data in water resources and summarizing prior researches by others. Most studies in this field only set up theoretical frame, but we define the "Water Big Data" and explain its tridimensional properties which are time dimension, spatial dimension and intelligent dimension. Based on HBase, the classification system of Water Big Data is introduced: hydrology data, ecology data and socio-economic data. Then after analyzing the challenges in water resources management, a series of solutions using Big Data technologies such as data mining and web crawler, are proposed. Finally, the prospect of applying big data in water resources is discussed, it can be predicted that as Big Data technology keeps developing, "3D" (Data Driven Decision) will be utilized more in water resources management in the future.

  19. Big Data in food and agriculture

    Directory of Open Access Journals (Sweden)

    Kelly Bronson

    2016-06-01

    Full Text Available Farming is undergoing a digital revolution. Our existing review of current Big Data applications in the agri-food sector has revealed several collection and analytics tools that may have implications for relationships of power between players in the food system (e.g. between farmers and large corporations. For example, Who retains ownership of the data generated by applications like Monsanto Corproation's Weed I.D . “app”? Are there privacy implications with the data gathered by John Deere's precision agricultural equipment? Systematically tracing the digital revolution in agriculture, and charting the affordances as well as the limitations of Big Data applied to food and agriculture, should be a broad research goal for Big Data scholarship. Such a goal brings data scholarship into conversation with food studies and it allows for a focus on the material consequences of big data in society.

  20. Big data optimization recent developments and challenges

    CERN Document Server

    2016-01-01

    The main objective of this book is to provide the necessary background to work with big data by introducing some novel optimization algorithms and codes capable of working in the big data setting as well as introducing some applications in big data optimization for both academics and practitioners interested, and to benefit society, industry, academia, and government. Presenting applications in a variety of industries, this book will be useful for the researchers aiming to analyses large scale data. Several optimization algorithms for big data including convergent parallel algorithms, limited memory bundle algorithm, diagonal bundle method, convergent parallel algorithms, network analytics, and many more have been explored in this book.

  1. Una aproximación a Big Data = An approach to Big Data

    OpenAIRE

    Puyol Moreno, Javier

    2014-01-01

    Big Data puede ser considerada como una tendencia en el avance de la tecnología que ha abierto la puerta a un nuevo enfoque para la comprensión y la toma de decisiones, que se utiliza para describir las enormes cantidades de datos (estructurados, no estructurados y semi- estructurados) que sería demasiado largo y costoso para cargar una base de datos relacional para su análisis. Así, el concepto de Big Data se aplica a toda la información que no puede ser procesada o analizada utilizando herr...

  2. Toward a Literature-Driven Definition of Big Data in Healthcare.

    Science.gov (United States)

    Baro, Emilie; Degoul, Samuel; Beuscart, Régis; Chazard, Emmanuel

    2015-01-01

    The aim of this study was to provide a definition of big data in healthcare. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n) and the number of variables (p) for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. A total of 196 papers were included. Big data can be defined as datasets with Log(n∗p) ≥ 7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR) data.

  3. Big Data Analytic, Big Step for Patient Management and Care in Puerto Rico.

    Science.gov (United States)

    Borrero, Ernesto E

    2018-01-01

    This letter provides an overview of the application of big data in health care system to improve quality of care, including predictive modelling for risk and resource use, precision medicine and clinical decision support, quality of care and performance measurement, public health and research applications, among others. The author delineates the tremendous potential for big data analytics and discuss how it can be successfully implemented in clinical practice, as an important component of a learning health-care system.

  4. Big Data and Biomedical Informatics: A Challenging Opportunity

    Science.gov (United States)

    2014-01-01

    Summary Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations. PMID:24853034

  5. Big data governance an emerging imperative

    CERN Document Server

    Soares, Sunil

    2012-01-01

    Written by a leading expert in the field, this guide focuses on the convergence of two major trends in information management-big data and information governance-by taking a strategic approach oriented around business cases and industry imperatives. With the advent of new technologies, enterprises are expanding and handling very large volumes of data; this book, nontechnical in nature and geared toward business audiences, encourages the practice of establishing appropriate governance over big data initiatives and addresses how to manage and govern big data, highlighting the relevant processes,

  6. Big Data and historical social science

    Directory of Open Access Journals (Sweden)

    Peter Bearman

    2015-11-01

    Full Text Available “Big Data” can revolutionize historical social science if it arises from substantively important contexts and is oriented towards answering substantively important questions. Such data may be especially important for answering previously largely intractable questions about the timing and sequencing of events, and of event boundaries. That said, “Big Data” makes no difference for social scientists and historians whose accounts rest on narrative sentences. Since such accounts are the norm, the effects of Big Data on the practice of historical social science may be more limited than one might wish.

  7. Use of intravenous immunoglobulins in clinical practice

    Directory of Open Access Journals (Sweden)

    E.K. Donyush

    2011-01-01

    Full Text Available Immunoglobulins are main component of immune defense; they take part in anti-infectious resistance of organism and regulate processes of different immune reactions. Intravenous immunoglobulins are the most frequently used products made from donor blood plasma. The need in these drugs is steadily increasing during last 15–20 years, and indications are widening due to modern hightechnology methods of production and cleaning. The article presents modern data on formula, mechanisms of action and indications for different groups of intravenous immunoglobulins (standard, hyperimmune, fortified and description of possible adverse events.Key words: immuglobulines, prophylaxis, treatment, unfavorable reaction, children.

  8. The Inverted Big-Bang

    OpenAIRE

    Vaas, Ruediger

    2004-01-01

    Our universe appears to have been created not out of nothing but from a strange space-time dust. Quantum geometry (loop quantum gravity) makes it possible to avoid the ominous beginning of our universe with its physically unrealistic (i.e. infinite) curvature, extreme temperature, and energy density. This could be the long sought after explanation of the big-bang and perhaps even opens a window into a time before the big-bang: Space itself may have come from an earlier collapsing universe tha...

  9. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  10. Usefulness of modified intravenous analgesia: initial experience in uterine artery embolization for leiomyomata

    International Nuclear Information System (INIS)

    Yang, Seung Boo; Jung, Young Jin; Goo, Dong Erk; Jang, Yun Woo

    2006-01-01

    We wanted to evaluate the usefulness of modified intravenous analgesia for the management of pain during uterine artery embolization for leiomyomata. Between April 2004 and July 2004, 15 patients with symptomatic fibroids underwent uterine artery embolization and pain management. Except the three patients for whom the Visual Analogue Scale (VAS) score was not obtained, twelve patients were included in this study. For pain management, epidural PCA (Patient Controlled Analgesia) was used in two patients, intravenous PCA was used in two patients and modified intravenous analgesia injection was used in eight patients. For all the patients, we used the 2.8 Fr coaxial microcatheter and 500-710 μ m PVA particles for the embolic materials. The protocol of the modified intravenous analgesia injection was as follow, 1) prior to femoral artery puncture, 30 mg of ketorolac tromethamine (Tarasyn)was injected via an intravenous route. 2) At the time that the one side uterine artery embolization was finished, normal saline mixed 150 mg meperidine (Demerol) was administered through the side port of the intravenous line that was used for hydration. 3) Additional ketorolac tromethamine 30 mg was injected after 6 hour. The VAS score and side effects were then checked. After 12 hours, the VAS score was rechecked. If the VAS score was above 4, this was considered as failure of pain management. The VAS scores, complications and side effects for the modified intravenous analgesia injection were compared with that of IV PCA and epidural PCA. The average VAS score of the modified intravenous analgesia injection, intravenous PCA and epidural PCA was 1.4, 1 and 0, respectively; the number of additional intramuscular injections of analgesia was 0.5, 0.5 and 0, respectively. All the patients who underwent epidural PCA had back pain at the puncture site and 1 patient who underwent modified intravenous analgesia injection experienced mild dyspnea, but they easily recovered with such

  11. Usefulness of modified intravenous analgesia: initial experience in uterine artery embolization for leiomyomata

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Seung Boo; Jung, Young Jin [Soonchunhyang University, Gumi Hospital, Gumi (Korea, Republic of); Goo, Dong Erk; Jang, Yun Woo [Soonchunhyang University Hospital, Seoul (Korea, Republic of)

    2006-04-15

    We wanted to evaluate the usefulness of modified intravenous analgesia for the management of pain during uterine artery embolization for leiomyomata. Between April 2004 and July 2004, 15 patients with symptomatic fibroids underwent uterine artery embolization and pain management. Except the three patients for whom the Visual Analogue Scale (VAS) score was not obtained, twelve patients were included in this study. For pain management, epidural PCA (Patient Controlled Analgesia) was used in two patients, intravenous PCA was used in two patients and modified intravenous analgesia injection was used in eight patients. For all the patients, we used the 2.8 Fr coaxial microcatheter and 500-710 {mu} m PVA particles for the embolic materials. The protocol of the modified intravenous analgesia injection was as follow, 1) prior to femoral artery puncture, 30 mg of ketorolac tromethamine (Tarasyn)was injected via an intravenous route. 2) At the time that the one side uterine artery embolization was finished, normal saline mixed 150 mg meperidine (Demerol) was administered through the side port of the intravenous line that was used for hydration. 3) Additional ketorolac tromethamine 30 mg was injected after 6 hour. The VAS score and side effects were then checked. After 12 hours, the VAS score was rechecked. If the VAS score was above 4, this was considered as failure of pain management. The VAS scores, complications and side effects for the modified intravenous analgesia injection were compared with that of IV PCA and epidural PCA. The average VAS score of the modified intravenous analgesia injection, intravenous PCA and epidural PCA was 1.4, 1 and 0, respectively; the number of additional intramuscular injections of analgesia was 0.5, 0.5 and 0, respectively. All the patients who underwent epidural PCA had back pain at the puncture site and 1 patient who underwent modified intravenous analgesia injection experienced mild dyspnea, but they easily recovered with such

  12. Classical propagation of strings across a big crunch/big bang singularity

    International Nuclear Information System (INIS)

    Niz, Gustavo; Turok, Neil

    2007-01-01

    One of the simplest time-dependent solutions of M theory consists of nine-dimensional Euclidean space times 1+1-dimensional compactified Milne space-time. With a further modding out by Z 2 , the space-time represents two orbifold planes which collide and re-emerge, a process proposed as an explanation of the hot big bang [J. Khoury, B. A. Ovrut, P. J. Steinhardt, and N. Turok, Phys. Rev. D 64, 123522 (2001).][P. J. Steinhardt and N. Turok, Science 296, 1436 (2002).][N. Turok, M. Perry, and P. J. Steinhardt, Phys. Rev. D 70, 106004 (2004).]. When the two planes are near, the light states of the theory consist of winding M2-branes, describing fundamental strings in a particular ten-dimensional background. They suffer no blue-shift as the M theory dimension collapses, and their equations of motion are regular across the transition from big crunch to big bang. In this paper, we study the classical evolution of fundamental strings across the singularity in some detail. We also develop a simple semiclassical approximation to the quantum evolution which allows one to compute the quantum production of excitations on the string and implement it in a simplified example

  13. Breast abscess after intravenous methamphetamine injection into the breast.

    Science.gov (United States)

    Kistler, Amanda; Ajkay, Nicolas

    2018-05-01

    Intravenous drug use is a problem plaguing our society. We present a case of a young female who injected methamphetamine into her mammary vein, resulting in the formation of a breast abscess. This case demonstrates a rare but dangerous complication of intravenous drug use and a possible differential diagnosis in a patient presenting with a breast abscess. © 2017 Wiley Periodicals, Inc.

  14. Increases in Intravenous Magnesium Use among Hospitalized Patients: An Institution Cross-Sectional Experience

    Directory of Open Access Journals (Sweden)

    Bryce A. Kiberd

    2015-06-01

    Full Text Available Background: Among hospitalized patients, indications for the measurement of magnesium levels and treatment of hypomagnesemia with intravenous magnesium are not well defined. Recently, there have been reports of worldwide shortages of intravenous magnesium sulphate. Objective: To examine secular trends in the administration of intravenous magnesium on hospital wards at a tertiary care institution. The secondary objective is to identify factors associated with magnesium use among admitted patients. Methods: Retrospective cross-section review of hospitalized patients at a single Canadian tertiary care center. Utilization of non-parental nutrition intravenous magnesium from 2003 to 2013 stratified by hospital ward was examined. In addition, patient level data from select wards (including medical and surgical services was examined at early and more recent time period (4/2006 versus 4/2013. Results: Among the 248,329 hospitalized patients, intravenous magnesium use increased by 2.86 fold from 2003 to 2013. Not all wards had an increase whereas some had nearly a 10 fold increase in use. In the sample ( n = 769, (adjusting for admission magnesium level, presence of an indication for intravenous magnesium, ward location, comorbidity and demographics intravenous magnesium administration was higher (25.8 % versus 5.5 % in 2013 versus 2006 (OR 13.91 (95 % CI, 6.21–31.17, p < 0.001. Despite this increase in intravenous magnesium administration, <3 % of patients were admitted on oral magnesium in 2006 and 2013. For patients receiving intravenous magnesium only a minority were discharged on oral therapy despite low levels. Conclusions: This center has witnessed a considerable increase in the use of in-hospital intravenous magnesium over the last 6 years that cannot be explained for by medical indications. The risks and benefits of this therapy deserve further study. If this change in practice is representative of other North American hospitals, it may be

  15. The Information Panopticon in the Big Data Era

    Directory of Open Access Journals (Sweden)

    Martin Berner

    2014-04-01

    Full Text Available Taking advantage of big data opportunities is challenging for traditional organizations. In this article, we take a panoptic view of big data – obtaining information from more sources and making it visible to all organizational levels. We suggest that big data requires the transformation from command and control hierarchies to post-bureaucratic organizational structures wherein employees at all levels can be empowered while simultaneously being controlled. We derive propositions that show how to best exploit big data technologies in organizations.

  16. Effectiveness of intravenous levetiracetam as an adjunctive treatment in pediatric refractory status epilepticus.

    Science.gov (United States)

    Kim, Jon Soo; Lee, Jeong Ho; Ryu, Hye Won; Lim, Byung Chan; Hwang, Hee; Chae, Jong-Hee; Choi, Jieun; Kim, Ki Joong; Hwang, Yong Seung; Kim, Hunmin

    2014-08-01

    Intravenous levetiracetam (LEV) has been shown to be effective and safe in treating adults with refractory status epilepticus (SE). We sought to investigate the efficacy and safety of intravenous LEV for pediatric patients with refractory SE. We performed a retrospective medical-record review of pediatric patients who were treated with intravenous LEV for refractory SE. Clinical information regarding age, sex, seizure type, and underlying neurological status was collected. We evaluated other anticonvulsants that were used prior to administration of intravenous LEV and assessed loading dose, response to treatment, and any adverse events from intravenous LEV administration. Fourteen patients (8 boys and 6 girls) received intravenous LEV for the treatment of refractory SE. The mean age of the patients was 4.4 ± 5.5 years (range, 4 days to 14.6 years). Ten of the patients were neurologically healthy prior to the refractory SE, and the other 4 had been previously diagnosed with epilepsy. The mean loading dose of intravenous LEV was 26 ± 4.6 mg/kg (range, 20-30 mg/kg). Seizure termination occurred in 6 (43%) of the 14 patients. In particular, 4 (57%) of the 7 patients younger than 2 years showed seizure termination. No immediate adverse events occurred during or after infusions. The current study demonstrated that the adjunctive use of intravenous LEV was effective and well tolerated in pediatric patients with refractory SE, even in patients younger than 2 years. Intravenous LEV should be considered as an effective and safe treatment option for refractory SE in pediatric patients.

  17. Usefulness of MR cholangiopancreatography after intravenous morphine administration

    International Nuclear Information System (INIS)

    Lee, So Jung; Ko, Ji Ho; Cho, Young Duk; Jung, Mi Hee; Yoon, Byung Chull

    2007-01-01

    We wanted to assess the usefulness of MRCP after intravenous morphine administration in the evaluation of the hepatopancreatic pancreatico-biliary ductal system. We studied 15 patients who were suspected of having disease of hepatopancreatic ductal system and they did not have any obstructive lesion on ultrasonography and/or CT. MRCP was acquired before and after morphine administration (0.04 mg/kg, intravenously). Three radiologists scored the quality of the images of the anatomic structures in the hepatopancreatic ductal system. We directly compared the quality of the images obtained with using the two methods and the improvement of the artifacts by pulsatile vascular compression. The MRCP images obtained after intravenous morphine administration were better than those obtained before morphine administration for visualizing the hepatopancreatic ductal system. On direct comparison, the MRCP images obtained after morphine administration were better in 12 cases, equivocal in two cases, and the images before morphine administration were better in only one case. In three patients, MRCP before morphine injection showed signal loss at the duct across the pulsatile hepatic artery. In two of three patients, MRCP after morphine injection showed no signal loss in this ductal area. MRCP after intravenous morphine administration enables physicians to see the hepatopancreatic ductal system significantly better and the artifacts caused by pulsation of the hepatic artery can be avoided

  18. [Efficacy of intravenous phenobarbital treatment for status epilepticus].

    Science.gov (United States)

    Muramoto, Emiko; Mizobuchi, Masahiro; Sumi, Yoshihiro; Sako, Kazuya; Nihira, Atsuko; Takeuchi, Akiko; Nakamura, Hirohiko

    2013-08-01

    Intravenous phenobarbital (IV-PB) therapy was launched in Japan in October 2008. We retrospectively investigated its efficacy and tolerability in patients with status epilepticus. Forty-three consecutive patients received IV-PB for status epilepticus between June 2009 and April 2011. Among them, 39 patients had underlying diseases, which included acute diseases in 19 patients and chronic conditions in 20 patients. Although 18 patients had been taking antiepileptic drugs (AEDs) before the occurrence of status epilepticus, the blood AED concentrations in 8 patients was below the therapeutic levels. Before the administration of IV-PB, 39 patients were treated with intravenous benzodiazepine, 17 patients were treated with intravenous phenytoin, and 15 patients with intravenous infusion of lidocaine. The initial doses of IV-PB ranged from 125 to 1,250 mg (1.9-20.0 mg/kg). Additional doses of IV-PB were required in 12 patients. Seizures were controlled in 35 patients (81%) after IV-PB administration. Cessation of status epilepticus was attained in 24 patients after the initial dose and in 11 patients after additional doses. There were no serious adverse effects, although respiratory suppression was observed in 3 patients and drug eruption was observed in 1 patient. IV-PB is relatively safe and effective for controlling status epilepticus. If the first dose is not effective, additional doses are required up to the recommended maximum dose.

  19. WE-H-BRB-00: Big Data in Radiation Oncology

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2016-06-15

    Big Data in Radiation Oncology: (1) Overview of the NIH 2015 Big Data Workshop, (2) Where do we stand in the applications of big data in radiation oncology?, and (3) Learning Health Systems for Radiation Oncology: Needs and Challenges for Future Success The overriding goal of this trio panel of presentations is to improve awareness of the wide ranging opportunities for big data impact on patient quality care and enhancing potential for research and collaboration opportunities with NIH and a host of new big data initiatives. This presentation will also summarize the Big Data workshop that was held at the NIH Campus on August 13–14, 2015 and sponsored by AAPM, ASTRO, and NIH. The workshop included discussion of current Big Data cancer registry initiatives, safety and incident reporting systems, and other strategies that will have the greatest impact on radiation oncology research, quality assurance, safety, and outcomes analysis. Learning Objectives: To discuss current and future sources of big data for use in radiation oncology research To optimize our current data collection by adopting new strategies from outside radiation oncology To determine what new knowledge big data can provide for clinical decision support for personalized medicine L. Xing, NIH/NCI Google Inc.

  20. WE-H-BRB-00: Big Data in Radiation Oncology

    International Nuclear Information System (INIS)

    2016-01-01

    Big Data in Radiation Oncology: (1) Overview of the NIH 2015 Big Data Workshop, (2) Where do we stand in the applications of big data in radiation oncology?, and (3) Learning Health Systems for Radiation Oncology: Needs and Challenges for Future Success The overriding goal of this trio panel of presentations is to improve awareness of the wide ranging opportunities for big data impact on patient quality care and enhancing potential for research and collaboration opportunities with NIH and a host of new big data initiatives. This presentation will also summarize the Big Data workshop that was held at the NIH Campus on August 13–14, 2015 and sponsored by AAPM, ASTRO, and NIH. The workshop included discussion of current Big Data cancer registry initiatives, safety and incident reporting systems, and other strategies that will have the greatest impact on radiation oncology research, quality assurance, safety, and outcomes analysis. Learning Objectives: To discuss current and future sources of big data for use in radiation oncology research To optimize our current data collection by adopting new strategies from outside radiation oncology To determine what new knowledge big data can provide for clinical decision support for personalized medicine L. Xing, NIH/NCI Google Inc.

  1. De impact van Big Data op Internationale Betrekkingen

    NARCIS (Netherlands)

    Zwitter, Andrej

    Big Data changes our daily lives, but does it also change international politics? In this contribution, Andrej Zwitter (NGIZ chair at Groningen University) argues that Big Data impacts on international relations in ways that we only now start to understand. To comprehend how Big Data influences

  2. Epidemiology in the Era of Big Data

    Science.gov (United States)

    Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M

    2015-01-01

    Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called ‘3 Vs’: variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that, while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field’s future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future. PMID:25756221

  3. Hydrothorax, hydromediastinum and pericardial effusion: a complication of intravenous alimentation.

    Science.gov (United States)

    Damtew, B; Lewandowski, B

    1984-01-01

    Complications secondary to intravenous alimentation are rare but potentially lethal. Massive bilateral pleural effusions and a pericardial effusion developed in a patient receiving prolonged intravenous alimentation. Severe respiratory distress and renal failure ensued. He recovered with appropriate treatment. Images Fig. 1 Fig. 2 Fig. 3 PMID:6428731

  4. Research Dilemmas with Behavioral Big Data.

    Science.gov (United States)

    Shmueli, Galit

    2017-06-01

    Behavioral big data (BBD) refers to very large and rich multidimensional data sets on human and social behaviors, actions, and interactions, which have become available to companies, governments, and researchers. A growing number of researchers in social science and management fields acquire and analyze BBD for the purpose of extracting knowledge and scientific discoveries. However, the relationships between the researcher, data, subjects, and research questions differ in the BBD context compared to traditional behavioral data. Behavioral researchers using BBD face not only methodological and technical challenges but also ethical and moral dilemmas. In this article, we discuss several dilemmas, challenges, and trade-offs related to acquiring and analyzing BBD for causal behavioral research.

  5. Big data and analytics strategic and organizational impacts

    CERN Document Server

    Morabito, Vincenzo

    2015-01-01

    This book presents and discusses the main strategic and organizational challenges posed by Big Data and analytics in a manner relevant to both practitioners and scholars. The first part of the book analyzes strategic issues relating to the growing relevance of Big Data and analytics for competitive advantage, which is also attributable to empowerment of activities such as consumer profiling, market segmentation, and development of new products or services. Detailed consideration is also given to the strategic impact of Big Data and analytics on innovation in domains such as government and education and to Big Data-driven business models. The second part of the book addresses the impact of Big Data and analytics on management and organizations, focusing on challenges for governance, evaluation, and change management, while the concluding part reviews real examples of Big Data and analytics innovation at the global level. The text is supported by informative illustrations and case studies, so that practitioners...

  6. Big Science and Long-tail Science

    CERN Document Server

    2008-01-01

    Jim Downing and I were privileged to be the guests of Salavtore Mele at CERN yesterday and to see the Atlas detector of the Large Hadron Collider . This is a wow experience - although I knew it was big, I hadnt realised how big.

  7. Synchrotron-based intravenous cerebral angiography in a small animal model

    International Nuclear Information System (INIS)

    Kelly, Michael E; Schueltke, Elisabeth; Fiedler, Stephan; Nemoz, Christian; Guzman, Raphael; Corde, Stephanie; Esteve, Francois; LeDuc, Geraldine; Juurlink, Bernhard H J; Meguro, Kotoo

    2007-01-01

    K-edge digital subtraction angiography (KEDSA), a recently developed synchrotron-based technique, utilizes monochromatic radiation and allows acquisition of high-quality angiography images after intravenous administration of contrast agent. We tested KEDSA for its suitability for intravenous cerebral angiography in an animal model. Adult male New Zealand rabbits were subjected to either angiography with conventional x-ray equipment or synchrotron-based intravenous KEDSA, using an iodine-based contrast agent. Angiography with conventional x-ray equipment after intra-arterial administration of contrast agent demonstrated the major intracranial vessels but no smaller branches. KEDSA was able to visualize the major intracranial vessels as well as smaller branches in both radiography mode (planar images) and tomography mode. Visualization was achieved with as little as 0.5 ml kg -1 of iodinated contrast material. We were able to obtain excellent visualization of the cerebral vasculature in an animal model using intravenous injection of contrast material, using synchrotron-based KEDSA

  8. Experience in using intravenous thrombolysis in ischemic stroke in Tatarstan

    Directory of Open Access Journals (Sweden)

    Dina Rustemovna Khasanova

    2011-01-01

    Full Text Available The paper describes experience with intravenous thrombolysis used in a few vascular centers of the Republic of Tatarstan in the past 5 years. Intravenous thrombolysis with alteplase (actilise was carried in 300 patients (188 men and 112 women aged 21 to 79 years (mean age 59.8±13.7 years who had ischemic stroke (IS. Significant positive changes (a neurological deficit decrease on the NIHSS score by ≥ 4 points were observed in 67.3% of cases; mortality was 6.7%. Hemorrhagic events as asymptomatic hemorrhagic transformations were found in 19.3% of cases with the neurological disorders being progressive in 4.6%. Recanalization of internal carotid artery occlusion was recorded only in 24.0% of the patients and that of occlusion of the proximal segments of the middle cerebral artery was in 50.1%. Examples of effective intravenous thrombolysis in IS in the carotid and vertebrobasilar beds are given. Whether intravenous thrombolysis can be more extensively used in IS is discussed.

  9. Toward a Literature-Driven Definition of Big Data in Healthcare

    Directory of Open Access Journals (Sweden)

    Emilie Baro

    2015-01-01

    Full Text Available Objective. The aim of this study was to provide a definition of big data in healthcare. Methods. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n and the number of variables (p for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. Results. A total of 196 papers were included. Big data can be defined as datasets with Log⁡(n*p≥7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Conclusion. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR data.

  10. Toward a Literature-Driven Definition of Big Data in Healthcare

    Science.gov (United States)

    Baro, Emilie; Degoul, Samuel; Beuscart, Régis; Chazard, Emmanuel

    2015-01-01

    Objective. The aim of this study was to provide a definition of big data in healthcare. Methods. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n) and the number of variables (p) for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. Results. A total of 196 papers were included. Big data can be defined as datasets with Log⁡(n∗p) ≥ 7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Conclusion. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR) data. PMID:26137488

  11. Big-Eyed Bugs Have Big Appetite for Pests

    Science.gov (United States)

    Many kinds of arthropod natural enemies (predators and parasitoids) inhabit crop fields in Arizona and can have a large negative impact on several pest insect species that also infest these crops. Geocoris spp., commonly known as big-eyed bugs, are among the most abundant insect predators in field c...

  12. Predictive Big Data Analytics: A Study of Parkinson?s Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations

    OpenAIRE

    Dinov, Ivo D.; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W.; Price, Nathan D.; Van Horn, John D.; Ames, Joseph

    2016-01-01

    Background A unique archive of Big Data on Parkinson?s Disease is collected, managed and disseminated by the Parkinson?s Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationsh...

  13. Big Data - What is it and why it matters.

    Science.gov (United States)

    Tattersall, Andy; Grant, Maria J

    2016-06-01

    Big data, like MOOCs, altmetrics and open access, is a term that has been commonplace in the library community for some time yet, despite its prevalence, many in the library and information sector remain unsure of the relationship between big data and their roles. This editorial explores what big data could mean for the day-to-day practice of health library and information workers, presenting examples of big data in action, considering the ethics of accessing big data sets and the potential for new roles for library and information workers. © 2016 Health Libraries Group.

  14. Research on information security in big data era

    Science.gov (United States)

    Zhou, Linqi; Gu, Weihong; Huang, Cheng; Huang, Aijun; Bai, Yongbin

    2018-05-01

    Big data is becoming another hotspot in the field of information technology after the cloud computing and the Internet of Things. However, the existing information security methods can no longer meet the information security requirements in the era of big data. This paper analyzes the challenges and a cause of data security brought by big data, discusses the development trend of network attacks under the background of big data, and puts forward my own opinions on the development of security defense in technology, strategy and product.

  15. BIG DATA IN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2015-06-01

    Full Text Available In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured in order to improve current transactions, to develop new business models, to provide a real image of the supply and demand and thereby, generate market advantages. So, the companies that turn to Big Data have a competitive advantage over other firms. Looking from the perspective of IT organizations, they must accommodate the storage and processing Big Data, and provide analysis tools that are easily integrated into business processes. This paper aims to discuss aspects regarding the Big Data concept, the principles to build, organize and analyse huge datasets in the business environment, offering a three-layer architecture, based on actual software solutions. Also, the article refers to the graphical tools for exploring and representing unstructured data, Gephi and NodeXL.

  16. Fuzzy 2-partition entropy threshold selection based on Big Bang–Big Crunch Optimization algorithm

    Directory of Open Access Journals (Sweden)

    Baljit Singh Khehra

    2015-03-01

    Full Text Available The fuzzy 2-partition entropy approach has been widely used to select threshold value for image segmenting. This approach used two parameterized fuzzy membership functions to form a fuzzy 2-partition of the image. The optimal threshold is selected by searching an optimal combination of parameters of the membership functions such that the entropy of fuzzy 2-partition is maximized. In this paper, a new fuzzy 2-partition entropy thresholding approach based on the technology of the Big Bang–Big Crunch Optimization (BBBCO is proposed. The new proposed thresholding approach is called the BBBCO-based fuzzy 2-partition entropy thresholding algorithm. BBBCO is used to search an optimal combination of parameters of the membership functions for maximizing the entropy of fuzzy 2-partition. BBBCO is inspired by the theory of the evolution of the universe; namely the Big Bang and Big Crunch Theory. The proposed algorithm is tested on a number of standard test images. For comparison, three different algorithms included Genetic Algorithm (GA-based, Biogeography-based Optimization (BBO-based and recursive approaches are also implemented. From experimental results, it is observed that the performance of the proposed algorithm is more effective than GA-based, BBO-based and recursion-based approaches.

  17. Big Data Analysis of Human Genome Variations

    KAUST Repository

    Gojobori, Takashi

    2016-01-01

    Since the human genome draft sequence was in public for the first time in 2000, genomic analyses have been intensively extended to the population level. The following three international projects are good examples for large-scale studies of human

  18. The big five personality traits and environmental concern: the moderating roles of individualism/collectivism and gender

    Directory of Open Access Journals (Sweden)

    Abbas Abdollahi

    2017-06-01

    Full Text Available Environmental pollution has become a serious challenge for humanity and the environment. Therefore, this study aims to examine the relationships between the Big Five personality traits, individualism, collectivism, participant’s age, and environmental concern, and testing the moderating roles of individualism/collectivism and gender in the relationship between the Big Five personality traits and environmental concern. In this quantitative study, the multi-stage cluster random sampling method was used to recruit a total of 1,160 respondents (614 females and 546 males from Kuala Lumpur, Malaysia. Structural Equation Modeling proved that respondents of high neuroticism, conscientiousness, extraversion, collectivism, and older ages were conscious of the environmental quality. Also, the findings showed that individualism, collectivism, and gender emerged as significant moderators in the link between the Big Five personality traits and the environmental concern.

  19. A little big history of Tiananmen

    NARCIS (Netherlands)

    Quaedackers, E.; Grinin, L.E.; Korotayev, A.V.; Rodrigue, B.H.

    2011-01-01

    This contribution aims at demonstrating the usefulness of studying small-scale subjects such as Tiananmen, or the Gate of Heavenly Peace, in Beijing - from a Big History perspective. By studying such a ‘little big history’ of Tiananmen, previously overlooked yet fundamental explanations for why

  20. Addressing big data issues in Scientific Data Infrastructure

    NARCIS (Netherlands)

    Demchenko, Y.; Membrey, P.; Grosso, P.; de Laat, C.; Smari, W.W.; Fox, G.C.

    2013-01-01

    Big Data are becoming a new technology focus both in science and in industry. This paper discusses the challenges that are imposed by Big Data on the modern and future Scientific Data Infrastructure (SDI). The paper discusses a nature and definition of Big Data that include such features as Volume,

  1. Improving Healthcare Using Big Data Analytics

    Directory of Open Access Journals (Sweden)

    Revanth Sonnati

    2017-03-01

    Full Text Available In daily terms we call the current era as Modern Era which can also be named as the era of Big Data in the field of Information Technology. Our daily lives in todays world are rapidly advancing never quenching ones thirst. The fields of science engineering and technology are producing data at an exponential rate leading to Exabytes of data every day. Big data helps us to explore and re-invent many areas not limited to education health and law. The primary purpose of this paper is to provide an in-depth analysis in the area of Healthcare using the big data and analytics. The main purpose is to emphasize on the usage of the big data which is being stored all the time helping to look back in the history but this is the time to emphasize on the analyzation to improve the medication and services. Although many big data implementations happen to be in-house development this proposed implementation aims to propose a broader extent using Hadoop which just happen to be the tip of the iceberg. The focus of this paper is not limited to the improvement and analysis of the data it also focusses on the strengths and drawbacks compared to the conventional techniques available.

  2. Big Data - Smart Health Strategies

    Science.gov (United States)

    2014-01-01

    Summary Objectives To select best papers published in 2013 in the field of big data and smart health strategies, and summarize outstanding research efforts. Methods A systematic search was performed using two major bibliographic databases for relevant journal papers. The references obtained were reviewed in a two-stage process, starting with a blinded review performed by the two section editors, and followed by a peer review process operated by external reviewers recognized as experts in the field. Results The complete review process selected four best papers, illustrating various aspects of the special theme, among them: (a) using large volumes of unstructured data and, specifically, clinical notes from Electronic Health Records (EHRs) for pharmacovigilance; (b) knowledge discovery via querying large volumes of complex (both structured and unstructured) biological data using big data technologies and relevant tools; (c) methodologies for applying cloud computing and big data technologies in the field of genomics, and (d) system architectures enabling high-performance access to and processing of large datasets extracted from EHRs. Conclusions The potential of big data in biomedicine has been pinpointed in various viewpoint papers and editorials. The review of current scientific literature illustrated a variety of interesting methods and applications in the field, but still the promises exceed the current outcomes. As we are getting closer towards a solid foundation with respect to common understanding of relevant concepts and technical aspects, and the use of standardized technologies and tools, we can anticipate to reach the potential that big data offer for personalized medicine and smart health strategies in the near future. PMID:25123721

  3. About Big Data and its Challenges and Benefits in Manufacturing

    OpenAIRE

    Bogdan NEDELCU

    2013-01-01

    The aim of this article is to show the importance of Big Data and its growing influence on companies. It also shows what kind of big data is currently generated and how much big data is estimated to be generated. We can also see how much are the companies willing to invest in big data and how much are they currently gaining from their big data. There are also shown some major influences that big data has over one major segment in the industry (manufacturing) and the challenges that appear.

  4. Big Data Management in US Hospitals: Benefits and Barriers.

    Science.gov (United States)

    Schaeffer, Chad; Booton, Lawrence; Halleck, Jamey; Studeny, Jana; Coustasse, Alberto

    Big data has been considered as an effective tool for reducing health care costs by eliminating adverse events and reducing readmissions to hospitals. The purposes of this study were to examine the emergence of big data in the US health care industry, to evaluate a hospital's ability to effectively use complex information, and to predict the potential benefits that hospitals might realize if they are successful in using big data. The findings of the research suggest that there were a number of benefits expected by hospitals when using big data analytics, including cost savings and business intelligence. By using big data, many hospitals have recognized that there have been challenges, including lack of experience and cost of developing the analytics. Many hospitals will need to invest in the acquiring of adequate personnel with experience in big data analytics and data integration. The findings of this study suggest that the adoption, implementation, and utilization of big data technology will have a profound positive effect among health care providers.

  5. Big Data Strategy for Telco: Network Transformation

    OpenAIRE

    F. Amin; S. Feizi

    2014-01-01

    Big data has the potential to improve the quality of services; enable infrastructure that businesses depend on to adapt continually and efficiently; improve the performance of employees; help organizations better understand customers; and reduce liability risks. Analytics and marketing models of fixed and mobile operators are falling short in combating churn and declining revenue per user. Big Data presents new method to reverse the way and improve profitability. The benefits of Big Data and ...

  6. Big Data in Shipping - Challenges and Opportunities

    OpenAIRE

    Rødseth, Ørnulf Jan; Perera, Lokukaluge Prasad; Mo, Brage

    2016-01-01

    Big Data is getting popular in shipping where large amounts of information is collected to better understand and improve logistics, emissions, energy consumption and maintenance. Constraints to the use of big data include cost and quality of on-board sensors and data acquisition systems, satellite communication, data ownership and technical obstacles to effective collection and use of big data. New protocol standards may simplify the process of collecting and organizing the data, including in...

  7. [Relevance of big data for molecular diagnostics].

    Science.gov (United States)

    Bonin-Andresen, M; Smiljanovic, B; Stuhlmüller, B; Sörensen, T; Grützkau, A; Häupl, T

    2018-04-01

    Big data analysis raises the expectation that computerized algorithms may extract new knowledge from otherwise unmanageable vast data sets. What are the algorithms behind the big data discussion? In principle, high throughput technologies in molecular research already introduced big data and the development and application of analysis tools into the field of rheumatology some 15 years ago. This includes especially omics technologies, such as genomics, transcriptomics and cytomics. Some basic methods of data analysis are provided along with the technology, however, functional analysis and interpretation requires adaptation of existing or development of new software tools. For these steps, structuring and evaluating according to the biological context is extremely important and not only a mathematical problem. This aspect has to be considered much more for molecular big data than for those analyzed in health economy or epidemiology. Molecular data are structured in a first order determined by the applied technology and present quantitative characteristics that follow the principles of their biological nature. These biological dependencies have to be integrated into software solutions, which may require networks of molecular big data of the same or even different technologies in order to achieve cross-technology confirmation. More and more extensive recording of molecular processes also in individual patients are generating personal big data and require new strategies for management in order to develop data-driven individualized interpretation concepts. With this perspective in mind, translation of information derived from molecular big data will also require new specifications for education and professional competence.

  8. Synchrotron-based intra-venous K-edge digital subtraction angiography in a pig model: A feasibility study

    International Nuclear Information System (INIS)

    Schueltke, Elisabeth; Fiedler, Stefan; Nemoz, Christian; Ogieglo, Lissa; Kelly, Michael E.; Crawford, Paul; Esteve, Francois; Brochard, Thierry; Renier, Michel; Requardt, Herwig; Le Duc, Geraldine; Juurlink, Bernhard; Meguro, Kotoo

    2010-01-01

    Background: K-edge digital subtraction angiography (KEDSA) combined with the tunability of synchrotron beam yields an imaging technique that is highly sensitive to low concentrations of contrast agents. Thus, contrast agent can be administered intravenously, obviating the need for insertion of a guided catheter to deliver a bolus of contrast agent close to the target tissue. With the high-resolution detectors used at synchrotron facilities, images can be acquired at high spatial resolution. Thus, the KEDSA appears particularly suited for studies of neurovascular pathology in animal models, where the vascular diameters are significantly smaller than in human patients. Materials and methods: This feasibility study was designed to test the suitability of KEDSA after intravenous injection of iodine-based contrast agent for use in a pig model. Four adult male pigs were used for our experiments. Neurovascular angiographic images were acquired using KEDSA with a solid state Germanium (Ge) detector at the European Synchrotron Radiation Facility (ESRF) in Grenoble, France. Results: After intravenous injection of 0.9 ml/kg iodinated contrast agent (Xenetix), the peak iodine concentrations in the internal carotid and middle cerebral arteries reached 35 mg/ml. KEDSA images in radiography mode allowed the visualization of intracranial arteries of less than 1.5 mm diameter.

  9. Synchrotron-based intra-venous K-edge digital subtraction angiography in a pig model: a feasibility study.

    Science.gov (United States)

    Schültke, Elisabeth; Fiedler, Stefan; Nemoz, Christian; Ogieglo, Lissa; Kelly, Michael E; Crawford, Paul; Esteve, Francois; Brochard, Thierry; Renier, Michel; Requardt, Herwig; Le Duc, Geraldine; Juurlink, Bernhard; Meguro, Kotoo

    2010-03-01

    K-edge digital subtraction angiography (KEDSA) combined with the tunability of synchrotron beam yields an imaging technique that is highly sensitive to low concentrations of contrast agents. Thus, contrast agent can be administered intravenously, obviating the need for insertion of a guided catheter to deliver a bolus of contrast agent close to the target tissue. With the high-resolution detectors used at synchrotron facilities, images can be acquired at high spatial resolution. Thus, the KEDSA appears particularly suited for studies of neurovascular pathology in animal models, where the vascular diameters are significantly smaller than in human patients. This feasibility study was designed to test the suitability of KEDSA after intravenous injection of iodine-based contrast agent for use in a pig model. Four adult male pigs were used for our experiments. Neurovascular angiographic images were acquired using KEDSA with a solid state Germanium (Ge) detector at the European Synchrotron Radiation Facility (ESRF) in Grenoble, France. After intravenous injection of 0.9 ml/kg iodinated contrast agent (Xenetix), the peak iodine concentrations in the internal carotid and middle cerebral arteries reached 35 mg/ml. KEDSA images in radiography mode allowed the visualization of intracranial arteries of less than 1.5mm diameter. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.

  10. Synchrotron-based intra-venous K-edge digital subtraction angiography in a pig model: A feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    Schueltke, Elisabeth [Departments of Surgery, University of Saskatchewan, Saskatoon, SK (Canada); Anatomy and Cell Biology, University of Saskatchewan, Saskatoon, SK (Canada); Department of Neurological Sciences, Walton Medical Centre, University of Liverpool, Liverpool L97 LJ (United Kingdom)], E-mail: e.schultke@usask.ca; Fiedler, Stefan [European Molecular Biology Laboratory (EMBL), Nottkestrasse 85, 22603 Hamburg (Germany); Nemoz, Christian [European Synchrotron Radiation Facility (ESRF), 6 rue Horowitz, 38043 Grenoble (France); Ogieglo, Lissa [Departments of Surgery, University of Saskatchewan, Saskatoon, SK (Canada); Kelly, Michael E. [Departments of Surgery, University of Saskatchewan, Saskatoon, SK (Canada); Department of Neurosurgery, Section of Cerebrovascular and Endovascular Neurosurgery, Cleveland Clinic, 9500 Euclid Avenue, Cleveland, OH (United States); Crawford, Paul [Royal Veterinary College, Hawkshead Lane, North Mymms, Hatfield, Herfordshire AL9 7TA (United Kingdom); Esteve, Francois [INSERM U836-ESRF, 6 rue Horowitz, 38043 Grenoble (France); Brochard, Thierry; Renier, Michel; Requardt, Herwig; Le Duc, Geraldine [European Synchrotron Radiation Facility (ESRF), 6 rue Horowitz, 38043 Grenoble (France); Juurlink, Bernhard [Anatomy and Cell Biology, University of Saskatchewan, Saskatoon, SK (Canada); Meguro, Kotoo [Departments of Surgery, University of Saskatchewan, Saskatoon, SK (Canada)

    2010-03-15

    Background: K-edge digital subtraction angiography (KEDSA) combined with the tunability of synchrotron beam yields an imaging technique that is highly sensitive to low concentrations of contrast agents. Thus, contrast agent can be administered intravenously, obviating the need for insertion of a guided catheter to deliver a bolus of contrast agent close to the target tissue. With the high-resolution detectors used at synchrotron facilities, images can be acquired at high spatial resolution. Thus, the KEDSA appears particularly suited for studies of neurovascular pathology in animal models, where the vascular diameters are significantly smaller than in human patients. Materials and methods: This feasibility study was designed to test the suitability of KEDSA after intravenous injection of iodine-based contrast agent for use in a pig model. Four adult male pigs were used for our experiments. Neurovascular angiographic images were acquired using KEDSA with a solid state Germanium (Ge) detector at the European Synchrotron Radiation Facility (ESRF) in Grenoble, France. Results: After intravenous injection of 0.9 ml/kg iodinated contrast agent (Xenetix), the peak iodine concentrations in the internal carotid and middle cerebral arteries reached 35 mg/ml. KEDSA images in radiography mode allowed the visualization of intracranial arteries of less than 1.5 mm diameter.

  11. Big data in psychology: A framework for research advancement.

    Science.gov (United States)

    Adjerid, Idris; Kelley, Ken

    2018-02-22

    The potential for big data to provide value for psychology is significant. However, the pursuit of big data remains an uncertain and risky undertaking for the average psychological researcher. In this article, we address some of this uncertainty by discussing the potential impact of big data on the type of data available for psychological research, addressing the benefits and most significant challenges that emerge from these data, and organizing a variety of research opportunities for psychology. Our article yields two central insights. First, we highlight that big data research efforts are more readily accessible than many researchers realize, particularly with the emergence of open-source research tools, digital platforms, and instrumentation. Second, we argue that opportunities for big data research are diverse and differ both in their fit for varying research goals, as well as in the challenges they bring about. Ultimately, our outlook for researchers in psychology using and benefiting from big data is cautiously optimistic. Although not all big data efforts are suited for all researchers or all areas within psychology, big data research prospects are diverse, expanding, and promising for psychology and related disciplines. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  12. 'Big data' in pharmaceutical science: challenges and opportunities.

    Science.gov (United States)

    Dossetter, Al G; Ecker, Gerhard; Laverty, Hugh; Overington, John

    2014-05-01

    Future Medicinal Chemistry invited a selection of experts to express their views on the current impact of big data in drug discovery and design, as well as speculate on future developments in the field. The topics discussed include the challenges of implementing big data technologies, maintaining the quality and privacy of data sets, and how the industry will need to adapt to welcome the big data era. Their enlightening responses provide a snapshot of the many and varied contributions being made by big data to the advancement of pharmaceutical science.

  13. Soft computing in big data processing

    CERN Document Server

    Park, Seung-Jong; Lee, Jee-Hyong

    2014-01-01

    Big data is an essential key to build a smart world as a meaning of the streaming, continuous integration of large volume and high velocity data covering from all sources to final destinations. The big data range from data mining, data analysis and decision making, by drawing statistical rules and mathematical patterns through systematical or automatically reasoning. The big data helps serve our life better, clarify our future and deliver greater value. We can discover how to capture and analyze data. Readers will be guided to processing system integrity and implementing intelligent systems. With intelligent systems, we deal with the fundamental data management and visualization challenges in effective management of dynamic and large-scale data, and efficient processing of real-time and spatio-temporal data. Advanced intelligent systems have led to managing the data monitoring, data processing and decision-making in realistic and effective way. Considering a big size of data, variety of data and frequent chan...

  14. Cabotegravir long acting injection protects macaques against intravenous challenge with SIVmac251.

    Science.gov (United States)

    Andrews, Chasity D; Bernard, Leslie St; Poon, Amanda Yee; Mohri, Hiroshi; Gettie, Natanya; Spreen, William R; Gettie, Agegnehu; Russell-Lodrigue, Kasi; Blanchard, James; Hong, Zhi; Ho, David D; Markowitz, Martin

    2017-02-20

    We evaluated the effectiveness of cabotegravir (CAB; GSK1265744 or GSK744) long acting as preexposure prophylaxis (PrEP) against intravenous simian immunodeficiency virus (SIV) challenge in a model that mimics blood transfusions based on the per-act probability of infection. CAB long acting is an integrase strand transfer inhibitor formulated as a 200 mg/ml injectable nanoparticle suspension that is an effective PrEP agent against rectal and vaginal simian/human immunodeficiency virus transmission in macaques. Three groups of rhesus macaques (n = 8 per group) were injected intramuscularly with CAB long acting and challenged intravenously with 17 animal infectious dose 50% SIVmac251 on week 2. Group 1 was injected with 50 mg/kg on week 0 and 4 to evaluate the protective efficacy of the CAB long-acting dose used in macaque studies mimicking sexual transmission. Group 2 was injected with 50 mg/kg on week 0 to evaluate the necessity of the second injection of CAB long acting for protection against intravenous challenge. Group 3 was injected with 25 mg/kg on week 0 and 50 mg/kg on week 4 to correlate CAB plasma concentrations at the time of challenge with protection. Five additional macaques remained untreated as controls. CAB long acting was highly protective with 21 of the 24 CAB long-acting-treated macaques remaining aviremic, resulting in 88% protection. The plasma CAB concentration at the time of virus challenge appeared to be more important for protection than sustaining therapeutic plasma concentrations with the second CAB long acting injection. These results support the clinical investigation of CAB long acting as PrEP in people who inject drugs.

  15. Biochemical characterization of individual human glycosylated pro-insulin-like growth factor (IGF)-II and big-IGF-II isoforms associated with cancer.

    Science.gov (United States)

    Greenall, Sameer A; Bentley, John D; Pearce, Lesley A; Scoble, Judith A; Sparrow, Lindsay G; Bartone, Nicola A; Xiao, Xiaowen; Baxter, Robert C; Cosgrove, Leah J; Adams, Timothy E

    2013-01-04

    Insulin-like growth factor II (IGF-II) is a major embryonic growth factor belonging to the insulin-like growth factor family, which includes insulin and IGF-I. Its expression in humans is tightly controlled by maternal imprinting, a genetic restraint that is lost in many cancers, resulting in up-regulation of both mature IGF-II mRNA and protein expression. Additionally, increased expression of several longer isoforms of IGF-II, termed "pro" and "big" IGF-II, has been observed. To date, it is ambiguous as to what role these IGF-II isoforms have in initiating and sustaining tumorigenesis and whether they are bioavailable. We have expressed each individual IGF-II isoform in their proper O-glycosylated format and established that all bind to the IGF-I receptor and both insulin receptors A and B, resulting in their activation and subsequent stimulation of fibroblast proliferation. We also confirmed that all isoforms are able to be sequestered into binary complexes with several IGF-binding proteins (IGFBP-2, IGFBP-3, and IGFBP-5). In contrast to this, ternary complex formation with IGFBP-3 or IGFBP-5 and the auxillary protein, acid labile subunit, was severely diminished. Furthermore, big-IGF-II isoforms bound much more weakly to purified ectodomain of the natural IGF-II scavenging receptor, IGF-IIR. IGF-II isoforms thus possess unique biological properties that may enable them to escape normal sequestration avenues and remain bioavailable in vivo to sustain oncogenic signaling.

  16. Solution of a braneworld big crunch/big bang cosmology

    International Nuclear Information System (INIS)

    McFadden, Paul L.; Turok, Neil; Steinhardt, Paul J.

    2007-01-01

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c) 2 . At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios

  17. Anti-tetanus toxoid antibodies in intravenous gamma globulin: an alternative to tetanus immune globulin.

    Science.gov (United States)

    Lee, D C; Lederman, H M

    1992-09-01

    The levels of anti-tetanus toxoid IgG antibodies were measured in 29 lots of intravenous gamma globulin (IVIG). The antibody levels varied from 4 to 90 IU/mL (geometric mean, 18.6; 90% confidence interval, 9.7-35.7). The variation from manufacturer to manufacturer accounted for most of the observed differences among lots; there was relatively little variability among multiple lots from a single manufacturer. IVIG may be an acceptable alternative to horse or human tetanus immune globulin.

  18. Dynamics of morphofunctional erythrocyte properties during intravenous glucose injection in patients with coronary heart disease

    Science.gov (United States)

    Malinova, Lidia I.; Simonenko, Georgy V.; Denisova, Tatyana P.; Tuchin, Valery V.

    2007-02-01

    Dynamics of glucose concentration in human organism is an important diagnostic characteristic for it's parameters correlate significantly with the severity of metabolic, vessel and perfusion disorders. 36 patients with stable angina pectoris of II and III functional classes were involved in this study. All of them were men in age range of 45-59 years old. 7 patients hospitalized with acute myocardial infarction (aged from 49 to 59 years old) form the group of compare. Control group (n = 5) was of practically healthy men in comparable age. To all patients intravenous glucose solution (40%) in standard loading dose was injected. Capillary and vein blood samples were withdrawn before, and 5, 60, 120, 180 and 240 minutes after glucose load. At these time points blood pressure and glucose concentration were measured. In prepared blood smears shape, deformability and sizes of erythrocytes, quantity and degree of shear stress resistant erythrocyte aggregates were studied. Received data were approximated by polynomial of high degree to receive concentration function of studied parameters, which first derivative elucidate velocity characteristics of morphofunctional erythrocyte properties during intravenous glucose injection in patients with coronary heart disease and practically healthy persons. Received data show principle differences in dynamics of morphofunctional erythrocyte properties during intravenous glucose injection in patients with coronary heart disease as a possible mechanism of coronary blood flow destabilization.

  19. Comparative study on the results of consecutive oral cholecystography and intravenous cholangiography

    International Nuclear Information System (INIS)

    Lee, Sung Hee; Park, Yang Ok; Yoo, Ho Joon

    1974-01-01

    Since its introduction in 1924, oral cholecystography has been used as a screening method in the diagnosis of the gallbladder disease. Recently, intravenous cholangiography has become a most valuable method in the diagnosis of biliary tract pathology because of its advantage of simultaneous visualization of the gallbladder and bile ducts in a short time. However, opinions vary considerably as to the significance of nonvisualization of the gallbladder with oral cholecystography. In attempt to evaluate how much intravenous cholangiography does contribute to the diagnosis in the cases that the gallbladder cannot be opacified or can only faintly visualized by the oral method, we have made a clinical observation in 168 patients, in whom intravenous cholangiography had been performed within a week following oral cholecystography, at Korea General Hospital during the last three years from January 1969 to December 1971. The results obtained are summarized as follows; 1. The results of oral cholecystography in 168 cases were as follow; well opacification of the gallbladder in 10 cases, faint opacification in 46 cases and nonopacification in 112 cases. 2. In 37.5% (42 cases) of 112 gallbladder not opacified by the oral method, the gallbladder was subsequently opacified by the intravenous method, and 11.6% (14 cases) turned out to be normal when examined by the intravenous method. 3. Further demonstration of abnormalities could be obtained with the aid intravenous cholangiography in 28 cases (16.6%); cholelithiasis in 12 cases and choledocholithiasis in 16 cases. 4. In every cases of 14 patients whose gallbladder were virtually not opacified by both oral and intravenous methods bit the common bile ducts could be opacified by intravenous cholangiography, definite abnormalities were identified in the gallbladder at surgery

  20. Genome Variation Map: a data repository of genome variations in BIG Data Center.

    Science.gov (United States)

    Song, Shuhui; Tian, Dongmei; Li, Cuiping; Tang, Bixia; Dong, Lili; Xiao, Jingfa; Bao, Yiming; Zhao, Wenming; He, Hang; Zhang, Zhang

    2018-01-04

    The Genome Variation Map (GVM; http://bigd.big.ac.cn/gvm/) is a public data repository of genome variations. As a core resource in the BIG Data Center, Beijing Institute of Genomics, Chinese Academy of Sciences, GVM dedicates to collect, integrate and visualize genome variations for a wide range of species, accepts submissions of different types of genome variations from all over the world and provides free open access to all publicly available data in support of worldwide research activities. Unlike existing related databases, GVM features integration of a large number of genome variations for a broad diversity of species including human, cultivated plants and domesticated animals. Specifically, the current implementation of GVM not only houses a total of ∼4.9 billion variants for 19 species including chicken, dog, goat, human, poplar, rice and tomato, but also incorporates 8669 individual genotypes and 13 262 manually curated high-quality genotype-to-phenotype associations for non-human species. In addition, GVM provides friendly intuitive web interfaces for data submission, browse, search and visualization. Collectively, GVM serves as an important resource for archiving genomic variation data, helpful for better understanding population genetic diversity and deciphering complex mechanisms associated with different phenotypes. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. Genome Variation Map: a data repository of genome variations in BIG Data Center

    Science.gov (United States)

    Tian, Dongmei; Li, Cuiping; Tang, Bixia; Dong, Lili; Xiao, Jingfa; Bao, Yiming; Zhao, Wenming; He, Hang

    2018-01-01

    Abstract The Genome Variation Map (GVM; http://bigd.big.ac.cn/gvm/) is a public data repository of genome variations. As a core resource in the BIG Data Center, Beijing Institute of Genomics, Chinese Academy of Sciences, GVM dedicates to collect, integrate and visualize genome variations for a wide range of species, accepts submissions of different types of genome variations from all over the world and provides free open access to all publicly available data in support of worldwide research activities. Unlike existing related databases, GVM features integration of a large number of genome variations for a broad diversity of species including human, cultivated plants and domesticated animals. Specifically, the current implementation of GVM not only houses a total of ∼4.9 billion variants for 19 species including chicken, dog, goat, human, poplar, rice and tomato, but also incorporates 8669 individual genotypes and 13 262 manually curated high-quality genotype-to-phenotype associations for non-human species. In addition, GVM provides friendly intuitive web interfaces for data submission, browse, search and visualization. Collectively, GVM serves as an important resource for archiving genomic variation data, helpful for better understanding population genetic diversity and deciphering complex mechanisms associated with different phenotypes. PMID:29069473

  2. [Big data and their perspectives in radiation therapy].

    Science.gov (United States)

    Guihard, Sébastien; Thariat, Juliette; Clavier, Jean-Baptiste

    2017-02-01

    The concept of big data indicates a change of scale in the use of data and data aggregation into large databases through improved computer technology. One of the current challenges in the creation of big data in the context of radiation therapy is the transformation of routine care items into dark data, i.e. data not yet collected, and the fusion of databases collecting different types of information (dose-volume histograms and toxicity data for example). Processes and infrastructures devoted to big data collection should not impact negatively on the doctor-patient relationship, the general process of care or the quality of the data collected. The use of big data requires a collective effort of physicians, physicists, software manufacturers and health authorities to create, organize and exploit big data in radiotherapy and, beyond, oncology. Big data involve a new culture to build an appropriate infrastructure legally and ethically. Processes and issues are discussed in this article. Copyright © 2016 Société Française du Cancer. Published by Elsevier Masson SAS. All rights reserved.

  3. Current applications of big data in obstetric anesthesiology.

    Science.gov (United States)

    Klumpner, Thomas T; Bauer, Melissa E; Kheterpal, Sachin

    2017-06-01

    The narrative review aims to highlight several recently published 'big data' studies pertinent to the field of obstetric anesthesiology. Big data has been used to study rare outcomes, to identify trends within the healthcare system, to identify variations in practice patterns, and to highlight potential inequalities in obstetric anesthesia care. Big data studies have helped define the risk of rare complications of obstetric anesthesia, such as the risk of neuraxial hematoma in thrombocytopenic parturients. Also, large national databases have been used to better understand trends in anesthesia-related adverse events during cesarean delivery as well as outline potential racial/ethnic disparities in obstetric anesthesia care. Finally, real-time analysis of patient data across a number of disparate health information systems through the use of sophisticated clinical decision support and surveillance systems is one promising application of big data technology on the labor and delivery unit. 'Big data' research has important implications for obstetric anesthesia care and warrants continued study. Real-time electronic surveillance is a potentially useful application of big data technology on the labor and delivery unit.

  4. intravenous infusion of chlorimipramine (anafranil)

    African Journals Online (AJOL)

    the already extensive outpatient facilities at Johannesburg. Hospital as well as the Tara Neuro-Psychiatric Hospital for long-term therapy. Technique of Chlorimipramine Infusion. Initially 1 ampoule of chlorimipramine 25 mg in 250 mg of 5°~ dextrose saline was administered intravenously at the rate of 60 drops per minute.

  5. Volume and Value of Big Healthcare Data.

    Science.gov (United States)

    Dinov, Ivo D

    Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions.

  6. Using Big Book to Teach Things in My House

    OpenAIRE

    Effrien, Intan; Lailatus, Sa’diyah; Nuruliftitah Maja, Neneng

    2017-01-01

    The purpose of this study to determine students' interest in learning using the big book media. Big book is a big book from the general book. The big book contains simple words and images that match the content of sentences and spelling. From here researchers can know the interest and development of students' knowledge. As well as train researchers to remain crative in developing learning media for students.

  7. Trends in IT Innovation to Build a Next Generation Bioinformatics Solution to Manage and Analyse Biological Big Data Produced by NGS Technologies.

    Science.gov (United States)

    de Brevern, Alexandre G; Meyniel, Jean-Philippe; Fairhead, Cécile; Neuvéglise, Cécile; Malpertuy, Alain

    2015-01-01

    Sequencing the human genome began in 1994, and 10 years of work were necessary in order to provide a nearly complete sequence. Nowadays, NGS technologies allow sequencing of a whole human genome in a few days. This deluge of data challenges scientists in many ways, as they are faced with data management issues and analysis and visualization drawbacks due to the limitations of current bioinformatics tools. In this paper, we describe how the NGS Big Data revolution changes the way of managing and analysing data. We present how biologists are confronted with abundance of methods, tools, and data formats. To overcome these problems, focus on Big Data Information Technology innovations from web and business intelligence. We underline the interest of NoSQL databases, which are much more efficient than relational databases. Since Big Data leads to the loss of interactivity with data during analysis due to high processing time, we describe solutions from the Business Intelligence that allow one to regain interactivity whatever the volume of data is. We illustrate this point with a focus on the Amadea platform. Finally, we discuss visualization challenges posed by Big Data and present the latest innovations with JavaScript graphic libraries.

  8. Small data, data infrastructures and big data (Working Paper 1)

    OpenAIRE

    Kitchin, Rob; Lauriault, Tracey P.

    2014-01-01

    The production of academic knowledge has progressed for the past few centuries using small data studies characterized by sampled data generated to answer specific questions. It is a strategy that has been remarkably successful, enabling the sciences, social sciences and humanities to advance in leaps and bounds. This approach is presently being challenged by the development of big data. Small data studies will, however, continue to be important in the future because of their utility in answer...

  9. Big Data Analytics Methodology in the Financial Industry

    Science.gov (United States)

    Lawler, James; Joseph, Anthony

    2017-01-01

    Firms in industry continue to be attracted by the benefits of Big Data Analytics. The benefits of Big Data Analytics projects may not be as evident as frequently indicated in the literature. The authors of the study evaluate factors in a customized methodology that may increase the benefits of Big Data Analytics projects. Evaluating firms in the…

  10. Big data: survey, technologies, opportunities, and challenges.

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  11. Big Data: Survey, Technologies, Opportunities, and Challenges

    Science.gov (United States)

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Mahmoud Ali, Waleed Kamaleldin; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  12. Opportunity and Challenges for Migrating Big Data Analytics in Cloud

    Science.gov (United States)

    Amitkumar Manekar, S.; Pradeepini, G., Dr.

    2017-08-01

    Big Data Analytics is a big word now days. As per demanding and more scalable process data generation capabilities, data acquisition and storage become a crucial issue. Cloud storage is a majorly usable platform; the technology will become crucial to executives handling data powered by analytics. Now a day’s trend towards “big data-as-a-service” is talked everywhere. On one hand, cloud-based big data analytics exactly tackle in progress issues of scale, speed, and cost. But researchers working to solve security and other real-time problem of big data migration on cloud based platform. This article specially focused on finding possible ways to migrate big data to cloud. Technology which support coherent data migration and possibility of doing big data analytics on cloud platform is demanding in natute for new era of growth. This article also gives information about available technology and techniques for migration of big data in cloud.

  13. Hot big bang or slow freeze?

    Science.gov (United States)

    Wetterich, C.

    2014-09-01

    We confront the big bang for the beginning of the universe with an equivalent picture of a slow freeze - a very cold and slowly evolving universe. In the freeze picture the masses of elementary particles increase and the gravitational constant decreases with cosmic time, while the Newtonian attraction remains unchanged. The freeze and big bang pictures both describe the same observations or physical reality. We present a simple ;crossover model; without a big bang singularity. In the infinite past space-time is flat. Our model is compatible with present observations, describing the generation of primordial density fluctuations during inflation as well as the present transition to a dark energy-dominated universe.

  14. Big Data

    DEFF Research Database (Denmark)

    Aaen, Jon; Nielsen, Jeppe Agger

    2016-01-01

    Big Data byder sig til som en af tidens mest hypede teknologiske innovationer, udråbt til at rumme kimen til nye, værdifulde operationelle indsigter for private virksomheder og offentlige organisationer. Mens de optimistiske udmeldinger er mange, er forskningen i Big Data i den offentlige sektor...... indtil videre begrænset. Denne artikel belyser, hvordan den offentlige sundhedssektor kan genanvende og udnytte en stadig større mængde data under hensyntagen til offentlige værdier. Artiklen bygger på et casestudie af anvendelsen af store mængder sundhedsdata i Dansk AlmenMedicinsk Database (DAMD......). Analysen viser, at (gen)brug af data i nye sammenhænge er en flerspektret afvejning mellem ikke alene økonomiske rationaler og kvalitetshensyn, men også kontrol over personfølsomme data og etiske implikationer for borgeren. I DAMD-casen benyttes data på den ene side ”i den gode sags tjeneste” til...

  15. Curating Big Data Made Simple: Perspectives from Scientific Communities.

    Science.gov (United States)

    Sowe, Sulayman K; Zettsu, Koji

    2014-03-01

    The digital universe is exponentially producing an unprecedented volume of data that has brought benefits as well as fundamental challenges for enterprises and scientific communities alike. This trend is inherently exciting for the development and deployment of cloud platforms to support scientific communities curating big data. The excitement stems from the fact that scientists can now access and extract value from the big data corpus, establish relationships between bits and pieces of information from many types of data, and collaborate with a diverse community of researchers from various domains. However, despite these perceived benefits, to date, little attention is focused on the people or communities who are both beneficiaries and, at the same time, producers of big data. The technical challenges posed by big data are as big as understanding the dynamics of communities working with big data, whether scientific or otherwise. Furthermore, the big data era also means that big data platforms for data-intensive research must be designed in such a way that research scientists can easily search and find data for their research, upload and download datasets for onsite/offsite use, perform computations and analysis, share their findings and research experience, and seamlessly collaborate with their colleagues. In this article, we present the architecture and design of a cloud platform that meets some of these requirements, and a big data curation model that describes how a community of earth and environmental scientists is using the platform to curate data. Motivation for developing the platform, lessons learnt in overcoming some challenges associated with supporting scientists to curate big data, and future research directions are also presented.

  16. Intravenous digital subtraction angiography of transplanted kidney artery

    International Nuclear Information System (INIS)

    Tessier, J.P; Teyssou, H.; Verdier, J.P.; Tison, E.; Meyblum, J.; Marchal, M.

    1986-01-01

    Results of 351 intravenous digital subtraction angiographs (AN) of transplanted kidneys emphasized reliability of this examination for detection of renal artery stenosis. A prospective study of 219 patients (188 interpretable AN) showed significant stenosis of grafted artery in 22% of cases: 17% of the 126 patients with normal blood pressure and 34% of the 62 cases of hypertension. Digital subtraction allows, with a single injection, assessment of renal artery, nephrogram and excretory cavities, but it is not a substitute for conventional intravenous urography 1 to 2 months after grafting [fr

  17. Big data analytics in healthcare: promise and potential.

    Science.gov (United States)

    Raghupathi, Wullianallur; Raghupathi, Viju

    2014-01-01

    To describe the promise and potential of big data analytics in healthcare. The paper describes the nascent field of big data analytics in healthcare, discusses the benefits, outlines an architectural framework and methodology, describes examples reported in the literature, briefly discusses the challenges, and offers conclusions. The paper provides a broad overview of big data analytics for healthcare researchers and practitioners. Big data analytics in healthcare is evolving into a promising field for providing insight from very large data sets and improving outcomes while reducing costs. Its potential is great; however there remain challenges to overcome.

  18. Data warehousing in the age of big data

    CERN Document Server

    Krishnan, Krish

    2013-01-01

    Data Warehousing in the Age of the Big Data will help you and your organization make the most of unstructured data with your existing data warehouse. As Big Data continues to revolutionize how we use data, it doesn't have to create more confusion. Expert author Krish Krishnan helps you make sense of how Big Data fits into the world of data warehousing in clear and concise detail. The book is presented in three distinct parts. Part 1 discusses Big Data, its technologies and use cases from early adopters. Part 2 addresses data warehousing, its shortcomings, and new architecture

  19. The Death of the Big Men

    DEFF Research Database (Denmark)

    Martin, Keir

    2010-01-01

    Recently Tolai people og Papua New Guinea have adopted the term 'Big Shot' to decribe an emerging post-colonial political elite. The mergence of the term is a negative moral evaluation of new social possibilities that have arisen as a consequence of the Big Shots' privileged position within a glo...

  20. Big data and software defined networks

    CERN Document Server

    Taheri, Javid

    2018-01-01

    Big Data Analytics and Software Defined Networking (SDN) are helping to drive the management of data usage of the extraordinary increase of computer processing power provided by Cloud Data Centres (CDCs). This new book investigates areas where Big-Data and SDN can help each other in delivering more efficient services.

  1. Big Data-Survey

    Directory of Open Access Journals (Sweden)

    P.S.G. Aruna Sri

    2016-03-01

    Full Text Available Big data is the term for any gathering of information sets, so expensive and complex, that it gets to be hard to process for utilizing customary information handling applications. The difficulties incorporate investigation, catch, duration, inquiry, sharing, stockpiling, Exchange, perception, and protection infringement. To reduce spot business patterns, anticipate diseases, conflict etc., we require bigger data sets when compared with the smaller data sets. Enormous information is hard to work with utilizing most social database administration frameworks and desktop measurements and perception bundles, needing rather enormously parallel programming running on tens, hundreds, or even a large number of servers. In this paper there was an observation on Hadoop architecture, different tools used for big data and its security issues.

  2. Postarthroscopy analgesia using intraarticular levobupivacaine and intravenous dexketoprofen trometamol.

    Science.gov (United States)

    Sahin, Sevtap Hekimoglu; Memiş, Dilek; Celik, Erkan; Sut, Necdet

    2015-12-01

    The aim of this prospective study was to determine the efficacy of intraarticular levobupivacaine with and without intravenous dexketoprofen trometamol for postarthroscopy analgesia. Sixty patients who underwent arthroscopic knee surgery were randomly assigned to three treatment groups. When the surgical procedure was completed, patients received the following treatments: group I (n = 20) patients received 20 mL intraarticular normal saline and 2 mL intravenous dexketoprofen trometamol (50 mg); group II (n = 20) patients received 20 mL intraarticular 0.5 % levobupivacaine (100 mg) and 2 mL intravenous normal saline; and group III (n = 20) patients received 20 mL intraarticular 0.5 % levobupivacaine (100 mg) and 2 mL intravenous dexketoprofen trometamol (50 mg). The visual analogue scale (VAS) was used, and the total analgesic consumption was assessed at 1, 2, 4, 6, 12, and 24 h post-operatively. The VAS scores at 1, 2, 4, 6, 12, and 24 h post-operatively were significantly increased in group I and group II compared with group III (p dexketoprofen trometamol administration provided better pain relief and less analgesic requirement after arthroscopic knee surgery during the first 24 h than that induced by dexketoprofen alone or levobupivacaine intraarticular alone. II.

  3. Organizational and technological correlates of nurses' trust in a smart intravenous pump.

    Science.gov (United States)

    Montague, Enid; Asan, Onur; Chiou, Erin

    2013-03-01

    The aim of this study was to understand technology and system characteristics that contribute to nurses' ratings of trust in a smart intravenous pump. Nurses' trust in new technologies can influence how technologies are used. Trust in technology is defined as a person's belief that a technology will not fail them. Potential outcomes of trust in technology are appropriate trust, overtrust, distrust, and mistrust. Trust in technology is also related to several use-specific outcomes, including appropriate use and inappropriate use such as overreliance, disuse or rejection, or misuse. Understanding trust in relation to outcomes can contribute to designs that facilitate appropriate trust in new technologies. A survey was completed by 391 nurses a year after the implementation of a new smart intravenous pump. The survey assessed trust in the intravenous pump and other elements of the sociotechnical system, individual characteristics, technology characteristics, and organizational characteristics. Results show that perceptions of usefulness, safety, ease of use, and usability are related to ratings of trust in smart intravenous pumps. Other work systemfactors such as perception of work environment, age, experience, quality of work, and perception of work performance are also related to ratings of trust. Nurses' trust in smart intravenous pumps is influenced by both characteristics of the technology and the sociotechnical system. Findings from this research have implications for the design of future smart intravenous pumps and health systems. Recommendations for appropriately trustworthy smart intravenous pumps are discussed. Findings also have implications for how trust in health technologies can be measured and conceptualized in complex sociotechnical systems.

  4. Big Data Analytics, Infectious Diseases and Associated Ethical Impacts

    OpenAIRE

    Garattini, C.; Raffle, J.; Aisyah, D. N.; Sartain, F.; Kozlakidis, Z.

    2017-01-01

    The exponential accumulation, processing and accrual of big data in healthcare are only possible through an equally rapidly evolving field of big data analytics. The latter offers the capacity to rationalize, understand and use big data to serve many different purposes, from improved services modelling to prediction of treatment outcomes, to greater patient and disease stratification. In the area of infectious diseases, the application of big data analytics has introduced a number of changes ...

  5. Evaluation of Data Management Systems for Geospatial Big Data

    OpenAIRE

    Amirian, Pouria; Basiri, Anahid; Winstanley, Adam C.

    2014-01-01

    Big Data encompasses collection, management, processing and analysis of the huge amount of data that varies in types and changes with high frequency. Often data component of Big Data has a positional component as an important part of it in various forms, such as postal address, Internet Protocol (IP) address and geographical location. If the positional components in Big Data extensively used in storage, retrieval, analysis, processing, visualization and knowledge discovery (geospatial Big Dat...

  6. Are intravenous injections of contrast media really less nephrotoxic than intra-arterial injections?

    Energy Technology Data Exchange (ETDEWEB)

    Nyman, Ulf [University of Lund, Department of Diagnostic Radiology, Trelleborg (Sweden); Almen, Torsten [Skaane University Hospital, Department of Clinical Sciences/Medical Radiology, University of Lund, Malmoe (Sweden); Jacobsson, Bo [University of Gothenburg and the Sahlgrenska Academy, Department of Diagnostic Radiology, The Queen Silvia Children' s Hospital, Goeteborg (Sweden); Aspelin, Peter [Karolinska Institute and University Hospital, Division of Medical Imaging and Technology, Department of Clinical Science, Intervention and Technology (CLINTEC), Stockholm (Sweden)

    2012-06-15

    We oppose the opinion that the intra-arterial administration of iodine-based contrast media (CM) appears to pose a greater risk of contrast medium-induced nephropathy (CIN) than intravenous administration since (1) in intra-arterial coronary procedures and most other intra-arterial angiographic examinations, CM injections are also intravenous relative to the kidneys, (2) there is a lack of comparative trials studying the risk of CIN between intra-arterial and intravenous procedures with matched risk factors and CM doses, (3) a bias selection of patients with fewer risk factors may explain the seemingly lower rate of CIN after CT in comparison with coronary interventions, (4) the rate of CIN following intra-arterial coronary procedures may also be exaggerated owing to other causes of acute kidney failure, such as haemodynamic instability and microembolisation, (5) roughly the same gram-iodine/GFR ratio ({approx}1:1) as a limit of relatively safe CM doses has preliminarily been found for both intravenous CT and intra-arterial coronary procedures and (6) the substantially higher injected intravenous CM dose rate during CT relative to an intra-arterial coronary procedure might actually pose a higher risk of CIN following CT. Key Points circle Most intra-arterial injections of contrast media are intravenous relative to the kidneys. circle No evidence that intravenous CM injections should be less nephrotoxic than intra-arterial. circle Considerably higher dose rates of CM are used for CT relative to intra-arterial procedures. circle Higher dose rates may pose higher nephrotoxic risk for intravenous based CT studies. (orig.)

  7. Comparison of analgesic efficacy of intravenous Paracetamol and intravenous dexketoprofen trometamol in multimodal analgesia after hysterectomy.

    Science.gov (United States)

    Unal, Ciğdem; Cakan, Türkay; Baltaci, Bülent; Başar, Hülya

    2013-10-01

    [corrected] We aimed to evaluate analgesic efficacy, opioid-sparing, and opioid-related adverse effects of intravenous paracetamol and intravenous dexketoprofen trometamol in combination with iv morphine after total abdominal hysterectomy. Sixty American Society of Anesthesiologist Physical Status Classification I-II patients scheduled for total abdominal hysterectomy were enrolled to this double-blinded, randomized, placebo controlled, and prospective study. Patients were divided into three groups as paracetamol, dexketoprofen trometamol, and placebo (0.9% NaCl) due to their post-operative analgesic usage. Intravenous patient controlled analgesia morphine was used as a rescue analgesic in all groups. Pain scores, hemodynamic parameters, morphine consumption, patient satisfaction, and side-effects were evaluated. Visual Analog Scale (VAS) scores were not statistically significantly different among the groups in all evaluation times, but decrease in VAS scores was statistically significant after the evaluation at 12(th) h in all groups. Total morphine consumption (morphine concentration = 0.2 mg/ml) in group paracetamol (72.3 ± 38.0 ml) and dexketoprofen trometamol (69.3 ± 24.1 ml) was significantly lower than group placebo (129.3 ± 22.6 ml) (P dexketoprofen trometamol after surgery and the increase in global satisfaction score was significant only in group placebo. Dexketoprofen trometamol and Paracetamol didn't cause significant change on pain scores, but increased patients' comfort. Although total morphine consumption was significantly decreased by both drugs, the incidence of nausea and vomiting were similar among the groups. According to results of the present study routine addition of dexketoprofen trometamol and paracetamol to patient controlled analgesia morphine after hysterectomies is not recommended.

  8. Predisposing factors for peripheral intravenous puncture failure in children

    OpenAIRE

    Negri,Daniela Cavalcante de; Avelar,Ariane Ferreira Machado; Andreoni,Solange; Pedreira,Mavilde da Luz Gonçalvez

    2012-01-01

    OBJECTIVE: To identify predisposing factors for peripheral intravenous puncture failure in children. METHODS: Cross-sectional cohort study conducted with 335 children in a pediatric ward of a university hospital after approval of the ethics committee. The Wald Chi-squared, Prevalence Ratio (PR) and backward procedure (p≤0.05) tests were applied. RESULTS: Success of peripheral intravenous puncture was obtained in 300 (89.5%) children and failure in 35 (10.4%). The failure rates were sign...

  9. A New Look at Big History

    Science.gov (United States)

    Hawkey, Kate

    2014-01-01

    The article sets out a "big history" which resonates with the priorities of our own time. A globalizing world calls for new spacial scales to underpin what the history curriculum addresses, "big history" calls for new temporal scales, while concern over climate change calls for a new look at subject boundaries. The article…

  10. West Virginia's big trees: setting the record straight

    Science.gov (United States)

    Melissa Thomas-Van Gundy; Robert. Whetsell

    2016-01-01

    People love big trees, people love to find big trees, and people love to find big trees in the place they call home. Having been suspicious for years, my coauthor and historian Rob Whetsell, approached me with a species identification challenge. There are several photographs of giant trees used by many people to illustrate the past forests of West Virginia,...

  11. Sosiaalinen asiakassuhdejohtaminen ja big data

    OpenAIRE

    Toivonen, Topi-Antti

    2015-01-01

    Tässä tutkielmassa käsitellään sosiaalista asiakassuhdejohtamista sekä hyötyjä, joita siihen voidaan saada big datan avulla. Sosiaalinen asiakassuhdejohtaminen on terminä uusi ja monille tuntematon. Tutkimusta motivoi aiheen vähäinen tutkimus, suomenkielisen tutkimuksen puuttuminen kokonaan sekä sosiaalisen asiakassuhdejohtamisen mahdollinen olennainen rooli yritysten toiminnassa tulevaisuudessa. Big dataa käsittelevissä tutkimuksissa keskitytään monesti sen tekniseen puoleen, eikä sovellutuk...

  12. Conversion from intravenous to oral medications: assessment of a computerized intervention for hospitalized patients.

    Science.gov (United States)

    Fischer, Michael A; Solomon, Daniel H; Teich, Jonathan M; Avorn, Jerry

    2003-11-24

    Many hospitalized patients continue to receive intravenous medications longer than necessary. Earlier conversion from the intravenous to the oral route could increase patient safety and comfort, reduce costs, and facilitate earlier discharge from the hospital without compromising clinical care. We examined the effect of a computer-based intervention to prompt physicians to switch appropriate patients from intravenous to oral medications. This study was performed at Brigham and Women's Hospital, an academic tertiary care hospital at which all medications are ordered online. We targeted 5 medications with equal oral and intravenous bioavailability: fluconazole, levofloxacin, metronidazole, ranitidine, and amiodarone. We used the hospital's computerized order entry system to prompt physicians to convert appropriate intravenous medications to the oral route. We measured the total use of the targeted medications via each route in the 4 months before and after the implementation of the intervention. We also measured the rate at which physicians responded to the intervention when prompted. The average intravenous defined daily dose declined by 11.1% (P =.002) from the preintervention to the postintervention period, while the average oral defined daily dose increased by 3.7% (P =.002). Length of stay, case-mix index, and total drug use at the hospital increased during the study period. The average total monthly use of the intravenous preparation of all of the targeted medications declined in the 4 months after the intervention began, compared with the 4 months before. In 35.6% of 1045 orders for which a prompt was generated, the physician either made a conversion from the intravenous to the oral version or canceled the order altogether. Computer-generated reminders can produce a substantial reduction in excessive use of targeted intravenous medications. As online prescribing becomes more common, this approach can be used to reduce excess use of intravenous medications

  13. D-branes in a big bang/big crunch universe: Misner space

    International Nuclear Information System (INIS)

    Hikida, Yasuaki; Nayak, Rashmi R.; Panigrahi, Kamal L.

    2005-01-01

    We study D-branes in a two-dimensional lorentzian orbifold R 1,1 /Γ with a discrete boost Γ. This space is known as Misner or Milne space, and includes big crunch/big bang singularity. In this space, there are D0-branes in spiral orbits and D1-branes with or without flux on them. In particular, we observe imaginary parts of partition functions, and interpret them as the rates of open string pair creation for D0-branes and emission of winding closed strings for D1-branes. These phenomena occur due to the time-dependence of the background. Open string 2→2 scattering amplitude on a D1-brane is also computed and found to be less singular than closed string case

  14. D-branes in a big bang/big crunch universe: Misner space

    Energy Technology Data Exchange (ETDEWEB)

    Hikida, Yasuaki [Theory Group, High Energy Accelerator Research Organization (KEK), Tukuba, Ibaraki 305-0801 (Japan); Nayak, Rashmi R. [Dipartimento di Fisica and INFN, Sezione di Roma 2, ' Tor Vergata' , Rome 00133 (Italy); Panigrahi, Kamal L. [Dipartimento di Fisica and INFN, Sezione di Roma 2, ' Tor Vergata' , Rome 00133 (Italy)

    2005-09-01

    We study D-branes in a two-dimensional lorentzian orbifold R{sup 1,1}/{gamma} with a discrete boost {gamma}. This space is known as Misner or Milne space, and includes big crunch/big bang singularity. In this space, there are D0-branes in spiral orbits and D1-branes with or without flux on them. In particular, we observe imaginary parts of partition functions, and interpret them as the rates of open string pair creation for D0-branes and emission of winding closed strings for D1-branes. These phenomena occur due to the time-dependence of the background. Open string 2{yields}2 scattering amplitude on a D1-brane is also computed and found to be less singular than closed string case.

  15. Astroinformatics: the big data of the universe

    OpenAIRE

    Barmby, Pauline

    2016-01-01

    In astrophysics we like to think that our field was the originator of big data, back when it had to be carried around in big sky charts and books full of tables. These days, it's easier to move astrophysics data around, but we still have a lot of it, and upcoming telescope  facilities will generate even more. I discuss how astrophysicists approach big data in general, and give examples from some Western Physics & Astronomy research projects.  I also give an overview of ho...

  16. Recent big flare

    International Nuclear Information System (INIS)

    Moriyama, Fumio; Miyazawa, Masahide; Yamaguchi, Yoshisuke

    1978-01-01

    The features of three big solar flares observed at Tokyo Observatory are described in this paper. The active region, McMath 14943, caused a big flare on September 16, 1977. The flare appeared on both sides of a long dark line which runs along the boundary of the magnetic field. Two-ribbon structure was seen. The electron density of the flare observed at Norikura Corona Observatory was 3 x 10 12 /cc. Several arc lines which connect both bright regions of different magnetic polarity were seen in H-α monochrome image. The active region, McMath 15056, caused a big flare on December 10, 1977. At the beginning, several bright spots were observed in the region between two main solar spots. Then, the area and the brightness increased, and the bright spots became two ribbon-shaped bands. A solar flare was observed on April 8, 1978. At first, several bright spots were seen around the solar spot in the active region, McMath 15221. Then, these bright spots developed to a large bright region. On both sides of a dark line along the magnetic neutral line, bright regions were generated. These developed to a two-ribbon flare. The time required for growth was more than one hour. A bright arc which connects two ribbons was seen, and this arc may be a loop prominence system. (Kato, T.)

  17. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    CERN Multimedia

    2008-01-01

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  18. Efficacy and Tolerability of Intravenous Levetiracetam in Children

    Directory of Open Access Journals (Sweden)

    Jose eAceves

    2013-08-01

    Full Text Available Intractable epilepsy in children poses a serious medical challenge. Acute repetitive seizures and status epilepticus leads to frequent emergency room visits and hospital admissions. Permanent neurological damage can occur if there is delay in treatment. It has been shown that these children continue to remain intractable even after acute seizure management with approved FDA agents. Intravenous levetiracetam, a second-generation anticonvulsant was approved by the FDA in 2006 in patients 16 years and older as an alternative when oral treatment is not an option. It has been shown that oral levetiracetam can be used in the treatment of status epilepticus and acute repetitive seizures. Data have been published showing that intravenous levetiracetam is safe and efficacious, and can be used in an acute inpatient setting. This current review will discuss the recent data about the safety and tolerability of intravenous levetiracetam in children and neonates, and emphasize the need for a larger prospective multicenter trial to prove the efficacy of this agent in acute seizure management.

  19. Inflated granularity: Spatial “Big Data” and geodemographics

    Directory of Open Access Journals (Sweden)

    Craig M Dalton

    2015-08-01

    Full Text Available Data analytics, particularly the current rhetoric around “Big Data”, tend to be presented as new and innovative, emerging ahistorically to revolutionize modern life. In this article, we situate one branch of Big Data analytics, spatial Big Data, through a historical predecessor, geodemographic analysis, to help develop a critical approach to current data analytics. Spatial Big Data promises an epistemic break in marketing, a leap from targeting geodemographic areas to targeting individuals. Yet it inherits characteristics and problems from geodemographics, including a justification through the market, and a process of commodification through the black-boxing of technology. As researchers develop sustained critiques of data analytics and its effects on everyday life, we must so with a grounding in the cultural and historical contexts from which data technologies emerged. This article and others (Barnes and Wilson, 2014 develop a historically situated, critical approach to spatial Big Data. This history illustrates connections to the critical issues of surveillance, redlining, and the production of consumer subjects and geographies. The shared histories and structural logics of spatial Big Data and geodemographics create the space for a continued critique of data analyses’ role in society.

  20. Big data analysis for smart farming

    NARCIS (Netherlands)

    Kempenaar, C.; Lokhorst, C.; Bleumer, E.J.B.; Veerkamp, R.F.; Been, Th.; Evert, van F.K.; Boogaardt, M.J.; Ge, L.; Wolfert, J.; Verdouw, C.N.; Bekkum, van Michael; Feldbrugge, L.; Verhoosel, Jack P.C.; Waaij, B.D.; Persie, van M.; Noorbergen, H.

    2016-01-01

    In this report we describe results of a one-year TO2 institutes project on the development of big data technologies within the milk production chain. The goal of this project is to ‘create’ an integration platform for big data analysis for smart farming and to develop a show case. This includes both

  1. A survey on Big Data Stream Mining

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... Big Data can be static on one machine or distributed ... decision making, and process automation. Big data .... Concept Drifting: concept drifting mean the classifier .... transactions generated by a prefix tree structure. EstDec ...

  2. Emerging technology and architecture for big-data analytics

    CERN Document Server

    Chang, Chip; Yu, Hao

    2017-01-01

    This book describes the current state of the art in big-data analytics, from a technology and hardware architecture perspective. The presentation is designed to be accessible to a broad audience, with general knowledge of hardware design and some interest in big-data analytics. Coverage includes emerging technology and devices for data-analytics, circuit design for data-analytics, and architecture and algorithms to support data-analytics. Readers will benefit from the realistic context used by the authors, which demonstrates what works, what doesn’t work, and what are the fundamental problems, solutions, upcoming challenges and opportunities. Provides a single-source reference to hardware architectures for big-data analytics; Covers various levels of big-data analytics hardware design abstraction and flow, from device, to circuits and systems; Demonstrates how non-volatile memory (NVM) based hardware platforms can be a viable solution to existing challenges in hardware architecture for big-data analytics.

  3. Toward a manifesto for the 'public understanding of big data'.

    Science.gov (United States)

    Michael, Mike; Lupton, Deborah

    2016-01-01

    In this article, we sketch a 'manifesto' for the 'public understanding of big data'. On the one hand, this entails such public understanding of science and public engagement with science and technology-tinged questions as follows: How, when and where are people exposed to, or do they engage with, big data? Who are regarded as big data's trustworthy sources, or credible commentators and critics? What are the mechanisms by which big data systems are opened to public scrutiny? On the other hand, big data generate many challenges for public understanding of science and public engagement with science and technology: How do we address publics that are simultaneously the informant, the informed and the information of big data? What counts as understanding of, or engagement with, big data, when big data themselves are multiplying, fluid and recursive? As part of our manifesto, we propose a range of empirical, conceptual and methodological exhortations. We also provide Appendix 1 that outlines three novel methods for addressing some of the issues raised in the article. © The Author(s) 2015.

  4. Intravenous voriconazole after toxic oral administration

    NARCIS (Netherlands)

    Alffenaar, J.W.C.; Van Assen, S.; De Monchy, J.G.R.; Uges, D.R.A.; Kosterink, J.G.W.; Van Der Werf, T.S.

    In a male patient with rhinocerebral invasive aspergillosis, prolonged high-dosage oral administration of voriconazole led to hepatotoxicity combined with a severe cutaneous reaction while intravenous administration in the same patient did not. High concentrations in the portal blood precipitate

  5. Principales parámetros para el estudio de la colaboración científica en Big Science

    OpenAIRE

    Ortoll, Eva; Canals, Agustí; Garcia, Montserrat; Cobarsí, Josep

    2014-01-01

    In several scientific disciplines research has shifted from experiments of a reduced scale to large and complex collaborations. Many recent scientific achievements like the human genome sequencing or the discovery of the Higgs boson have taken place within the “big science” paradigm. The study of scientific collaboration needs to take into account all the diverse factors that have an influence on it. In the case of big science experiments, some of those aspects are particularly important: num...

  6. Big Data in Caenorhabditis elegans: quo vadis?

    Science.gov (United States)

    Hutter, Harald; Moerman, Donald

    2015-11-05

    A clear definition of what constitutes "Big Data" is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of "complete" data sets for this organism is actually rather small--not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein-protein interaction--important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell. © 2015 Hutter and Moerman. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  7. 76 FR 7810 - Big Horn County Resource Advisory Committee

    Science.gov (United States)

    2011-02-11

    ..., Wyoming 82801. Comments may also be sent via e-mail to [email protected] , with the words Big... DEPARTMENT OF AGRICULTURE Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee...

  8. Hot big bang or slow freeze?

    Energy Technology Data Exchange (ETDEWEB)

    Wetterich, C.

    2014-09-07

    We confront the big bang for the beginning of the universe with an equivalent picture of a slow freeze — a very cold and slowly evolving universe. In the freeze picture the masses of elementary particles increase and the gravitational constant decreases with cosmic time, while the Newtonian attraction remains unchanged. The freeze and big bang pictures both describe the same observations or physical reality. We present a simple “crossover model” without a big bang singularity. In the infinite past space–time is flat. Our model is compatible with present observations, describing the generation of primordial density fluctuations during inflation as well as the present transition to a dark energy-dominated universe.

  9. Hot big bang or slow freeze?

    International Nuclear Information System (INIS)

    Wetterich, C.

    2014-01-01

    We confront the big bang for the beginning of the universe with an equivalent picture of a slow freeze — a very cold and slowly evolving universe. In the freeze picture the masses of elementary particles increase and the gravitational constant decreases with cosmic time, while the Newtonian attraction remains unchanged. The freeze and big bang pictures both describe the same observations or physical reality. We present a simple “crossover model” without a big bang singularity. In the infinite past space–time is flat. Our model is compatible with present observations, describing the generation of primordial density fluctuations during inflation as well as the present transition to a dark energy-dominated universe

  10. Hot big bang or slow freeze?

    Directory of Open Access Journals (Sweden)

    C. Wetterich

    2014-09-01

    Full Text Available We confront the big bang for the beginning of the universe with an equivalent picture of a slow freeze — a very cold and slowly evolving universe. In the freeze picture the masses of elementary particles increase and the gravitational constant decreases with cosmic time, while the Newtonian attraction remains unchanged. The freeze and big bang pictures both describe the same observations or physical reality. We present a simple “crossover model” without a big bang singularity. In the infinite past space–time is flat. Our model is compatible with present observations, describing the generation of primordial density fluctuations during inflation as well as the present transition to a dark energy-dominated universe.

  11. Big Data: Survey, Technologies, Opportunities, and Challenges

    Directory of Open Access Journals (Sweden)

    Nawsher Khan

    2014-01-01

    Full Text Available Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  12. Pre-big bang cosmology and quantum fluctuations

    International Nuclear Information System (INIS)

    Ghosh, A.; Pollifrone, G.; Veneziano, G.

    2000-01-01

    The quantum fluctuations of a homogeneous, isotropic, open pre-big bang model are discussed. By solving exactly the equations for tensor and scalar perturbations we find that particle production is negligible during the perturbative Pre-Big Bang phase

  13. Adherence of radiopharmaceuticals and labeled cells to intravenous tubing

    International Nuclear Information System (INIS)

    Segall, G.M.; Gurevich, N.; McDougall, I.R.

    1986-01-01

    A survey of 67 nuclear medicine departments revealed no agreement on which radiolabeled agents could be injected through intravenous lines (IVs) and which required direct venipuncture. Labeled cells and several common radiopharmaceuticals were tested for adherence to intravenous tubing. Residual activity remaining in the tubing after an adequate flush was less than 1% of the injected dose in each case. Administration of radiolabeled agents through existing IVs is an acceptable alternative to direct venipuncture in many cases

  14. Modes of Action of Intravenous Immunoglobulin in Bullous Pemphigoid.

    Science.gov (United States)

    Li, Ning; Culton, Donna; Diaz, Luis A; Liu, Zhi

    2018-06-01

    Bullous pemphigoid is an autoantibody-mediated skin blistering disease. Previous studies revealed that intravenous Ig is therapeutic in animal models of bullous pemphigoid by saturating the IgG-protective receptor FcRn, thereby accelerating degradation of pathogenic IgG. Sasaoka et al. demonstrate that the inhibitory effects of intravenous Ig on bullous pemphigoid are also associated with negative modulation of cytokine production by keratinocytes. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Analysis of Big Data Maturity Stage in Hospitality Industry

    OpenAIRE

    Shabani, Neda; Munir, Arslan; Bose, Avishek

    2017-01-01

    Big data analytics has an extremely significant impact on many areas in all businesses and industries including hospitality. This study aims to guide information technology (IT) professionals in hospitality on their big data expedition. In particular, the purpose of this study is to identify the maturity stage of the big data in hospitality industry in an objective way so that hotels be able to understand their progress, and realize what it will take to get to the next stage of big data matur...

  16. A Multidisciplinary Perspective of Big Data in Management Research

    OpenAIRE

    Sheng, Jie; Amankwah-Amoah, J.; Wang, X.

    2017-01-01

    In recent years, big data has emerged as one of the prominent buzzwords in business and management. In spite of the mounting body of research on big data across the social science disciplines, scholars have offered little synthesis on the current state of knowledge. To take stock of academic research that contributes to the big data revolution, this paper tracks scholarly work's perspectives on big data in the management domain over the past decade. We identify key themes emerging in manageme...

  17. Unconjugated oestetrol in plasma in response to an intravenous load of dehydroepiandrosterone sulphate (DHAS) in uncomplicated and complicated human pregnancy

    International Nuclear Information System (INIS)

    Axelsson, Ove

    1978-01-01

    A non-chromatographic radioimmunoassay for estimation of unconjugated oestetrol in plasma from pregnant women is described. The antiserum has a high specificity to oestetrol. The technical procedure is simple and rapid. Only small amounts of plasma (0.2-0.4 ml) are needed for the analysis. The method has been applied to the measurement of oestetrol in plasma from pregnant women before and after an intravenous injection of 50 mg DHAS. In women with uncomplicated pregnancies a rise of plasma oestetrol was found 60 min after the injection. From 120 to 360 min there was a plateau level, at 600 min a decrease from this level was observed. No changes in the oestetrol response were found with advancing gestational age from the 33rd to the 40th week of pregnancy. A great spread in the individual responses were recorded. Patients with pre-eclampsia and intrauterine growth retardation had a tendency to a lower increase and patients with diabetes a tendency to a higher increase of plasma oestetrol after the DHAS administration. From the data obtained it is concluded that the increase of plasma oestetrol after an intraveneous injection of DHAS in most cases is secondary to the increase of plasma oestradiol. The results suggest that measurement of unconjugated oestetrol in plasma after an intravenous load of DHAS is no safe way to assess foetal wellbeing. In women with intrauterine growth retardation (IUGR) the simultaneous measurement of plasma oestradiol and oestetrol after an injection of DHAS indicates a possibility to distinguish placental from foetal causes of this syndrome. (author)

  18. An embedding for the big bang

    Science.gov (United States)

    Wesson, Paul S.

    1994-01-01

    A cosmological model is given that has good physical properties for the early and late universe but is a hypersurface in a flat five-dimensional manifold. The big bang can therefore be regarded as an effect of a choice of coordinates in a truncated higher-dimensional geometry. Thus the big bang is in some sense a geometrical illusion.

  19. Structural Basis of Human Parechovirus Neutralization by Human Monoclonal Antibodies

    NARCIS (Netherlands)

    Shakeel, Shabih; Westerhuis, Brenda M.; Ora, Ari; Koen, Gerrit; Bakker, Arjen Q.; Claassen, Yvonne; Wagner, Koen; Beaumont, Tim; Wolthers, Katja C.; Butcher, Sarah J.

    2015-01-01

    Since it was first recognized in 2004 that human parechoviruses (HPeV) are a significant cause of central nervous system and neonatal sepsis, their clinical importance, primarily in children, has started to emerge. Intravenous immunoglobulin treatment is the only treatment available in such

  20. Big Data as Governmentality in International Development

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    2017-01-01

    Statistics have long shaped the field of visibility for the governance of development projects. The introduction of big data has altered the field of visibility. Employing Dean's “analytics of government” framework, we analyze two cases—malaria tracking in Kenya and monitoring of food prices...... in Indonesia. Our analysis shows that big data introduces a bias toward particular types of visualizations. What problems are being made visible through big data depends to some degree on how the underlying data is visualized and who is captured in the visualizations. It is also influenced by technical factors...