WorldWideScience

Sample records for intravenous human big

  1. The human experience with intravenous levodopa

    Directory of Open Access Journals (Sweden)

    Shan H Siddiqi

    2016-01-01

    Full Text Available Objective: To compile a comprehensive summary of published human experience with levodopa given intravenously, with a focus on information required by regulatory agencies.Background: While safe intravenous (IV use of levodopa has been documented for over 50 years, regulatory supervision for pharmaceuticals given by a route other than that approved by the U.S. Food and Drug Administration (FDA has become increasingly cautious. If delivering a drug by an alternate route raises the risk of adverse events, an investigational new drug (IND application is required, including a comprehensive review of toxicity data.Methods: Over 200 articles referring to IV levodopa were examined for details of administration, pharmacokinetics, benefit and side effects.Results: We identified 142 original reports describing IVLD use in humans, beginning with psychiatric research in 1959-1960 before the development of peripheral decarboxylase inhibitors. Over 2750 subjects have received IV levodopa, and reported outcomes include parkinsonian signs, sleep variables, hormone levels, hemodynamics, CSF amino acid composition, regional cerebral blood flow, cognition, perception and complex behavior. Mean pharmacokinetic variables were summarized for 49 healthy subjects and 190 with Parkinson’s disease. Side effects were those expected from clinical experience with oral levodopa and dopamine agonists. No articles reported deaths or induction of psychosis.Conclusion: Over 2750 patients have received IV levodopa with a safety profile comparable to that seen with oral administration.

  2. Human factors in Big Data

    NARCIS (Netherlands)

    Boer, J. de

    2016-01-01

    Since 2014 I am involved in various (research) projects that try to make the hype around Big Data more concrete and tangible for the industry and government. Big Data is about multiple sources of (real-time) data that can be analysed, transformed to information and be used to make 'smart' decisions.

  3. Statistical Challenges in "Big Data" Human Neuroimaging.

    Science.gov (United States)

    Smith, Stephen M; Nichols, Thomas E

    2018-01-17

    Smith and Nichols discuss "big data" human neuroimaging studies, with very large subject numbers and amounts of data. These studies provide great opportunities for making new discoveries about the brain but raise many new analytical challenges and interpretational risks. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Exploring Data in Human Resources Big Data

    Directory of Open Access Journals (Sweden)

    Adela BARA

    2016-01-01

    Full Text Available Nowadays, social networks and informatics technologies and infrastructures are constantly developing and affect each other. In this context, the HR recruitment process became complex and many multinational organizations have encountered selection issues. The objective of the paper is to develop a prototype system for assisting the selection of candidates for an intelligent management of human resources. Such a system can be a starting point for the efficient organization of semi-structured and unstructured data on recruitment activities. The article extends the research presented at the 14th International Conference on Informatics in Economy (IE 2015 in the scientific paper "Big Data challenges for human resources management".

  5. Big Data Analysis of Human Genome Variations

    KAUST Repository

    Gojobori, Takashi

    2016-01-25

    Since the human genome draft sequence was in public for the first time in 2000, genomic analyses have been intensively extended to the population level. The following three international projects are good examples for large-scale studies of human genome variations: 1) HapMap Data (1,417 individuals) (http://hapmap.ncbi.nlm.nih.gov/downloads/genotypes/2010-08_phaseII+III/forward/), 2) HGDP (Human Genome Diversity Project) Data (940 individuals) (http://www.hagsc.org/hgdp/files.html), 3) 1000 genomes Data (2,504 individuals) http://ftp.1000genomes.ebi.ac.uk/vol1/ftp/release/20130502/ If we can integrate all three data into a single volume of data, we should be able to conduct a more detailed analysis of human genome variations for a total number of 4,861 individuals (= 1,417+940+2,504 individuals). In fact, we successfully integrated these three data sets by use of information on the reference human genome sequence, and we conducted the big data analysis. In particular, we constructed a phylogenetic tree of about 5,000 human individuals at the genome level. As a result, we were able to identify clusters of ethnic groups, with detectable admixture, that were not possible by an analysis of each of the three data sets. Here, we report the outcome of this kind of big data analyses and discuss evolutionary significance of human genomic variations. Note that the present study was conducted in collaboration with Katsuhiko Mineta and Kosuke Goto at KAUST.

  6. The Human Genome Project: big science transforms biology and medicine

    OpenAIRE

    Hood, Leroy; Rowen, Lee

    2013-01-01

    The Human Genome Project has transformed biology through its integrated big science approach to deciphering a reference human genome sequence along with the complete sequences of key model organisms. The project exemplifies the power, necessity and success of large, integrated, cross-disciplinary efforts - so-called ‘big science’ - directed towards complex major objectives. In this article, we discuss the ways in which this ambitious endeavor led to the development of novel technologies and a...

  7. Big Data Analysis of Human Genome Variations

    KAUST Repository

    Gojobori, Takashi

    2016-01-01

    Since the human genome draft sequence was in public for the first time in 2000, genomic analyses have been intensively extended to the population level. The following three international projects are good examples for large-scale studies of human

  8. Pharmacokinetics of high-dose intravenous melatonin in humans

    DEFF Research Database (Denmark)

    Andersen, Lars P H; Werner, Mads U; Rosenkilde, Mette Marie

    2016-01-01

    This crossover study investigated the pharmacokinetics and adverse effects of high-dose intravenous melatonin. Volunteers participated in 3 identical study sessions, receiving an intravenous bolus of 10 mg melatonin, 100 mg melatonin, and placebo. Blood samples were collected at baseline and 0, 60......, 120, 180, 240, 300, 360, and 420 minutes after the bolus. Quantitative determination of plasma melatonin concentrations was performed using a radioimmunoassay technique. Pharmacokinetic parameters were estimated by a compartmental pharmacokinetic analysis. Adverse effects included assessments...... of sedation and registration of other symptoms. Sedation, evaluated as simple reaction times, was measured at baseline and 120, 180, 300, and 420 minutes after the bolus. Twelve male volunteers completed the study. Median (IQR) Cmax after the bolus injections of 10 mg and 100 mg of melatonin were 221...

  9. Think Big! The Human Condition Project

    Science.gov (United States)

    Metcalfe, Gareth

    2014-01-01

    How can educators provide children with a genuine experience of carrying out an extended scientific investigation? And can teachers change the perception of what it means to be a scientist? These were key questions that lay behind "The Human Condition" project, an initiative funded by the Primary Science Teaching Trust to explore a new…

  10. The Human Genome Project: big science transforms biology and medicine.

    Science.gov (United States)

    Hood, Leroy; Rowen, Lee

    2013-01-01

    The Human Genome Project has transformed biology through its integrated big science approach to deciphering a reference human genome sequence along with the complete sequences of key model organisms. The project exemplifies the power, necessity and success of large, integrated, cross-disciplinary efforts - so-called 'big science' - directed towards complex major objectives. In this article, we discuss the ways in which this ambitious endeavor led to the development of novel technologies and analytical tools, and how it brought the expertise of engineers, computer scientists and mathematicians together with biologists. It established an open approach to data sharing and open-source software, thereby making the data resulting from the project accessible to all. The genome sequences of microbes, plants and animals have revolutionized many fields of science, including microbiology, virology, infectious disease and plant biology. Moreover, deeper knowledge of human sequence variation has begun to alter the practice of medicine. The Human Genome Project has inspired subsequent large-scale data acquisition initiatives such as the International HapMap Project, 1000 Genomes, and The Cancer Genome Atlas, as well as the recently announced Human Brain Project and the emerging Human Proteome Project.

  11. Big Data and Intelligence: Applications, Human Capital, and Education

    Directory of Open Access Journals (Sweden)

    Michael Landon-Murray

    2016-06-01

    Full Text Available The potential for big data to contribute to the US intelligence mission goes beyond bulk collection, social media and counterterrorism. Applications will speak to a range of issues of major concern to intelligence agencies, from military operations to climate change to cyber security. There are challenges too: procurement lags, data stovepiping, separating signal from noise, sources and methods, a range of normative issues, and central to managing these challenges, human capital. These potential applications and challenges are discussed and a closer look at what data scientists do in the Intelligence Community (IC is offered. Effectively filling the ranks of the IC’s data science workforce will depend on the provision of well-trained data scientists from the higher education system. Program offerings at America’s top fifty universities will thus be surveyed (just a few years ago there were reportedly no degrees in data science. One Master’s program that has melded data science with intelligence is examined as well as a university big data research center focused on security and intelligence. This discussion goes a long way to clarify the prospective uses of data science in intelligence while probing perhaps the key challenge to optimal application of big data in the IC.

  12. Big History or the 13800 million years from the Big Bang to the Human Brain

    Science.gov (United States)

    Gústafsson, Ludvik E.

    2017-04-01

    Big History is the integrated history of the Cosmos, Earth, Life, and Humanity. It is an attempt to understand our existence as a continuous unfolding of processes leading to ever more complex structures. Three major steps in the development of the Universe can be distinguished, the first being the creation of matter/energy and forces in the context of an expanding universe, while the second and third steps were reached when completely new qualities of matter came into existence. 1. Matter comes out of nothing Quantum fluctuations and the inflation event are thought to be responsible for the creation of stable matter particles in what is called the Big Bang. Along with simple particles the universe is formed. Later larger particles like atoms and the most simple chemical elements hydrogen and helium evolved. Gravitational contraction of hydrogen and helium formed the first stars und later on the first galaxies. Massive stars ended their lives in violent explosions releasing heavier elements like carbon, oxygen, nitrogen, sulfur and iron into the universe. Subsequent star formation led to star systems with bodies containing these heavier elements. 2. Matter starts to live About 9200 million years after the Big Bang a rather inconspicous star of middle size formed in one of a billion galaxies. The leftovers of the star formation clumped into bodies rotating around the central star. In some of them elements like silicon, oxygen, iron and many other became the dominant matter. On the third of these bodies from the central star much of the surface was covered with an already very common chemical compound in the universe, water. Fluid water and plenty of various elements, especially carbon, were the ingredients of very complex chemical compounds that made up even more complex structures. These were able to replicate themselves. Life had appeared, the only occasion that we human beings know of. Life evolved subsequently leading eventually to the formation of multicellular

  13. A Big Bang model of human colorectal tumor growth.

    Science.gov (United States)

    Sottoriva, Andrea; Kang, Haeyoun; Ma, Zhicheng; Graham, Trevor A; Salomon, Matthew P; Zhao, Junsong; Marjoram, Paul; Siegmund, Kimberly; Press, Michael F; Shibata, Darryl; Curtis, Christina

    2015-03-01

    What happens in early, still undetectable human malignancies is unknown because direct observations are impractical. Here we present and validate a 'Big Bang' model, whereby tumors grow predominantly as a single expansion producing numerous intermixed subclones that are not subject to stringent selection and where both public (clonal) and most detectable private (subclonal) alterations arise early during growth. Genomic profiling of 349 individual glands from 15 colorectal tumors showed an absence of selective sweeps, uniformly high intratumoral heterogeneity (ITH) and subclone mixing in distant regions, as postulated by our model. We also verified the prediction that most detectable ITH originates from early private alterations and not from later clonal expansions, thus exposing the profile of the primordial tumor. Moreover, some tumors appear 'born to be bad', with subclone mixing indicative of early malignant potential. This new model provides a quantitative framework to interpret tumor growth dynamics and the origins of ITH, with important clinical implications.

  14. First-pass metabolism of ethanol in human beings: effect of intravenous infusion of fructose

    DEFF Research Database (Denmark)

    Parlesak, Alexandr; Billinger, MH; Schäfer, C.

    2004-01-01

    Intravenous infusion of fructose has been shown to enhance reduced form of nicotinamide adenine dinucleotide reoxidation and, thereby, to enhance the metabolism of ethanol. In the current study, the effect of fructose infusion on first-pass metabolism of ethanol was studied in human volunteers....... A significantly higher first-pass metabolism of ethanol was obtained after administration of fructose in comparison with findings for control experiments with an equimolar dose of glucose. Because fructose is metabolized predominantly in the liver and can be presumed to have virtually no effects in the stomach...

  15. Hepatic glycogen in humans. II. Gluconeogenetic formation after oral and intravenous glucose

    International Nuclear Information System (INIS)

    Radziuk, J.

    1989-01-01

    The amount of glycogen that is formed by gluconeogenetic pathways during glucose loading was quantitated in human subjects. Oral glucose loading was compared with its intravenous administration. Overnight-fasted subjects received a constant infusion or [3- 3 H]glucose and a marker for gluconeogenesis, [U- 14 C]lactate or sodium [ 14 C]bicarbonate [ 14 C]bicarbonate. An unlabeled glucose load was then administered. Postabsorptively, or after glucose infusion was terminated, a third tracer ([6- 3 H]glucose) infusion was initiated along with a three-step glucagon infusion. Without correcting for background stimulation of [ 14 C]glucose production or for dilution of 14 C with citric acid cycle carbon in the oxaloacetate pool, the amount of glycogen mobilized by the glucagon infusion that was produced by gluconeogenesis during oral glucose loading was 2.9 +/- 0.7 g calculated from [U- 14 C]-lactate incorporation and 7.4 +/- 1.3 g calculated using [ 14 C]bicarbonate as a gluconeogenetic marker. During intravenous glucose administration the latter measurement also yielded 7.2 +/- 1.1 g. When the two corrections above are applied, the respective quantities became 5.3 +/- 1.7 g for [U- 14 C]lactate as tracer and 14.7 +/- 4.3 and 13.9 +/- 3.6 g for oral and intravenous glucose with [ 14 C]bicarbonate as tracer (P less than 0.05, vs. [ 14 C]-lactate as tracer). When [2- 14 C]acetate was infused, the same amount of label was incorporated into mobilized glycogen regardless of which route of glucose administration was used. Comparison with previous data also suggests that 14 CO 2 is a potentially useful marker for the gluconeogenetic process in vivo

  16. Implicit transitive inference and the human hippocampus: does intravenous midazolam function as a reversible hippocampal lesion?

    Directory of Open Access Journals (Sweden)

    Greene Anthony J

    2007-09-01

    Full Text Available Abstract Recent advances have led to an understanding that the hippocampus is involved more broadly than explicit or declarative memory alone. Tasks which involve the acquisition of complex associations involve the hippocampus whether the learning is explicit or implicit. One hippocampal-dependent implicit task is transitive inference (TI. Recently it was suggested that implicit transitive inference does not depend upon the hippocampus (Frank, M. J., O'Reilly, R. C., & Curran, T. 2006. When memory fails, intuition reigns: midazolam enhances implicit inference in humans. Psychological Science, 17, 700–707. The authors demonstrated that intravenous midazolam, which is thought to inactivate the hippocampus, may enhance TI performance. Three critical assumptions are required but not met: 1 that deactivations of other regions could not account for the effect 2 that intravenous midazolam does indeed deactivate the hippocampus and 3 that midazolam influences explicit but not implicit memory. Each of these assumptions is seriously flawed. Consequently, the suggestion that implicit TI does not depend upon the hippocampus is unfounded.

  17. How Big Data Fast Tracked Human Mobility Research and the Lessons for Animal Movement Ecology

    Directory of Open Access Journals (Sweden)

    Michele Thums

    2018-02-01

    Full Text Available The rise of the internet coupled with technological innovations such as smartphones have generated massive volumes of geo-referenced data (big data on human mobility. This has allowed the number of studies of human mobility to rapidly overtake those of animal movement. Today, telemetry studies of animals are also approaching big data status. Here, we review recent advances in studies of human mobility and identify the opportunities they present for advancing our understanding of animal movement. We describe key analytical techniques, potential bottlenecks and a roadmap for progress toward a synthesis of movement patterns of wild animals.

  18. How Big Data Fast Tracked Human Mobility Research and the Lessons for Animal Movement Ecology

    KAUST Repository

    Thums, Michele; Ferná ndez-Gracia, Juan; Sequeira, Ana M. M.; Eguí luz, Ví ctor M.; Duarte, Carlos M.; Meekan, Mark G.

    2018-01-01

    The rise of the internet coupled with technological innovations such as smartphones have generated massive volumes of geo-referenced data (big data) on human mobility. This has allowed the number of studies of human mobility to rapidly overtake those of animal movement. Today, telemetry studies of animals are also approaching big data status. Here, we review recent advances in studies of human mobility and identify the opportunities they present for advancing our understanding of animal movement. We describe key analytical techniques, potential bottlenecks and a roadmap for progress toward a synthesis of movement patterns of wild animals.

  19. How Big Data Fast Tracked Human Mobility Research and the Lessons for Animal Movement Ecology

    KAUST Repository

    Thums, Michele

    2018-02-13

    The rise of the internet coupled with technological innovations such as smartphones have generated massive volumes of geo-referenced data (big data) on human mobility. This has allowed the number of studies of human mobility to rapidly overtake those of animal movement. Today, telemetry studies of animals are also approaching big data status. Here, we review recent advances in studies of human mobility and identify the opportunities they present for advancing our understanding of animal movement. We describe key analytical techniques, potential bottlenecks and a roadmap for progress toward a synthesis of movement patterns of wild animals.

  20. Using Big Data to Understand the Human Condition: The Kavli HUMAN Project.

    Science.gov (United States)

    Azmak, Okan; Bayer, Hannah; Caplin, Andrew; Chun, Miyoung; Glimcher, Paul; Koonin, Steven; Patrinos, Aristides

    2015-09-01

    Until now, most large-scale studies of humans have either focused on very specific domains of inquiry or have relied on between-subjects approaches. While these previous studies have been invaluable for revealing important biological factors in cardiac health or social factors in retirement choices, no single repository contains anything like a complete record of the health, education, genetics, environmental, and lifestyle profiles of a large group of individuals at the within-subject level. This seems critical today because emerging evidence about the dynamic interplay between biology, behavior, and the environment point to a pressing need for just the kind of large-scale, long-term synoptic dataset that does not yet exist at the within-subject level. At the same time that the need for such a dataset is becoming clear, there is also growing evidence that just such a synoptic dataset may now be obtainable-at least at moderate scale-using contemporary big data approaches. To this end, we introduce the Kavli HUMAN Project (KHP), an effort to aggregate data from 2,500 New York City households in all five boroughs (roughly 10,000 individuals) whose biology and behavior will be measured using an unprecedented array of modalities over 20 years. It will also richly measure environmental conditions and events that KHP members experience using a geographic information system database of unparalleled scale, currently under construction in New York. In this manner, KHP will offer both synoptic and granular views of how human health and behavior coevolve over the life cycle and why they evolve differently for different people. In turn, we argue that this will allow for new discovery-based scientific approaches, rooted in big data analytics, to improving the health and quality of human life, particularly in urban contexts.

  1. Effect of intravenous infusion of glyceryl trinitrate on gastric and small intestinal motor function in healthy humans

    DEFF Research Database (Denmark)

    Madsen, Jan Lysgård; Fuglsang, Stefan; Graff, J

    2006-01-01

    : To examine the effect of intravenous infusion of glyceryl trinitrate on gastric and small intestinal motor function after a meal in healthy humans. METHODS: Nine healthy volunteers participated in a placebo-controlled, double-blind, crossover study. Each volunteer was examined during intravenous infusion...... of glyceryl trinitrate 1 microg/kg x min or saline. A gamma camera technique was used to measure gastric emptying and small intestinal transit after a 1600-kJ mixed liquid and solid meal. Furthermore, duodenal motility was assessed by manometry. RESULTS: Glyceryl trinitrate did not change gastric mean...... emptying time, gastric half emptying time, gastric retention at 15 min or small intestinal mean transit time. Glyceryl trinitrate did not influence the frequency of duodenal contractions, the amplitude of duodenal contractions or the duodenal motility index. CONCLUSIONS: Intravenous infusion of glyceryl...

  2. Pharmacokinetics and pharmacodynamics of eltanolone (pregnanolone), a new steroid intravenous anaesthetic, in humans

    DEFF Research Database (Denmark)

    Carl, Peder; Høgskilde, S; Lang-Jensen, T

    1994-01-01

    Eltanolone, a new intravenous steroid anaesthetic agent was administered intravenously in a dose of 0.6 mg.kg-1 over 45 s to eight healthy male volunteers to evaluate some of its pharmacokinetic and pharmacodynamic effects. Drug concentration-time data were analysed by PCNONLIN, a non...

  3. A chromatographic method for the production of a human immunoglobulin G solution for intravenous use

    Directory of Open Access Journals (Sweden)

    K. Tanaka

    1998-11-01

    Full Text Available Immunoglobulin G (IgG of excellent quality for intravenous use was obtained from the cryosupernatant of human plasma by a chromatographic method based on a mixture of ion-exchange, DEAE-Sepharose FF and arginine Sepharose 4B affinity chromatography and a final purification step by Sephacryl S-300 HR gel filtration. The yield of 10 experimental batches produced was 3.5 g IgG per liter of plasma. A solvent/detergent combination of 1% Tri (n-butyl phosphate and 1% Triton X-100 was used to inactivate lipid-coated viruses. Analysis of the final product (5% liquid IgG based on the mean for 10 batches showed 94% monomers, 5.5% dimers and 0.5% polymers and aggregates. Anticomplementary activity was 0.3 CH50/mg IgG and prekallikrein activator levels were less than 5 IU/ml. Stability at 37ºC for 30 days in the liquid state was satisfactory. IgG was stored in flasks (2.5 g/flask at 4 to 8ºC. All the characteristics of the product were consistent with the requirements of the 1997 Pharmacopée Européenne.

  4. Deserts in the Deluge: TerraPopulus and Big Human-Environment Data.

    Science.gov (United States)

    Manson, S M; Kugler, T A; Haynes, D

    2016-01-01

    Terra Populus, or TerraPop, is a cyberinfrastructure project that integrates, preserves, and disseminates massive data collections describing characteristics of the human population and environment over the last six decades. TerraPop has made a number of GIScience advances in the handling of big spatial data to make information interoperable between formats and across scientific communities. In this paper, we describe challenges of these data, or 'deserts in the deluge' of data, that are common to spatial big data more broadly, and explore computational solutions specific to microdata, raster, and vector data models.

  5. Big Hat, No Cattle: Managing Human Resources, Part 2.

    Science.gov (United States)

    Skinner, Wickham

    1982-01-01

    The author discusses why business has difficulty in motivating its employees and proposes a new approach to developing human resources. Discusses mistaken premises, personnel and supervision, setting a long-term goal, changing management's philosophy, and selling human resource development as a company priority. (CT)

  6. Big Hat, No Cattle: Managing Human Resources, Part 1.

    Science.gov (United States)

    Skinner, Wickham

    1982-01-01

    Presents an in-depth analysis of problems and a suggested approach to developing human resources which goes beyond identifying symptoms and provides a comprehensive perspective for building an effective work force. (JOW)

  7. Usefulness of high-dose intravenous human immunoglobulins treatment for refractory recurrent pericarditis.

    Science.gov (United States)

    Moretti, Michele; Buiatti, Alessandra; Merlo, Marco; Massa, Laura; Fabris, Enrico; Pinamonti, Bruno; Sinagra, Gianfranco

    2013-11-01

    The management of refractory recurrent pericarditis is challenging. Previous clinical reports have noted a beneficial effect of high-dose intravenous human immunoglobulins (IvIgs) in isolated and systemic inflammatory disease-related forms. In this article, we analyzed retrospectively our clinical experience with IvIg therapy in a series of clinical cases of pericarditis refractory to conventional treatment. We retrospectively analyzed 9 patients (1994 to 2010) with refractory recurrent pericarditis, who received high-dose IvIg as a part of their medical treatment. Nonsteroidal anti-inflammatory drugs (NSAIDs), steroids, or colchicine treatment was not discontinued during IvIg treatment. No patients had a history of autoimmune or connective tissue diseases. During an average period of 11 months from the first recurrence, patients had experienced a mean of 5 relapses before the first IvIg treatment. In 4 cases, patients showed complete clinical remission with no further relapse after the first IvIg cycle. Two patients experienced a single minor relapse, responsive to short-term nonsteroidal anti-inflammatory drugs. In 2 patients, we performed a second cycle of IvIg after a recurrence of pericarditis, with subsequent complete remission. One patient did not respond to 3 cycles of IvIg and subsequently underwent pericardial window and long-term immunosuppressive treatment. No major adverse effect was observed in consequence of IvIg administration in all the cases. In conclusion, although IvIg mode of action is still poorly understood in this setting, this treatment can be considered as an option in patients with recurrent pericarditis refractory to conventional medical treatment and, in our small series, has proved to be effective in 8 of 9 cases. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Intravenous human immunoglobulins for refractory recurrent pericarditis: a systematic review of all published cases.

    Science.gov (United States)

    Imazio, Massimo; Lazaros, George; Picardi, Elisa; Vasileiou, Panagiotis; Carraro, Mara; Tousoulis, Dimitrios; Belli, Riccardo; Gaita, Fiorenzo

    2016-04-01

    Refractory recurrent pericarditis is a major clinical challenge after colchicine failure, especially in corticosteroid-dependent patients. Human intravenous immunoglobulins (IVIGs) have been proposed as possible therapeutic options for these cases. The goal of this systematic review is to assess the efficacy and safety of IVIGs in this context. Studies reporting the use of IVIG for the treatment of recurrent pericarditis and published up to October 2014 were searched in several databases. All references found, upon initial assessment at title and abstract level for suitability, were consequently retrieved as full reports for further appraisal. Among the 18 citations retrieved, 17 reports (4 case series and 13 single case reports, with an overall population of 30 patients) were included. The mean disease duration was 14 months and the mean number of recurrences before IVIG was 3. Approximately 47% of patients had idiopathic recurrent pericarditis, 10% had an infective cause, and the remainder a systemic inflammatory disease. Nineteen out of the 30 patients (63.3%) were on corticosteroids at IVIG commencement. IVIGs were generally administered at a dose of 400-500 mg/kg/day for 5 consecutive days with repeated cycles according to the clinical response. Complications were uncommon (headache in ~3%) and not life-threatening. After a mean follow-up of approximately 33th months, recurrences occurred in 26.6% of cases after the first IVIG cycle, and 22 of the 30 patients (73.3%) were recurrence-free. Five patients (16.6%) were on corticosteroids at the end of the follow-up. IVIGs are rapidly acting, well tolerated, and efficacious steroid-sparing agents in refractory pericarditis.

  9. Schooling for Humanity: When Big Brother Isn't Watching.

    Science.gov (United States)

    Solmitz, David O.

    Most educational reform initiatives of the past 20 years are geared towards ensuring that the United States dominates the emerging global economy. What is lost in this rush to the top of the materialist heap is an education for the more enduring human values: creativity, intellectual development, care, social justice, and democracy. In this book,…

  10. Systemic administration of antiretrovirals prior to exposure prevents rectal and intravenous HIV-1 transmission in humanized BLT mice.

    Directory of Open Access Journals (Sweden)

    Paul W Denton

    2010-01-01

    Full Text Available Successful antiretroviral pre-exposure prophylaxis (PrEP for mucosal and intravenous HIV-1 transmission could reduce new infections among targeted high-risk populations including discordant couples, injection drug users, high-risk women and men who have sex with men. Targeted antiretroviral PrEP could be particularly effective at slowing the spread of HIV-1 if a single antiretroviral combination were found to be broadly protective across multiple routes of transmission. Therefore, we designed our in vivo preclinical study to systematically investigate whether rectal and intravenous HIV-1 transmission can be blocked by antiretrovirals administered systemically prior to HIV-1 exposure. We performed these studies using a highly relevant in vivo model of mucosal HIV-1 transmission, humanized Bone marrow/Liver/Thymus mice (BLT. BLT mice are susceptible to HIV-1 infection via three major physiological routes of viral transmission: vaginal, rectal and intravenous. Our results show that BLT mice given systemic antiretroviral PrEP are efficiently protected from HIV-1 infection regardless of the route of exposure. Specifically, systemic antiretroviral PrEP with emtricitabine and tenofovir disoproxil fumarate prevented both rectal (Chi square = 8.6, df = 1, p = 0.003 and intravenous (Chi square = 13, df = 1, p = 0.0003 HIV-1 transmission. Our results indicate that antiretroviral PrEP has the potential to be broadly effective at preventing new rectal or intravenous HIV transmissions in targeted high risk individuals. These in vivo preclinical findings provide strong experimental evidence supporting the potential clinical implementation of antiretroviral based pre-exposure prophylactic measures to prevent the spread of HIV/AIDS.

  11. The dynamics of big data and human rights: the case of scientific research.

    Science.gov (United States)

    Vayena, Effy; Tasioulas, John

    2016-12-28

    In this paper, we address the complex relationship between big data and human rights. Because this is a vast terrain, we restrict our focus in two main ways. First, we concentrate on big data applications in scientific research, mostly health-related research. And, second, we concentrate on two human rights: the familiar right to privacy and the less well-known right to science. Our contention is that human rights interact in potentially complex ways with big data, not only constraining it, but also enabling it in various ways; and that such rights are dynamic in character, rather than fixed once and for all, changing in their implications over time in line with changes in the context we inhabit, and also as they interact among themselves in jointly responding to the opportunities and risks thrown up by a changing world. Understanding this dynamic interaction of human rights is crucial for formulating an ethic tailored to the realities-the new capabilities and risks-of the rapidly evolving digital environment.This article is part of the themed issue 'The ethical impact of data science'. © 2016 The Author(s).

  12. Short faces, big tongues: developmental origin of the human chin.

    Directory of Open Access Journals (Sweden)

    Michael Coquerelle

    Full Text Available During the course of human evolution, the retraction of the face underneath the braincase, and closer to the cervical column, has reduced the horizontal dimension of the vocal tract. By contrast, the relative size of the tongue has not been reduced, implying a rearrangement of the space at the back of the vocal tract to allow breathing and swallowing. This may have left a morphological signature such as a chin (mental prominence that can potentially be interpreted in Homo. Long considered an autopomorphic trait of Homo sapiens, various extinct hominins show different forms of mental prominence. These features may be the evolutionary by-product of equivalent developmental constraints correlated with an enlarged tongue. In order to investigate developmental mechanisms related to this hypothesis, we compare modern 34 human infants against 8 chimpanzee fetuses, whom development of the mandibular symphysis passes through similar stages. The study sets out to test that the shared ontogenetic shape changes of the symphysis observed in both species are driven by the same factor--space restriction at the back of the vocal tract and the associated arrangement of the tongue and hyoid bone. We apply geometric morphometric methods to extensive three-dimensional anatomical landmarks and semilandmarks configuration, capturing the geometry of the cervico-craniofacial complex including the hyoid bone, tongue muscle and the mandible. We demonstrate that in both species, the forward displacement of the mental region derives from the arrangement of the tongue and hyoid bone, in order to cope with the relative horizontal narrowing of the oral cavity. Because humans and chimpanzees share this pattern of developmental integration, the different forms of mental prominence seen in some extinct hominids likely originate from equivalent ontogenetic constraints. Variations in this process could account for similar morphologies.

  13. Effect of intravenous infusion of glyceryl trinitrate on gastric and small intestinal motor function in healthy humans

    DEFF Research Database (Denmark)

    Madsen, Jan Lysgård; Fuglsang, Stefan; Graff, J

    2006-01-01

    of glyceryl trinitrate 1 microg/kg x min or saline. A gamma camera technique was used to measure gastric emptying and small intestinal transit after a 1600-kJ mixed liquid and solid meal. Furthermore, duodenal motility was assessed by manometry. RESULTS: Glyceryl trinitrate did not change gastric mean......BACKGROUND: Glyceryl trinitrate is a donor of nitric oxide that relaxes smooth muscle cells of the gastrointestinal tract. Little is known about the effect of glyceryl trinitrate on gastric emptying and no data exist on the possible effect of glyceryl trinitrate on small intestinal transit. AIM......: To examine the effect of intravenous infusion of glyceryl trinitrate on gastric and small intestinal motor function after a meal in healthy humans. METHODS: Nine healthy volunteers participated in a placebo-controlled, double-blind, crossover study. Each volunteer was examined during intravenous infusion...

  14. Dose-Dependent Effect of Intravenous Administration of Human Umbilical Cord-Derived Mesenchymal Stem Cells in Neonatal Stroke Mice

    Science.gov (United States)

    Tanaka, Emi; Ogawa, Yuko; Mukai, Takeo; Sato, Yoshiaki; Hamazaki, Takashi; Nagamura-Inoue, Tokiko; Harada-Shiba, Mariko; Shintaku, Haruo; Tsuji, Masahiro

    2018-01-01

    Neonatal brain injury induced by stroke causes significant disability, including cerebral palsy, and there is no effective therapy for stroke. Recently, mesenchymal stem cells (MSCs) have emerged as a promising tool for stem cell-based therapies. In this study, we examined the safety and efficacy of intravenously administered human umbilical cord-derived MSCs (UC-MSCs) in neonatal stroke mice. Pups underwent permanent middle cerebral artery occlusion at postnatal day 12 (P12), and low-dose (1 × 104) or high-dose (1 × 105) UC-MSCs were administered intravenously 48 h after the insult (P14). To evaluate the effect of the UC-MSC treatment, neurological behavior and cerebral blood flow were measured, and neuroanatomical analysis was performed at P28. To investigate the mechanisms of intravenously injected UC-MSCs, systemic blood flowmetry, in vivo imaging and human brain-derived neurotrophic factor (BDNF) measurements were performed. Functional disability was significantly improved in the high-dose UC-MSC group when compared with the vehicle group, but cerebral blood flow and cerebral hemispheric volume were not restored by UC-MSC therapy. The level of exogenous human BDNF was elevated only in the cerebrospinal fluid of one pup 24 h after UC-MSC injection, and in vivo imaging revealed that most UC-MSCs were trapped in the lungs and disappeared in a week without migration toward the brain or other organs. We found that systemic blood flow was stable over the 10 min after cell administration and that there were no differences in mortality among the groups. Immunohistopathological assessment showed that the percent area of Iba1-positive staining in the peri-infarct cortex was significantly reduced with the high-dose UC-MSC treatment compared with the vehicle treatment. These results suggest that intravenous administration of UC-MSCs is safe for a mouse model of neonatal stroke and improves dysfunction after middle cerebral artery occlusion by modulating

  15. Dose-Dependent Effect of Intravenous Administration of Human Umbilical Cord-Derived Mesenchymal Stem Cells in Neonatal Stroke Mice

    Directory of Open Access Journals (Sweden)

    Emi Tanaka

    2018-03-01

    Full Text Available Neonatal brain injury induced by stroke causes significant disability, including cerebral palsy, and there is no effective therapy for stroke. Recently, mesenchymal stem cells (MSCs have emerged as a promising tool for stem cell-based therapies. In this study, we examined the safety and efficacy of intravenously administered human umbilical cord-derived MSCs (UC-MSCs in neonatal stroke mice. Pups underwent permanent middle cerebral artery occlusion at postnatal day 12 (P12, and low-dose (1 × 104 or high-dose (1 × 105 UC-MSCs were administered intravenously 48 h after the insult (P14. To evaluate the effect of the UC-MSC treatment, neurological behavior and cerebral blood flow were measured, and neuroanatomical analysis was performed at P28. To investigate the mechanisms of intravenously injected UC-MSCs, systemic blood flowmetry, in vivo imaging and human brain-derived neurotrophic factor (BDNF measurements were performed. Functional disability was significantly improved in the high-dose UC-MSC group when compared with the vehicle group, but cerebral blood flow and cerebral hemispheric volume were not restored by UC-MSC therapy. The level of exogenous human BDNF was elevated only in the cerebrospinal fluid of one pup 24 h after UC-MSC injection, and in vivo imaging revealed that most UC-MSCs were trapped in the lungs and disappeared in a week without migration toward the brain or other organs. We found that systemic blood flow was stable over the 10 min after cell administration and that there were no differences in mortality among the groups. Immunohistopathological assessment showed that the percent area of Iba1-positive staining in the peri-infarct cortex was significantly reduced with the high-dose UC-MSC treatment compared with the vehicle treatment. These results suggest that intravenous administration of UC-MSCs is safe for a mouse model of neonatal stroke and improves dysfunction after middle cerebral artery occlusion by

  16. How They Move Reveals What Is Happening: Understanding the Dynamics of Big Events from Human Mobility Pattern

    Directory of Open Access Journals (Sweden)

    Jean Damascène Mazimpaka

    2017-01-01

    Full Text Available The context in which a moving object moves contributes to the movement pattern observed. Likewise, the movement pattern reflects the properties of the movement context. In particular, big events influence human mobility depending on the dynamics of the events. However, this influence has not been explored to understand big events. In this paper, we propose a methodology for learning about big events from human mobility pattern. The methodology involves extracting and analysing the stopping, approaching, and moving-away interactions between public transportation vehicles and the geographic context. The analysis is carried out at two different temporal granularity levels to discover global and local patterns. The results of evaluating this methodology on bus trajectories demonstrate that it can discover occurrences of big events from mobility patterns, roughly estimate the event start and end time, and reveal the temporal patterns of arrival and departure of event attendees. This knowledge can be usefully applied in transportation and event planning and management.

  17. Human kinetics of orally and intravenously administered low-dose 1,2-(13)C-dichloroacetate.

    Science.gov (United States)

    Jia, Minghong; Coats, Bonnie; Chadha, Monisha; Frentzen, Barbara; Perez-Rodriguez, Javier; Chadik, Paul A; Yost, Richard A; Henderson, George N; Stacpoole, Peter W

    2006-12-01

    Dichloroacetate (DCA) is a putative environmental hazard, owing to its ubiquitous presence in the biosphere and its association with animal and human toxicity. We sought to determine the kinetics of environmentally relevant concentrations of 1,2-(13)C-DCA administered to healthy adults. Subjects received an oral or intravenous dose of 2.5 microg/kg of 1,2-(13)C-DCA. Plasma and urine concentrations of 1,2-(13)C-DCA were measured by a modified gas chromatography-tandem mass spectrometry method. 1,2-(13)C-DCA kinetics was determined by modeling using WinNonlin 4.1 software. Plasma concentrations of 1,2-(13)C-DCA peaked 10 minutes and 30 minutes after intravenous or oral administration, respectively. Plasma kinetic parameters varied as a function of dose and duration. Very little unchanged 1,2-(13)C-DCA was excreted in urine. Trace amounts of DCA alter its own kinetics after short-term exposure. These findings have important implications for interpreting the impact of this xenobiotic on human health.

  18. Sandwich-type enzyme immunoassay for big endothelin-I in plasma: concentrations in healthy human subjects unaffected by sex or posture.

    Science.gov (United States)

    Aubin, P; Le Brun, G; Moldovan, F; Villette, J M; Créminon, C; Dumas, J; Homyrda, L; Soliman, H; Azizi, M; Fiet, J

    1997-01-01

    A sandwich-type enzyme immunoassay has been developed for measuring human big endothelin-1 (big ET-1) in human plasma and supernatant fluids from human cell cultures. Big ET-1 is the precursor of endothelin 1 (ET-1), the most potent vasoconstrictor known. A rabbit antibody raised against the big ET-1 COOH-terminus fragment was used as an immobilized antibody (anti-P16). The Fab' fragment of a monoclonal antibody (1B3) raised against the ET-1 loop fragment was used as the enzyme-labeled antibody, after being coupled to acetylcholinesterase. The lowest detectable value in the assay was 1.2 pg/mL (0.12 pg/well). The assay was highly specific for big ET-1, demonstrating no cross-reactivity with ET-1, big endothelin-2 (big ET-2), and big endothelin-3 (big ET-3). We used this assay to evaluate the effect of two different postural positions (supine and standing) on plasma big ET-1 concentrations in 11 male and 11 female healthy subjects. Data analysis revealed that neither sex nor body position influenced plasma big ET-1 concentrations. This assay should thus permit the detection of possible variations in plasma concentrations of big ET-1 in certain pathologies and, in association with ET-1 assay, make possible in vitro study of endothelin-converting enzyme activity in cell models. Such studies could clarify the physiological and clinical roles of this family of peptides.

  19. Hepatic glycogen in humans. I. Direct formation after oral and intravenous glucose or after a 24-h fast

    International Nuclear Information System (INIS)

    Radziuk, J.

    1989-01-01

    The formation of hepatic glycogen by the direct pathway is assessed in humans after a 12-h fast and oral loading (100 g) or intravenous infusion (90 g) and after a 24-h fast and the same oral glucose load. The methodology used is based on the double tracer method. [3- 3 H]glucose is infused at a constant rate for the determination of the metabolic clearance of glucose. [1- 14 C]glucose is administered with the glucose load. One hour after absorption or the intravenous glucose infusion is terminated, a glucagon infusion is initiated to mobilize the glycogen labeled with [1- 14 C]glucose and formed during the absorptive period. At this time a third tracer, [6- 3 H]glucose, is administered to measure glucose clearance. It was found that after the 12-h fast and oral glucose loading 7.2 +/- 1.1 g of hepatic glycogen appears to be formed directly from glucose compared with 8.4 +/- 1.0 g after the same load and a 24-h fast and 8.5 +/- 0.4 g after a 12-h fast and an equivalent intravenous glucose infusion. When the amount of label ([ 14 C]glucose) mobilized that was not corrected for metabolic recycling was calculated, the data suggested that the amount of glycogen formed by gluconeogenic pathways was probably at least equal to that formed by direct uptake. It was also approximately 60% greater after a 24-h fast. It can be concluded that the amount of hepatic glycogen formed directly from glucose during glucose loading is not significantly altered by the route of entry or the extension of the fasting period to 24 h. The data suggest, however, that gluconeogenetic formation of glycogen increases with fasting

  20. Long-term intravenous treatment of Pompe disease with recombinant human alpha-glucosidase from milk.

    Science.gov (United States)

    Van den Hout, Johanna M P; Kamphoven, Joep H J; Winkel, Léon P F; Arts, Willem F M; De Klerk, Johannes B C; Loonen, M Christa B; Vulto, Arnold G; Cromme-Dijkhuis, Adri; Weisglas-Kuperus, Nynke; Hop, Wim; Van Hirtum, Hans; Van Diggelen, Otto P; Boer, Marijke; Kroos, Marian A; Van Doorn, Pieter A; Van der Voort, Edwin; Sibbles, Barbara; Van Corven, Emiel J J M; Brakenhoff, Just P J; Van Hove, Johan; Smeitink, Jan A M; de Jong, Gerard; Reuser, Arnold J J; Van der Ploeg, Ans T

    2004-05-01

    Recent reports warn that the worldwide cell culture capacity is insufficient to fulfill the increasing demand for human protein drugs. Production in milk of transgenic animals is an attractive alternative. Kilogram quantities of product per year can be obtained at relatively low costs, even in small animals such as rabbits. We tested the long-term safety and efficacy of recombinant human -glucosidase (rhAGLU) from rabbit milk for the treatment of the lysosomal storage disorder Pompe disease. The disease occurs with an estimated frequency of 1 in 40,000 and is designated as orphan disease. The classic infantile form leads to death at a median age of 6 to 8 months and is diagnosed by absence of alpha-glucosidase activity and presence of fully deleterious mutations in the alpha-glucosidase gene. Cardiac hypertrophy is characteristically present. Loss of muscle strength prevents infants from achieving developmental milestones such as sitting, standing, and walking. Milder forms of the disease are associated with less severe mutations and partial deficiency of alpha-glucosidase. In the beginning of 1999, 4 critically ill patients with infantile Pompe disease (2.5-8 months of age) were enrolled in a single-center open-label study and treated intravenously with rhAGLU in a dose of 15 to 40 mg/kg/week. Genotypes of patients were consistent with the most severe form of Pompe disease. Additional molecular analysis failed to detect processed forms of alpha-glucosidase (95, 76, and 70 kDa) in 3 of the 4 patients and revealed only a trace amount of the 95-kDa biosynthetic intermediate form in the fourth (patient 1). With the more sensitive detection method, 35S-methionine incorporation, we could detect low-level synthesis of -glucosidase in 3 of the 4 patients (patients 1, 2, and 4) with some posttranslation modification from 110 kDa to 95 kDa in 1 of them (patient 1). One patient (patient 3) remained totally deficient with both detection methods (negative for cross

  1. Clearance of 131I-labeled murine monoclonal antibody from patients' blood by intravenous human anti-murine immunoglobulin antibody

    International Nuclear Information System (INIS)

    Stewart, J.S.; Sivolapenko, G.B.; Hird, V.; Davies, K.A.; Walport, M.; Ritter, M.A.; Epenetos, A.A.

    1990-01-01

    Five patients treated with intraperitoneal 131I-labeled mouse monoclonal antibody for ovarian cancer also received i.v. exogenous polyclonal human anti-murine immunoglobulin antibody. The pharmacokinetics of 131I-labeled monoclonal antibody in these patients were compared with those of 28 other patients receiving i.p.-radiolabeled monoclonal antibody for the first time without exogenous human anti-murine immunoglobulin, and who had no preexisting endogenous human anti-murine immunoglobulin antibody. Patients receiving i.v. human anti-murine immunoglobulin antibody demonstrated a rapid clearance of 131I-labeled monoclonal antibody from their circulation. The (mean) maximum 131I blood content was 11.4% of the injected activity in patients receiving human anti-murine immunoglobulin antibody compared to 23.3% in patients not given human anti-murine immunoglobulin antibody. Intravenous human anti-murine immunoglobulin antibody decreased the radiation dose to bone marrow (from 131I-labeled monoclonal antibody in the vascular compartment) 4-fold. Following the injection of human anti-murine immunoglobulin antibody, 131I-monoclonal/human anti-murine immunoglobulin antibody immune complexes were rapidly transported to the liver. Antibody dehalogenation in the liver was rapid, with 87% of the injected 131I excreted in 5 days. Despite the efficient hepatic uptake of immune complexes, dehalogenation of monoclonal antibody was so rapid that the radiation dose to liver parenchyma from circulating 131I was decreased 4-fold rather than increased. All patients developed endogenous human anti-murine immunoglobulin antibody 2 to 3 weeks after treatment

  2. High Efficiency of Human Normal Immunoglobulin for Intravenous Administration in a Patient with Kawasaki Syndrome Diagnosed in the Later Stages

    Directory of Open Access Journals (Sweden)

    Tatyana V. Sleptsova

    2016-01-01

    Full Text Available The article describes a case of late diagnosis of mucocutaneous lymphonodular syndrome (Kawasaki syndrome. At the beginning of the therapy, the child had fever, conjunctivitis, stomatitis, rash, solid swelling of hands and feet, and coronaritis with the development of aneurysms. The article describes the successful use of normal human immunoglobulin for intravenous administration at a dose of 2 g/kg body weight per course in combination with acetylsalicylic acid at the dose of 80 mg/kg per day. After 3 days of treatment, the rash disappeared; limb swelling and symptoms of conjunctivitis significantly reduced; and laboratory parameters of disease activity became normal (erythrocyte sedimentation rate, C-reactive protein concentration. After 3 months, inflammation in the coronary arteries was stopped. After 6 months, a regression of coronary artery aneurysms was recorded. No adverse effects during the immunoglobulin therapy were observed.

  3. Effects of intravenous glucose on dopaminergic function in the human brain in vivo.

    Science.gov (United States)

    Haltia, Lauri T; Rinne, Juha O; Merisaari, Harri; Maguire, Ralph P; Savontaus, Eriika; Helin, Semi; Någren, Kjell; Kaasinen, Valtteri

    2007-09-01

    Dopamine is known to regulate food intake by modulating food reward via the mesolimbic circuitry of the brain. The objective of this study was to compare the effects of high energy input (i.v. glucose) on striatal and thalamic dopamine release in overweight and lean individuals. We hypothesized that glucose would induce dopamine release and positive ratings (e.g., satiety) in Behavioral Analog Scales, particularly in food-deprived lean subjects. [(11)C]raclopride PET was performed for 12 lean (mean BMI = 22 kg/m(2)) and 12 overweight (mean BMI = 33 kg/m(2)) healthy subjects. Each subject was imaged twice in a blinded counter-balanced setting, after 300 mg/kg i.v. glucose and after i.v. placebo. Dopamine D2 receptor binding potentials (BPs) were estimated. The voxel-based analysis of the baseline scans indicated lower striatal BPs in the overweight group and a negative correlation between BMIs and BPs. Intravenous glucose did not have a significant effect on BPs in overweight or lean subjects (male and female groups combined). However, BP changes were opposite in the two gender groups. In male subjects, significant BP reductions after glucose were seen in the right and left caudate nucleus, left putamen, and right thalamus. In female subjects, increases in BP secondary to glucose were seen in the right caudate nucleus and right and left putamen. The sexually dimorphic effect of glucose was seen in both overweight and lean subjects. Although gender differences were not among the a priori hypotheses of the present study and, therefore, they must be considered to be preliminary findings, we postulate that this observation is a reflection of an interaction between glucose, sex steroids (estrogen), leptin, and dopamine.

  4. Noninvasive quantification of human brain antioxidant concentrations after an intravenous bolus of vitamin C

    Science.gov (United States)

    Background: Until now, antioxidant based initiatives for preventing dementia have lacked a means to detect deficiency or measure pharmacologic effect in the human brain in situ. Objective: Our objective was to apply a novel method to measure key human brain antioxidant concentrations throughout the ...

  5. Human factors/ergonomics implications of big data analytics: Chartered Institute of Ergonomics and Human Factors annual lecture.

    Science.gov (United States)

    Drury, Colin G

    2015-01-01

    In recent years, advances in sensor technology, connectedness and computational power have come together to produce huge data-sets. The treatment and analysis of these data-sets is known as big data analytics (BDA), and the somewhat related term data mining. Fields allied to human factors/ergonomics (HFE), e.g. statistics, have developed computational methods to derive meaningful, actionable conclusions from these data bases. This paper examines BDA, often characterised by volume, velocity and variety, giving examples of successful BDA use. This examination provides context by considering examples of using BDA on human data, using BDA in HFE studies, and studies of how people perform BDA. Significant issues for HFE are the reliance of BDA on correlation rather than hypotheses and theory, the ethics of BDA and the use of HFE in data visualisation.

  6. Intravenous Lipid Emulsion as an Antidote for the Treatment of Acute Poisoning: A Bibliometric Analysis of Human and Animal Studies.

    Science.gov (United States)

    Zyoud, Sa'ed H; Waring, W Stephen; Al-Jabi, Samah W; Sweileh, Waleed M; Rahhal, Belal; Awang, Rahmat

    2016-11-01

    In recent years, there has been increasing interest in the role of intravenous lipid formulations as potential antidotes in patients with severe cardiotoxicity caused by drug toxicity. The aim of this study was to conduct a comprehensive bibliometric analysis of all human and animal studies featuring lipid emulsion as an antidote for the treatment of acute poisoning. The Scopus database search was performed on 5 February 2016 to analyse the research output related to intravenous lipid emulsion as an antidote for the treatment of acute poisoning. Research indicators used for analysis included total number of articles, date (year) of publication, total citations, value of the h-index, document types, countries of publication, journal names, collaboration patterns and institutions. A total of 594 articles were retrieved from Scopus database for the period of 1955-2015. The percentage share of global intravenous lipid emulsion research output showed that research output was 85.86% in 2006-2015 with yearly average growth in this field of 51 articles per year. The USA, United Kingdom (UK), France, Canada, New Zealand, Germany, Australia, China, Turkey and Japan accounted for 449 (75.6%) of all the publications. The total number of citations for all documents was 9,333, with an average of 15.7 citations per document. The h-index of the retrieved documents for lipid emulsion research as antidote for the treatment of acute poisoning was 49. The USA and the UK achieved the highest h-indices, 34 and 14, respectively. New Zealand produced the greatest number of documents with international collaboration (51.9%) followed by Australia (50%) and Canada (41.4%) out of the total number of publications for each country. In summary, we found an increase in the number of publications in the field of lipid emulsion after 2006. The results of this study demonstrate that the majority of publications in the field of lipid emulsion were published by high-income countries. Researchers from

  7. A pharmacokinetic evaluation of five H(1) antagonists after an oral and intravenous microdose to human subjects.

    Science.gov (United States)

    Madan, Ajay; O'Brien, Zhihong; Wen, Jianyun; O'Brien, Chris; Farber, Robert H; Beaton, Graham; Crowe, Paul; Oosterhuis, Berend; Garner, R Colin; Lappin, Graham; Bozigian, Haig P

    2009-03-01

    To evaluate the pharmacokinetics (PK) of five H(1) receptor antagonists in human volunteers after a single oral and intravenous (i.v.) microdose (0.1 mg). Five H(1) receptor antagonists, namely NBI-1, NBI-2, NBI-3, NBI-4 and diphenhydramine, were administered to human volunteers as a single 0.1-mg oral and i.v. dose. Blood samples were collected up to 48 h, and the parent compound in the plasma extract was quantified by high-performance liquid chromatography and accelerator mass spectroscopy. The median clearance (CL), apparent volume of distribution (V(d)) and apparent terminal elimination half-life (t(1/2)) of diphenhydramine after an i.v. microdose were 24.7 l h(-1), 302 l and 9.3 h, and the oral C(max) and AUC(0-infinity) were 0.195 ng ml(-1) and 1.52 ng h ml(-1), respectively. These data were consistent with previously published diphenhydramine data at 500 times the microdose. The rank order of oral bioavailability of the five compounds was as follows: NBI-2 > NBI-1 > NBI-3 > diphenhydramine > NBI-4, whereas the rank order for CL was NBI-4 > diphenhydramine > NBI-1 > NBI-3 > NBI-2. Human microdosing provided estimates of clinical PK of four structurally related compounds, which were deemed useful for compound selection.

  8. A pharmacokinetic evaluation of five H1 antagonists after an oral and intravenous microdose to human subjects

    Science.gov (United States)

    Madan, Ajay; O'Brien, Zhihong; Wen, Jianyun; O'Brien, Chris; Farber, Robert H; Beaton, Graham; Crowe, Paul; Oosterhuis, Berend; Garner, R Colin; Lappin, Graham; Bozigian, Haig P

    2009-01-01

    AIMS To evaluate the pharmacokinetics (PK) of five H1 receptor antagonists in human volunteers after a single oral and intravenous (i.v.) microdose (0.1 mg). METHODS Five H1 receptor antagonists, namely NBI-1, NBI-2, NBI-3, NBI-4 and diphenhydramine, were administered to human volunteers as a single 0.1-mg oral and i.v. dose. Blood samples were collected up to 48 h, and the parent compound in the plasma extract was quantified by high-performance liquid chromatography and accelerator mass spectroscopy. RESULTS The median clearance (CL), apparent volume of distribution (Vd) and apparent terminal elimination half-life (t1/2) of diphenhydramine after an i.v. microdose were 24.7 l h−1, 302 l and 9.3 h, and the oral Cmax and AUC0–∞ were 0.195 ng ml−1 and 1.52 ng h ml−1, respectively. These data were consistent with previously published diphenhydramine data at 500 times the microdose. The rank order of oral bioavailability of the five compounds was as follows: NBI-2 > NBI-1 > NBI-3 > diphenhydramine > NBI-4, whereas the rank order for CL was NBI-4 > diphenhydramine > NBI-1 > NBI-3 > NBI-2. CONCLUSIONS Human microdosing provided estimates of clinical PK of four structurally related compounds, which were deemed useful for compound selection. PMID:19523012

  9. A big data approach to the concordance of the toxicity of pharmaceuticals in animals and humans.

    Science.gov (United States)

    Clark, Matthew; Steger-Hartmann, Thomas

    2018-07-01

    Although lack of efficacy is an important cause of late stage attrition in drug development the shortcomings in the translation of toxicities observed during the preclinical development to observations in clinical trials or post-approval is an ongoing topic of research. The concordance between preclinical and clinical safety observations has been analyzed only on relatively small data sets, mostly over short time periods of drug approvals. We therefore explored the feasibility of a big-data analysis on a set of 3,290 approved drugs and formulations for which 1,637,449 adverse events were reported for both humans animal species in regulatory submissions over a period of more than 70 years. The events reported in five species - rat, dog, mouse, rabbit, and cynomolgus monkey - were treated as diagnostic tests for human events and the diagnostic power was computed for each event/species pair using likelihood ratios. The animal-human translation of many key observations is confirmed as being predictive, such as QT prolongation and arrhythmias in dog. Our study confirmed the general predictivity of animal safety observations for humans, but also identified issues of such automated analyses which are on the one hand related to data curation and controlled vocabularies, on the other hand to methodological changes over the course of time. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.

  10. Acute effect of intravenously applied alcohol in the human striatal and extrastriatal D2 /D3 dopamine system.

    Science.gov (United States)

    Pfeifer, Philippe; Tüscher, Oliver; Buchholz, Hans Georg; Gründer, Gerhard; Vernaleken, Ingo; Paulzen, Michael; Zimmermann, Ulrich S; Maus, Stephan; Lieb, Klaus; Eggermann, Thomas; Fehr, Christoph; Schreckenberger, Mathias

    2017-09-01

    Investigations on the acute effects of alcohol in the human mesolimbic dopamine D 2 /D 3 receptor system have yielded conflicting results. With respect to the effects of alcohol on extrastriatal D 2 /D 3 dopamine receptors no investigations have been reported yet. Therefore we applied PET imaging using the postsynaptic dopamine D 2 /D 3 receptor ligand [ 18 F]fallypride addressing the question, whether intravenously applied alcohol stimulates the extrastriatal and striatal dopamine system. We measured subjective effects of alcohol and made correlation analyses with the striatal and extrastriatal D 2 /D 3 binding potential. Twenty-four healthy male μ-opioid receptor (OPRM1)118G allele carriers underwent a standardized intravenous and placebo alcohol administration. The subjective effects of alcohol were measured with a visual analogue scale. For the evaluation of the dopamine response we calculated the binding potential (BP ND ) by using the simplified reference tissue model (SRTM). In addition, we calculated distribution volumes (target and reference regions) in 10 subjects for which metabolite corrected arterial samples were available. In the alcohol condition no significant dopamine response in terms of a reduction of BP ND was observed in striatal and extrastriatal brain regions. We found a positive correlation for 'liking' alcohol and the BP ND in extrastriatal brain regions (Inferior frontal cortex (IFC) (r = 0.533, p = 0.007), orbitofrontal cortex (OFC) (r = 0.416, p = 0.043) and prefrontal cortex (PFC) (r = 0.625, p = 0.001)). The acute alcohol effects on the D 2 /D 3 dopamine receptor binding potential of the striatal and extrastriatal system in our experiment were insignificant. A positive correlation of the subjective effect of 'liking' alcohol with cortical D 2 /D 3 receptors may hint at an addiction relevant trait. © 2016 Society for the Study of Addiction.

  11. Long-term intravenous treatment of Pompe disease with recombinant human alpha-glucosidase from milk.

    NARCIS (Netherlands)

    Hout, J.M. van den; Kamphoven, J.H.; Winkel, L.P.; Arts, W.F.M.; Klerk, J.B.C. de; Loonen, M.C.B.; Vulto, A.G.; Cromme-Dijkhuis, A.H.; Weisglas-Kuperus, N.; Hop, W.C.J.; Hirtum, H. van; Diggelen, O.P. van; Boer, M. de; Kroos, M.A.; Doorn, P.A. van; Voort, E.I. van der; Sibbles, B.; Corven, E.J. van; Brakenhoff, J.P.; Hove, J.L. van; Smeitink, J.A.M.; Jong, G. de; Reuser, A.J.J.; Ploeg, A.T. van der

    2004-01-01

    OBJECTIVE: Recent reports warn that the worldwide cell culture capacity is insufficient to fulfill the increasing demand for human protein drugs. Production in milk of transgenic animals is an attractive alternative. Kilogram quantities of product per year can be obtained at relatively low costs,

  12. Long-term intravenous treatment of Pompe disease with recombinant human alpha-glucosidase from milk

    NARCIS (Netherlands)

    J.M.P. van den Hout (Johanna); B. Sibbles (Barbara); J.P. Brakenhoff (Just); A.H. Cromme-Dijkhuis (Adri); N. Weisglas-Kuperus (Nynke); A.J.J. Reuser (Arnold); M.A. Boer (Marijke); J.A.M. Smeitink (Jan); O.P. van Diggelen (Otto); E. van der Voort (Edwin); E.J.J.M. van Corven (Emiel); H. van Hirtum (Hans); J.H.J. Kamphoven (Joep); A.T. van der Ploeg (Ans); J. van Hove (Johan); W.F.M. Arts (Willem Frans); P.A. van Doorn (Pieter); J.B.C. de Klerk (Johannes); M.C.B. Loonen (Christa); A.G. Vulto (Arnold); M.A. Kroos (Marian); W.C.J. Hop (Wim); L.P.F. Winkel (Léon); G. de Jong (Gerard)

    2004-01-01

    textabstractOBJECTIVE: Recent reports warn that the worldwide cell culture capacity is insufficient to fulfill the increasing demand for human protein drugs. Production in milk of transgenic animals is an attractive alternative. Kilogram quantities of product per year can be

  13. Dosimetry of intravenously administered oxygen-15 labelled water in man: a model based on experimental human data from 21 subjects

    International Nuclear Information System (INIS)

    Smith, T.; Tong, C.; Lammertsma, A.A.; Butler, K.R.; Schnorr, L.; Watson, J.D.G.; Ramsay, S.; Clark, J.C.; Jones, T.

    1994-01-01

    Models based on uniform distribution of tracer in total body water underestimate the absorbed dose from H 2 15 O because of the short half-life (2.04 min) of 15 O, which leads to non-uniform distribution of absorbed dose and also complicates the direct measurement of organ retention curves. However, organ absorbed doses can be predicted by the present kinetic model based on the convolution technique. The measured time course of arterial H 2 15 O concentration following intravenous administration represents the input function to organs. The impulse response of a given organ is its transit time function determined by blood flow and the partition of water between tissue and blood. Values of these two parameters were taken from the literature. Integrals of the arterial input function and organ transit time functions were used to derive integrals of organ retention functions (organ residence times). The latter were used with absorbed dose calculation software (MIRDOSE-2) to obtain estimates for 24 organs. From the mean values of organ absorbed doses, the effective dose equivalent (EDE) and effective dose (ED) were calculated. From measurements on 21 subjects, the average value for both EDE and ED was calculated to be 1.2 μSv.MBq -1 compared with a value of about 0.5 μSv.MBq -1 predicted by uniform water distribution models. Based on the human data, a method of approximating H 2 15 O absorbed dose values from body surface area is described. (orig.)

  14. Estimation of absorbed doses in humans due to intravenous administration of fluorine-18-fluorodeoxyglucose in PET studies

    International Nuclear Information System (INIS)

    Mejia, A.A.; Nakamura, T.; Masatoshi, I.; Hatazawa, J.; Masaki, M.; Watanuki, S.

    1991-01-01

    Radiation absorbed doses due to intravenous administration of fluorine-18-fluorodeoxyglucose in positron emission tomography (PET) studies were estimated in normal volunteers. The time-activity curves were obtained for seven human organs (brain, heart, kidney, liver, lung, pancreas, and spleen) by using dynamic PET scans and for bladder content by using a single detector. These time-activity curves were used for the calculation of the cumulative activity in these organs. Absorbed doses were calculated by the MIRD method using the absorbed dose per unit of cumulated activity, 'S' value, transformed for the Japanese physique and the organ masses of the Japanese reference man. The bladder wall and the heart were the organs receiving higher doses of 1.2 x 10(-1) and 4.5 x 10(-2) mGy/MBq, respectively. The brain received a dose of 2.9 x 10(-2) mGy/MBq, and other organs received doses between 1.0 x 10(-2) and 3.0 x 10(-2) mGy/MBq. The effective dose equivalent was estimated to be 2.4 x 10(-2) mSv/MBq. These results were comparable to values of absorbed doses reported by other authors on the radiation dosimetry of this radiopharmaceutical

  15. Effects of Intravenous Administration of Human Umbilical Cord Blood Stem Cells in 3-Acetylpyridine-Lesioned Rats

    Science.gov (United States)

    Calatrava-Ferreras, Lucía; Gonzalo-Gobernado, Rafael; Herranz, Antonio S.; Reimers, Diana; Montero Vega, Teresa; Jiménez-Escrig, Adriano; Richart López, Luis Alberto; Bazán, Eulalia

    2012-01-01

    Cerebellar ataxias include a heterogeneous group of infrequent diseases characterized by lack of motor coordination caused by disturbances in the cerebellum and its associated circuits. Current therapies are based on the use of drugs that correct some of the molecular processes involved in their pathogenesis. Although these treatments yielded promising results, there is not yet an effective therapy for these diseases. Cell replacement strategies using human umbilical cord blood mononuclear cells (HuUCBMCs) have emerged as a promising approach for restoration of function in neurodegenerative diseases. The aim of this work was to investigate the potential therapeutic activity of HuUCBMCs in the 3-acetylpyridine (3-AP) rat model of cerebellar ataxia. Intravenous administered HuUCBMCs reached the cerebellum and brain stem of 3-AP ataxic rats. Grafted cells reduced 3-AP-induced neuronal loss promoted the activation of microglia in the brain stem, and prevented the overexpression of GFAP elicited by 3-AP in the cerebellum. In addition, HuUCBMCs upregulated the expression of proteins that are critical for cell survival, such as phospho-Akt and Bcl-2, in the cerebellum and brain stem of 3-AP ataxic rats. As all these effects were accompanied by a temporal but significant improvement in motor coordination, HuUCBMCs grafts can be considered as an effective cell replacement therapy for cerebellar disorders. PMID:23150735

  16. Production of intravenous human dengue immunoglobulin from Brazilian-blood donors

    Directory of Open Access Journals (Sweden)

    Frederico Leite Gouveia

    2013-12-01

    Full Text Available Dengue represents an important health problem in Brazil and therefore there is a great need to develop a vaccine or treatment. The neutralization of the dengue virus by a specific antibody can potentially be applied to therapy. The present paper describes, for the first time, the preparation of Immunoglobulin specific for the dengue virus (anti-DENV IgG, collected from screened Brazilian blood-donations. Production was performed using the classic Cohn-Oncley process with minor modifications. The anti-DENV IgG was biochemically and biophysically characterized and fulfilled the requirements defined by the European Pharmacopoeia. The finished product was able to neutralize different virus serotypes (DENV-1, DENV-2, and DENV-3, while a commercial IgG collected from American blood donations was found to have low anti-dengue antibody titers. Overall, this anti-DENV IgG represents an important step in the study of the therapeutic potential and safety of a specific antibody that neutralizes the dengue virus in humans.

  17. Where are human subjects in Big Data research? The emerging ethics divide

    Directory of Open Access Journals (Sweden)

    Jacob Metcalf

    2016-06-01

    Full Text Available There are growing discontinuities between the research practices of data science and established tools of research ethics regulation. Some of the core commitments of existing research ethics regulations, such as the distinction between research and practice, cannot be cleanly exported from biomedical research to data science research. Such discontinuities have led some data science practitioners and researchers to move toward rejecting ethics regulations outright. These shifts occur at the same time as a proposal for major revisions to the Common Rule—the primary regulation governing human-subjects research in the USA—is under consideration for the first time in decades. We contextualize these revisions in long-running complaints about regulation of social science research and argue data science should be understood as continuous with social sciences in this regard. The proposed regulations are more flexible and scalable to the methods of non-biomedical research, yet problematically largely exclude data science methods from human-subjects regulation, particularly uses of public datasets. The ethical frameworks for Big Data research are highly contested and in flux, and the potential harms of data science research are unpredictable. We examine several contentious cases of research harms in data science, including the 2014 Facebook emotional contagion study and the 2016 use of geographical data techniques to identify the pseudonymous artist Banksy. To address disputes about application of human-subjects research ethics in data science, critical data studies should offer a historically nuanced theory of “data subjectivity” responsive to the epistemic methods, harms and benefits of data science and commerce.

  18. Characterization of ornidazole metabolites in human bile after intraveneous doses by ultraperformance liquid chromatography/quadrupole time-of-flight mass spectrometry

    Directory of Open Access Journals (Sweden)

    Jiangbo Du

    2012-04-01

    Full Text Available Ultraperformance liquid chromatography/quadrupole time-of-flight mass spectrometry (UPLC/Q-TOF MS was used to characterize ornidazole metabolites in human bile after intravenous doses. A liquid chromatography tandem mass spectrometry (LC–MS/MS assay was developed for the determination of the bile level of ornidazole. Bile samples, collected from four patients with T-tube drainage after biliary tract surgery, were prepared by protein precipitation with acetonitrile before analysis. A total of 12 metabolites, including 10 novel metabolites, were detected and characterized. The metabolites of ornidazole in human bile were the products of hydrochloride (HCl elimination, oxidative dechlorination, hydroxylation, sulfation, diastereoisomeric glucuronation, and substitution of NO2 or Cl atom by cysteine or N-acetylcysteine, and oxidative dechlorination followed by further carboxylation. The bile levels of ornidazole at 12 h after multiple intravenous infusions were well above its minimal inhibitory concentration for common strains of anaerobic bacteria.

  19. Early Intravenous Delivery of Human Brain Stromal Cells Modulates Systemic Inflammation and Leads to Vasoprotection in Traumatic Spinal Cord Injury.

    Science.gov (United States)

    Badner, Anna; Vawda, Reaz; Laliberte, Alex; Hong, James; Mikhail, Mirriam; Jose, Alejandro; Dragas, Rachel; Fehlings, Michael

    2016-08-01

    : Spinal cord injury (SCI) is a life-threatening condition with multifaceted complications and limited treatment options. In SCI, the initial physical trauma is closely followed by a series of secondary events, including inflammation and blood spinal cord barrier (BSCB) disruption, which further exacerbate injury. This secondary pathology is partially mediated by the systemic immune response to trauma, in which cytokine production leads to the recruitment/activation of inflammatory cells. Because early intravenous delivery of mesenchymal stromal cells (MSCs) has been shown to mitigate inflammation in various models of neurologic disease, this study aimed to assess these effects in a rat model of SCI (C7-T1, 35-gram clip compression) using human brain-derived stromal cells. Quantitative polymerase chain reaction for a human-specific DNA sequence was used to assess cell biodistribution/clearance and confirmed that only a small proportion (approximately 0.001%-0.002%) of cells are delivered to the spinal cord, with the majority residing in the lung, liver, and spleen. Intriguingly, although cell populations drastically declined in all aforementioned organs, there remained a persistent population in the spleen at 7 days. Furthermore, the cell infusion significantly increased splenic and circulating levels of interleukin-10-a potent anti-inflammatory cytokine. Through this suppression of the systemic inflammatory response, the cells also reduced acute spinal cord BSCB permeability, hemorrhage, and lesion volume. These early effects further translated into enhanced functional recovery and tissue sparing 10 weeks after SCI. This work demonstrates an exciting therapeutic approach whereby a minimally invasive cell-transplantation procedure can effectively reduce secondary damage after SCI through systemic immunomodulation. Central nervous system pericytes (perivascular stromal cells) have recently gained significant attention within the scientific community. In addition to

  20. Rapid intravenous infusion of 20 mL/kg saline alters the distribution of perfusion in healthy supine humans.

    Science.gov (United States)

    Henderson, A C; Sá, R C; Barash, I A; Holverda, S; Buxton, R B; Hopkins, S R; Prisk, G K

    2012-03-15

    Rapid intravenous saline infusion, a model meant to replicate the initial changes leading to pulmonary interstitial edema, increases pulmonary arterial pressure in humans. We hypothesized that this would alter lung perfusion distribution. Six healthy subjects (29 ± 6 years) underwent magnetic resonance imaging to quantify perfusion using arterial spin labeling. Regional proton density was measured using a fast-gradient echo sequence, allowing blood delivered to the slice to be normalized for density and quantified in mL/min/g. Contributions from flow in large conduit vessels were minimized using a flow cutoff value (blood delivered > 35% maximum in mL/min/cm(3)) in order to obtain an estimate of blood delivered to the capillary bed (perfusion). Images were acquired supine at baseline, after infusion of 20 mL/kg saline, and after a short upright recovery period for a single sagittal slice in the right lung during breath-holds at functional residual capacity. Thoracic fluid content measured by impedance cardiography was elevated post-infusion by up to 13% (pchanges in conduit vessels, there were no significant changes in perfusion in dependent lung following infusion (7.8 ± 1.9 mL/min/g baseline, 7.9 ± 2.0 post, 8.5 ± 2.1 recovery, p=0.36). There were no significant changes in lung density. These data suggest that saline infusion increased perfusion to nondependent lung, consistent with an increase in intravascular pressures. Dependent lung may have been "protected" from increases in perfusion following infusion due to gravitational compression of the pulmonary vasculature. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Characterizing Big Data Management

    Directory of Open Access Journals (Sweden)

    Rogério Rossi

    2015-06-01

    Full Text Available Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: technology, people and processes. Hence, this article discusses these dimensions: the technological dimension that is related to storage, analytics and visualization of big data; the human aspects of big data; and, in addition, the process management dimension that involves in a technological and business approach the aspects of big data management.

  2. Combining Human Computing and Machine Learning to Make Sense of Big (Aerial) Data for Disaster Response.

    Science.gov (United States)

    Ofli, Ferda; Meier, Patrick; Imran, Muhammad; Castillo, Carlos; Tuia, Devis; Rey, Nicolas; Briant, Julien; Millet, Pauline; Reinhard, Friedrich; Parkan, Matthew; Joost, Stéphane

    2016-03-01

    Aerial imagery captured via unmanned aerial vehicles (UAVs) is playing an increasingly important role in disaster response. Unlike satellite imagery, aerial imagery can be captured and processed within hours rather than days. In addition, the spatial resolution of aerial imagery is an order of magnitude higher than the imagery produced by the most sophisticated commercial satellites today. Both the United States Federal Emergency Management Agency (FEMA) and the European Commission's Joint Research Center (JRC) have noted that aerial imagery will inevitably present a big data challenge. The purpose of this article is to get ahead of this future challenge by proposing a hybrid crowdsourcing and real-time machine learning solution to rapidly process large volumes of aerial data for disaster response in a time-sensitive manner. Crowdsourcing can be used to annotate features of interest in aerial images (such as damaged shelters and roads blocked by debris). These human-annotated features can then be used to train a supervised machine learning system to learn to recognize such features in new unseen images. In this article, we describe how this hybrid solution for image analysis can be implemented as a module (i.e., Aerial Clicker) to extend an existing platform called Artificial Intelligence for Disaster Response (AIDR), which has already been deployed to classify microblog messages during disasters using its Text Clicker module and in response to Cyclone Pam, a category 5 cyclone that devastated Vanuatu in March 2015. The hybrid solution we present can be applied to both aerial and satellite imagery and has applications beyond disaster response such as wildlife protection, human rights, and archeological exploration. As a proof of concept, we recently piloted this solution using very high-resolution aerial photographs of a wildlife reserve in Namibia to support rangers with their wildlife conservation efforts (SAVMAP project, http://lasig.epfl.ch/savmap ). The

  3. A Big Video Manifesto

    DEFF Research Database (Denmark)

    Mcilvenny, Paul Bruce; Davidsen, Jacob

    2017-01-01

    and beautiful visualisations. However, we also need to ask what the tools of big data can do both for the Humanities and for more interpretative approaches and methods. Thus, we prefer to explore how the power of computation, new sensor technologies and massive storage can also help with video-based qualitative......For the last few years, we have witnessed a hype about the potential results and insights that quantitative big data can bring to the social sciences. The wonder of big data has moved into education, traffic planning, and disease control with a promise of making things better with big numbers...

  4. Near Real Time Analytics of Human Sensor Networks in the Realm of Big Data

    Science.gov (United States)

    Aulov, O.; Halem, M.

    2012-12-01

    With the prolific development of social media, emergency responders have an increasing interest in harvesting social media from outlets such as Flickr, Twitter, and Facebook, in order to assess the scale and specifics of extreme events including wild fires, earthquakes, terrorist attacks, oil spills, etc. A number of experimental platforms have successfully been implemented to demonstrate the utilization of social media data in extreme events, including Twitter Earthquake Detector, which relied on tweets for earthquake monitoring; AirTwitter, which used tweets for air quality reporting; and our previous work, using Flickr data as boundary value forcings to improve the forecast of oil beaching in the aftermath of the Deepwater Horizon oil spill. The majority of these platforms addressed a narrow, specific type of emergency and harvested data from a particular outlet. We demonstrate an interactive framework for monitoring, mining and analyzing a plethora of heterogeneous social media sources for a diverse range of extreme events. Our framework consists of three major parts: a real time social media aggregator, a data processing and analysis engine, and a web-based visualization and reporting tool. The aggregator gathers tweets, Facebook comments from fan pages, Google+ posts, forum discussions, blog posts (such as LiveJournal and Blogger.com), images from photo-sharing platforms (such as Flickr, Picasa), videos from video-sharing platforms (youtube, Vimeo), and so forth. The data processing and analysis engine pre-processes the aggregated information and annotates it with geolocation and sentiment information. In many cases, the metadata of the social media posts does not contain geolocation information—-however, a human reader can easily guess from the body of the text what location is discussed. We are automating this task by use of Named Entity Recognition (NER) algorithms and a gazetteer service. The visualization and reporting tool provides a web-based, user

  5. Sexual dimorphism in relation to big-game hunting and economy in modern human populations.

    Science.gov (United States)

    Collier, S

    1993-08-01

    Postcranial skeletal data from two recent Eskimo populations are used to test David Frayer's model of sexual dimorphism reduction in Europe between the Upper Paleolithic and Mesolithic. Frayer argued that a change from big-game hunting and adoption of new technology in the Mesolithic reduced selection for large body size in males and led to a reduction in skeletal sexual dimorphism. Though aspects of Frayer's work have been criticized in the literature, the association of big-game hunting and high sexual dimorphism is untested. This study employs univariate and multivariate analysis to test that association by examining sexual dimorphism of cranial and postcranial bones of two recent Alaskan Eskimo populations, one being big-game (whale and other large marine mammal) hunting people, and the second being salmon fishing, riverine people. While big-game hunting influences skeletal robusticity, it cannot be said to lead to greater sexual dimorphism generally. The two populations had different relative sexual dimorphism levels for different parts of the body. Notably, the big-game hunting (whaling) Eskimos had the lower multivariate dimorphism in the humerus, which could be expected to be the structure under greatest exertion by such hunting in males. While the exertions of the whale hunting economic activities led to high skeletal robusticity, as predicted by Frayer's model, this was true of the females as well as the males, resulting in low sexual dimorphism in some features. Females are half the sexual dimorphism equation, and they cannot be seen as constants in any model of economic behavior.

  6. HARNESSING BIG DATA VOLUMES

    Directory of Open Access Journals (Sweden)

    Bogdan DINU

    2014-04-01

    Full Text Available Big Data can revolutionize humanity. Hidden within the huge amounts and variety of the data we are creating we may find information, facts, social insights and benchmarks that were once virtually impossible to find or were simply inexistent. Large volumes of data allow organizations to tap in real time the full potential of all the internal or external information they possess. Big data calls for quick decisions and innovative ways to assist customers and the society as a whole. Big data platforms and product portfolio will help customers harness to the full the value of big data volumes. This paper deals with technical and technological issues related to handling big data volumes in the Big Data environment.

  7. Intraarticular and intravenous administration of 99MTc-HMPAO-labeled human mesenchymal stem cells (99MTC-AH-MSCS): In vivo imaging and biodistribution

    International Nuclear Information System (INIS)

    Meseguer-Olmo, Luis; Montellano, Antonio Jesús; Martínez, Teresa; Martínez, Carlos M.; Revilla-Nuin, Beatriz; Roldán, Marta; Mora, Cristina Fuente; López-Lucas, Maria Dolores; Fuente, Teodomiro

    2017-01-01

    Introduction: Therapeutic application of intravenous administered (IV) human bone marrow-derived mesenchymal stem cells (ahMSCs) appears to have as main drawback the massive retention of cells in the lung parenchyma, questioning the suitability of this via of administration. Intraarticular administration (IAR) could be considered as an alternative route for therapy in degenerative and traumatic joint lesions. Our work is outlined as a comparative study of biodistribution of 99m Tc-ahMSCs after IV and IAR administration, via scintigraphic study in an animal model. Methods: Isolated primary culture of adult human mesenchymal stem cells was labeled with 99m Tc-HMPAO for scintigraphic study of in vivo distribution after intravenous and intra-articular (knee) administration in rabbits. Results: IV administration of radiolabeled ahMSCs showed the bulk of radioactivity in the lung parenchyma while IAR images showed activity mainly in the injected cavity and complete absence of uptake in pulmonary bed. Conclusions: Our study shows that IAR administration overcomes the limitations of IV injection, in particular, those related to cells destruction in the lung parenchyma. After IAR administration, cells remain within the joint cavity, as expected given its size and adhesion properties. Advances in knowledge: Intra-articular administration of adult human mesenchymal stem cells could be a suitable route for therapeutic effect in joint lesions. Implications for patient care: Local administration of adult human mesenchymal stem cells could improve their therapeutic effects, minimizing side effects in patients.

  8. Big universe, big data

    DEFF Research Database (Denmark)

    Kremer, Jan; Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian

    2017-01-01

    , modern astronomy requires big data know-how, in particular it demands highly efficient machine learning and image analysis algorithms. But scalability is not the only challenge: Astronomy applications touch several current machine learning research questions, such as learning from biased data and dealing......, and highlight some recent methodological advancements in machine learning and image analysis triggered by astronomical applications....

  9. A data analysis framework for biomedical big data: Application on mesoderm differentiation of human pluripotent stem cells.

    Science.gov (United States)

    Ulfenborg, Benjamin; Karlsson, Alexander; Riveiro, Maria; Améen, Caroline; Åkesson, Karolina; Andersson, Christian X; Sartipy, Peter; Synnergren, Jane

    2017-01-01

    The development of high-throughput biomolecular technologies has resulted in generation of vast omics data at an unprecedented rate. This is transforming biomedical research into a big data discipline, where the main challenges relate to the analysis and interpretation of data into new biological knowledge. The aim of this study was to develop a framework for biomedical big data analytics, and apply it for analyzing transcriptomics time series data from early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. To this end, transcriptome profiling by microarray was performed on differentiating human pluripotent stem cells sampled at eleven consecutive days. The gene expression data was analyzed using the five-stage analysis framework proposed in this study, including data preparation, exploratory data analysis, confirmatory analysis, biological knowledge discovery, and visualization of the results. Clustering analysis revealed several distinct expression profiles during differentiation. Genes with an early transient response were strongly related to embryonic- and mesendoderm development, for example CER1 and NODAL. Pluripotency genes, such as NANOG and SOX2, exhibited substantial downregulation shortly after onset of differentiation. Rapid induction of genes related to metal ion response, cardiac tissue development, and muscle contraction were observed around day five and six. Several transcription factors were identified as potential regulators of these processes, e.g. POU1F1, TCF4 and TBP for muscle contraction genes. Pathway analysis revealed temporal activity of several signaling pathways, for example the inhibition of WNT signaling on day 2 and its reactivation on day 4. This study provides a comprehensive characterization of biological events and key regulators of the early differentiation of human pluripotent stem cells towards the mesoderm and cardiac lineages. The proposed analysis framework can be used to structure

  10. Big Data for Global History: The Transformative Promise of Digital Humanities

    Directory of Open Access Journals (Sweden)

    Joris van Eijnatten

    2013-12-01

    Full Text Available This article discusses the promises and challenges of digital humanitiesmethodologies for historical inquiry. In order to address the great outstanding question whether big data will re-invigorate macro-history, a number of research projects are described that use cultural text mining to explore big data repositories of digitised newspapers. The advantages of quantitative analysis, visualisation and named entity recognition in both exploration and analysis are illustrated in the study of public debates on drugs, drug trafficking, and drug users in the early twentieth century (wahsp, the comparative study of discourses about heredity, genetics, and eugenics in Dutch and German newspapers, 1863-1940 (biland and the study of trans-Atlantic discourses (Translantis. While many technological and practical obstacles remain, advantages over traditional hermeneutic methodology are found in heuristics, analytics, quantitative trans-disciplinarity, and reproducibility, offering a quantitative and trans-national perspective on the history of mentalities.

  11. Big data, big responsibilities

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2014-01-01

    Full Text Available Big data refers to the collection and aggregation of large quantities of data produced by and about people, things or the interactions between them. With the advent of cloud computing, specialised data centres with powerful computational hardware and software resources can be used for processing and analysing a humongous amount of aggregated data coming from a variety of different sources. The analysis of such data is all the more valuable to the extent that it allows for specific patterns to be found and new correlations to be made between different datasets, so as to eventually deduce or infer new information, as well as to potentially predict behaviours or assess the likelihood for a certain event to occur. This article will focus specifically on the legal and moral obligations of online operators collecting and processing large amounts of data, to investigate the potential implications of big data analysis on the privacy of individual users and on society as a whole.

  12. Unconjugated oestetrol in plasma in response to an intravenous load of dehydroepiandrosterone sulphate (DHAS) in uncomplicated and complicated human pregnancy

    International Nuclear Information System (INIS)

    Axelsson, Ove

    1978-01-01

    A non-chromatographic radioimmunoassay for estimation of unconjugated oestetrol in plasma from pregnant women is described. The antiserum has a high specificity to oestetrol. The technical procedure is simple and rapid. Only small amounts of plasma (0.2-0.4 ml) are needed for the analysis. The method has been applied to the measurement of oestetrol in plasma from pregnant women before and after an intravenous injection of 50 mg DHAS. In women with uncomplicated pregnancies a rise of plasma oestetrol was found 60 min after the injection. From 120 to 360 min there was a plateau level, at 600 min a decrease from this level was observed. No changes in the oestetrol response were found with advancing gestational age from the 33rd to the 40th week of pregnancy. A great spread in the individual responses were recorded. Patients with pre-eclampsia and intrauterine growth retardation had a tendency to a lower increase and patients with diabetes a tendency to a higher increase of plasma oestetrol after the DHAS administration. From the data obtained it is concluded that the increase of plasma oestetrol after an intraveneous injection of DHAS in most cases is secondary to the increase of plasma oestradiol. The results suggest that measurement of unconjugated oestetrol in plasma after an intravenous load of DHAS is no safe way to assess foetal wellbeing. In women with intrauterine growth retardation (IUGR) the simultaneous measurement of plasma oestradiol and oestetrol after an injection of DHAS indicates a possibility to distinguish placental from foetal causes of this syndrome. (author)

  13. Big data challenges in decoding cortical activity in a human with quadriplegia to inform a brain computer interface.

    Science.gov (United States)

    Friedenberg, David A; Bouton, Chad E; Annetta, Nicholas V; Skomrock, Nicholas; Mingming Zhang; Schwemmer, Michael; Bockbrader, Marcia A; Mysiw, W Jerry; Rezai, Ali R; Bresler, Herbert S; Sharma, Gaurav

    2016-08-01

    Recent advances in Brain Computer Interfaces (BCIs) have created hope that one day paralyzed patients will be able to regain control of their paralyzed limbs. As part of an ongoing clinical study, we have implanted a 96-electrode Utah array in the motor cortex of a paralyzed human. The array generates almost 3 million data points from the brain every second. This presents several big data challenges towards developing algorithms that should not only process the data in real-time (for the BCI to be responsive) but are also robust to temporal variations and non-stationarities in the sensor data. We demonstrate an algorithmic approach to analyze such data and present a novel method to evaluate such algorithms. We present our methodology with examples of decoding human brain data in real-time to inform a BCI.

  14. Big science

    CERN Multimedia

    Nadis, S

    2003-01-01

    " "Big science" is moving into astronomy, bringing large experimental teams, multi-year research projects, and big budgets. If this is the wave of the future, why are some astronomers bucking the trend?" (2 pages).

  15. Kinetics of intravenous radiographic contrast medium injections as used on CT: simulation with time delay differential equations in a basic human cardiovascular multicompartment model.

    Science.gov (United States)

    Violon, D

    2012-12-01

    To develop a multicompartment model of only essential human body components that predicts the contrast medium concentration vs time curve in a chosen compartment after an intravenous injection. Also to show that the model can be used to time adequately contrast-enhanced CT series. A system of linked time delay instead of ordinary differential equations described the model and was solved with a Matlab program (Matlab v. 6.5; The Mathworks, Inc., Natick, MA). All the injection and physiological parameters were modified to cope with normal or pathological situations. In vivo time-concentration curves from the literature were recalculated to validate the model. The recalculated contrast medium time-concentration curves and parameters are given. The results of the statistical analysis of the study findings are expressed as the median prediction error and the median absolute prediction error values for both the time delay and ordinary differential equation systems; these are situated well below the generally accepted maximum 20% limit. The presented program correctly predicts the time-concentration curve of an intravenous contrast medium injection and, consequently, allows an individually tailored approach of CT examinations with optimised use of the injected contrast medium volume, as long as time delay instead of ordinary differential equations are used. The presented program offers good preliminary knowledge of the time-contrast medium concentration curve after any intravenous injection, allowing adequate timing of a CT examination, required by the short scan time of present-day scanners. The injected volume of contrast medium can be tailored to the individual patient with no more contrast medium than is strictly needed.

  16. Pharmacokinetics of flomoxef in mucosal tissue of the middle ear and mastoid following intravenous administration in humans.

    Science.gov (United States)

    Saito, H; Kimura, T; Takeda, T; Kishimoto, S; Oguma, T; Shimamura, K

    1990-01-01

    The pharmacokinetics of flomoxef in serum and in the mucosal tissue of the middle ear and mastoid were studied in 9 patients undergoing tympanoplasties. All patients received 1 g of flomoxef intravenously. Flomoxef levels in serum and in mucosal tissue were determined by a bioassay method. The peak value of mean concentrations of flomoxef in the mucosal tissue was 30.3 +/- 11.7 micrograms/ml at 10 min after the administrations. Pharmacokinetic analyses showed that the concentration of flomoxef in the mucosal tissue was over 1.56 micrograms/ml (which is the MIC90 for the common pathogens of otitis media) for more than 2 h and decreased parallel with serum concentration with a half-life of about 40 min.

  17. Crystallization and preliminary crystallographic analysis of the fourth FAS1 domain of human BigH3

    International Nuclear Information System (INIS)

    Yoo, Ji-Ho; Kim, EungKweon; Kim, Jongsun; Cho, Hyun-Soo

    2007-01-01

    The crystallization and X-ray diffraction analysis of the fourth FAS1 domain of human BigH3 are reported. The protein BigH3 is a cell-adhesion molecule induced by transforming growth factor-β (TGF-β). It consists of four homologous repeat domains known as FAS1 domains; mutations in these domains have been linked to corneal dystrophy. The fourth FAS1 domain was expressed in Escherichia coli B834 (DE3) (a methionine auxotroph) and purified by DEAE anion-exchange and gel-filtration chromatography. The FAS1 domain was crystallized using the vapour-diffusion method. A SAD diffraction data set was collected to a resolution of 2.5 Å at 100 K. The crystal belonged to space group P6 1 or P6 5 and had two molecules per asymmetric unit, with unit-cell parameters a = b = 62.93, c = 143.27 Å, α = β = 90.0, γ = 120.0°

  18. Reduction of microhemorrhages in the spinal cord of symptomatic ALS mice after intravenous human bone marrow stem cell transplantation accompanies repair of the blood-spinal cord barrier

    Science.gov (United States)

    Eve, David J.; Steiner, George; Mahendrasah, Ajay; Sanberg, Paul R.; Kurien, Crupa; Thomson, Avery; Borlongan, Cesar V.; Garbuzova-Davis, Svitlana

    2018-01-01

    Blood-spinal cord barrier (BSCB) alterations, including capillary rupture, have been demonstrated in animal models of amyotrophic lateral sclerosis (ALS) and ALS patients. To date, treatment to restore BSCB in ALS is underexplored. Here, we evaluated whether intravenous transplantation of human bone marrow CD34+ (hBM34+) cells into symptomatic ALS mice leads to restoration of capillary integrity in the spinal cord as determined by detection of microhemorrhages. Three different doses of hBM34+ cells (5 × 104, 5 × 105 or 1 × 106) or media were intravenously injected into symptomatic G93A SOD1 mice at 13 weeks of age. Microhemorrhages were determined in the cervical and lumbar spinal cords of mice at 4 weeks post-treatment, as revealed by Perls’ Prussian blue staining for ferric iron. Numerous microhemorrhages were observed in the gray and white matter of the spinal cords in media-treated mice, with a greater number of capillary ruptures within the ventral horn of both segments. In cell-treated mice, microhemorrhage numbers in the cervical and lumbar spinal cords were inversely related to administered cell doses. In particular, the pervasive microvascular ruptures determined in the spinal cords in late symptomatic ALS mice were significantly decreased by the highest cell dose, suggestive of BSCB repair by grafted hBM34+ cells. The study results provide translational outcomes supporting transplantation of hBM34+ cells at an optimal dose as a potential therapeutic strategy for BSCB repair in ALS patients. PMID:29535831

  19. Spiral CT and optimization of the modalities of the iodinated intravenous contrast material: Experimental studies in human pathology

    International Nuclear Information System (INIS)

    Bonaldi, V.

    1998-01-01

    Spiral (or helical) CT represents the most recent improvement in the field of computed assisted tomography (CAT scan). The capabilities of this new imaging modality are much superior to these of conventional CT scanning; then result from the rapid acquisition and from the volumetric nature of the derived data set. The short time of data acquisition had made mandatory the revision of protocols for intravenous administration of iodinated contrast material. By the means of several studies, carried out on pathologic and healthy patients, we have attempted to improve knowledge in factors influencing CT attenuation values after injection of contrast material, in the aim of improving contrast administration performed during spiral CT scanning. Anatomical landmarks that we have studied till now have been liver, the pancreas, the kidney and the cervical spine. In addition, a paired based methodology has been used. The volumetric set of data derived from spiral CT scanning leads to optimal post-processing tasks, the most interesting being related to cine-display and multiplanar reformatting; both modalities have been evaluated, about the pancreas and the musculo-skeletal system respectively. Conversely, this new modality, as for other imaging modalities, is responsible for additional costs derived from restless increase in the number of images to be dealt with and from the occurrence of new tasks (in post-processing particularly). The place of spiral CT in diagnostic strategies among other modern imaging modalities should be assessed, especially with respect to Magnetic Resonance Imaging (MRI). (author)

  20. LLNL's Big Science Capabilities Help Spur Over $796 Billion in U.S. Economic Activity Sequencing the Human Genome

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Jeffrey S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-28

    LLNL’s successful history of taking on big science projects spans beyond national security and has helped create billions of dollars per year in new economic activity. One example is LLNL’s role in helping sequence the human genome. Over $796 billion in new economic activity in over half a dozen fields has been documented since LLNL successfully completed this Grand Challenge.

  1. Reduction of microhemorrhages in the spinal cord of symptomatic ALS mice after intravenous human bone marrow stem cell transplantation accompanies repair of the blood-spinal cord barrier.

    Science.gov (United States)

    Eve, David J; Steiner, George; Mahendrasah, Ajay; Sanberg, Paul R; Kurien, Crupa; Thomson, Avery; Borlongan, Cesar V; Garbuzova-Davis, Svitlana

    2018-02-13

    Blood-spinal cord barrier (BSCB) alterations, including capillary rupture, have been demonstrated in animal models of amyotrophic lateral sclerosis (ALS) and ALS patients. To date, treatment to restore BSCB in ALS is underexplored. Here, we evaluated whether intravenous transplantation of human bone marrow CD34 + (hBM34 + ) cells into symptomatic ALS mice leads to restoration of capillary integrity in the spinal cord as determined by detection of microhemorrhages. Three different doses of hBM34 + cells (5 × 10 4 , 5 × 10 5 or 1 × 10 6 ) or media were intravenously injected into symptomatic G93A SOD1 mice at 13 weeks of age. Microhemorrhages were determined in the cervical and lumbar spinal cords of mice at 4 weeks post-treatment, as revealed by Perls' Prussian blue staining for ferric iron. Numerous microhemorrhages were observed in the gray and white matter of the spinal cords in media-treated mice, with a greater number of capillary ruptures within the ventral horn of both segments. In cell-treated mice, microhemorrhage numbers in the cervical and lumbar spinal cords were inversely related to administered cell doses. In particular, the pervasive microvascular ruptures determined in the spinal cords in late symptomatic ALS mice were significantly decreased by the highest cell dose, suggestive of BSCB repair by grafted hBM34 + cells. The study results provide translational outcomes supporting transplantation of hBM34 + cells at an optimal dose as a potential therapeutic strategy for BSCB repair in ALS patients.

  2. The big challenges in modeling human and environmental well-being.

    Science.gov (United States)

    Tuljapurkar, Shripad

    2016-01-01

    This article is a selective review of quantitative research, historical and prospective, that is needed to inform sustainable development policy. I start with a simple framework to highlight how demography and productivity shape human well-being. I use that to discuss three sets of issues and corresponding challenges to modeling: first, population prehistory and early human development and their implications for the future; second, the multiple distinct dimensions of human and environmental well-being and the meaning of sustainability; and, third, inequality as a phenomenon triggered by development and models to examine changing inequality and its consequences. I conclude with a few words about other important factors: political, institutional, and cultural.

  3. Urbanising Big

    DEFF Research Database (Denmark)

    Ljungwall, Christer

    2013-01-01

    Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis.......Development in China raises the question of how big a city can become, and at the same time be sustainable, writes Christer Ljungwall of the Swedish Agency for Growth Policy Analysis....

  4. Big Argumentation?

    Directory of Open Access Journals (Sweden)

    Daniel Faltesek

    2013-08-01

    Full Text Available Big Data is nothing new. Public concern regarding the mass diffusion of data has appeared repeatedly with computing innovations, in the formation before Big Data it was most recently referred to as the information explosion. In this essay, I argue that the appeal of Big Data is not a function of computational power, but of a synergistic relationship between aesthetic order and a politics evacuated of a meaningful public deliberation. Understanding, and challenging, Big Data requires an attention to the aesthetics of data visualization and the ways in which those aesthetics would seem to depoliticize information. The conclusion proposes an alternative argumentative aesthetic as the appropriate response to the depoliticization posed by the popular imaginary of Big Data.

  5. Big data

    DEFF Research Database (Denmark)

    Madsen, Anders Koed; Flyverbom, Mikkel; Hilbert, Martin

    2016-01-01

    is to outline a research agenda that can be used to raise a broader set of sociological and practice-oriented questions about the increasing datafication of international relations and politics. First, it proposes a way of conceptualizing big data that is broad enough to open fruitful investigations......The claim that big data can revolutionize strategy and governance in the context of international relations is increasingly hard to ignore. Scholars of international political sociology have mainly discussed this development through the themes of security and surveillance. The aim of this paper...... into the emerging use of big data in these contexts. This conceptualization includes the identification of three moments contained in any big data practice. Second, it suggests a research agenda built around a set of subthemes that each deserve dedicated scrutiny when studying the interplay between big data...

  6. Development of a translational model to screen medications for cocaine use disorder II: Choice between intravenous cocaine and money in humans

    Science.gov (United States)

    Lile, Joshua A.; Stoops, William W.; Rush, Craig R.; Negus, S. Stevens; Glaser, Paul E. A.; Hatton, Kevin W.; Hays, Lon R.

    2016-01-01

    Background A medication for treating cocaine use disorder has yet to be approved. Laboratory-based evaluation of candidate medications in animals and humans is a valuable means to demonstrate safety, tolerability and initial efficacy of potential medications. However, animal-to-human translation has been hampered by a lack of coordination. Therefore, we designed homologous cocaine self-administration studies in rhesus monkeys (see companion article) and human subjects in an attempt to develop linked, functionally equivalent procedures for research on candidate medications for cocaine use disorder. Methods Eight (N=8) subjects with cocaine use disorder completed 12 experimental sessions in which they responded to receive money ($0.01, $1.00 and $3.00) or intravenous cocaine (0, 3, 10 and 30 mg/70 kg) under independent, concurrent progressive-ratio schedules. Prior to the completion of 9 choice trials, subjects sampled the cocaine dose available during that session and were informed of the monetary alternative value. Results The allocation of behavior varied systematically as a function of cocaine dose and money value. Moreover, a similar pattern of cocaine choice was demonstrated in rhesus monkeys and humans across different cocaine doses and magnitudes of the species-specific alternative reinforcers. The subjective and cardiovascular responses to IV cocaine were an orderly function of dose, although heart rate and blood pressure remained within safe limits. Conclusions These coordinated studies successfully established drug vs. non-drug choice procedures in humans and rhesus monkeys that yielded similar cocaine choice behavior across species. This translational research platform will be used in future research to enhance the efficiency of developing interventions to reduce cocaine use. PMID:27269368

  7. A randomized clinical trial of recombinant human hyaluronidase-facilitated subcutaneous versus intravenous rehydration in mild to moderately dehydrated children in the emergency department.

    Science.gov (United States)

    Spandorfer, Philip R; Mace, Sharon E; Okada, Pamela J; Simon, Harold K; Allen, Coburn H; Spiro, David M; Friend, Keith; Harb, George; Lebel, Francois

    2012-11-01

    Alternative treatment of dehydration is needed when intravenous (IV) or oral rehydration therapy fails. Subcutaneous (SC) hydration facilitated by recombinant human hyaluronidase offers an alternative treatment for dehydration. This clinical trial is the first to compare recombinant human hyaluronidase-facilitated SC (rHFSC) rehydration with standard IV rehydration for use in dehydrated children. This Phase IV noninferiority trial evaluated whether rHFSC fluid administration can be given safely and effectively, with volumes similar to those delivered intravenously, to children who have mild to moderate dehydration. The study included mild to moderately dehydrated children (Gorelick dehydration score) aged 1 month to 10 years. They were randomized to receive 20 mL/kg of isotonic fluids using rHFSC or IV therapy over 1 hour and then as needed until clinically rehydrated. The primary outcome was total volume of fluid administered (emergency department [ED] plus inpatient hospitalization). Secondary outcomes included mean volume infused in the ED alone, postinfusion dehydration scores and weight changes, line placement success and time, safety, and provider and parent/guardian questionnaire. 148 patients (mean age, 2.3 [1.91] years]; white, 53.4%; black, 31.8%) were enrolled in the intention-to-treat population (73 rHFSC; 75 IV). The primary outcome, mean total volume infused, was 365.0 (324.6) mL in the rHFSC group over 3.1 hours versus 455.8 (597.4) mL in the IV group over 6.6 hours (P = 0.51). The secondary outcome of mean volume infused in the ED alone was 334.3 (226.40) mL in the rHFSC group versus 299.6 (252.33) mL in the IV group (P = 0.03). Dehydration scores and weight changes postinfusion were similar. Successful line placement occurred in all 73 rHFSC-treated patients and 59 of 75 (78.7%) IV-treated patients (P dehydrated children, rHFSC was inferior to IV hydration for the primary outcome measure. However, rHFSC was noninferior in the ED phase of hydration

  8. GH receptor signaling in skeletal muscle and adipose tissue in human subjects following exposure to an intravenous GH bolus

    DEFF Research Database (Denmark)

    Jørgensen, Jens O L; Jessen, Niels; Pedersen, Steen Bønløkke

    2006-01-01

    Growth hormone (GH) regulates muscle and fat metabolism, which impacts on body composition and insulin sensitivity, but the underlying GH signaling pathways have not been studied in vivo in humans. We investigated GH signaling in biopsies from muscle and abdominal fat obtained 30 (n = 3) or 60 (n...... was measured by in vitro phosphorylation of PI. STAT5 DNA binding activity was assessed with EMSA, and the expression of IGF-I and SOCS mRNA was measured by real-time RT-PCR. GH induced a 52% increase in circulating FFA levels with peak values after 155 min (P = 0.03). Tyrosine-phosphorylated STAT5...... tended to increase after GH in muscle and fat, respectively. We conclude that 1) STAT5 is acutely activated in human muscle and fat after a GH bolus, but additional downstream GH signaling was significant only in fat; 2) the direct GH effects in muscle need further characterization; and 3) this human...

  9. Zooniverse: Combining Human and Machine Classifiers for the Big Survey Era

    Science.gov (United States)

    Fortson, Lucy; Wright, Darryl; Beck, Melanie; Lintott, Chris; Scarlata, Claudia; Dickinson, Hugh; Trouille, Laura; Willi, Marco; Laraia, Michael; Boyer, Amy; Veldhuis, Marten; Zooniverse

    2018-01-01

    Many analyses of astronomical data sets, ranging from morphological classification of galaxies to identification of supernova candidates, have relied on humans to classify data into distinct categories. Crowdsourced galaxy classifications via the Galaxy Zoo project provided a solution that scaled visual classification for extant surveys by harnessing the combined power of thousands of volunteers. However, the much larger data sets anticipated from upcoming surveys will require a different approach. Automated classifiers using supervised machine learning have improved considerably over the past decade but their increasing sophistication comes at the expense of needing ever more training data. Crowdsourced classification by human volunteers is a critical technique for obtaining these training data. But several improvements can be made on this zeroth order solution. Efficiency gains can be achieved by implementing a “cascade filtering” approach whereby the task structure is reduced to a set of binary questions that are more suited to simpler machines while demanding lower cognitive loads for humans.Intelligent subject retirement based on quantitative metrics of volunteer skill and subject label reliability also leads to dramatic improvements in efficiency. We note that human and machine classifiers may retire subjects differently leading to trade-offs in performance space. Drawing on work with several Zooniverse projects including Galaxy Zoo and Supernova Hunter, we will present recent findings from experiments that combine cohorts of human and machine classifiers. We show that the most efficient system results when appropriate subsets of the data are intelligently assigned to each group according to their particular capabilities.With sufficient online training, simple machines can quickly classify “easy” subjects, leaving more difficult (and discovery-oriented) tasks for volunteers. We also find humans achieve higher classification purity while samples

  10. Big cats in our backyards: persistence of large carnivores in a human dominated landscape in India.

    Directory of Open Access Journals (Sweden)

    Vidya Athreya

    Full Text Available Protected areas are extremely important for the long term viability of biodiversity in a densely populated country like India where land is a scarce resource. However, protected areas cover only 5% of the land area in India and in the case of large carnivores that range widely, human use landscapes will function as important habitats required for gene flow to occur between protected areas. In this study, we used photographic capture recapture analysis to assess the density of large carnivores in a human-dominated agricultural landscape with density >300 people/km(2 in western Maharashtra, India. We found evidence of a wide suite of wild carnivores inhabiting a cropland landscape devoid of wilderness and wild herbivore prey. Furthermore, the large carnivores; leopard (Panthera pardus and striped hyaena (Hyaena hyaena occurred at relatively high density of 4.8±1.2 (sd adults/100 km(2 and 5.03±1.3 (sd adults/100 km(2 respectively. This situation has never been reported before where 10 large carnivores/100 km(2 are sharing space with dense human populations in a completely modified landscape. Human attacks by leopards were rare despite a potentially volatile situation considering that the leopard has been involved in serious conflict, including human deaths in adjoining areas. The results of our work push the frontiers of our understanding of the adaptability of both, humans and wildlife to each other's presence. The results also highlight the urgent need to shift from a PA centric to a landscape level conservation approach, where issues are more complex, and the potential for conflict is also very high. It also highlights the need for a serious rethink of conservation policy, law and practice where the current management focus is restricted to wildlife inside Protected Areas.

  11. How Do Small Things Make a Big Difference? Activities to Teach about Human-Microbe Interactions.

    Science.gov (United States)

    Jasti, Chandana; Hug, Barbara; Waters, Jillian L; Whitaker, Rachel J

    2014-11-01

    Recent scientific studies are providing increasing evidence for how microbes living in and on us are essential to our good health. However, many students still think of microbes only as germs that harm us. The classroom activities presented here are designed to shift student thinking on this topic. In these guided inquiry activities, students investigate human-microbe interactions as they work together to interpret and analyze authentic data from published articles and develop scientific models. Through the activities, students learn and apply ecological concepts as they come to see the human body as a fascinatingly complex ecosystem.

  12. How Big Is Too Big?

    Science.gov (United States)

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  13. A big blank white canvas? Mapping and modeling human impact in Antarctica

    Science.gov (United States)

    Steve Carver; Tina Tin

    2015-01-01

    Antarctica is certainly what most people would consider being the world's last great wilderness; largely untouched and undeveloped by humans. Yet it is not inviolate - there are scientific bases, tourist operations, expeditions, airstrips and even roads. Although these impacts are by and large limited in extent, their very presence in an otherwise "blank...

  14. Digital Humanities: the Next Big Thing? Enkele notities bij een ontluikend debat

    NARCIS (Netherlands)

    Besser, S.; Vaessens, T.

    2013-01-01

    In the form of provisional notes, the authors offer suggestions for an intensification of the theoretical debate on the digital humanities and computational literary studies in particular. From the perspective of poststructuralist theory, they address some of the epistemological underpinnings of

  15. Distribution of 131I-labeled recombinant human erythropoietin in maternal and fetal organs following intravenous administration in pregnant rats

    International Nuclear Information System (INIS)

    Yilmaz, O.; Lambrecht, F.Y.; Durkan, K.; Gokmen, N.; Erbayraktar, S.

    2007-01-01

    The aim of the present study was to demonstrate the possible transplacental transmission of 131 I labeled recombinant human erythropoietin ( 131 I-rh-EPO) in pregnant rats and its distribution through maternal and fetal organs. Six Wistar Albino Rats in their pregnancy of 18 days were used 131 I labeled recombinant human erythropoietin (specific activity = 2.4 μCi/IU) was injected into the tail vein of rats. After 30 minutes labeled erythropoietin infusion maternal stomach, kidney, lung, liver, brain and heart as well as fetus were removed. Then, the same organs were removed from each fetus. Measuring weight of maternal and fetal organs as well as placenta were followed by radioactivity count via Cd(Te) detector. 131 I labeled recombinant human erythropoietin was found to be able to pass rat placenta and its distribution order in fetal organs was similar to those of maternal organs. Besides, as measurements were performed closer to cornu uteri, uptakes were decreasing in every fetus and its corresponding placenta. (author)

  16. Pharmacokinetic-Pharmacodynamic Modelling of the Analgesic and Antihyperalgesic Effects of Morphine after Intravenous Infusion in Human Volunteers

    DEFF Research Database (Denmark)

    Ravn, Pernille; Foster, David J. R.; Kreilgaard, Mads

    2014-01-01

    Using a modelling approach, this study aimed to (i) examine whether the pharmacodynamics of the analgesic and antihyperalgesic effects of morphine differ; (ii) investigate the influence of demographic, pain sensitivity and genetic (OPRM1) variables on between-subject variability of morphine...... pharmacokinetics and pharmacodynamics in human experimental pain models. The study was a randomized, double-blind, 5-arm, cross-over, placebo-controlled study. The psychophysical cutaneous pain tests, electrical pain tolerance (EPTo) and secondary hyperalgesia areas (2HA) were studied in 28 healthy individuals (15...

  17. Intentional intravenous mercury injection

    African Journals Online (AJOL)

    In this case report, intravenous complications, treatment strategies and possible ... Mercury toxicity is commonly associated with vapour inhalation or oral ingestion, for which there exist definite treatment options. Intravenous mercury ... personality, anxiousness, irritability, insomnia, depression and drowsi- ness.[1] However ...

  18. The plasma and cerebrospinal fluid pharmacokinetics of erlotinib and its active metabolite (OSI-420) after intravenous administration of erlotinib in non-human primates.

    Science.gov (United States)

    Meany, Holly J; Fox, Elizabeth; McCully, Cynthia; Tucker, Chris; Balis, Frank M

    2008-08-01

    Erlotinib hydrochloride is a small molecule inhibitor of epidermal growth factor receptor (EGFR). EGFR is over-expressed in primary brain tumors and solid tumors that metastasize to the central nervous system. We evaluated the plasma and cerebrospinal fluid (CSF) pharmacokinetics of erlotinib and its active metabolite OSI-420 after an intravenous (IV) dose in a non-human primate model. Erlotinib was administered as a 1 h IV infusion to four adult rhesus monkeys. Serial blood and CSF samples were drawn over 48 h and erlotinib and OSI-420 were quantified with an HPLC/tandem mass spectroscopic assay. Pharmacokinetic parameters were estimated using non-compartmental and compartmental methods. CSF penetration was calculated from the AUC(CSF):AUC(plasma). Erlotinib disappearance from plasma after a short IV infusion was biexponential with a mean terminal half-life of 5.2 h and a mean clearance of 128 ml/min per m(2). OSI-420 exposure (AUC) in plasma was 30% (range 12-59%) of erlotinib, and OSI-420 clearance was more than 5-fold higher than erlotinib. Erlotinib and OSI-420 were detectable in CSF. The CSF penetration (AUC(CSF):AUC(plasma)) of erlotinib and OSI-420 was OSI-420 are measurable in CSF after an IV dose. The drug exposure (AUC) in the CSF is limited relative to total plasma concentrations but is substantial relative the free drug exposure in plasma.

  19. Phase I dose escalation pharmacokinetic assessment of intravenous humanized anti-MUC1 antibody AS1402 in patients with advanced breast cancer.

    Science.gov (United States)

    Pegram, Mark D; Borges, Virginia F; Ibrahim, Nuhad; Fuloria, Jyotsna; Shapiro, Charles; Perez, Susan; Wang, Karen; Schaedli Stark, Franziska; Courtenay Luck, Nigel

    2009-01-01

    MUC1 is a cell-surface glycoprotein that establishes a molecular barrier at the epithelial surface and engages in morphogenetic signal transduction. Alterations in MUC1 glycosylation accompany the development of cancer and influence cellular growth, differentiation, transformation, adhesion, invasion, and immune surveillance. A 20-amino-acid tandem repeat that forms the core protein of MUC1 is overexpressed and aberrantly glycosylated in the majority of epithelial tumors. AS1402 (formerly R1550) is a humanized IgG1k monoclonal antibody that binds to PDTR sequences within this tandem repeat that are not exposed in normal cells. AS1402 is a potent inducer of antibody-dependent cellular cytotoxicity (ADCC), specifically against MUC1-expressing tumor cells. The objective of this study was to determine the safety, tolerability, and pharmacokinetic (PK) characteristics of AS1402 monotherapy in patients with locally advanced or metastatic MUC1-positive breast cancer that had progressed after anthracyclines- and taxane-based therapy. Patients received AS1402 over a 1- to 3-hour intravenous (i.v.) infusion at doses between 1 and 16 mg/kg, with repeated dosing every 1 to 3 weeks (based on patient-individualized PK assessment) until disease progression. Serum AS1402 levels were measured at multiple times after i.v. administration. Human anti-human antibody (HAHA) responses were measured to determine the immunogenicity of AS1402. Noncompartmental pharmacokinetic parameters were determined and were used to assess dose dependency across the dose range studied. Twenty-six patients were treated. AS1402 was generally well tolerated. Two grade 3/4 drug-related adverse events were reported, both at the 3-mg/kg dose. Neither was observed in expanded or subsequent dosing cohorts. No anti-human antibodies were detected. Plasma concentrations of AS1402 appeared to be proportional to dose within the 1- to 16-mg/kg dose range assessed, with a mean terminal half-life of 115.4 +/- 37.1 hours

  20. Big Surveys, Big Data Centres

    Science.gov (United States)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  1. Big Opportunities and Big Concerns of Big Data in Education

    Science.gov (United States)

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  2. Big Dreams

    Science.gov (United States)

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  3. Big Science

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1986-05-15

    Astronomy, like particle physics, has become Big Science where the demands of front line research can outstrip the science budgets of whole nations. Thus came into being the European Southern Observatory (ESO), founded in 1962 to provide European scientists with a major modern observatory to study the southern sky under optimal conditions.

  4. What Role for Law, Human Rights, and Bioethics in an Age of Big Data, Consortia Science, and Consortia Ethics? The Importance of Trustworthiness.

    Science.gov (United States)

    Dove, Edward S; Özdemir, Vural

    2015-09-01

    The global bioeconomy is generating new paradigm-shifting practices of knowledge co-production, such as collective innovation; large-scale, data-driven global consortia science (Big Science); and consortia ethics (Big Ethics). These bioeconomic and sociotechnical practices can be forces for progressive social change, but they can also raise predicaments at the interface of law, human rights, and bioethics. In this article, we examine one such double-edged practice: the growing, multivariate exploitation of Big Data in the health sector, particularly by the private sector. Commercial exploitation of health data for knowledge-based products is a key aspect of the bioeconomy and is also a topic of concern among publics around the world. It is exacerbated in the current age of globally interconnected consortia science and consortia ethics, which is characterized by accumulating epistemic proximity, diminished academic independence, "extreme centrism", and conflicted/competing interests among innovation actors. Extreme centrism is of particular importance as a new ideology emerging from consortia science and consortia ethics; this relates to invariably taking a middle-of-the-road populist stance, even in the event of human rights breaches, so as to sustain the populist support needed for consortia building and collective innovation. What role do law, human rights, and bioethics-separate and together-have to play in addressing these predicaments and opportunities in early 21st century science and society? One answer we propose is an intertwined ethico-legal normative construct, namely trustworthiness . By considering trustworthiness as a central pillar at the intersection of law, human rights, and bioethics, we enable others to trust us, which in turns allows different actors (both nonprofit and for-profit) to operate more justly in consortia science and ethics, as well as to access and responsibly use health data for public benefit.

  5. What Role for Law, Human Rights, and Bioethics in an Age of Big Data, Consortia Science, and Consortia Ethics? The Importance of Trustworthiness

    Science.gov (United States)

    Dove, Edward S.; Özdemir, Vural

    2015-01-01

    The global bioeconomy is generating new paradigm-shifting practices of knowledge co-production, such as collective innovation; large-scale, data-driven global consortia science (Big Science); and consortia ethics (Big Ethics). These bioeconomic and sociotechnical practices can be forces for progressive social change, but they can also raise predicaments at the interface of law, human rights, and bioethics. In this article, we examine one such double-edged practice: the growing, multivariate exploitation of Big Data in the health sector, particularly by the private sector. Commercial exploitation of health data for knowledge-based products is a key aspect of the bioeconomy and is also a topic of concern among publics around the world. It is exacerbated in the current age of globally interconnected consortia science and consortia ethics, which is characterized by accumulating epistemic proximity, diminished academic independence, “extreme centrism”, and conflicted/competing interests among innovation actors. Extreme centrism is of particular importance as a new ideology emerging from consortia science and consortia ethics; this relates to invariably taking a middle-of-the-road populist stance, even in the event of human rights breaches, so as to sustain the populist support needed for consortia building and collective innovation. What role do law, human rights, and bioethics—separate and together—have to play in addressing these predicaments and opportunities in early 21st century science and society? One answer we propose is an intertwined ethico-legal normative construct, namely trustworthiness. By considering trustworthiness as a central pillar at the intersection of law, human rights, and bioethics, we enable others to trust us, which in turns allows different actors (both nonprofit and for-profit) to operate more justly in consortia science and ethics, as well as to access and responsibly use health data for public benefit. PMID:26345196

  6. What Role for Law, Human Rights, and Bioethics in an Age of Big Data, Consortia Science, and Consortia Ethics? The Importance of Trustworthiness

    Directory of Open Access Journals (Sweden)

    Edward S. Dove

    2015-08-01

    Full Text Available The global bioeconomy is generating new paradigm-shifting practices of knowledge co-production, such as collective innovation; large-scale, data-driven global consortia science (Big Science; and consortia ethics (Big Ethics. These bioeconomic and sociotechnical practices can be forces for progressive social change, but they can also raise predicaments at the interface of law, human rights, and bioethics. In this article, we examine one such double-edged practice: the growing, multivariate exploitation of Big Data in the health sector, particularly by the private sector. Commercial exploitation of health data for knowledge-based products is a key aspect of the bioeconomy and is also a topic of concern among publics around the world. It is exacerbated in the current age of globally interconnected consortia science and consortia ethics, which is characterized by accumulating epistemic proximity, diminished academic independence, “extreme centrism”, and conflicted/competing interests among innovation actors. Extreme centrism is of particular importance as a new ideology emerging from consortia science and consortia ethics; this relates to invariably taking a middle-of-the-road populist stance, even in the event of human rights breaches, so as to sustain the populist support needed for consortia building and collective innovation. What role do law, human rights, and bioethics—separate and together—have to play in addressing these predicaments and opportunities in early 21st century science and society? One answer we propose is an intertwined ethico-legal normative construct, namely trustworthiness. By considering trustworthiness as a central pillar at the intersection of law, human rights, and bioethics, we enable others to trust us, which in turns allows different actors (both nonprofit and for-profit to operate more justly in consortia science and ethics, as well as to access and responsibly use health data for public benefit.

  7. Dose-Related Modulation of Event-Related Potentials to Novel and Target Stimuli by Intravenous Δ9-THC in Humans

    Science.gov (United States)

    D'Souza, Deepak Cyril; Fridberg, Daniel J; Skosnik, Patrick D; Williams, Ashley; Roach, Brian; Singh, Nagendra; Carbuto, Michelle; Elander, Jacqueline; Schnakenberg, Ashley; Pittman, Brian; Sewell, R Andrew; Ranganathan, Mohini; Mathalon, Daniel

    2012-01-01

    Cannabinoids induce a host of perceptual alterations and cognitive deficits in humans. However, the neural correlates of these deficits have remained elusive. The current study examined the acute, dose-related effects of delta-9-tetrahydrocannabinol (Δ9-THC) on psychophysiological indices of information processing in humans. Healthy subjects (n=26) completed three test days during which they received intravenous Δ9-THC (placebo, 0.015 and 0.03 mg/kg) in a within-subject, double-blind, randomized, cross-over, and counterbalanced design. Psychophysiological data (electroencephalography) were collected before and after drug administration while subjects engaged in an event-related potential (ERP) task known to be a valid index of attention and cognition (a three-stimulus auditory ‘oddball' P300 task). Δ9-THC dose-dependently reduced the amplitude of both the target P300b and the novelty P300a. Δ9-THC did not have any effect on the latency of either the P300a or P300b, or on early sensory-evoked ERP components preceding the P300 (the N100). Concomitantly, Δ9-THC induced psychotomimetic effects, perceptual alterations, and subjective ‘high' in a dose-dependent manner. Δ9-THC -induced reductions in P3b amplitude correlated with Δ9-THC-induced perceptual alterations. Lastly, exploratory analyses examining cannabis use status showed that whereas recent cannabis users had blunted behavioral effects to Δ9-THC, there were no dose-related effects of Δ9-THC on P300a/b amplitude between cannabis-free and recent cannabis users. Overall, these data suggest that at doses that produce behavioral and subjective effects consistent with the known properties of cannabis, Δ9-THC reduced P300a and P300b amplitudes without altering the latency of these ERPs. Cannabinoid agonists may therefore disrupt cortical processes responsible for context updating and the automatic orientation of attention, while leaving processing speed and earlier sensory ERP components intact

  8. Big Egos in Big Science

    DEFF Research Database (Denmark)

    Andersen, Kristina Vaarst; Jeppesen, Jacob

    In this paper we investigate the micro-mechanisms governing structural evolution and performance of scientific collaboration. Scientific discovery tends not to be lead by so called lone ?stars?, or big egos, but instead by collaboration among groups of researchers, from a multitude of institutions...

  9. Big Data and Big Science

    OpenAIRE

    Di Meglio, Alberto

    2014-01-01

    Brief introduction to the challenges of big data in scientific research based on the work done by the HEP community at CERN and how the CERN openlab promotes collaboration among research institutes and industrial IT companies. Presented at the FutureGov 2014 conference in Singapore.

  10. Big data need big theory too.

    Science.gov (United States)

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2015 The Authors.

  11. Nursing Needs Big Data and Big Data Needs Nursing.

    Science.gov (United States)

    Brennan, Patricia Flatley; Bakken, Suzanne

    2015-09-01

    Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.

  12. Big inquiry

    Energy Technology Data Exchange (ETDEWEB)

    Wynne, B [Lancaster Univ. (UK)

    1979-06-28

    The recently published report entitled 'The Big Public Inquiry' from the Council for Science and Society and the Outer Circle Policy Unit is considered, with especial reference to any future enquiry which may take place into the first commercial fast breeder reactor. Proposals embodied in the report include stronger rights for objectors and an attempt is made to tackle the problem that participation in a public inquiry is far too late to be objective. It is felt by the author that the CSS/OCPU report is a constructive contribution to the debate about big technology inquiries but that it fails to understand the deeper currents in the economic and political structure of technology which so influence the consequences of whatever formal procedures are evolved.

  13. Big Data

    OpenAIRE

    Bútora, Matúš

    2017-01-01

    Cieľom bakalárskej práca je popísať problematiku Big Data a agregačné operácie OLAP pre podporu rozhodovania, ktoré sú na ne aplikované pomocou technológie Apache Hadoop. Prevažná časť práce je venovaná popisu práve tejto technológie. Posledná kapitola sa zaoberá spôsobom aplikovania agregačných operácií a problematikou ich realizácie. Nasleduje celkové zhodnotenie práce a možnosti využitia výsledného systému do budúcna. The aim of the bachelor thesis is to describe the Big Data issue and ...

  14. BIG DATA

    OpenAIRE

    Abhishek Dubey

    2018-01-01

    The term 'Big Data' portrays inventive methods and advances to catch, store, disseminate, oversee and break down petabyte-or bigger estimated sets of data with high-speed & diverted structures. Enormous information can be organized, non-structured or half-organized, bringing about inadequacy of routine information administration techniques. Information is produced from different distinctive sources and can touch base in the framework at different rates. With a specific end goal to handle this...

  15. Keynote: Big Data, Big Opportunities

    OpenAIRE

    Borgman, Christine L.

    2014-01-01

    The enthusiasm for big data is obscuring the complexity and diversity of data in scholarship and the challenges for stewardship. Inside the black box of data are a plethora of research, technology, and policy issues. Data are not shiny objects that are easily exchanged. Rather, data are representations of observations, objects, or other entities used as evidence of phenomena for the purposes of research or scholarship. Data practices are local, varying from field to field, individual to indiv...

  16. Networking for big data

    CERN Document Server

    Yu, Shui; Misic, Jelena; Shen, Xuemin (Sherman)

    2015-01-01

    Networking for Big Data supplies an unprecedented look at cutting-edge research on the networking and communication aspects of Big Data. Starting with a comprehensive introduction to Big Data and its networking issues, it offers deep technical coverage of both theory and applications.The book is divided into four sections: introduction to Big Data, networking theory and design for Big Data, networking security for Big Data, and platforms and systems for Big Data applications. Focusing on key networking issues in Big Data, the book explains network design and implementation for Big Data. It exa

  17. Clinical Evaluation of Ciprofloxacin Intravenous Preparation ...

    African Journals Online (AJOL)

    The most common site of bacteria infection in humans is the urinary tract. For nosocomial infections it is the catheterized urinary tract. Compromised immune responses in hospitalized patients contribute to the difficulties encountered in treating their infections. In these patients, administration of intravenous antibiotic is ...

  18. Big Data

    DEFF Research Database (Denmark)

    Aaen, Jon; Nielsen, Jeppe Agger

    2016-01-01

    Big Data byder sig til som en af tidens mest hypede teknologiske innovationer, udråbt til at rumme kimen til nye, værdifulde operationelle indsigter for private virksomheder og offentlige organisationer. Mens de optimistiske udmeldinger er mange, er forskningen i Big Data i den offentlige sektor...... indtil videre begrænset. Denne artikel belyser, hvordan den offentlige sundhedssektor kan genanvende og udnytte en stadig større mængde data under hensyntagen til offentlige værdier. Artiklen bygger på et casestudie af anvendelsen af store mængder sundhedsdata i Dansk AlmenMedicinsk Database (DAMD......). Analysen viser, at (gen)brug af data i nye sammenhænge er en flerspektret afvejning mellem ikke alene økonomiske rationaler og kvalitetshensyn, men også kontrol over personfølsomme data og etiske implikationer for borgeren. I DAMD-casen benyttes data på den ene side ”i den gode sags tjeneste” til...

  19. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  20. Computed tomography intravenous cholangiography

    International Nuclear Information System (INIS)

    Nascimento, S.; Murray, W.; Wilson, P.

    1997-01-01

    Indications for direct visualization of the bile ducts include bile duct dilatation demonstrated by ultrasound or computed tomography (CT) scanning, where the cause of the bile duct dilatation is uncertain or where the anatomy of bile duct obstruction needs further clarification. Another indication is right upper quadrant pain, particularly in a post-cholecystectomy patient, where choledocholithiasis is suspected. A possible new indication is pre-operative evaluation prior to laparoscopic cholecystectomy. The bile ducts are usually studied by endoscopic retrograde cholangiopancreatography (ERCP), or, less commonly, trans-hepatic cholangiography. The old technique of intravenous cholangiography has fallen into disrepute because of inconsistent bile-duct opacification. The advent of spiral CT scanning has renewed interest in intravenous cholangiography. The CT technique is very sensitive to the contrast agent in the bile ducts, and angiographic and three-dimensional reconstructions of the biliary tree can readily be obtained using the CT intravenous cholangiogram technique (CT IVC). Seven patients have been studied using this CT IVC technique, between February 1995 and June 1996, and are the subject of the present report. Eight further studies have since been performed. The results suggest that CT IVC could replace ERCP as the primary means of direct cholangiography, where pancreatic duct visualization is not required. (authors)

  1. Big Data, Big Opportunities, and Big Challenges.

    Science.gov (United States)

    Frelinger, Jeffrey A

    2015-11-01

    High-throughput assays have begun to revolutionize modern biology and medicine. The advent of cheap next-generation sequencing (NGS) has made it possible to interrogate cells and human populations as never before. Although this has allowed us to investigate the genetics, gene expression, and impacts of the microbiome, there remain both practical and conceptual challenges. These include data handling, storage, and statistical analysis, as well as an inherent problem of the analysis of heterogeneous cell populations.

  2. Risk-benefit evaluation of fish from Chinese markets: Nutrients and contaminants in 24 fish species from five big cities and related assessment for human health

    Energy Technology Data Exchange (ETDEWEB)

    Du, Zhen-Yu, E-mail: zdu@nifes.no [National Institute of Nutrition and Seafood Research (NIFES), N-5817 Bergen (Norway); Zhang, Jian [National Institute of Nutrition and Seafood Research (NIFES), N-5817 Bergen (Norway); Institute of Nutrition and Food Safety, Chinese Center for Disease Control and Prevention, Beijing, 100050 (China); Department of Biomedicine, University of Bergen (Norway); Wang, Chunrong; Li, Lixiang; Man, Qingqing [Institute of Nutrition and Food Safety, Chinese Center for Disease Control and Prevention, Beijing, 100050 (China); Lundebye, Anne-Katrine; Froyland, Livar [National Institute of Nutrition and Seafood Research (NIFES), N-5817 Bergen (Norway)

    2012-02-01

    The risks and benefits of fish from markets in Chinese cities have not previously been fully evaluated. In the present study, 24 common fish species with more than 400 individual samples were collected from markets from five big Chinese cities in 2007. The main nutrients and contaminants were measured and the risk-benefit was evaluated based on recommended nutrient intakes and risk level criteria set by relevant authorities. The comprehensive effects of nutrients and contaminants in marine oily fish were also evaluated using the data of two related human dietary intervention trials performed in dyslipidemic Chinese men and women in 2008 and 2010, respectively. The results showed that concentrations of contaminants analyzed including DDT, PCB{sub 7}, arsenic and cadmium were much lower than their corresponding maximum limits with the exception of the mercury concentration in common carp. Concentrations of POPs and n-3 LCPUFA, mainly EPA and DHA, were positively associated with the lipid content of the fish. With a daily intake of 80-100 g marine oily fish, the persistent organic pollutants in fish would not counteract the beneficial effects of n-3 LCPUFA in reducing cardiovascular disease (CVD) risk markers. Marine oily fish provided more effective protection against CVD than lean fish, particularly for the dyslipidemic populations. The risk-benefit assessment based on the present daily aquatic product intake in Chinese urban residents (44.9 and 62.3 g for the average values for all cities and big cities, respectively) indicated that fish, particularly marine oily fish, can be regularly consumed to achieve optimal nutritional benefits from n-3 LCPUFA, without causing significant contaminant-related health risks. However, the potential health threat from contaminants in fish should still be emphasized for the populations consuming large quantities of fish, particularly wild fish. - Highlights: Black-Right-Pointing-Pointer We collected 24 fish species with more than

  3. Risk–benefit evaluation of fish from Chinese markets: Nutrients and contaminants in 24 fish species from five big cities and related assessment for human health

    International Nuclear Information System (INIS)

    Du, Zhen-Yu; Zhang, Jian; Wang, Chunrong; Li, Lixiang; Man, Qingqing; Lundebye, Anne-Katrine; Frøyland, Livar

    2012-01-01

    The risks and benefits of fish from markets in Chinese cities have not previously been fully evaluated. In the present study, 24 common fish species with more than 400 individual samples were collected from markets from five big Chinese cities in 2007. The main nutrients and contaminants were measured and the risk–benefit was evaluated based on recommended nutrient intakes and risk level criteria set by relevant authorities. The comprehensive effects of nutrients and contaminants in marine oily fish were also evaluated using the data of two related human dietary intervention trials performed in dyslipidemic Chinese men and women in 2008 and 2010, respectively. The results showed that concentrations of contaminants analyzed including DDT, PCB 7 , arsenic and cadmium were much lower than their corresponding maximum limits with the exception of the mercury concentration in common carp. Concentrations of POPs and n-3 LCPUFA, mainly EPA and DHA, were positively associated with the lipid content of the fish. With a daily intake of 80–100 g marine oily fish, the persistent organic pollutants in fish would not counteract the beneficial effects of n-3 LCPUFA in reducing cardiovascular disease (CVD) risk markers. Marine oily fish provided more effective protection against CVD than lean fish, particularly for the dyslipidemic populations. The risk–benefit assessment based on the present daily aquatic product intake in Chinese urban residents (44.9 and 62.3 g for the average values for all cities and big cities, respectively) indicated that fish, particularly marine oily fish, can be regularly consumed to achieve optimal nutritional benefits from n-3 LCPUFA, without causing significant contaminant-related health risks. However, the potential health threat from contaminants in fish should still be emphasized for the populations consuming large quantities of fish, particularly wild fish. - Highlights: ► We collected 24 fish species with more than 400 individual samples

  4. Big data naturally rescaled

    International Nuclear Information System (INIS)

    Stoop, Ruedi; Kanders, Karlis; Lorimer, Tom; Held, Jenny; Albert, Carlo

    2016-01-01

    We propose that a handle could be put on big data by looking at the systems that actually generate the data, rather than the data itself, realizing that there may be only few generic processes involved in this, each one imprinting its very specific structures in the space of systems, the traces of which translate into feature space. From this, we propose a practical computational clustering approach, optimized for coping with such data, inspired by how the human cortex is known to approach the problem.

  5. [Effect of compound danshen dripping pill combined with intravenous transplantation of human umbilical cord blood mononuclear cells on local inflammatory response in the myocardium of rabbits with acute myocardial infarction].

    Science.gov (United States)

    Deng, Liu-xia; Yu, Guo-long; Al, Qi; Yuan, Chun-ju

    2013-11-01

    To investigate effect of Compound Danshen Dripping Pill (CDDP) on the inflammatory response of the myocardium of acute myocardial infarction (AMI) rabbits, to observe the therapeutic effect of CDDP combined intravenous transplantation of human umbilical cord blood mononuclear cells (HUCBMCs) on inflammatory response, pro-inflammatory cytokine tumor necrosis factor alpha (TNF-alpha) , and heart function in the myocardium of AMI rabbits, and to explore the possible protective mechanisms of the combined therapy. The AMI model was successfully established by ligation of the left anterior coronary artery (LAD) in 40 healthy rabbits.Then they were randomly divided into four groups, i.e., the control group, the CDDP group, the transplantation group, and the combined group, 10 in each group. Rabbits in the control group received intravenous injection of 0.5 mL normal saline via ear vein within 24 h after AMI and then intragastric infusion of normal saline at 5 mL per day. Rabbits in the CDDP group received intravenous injection of 0.5 mL normal saline via ear vein within 24 h after AMI and then intragastric infusion of solution obtained by solving 270 mg CDDP in 5 mL normal saline per day. Rabbits in the transplantation group received intravenous injection of 0.5 mL normal saline labeled with green fluorescent protein (GFP) containing 3 x 10(7) of HUCBMCs via ear vein within 24 h after AMI and then intragastric infusion of normal saline at 5 mL per day. Rabbits in the combined group received intravenous injection of 0.5 mL normal saline labeled with GFP containing 3 x 10(7) of HUCBMCs via ear vein within 24 h after AMI and then intragastric infusion of solution obtained by solving 270 mg CDDP in 5 mL normal saline per day. At week 1 and 4 after treatment, cardiac function indices such as left ventricular fractional shorting (LVFS) and left ventricular ejection fraction (LVEF) were performed by echocardiography; the number of transplanted cells in the myocardium was found

  6. Biochemical characterization of individual human glycosylated pro-insulin-like growth factor (IGF)-II and big-IGF-II isoforms associated with cancer.

    Science.gov (United States)

    Greenall, Sameer A; Bentley, John D; Pearce, Lesley A; Scoble, Judith A; Sparrow, Lindsay G; Bartone, Nicola A; Xiao, Xiaowen; Baxter, Robert C; Cosgrove, Leah J; Adams, Timothy E

    2013-01-04

    Insulin-like growth factor II (IGF-II) is a major embryonic growth factor belonging to the insulin-like growth factor family, which includes insulin and IGF-I. Its expression in humans is tightly controlled by maternal imprinting, a genetic restraint that is lost in many cancers, resulting in up-regulation of both mature IGF-II mRNA and protein expression. Additionally, increased expression of several longer isoforms of IGF-II, termed "pro" and "big" IGF-II, has been observed. To date, it is ambiguous as to what role these IGF-II isoforms have in initiating and sustaining tumorigenesis and whether they are bioavailable. We have expressed each individual IGF-II isoform in their proper O-glycosylated format and established that all bind to the IGF-I receptor and both insulin receptors A and B, resulting in their activation and subsequent stimulation of fibroblast proliferation. We also confirmed that all isoforms are able to be sequestered into binary complexes with several IGF-binding proteins (IGFBP-2, IGFBP-3, and IGFBP-5). In contrast to this, ternary complex formation with IGFBP-3 or IGFBP-5 and the auxillary protein, acid labile subunit, was severely diminished. Furthermore, big-IGF-II isoforms bound much more weakly to purified ectodomain of the natural IGF-II scavenging receptor, IGF-IIR. IGF-II isoforms thus possess unique biological properties that may enable them to escape normal sequestration avenues and remain bioavailable in vivo to sustain oncogenic signaling.

  7. The International Big History Association

    Science.gov (United States)

    Duffy, Michael; Duffy, D'Neil

    2013-01-01

    IBHA, the International Big History Association, was organized in 2010 and "promotes the unified, interdisciplinary study and teaching of history of the Cosmos, Earth, Life, and Humanity." This is the vision that Montessori embraced long before the discoveries of modern science fleshed out the story of the evolving universe. "Big…

  8. Intravenous versus oral etoposide

    DEFF Research Database (Denmark)

    Ali, Abir Salwa; Grönberg, Malin; Langer, Seppo W.

    2018-01-01

    High-grade gastroenteropancreatic neuroendocrine neoplasms (GEP-NENs, G3) are aggressive cancers of the digestive system with poor prognosis and survival. Platinum-based chemotherapy (cisplatin/carboplatin + etoposide) is considered the first-line palliative treatment. Etoposide is frequently...... administered intravenously; however, oral etoposide may be used as an alternative. Concerns for oral etoposide include decreased bioavailability, inter- and intra-patient variability and patient compliance. We aimed to evaluate possible differences in progression-free survival (PFS) and overall survival (OS......) in patients treated with oral etoposide compared to etoposide given as infusion. Patients (n = 236) from the Nordic NEC study were divided into three groups receiving etoposide as a long infusion (24 h, n = 170), short infusion (≤ 5 h, n = 33) or oral etoposide (n = 33) according to hospital tradition. PFS...

  9. Intraarterial reteplase and intravenous abciximab for treatment of acute ischemic stroke. A preliminary feasibility and safety study in a non-human primate model

    International Nuclear Information System (INIS)

    Qureshi, Adnan I.; Suri, M. Fareed K.; Ali, Zulfiqar; Ringer, Andrew J.; Boulos, Alan S.; Guterman, Lee R.; Hopkins, L. Nelson; Nakada, Marian T.; Alberico, Ronald A.; Martin, Lisa B.E.

    2005-01-01

    We performed a preliminary feasibility and safety study using intravenous (IV) administration of a platelet glycoprotein IIb/IIIa inhibitor (abciximab) in conjunction with intraarterial (IA) administration of a thrombolytic agent (reteplase) in a primate model of intracranial thrombosis. We introduced thrombus through superselective catheterization of the intracranial segment of the internal carotid artery in 16 primates. The animals were randomly assigned to receive IA reteplase and IV abciximab (n =4), IA reteplase and IV placebo (n =4), IA placebo and IV abciximab (n =4) or IA and IV placebo (n =4). Recanalization was assessed by serial angiography during the 6-h period after initiation of treatment. Postmortem magnetic resonance (MR) imaging was performed to determine the presence of cerebral infarction or intracranial hemorrhage. Partial or complete recanalization at 6 h after initiation of treatment (decrease of two or more points in pre-treatment angiographic occlusion grade) was observed in two animals treated with IA reteplase and IV abciximab, three animals treated with IA reteplase alone and one animal treated with IV abciximab alone. No improvement in perfusion was observed in animals that received IV and IA placebo. Cerebral infarction was demonstrated on postmortem MR imaging in three animals that received IA and IV placebo and in one animal each from the groups that received IA reteplase and IV abciximab or IV abciximab alone. One animal that received IV abciximab alone had a small intracerebral hemorrhage on MR imaging. (orig.)

  10. The challenges of big data.

    Science.gov (United States)

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. © 2016. Published by The Company of Biologists Ltd.

  11. Big data for bipolar disorder.

    Science.gov (United States)

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  12. Capillary electrophoresis of Big-Dye terminator sequencing reactions for human mtDNA Control Region haplotyping in the identification of human remains.

    Science.gov (United States)

    Montesino, Marta; Prieto, Lourdes

    2012-01-01

    Cycle sequencing reaction with Big-Dye terminators provides the methodology to analyze mtDNA Control Region amplicons by means of capillary electrophoresis. DNA sequencing with ddNTPs or terminators was developed by (1). The progressive automation of the method by combining the use of fluorescent-dye terminators with cycle sequencing has made it possible to increase the sensibility and efficiency of the method and hence has allowed its introduction into the forensic field. PCR-generated mitochondrial DNA products are the templates for sequencing reactions. Different set of primers can be used to generate amplicons with different sizes according to the quality and quantity of the DNA extract providing sequence data for different ranges inside the Control Region.

  13. Big Data, Big Problems: A Healthcare Perspective.

    Science.gov (United States)

    Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M

    2017-01-01

    Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."

  14. Big Game Reporting Stations

    Data.gov (United States)

    Vermont Center for Geographic Information — Point locations of big game reporting stations. Big game reporting stations are places where hunters can legally report harvested deer, bear, or turkey. These are...

  15. Stalin's Big Fleet Program

    National Research Council Canada - National Science Library

    Mauner, Milan

    2002-01-01

    Although Dr. Milan Hauner's study 'Stalin's Big Fleet program' has focused primarily on the formation of Big Fleets during the Tsarist and Soviet periods of Russia's naval history, there are important lessons...

  16. Big Data Semantics

    NARCIS (Netherlands)

    Ceravolo, Paolo; Azzini, Antonia; Angelini, Marco; Catarci, Tiziana; Cudré-Mauroux, Philippe; Damiani, Ernesto; Mazak, Alexandra; van Keulen, Maurice; Jarrar, Mustafa; Santucci, Giuseppe; Sattler, Kai-Uwe; Scannapieco, Monica; Wimmer, Manuel; Wrembel, Robert; Zaraket, Fadi

    2018-01-01

    Big Data technology has discarded traditional data modeling approaches as no longer applicable to distributed data processing. It is, however, largely recognized that Big Data impose novel challenges in data and infrastructure management. Indeed, multiple components and procedures must be

  17. Privacy as human flourishing: could a shift towards virtue ethics strengthen privacy protection in the age of Big Data?

    NARCIS (Netherlands)

    van der Sloot, B.

    2014-01-01

    Privacy is commonly seen as an instrumental value in relation to negative freedom, human dignity and personal autonomy. Article 8 ECHR, protecting the right to privacy, was originally coined as a doctrine protecting the negative freedom of citizens in vertical relations, that is between citizen and

  18. Acute Toxicity of Intravenously Administered Titanium Dioxide Nanoparticles in Mice

    OpenAIRE

    Xu, Jiaying; Shi, Hongbo; Ruth, Magaye; Yu, Hongsheng; Lazar, Lissy; Zou, Baobo; Yang, Cui; Wu, Aiguo; Zhao, Jinshun

    2013-01-01

    BACKGROUND: With a wide range of applications, titanium dioxide (TiO₂) nanoparticles (NPs) are manufactured worldwide in large quantities. Recently, in the field of nanomedicine, intravenous injection of TiO₂ nanoparticulate carriers directly into the bloodstream has raised public concerns on their toxicity to humans. METHODS: In this study, mice were injected intravenously with a single dose of TiO₂ NPs at varying dose levels (0, 140, 300, 645, or 1387 mg/kg). Animal mortality, blood biochem...

  19. Ultrasonography versus intravenous urography

    International Nuclear Information System (INIS)

    Aslaksen, A.

    1991-01-01

    The present study was performed to compare the clinical value of urography and ultrasonography in a non-selected group of patients referred for urography to a university hospital. The conslusions and clinical implications of the study are as follows: Intravenous urography remains the cornerstone imaging examination in the evaluation of ureteral calculi. Ultrasonography is a valuable adjunct in cases of non- visualization of the kidneys, in distal obstruction and known contrast media allergy. When women with recurrent urinary tract infection are referred for imaging of the urinary tract, ultrasonography should be used. Ultrasonography should replace urography for screening of non-acute hydronephrosis like in female genital cancer and benign prostate hyperplasia. There is good correlation between urography and ultrasonography in assessing the degree of hydronephrosis. However, more researh on the relationship between hydronephrosis and obstruction is necessary. Ultrasonography should be used as the only imaging method of the upper urinary tract in patients with microscopic hematuria. In patients less than 50 years with macroscopic hematuria, ultrasonography should be used as the only imaging of the upper urinary tract, and an examination of the urinary bladder should be included. In patients over 50 years, urography supplied with ultrasonography should be used, but more research is necessary on the subject of imaging method and age. 158 refs

  20. Characterizing Big Data Management

    OpenAIRE

    Rogério Rossi; Kechi Hirama

    2015-01-01

    Big data management is a reality for an increasing number of organizations in many areas and represents a set of challenges involving big data modeling, storage and retrieval, analysis and visualization. However, technological resources, people and processes are crucial to facilitate the management of big data in any kind of organization, allowing information and knowledge from a large volume of data to support decision-making. Big data management can be supported by these three dimensions: t...

  1. Social big data mining

    CERN Document Server

    Ishikawa, Hiroshi

    2015-01-01

    Social Media. Big Data and Social Data. Hypotheses in the Era of Big Data. Social Big Data Applications. Basic Concepts in Data Mining. Association Rule Mining. Clustering. Classification. Prediction. Web Structure Mining. Web Content Mining. Web Access Log Mining, Information Extraction and Deep Web Mining. Media Mining. Scalability and Outlier Detection.

  2. The mobilize center: an NIH big data to knowledge center to advance human movement research and improve mobility.

    Science.gov (United States)

    Ku, Joy P; Hicks, Jennifer L; Hastie, Trevor; Leskovec, Jure; Ré, Christopher; Delp, Scott L

    2015-11-01

    Regular physical activity helps prevent heart disease, stroke, diabetes, and other chronic diseases, yet a broad range of conditions impair mobility at great personal and societal cost. Vast amounts of data characterizing human movement are available from research labs, clinics, and millions of smartphones and wearable sensors, but integration and analysis of this large quantity of mobility data are extremely challenging. The authors have established the Mobilize Center (http://mobilize.stanford.edu) to harness these data to improve human mobility and help lay the foundation for using data science methods in biomedicine. The Center is organized around 4 data science research cores: biomechanical modeling, statistical learning, behavioral and social modeling, and integrative modeling. Important biomedical applications, such as osteoarthritis and weight management, will focus the development of new data science methods. By developing these new approaches, sharing data and validated software tools, and training thousands of researchers, the Mobilize Center will transform human movement research. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  3. Temporal trend in the levels of polycyclic aromatic hydrocarbons emitted in a big tire landfill fire in Spain: Risk assessment for human health.

    Science.gov (United States)

    Rovira, Joaquim; Domínguez-Morueco, Noelia; Nadal, Martí; Schuhmacher, Marta; Domingo, José L

    2018-02-23

    In May 2016, a big fire occurred in an illegal landfill placed in Seseña (Toledo, Spain), where between 70,000 and 90,000 tons of tires had been accumulated during years. Just after the fire, and because of the increase of airborne PAHs, we found that cancer risks for the population living in the neighborhood of the landfill were 3-5 times higher than for the rest of inhabitants of Seseña. Some months after our initial (June 2016) study, two sampling campaigns (December 2016 and May 2017) were performed to assess the temporal trends of the environmental levels of PAHs, as well as to reassure that these chemicals did not pose any risk for the human health of Seseña inhabitants. In soils, the total concentrations of the 16 PAHs (December 2016), as well as the sum of the seven carcinogenic PAHs, showed values between 8.5 and 94.7 ng g -1 and between 1.0 and 42.3 ng g -1 , respectively. In May 2017, a significant decrease (between 4 and 38 times) in the levels of PAHs in air was observed, with total concentrations ranging between 3.49 and 5.06 ng m -3 . One year after the fire, the cancer risk at different zones of Seseña was similar, being lower than that found in June 2016, and negligible according to national and international agencies.

  4. Granulomatous interstitial pneumonia in a miniature swine associated with repeated intravenous injections of Tc-99m human serum albumin: concise communication

    International Nuclear Information System (INIS)

    Whinnery, J.E.; Young, J.T.

    1980-01-01

    Albumin lung-scanning agents have a proven high degree of safety, with the only contraindication to their use being allergic hypersensitivity. We have used these agents to investigate the physiologic effects of high G/sub z/ acceleratory forces on pulmonary perfusion using the miniature swine. Multiple doses of human macroaggregated albumin and human-albumin microspheres were given to a miniature swine at various levels of centrifugal acceleration over a 6-wk period. The dosages given were the same per kilogram as those used for routine clinical human studies. The animal subsequently died from a severe granulomatous interstitial pneumonia. The granulomatous lesions suggest that the pathogenesis may have involved a cell-mediated delayed hypersensitivity. This interstitial pneumonia may represent the end point in a chronic hypersensitivity response to the human-albumin lung-scanning agents

  5. Tumor tropism of intravenously injected human-induced pluripotent stem cell-derived neural stem cells and their gene therapy application in a metastatic breast cancer model.

    Science.gov (United States)

    Yang, Jing; Lam, Dang Hoang; Goh, Sally Sallee; Lee, Esther Xingwei; Zhao, Ying; Tay, Felix Chang; Chen, Can; Du, Shouhui; Balasundaram, Ghayathri; Shahbazi, Mohammad; Tham, Chee Kian; Ng, Wai Hoe; Toh, Han Chong; Wang, Shu

    2012-05-01

    Human pluripotent stem cells can serve as an accessible and reliable source for the generation of functional human cells for medical therapies. In this study, we used a conventional lentiviral transduction method to derive human-induced pluripotent stem (iPS) cells from primary human fibroblasts and then generated neural stem cells (NSCs) from the iPS cells. Using a dual-color whole-body imaging technology, we demonstrated that after tail vein injection, these human NSCs displayed a robust migratory capacity outside the central nervous system in both immunodeficient and immunocompetent mice and homed in on established orthotopic 4T1 mouse mammary tumors. To investigate whether the iPS cell-derived NSCs can be used as a cellular delivery vehicle for cancer gene therapy, the cells were transduced with a baculoviral vector containing the herpes simplex virus thymidine kinase suicide gene and injected through tail vein into 4T1 tumor-bearing mice. The transduced NSCs were effective in inhibiting the growth of the orthotopic 4T1 breast tumor and the metastatic spread of the cancer cells in the presence of ganciclovir, leading to prolonged survival of the tumor-bearing mice. The use of iPS cell-derived NSCs for cancer gene therapy bypasses the sensitive ethical issue surrounding the use of cells derived from human fetal tissues or human embryonic stem cells. This approach may also help to overcome problems associated with allogeneic transplantation of other types of human NSCs. Copyright © 2012 AlphaMed Press.

  6. Big Data: Understanding Big Data

    OpenAIRE

    Taylor-Sakyi, Kevin

    2016-01-01

    Steve Jobs, one of the greatest visionaries of our time was quoted in 1996 saying "a lot of times, people do not know what they want until you show it to them" [38] indicating he advocated products to be developed based on human intuition rather than research. With the advancements of mobile devices, social networks and the Internet of Things, enormous amounts of complex data, both structured and unstructured are being captured in hope to allow organizations to make better business decisions ...

  7. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  8. Microsoft big data solutions

    CERN Document Server

    Jorgensen, Adam; Welch, John; Clark, Dan; Price, Christopher; Mitchell, Brian

    2014-01-01

    Tap the power of Big Data with Microsoft technologies Big Data is here, and Microsoft's new Big Data platform is a valuable tool to help your company get the very most out of it. This timely book shows you how to use HDInsight along with HortonWorks Data Platform for Windows to store, manage, analyze, and share Big Data throughout the enterprise. Focusing primarily on Microsoft and HortonWorks technologies but also covering open source tools, Microsoft Big Data Solutions explains best practices, covers on-premises and cloud-based solutions, and features valuable case studies. Best of all,

  9. Population specific biomarkers of human aging: a big data study using South Korean, Canadian and Eastern European patient populations.

    Science.gov (United States)

    Mamoshina, Polina; Kochetov, Kirill; Putin, Evgeny; Cortese, Franco; Aliper, Alexander; Lee, Won-Suk; Ahn, Sung-Min; Uhn, Lee; Skjodt, Neil; Kovalchuk, Olga; Scheibye-Knudsen, Morten; Zhavoronkov, Alex

    2018-01-11

    Accurate and physiologically meaningful biomarkers for human aging are key to assessing anti-aging therapies. Given ethnic differences in health, diet, lifestyle, behaviour, environmental exposures and even average rate of biological aging, it stands to reason that aging clocks trained on datasets obtained from specific ethnic populations are more likely to account for these potential confounding factors, resulting in an enhanced capacity to predict chronological age and quantify biological age. Here we present a deep learning-based hematological aging clock modeled using the large combined dataset of Canadian, South Korean and Eastern European population blood samples that show increased predictive accuracy in individual populations compared to population-specific hematologic aging clocks. The performance of models was also evaluated on publicly-available samples of the American population from the National Health and Nutrition Examination Survey (NHANES). In addition, we explored the association between age predicted by both population-specific and combined hematological clocks and all-cause mortality. Overall, this study suggests a) the population-specificity of aging patterns and b) hematologic clocks predicts all-cause mortality. Proposed models added to the freely available Aging.AI system allowing improved ability to assess human aging. © The Author(s) 2018. Published by Oxford University Press on behalf of The Gerontological Society of America.

  10. A novel double-tracer technique to characterize absorption, distribution, metabolism and excretion (ADME) of [14C]tofogliflozin after oral administration and concomitant intravenous microdose administration of [13C]tofogliflozin in humans.

    Science.gov (United States)

    Schwab, Dietmar; Portron, Agnes; Backholer, Zoe; Lausecker, Berthold; Kawashima, Kosuke

    2013-06-01

    Human mass balance studies and the assessment of absolute oral bioavailability (F) are usually assessed in separate studies. Intravenous microdose administration of an isotope tracer concomitant to an unlabeled oral dose is an emerging technique to assess F. We report a novel double-tracer approach implemented for tofogliflozin combining oral administration of a radiolabel tracer with concomitant intravenous administration of a stable isotope tracer. Tofogliflozin is a potent and selective sodium/glucose cotransporter 2 inhibitor for the treatment of type 2 diabetes mellitus currently in clinical development. The objectives of the present study were to assess the systemic exposure of major circulating metabolites, excretion balance, F and contribution of renal clearance (CLR) to total clearance (CL) of tofogliflozin in healthy subjects within one study applying a novel double-tracer technique. Six healthy male subjects received 20 mg [(12)C/(14)C]tofogliflozin (3.73 MBq) orally and a concomitant microdose of 0.1 mg [(13)C]tofogliflozin intravenously. Pharmacokinetics of tofogliflozin were determined for the oral and intravenous route; the pharmacokinetics of the metabolites M1 and M5 were determined for the oral route. Quantification of [(12)C]tofogliflozin in plasma and urine and [(13)C]tofogliflozin in plasma was performed by selective LC-MS/MS methods. For the pre-selected metabolites of tofogliflozin, M1 and M5, a validated liquid chromatography-tandem mass spectrometry (LC-MS/MS) was applied to plasma and urine samples. Total radioactivity was assessed in plasma, urine and feces. Pharmacokinetic analysis was conducted by non-compartmental methods. The pharmacokinetics of tofogliflozin in healthy subjects were characterized by an F of 97.5 ± 12.3 %, CL of 10.0 ± 1.3 l/h and volume of distribution at steady-state (V(ss)) of 50.6 ± 6.7 l. The main route of elimination of total drug-related material was by excretion into urine (77.0 ± 4.1 % of the dose). The

  11. Different pressor and bronchoconstrictor properties of human big-endothelin-1, 2 (1-38) and 3 in ketamine/xylazine-anaesthetized guinea-pigs.

    OpenAIRE

    Gratton, J P; Rae, G A; Claing, A; Télémaque, S; D'Orléans-Juste, P

    1995-01-01

    1. In the present study, the precursors of endothelin-1, endothelin-2 and endothelin-3 were tested for their pressor and bronchoconstrictor properties in the anaesthetized guinea-pig. In addition, the effects of big-endothelin-1 and endothelin-1 were assessed under urethane or ketamine/xylazine anaesthesia. 2. When compared to ketamine/xylazine, urethane markedly depressed the pressor and bronchoconstrictor properties of endothelin-1 and big-endothelin-1. 3. Under ketamine/xylazine anaesthesi...

  12. Methods of preparing and using intravenous nutrient compositions

    International Nuclear Information System (INIS)

    Beigler, M.A.; Koury, A.J.

    1983-01-01

    A method for preparing a stable, dry-packaged, sterile, nutrient composition which upon addition of sterile, pyrogen-free water is suitable for intravenous administration to a mammal, including a human, is described. The method comprises providing the nutrients in a specific dry form and state of physical purity acceptable for intravenous administration, sealing the nutrients in a particular type of container adapted to receive and dispense sterile fluids and subjecting the container and its sealed contents to a sterilizing, nondestructive dose of ionizing radiation. The method results in a packaged, sterile nutrient composition which may be dissolved by the addition of sterile pyrogen-free water. The resulting aqueous intravenous solution may be safely administered to a mammal in need of nutrient therapy. The packaged nutrient compositions of the invention exhibit greatly extended storage life and provide an economical method of providing intravenous solutions which are safe and efficacious for use. (author)

  13. Biochemical Characterization of Individual Human Glycosylated pro-Insulin-like Growth Factor (IGF)-II and big-IGF-II Isoforms Associated with Cancer

    Science.gov (United States)

    Greenall, Sameer A.; Bentley, John D.; Pearce, Lesley A.; Scoble, Judith A.; Sparrow, Lindsay G.; Bartone, Nicola A.; Xiao, Xiaowen; Baxter, Robert C.; Cosgrove, Leah J.; Adams, Timothy E.

    2013-01-01

    Insulin-like growth factor II (IGF-II) is a major embryonic growth factor belonging to the insulin-like growth factor family, which includes insulin and IGF-I. Its expression in humans is tightly controlled by maternal imprinting, a genetic restraint that is lost in many cancers, resulting in up-regulation of both mature IGF-II mRNA and protein expression. Additionally, increased expression of several longer isoforms of IGF-II, termed “pro” and “big” IGF-II, has been observed. To date, it is ambiguous as to what role these IGF-II isoforms have in initiating and sustaining tumorigenesis and whether they are bioavailable. We have expressed each individual IGF-II isoform in their proper O-glycosylated format and established that all bind to the IGF-I receptor and both insulin receptors A and B, resulting in their activation and subsequent stimulation of fibroblast proliferation. We also confirmed that all isoforms are able to be sequestered into binary complexes with several IGF-binding proteins (IGFBP-2, IGFBP-3, and IGFBP-5). In contrast to this, ternary complex formation with IGFBP-3 or IGFBP-5 and the auxillary protein, acid labile subunit, was severely diminished. Furthermore, big-IGF-II isoforms bound much more weakly to purified ectodomain of the natural IGF-II scavenging receptor, IGF-IIR. IGF-II isoforms thus possess unique biological properties that may enable them to escape normal sequestration avenues and remain bioavailable in vivo to sustain oncogenic signaling. PMID:23166326

  14. The big bang

    International Nuclear Information System (INIS)

    Chown, Marcus.

    1987-01-01

    The paper concerns the 'Big Bang' theory of the creation of the Universe 15 thousand million years ago, and traces events which physicists predict occurred soon after the creation. Unified theory of the moment of creation, evidence of an expanding Universe, the X-boson -the particle produced very soon after the big bang and which vanished from the Universe one-hundredth of a second after the big bang, and the fate of the Universe, are all discussed. (U.K.)

  15. Summary big data

    CERN Document Server

    2014-01-01

    This work offers a summary of Cukier the book: "Big Data: A Revolution That Will Transform How we Live, Work, and Think" by Viktor Mayer-Schonberg and Kenneth. Summary of the ideas in Viktor Mayer-Schonberg's and Kenneth Cukier's book: " Big Data " explains that big data is where we use huge quantities of data to make better predictions based on the fact we identify patters in the data rather than trying to understand the underlying causes in more detail. This summary highlights that big data will be a source of new economic value and innovation in the future. Moreover, it shows that it will

  16. Data: Big and Small.

    Science.gov (United States)

    Jones-Schenk, Jan

    2017-02-01

    Big data is a big topic in all leadership circles. Leaders in professional development must develop an understanding of what data are available across the organization that can inform effective planning for forecasting. Collaborating with others to integrate data sets can increase the power of prediction. Big data alone is insufficient to make big decisions. Leaders must find ways to access small data and triangulate multiple types of data to ensure the best decision making. J Contin Educ Nurs. 2017;48(2):60-61. Copyright 2017, SLACK Incorporated.

  17. Digital humanitarians how big data is changing the face of humanitarian response

    CERN Document Server

    Meier, Patrick

    2015-01-01

    The Rise of Digital HumanitariansMapping Haiti LiveSupporting Search And Rescue EffortsPreparing For The Long Haul Launching An SMS Life Line Sending In The Choppers Openstreetmap To The Rescue Post-Disaster Phase The Human Story Doing Battle With Big Data Rise Of Digital Humanitarians This Book And YouThe Rise of Big (Crisis) DataBig (Size) Data Finding Needles In Big (Size) Data Policy, Not Simply Technology Big (False) Data Unpacking Big (False) Data Calling 991 And 999 Big (

  18. Orthostatic stability with intravenous levodopa

    Directory of Open Access Journals (Sweden)

    Shan H. Siddiqi

    2015-08-01

    Full Text Available Intravenous levodopa has been used in a multitude of research studies due to its more predictable pharmacokinetics compared to the oral form, which is used frequently as a treatment for Parkinson’s disease (PD. Levodopa is the precursor for dopamine, and intravenous dopamine would strongly affect vascular tone, but peripheral decarboxylase inhibitors are intended to block such effects. Pulse and blood pressure, with orthostatic changes, were recorded before and after intravenous levodopa or placebo—after oral carbidopa—in 13 adults with a chronic tic disorder and 16 tic-free adult control subjects. Levodopa caused no statistically or clinically significant changes in blood pressure or pulse. These data add to previous data that support the safety of i.v. levodopa when given with adequate peripheral inhibition of DOPA decarboxylase.

  19. What do Big Data do in Global Governance?

    DEFF Research Database (Denmark)

    Krause Hansen, Hans; Porter, Tony

    2017-01-01

    Two paradoxes associated with big data are relevant to global governance. First, while promising to increase the capacities of humans in governance, big data also involve an increasingly independent role for algorithms, technical artifacts, the Internet of things, and other objects, which can...... reduce the control of human actors. Second, big data involve new boundary transgressions as data are brought together from multiple sources while also creating new boundary conflicts as powerful actors seek to gain advantage by controlling big data and excluding competitors. These changes are not just...... about new data sources for global decision-makers, but instead signal more profound changes in the character of global governance....

  20. Noninvasive assessment of coronary stenoses by myocardial imaging during pharmacologic coronary vasodilation. VI. Detection of coronary artery disease in human beings with intravenous N-13 ammonia and positron computed tomography

    International Nuclear Information System (INIS)

    Schelbert, H.R.; Wisenberg, G.; Phelps, M.E.; Gould, K.L.; Henze, E.; Hoffman, E.J.; Gomes, A.; Kuhl, D.E.

    1982-01-01

    The possibility of detecting mild coronary stenoses with positron computed tomography and nitrogen (N-13) ammonia administered during pharmacologic coronary vasodilation was previously demonstrated in chronically instrumented dogs. The feasibility of using this technique in human beings and its sensitivity in determining the degree and extent of coronary artery disease were examined in 13 young normal healthy volunteers and 32 patients with angiographically documented coronary artery disease. N-13 ammonia was administered intravenously and its distribution in the left ventricular myocardium recorded at rest and during dipyridamole-induced coronary hyperemia. In the 13 volunteers, N-13 activity was homogeneous at rest and during hyperemia, whereas 31 of the 32 patients had regional defects on the hyperemic images not present during rest. All six patients with double, all 10 with triple and 15 of 16 patients with single vessel disease (97 percent) were correctly identified with the technique. Two vessel involvement was correctly identified in five of the six patients with double vessel disease and three vessel disease in six of 10 patients. Of all 58 coronary stenoses, 52 (90 percent) were correctly identified. In a subgroup of 11 patients, the technique was compared with exercise thallium-201 planar images, which were abnormal in 10 (91 percent) whereas N-13 images were abnormal in all 11. Of the 19 stenosed coronary arteries in this subgroup, 11 (58 percent) were correctly identified with thallium-201 and 17 (89 percent) with tomography (p less than 0.01). It is concluded that cross-sectional imaging of the myocardial distribution of N-13 ammonia administered during pharmacologic coronary vasodilation is a highly sensitive and accurate means for noninvasive detection of coronary stenoses in human beings and for estimating the extent of coronary artery disease

  1. Bliver big data til big business?

    DEFF Research Database (Denmark)

    Ritter, Thomas

    2015-01-01

    Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge.......Danmark har en digital infrastruktur, en registreringskultur og it-kompetente medarbejdere og kunder, som muliggør en førerposition, men kun hvis virksomhederne gør sig klar til næste big data-bølge....

  2. Dual of big bang and big crunch

    International Nuclear Information System (INIS)

    Bak, Dongsu

    2007-01-01

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory

  3. Comparsion between Intravenous Delivered Human Fetal Bone Marrow Mesenchymal Stromal Cells and Mononuclear Cells in the Treatment of Rat Cerebral Infarct.

    Science.gov (United States)

    Huang, Ai-Hua; Zhang, Ping-Ping; Zhang, Bin; Ma, Bu-Qing; Guan, Yun-Qian; Zhou, Yi-Dan

    2016-10-10

    Objective To compare the effecacy of human mesenchymal stromal cell (hMSC) with human mononuclear cell (hMNC) in treating rat cerebral infarct.Methods The SD rat models of cerebral infarct were established by distal middle cerebral artery occlusion (dMCAO). Rats were divided into four groups: sham,ischemia vehicle,MSC,and MNC transplantation groups. For the transplantation group,1×10 6 hMSCs or hMNCs were intravascularly transplanted into the tail vein 1 hour after the ischemia onset. The ischemia vehicle group received dMCAO surgery and intravascular saline injection 1,3,5,and 7 days after the ischemia onset,and then behavioral tests were performed. At 48 h after the ischemia onset,the abundance of Iba- 1,the symbol of activated microglia,was evaluated in the peri-ischemia striatum area; meanwhile,the neurotrophic factors such as glial cell line-derived neurotrophic factor (GDNF) and brain-derived neurotrophic factor (BDNF) in ipsilateral peri-ischemia striatum area were also measured. Results The relative infarct volume in ischemia vehicle group,hMSC group,and hMNC transplantation group were (37.85±4.40)%,(33.41±3.82)%,and (30.23±3.63)%,respectively. The infarct volumes of MSC group (t=2.100,P=0.034) and MNC group (t=2.109,P=0.0009) were significantly smaller than that of ischemia vehicle group,and that of MNC group was significantly smaller than that of MSC group (t=1.743,P=0.043). One day after transplantation,the score of ischemia vehicle group in limb placing test was (4.32±0.71)%,which was significantly lower than that in sham group (9.73±0.36)% (t=2.178,P=8.61×10 -11 ). The scores of MSC and MNC group,which were (5.09±0.62)% (t=2.1009,P=0.024) and (5.90±0.68)% (t=2.1008,P=0.0001),respectively,were significantly higher than that of ischemia vehicle group; also,the score of MNC group was significantly higher than that of MSC group(t=2.1009,P=0.0165). The contralateral forelimb scores of MSC and MNC groups in beam walking test were (5.56±0.86)% (t=2

  4. intravenous infusion of chlorimipramine (anafranil)

    African Journals Online (AJOL)

    the already extensive outpatient facilities at Johannesburg. Hospital as well as the Tara Neuro-Psychiatric Hospital for long-term therapy. Technique of Chlorimipramine Infusion. Initially 1 ampoule of chlorimipramine 25 mg in 250 mg of 5°~ dextrose saline was administered intravenously at the rate of 60 drops per minute.

  5. Big Data and Neuroimaging.

    Science.gov (United States)

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  6. Big data for health.

    Science.gov (United States)

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  7. Big Data in Medicine is Driving Big Changes

    Science.gov (United States)

    Verspoor, K.

    2014-01-01

    Summary Objectives To summarise current research that takes advantage of “Big Data” in health and biomedical informatics applications. Methods Survey of trends in this work, and exploration of literature describing how large-scale structured and unstructured data sources are being used to support applications from clinical decision making and health policy, to drug design and pharmacovigilance, and further to systems biology and genetics. Results The survey highlights ongoing development of powerful new methods for turning that large-scale, and often complex, data into information that provides new insights into human health, in a range of different areas. Consideration of this body of work identifies several important paradigm shifts that are facilitated by Big Data resources and methods: in clinical and translational research, from hypothesis-driven research to data-driven research, and in medicine, from evidence-based practice to practice-based evidence. Conclusions The increasing scale and availability of large quantities of health data require strategies for data management, data linkage, and data integration beyond the limits of many existing information systems, and substantial effort is underway to meet those needs. As our ability to make sense of that data improves, the value of the data will continue to increase. Health systems, genetics and genomics, population and public health; all areas of biomedicine stand to benefit from Big Data and the associated technologies. PMID:25123716

  8. From Big Data to Big Business

    DEFF Research Database (Denmark)

    Lund Pedersen, Carsten

    2017-01-01

    Idea in Brief: Problem: There is an enormous profit potential for manufacturing firms in big data, but one of the key barriers to obtaining data-driven growth is the lack of knowledge about which capabilities are needed to extract value and profit from data. Solution: We (BDBB research group at C...

  9. Big data, big knowledge: big data for personalized healthcare.

    Science.gov (United States)

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  10. Phase 1A safety assessment of intravenous amitriptyline

    NARCIS (Netherlands)

    Fridrich, Peter; Colvin, Hans Peter; Zizza, Anthony; Wasan, Ajay D.; Lukanich, Jean; Lirk, Philipp; Saria, Alois; Zernig, Gerald; Hamp, Thomas; Gerner, Peter

    2007-01-01

    The antidepressant amitriptyline is used as an adjuvant in the treatment of chronic pain. Among its many actions, amitriptyline blocks Na+ channels and nerves in several animal and human models. As perioperative intravenous lidocaine has been suggested to decrease postoperative pain, amitriptyline,

  11. Big Data in industry

    Science.gov (United States)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  12. Big data a primer

    CERN Document Server

    Bhuyan, Prachet; Chenthati, Deepak

    2015-01-01

    This book is a collection of chapters written by experts on various aspects of big data. The book aims to explain what big data is and how it is stored and used. The book starts from  the fundamentals and builds up from there. It is intended to serve as a review of the state-of-the-practice in the field of big data handling. The traditional framework of relational databases can no longer provide appropriate solutions for handling big data and making it available and useful to users scattered around the globe. The study of big data covers a wide range of issues including management of heterogeneous data, big data frameworks, change management, finding patterns in data usage and evolution, data as a service, service-generated data, service management, privacy and security. All of these aspects are touched upon in this book. It also discusses big data applications in different domains. The book will prove useful to students, researchers, and practicing database and networking engineers.

  13. Recht voor big data, big data voor recht

    NARCIS (Netherlands)

    Lafarre, Anne

    Big data is een niet meer weg te denken fenomeen in onze maatschappij. Het is de hype cycle voorbij en de eerste implementaties van big data-technieken worden uitgevoerd. Maar wat is nu precies big data? Wat houden de vijf V's in die vaak genoemd worden in relatie tot big data? Ter inleiding van

  14. Intravenous polyclonal human immunoglobulins in multiple sclerosis

    DEFF Research Database (Denmark)

    Sørensen, Per Soelberg

    2008-01-01

    to methylprednisolone does not make remission of symptoms faster or more complete. IVIG does not seem to be of any benefit to chronic visual or motor symptoms in MS. In secondary progressive MS, IVIG has not shown any effect on disease progression, relapses or new magnetic resonance imaging lesions. Experimental...... studies in the MS model experimental autoimmune encephalomyelitis in rats demonstrate that IVIG has to be administered at the time of induction of a relapse in order to be effective. In conclusion, IVIG can be considered as a second-line treatment to approved therapies for relapsing-remitting MS...... and magnetic resonance imaging outcome measures Udgivelsesdato: 2008...

  15. Assessing Big Data

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2015-01-01

    In recent years, big data has been one of the most controversially discussed technologies in terms of its possible positive and negative impact. Therefore, the need for technology assessments is obvious. This paper first provides, based on the results of a technology assessment study, an overview...... of the potential and challenges associated with big data and then describes the problems experienced during the study as well as methods found helpful to address them. The paper concludes with reflections on how the insights from the technology assessment study may have an impact on the future governance of big...... data....

  16. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Boyd, Richard N.

    2001-01-01

    The precision of measurements in modern cosmology has made huge strides in recent years, with measurements of the cosmic microwave background and the determination of the Hubble constant now rivaling the level of precision of the predictions of big bang nucleosynthesis. However, these results are not necessarily consistent with the predictions of the Standard Model of big bang nucleosynthesis. Reconciling these discrepancies may require extensions of the basic tenets of the model, and possibly of the reaction rates that determine the big bang abundances

  17. Big data for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kaufman, Marcia

    2013-01-01

    Find the right big data solution for your business or organization Big data management is one of the major challenges facing business, industry, and not-for-profit organizations. Data sets such as customer transactions for a mega-retailer, weather patterns monitored by meteorologists, or social network activity can quickly outpace the capacity of traditional data management tools. If you need to develop or manage big data solutions, you'll appreciate how these four experts define, explain, and guide you through this new and often confusing concept. You'll learn what it is, why it m

  18. Big Data, indispensable today

    Directory of Open Access Journals (Sweden)

    Radu-Ioan ENACHE

    2015-10-01

    Full Text Available Big data is and will be used more in the future as a tool for everything that happens both online and offline. Of course , online is a real hobbit, Big Data is found in this medium , offering many advantages , being a real help for all consumers. In this paper we talked about Big Data as being a plus in developing new applications, by gathering useful information about the users and their behaviour.We've also presented the key aspects of real-time monitoring and the architecture principles of this technology. The most important benefit brought to this paper is presented in the cloud section.

  19. Intravenous Therapy: Hazards, Complications and Their Prevention ...

    African Journals Online (AJOL)

    Breaks in aseptic techniques, faulty handling of parenteral fluid containers, failure to discard out-dated intravenous solutions and tubings contribute to occurrence of intravenous-associated sepsis. Improper technique and lack of pharmaceutical knowledge when adding drugs into intravenous fluids contribute to ...

  20. A tomographic approach to intravenous coronary arteriography

    International Nuclear Information System (INIS)

    Ritman, E.L.; Bove, A.A.

    1986-01-01

    Coronary artery anatomy can be visualized using high speed, volume scanning X-ray CT. A single scan during a bolus injection of contrast medium provides image data for display of all angles of view of the opacified coronary arterial tree. Due to the tomographic nature of volume image data the superposition of contrast filled cardiac chambers, such as would occur in the levophase of an intravenous injection of contrast agent, can be eliminated. Data are presented which support these statements. The Dynamic Spatial Reconstructor (DSR) was used to scan a life-like radiologic phantom of an adult human thorax in which the left atrial and ventricular chambers and the major epicardial coronary arteries were opacified so as to simulate the levophase of an intravenous injection of contrast agent. A catheter filled with diluted contrast agent and with regions of luminal narrowing (i.e. 'stenoses') was advanced along a tract equivalent to a right ventricular catheterization. Ease of visualization of the catheter 'stenoses' and the accuracy with which they can be measured are presented. (Auth.)

  1. Intravenous Antiepileptic Drugs in Russia

    Directory of Open Access Journals (Sweden)

    P. N. Vlasov

    2014-01-01

    Full Text Available Launching four intravenous antiepileptic drugs: valproate (Depakene and Convulex, lacosamide (Vimpat, and levetiracetam (Keppra – into the Russian market has significantly broadened the possibilities of rendering care to patients in seizure emergency situations. The chemi- cal structure, mechanisms of action, indications/contraindications, clinical effectiveness and tolerability, advantages/disadvantages, and adverse events of using these drugs in urgent and elective neurology are discussed. 

  2. Big Data in der Cloud

    DEFF Research Database (Denmark)

    Leimbach, Timo; Bachlechner, Daniel

    2014-01-01

    Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)......Technology assessment of big data, in particular cloud based big data services, for the Office for Technology Assessment at the German federal parliament (Bundestag)...

  3. Small Big Data Congress 2017

    NARCIS (Netherlands)

    Doorn, J.

    2017-01-01

    TNO, in collaboration with the Big Data Value Center, presents the fourth Small Big Data Congress! Our congress aims at providing an overview of practical and innovative applications based on big data. Do you want to know what is happening in applied research with big data? And what can already be

  4. Cryptography for Big Data Security

    Science.gov (United States)

    2015-07-13

    Cryptography for Big Data Security Book Chapter for Big Data: Storage, Sharing, and Security (3S) Distribution A: Public Release Ariel Hamlin1 Nabil...Email: arkady@ll.mit.edu ii Contents 1 Cryptography for Big Data Security 1 1.1 Introduction...48 Chapter 1 Cryptography for Big Data Security 1.1 Introduction With the amount

  5. Big data opportunities and challenges

    CERN Document Server

    2014-01-01

    This ebook aims to give practical guidance for all those who want to understand big data better and learn how to make the most of it. Topics range from big data analysis, mobile big data and managing unstructured data to technologies, governance and intellectual property and security issues surrounding big data.

  6. Citizens’ Media Meets Big Data: The Emergence of Data Activism

    NARCIS (Netherlands)

    Milan, S.; Gutiérrez, M.

    2015-01-01

    Big data presents citizens with new challenges and opportunities. ‘Data activism’ practices emerge at the intersection of the social and technological dimension of human action, whereby citizens take a critical approach to big data, and appropriate and manipulate data for advocacy and social change.

  7. Crisis analytics : big data-driven crisis response

    NARCIS (Netherlands)

    Qadir, Junaid; ur Rasool, Raihan; Zwitter, Andrej; Sathiaseelan, Arjuna; Crowcroft, Jon

    2016-01-01

    Disasters have long been a scourge for humanity. With the advances in technology (in terms of computing, communications, and the ability to process, and analyze big data), our ability to respond to disasters is at an inflection point. There is great optimism that big data tools can be leveraged to

  8. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Madsen, Anders Koed; Rasche, Andreas

    This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects...... shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact....

  9. Big Data Revisited

    DEFF Research Database (Denmark)

    Kallinikos, Jannis; Constantiou, Ioanna

    2015-01-01

    We elaborate on key issues of our paper New games, new rules: big data and the changing context of strategy as a means of addressing some of the concerns raised by the paper’s commentators. We initially deal with the issue of social data and the role it plays in the current data revolution...... and the technological recording of facts. We further discuss the significance of the very mechanisms by which big data is produced as distinct from the very attributes of big data, often discussed in the literature. In the final section of the paper, we qualify the alleged importance of algorithms and claim...... that the structures of data capture and the architectures in which data generation is embedded are fundamental to the phenomenon of big data....

  10. The Big Bang Singularity

    Science.gov (United States)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  11. BigDansing

    KAUST Repository

    Khayyat, Zuhair; Ilyas, Ihab F.; Jindal, Alekh; Madden, Samuel; Ouzzani, Mourad; Papotti, Paolo; Quiané -Ruiz, Jorge-Arnulfo; Tang, Nan; Yin, Si

    2015-01-01

    of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic

  12. Boarding to Big data

    Directory of Open Access Journals (Sweden)

    Oana Claudia BRATOSIN

    2016-05-01

    Full Text Available Today Big data is an emerging topic, as the quantity of the information grows exponentially, laying the foundation for its main challenge, the value of the information. The information value is not only defined by the value extraction from huge data sets, as fast and optimal as possible, but also by the value extraction from uncertain and inaccurate data, in an innovative manner using Big data analytics. At this point, the main challenge of the businesses that use Big data tools is to clearly define the scope and the necessary output of the business so that the real value can be gained. This article aims to explain the Big data concept, its various classifications criteria, architecture, as well as the impact in the world wide processes.

  13. Big Creek Pit Tags

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The BCPITTAGS database is used to store data from an Oncorhynchus mykiss (steelhead/rainbow trout) population dynamics study in Big Creek, a coastal stream along the...

  14. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-01-01

    on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also

  15. Reframing Open Big Data

    DEFF Research Database (Denmark)

    Marton, Attila; Avital, Michel; Jensen, Tina Blegind

    2013-01-01

    Recent developments in the techniques and technologies of collecting, sharing and analysing data are challenging the field of information systems (IS) research let alone the boundaries of organizations and the established practices of decision-making. Coined ‘open data’ and ‘big data......’, these developments introduce an unprecedented level of societal and organizational engagement with the potential of computational data to generate new insights and information. Based on the commonalities shared by open data and big data, we develop a research framework that we refer to as open big data (OBD......) by employing the dimensions of ‘order’ and ‘relationality’. We argue that these dimensions offer a viable approach for IS research on open and big data because they address one of the core value propositions of IS; i.e. how to support organizing with computational data. We contrast these dimensions with two...

  16. Conociendo Big Data

    Directory of Open Access Journals (Sweden)

    Juan José Camargo-Vega

    2014-12-01

    Full Text Available Teniendo en cuenta la importancia que ha adquirido el término Big Data, la presente investigación buscó estudiar y analizar de manera exhaustiva el estado del arte del Big Data; además, y como segundo objetivo, analizó las características, las herramientas, las tecnologías, los modelos y los estándares relacionados con Big Data, y por último buscó identificar las características más relevantes en la gestión de Big Data, para que con ello se pueda conocer todo lo concerniente al tema central de la investigación.La metodología utilizada incluyó revisar el estado del arte de Big Data y enseñar su situación actual; conocer las tecnologías de Big Data; presentar algunas de las bases de datos NoSQL, que son las que permiten procesar datos con formatos no estructurados, y mostrar los modelos de datos y las tecnologías de análisis de ellos, para terminar con algunos beneficios de Big Data.El diseño metodológico usado para la investigación fue no experimental, pues no se manipulan variables, y de tipo exploratorio, debido a que con esta investigación se empieza a conocer el ambiente del Big Data.

  17. Big Bang baryosynthesis

    International Nuclear Information System (INIS)

    Turner, M.S.; Chicago Univ., IL

    1983-01-01

    In these lectures I briefly review Big Bang baryosynthesis. In the first lecture I discuss the evidence which exists for the BAU, the failure of non-GUT symmetrical cosmologies, the qualitative picture of baryosynthesis, and numerical results of detailed baryosynthesis calculations. In the second lecture I discuss the requisite CP violation in some detail, further the statistical mechanics of baryosynthesis, possible complications to the simplest scenario, and one cosmological implication of Big Bang baryosynthesis. (orig./HSI)

  18. Minsky on "Big Government"

    Directory of Open Access Journals (Sweden)

    Daniel de Santana Vasconcelos

    2014-03-01

    Full Text Available This paper objective is to assess, in light of the main works of Minsky, his view and analysis of what he called the "Big Government" as that huge institution which, in parallels with the "Big Bank" was capable of ensuring stability in the capitalist system and regulate its inherently unstable financial system in mid-20th century. In this work, we analyze how Minsky proposes an active role for the government in a complex economic system flawed by financial instability.

  19. Big Data and medicine: a big deal?

    Science.gov (United States)

    Mayer-Schönberger, V; Ingelsson, E

    2018-05-01

    Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research. Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data's role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval. © 2017 The Association for the Publication of the Journal of Internal Medicine.

  20. Big data uncertainties.

    Science.gov (United States)

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  1. Intravenous immunoglobulin and Alzheimer's disease immunotherapy.

    Science.gov (United States)

    Solomon, Beka

    2007-02-01

    Amyloid-beta peptide (Abeta) contributes to the acute progression of Alzheimer's disease (AD) and has become the main target for therapeutics. Active immunization with Abeta in individuals with AD has been efficacious; however, some patients developed side effects, possibly related to an autoimmune response. Evidence that intravenous immunoglobulin (IVIg), an FDA-approved purified immunoglobulin fraction from normal human donor blood, shows promise of passive immunotherapy for AD is reviewed. Investigations into the molecular effects of IVIg on Abeta clearance, using the BV-2 cellular microglia line, demonstrate that IVIg dissolves Abeta fibrils in vitro, increases cellular tolerance to Abeta, enhances microglial migration toward Abeta deposits, and mediates phagocytosis of Abeta. Preliminary clinical results indicate that IVIg, which contains natural antibodies against the Abeta, warrants further study into its potential to deliver a controlled immune attack on the peptide, avoiding the immune toxicities that have had a negative impact on the first clinical trials of vaccine against Abeta.

  2. THE FASTEST OODA LOOP: THE IMPLICATIONS OF BIG DATA FOR AIR POWER

    Science.gov (United States)

    2016-06-01

    need for a human interpreter. Until the rise of Big Data , automated translation only had a “small” library of several million words to pull from and...AIR COMMAND AND STAFF COLLEGE AIR UNIVERSITY THE FASTEST OODA LOOP: THE IMPLICATIONS OF BIG DATA FOR AIR POWER by Aaron J. Dove, Maj, USAF A...1 Previous Academic Study....................................................................................................2 Why Big Data

  3. Privacy Challenges of Genomic Big Data.

    Science.gov (United States)

    Shen, Hong; Ma, Jian

    2017-01-01

    With the rapid advancement of high-throughput DNA sequencing technologies, genomics has become a big data discipline where large-scale genetic information of human individuals can be obtained efficiently with low cost. However, such massive amount of personal genomic data creates tremendous challenge for privacy, especially given the emergence of direct-to-consumer (DTC) industry that provides genetic testing services. Here we review the recent development in genomic big data and its implications on privacy. We also discuss the current dilemmas and future challenges of genomic privacy.

  4. Cognitive computing and big data analytics

    CERN Document Server

    Hurwitz, Judith; Bowles, Adrian

    2015-01-01

    MASTER THE ABILITY TO APPLY BIG DATA ANALYTICS TO MASSIVE AMOUNTS OF STRUCTURED AND UNSTRUCTURED DATA Cognitive computing is a technique that allows humans and computers to collaborate in order to gain insights and knowledge from data by uncovering patterns and anomalies. This comprehensive guide explains the underlying technologies, such as artificial intelligence, machine learning, natural language processing, and big data analytics. It then demonstrates how you can use these technologies to transform your organization. You will explore how different vendors and different industries are a

  5. BigDansing

    KAUST Repository

    Khayyat, Zuhair

    2015-06-02

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to scaling to big datasets. This presents a serious impediment since data cleansing often involves costly computations such as enumerating pairs of tuples, handling inequality joins, and dealing with user-defined functions. In this paper, we present BigDansing, a Big Data Cleansing system to tackle efficiency, scalability, and ease-of-use issues in data cleansing. The system can run on top of most common general purpose data processing platforms, ranging from DBMSs to MapReduce-like frameworks. A user-friendly programming interface allows users to express data quality rules both declaratively and procedurally, with no requirement of being aware of the underlying distributed platform. BigDansing takes these rules into a series of transformations that enable distributed computations and several optimizations, such as shared scans and specialized joins operators. Experimental results on both synthetic and real datasets show that BigDansing outperforms existing baseline systems up to more than two orders of magnitude without sacrificing the quality provided by the repair algorithms.

  6. Big data challenges

    DEFF Research Database (Denmark)

    Bachlechner, Daniel; Leimbach, Timo

    2016-01-01

    Although reports on big data success stories have been accumulating in the media, most organizations dealing with high-volume, high-velocity and high-variety information assets still face challenges. Only a thorough understanding of these challenges puts organizations into a position in which...... they can make an informed decision for or against big data, and, if the decision is positive, overcome the challenges smoothly. The combination of a series of interviews with leading experts from enterprises, associations and research institutions, and focused literature reviews allowed not only...... framework are also relevant. For large enterprises and startups specialized in big data, it is typically easier to overcome the challenges than it is for other enterprises and public administration bodies....

  7. Thick-Big Descriptions

    DEFF Research Database (Denmark)

    Lai, Signe Sophus

    The paper discusses the rewards and challenges of employing commercial audience measurements data – gathered by media industries for profitmaking purposes – in ethnographic research on the Internet in everyday life. It questions claims to the objectivity of big data (Anderson 2008), the assumption...... communication systems, language and behavior appear as texts, outputs, and discourses (data to be ‘found’) – big data then documents things that in earlier research required interviews and observations (data to be ‘made’) (Jensen 2014). However, web-measurement enterprises build audiences according...... to a commercial logic (boyd & Crawford 2011) and is as such directed by motives that call for specific types of sellable user data and specific segmentation strategies. In combining big data and ‘thick descriptions’ (Geertz 1973) scholars need to question how ethnographic fieldwork might map the ‘data not seen...

  8. Big data in biomedicine.

    Science.gov (United States)

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Big data bioinformatics.

    Science.gov (United States)

    Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao

    2014-12-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.

  10. Big Data: Concept, Potentialities and Vulnerabilities

    Directory of Open Access Journals (Sweden)

    Fernando Almeida

    2018-03-01

    Full Text Available The evolution of information systems and the growth in the use of the Internet and social networks has caused an explosion in the amount of available data relevant to the activities of the companies. Therefore, the treatment of these available data is vital to support operational, tactical and strategic decisions. This paper aims to present the concept of big data and the main technologies that support the analysis of large data volumes. The potential of big data is explored considering nine sectors of activity, such as financial, retail, healthcare, transports, agriculture, energy, manufacturing, public, and media and entertainment. In addition, the main current opportunities, vulnerabilities and privacy challenges of big data are discussed. It was possible to conclude that despite the potential for using the big data to grow in the previously identified areas, there are still some challenges that need to be considered and mitigated, namely the privacy of information, the existence of qualified human resources to work with Big Data and the promotion of a data-driven organizational culture.

  11. INTRAVENOUS IMMUNOGLOBULIN IN PEDIATRIC RHEUMATOLOGY PRACTICE

    Directory of Open Access Journals (Sweden)

    E. I. Alexeeva

    2015-01-01

    Full Text Available Modern successful treatment of rheumatic diseases is impossible without the use of intravenous immunoglobulin. The use of intravenous immunoglobulin is based on strict indications developed as a result of long-term multicenter controlled studies. The article highlights the issues of using immunoglobulin in pediatric rheumatology practice, and provides the review of literature with the results from the evaluation of the efficiency of intravenous immunoglobulin confirming the efficiency of the drug only for certain rheumatic diseases. 

  12. Rationale and design of the allogeneiC human mesenchymal stem cells (hMSC) in patients with aging fRAilTy via intravenoUS delivery (CRATUS) study: A phase I/II, randomized, blinded and placebo controlled trial to evaluate the safety and potential efficacy of allogeneic human mesenchymal stem cell infusion in patients with aging frailty.

    Science.gov (United States)

    Golpanian, Samuel; DiFede, Darcy L; Pujol, Marietsy V; Lowery, Maureen H; Levis-Dusseau, Silvina; Goldstein, Bradley J; Schulman, Ivonne H; Longsomboon, Bangon; Wolf, Ariel; Khan, Aisha; Heldman, Alan W; Goldschmidt-Clermont, Pascal J; Hare, Joshua M

    2016-03-15

    Frailty is a syndrome associated with reduced physiological reserves that increases an individual's vulnerability for developing increased morbidity and/or mortality. While most clinical trials have focused on exercise, nutrition, pharmacologic agents, or a multifactorial approach for the prevention and attenuation of frailty, none have studied the use of cell-based therapies. We hypothesize that the application of allogeneic human mesenchymal stem cells (allo-hMSCs) as a therapeutic agent for individuals with frailty is safe and efficacious. The CRATUS trial comprises an initial non-blinded phase I study, followed by a blinded, randomized phase I/II study (with an optional follow-up phase) that will address the safety and pre-specified beneficial effects in patients with the aging frailty syndrome. In the initial phase I protocol, allo-hMSCs will be administered in escalating doses via peripheral intravenous infusion (n=15) to patients allocated to three treatment groups: Group 1 (n=5, 20 million allo-hMSCs), Group 2 (n=5, 100 million allo-hMSCs), and Group 3 (n=5, 200 million allo-hMSCs). Subsequently, in the randomized phase, allo-hMSCs or matched placebo will be administered to patients (n=30) randomly allocated in a 1:1:1 ratio to one of two doses of MSCs versus placebo: Group A (n=10, 100 million allo-hMSCs), Group B (n=10, 200 million allo-hMSCs), and Group C (n=10, placebo). Primary and secondary objectives are, respectively, to demonstrate the safety and efficacy of allo-hMSCs administered in frail older individuals. This study will determine the safety of intravenous infusion of stem cells and compare phenotypic outcomes in patients with aging frailty.

  13. Use of intravenous immunoglobulin in neonates with haemolytic disease and immune thrombocytopenia

    Directory of Open Access Journals (Sweden)

    Marković-Sovtić Gordana

    2013-01-01

    Full Text Available Background/Aim. Intravenous immunoglobulin is a blood product made of human polyclonal immunoglobulin G. The mode of action of intravenous immunoglobulin is very complex. It is indicated in treatment of neonatal immune thrombocytopenia and haemolytic disease of the newborn. The aim of the study was to present our experience in the use of intravenous immunoglobulin in a group of term neonates. Methods. We analysed all relevant clinical and laboratory data of 23 neonates who recieved intravenous immunoglobulin during their hospitalization in Neonatal Intensive Care Unit of Mother and Child Health Care Institute over a five year period, from 2006. to 2010. Results. There were 11 patients with haemolytic disease of the newborn and 12 neonates with immune thrombocytopenia. All of them recieved 1-2 g/kg intravenous immunoglobulin in the course of their treatment. There was no adverse effects of intravenous immunoglobulin use. The use of intravenous immunoglobulin led to an increase in platelet number in thrombocytopenic patients, whereas in those with haemolytic disease serum bilirubin level decreased significantly, so that some patients whose bilirubin level was very close to the exchange transfusion criterion, avoided this procedure. Conclusion. The use of intravenous immunoglobulin was shown to be an effective treatment in reducing the need for exchange transfusion, duration of phototherapy and the length of hospital stay in neonates with haemolytic disease. When used in treatment of neonatal immune thrombocytopenia, it leads to an increase in the platelet number, thus decreasing the risk of serious complications of thrombocytopenia.

  14. Big Java late objects

    CERN Document Server

    Horstmann, Cay S

    2012-01-01

    Big Java: Late Objects is a comprehensive introduction to Java and computer programming, which focuses on the principles of programming, software engineering, and effective learning. It is designed for a two-semester first course in programming for computer science students.

  15. Big ideas: innovation policy

    OpenAIRE

    John Van Reenen

    2011-01-01

    In the last CentrePiece, John Van Reenen stressed the importance of competition and labour market flexibility for productivity growth. His latest in CEP's 'big ideas' series describes the impact of research on how policy-makers can influence innovation more directly - through tax credits for business spending on research and development.

  16. Big Data ethics

    NARCIS (Netherlands)

    Zwitter, Andrej

    2014-01-01

    The speed of development in Big Data and associated phenomena, such as social media, has surpassed the capacity of the average consumer to understand his or her actions and their knock-on effects. We are moving towards changes in how ethics has to be perceived: away from individual decisions with

  17. Big data in history

    CERN Document Server

    Manning, Patrick

    2013-01-01

    Big Data in History introduces the project to create a world-historical archive, tracing the last four centuries of historical dynamics and change. Chapters address the archive's overall plan, how to interpret the past through a global archive, the missions of gathering records, linking local data into global patterns, and exploring the results.

  18. The Big Sky inside

    Science.gov (United States)

    Adams, Earle; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Hester, Carolyn; Knuth, Randy; Spangler, Todd; Jones, David; Henthorn, Melissa; Hammill, Brock; Smith, Paul; Salisbury, Rob; Reckin, Gene; Boulafentis, Johna

    2009-01-01

    The University of Montana (UM)-Missoula has implemented a problem-based program in which students perform scientific research focused on indoor air pollution. The Air Toxics Under the Big Sky program (Jones et al. 2007; Adams et al. 2008; Ward et al. 2008) provides a community-based framework for understanding the complex relationship between poor…

  19. Moving Another Big Desk.

    Science.gov (United States)

    Fawcett, Gay

    1996-01-01

    New ways of thinking about leadership require that leaders move their big desks and establish environments that encourage trust and open communication. Educational leaders must trust their colleagues to make wise choices. When teachers are treated democratically as leaders, classrooms will also become democratic learning organizations. (SM)

  20. A Big Bang Lab

    Science.gov (United States)

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  1. New 'bigs' in cosmology

    International Nuclear Information System (INIS)

    Yurov, Artyom V.; Martin-Moruno, Prado; Gonzalez-Diaz, Pedro F.

    2006-01-01

    This paper contains a detailed discussion on new cosmic solutions describing the early and late evolution of a universe that is filled with a kind of dark energy that may or may not satisfy the energy conditions. The main distinctive property of the resulting space-times is that they make to appear twice the single singular events predicted by the corresponding quintessential (phantom) models in a manner which can be made symmetric with respect to the origin of cosmic time. Thus, big bang and big rip singularity are shown to take place twice, one on the positive branch of time and the other on the negative one. We have also considered dark energy and phantom energy accretion onto black holes and wormholes in the context of these new cosmic solutions. It is seen that the space-times of these holes would then undergo swelling processes leading to big trip and big hole events taking place on distinct epochs along the evolution of the universe. In this way, the possibility is considered that the past and future be connected in a non-paradoxical manner in the universes described by means of the new symmetric solutions

  2. The Big Bang

    CERN Multimedia

    Moods, Patrick

    2006-01-01

    How did the Universe begin? The favoured theory is that everything - space, time, matter - came into existence at the same moment, around 13.7 thousand million years ago. This event was scornfully referred to as the "Big Bang" by Sir Fred Hoyle, who did not believe in it and maintained that the Universe had always existed.

  3. Big Data Analytics

    Indian Academy of Sciences (India)

    The volume and variety of data being generated using computersis doubling every two years. It is estimated that in 2015,8 Zettabytes (Zetta=1021) were generated which consistedmostly of unstructured data such as emails, blogs, Twitter,Facebook posts, images, and videos. This is called big data. Itis possible to analyse ...

  4. Identifying Dwarfs Workloads in Big Data Analytics

    OpenAIRE

    Gao, Wanling; Luo, Chunjie; Zhan, Jianfeng; Ye, Hainan; He, Xiwen; Wang, Lei; Zhu, Yuqing; Tian, Xinhui

    2015-01-01

    Big data benchmarking is particularly important and provides applicable yardsticks for evaluating booming big data systems. However, wide coverage and great complexity of big data computing impose big challenges on big data benchmarking. How can we construct a benchmark suite using a minimum set of units of computation to represent diversity of big data analytics workloads? Big data dwarfs are abstractions of extracting frequently appearing operations in big data computing. One dwarf represen...

  5. Big Data and Chemical Education

    Science.gov (United States)

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  6. Scaling Big Data Cleansing

    KAUST Repository

    Khayyat, Zuhair

    2017-07-31

    Data cleansing approaches have usually focused on detecting and fixing errors with little attention to big data scaling. This presents a serious impediment since identify- ing and repairing dirty data often involves processing huge input datasets, handling sophisticated error discovery approaches and managing huge arbitrary errors. With large datasets, error detection becomes overly expensive and complicated especially when considering user-defined functions. Furthermore, a distinctive algorithm is de- sired to optimize inequality joins in sophisticated error discovery rather than na ̈ıvely parallelizing them. Also, when repairing large errors, their skewed distribution may obstruct effective error repairs. In this dissertation, I present solutions to overcome the above three problems in scaling data cleansing. First, I present BigDansing as a general system to tackle efficiency, scalability, and ease-of-use issues in data cleansing for Big Data. It automatically parallelizes the user’s code on top of general-purpose distributed platforms. Its programming inter- face allows users to express data quality rules independently from the requirements of parallel and distributed environments. Without sacrificing their quality, BigDans- ing also enables parallel execution of serial repair algorithms by exploiting the graph representation of discovered errors. The experimental results show that BigDansing outperforms existing baselines up to more than two orders of magnitude. Although BigDansing scales cleansing jobs, it still lacks the ability to handle sophisticated error discovery requiring inequality joins. Therefore, I developed IEJoin as an algorithm for fast inequality joins. It is based on sorted arrays and space efficient bit-arrays to reduce the problem’s search space. By comparing IEJoin against well- known optimizations, I show that it is more scalable, and several orders of magnitude faster. BigDansing depends on vertex-centric graph systems, i.e., Pregel

  7. Big data analysis new algorithms for a new society

    CERN Document Server

    Stefanowski, Jerzy

    2016-01-01

    This edited volume is devoted to Big Data Analysis from a Machine Learning standpoint as presented by some of the most eminent researchers in this area. It demonstrates that Big Data Analysis opens up new research problems which were either never considered before, or were only considered within a limited range. In addition to providing methodological discussions on the principles of mining Big Data and the difference between traditional statistical data analysis and newer computing frameworks, this book presents recently developed algorithms affecting such areas as business, financial forecasting, human mobility, the Internet of Things, information networks, bioinformatics, medical systems and life science. It explores, through a number of specific examples, how the study of Big Data Analysis has evolved and how it has started and will most likely continue to affect society. While the benefits brought upon by Big Data Analysis are underlined, the book also discusses some of the warnings that have been issued...

  8. Today and tomorrow of intravenous coronary angiography programme in Japan

    International Nuclear Information System (INIS)

    Ando, Masami; Hyodo, Kazuyuki

    1994-01-01

    Development of an intravenous coronary angiography system using monochromated synchrotron radiation at the Photon Factory is described. This comprises an asymmetric cut silicon monochromator crystal to get a larger exposure area, a two dimensional imaging system using an imaging intensifier coupled to a CCD TV camera and a fast video data acquisition system. The whole system is under development using alive dogs. A future system including a dedicated insertion device applicable to alive humans is also proposed. (author)

  9. HIV antibodies among intravenous drug users in Bahrain.

    Science.gov (United States)

    al-Haddad, M K; Khashaba, A S; Baig, B Z; Khalfan, S

    1994-09-01

    A 12-month study was conducted to identify risk factors for human immunodeficiency virus (HIV) infections among intravenous drug users (IDU) attending drug rehabilitation clinic of the Psychiatric Hospital, Manama, Bahrain. Patients provided demographic and behavioural information based on a questionnaire. Two hundred and forty male IDUs participated in the study on voluntary basis. The seroprevalence of HIV was 21.1 per cent. The presence of HIV antibody was associated with educational status, frequency of injecting drugs and needle sharing.

  10. Euthanasia of Small Animals with Nitrogen; Comparison with Intravenous Pentobarbital

    OpenAIRE

    Quine, John P.; Buckingham, William; Strunin, Leo

    1988-01-01

    Intravenous pentobarbital (with or without addition of saturated potassium chloride) was compared with nitrogen gas exposure for euthanasia of small animals (dogs, cats, and rabbits) in a humane society environment. Initially, electrocardiographic) and electroencephalographic monitoring were used to establish the time of death in presedated animals given either pentobarbital or exposed to nitrogen; later, nitrogen euthanasia alone was studied. Sedation with acepromazine delayed the effects of...

  11. Intentional intravenous mercury injection | Yudelowitz | South African ...

    African Journals Online (AJOL)

    Intravenous mercury injection is rarely seen, with few documented cases. Treatment strategies are not clearly defined for such cases, although a few options do show benefit. This case report describes a 29-year-old man suffering from bipolar disorder, who presented following self-inflicted intravenous injection of mercury.

  12. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    Science.gov (United States)

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  13. Semantic Web technologies for the big data in life sciences.

    Science.gov (United States)

    Wu, Hongyan; Yamaguchi, Atsuko

    2014-08-01

    The life sciences field is entering an era of big data with the breakthroughs of science and technology. More and more big data-related projects and activities are being performed in the world. Life sciences data generated by new technologies are continuing to grow in not only size but also variety and complexity, with great speed. To ensure that big data has a major influence in the life sciences, comprehensive data analysis across multiple data sources and even across disciplines is indispensable. The increasing volume of data and the heterogeneous, complex varieties of data are two principal issues mainly discussed in life science informatics. The ever-evolving next-generation Web, characterized as the Semantic Web, is an extension of the current Web, aiming to provide information for not only humans but also computers to semantically process large-scale data. The paper presents a survey of big data in life sciences, big data related projects and Semantic Web technologies. The paper introduces the main Semantic Web technologies and their current situation, and provides a detailed analysis of how Semantic Web technologies address the heterogeneous variety of life sciences big data. The paper helps to understand the role of Semantic Web technologies in the big data era and how they provide a promising solution for the big data in life sciences.

  14. Big Data-Survey

    Directory of Open Access Journals (Sweden)

    P.S.G. Aruna Sri

    2016-03-01

    Full Text Available Big data is the term for any gathering of information sets, so expensive and complex, that it gets to be hard to process for utilizing customary information handling applications. The difficulties incorporate investigation, catch, duration, inquiry, sharing, stockpiling, Exchange, perception, and protection infringement. To reduce spot business patterns, anticipate diseases, conflict etc., we require bigger data sets when compared with the smaller data sets. Enormous information is hard to work with utilizing most social database administration frameworks and desktop measurements and perception bundles, needing rather enormously parallel programming running on tens, hundreds, or even a large number of servers. In this paper there was an observation on Hadoop architecture, different tools used for big data and its security issues.

  15. Finding the big bang

    CERN Document Server

    Page, Lyman A; Partridge, R Bruce

    2009-01-01

    Cosmology, the study of the universe as a whole, has become a precise physical science, the foundation of which is our understanding of the cosmic microwave background radiation (CMBR) left from the big bang. The story of the discovery and exploration of the CMBR in the 1960s is recalled for the first time in this collection of 44 essays by eminent scientists who pioneered the work. Two introductory chapters put the essays in context, explaining the general ideas behind the expanding universe and fossil remnants from the early stages of the expanding universe. The last chapter describes how the confusion of ideas and measurements in the 1960s grew into the present tight network of tests that demonstrate the accuracy of the big bang theory. This book is valuable to anyone interested in how science is done, and what it has taught us about the large-scale nature of the physical universe.

  16. Big Data as Governmentality

    DEFF Research Database (Denmark)

    Flyverbom, Mikkel; Klinkby Madsen, Anders; Rasche, Andreas

    data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects......This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big...... of international development agendas to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development efforts, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion...

  17. Big nuclear accidents

    International Nuclear Information System (INIS)

    Marshall, W.; Billingon, D.E.; Cameron, R.F.; Curl, S.J.

    1983-09-01

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but just imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the risks of nuclear power. The paper reviews the way in which the probability and consequences of big nuclear accidents have been presented in the past and makes recommendations for the future, including the presentation of the long-term consequences of such accidents in terms of 'loss of life expectancy', 'increased chance of fatal cancer' and 'equivalent pattern of compulsory cigarette smoking'. The paper presents mathematical arguments, which show the derivation and validity of the proposed methods of presenting the consequences of imaginable big nuclear accidents. (author)

  18. Big Bounce and inhomogeneities

    International Nuclear Information System (INIS)

    Brizuela, David; Mena Marugan, Guillermo A; Pawlowski, Tomasz

    2010-01-01

    The dynamics of an inhomogeneous universe is studied with the methods of loop quantum cosmology, via a so-called hybrid quantization, as an example of the quantization of vacuum cosmological spacetimes containing gravitational waves (Gowdy spacetimes). The analysis of this model with an infinite number of degrees of freedom, performed at the effective level, shows that (i) the initial Big Bang singularity is replaced (as in the case of homogeneous cosmological models) by a Big Bounce, joining deterministically two large universes, (ii) the universe size at the bounce is at least of the same order of magnitude as that of the background homogeneous universe and (iii) for each gravitational wave mode, the difference in amplitude at very early and very late times has a vanishing statistical average when the bounce dynamics is strongly dominated by the inhomogeneities, whereas this average is positive when the dynamics is in a near-vacuum regime, so that statistically the inhomogeneities are amplified. (fast track communication)

  19. Big Data and reality

    Directory of Open Access Journals (Sweden)

    Ryan Shaw

    2015-11-01

    Full Text Available DNA sequencers, Twitter, MRIs, Facebook, particle accelerators, Google Books, radio telescopes, Tumblr: what do these things have in common? According to the evangelists of “data science,” all of these are instruments for observing reality at unprecedentedly large scales and fine granularities. This perspective ignores the social reality of these very different technological systems, ignoring how they are made, how they work, and what they mean in favor of an exclusive focus on what they generate: Big Data. But no data, big or small, can be interpreted without an understanding of the process that generated them. Statistical data science is applicable to systems that have been designed as scientific instruments, but is likely to lead to confusion when applied to systems that have not. In those cases, a historical inquiry is preferable.

  20. Really big numbers

    CERN Document Server

    Schwartz, Richard Evan

    2014-01-01

    In the American Mathematical Society's first-ever book for kids (and kids at heart), mathematician and author Richard Evan Schwartz leads math lovers of all ages on an innovative and strikingly illustrated journey through the infinite number system. By means of engaging, imaginative visuals and endearing narration, Schwartz manages the monumental task of presenting the complex concept of Big Numbers in fresh and relatable ways. The book begins with small, easily observable numbers before building up to truly gigantic ones, like a nonillion, a tredecillion, a googol, and even ones too huge for names! Any person, regardless of age, can benefit from reading this book. Readers will find themselves returning to its pages for a very long time, perpetually learning from and growing with the narrative as their knowledge deepens. Really Big Numbers is a wonderful enrichment for any math education program and is enthusiastically recommended to every teacher, parent and grandparent, student, child, or other individual i...

  1. Big Bang Circus

    Science.gov (United States)

    Ambrosini, C.

    2011-06-01

    Big Bang Circus is an opera I composed in 2001 and which was premiered at the Venice Biennale Contemporary Music Festival in 2002. A chamber group, four singers and a ringmaster stage the story of the Universe confronting and interweaving two threads: how early man imagined it and how scientists described it. Surprisingly enough fancy, myths and scientific explanations often end up using the same images, metaphors and sometimes even words: a strong tension, a drumskin starting to vibrate, a shout…

  2. Big Bang 5

    CERN Document Server

    Apolin, Martin

    2007-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 5 RG behandelt die Grundlagen (Maßsystem, Größenordnungen) und die Mechanik (Translation, Rotation, Kraft, Erhaltungssätze).

  3. Big Bang 8

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Band 8 vermittelt auf verständliche Weise Relativitätstheorie, Kern- und Teilchenphysik (und deren Anwendungen in der Kosmologie und Astrophysik), Nanotechnologie sowie Bionik.

  4. Big Bang 6

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. Der Band 6 RG behandelt die Gravitation, Schwingungen und Wellen, Thermodynamik und eine Einführung in die Elektrizität anhand von Alltagsbeispielen und Querverbindungen zu anderen Disziplinen.

  5. Big Bang 7

    CERN Document Server

    Apolin, Martin

    2008-01-01

    Physik soll verständlich sein und Spaß machen! Deshalb beginnt jedes Kapitel in Big Bang mit einem motivierenden Überblick und Fragestellungen und geht dann von den Grundlagen zu den Anwendungen, vom Einfachen zum Komplizierten. Dabei bleibt die Sprache einfach, alltagsorientiert und belletristisch. In Band 7 werden neben einer Einführung auch viele aktuelle Aspekte von Quantenmechanik (z. Beamen) und Elektrodynamik (zB Elektrosmog), sowie die Klimaproblematik und die Chaostheorie behandelt.

  6. Big Bang Darkleosynthesis

    OpenAIRE

    Krnjaic, Gordan; Sigurdson, Kris

    2014-01-01

    In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis , a dark-sector version of big-bang nucleosynthesis (BBN), in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD , which generica...

  7. Big³. Editorial.

    Science.gov (United States)

    Lehmann, C U; Séroussi, B; Jaulent, M-C

    2014-05-22

    To provide an editorial introduction into the 2014 IMIA Yearbook of Medical Informatics with an overview of the content, the new publishing scheme, and upcoming 25th anniversary. A brief overview of the 2014 special topic, Big Data - Smart Health Strategies, and an outline of the novel publishing model is provided in conjunction with a call for proposals to celebrate the 25th anniversary of the Yearbook. 'Big Data' has become the latest buzzword in informatics and promise new approaches and interventions that can improve health, well-being, and quality of life. This edition of the Yearbook acknowledges the fact that we just started to explore the opportunities that 'Big Data' will bring. However, it will become apparent to the reader that its pervasive nature has invaded all aspects of biomedical informatics - some to a higher degree than others. It was our goal to provide a comprehensive view at the state of 'Big Data' today, explore its strengths and weaknesses, as well as its risks, discuss emerging trends, tools, and applications, and stimulate the development of the field through the aggregation of excellent survey papers and working group contributions to the topic. For the first time in history will the IMIA Yearbook be published in an open access online format allowing a broader readership especially in resource poor countries. For the first time, thanks to the online format, will the IMIA Yearbook be published twice in the year, with two different tracks of papers. We anticipate that the important role of the IMIA yearbook will further increase with these changes just in time for its 25th anniversary in 2016.

  8. Recent big flare

    International Nuclear Information System (INIS)

    Moriyama, Fumio; Miyazawa, Masahide; Yamaguchi, Yoshisuke

    1978-01-01

    The features of three big solar flares observed at Tokyo Observatory are described in this paper. The active region, McMath 14943, caused a big flare on September 16, 1977. The flare appeared on both sides of a long dark line which runs along the boundary of the magnetic field. Two-ribbon structure was seen. The electron density of the flare observed at Norikura Corona Observatory was 3 x 10 12 /cc. Several arc lines which connect both bright regions of different magnetic polarity were seen in H-α monochrome image. The active region, McMath 15056, caused a big flare on December 10, 1977. At the beginning, several bright spots were observed in the region between two main solar spots. Then, the area and the brightness increased, and the bright spots became two ribbon-shaped bands. A solar flare was observed on April 8, 1978. At first, several bright spots were seen around the solar spot in the active region, McMath 15221. Then, these bright spots developed to a large bright region. On both sides of a dark line along the magnetic neutral line, bright regions were generated. These developed to a two-ribbon flare. The time required for growth was more than one hour. A bright arc which connects two ribbons was seen, and this arc may be a loop prominence system. (Kato, T.)

  9. Big Data Technologies

    Science.gov (United States)

    Bellazzi, Riccardo; Dagliati, Arianna; Sacchi, Lucia; Segagni, Daniele

    2015-01-01

    The so-called big data revolution provides substantial opportunities to diabetes management. At least 3 important directions are currently of great interest. First, the integration of different sources of information, from primary and secondary care to administrative information, may allow depicting a novel view of patient’s care processes and of single patient’s behaviors, taking into account the multifaceted nature of chronic care. Second, the availability of novel diabetes technologies, able to gather large amounts of real-time data, requires the implementation of distributed platforms for data analysis and decision support. Finally, the inclusion of geographical and environmental information into such complex IT systems may further increase the capability of interpreting the data gathered and extract new knowledge from them. This article reviews the main concepts and definitions related to big data, it presents some efforts in health care, and discusses the potential role of big data in diabetes care. Finally, as an example, it describes the research efforts carried on in the MOSAIC project, funded by the European Commission. PMID:25910540

  10. SETI as a part of Big History

    Science.gov (United States)

    Maccone, Claudio

    2014-08-01

    Big History is an emerging academic discipline which examines history scientifically from the Big Bang to the present. It uses a multidisciplinary approach based on combining numerous disciplines from science and the humanities, and explores human existence in the context of this bigger picture. It is taught at some universities. In a series of recent papers ([11] through [15] and [17] through [18]) and in a book [16], we developed a new mathematical model embracing Darwinian Evolution (RNA to Humans, see, in particular, [17] and Human History (Aztecs to USA, see [16]) and then we extrapolated even that into the future up to ten million years (see 18), the minimum time requested for a civilization to expand to the whole Milky Way (Fermi paradox). In this paper, we further extend that model in the past so as to let it start at the Big Bang (13.8 billion years ago) thus merging Big History, Evolution on Earth and SETI (the modern Search for ExtraTerrestrial Intelligence) into a single body of knowledge of a statistical type. Our idea is that the Geometric Brownian Motion (GBM), so far used as the key stochastic process of financial mathematics (Black-Sholes models and related 1997 Nobel Prize in Economics!) may be successfully applied to the whole of Big History. In particular, in this paper we derive the Statistical Drake Equation (namely the statistical extension of the classical Drake Equation typical of SETI) can be regarded as the “frozen in time” part of GBM. This makes SETI a subset of our Big History Theory based on GBMs: just as the GBM is the “movie” unfolding in time, so the Statistical Drake Equation is its “still picture”, static in time, and the GBM is the time-extension of the Drake Equation. Darwinian Evolution on Earth may be easily described as an increasing GBM in the number of living species on Earth over the last 3.5 billion years. The first of them was RNA 3.5 billion years ago, and now 50 million living species or more exist, each

  11. Big bang and big crunch in matrix string theory

    OpenAIRE

    Bedford, J; Papageorgakis, C; Rodríguez-Gómez, D; Ward, J

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  12. Intravenous Iron Carboxymaltose as a Potential Therapeutic in Anemia of Inflammation.

    Directory of Open Access Journals (Sweden)

    Niklas Lofruthe

    Full Text Available Intravenous iron supplementation is an effective therapy in iron deficiency anemia (IDA, but controversial in anemia of inflammation (AI. Unbound iron can be used by bacteria and viruses for their replication and enhance the inflammatory response. Nowadays available high molecular weight iron complexes for intravenous iron substitution, such as ferric carboxymaltose, might be useful in AI, as these pharmaceuticals deliver low doses of free iron over a prolonged period of time. We tested the effects of intravenous iron carboxymaltose in murine AI: Wild-type mice were exposed to the heat-killed Brucella abortus (BA model and treated with or without high molecular weight intravenous iron. 4h after BA injection followed by 2h after intravenous iron treatment, inflammatory cytokines were upregulated by BA, but not enhanced by iron treatment. In long term experiments, mice were fed a regular or an iron deficient diet and then treated with intravenous iron or saline 14 days after BA injection. Iron treatment in mice with BA-induced AI was effective 24h after iron administration. In contrast, mice with IDA (on iron deficiency diet prior to BA-IA required 7d to recover from AI. In these experiments, inflammatory markers were not further induced in iron-treated compared to vehicle-treated BA-injected mice. These results demonstrate that intravenous iron supplementation effectively treated the murine BA-induced AI without further enhancement of the inflammatory response. Studies in humans have to reveal treatment options for AI in patients.

  13. Disaggregating asthma: Big investigation versus big data.

    Science.gov (United States)

    Belgrave, Danielle; Henderson, John; Simpson, Angela; Buchan, Iain; Bishop, Christopher; Custovic, Adnan

    2017-02-01

    We are facing a major challenge in bridging the gap between identifying subtypes of asthma to understand causal mechanisms and translating this knowledge into personalized prevention and management strategies. In recent years, "big data" has been sold as a panacea for generating hypotheses and driving new frontiers of health care; the idea that the data must and will speak for themselves is fast becoming a new dogma. One of the dangers of ready accessibility of health care data and computational tools for data analysis is that the process of data mining can become uncoupled from the scientific process of clinical interpretation, understanding the provenance of the data, and external validation. Although advances in computational methods can be valuable for using unexpected structure in data to generate hypotheses, there remains a need for testing hypotheses and interpreting results with scientific rigor. We argue for combining data- and hypothesis-driven methods in a careful synergy, and the importance of carefully characterized birth and patient cohorts with genetic, phenotypic, biological, and molecular data in this process cannot be overemphasized. The main challenge on the road ahead is to harness bigger health care data in ways that produce meaningful clinical interpretation and to translate this into better diagnoses and properly personalized prevention and treatment plans. There is a pressing need for cross-disciplinary research with an integrative approach to data science, whereby basic scientists, clinicians, data analysts, and epidemiologists work together to understand the heterogeneity of asthma. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Man is not a big rat: concerns with traditional human risk assessment of phthalates based on their anti-androgenic effects observed in the rat foetus.

    Science.gov (United States)

    Habert, René; Livera, Gabriel; Rouiller-Fabre, Virginie

    2014-01-01

    Phthalates provide one of the most documented example evidencing how much we must be cautious when using the traditional paradigm based on extrapolation of experimental data from rodent studies for human health risk assessment of endocrine disruptors (EDs). Since foetal testis is known as one of the most sensitive targets of EDs, phthalate risk assessment is routinely based on the capacity of such compounds to decrease testosterone production by the testis or to impair masculinization in the rat during foetal life. In this paper, the well-established inhibiting effects of phthalates of the foetal Leydig cells function in the rat are briefly reviewed. Then, data obtained in humans and other species are carefully analysed. Already in January 2009, using the organotypic culture system named Fetal Testis Assay (FeTA) that we developed, we reported that phthalates might not affect testosterone production in human foetal testes. Several recent experimental studies using xenografts confirm the absence of detectable anti-androgenic effect of phthalates in the human foetal testes. Epidemiological studies led to contradictory results. Altogether, these findings suggest that phthalates effects on foetal Leydig cells are largely species-specific. Consequently, the phthalate threshold doses that disturb foetal steroidogenesis in rat testes and that are presently used to define the acceptable daily intake levels for human health protection must be questioned. This does not mean that phthalates are safe because these compounds have many deleterious effects upon germ cell development that may be common to the different studied species including human. More generally, the identification of common molecular, cellular or/and phenotypic targets in rat and human testes should precede the choice of the toxicological endpoint in rat to accurately assess the safety threshold of any ED in humans.

  15. Big data and educational research

    OpenAIRE

    Beneito-Montagut, Roser

    2017-01-01

    Big data and data analytics offer the promise to enhance teaching and learning, improve educational research and progress education governance. This chapter aims to contribute to the conceptual and methodological understanding of big data and analytics within educational research. It describes the opportunities and challenges that big data and analytics bring to education as well as critically explore the perils of applying a data driven approach to education. Despite the claimed value of the...

  16. The trashing of Big Green

    International Nuclear Information System (INIS)

    Felten, E.

    1990-01-01

    The Big Green initiative on California's ballot lost by a margin of 2-to-1. Green measures lost in five other states, shocking ecology-minded groups. According to the postmortem by environmentalists, Big Green was a victim of poor timing and big spending by the opposition. Now its supporters plan to break up the bill and try to pass some provisions in the Legislature

  17. Rectal dihydroartemisinin versus intravenous quinine in the ...

    African Journals Online (AJOL)

    Rectal dihydroartemisinin versus intravenous quinine in the treatment of severe malaria: A randomised clinical trial. F Esamai, P Ayuo, W Owino-Ongor, J Rotich, A Ngindu, A Obala, F Ogaro, L Quoqiao, G Xingbo, L Guangqian ...

  18. INFECTIVE ENDOCARDITIS IN INTRAVENOUS DRUGS ABUSED PATIENT

    Directory of Open Access Journals (Sweden)

    E. Y. Ponomareva

    2014-07-01

    Full Text Available Three-year observation of acute tricuspid infective endocarditis in intravenous drug abused patient: diagnosis, clinical features, visceral lesions, the possibility of cardiac surgery and conservative treatment, outcome.

  19. INFECTIVE ENDOCARDITIS IN INTRAVENOUS DRUGS ABUSED PATIENT

    Directory of Open Access Journals (Sweden)

    E. Y. Ponomareva

    2011-01-01

    Full Text Available Three-year observation of acute tricuspid infective endocarditis in intravenous drug abused patient: diagnosis, clinical features, visceral lesions, the possibility of cardiac surgery and conservative treatment, outcome.

  20. First-in-Human Study of PF-05212384 (PKI-587), a Small-Molecule, Intravenous, Dual Inhibitor of PI3K and mTOR in Patients with Advanced Cancer.

    Science.gov (United States)

    Shapiro, Geoffrey I; Bell-McGuinn, Katherine M; Molina, Julian R; Bendell, Johanna; Spicer, James; Kwak, Eunice L; Pandya, Susan S; Millham, Robert; Borzillo, Gary; Pierce, Kristen J; Han, Lixin; Houk, Brett E; Gallo, Jorge D; Alsina, Maria; Braña, Irene; Tabernero, Josep

    2015-04-15

    To evaluate safety (primary endpoint), tolerability, pharmacokinetics, pharmacodynamic profile, and preliminary activity of the intravenous, pan-class I isoform PI3K/mTOR inhibitor PF-05212384 in patients with advanced solid tumors. Part 1 of this open-label phase I study was designed to estimate the maximum-tolerated dose (MTD) in patients with nonselected solid tumors, using a modified continual reassessment method to guide dose escalation. Objectives of part 2 were MTD confirmation and assessment of preliminary activity in patients with selected tumor types and PI3K pathway dysregulation. Seventy-seven of the 78 enrolled patients received treatment. The MTD for PF-05212384, administered intravenously once weekly, was estimated to be 154 mg. The most common treatment-related adverse events (AE) were mucosal inflammation/stomatitis (58.4%), nausea (42.9%), hyperglycemia (26%), decreased appetite (24.7%), fatigue (24.7%), and vomiting (24.7%). The majority of patients treated at the MTD experienced only grade 1 treatment-related AEs. Grade 3 treatment-related AEs occurred in 23.8% of patients at the MTD. No treatment-related grade 4-5 AEs were reported at any dose level. Antitumor activity was noted in this heavily pretreated patient population, with two partial responses (PR) and an unconfirmed PR. Eight patients had long-lasting stable disease (>6 months). Pharmacokinetic analyses showed a biphasic concentration-time profile for PF-05212384 (half-life, 30-37 hours after multiple dosing). PF-05212384 inhibited downstream effectors of the PI3K pathway in paired tumor biopsies. These findings demonstrate the manageable safety profile and antitumor activity of the PI3K/mTOR inhibitor PF-05212384, supporting further clinical development for patients with advanced solid malignancies. ©2015 American Association for Cancer Research.

  1. Big data in fashion industry

    Science.gov (United States)

    Jain, S.; Bruniaux, J.; Zeng, X.; Bruniaux, P.

    2017-10-01

    Significant work has been done in the field of big data in last decade. The concept of big data includes analysing voluminous data to extract valuable information. In the fashion world, big data is increasingly playing a part in trend forecasting, analysing consumer behaviour, preference and emotions. The purpose of this paper is to introduce the term fashion data and why it can be considered as big data. It also gives a broad classification of the types of fashion data and briefly defines them. Also, the methodology and working of a system that will use this data is briefly described.

  2. Big Data Analytics An Overview

    Directory of Open Access Journals (Sweden)

    Jayshree Dwivedi

    2015-08-01

    Full Text Available Big data is a data beyond the storage capacity and beyond the processing power is called big data. Big data term is used for data sets its so large or complex that traditional data it involves data sets with sizes. Big data size is a constantly moving target year by year ranging from a few dozen terabytes to many petabytes of data means like social networking sites the amount of data produced by people is growing rapidly every year. Big data is not only a data rather it become a complete subject which includes various tools techniques and framework. It defines the epidemic possibility and evolvement of data both structured and unstructured. Big data is a set of techniques and technologies that require new forms of assimilate to uncover large hidden values from large datasets that are diverse complex and of a massive scale. It is difficult to work with using most relational database management systems and desktop statistics and visualization packages exacting preferably massively parallel software running on tens hundreds or even thousands of servers. Big data environment is used to grab organize and resolve the various types of data. In this paper we describe applications problems and tools of big data and gives overview of big data.

  3. Was there a big bang

    International Nuclear Information System (INIS)

    Narlikar, J.

    1981-01-01

    In discussing the viability of the big-bang model of the Universe relative evidence is examined including the discrepancies in the age of the big-bang Universe, the red shifts of quasars, the microwave background radiation, general theory of relativity aspects such as the change of the gravitational constant with time, and quantum theory considerations. It is felt that the arguments considered show that the big-bang picture is not as soundly established, either theoretically or observationally, as it is usually claimed to be, that the cosmological problem is still wide open and alternatives to the standard big-bang picture should be seriously investigated. (U.K.)

  4. Intravenous iron-containing products: EMA procrastination.

    Science.gov (United States)

    2014-07-01

    A European reassessment has led to identical changes in the summaries of product characteristics (SPCs) for all intravenous iron-containing products: the risk of serious adverse effects is now highlighted, underlining the fact that intravenous iron-containing products should only be used when the benefits clearly outweigh the harms. Unfortunately, iron dextran still remains on the market despite a higher risk of hypersensitivity reactions than with iron sucrose.

  5. How Big is Earth?

    Science.gov (United States)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  6. Privacy and Big Data

    CERN Document Server

    Craig, Terence

    2011-01-01

    Much of what constitutes Big Data is information about us. Through our online activities, we leave an easy-to-follow trail of digital footprints that reveal who we are, what we buy, where we go, and much more. This eye-opening book explores the raging privacy debate over the use of personal data, with one undeniable conclusion: once data's been collected, we have absolutely no control over who uses it or how it is used. Personal data is the hottest commodity on the market today-truly more valuable than gold. We are the asset that every company, industry, non-profit, and government wants. Pri

  7. Visualizing big energy data

    DEFF Research Database (Denmark)

    Hyndman, Rob J.; Liu, Xueqin Amy; Pinson, Pierre

    2018-01-01

    Visualization is a crucial component of data analysis. It is always a good idea to plot the data before fitting models, making predictions, or drawing conclusions. As sensors of the electric grid are collecting large volumes of data from various sources, power industry professionals are facing th...... the challenge of visualizing such data in a timely fashion. In this article, we demonstrate several data-visualization solutions for big energy data through three case studies involving smart-meter data, phasor measurement unit (PMU) data, and probabilistic forecasts, respectively....

  8. Big Data Challenges

    Directory of Open Access Journals (Sweden)

    Alexandru Adrian TOLE

    2013-10-01

    Full Text Available The amount of data that is traveling across the internet today, not only that is large, but is complex as well. Companies, institutions, healthcare system etc., all of them use piles of data which are further used for creating reports in order to ensure continuity regarding the services that they have to offer. The process behind the results that these entities requests represents a challenge for software developers and companies that provide IT infrastructure. The challenge is how to manipulate an impressive volume of data that has to be securely delivered through the internet and reach its destination intact. This paper treats the challenges that Big Data creates.

  9. A Matrix Big Bang

    OpenAIRE

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matr...

  10. Optimal composition of intravenous lipids

    African Journals Online (AJOL)

    fatty acids. All these fatty acids present a source of energy with a ..... Phagocytosis and killing of Candida albicans by human. Figure 4: ... Mechanisms of increased survival after lipopolysaccharide- induced .... Dietary alpha-linolenic acid and.

  11. BIG Data - BIG Gains? Understanding the Link Between Big Data Analytics and Innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance for product innovations. Since big data technologies provide new data information practices, they create new decision-making possibilities, which firms can use to realize innovations. Applying German firm-level data we find suggestive evidence that big data analytics matters for the likelihood of becoming a product innovator as well as the market success of the firms’ product innovat...

  12. BIG data - BIG gains? Empirical evidence on the link between big data analytics and innovation

    OpenAIRE

    Niebel, Thomas; Rasel, Fabienne; Viete, Steffen

    2017-01-01

    This paper analyzes the relationship between firms’ use of big data analytics and their innovative performance in terms of product innovations. Since big data technologies provide new data information practices, they create novel decision-making possibilities, which are widely believed to support firms’ innovation process. Applying German firm-level data within a knowledge production function framework we find suggestive evidence that big data analytics is a relevant determinant for the likel...

  13. Tiny tweaks, big changes: An alternative strategy to empower ethical culture of human research in anesthesia (A Taiwan Acta Anesthesiologica Taiwanica-Ethics Review Task Force Report).

    Science.gov (United States)

    Luk, Hsiang-Ning; Ennever, John F; Day, Yuan-Ji; Wong, Chih-Shung; Sun, Wei-Zen

    2015-03-01

    For this guidance article, the Ethics Review Task Force (ERTF) of the Journal reviewed and discussed the ethics issues related to publication of human research in the field of anesthesia. ERTF first introduced international ethics principles and minimal requirements of reporting of ethics practices, followed by discussing the universal problems of publication ethics. ERTF then compared the accountability and methodology of several medical journals in assuring authors' ethics compliance. Using the Taiwan Institutional Review Board system as an example, ERTF expressed the importance of institutional review board registration and accreditation to assure human participant protection. ERTF presented four major human research misconducts in the field of anesthesia in recent years. ERTF finally proposed a flow-chart to guide journal peer reviewers and editors in ethics review during the editorial process in publishing. Examples of template languages applied in the Ethics statement section in the manuscript are expected to strengthen the ethics compliance of the authors and to set an ethical culture for all the stakeholders involved in human research. Copyright © 2015. Published by Elsevier B.V.

  14. [Big data in imaging].

    Science.gov (United States)

    Sewerin, Philipp; Ostendorf, Benedikt; Hueber, Axel J; Kleyer, Arnd

    2018-04-01

    Until now, most major medical advancements have been achieved through hypothesis-driven research within the scope of clinical trials. However, due to a multitude of variables, only a certain number of research questions could be addressed during a single study, thus rendering these studies expensive and time consuming. Big data acquisition enables a new data-based approach in which large volumes of data can be used to investigate all variables, thus opening new horizons. Due to universal digitalization of the data as well as ever-improving hard- and software solutions, imaging would appear to be predestined for such analyses. Several small studies have already demonstrated that automated analysis algorithms and artificial intelligence can identify pathologies with high precision. Such automated systems would also seem well suited for rheumatology imaging, since a method for individualized risk stratification has long been sought for these patients. However, despite all the promising options, the heterogeneity of the data and highly complex regulations covering data protection in Germany would still render a big data solution for imaging difficult today. Overcoming these boundaries is challenging, but the enormous potential advances in clinical management and science render pursuit of this goal worthwhile.

  15. Big bang nucleosynthesis

    International Nuclear Information System (INIS)

    Fields, Brian D.; Olive, Keith A.

    2006-01-01

    We present an overview of the standard model of big bang nucleosynthesis (BBN), which describes the production of the light elements in the early universe. The theoretical prediction for the abundances of D, 3 He, 4 He, and 7 Li is discussed. We emphasize the role of key nuclear reactions and the methods by which experimental cross section uncertainties are propagated into uncertainties in the predicted abundances. The observational determination of the light nuclides is also discussed. Particular attention is given to the comparison between the predicted and observed abundances, which yields a measurement of the cosmic baryon content. The spectrum of anisotropies in the cosmic microwave background (CMB) now independently measures the baryon density to high precision; we show how the CMB data test BBN, and find that the CMB and the D and 4 He observations paint a consistent picture. This concordance stands as a major success of the hot big bang. On the other hand, 7 Li remains discrepant with the CMB-preferred baryon density; possible explanations are reviewed. Finally, moving beyond the standard model, primordial nucleosynthesis constraints on early universe and particle physics are also briefly discussed

  16. Heterogeneity of pituitary and plasma prolactin in man: decreased affinity of big prolactin in a radioreceptor assay and evidence for its secretion

    International Nuclear Information System (INIS)

    Garnier, P.E.; Aubert, M.L.; Kaplan, S.L.; Grumbach, M.M.

    1978-01-01

    Molecular heterogeneity of immunoreactive human PRL (IR-hPRL) plasma was assessed by exclusion chromatography in blood from 4 normal adults, 3 newborn infants, 2 late gestational women, 3 patients with primary hypothyroidism and high PRL levels, 2 with functional hyperprolactinemia, 3 with acromegaly, and 10 with PRL-secreting tumors. Three forms of PRL were detected: big-big hPRL, big hPRL, and little hPRL. In normal subjects, the proportion of big-big, big, and little hPRL components was 5.1%, 9.1%, and 85.8%, respectively, without change in the distribution after TRF stimulation. In 8 of 10 patients with PRL-secreting tumors, we detected a significantly higher proportion of big PRL. In 2 additional patients with prolactinomas, the proportion of big PRL was much higher. In 3 of 10 patients, the molecular heterogeneity of the tumor PRL was similar to that in plasma. In 1 acromegalic, there was a very high proportion of big-big hPRL. The PRL fractions were tested in a radioreceptor assay (RRA) using membranes from rabbit mammary gland. Big PRL was much less active than little PRL in the RRA. The fractions were rechromatographed after storage. Big PRL partially distributed as little or big-big PRL, while little PRL remained unchanged. Big-big PRL from tumor extract partially converted into big and little PRL. The big PRL obtained by rechromatography had low activity in the RRA. These observations suggest at least part of the receptor activity of big PRL may arise from generation of or contamination by little PRL. The decreased binding affinity of big PRL in the RRA also indicates that big PRL has little, if any, biological activity. The evidence suggests big PRL is a native PRL dimer linked by intermolecular disulfide bonds which arises in the lactotrope as a postsynthetic product or derivative and is not a true precursor prohormone

  17. Big Data and Nursing: Implications for the Future.

    Science.gov (United States)

    Topaz, Maxim; Pruinelli, Lisiane

    2017-01-01

    Big data is becoming increasingly more prevalent and it affects the way nurses learn, practice, conduct research and develop policy. The discipline of nursing needs to maximize the benefits of big data to advance the vision of promoting human health and wellbeing. However, current practicing nurses, educators and nurse scientists often lack the required skills and competencies necessary for meaningful use of big data. Some of the key skills for further development include the ability to mine narrative and structured data for new care or outcome patterns, effective data visualization techniques, and further integration of nursing sensitive data into artificial intelligence systems for better clinical decision support. We provide growth-path vision recommendations for big data competencies for practicing nurses, nurse educators, researchers, and policy makers to help prepare the next generation of nurses and improve patient outcomes trough better quality connected health.

  18. Was the big bang hot

    International Nuclear Information System (INIS)

    Wright, E.L.

    1983-01-01

    The author considers experiments to confirm the substantial deviations from a Planck curve in the Woody and Richards spectrum of the microwave background, and search for conducting needles in our galaxy. Spectral deviations and needle-shaped grains are expected for a cold Big Bang, but are not required by a hot Big Bang. (Auth.)

  19. Ethische aspecten van big data

    NARCIS (Netherlands)

    N. (Niek) van Antwerpen; Klaas Jan Mollema

    2017-01-01

    Big data heeft niet alleen geleid tot uitdagende technische vraagstukken, ook gaat het gepaard met allerlei nieuwe ethische en morele kwesties. Om verantwoord met big data om te gaan, moet ook over deze kwesties worden nagedacht. Want slecht datagebruik kan nadelige gevolgen hebben voor

  20. Fremtidens landbrug bliver big business

    DEFF Research Database (Denmark)

    Hansen, Henning Otte

    2016-01-01

    Landbrugets omverdensforhold og konkurrencevilkår ændres, og det vil nødvendiggøre en udvikling i retning af “big business“, hvor landbrugene bliver endnu større, mere industrialiserede og koncentrerede. Big business bliver en dominerende udvikling i dansk landbrug - men ikke den eneste...

  1. Passport to the Big Bang

    CERN Multimedia

    De Melis, Cinzia

    2013-01-01

    Le 2 juin 2013, le CERN inaugure le projet Passeport Big Bang lors d'un grand événement public. Affiche et programme. On 2 June 2013 CERN launches a scientific tourist trail through the Pays de Gex and the Canton of Geneva known as the Passport to the Big Bang. Poster and Programme.

  2. Applications of Big Data in Education

    OpenAIRE

    Faisal Kalota

    2015-01-01

    Big Data and analytics have gained a huge momentum in recent years. Big Data feeds into the field of Learning Analytics (LA) that may allow academic institutions to better understand the learners' needs and proactively address them. Hence, it is important to have an understanding of Big Data and its applications. The purpose of this descriptive paper is to provide an overview of Big Data, the technologies used in Big Data, and some of the applications of Big Data in educa...

  3. Exploring complex and big data

    Directory of Open Access Journals (Sweden)

    Stefanowski Jerzy

    2017-12-01

    Full Text Available This paper shows how big data analysis opens a range of research and technological problems and calls for new approaches. We start with defining the essential properties of big data and discussing the main types of data involved. We then survey the dedicated solutions for storing and processing big data, including a data lake, virtual integration, and a polystore architecture. Difficulties in managing data quality and provenance are also highlighted. The characteristics of big data imply also specific requirements and challenges for data mining algorithms, which we address as well. The links with related areas, including data streams and deep learning, are discussed. The common theme that naturally emerges from this characterization is complexity. All in all, we consider it to be the truly defining feature of big data (posing particular research and technological challenges, which ultimately seems to be of greater importance than the sheer data volume.

  4. Medios ciudadanos y big data: La emergencia del activismo de datos

    OpenAIRE

    Milan, S.; Gutiérrez, M.

    2015-01-01

    Big data presents citizens with new challenges and opportunities. ‘Data activism’ practices emerge at the intersection of the social and techsnological dimension of human action, whereby citizens take a critical approach to big data, and appropriate and manipulate data for advocacy and social change. This theoretical article explores the emergence of data activism as an empirical reality and a heuristic tool to study how people engage politically with big data. We ground the concept on a mult...

  5. [Three applications and the challenge of the big data in otology].

    Science.gov (United States)

    Lei, Guanxiong; Li, Jianan; Shen, Weidong; Yang, Shiming

    2016-03-01

    With the expansion of human practical activities, more and more areas have suffered from big data problems. The emergence of big data requires people to update the research paradigm and develop new technical methods. This review discussed that big data might bring opportunities and challenges in the area of auditory implantation, the deafness genome, and auditory pathophysiology, and pointed out that we needed to find appropriate theories and methods to make this kind of expectation into reality.

  6. The big data telescope

    International Nuclear Information System (INIS)

    Finkel, Elizabeth

    2017-01-01

    On a flat, red mulga plain in the outback of Western Australia, preparations are under way to build the most audacious telescope astronomers have ever dreamed of - the Square Kilometre Array (SKA). Next-generation telescopes usually aim to double the performance of their predecessors. The Australian arm of SKA will deliver a 168-fold leap on the best technology available today, to show us the universe as never before. It will tune into signals emitted just a million years after the Big Bang, when the universe was a sea of hydrogen gas, slowly percolating with the first galaxies. Their starlight illuminated the fledgling universe in what is referred to as the “cosmic dawn”.

  7. The Big Optical Array

    International Nuclear Information System (INIS)

    Mozurkewich, D.; Johnston, K.J.; Simon, R.S.

    1990-01-01

    This paper describes the design and the capabilities of the Naval Research Laboratory Big Optical Array (BOA), an interferometric optical array for high-resolution imaging of stars, stellar systems, and other celestial objects. There are four important differences between the BOA design and the design of Mark III Optical Interferometer on Mount Wilson (California). These include a long passive delay line which will be used in BOA to do most of the delay compensation, so that the fast delay line will have a very short travel; the beam combination in BOA will be done in triplets, to allow measurement of closure phase; the same light will be used for both star and fringe tracking; and the fringe tracker will use several wavelength channels

  8. Big nuclear accidents

    International Nuclear Information System (INIS)

    Marshall, W.

    1983-01-01

    Much of the debate on the safety of nuclear power focuses on the large number of fatalities that could, in theory, be caused by extremely unlikely but imaginable reactor accidents. This, along with the nuclear industry's inappropriate use of vocabulary during public debate, has given the general public a distorted impression of the safety of nuclear power. The way in which the probability and consequences of big nuclear accidents have been presented in the past is reviewed and recommendations for the future are made including the presentation of the long-term consequences of such accidents in terms of 'reduction in life expectancy', 'increased chance of fatal cancer' and the equivalent pattern of compulsory cigarette smoking. (author)

  9. Nonstandard big bang models

    International Nuclear Information System (INIS)

    Calvao, M.O.; Lima, J.A.S.

    1989-01-01

    The usual FRW hot big-bang cosmologies have been generalized by considering the equation of state ρ = Anm +(γ-1) -1 p, where m is the rest mass of the fluid particles and A is a dimensionless constant. Explicit analytic solutions are given for the flat case (ε=O). For large cosmological times these extended models behave as the standard Einstein-de Sitter universes regardless of the values of A and γ. Unlike the usual FRW flat case the deceleration parameter q is a time-dependent function and its present value, q≅ 1, obtained from the luminosity distance versus redshift relation, may be fitted by taking, for instance, A=1 and γ = 5/3 (monatomic relativistic gas with >> k B T). In all cases the universe cools obeying the same temperature law of the FRW models and it is shown that the age of the universe is only slightly modified. (author) [pt

  10. The Last Big Bang

    Energy Technology Data Exchange (ETDEWEB)

    McGuire, Austin D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Meade, Roger Allen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-13

    As one of the very few people in the world to give the “go/no go” decision to detonate a nuclear device, Austin “Mac” McGuire holds a very special place in the history of both the Los Alamos National Laboratory and the world. As Commander of Joint Task Force Unit 8.1.1, on Christmas Island in the spring and summer of 1962, Mac directed the Los Alamos data collection efforts for twelve of the last atmospheric nuclear detonations conducted by the United States. Since data collection was at the heart of nuclear weapon testing, it fell to Mac to make the ultimate decision to detonate each test device. He calls his experience THE LAST BIG BANG, since these tests, part of Operation Dominic, were characterized by the dramatic displays of the heat, light, and sounds unique to atmospheric nuclear detonations – never, perhaps, to be witnessed again.

  11. A matrix big bang

    International Nuclear Information System (INIS)

    Craps, Ben; Sethi, Savdeep; Verlinde, Erik

    2005-01-01

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control

  12. A matrix big bang

    Energy Technology Data Exchange (ETDEWEB)

    Craps, Ben [Instituut voor Theoretische Fysica, Universiteit van Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam (Netherlands); Sethi, Savdeep [Enrico Fermi Institute, University of Chicago, Chicago, IL 60637 (United States); Verlinde, Erik [Instituut voor Theoretische Fysica, Universiteit van Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam (Netherlands)

    2005-10-15

    The light-like linear dilaton background represents a particularly simple time-dependent 1/2 BPS solution of critical type-IIA superstring theory in ten dimensions. Its lift to M-theory, as well as its Einstein frame metric, are singular in the sense that the geometry is geodesically incomplete and the Riemann tensor diverges along a light-like subspace of codimension one. We study this background as a model for a big bang type singularity in string theory/M-theory. We construct the dual Matrix theory description in terms of a (1+1)-d supersymmetric Yang-Mills theory on a time-dependent world-sheet given by the Milne orbifold of (1+1)-d Minkowski space. Our model provides a framework in which the physics of the singularity appears to be under control.

  13. DPF Big One

    International Nuclear Information System (INIS)

    Anon.

    1993-01-01

    At its latest venue at Fermilab from 10-14 November, the American Physical Society's Division of Particles and Fields meeting entered a new dimension. These regular meetings, which allow younger researchers to communicate with their peers, have been gaining popularity over the years (this was the seventh in the series), but nobody had expected almost a thousand participants and nearly 500 requests to give talks. Thus Fermilab's 800-seat auditorium had to be supplemented with another room with a video hookup, while the parallel sessions were organized into nine bewildering streams covering fourteen major physics topics. With the conventionality of the Standard Model virtually unchallenged, physics does not move fast these days. While most of the physics results had already been covered in principle at the International Conference on High Energy Physics held in Dallas in August (October, page 1), the Fermilab DPF meeting had a very different atmosphere. Major international meetings like Dallas attract big names from far and wide, and it is difficult in such an august atmosphere for young researchers to find a receptive audience. This was not the case at the DPF parallel sessions. The meeting also adopted a novel approach, with the parallels sandwiched between an initial day of plenaries to set the scene, and a final day of summaries. With the whole world waiting for the sixth ('top') quark to be discovered at Fermilab's Tevatron protonantiproton collider, the meeting began with updates from Avi Yagil and Ronald Madaras from the big detectors, CDF and DO respectively. Although rumours flew thick and fast, the Tevatron has not yet reached the top, although Yagil could show one intriguing event of a type expected from the heaviest quark

  14. DPF Big One

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1993-01-15

    At its latest venue at Fermilab from 10-14 November, the American Physical Society's Division of Particles and Fields meeting entered a new dimension. These regular meetings, which allow younger researchers to communicate with their peers, have been gaining popularity over the years (this was the seventh in the series), but nobody had expected almost a thousand participants and nearly 500 requests to give talks. Thus Fermilab's 800-seat auditorium had to be supplemented with another room with a video hookup, while the parallel sessions were organized into nine bewildering streams covering fourteen major physics topics. With the conventionality of the Standard Model virtually unchallenged, physics does not move fast these days. While most of the physics results had already been covered in principle at the International Conference on High Energy Physics held in Dallas in August (October, page 1), the Fermilab DPF meeting had a very different atmosphere. Major international meetings like Dallas attract big names from far and wide, and it is difficult in such an august atmosphere for young researchers to find a receptive audience. This was not the case at the DPF parallel sessions. The meeting also adopted a novel approach, with the parallels sandwiched between an initial day of plenaries to set the scene, and a final day of summaries. With the whole world waiting for the sixth ('top') quark to be discovered at Fermilab's Tevatron protonantiproton collider, the meeting began with updates from Avi Yagil and Ronald Madaras from the big detectors, CDF and DO respectively. Although rumours flew thick and fast, the Tevatron has not yet reached the top, although Yagil could show one intriguing event of a type expected from the heaviest quark.

  15. Big bang and big crunch in matrix string theory

    International Nuclear Information System (INIS)

    Bedford, J.; Ward, J.; Papageorgakis, C.; Rodriguez-Gomez, D.

    2007-01-01

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe

  16. Some big ideas for some big problems.

    Science.gov (United States)

    Winter, D D

    2000-05-01

    Although most psychologists do not see sustainability as a psychological problem, our environmental predicament is caused largely by human behaviors, accompanied by relevant thoughts, feelings, attitudes, and values. The huge task of building sustainable cultures will require a great many psychologists from a variety of backgrounds. In an effort to stimulate the imaginations of a wide spectrum of psychologists to take on the crucial problem of sustainability, this article discusses 4 psychological approaches (neo-analytic, behavioral, social, and cognitive) and outlines some of their insights into environmentally relevant behavior. These models are useful for illuminating ways to increase environmentally responsible behaviors of clients, communities, and professional associations.

  17. [Peripheral intravenous catheter-related phlebitis].

    Science.gov (United States)

    van der Sar-van der Brugge, Simone; Posthuma, E F M Ward

    2011-01-01

    Phlebitis is a very common complication of the use of intravenous catheters. Two patients with an i.v. catheter complicated by thrombophlebitis are described. Patient A was immunocompromised due to chronic lymphatic leukaemia and developed septic thrombophlebitis with positive blood cultures for S. Aureus. Patient B was being treated with flucloxacillin because of an S. Aureus infection and developed chemical phlebitis. Septic phlebitis is rare, but potentially serious. Chemical or mechanical types of thrombophlebitis are usually less severe, but happen very frequently. Risk factors include: female sex, previous episode of phlebitis, insertion at (ventral) forearm, emergency placement and administration of antibiotics. Until recently, routine replacement of peripheral intravenous catheters after 72-96 h was recommended, but randomised controlled trials have not shown any benefit of this routine. A recent Cochrane Review recommends replacement of peripheral intravenous catheters when clinically indicated only.

  18. BigOP: Generating Comprehensive Big Data Workloads as a Benchmarking Framework

    OpenAIRE

    Zhu, Yuqing; Zhan, Jianfeng; Weng, Chuliang; Nambiar, Raghunath; Zhang, Jinchao; Chen, Xingzhen; Wang, Lei

    2014-01-01

    Big Data is considered proprietary asset of companies, organizations, and even nations. Turning big data into real treasure requires the support of big data systems. A variety of commercial and open source products have been unleashed for big data storage and processing. While big data users are facing the choice of which system best suits their needs, big data system developers are facing the question of how to evaluate their systems with regard to general big data processing needs. System b...

  19. Use of intravenous immunoglobulins in clinical practice

    Directory of Open Access Journals (Sweden)

    E.K. Donyush

    2011-01-01

    Full Text Available Immunoglobulins are main component of immune defense; they take part in anti-infectious resistance of organism and regulate processes of different immune reactions. Intravenous immunoglobulins are the most frequently used products made from donor blood plasma. The need in these drugs is steadily increasing during last 15–20 years, and indications are widening due to modern hightechnology methods of production and cleaning. The article presents modern data on formula, mechanisms of action and indications for different groups of intravenous immunoglobulins (standard, hyperimmune, fortified and description of possible adverse events.Key words: immuglobulines, prophylaxis, treatment, unfavorable reaction, children.

  20. From big bang to big crunch and beyond

    International Nuclear Information System (INIS)

    Elitzur, Shmuel; Rabinovici, Eliezer; Giveon, Amit; Kutasov, David

    2002-01-01

    We study a quotient Conformal Field Theory, which describes a 3+1 dimensional cosmological spacetime. Part of this spacetime is the Nappi-Witten (NW) universe, which starts at a 'big bang' singularity, expands and then contracts to a 'big crunch' singularity at a finite time. The gauged WZW model contains a number of copies of the NW spacetime, with each copy connected to the preceding one and to the next one at the respective big bang/big crunch singularities. The sequence of NW spacetimes is further connected at the singularities to a series of non-compact static regions with closed timelike curves. These regions contain boundaries, on which the observables of the theory live. This suggests a holographic interpretation of the physics. (author)

  1. Google BigQuery analytics

    CERN Document Server

    Tigani, Jordan

    2014-01-01

    How to effectively use BigQuery, avoid common mistakes, and execute sophisticated queries against large datasets Google BigQuery Analytics is the perfect guide for business and data analysts who want the latest tips on running complex queries and writing code to communicate with the BigQuery API. The book uses real-world examples to demonstrate current best practices and techniques, and also explains and demonstrates streaming ingestion, transformation via Hadoop in Google Compute engine, AppEngine datastore integration, and using GViz with Tableau to generate charts of query results. In addit

  2. Empathy and the Big Five

    OpenAIRE

    Paulus, Christoph

    2016-01-01

    Del Barrio et al. (2004) haben vor mehr als 10 Jahren versucht, eine direkte Beziehung zwischen Empathie und den Big Five herzustellen. Im Mittel hatten in ihrer Stichprobe Frauen höhere Werte in der Empathie und auf den Big Five-Faktoren mit Ausnahme des Faktors Neurotizismus. Zusammenhänge zu Empathie fanden sie in den Bereichen Offenheit, Verträglichkeit, Gewissenhaftigkeit und Extraversion. In unseren Daten besitzen Frauen sowohl in der Empathie als auch den Big Five signifikant höhere We...

  3. "Big data" in economic history.

    Science.gov (United States)

    Gutmann, Myron P; Merchant, Emily Klancher; Roberts, Evan

    2018-03-01

    Big data is an exciting prospect for the field of economic history, which has long depended on the acquisition, keying, and cleaning of scarce numerical information about the past. This article examines two areas in which economic historians are already using big data - population and environment - discussing ways in which increased frequency of observation, denser samples, and smaller geographic units allow us to analyze the past with greater precision and often to track individuals, places, and phenomena across time. We also explore promising new sources of big data: organically created economic data, high resolution images, and textual corpora.

  4. Big Data as Information Barrier

    Directory of Open Access Journals (Sweden)

    Victor Ya. Tsvetkov

    2014-07-01

    Full Text Available The article covers analysis of ‘Big Data’ which has been discussed over last 10 years. The reasons and factors for the issue are revealed. It has proved that the factors creating ‘Big Data’ issue has existed for quite a long time, and from time to time, would cause the informational barriers. Such barriers were successfully overcome through the science and technologies. The conducted analysis refers the “Big Data” issue to a form of informative barrier. This issue may be solved correctly and encourages development of scientific and calculating methods.

  5. Homogeneous and isotropic big rips?

    CERN Document Server

    Giovannini, Massimo

    2005-01-01

    We investigate the way big rips are approached in a fully inhomogeneous description of the space-time geometry. If the pressure and energy densities are connected by a (supernegative) barotropic index, the spatial gradients and the anisotropic expansion decay as the big rip is approached. This behaviour is contrasted with the usual big-bang singularities. A similar analysis is performed in the case of sudden (quiescent) singularities and it is argued that the spatial gradients may well be non-negligible in the vicinity of pressure singularities.

  6. Big Data in Space Science

    OpenAIRE

    Barmby, Pauline

    2018-01-01

    It seems like “big data” is everywhere these days. In planetary science and astronomy, we’ve been dealing with large datasets for a long time. So how “big” is our data? How does it compare to the big data that a bank or an airline might have? What new tools do we need to analyze big datasets, and how can we make better use of existing tools? What kinds of science problems can we address with these? I’ll address these questions with examples including ESA’s Gaia mission, ...

  7. Rate Change Big Bang Theory

    Science.gov (United States)

    Strickland, Ken

    2013-04-01

    The Rate Change Big Bang Theory redefines the birth of the universe with a dramatic shift in energy direction and a new vision of the first moments. With rate change graph technology (RCGT) we can look back 13.7 billion years and experience every step of the big bang through geometrical intersection technology. The analysis of the Big Bang includes a visualization of the first objects, their properties, the astounding event that created space and time as well as a solution to the mystery of anti-matter.

  8. Urban Big Data and the Development of City Intelligence

    Directory of Open Access Journals (Sweden)

    Yunhe Pan

    2016-06-01

    Full Text Available This study provides a definition for urban big data while exploring its features and applications of China's city intelligence. The differences between city intelligence in China and the “smart city” concept in other countries are compared to highlight and contrast the unique definition and model for China's city intelligence in this paper. Furthermore, this paper examines the role of urban big data in city intelligence by showing that it not only serves as the cornerstone of this trend as it also plays a core role in the diffusion of city intelligence technology and serves as an inexhaustible resource for the sustained development of city intelligence. This study also points out the challenges of shaping and developing of China's urban big data. Considering the supporting and core role that urban big data plays in city intelligence, the study then expounds on the key points of urban big data, including infrastructure support, urban governance, public services, and economic and industrial development. Finally, this study points out that the utility of city intelligence as an ideal policy tool for advancing the goals of China's urban development. In conclusion, it is imperative that China make full use of its unique advantages—including using the nation's current state of development and resources, geographical advantages, and good human relations—in subjective and objective conditions to promote the development of city intelligence through the proper application of urban big data.

  9. Big Data is invading big places as CERN

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Big Data technologies are becoming more popular with the constant grow of data generation in different fields such as social networks, internet of things and laboratories like CERN. How is CERN making use of such technologies? How machine learning is applied at CERN with Big Data technologies? How much data we move and how it is analyzed? All these questions will be answered during the talk.

  10. The ethics of big data in big agriculture

    OpenAIRE

    Carbonell (Isabelle M.)

    2016-01-01

    This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique in...

  11. Big climate data analysis

    Science.gov (United States)

    Mudelsee, Manfred

    2015-04-01

    The Big Data era has begun also in the climate sciences, not only in economics or molecular biology. We measure climate at increasing spatial resolution by means of satellites and look farther back in time at increasing temporal resolution by means of natural archives and proxy data. We use powerful supercomputers to run climate models. The model output of the calculations made for the IPCC's Fifth Assessment Report amounts to ~650 TB. The 'scientific evolution' of grid computing has started, and the 'scientific revolution' of quantum computing is being prepared. This will increase computing power, and data amount, by several orders of magnitude in the future. However, more data does not automatically mean more knowledge. We need statisticians, who are at the core of transforming data into knowledge. Statisticians notably also explore the limits of our knowledge (uncertainties, that is, confidence intervals and P-values). Mudelsee (2014 Climate Time Series Analysis: Classical Statistical and Bootstrap Methods. Second edition. Springer, Cham, xxxii + 454 pp.) coined the term 'optimal estimation'. Consider the hyperspace of climate estimation. It has many, but not infinite, dimensions. It consists of the three subspaces Monte Carlo design, method and measure. The Monte Carlo design describes the data generating process. The method subspace describes the estimation and confidence interval construction. The measure subspace describes how to detect the optimal estimation method for the Monte Carlo experiment. The envisaged large increase in computing power may bring the following idea of optimal climate estimation into existence. Given a data sample, some prior information (e.g. measurement standard errors) and a set of questions (parameters to be estimated), the first task is simple: perform an initial estimation on basis of existing knowledge and experience with such types of estimation problems. The second task requires the computing power: explore the hyperspace to

  12. Hey, big spender

    Energy Technology Data Exchange (ETDEWEB)

    Cope, G.

    2000-04-01

    Business to business electronic commerce is looming large in the future of the oil industry. It is estimated that by adopting e-commerce the industry could achieve bottom line savings of between $1.8 to $ 3.4 billion a year on annual gross revenues in excess of $ 30 billion. At present there are several teething problems to overcome such as inter-operability standards, which are at least two or three years away. Tying in electronically with specific suppliers is also an expensive proposition, although the big benefits are in fact in doing business with the same suppliers on a continuing basis. Despite these problems, 14 of the world's largest energy and petrochemical companies joined forces in mid-April to create a single Internet procurement marketplace for the industry's complex supply chain. The exchange was designed by B2B (business-to-business) software provider, Commerce One Inc., ; it will leverage the buying clout of these industry giants (BP Amoco, Royal Dutch Shell Group, Conoco, Occidental Petroleum, Phillips Petroleum, Unocal Corporation and Statoil among them), currently about $ 125 billion on procurement per year; they hope to save between 5 to 30 per cent depending on the product and the region involved. Other similar schemes such as Chevron and partners' Petrocosm Marketplace, Network Oil, a Houston-based Internet portal aimed at smaller petroleum companies, are also doing business in the $ 10 billion per annum range. e-Energy, a cooperative project between IBM Ericson and Telus Advertising is another neutral, virtual marketplace targeted at the oil and gas sector. PetroTRAX, a Calgary-based website plans to take online procurement and auction sales a big step forward by establishing a portal to handle any oil company's asset management needs. There are also a number of websites targeting specific needs: IndigoPool.com (acquisitions and divestitures) and WellBid.com (products related to upstream oil and gas operators) are just

  13. Hey, big spender

    International Nuclear Information System (INIS)

    Cope, G.

    2000-01-01

    Business to business electronic commerce is looming large in the future of the oil industry. It is estimated that by adopting e-commerce the industry could achieve bottom line savings of between $1.8 to $ 3.4 billion a year on annual gross revenues in excess of $ 30 billion. At present there are several teething problems to overcome such as inter-operability standards, which are at least two or three years away. Tying in electronically with specific suppliers is also an expensive proposition, although the big benefits are in fact in doing business with the same suppliers on a continuing basis. Despite these problems, 14 of the world's largest energy and petrochemical companies joined forces in mid-April to create a single Internet procurement marketplace for the industry's complex supply chain. The exchange was designed by B2B (business-to-business) software provider, Commerce One Inc., ; it will leverage the buying clout of these industry giants (BP Amoco, Royal Dutch Shell Group, Conoco, Occidental Petroleum, Phillips Petroleum, Unocal Corporation and Statoil among them), currently about $ 125 billion on procurement per year; they hope to save between 5 to 30 per cent depending on the product and the region involved. Other similar schemes such as Chevron and partners' Petrocosm Marketplace, Network Oil, a Houston-based Internet portal aimed at smaller petroleum companies, are also doing business in the $ 10 billion per annum range. e-Energy, a cooperative project between IBM Ericson and Telus Advertising is another neutral, virtual marketplace targeted at the oil and gas sector. PetroTRAX, a Calgary-based website plans to take online procurement and auction sales a big step forward by establishing a portal to handle any oil company's asset management needs. There are also a number of websites targeting specific needs: IndigoPool.com (acquisitions and divestitures) and WellBid.com (products related to upstream oil and gas operators) are just two examples. All in

  14. Methods and tools for big data visualization

    OpenAIRE

    Zubova, Jelena; Kurasova, Olga

    2015-01-01

    In this paper, methods and tools for big data visualization have been investigated. Challenges faced by the big data analysis and visualization have been identified. Technologies for big data analysis have been discussed. A review of methods and tools for big data visualization has been done. Functionalities of the tools have been demonstrated by examples in order to highlight their advantages and disadvantages.

  15. Measuring the Promise of Big Data Syllabi

    Science.gov (United States)

    Friedman, Alon

    2018-01-01

    Growing interest in Big Data is leading industries, academics and governments to accelerate Big Data research. However, how teachers should teach Big Data has not been fully examined. This article suggests criteria for redesigning Big Data syllabi in public and private degree-awarding higher education establishments. The author conducted a survey…

  16. Medios ciudadanos y big data: La emergencia del activismo de datos

    NARCIS (Netherlands)

    Milan, S.; Gutiérrez, M.

    2015-01-01

    Big data presents citizens with new challenges and opportunities. ‘Data activism’ practices emerge at the intersection of the social and techsnological dimension of human action, whereby citizens take a critical approach to big data, and appropriate and manipulate data for advocacy and social

  17. The BigBOSS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  18. The adverse effects of inadvertent intraoperative intravenous ...

    African Journals Online (AJOL)

    Inadvertent intravenous injection of 1% phenylephrine (10 mg) induced severe hypertension and tachycardia in a previously healthy female patient undergoing elective gynaecological surgery. This medical error was investigated using the criticalincident technique that is available in our department. This case report ...

  19. [Phlebitis associated to intravenous/infusional therapy].

    Science.gov (United States)

    Nicotera, Raffaela

    2011-01-01

    Phlebitis is a common problem associated to intravenous therapies, it may cause pain, sepsis and increased duration of hospitalization. Several factors can increase the risk of phlebitis. The literature review addresses the mechanisms of chemical phlebitis, the characteristics of drugs likely to cause a phlebitis and the main measures to be adopted for prevention and treatment.

  20. Intravenous iron supplementation in children on hemodialysis.

    NARCIS (Netherlands)

    Leijn, E.; Monnens, L.A.H.; Cornelissen, E.A.M.

    2004-01-01

    BACKGROUND: Children with end-stage renal disease (ESRD) on hemodialysis (HD) are often absolute or functional iron deficient. There is little experience in treating these children with intravenous (i.v.) iron-sucrose. In this prospective study, different i.v. iron-sucrose doses were tested in

  1. Campaign best practice in intravenous therapy.

    Science.gov (United States)

    Baldwin, Wayne; Murphy, Jayne; Shakespeare, David; Kelly, Chris; Fox, Louise; Kelly, Matthew

    Intravenous therapy is an integral part of nursing care but is associated with a high risk of infection. This article outlines a campaign that aimed to increase awareness of best practice for IV therapy and reduce the risks of healthcare-associated IV infections in hospital and community settings.

  2. Intravenous voriconazole after toxic oral administration

    NARCIS (Netherlands)

    Alffenaar, J.W.C.; Van Assen, S.; De Monchy, J.G.R.; Uges, D.R.A.; Kosterink, J.G.W.; Van Der Werf, T.S.

    In a male patient with rhinocerebral invasive aspergillosis, prolonged high-dosage oral administration of voriconazole led to hepatotoxicity combined with a severe cutaneous reaction while intravenous administration in the same patient did not. High concentrations in the portal blood precipitate

  3. Complex intravenous anesthesia in interventional procedures

    International Nuclear Information System (INIS)

    Xie Zonggui; Hu Yuanming; Huang Yunlong; You Yong; Wu Juan; Huang Zengping; Li Jian

    2006-01-01

    Objective: To evaluate the value and safety of Diprivan and Fentany intravenous administration of analgesia in interventional procedures. Methods: Diprivan with Fentany intravenous administration for analgesia was used in eighty interventional procedures of sixty-five patients, without tracheal tube insertion. Vital signs including HR, BP, arterial oxygen saturation (SpO 2 ) and patients' reaction to operating were recorded. Results: Intravenous anesthesia was cared out successfully in eighty interventional procedures, with patients under sleeping condition during the operation, together with no pain and no agony memory of the procedure. The amount of Diprivan was 500±100 mg and Fentany was 0.2±0.025 mg. Mean arterial pressure and SpO 2 were 11.4±2.2 kPa, 10.6±2.1 kPa and 98±1.0, 96±1.5 respectively before and after ten minutes of the operation, with no significant difference. Conclusions: Diprivan with Fentany intravenous administration for interventional procedure analgesia possess good safety, painless and no agony memory of the procedure; therefor ought to be recommended. (authors)

  4. Intravenous paracetamol overdose in a paediatric patient

    NARCIS (Netherlands)

    Broeks, Ilse J.; Van Roon, Eric N.; Van Pinxteren-Nagler, Evelyn; De Vries, Tjalling W.

    2013-01-01

    BACKGROUND: Paracetamol is a widely used drug in children. In therapeutic doses, paracetamol has an excellent safety profile. Since the introduction of the intravenous form in 2004, only three reports of accidental overdose in children have been published. The low number probably is due to

  5. Administration and monitoring of intravenous anesthetics

    NARCIS (Netherlands)

    Sahinovic, Marko M.; Absalom, Anthony R.; Struys, Michel M. R. F.

    2010-01-01

    Purpose of review The importance of accuracy in controlling the dose-response relation for intravenous anesthetics is directly related to the importance of optimizing the efficacy and quality of anesthesia while minimizing adverse drug effects. Therefore, it is important to measure and control all

  6. A double-tracer technique to characterize absorption, distribution, metabolism and excretion (ADME) of [14C]-basimglurant and absolute bioavailability after oral administration and concomitant intravenous microdose administration of [13C6]-labeled basimglurant in humans.

    Science.gov (United States)

    Guerini, Elena; Schadt, Simone; Greig, Gerard; Haas, Ruth; Husser, Christophe; Zell, Manfred; Funk, Christoph; Hartung, Thomas; Gloge, Andreas; Mallalieu, Navita L

    2017-02-01

    1. The emerging technique of employing intravenous microdose administration of an isotope tracer concomitantly with an [ 14 C]-labeled oral dose was used to characterize the disposition and absolute bioavailability of a novel metabotropic glutamate 5 (mGlu5) receptor antagonist under clinical development for major depressive disorder (MDD). 2. Six healthy volunteers received a single 1 mg [ 12 C/ 14 C]-basimglurant (2.22 MBq) oral dose and a concomitant i.v. tracer dose of 100 μg of [ 13 C 6 ]-basimglurant. Concentrations of [ 12 C]-basimglurant and the stable isotope [ 13 C 6 ]-basimglurant were determined in plasma by a specific LC/MS-MS method. Total [ 14 C] radioactivity was determined in whole blood, plasma, urine and feces by liquid scintillation counting. Metabolic profiling was conducted in plasma, urine, blood cell pellet and feces samples. 3. The mean absolute bioavailability after oral administration (F) of basimglurant was ∼67% (range 45.7-77.7%). The major route of [ 14 C]-radioactivity excretion, primarily in form of metabolites, was in urine (mean recovery 73.4%), with the remainder excreted in feces (mean recovery 26.5%). The median t max for [ 12 C]-basimglurant after the oral administration was 0.71 h (range 0.58-1.00) and the mean terminal half-life was 77.2 ± 38.5 h. Terminal half-life for the [ 14 C]-basimglurant was 178 h indicating presence of metabolites with a longer terminal half-life. Five metabolites were identified with M1-Glucuronide as major and the others in trace amounts. There was minimal binding of drug to RBCs. IV pharmacokinetics was characterized with a mean ± SD CL of 11.8 ± 7.4 mL/h and a Vss of 677 ± 229 L. 4. The double-tracer technique used in this study allowed to simultaneously characterize the absolute bioavailability and disposition characteristics of the new oral molecular entity in a single study.

  7. Biophotonics: the big picture

    Science.gov (United States)

    Marcu, Laura; Boppart, Stephen A.; Hutchinson, Mark R.; Popp, Jürgen; Wilson, Brian C.

    2018-02-01

    The 5th International Conference on Biophotonics (ICOB) held April 30 to May 1, 2017, in Fremantle, Western Australia, brought together opinion leaders to discuss future directions for the field and opportunities to consider. The first session of the conference, "How to Set a Big Picture Biophotonics Agenda," was focused on setting the stage for developing a vision and strategies for translation and impact on society of biophotonic technologies. The invited speakers, panelists, and attendees engaged in discussions that focused on opportunities and promising applications for biophotonic techniques, challenges when working at the confluence of the physical and biological sciences, driving factors for advances of biophotonic technologies, and educational opportunities. We share a summary of the presentations and discussions. Three main themes from the conference are presented in this position paper that capture the current status, opportunities, challenges, and future directions of biophotonics research and key areas of applications: (1) biophotonics at the nano- to microscale level; (2) biophotonics at meso- to macroscale level; and (3) biophotonics and the clinical translation conundrum.

  8. Big Data, Small Sample.

    Science.gov (United States)

    Gerlovina, Inna; van der Laan, Mark J; Hubbard, Alan

    2017-05-20

    Multiple comparisons and small sample size, common characteristics of many types of "Big Data" including those that are produced by genomic studies, present specific challenges that affect reliability of inference. Use of multiple testing procedures necessitates calculation of very small tail probabilities of a test statistic distribution. Results based on large deviation theory provide a formal condition that is necessary to guarantee error rate control given practical sample sizes, linking the number of tests and the sample size; this condition, however, is rarely satisfied. Using methods that are based on Edgeworth expansions (relying especially on the work of Peter Hall), we explore the impact of departures of sampling distributions from typical assumptions on actual error rates. Our investigation illustrates how far the actual error rates can be from the declared nominal levels, suggesting potentially wide-spread problems with error rate control, specifically excessive false positives. This is an important factor that contributes to "reproducibility crisis". We also review some other commonly used methods (such as permutation and methods based on finite sampling inequalities) in their application to multiple testing/small sample data. We point out that Edgeworth expansions, providing higher order approximations to the sampling distribution, offer a promising direction for data analysis that could improve reliability of studies relying on large numbers of comparisons with modest sample sizes.

  9. Big bang darkleosynthesis

    Directory of Open Access Journals (Sweden)

    Gordan Krnjaic

    2015-12-01

    Full Text Available In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN, in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV/dark-nucleon binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S≫3/2, whose discovery would be smoking gun evidence for dark nuclei.

  10. Predicting big bang deuterium

    Energy Technology Data Exchange (ETDEWEB)

    Hata, N.; Scherrer, R.J.; Steigman, G.; Thomas, D.; Walker, T.P. [Department of Physics, Ohio State University, Columbus, Ohio 43210 (United States)

    1996-02-01

    We present new upper and lower bounds to the primordial abundances of deuterium and {sup 3}He based on observational data from the solar system and the interstellar medium. Independent of any model for the primordial production of the elements we find (at the 95{percent} C.L.): 1.5{times}10{sup {minus}5}{le}(D/H){sub {ital P}}{le}10.0{times}10{sup {minus}5} and ({sup 3}He/H){sub {ital P}}{le}2.6{times}10{sup {minus}5}. When combined with the predictions of standard big bang nucleosynthesis, these constraints lead to a 95{percent} C.L. bound on the primordial abundance deuterium: (D/H){sub best}=(3.5{sup +2.7}{sub {minus}1.8}){times}10{sup {minus}5}. Measurements of deuterium absorption in the spectra of high-redshift QSOs will directly test this prediction. The implications of this prediction for the primordial abundances of {sup 4}He and {sup 7}Li are discussed, as well as those for the universal density of baryons. {copyright} {ital 1996 The American Astronomical Society.}

  11. Big bang darkleosynthesis

    Science.gov (United States)

    Krnjaic, Gordan; Sigurdson, Kris

    2015-12-01

    In a popular class of models, dark matter comprises an asymmetric population of composite particles with short range interactions arising from a confined nonabelian gauge group. We show that coupling this sector to a well-motivated light mediator particle yields efficient darkleosynthesis, a dark-sector version of big-bang nucleosynthesis (BBN), in generic regions of parameter space. Dark matter self-interaction bounds typically require the confinement scale to be above ΛQCD, which generically yields large (≫MeV /dark-nucleon) binding energies. These bounds further suggest the mediator is relatively weakly coupled, so repulsive forces between dark-sector nuclei are much weaker than Coulomb repulsion between standard-model nuclei, which results in an exponential barrier-tunneling enhancement over standard BBN. Thus, darklei are easier to make and harder to break than visible species with comparable mass numbers. This process can efficiently yield a dominant population of states with masses significantly greater than the confinement scale and, in contrast to dark matter that is a fundamental particle, may allow the dominant form of dark matter to have high spin (S ≫ 3 / 2), whose discovery would be smoking gun evidence for dark nuclei.

  12. The role of big laboratories

    CERN Document Server

    Heuer, Rolf-Dieter

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward.

  13. The role of big laboratories

    International Nuclear Information System (INIS)

    Heuer, R-D

    2013-01-01

    This paper presents the role of big laboratories in their function as research infrastructures. Starting from the general definition and features of big laboratories, the paper goes on to present the key ingredients and issues, based on scientific excellence, for the successful realization of large-scale science projects at such facilities. The paper concludes by taking the example of scientific research in the field of particle physics and describing the structures and methods required to be implemented for the way forward. (paper)

  14. Challenges of Big Data Analysis.

    Science.gov (United States)

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  15. Generalized formal model of Big Data

    OpenAIRE

    Shakhovska, N.; Veres, O.; Hirnyak, M.

    2016-01-01

    This article dwells on the basic characteristic features of the Big Data technologies. It is analyzed the existing definition of the “big data” term. The article proposes and describes the elements of the generalized formal model of big data. It is analyzed the peculiarities of the application of the proposed model components. It is described the fundamental differences between Big Data technology and business analytics. Big Data is supported by the distributed file system Google File System ...

  16. Intravenous/oral ciprofloxacin therapy versus intravenous ceftazidime therapy for selected bacterial infections.

    Science.gov (United States)

    Gaut, P L; Carron, W C; Ching, W T; Meyer, R D

    1989-11-30

    The efficacy and toxicity of sequential intravenous and oral ciprofloxacin therapy was compared with intravenously administered ceftazidime in a prospective, randomized, controlled, non-blinded trial. Thirty-two patients (16 patients receiving ciprofloxacin and 16 patients receiving ceftazidime) with 38 infections caused by susceptible Pseudomonas aeruginosa, enteric gram-negative rods, Salmonella group B, Serratia marcescens, Pseudomonas cepacia, and Xanthomonas maltophilia at various sites were evaluable for determination of efficacy. Length of therapy varied from seven to 25 days. Concomitant antimicrobials included intravenously administered beta-lactams for gram-positive organisms, intravenous/oral metronidazole and clindamycin for anaerobes, and intravenous/local amphotericin B for Candida albicans. Intravenous administration of 200 mg ciprofloxacin every 12 hours to 11 patients produced peak serum levels between 1.15 and 3.12 micrograms/ml; trough levels ranged between 0.08 and 0.86 micrograms/ml. Overall response rates were similar for patients receiving ciprofloxacin and ceftazidime. Emergence of resistance was similar in both groups--one Enterobacter cloacae and two P. aeruginosa became resistant after ciprofloxacin therapy and two P. aeruginosa became resistant after ceftazidime therapy. The frequency of superinfection with a variety of organisms was also similar in both groups. Adverse events related to ciprofloxacin included transient pruritus at the infusion site and generalized rash leading to drug discontinuation (one patient each), and with ceftazidime adverse effects included pain at the site of infusion and the development of allergic interstitial nephritis (one patient each). Overall, intravenous/oral ciprofloxin therapy appears to be as safe and effective as intravenous ceftazidime therapy in the treatment of a variety of infections due to susceptible aerobic gram-negative organisms.

  17. Disposition of nasal, intravenous, and oral methadone in healthy volunteers.

    Science.gov (United States)

    Dale, Ola; Hoffer, Christine; Sheffels, Pamela; Kharasch, Evan D

    2002-11-01

    Nasal administration of many opioids demonstrates rapid uptake and fast onset of action. Nasal administration may be an alternative to intravenous and oral administration of methadone and was therefore studied in human volunteers. The study was approved by the Institutional Review Board of the University of Washington, Seattle. Eight healthy volunteers (6 men and 2 women) aged 19 to 33 years were enrolled after informed written consent was obtained. Subjects received 10 mg methadone hydrochloride nasally, orally, or intravenously on 3 separate occasions in a crossover design. Nasal methadone (50 mg/mL in aqueous solution) was given as a 100-microL spray in each nostril (Pfeiffer BiDose sprayer). Blood samples for liquid chromatography-mass spectrometry analyses of methadone and the metabolite 2-ethyl-1,5-dimethyl-3,3-diphenylpyrrolinium were drawn for up to 96 hours. The methadone effect was measured by noninvasive infrared pupilometry coincident with blood sampling. Nasal uptake of methadone was rapid, with maximum plasma concentrations occurring within 7 minutes. The maximum effects of intravenous, nasal, and oral methadone, on the basis of dark-adapted pupil diameter, were reached in about 15 minutes, 30 minutes, and 2 hours, respectively. The respective durations were 24, 10, and 8 hours. Both nasal and oral bioavailabilities were 0.85. Subjects reported that nasal methadone caused a burning sensation. Nasal administration of methadone results in rapid absorption and onset of effect and high bioavailability, which was greater than that reported for other nasal opioids, with a similar duration of effect. Nasal administration may be an alternative route of methadone administration; however, improved formulations are desirable to reduce nasal irritation.

  18. Acute toxicity of intravenously administered titanium dioxide nanoparticles in mice.

    Directory of Open Access Journals (Sweden)

    Jiaying Xu

    Full Text Available BACKGROUND: With a wide range of applications, titanium dioxide (TiO₂ nanoparticles (NPs are manufactured worldwide in large quantities. Recently, in the field of nanomedicine, intravenous injection of TiO₂ nanoparticulate carriers directly into the bloodstream has raised public concerns on their toxicity to humans. METHODS: In this study, mice were injected intravenously with a single dose of TiO₂ NPs at varying dose levels (0, 140, 300, 645, or 1387 mg/kg. Animal mortality, blood biochemistry, hematology, genotoxicity and histopathology were investigated 14 days after treatment. RESULTS: Death of mice in the highest dose (1387 mg/kg group was observed at day two after TiO₂ NPs injection. At day 7, acute toxicity symptoms, such as decreased physical activity and decreased intake of food and water, were observed in the highest dose group. Hematological analysis and the micronucleus test showed no significant acute hematological or genetic toxicity except an increase in the white blood cell (WBC count among mice 645 mg/kg dose group. However, the spleen of the mice showed significantly higher tissue weight/body weight (BW coefficients, and lower liver and kidney coefficients in the TiO₂ NPs treated mice compared to control. The biochemical parameters and histological tissue sections indicated that TiO₂ NPs treatment could induce different degrees of damage in the brain, lung, spleen, liver and kidneys. However, no pathological effects were observed in the heart in TiO₂ NPs treated mice. CONCLUSIONS: Intravenous injection of TiO₂ NPs at high doses in mice could cause acute toxicity effects in the brain, lung, spleen, liver, and kidney. No significant hematological or genetic toxicity was observed.

  19. Spongiform leucoencephalopathy following intravenous heroin abuse: Radiological and histopathological findings

    International Nuclear Information System (INIS)

    Robertson, A.S.; Jain, S.; O'Neil, R.A.

    2001-01-01

    A case of spongiform leucoencephalopathy in a known intravenous heroin abuser is presented. To our knowledge, this is the only case of heroin-related spongiform leucoencephalopathy reported in Australia. The relationship to intravenous rather than inhaled heroin is particularly unusual with only one other possible case documented in the literature. The imaging and histopathological findings are described. Neurological examination revealed disorientation in time and place, memory loss and cognitive impairment but no focal signs. Biochemical and haematological profiles were normal. Viral serology was positive for hepatitis C but negative for hepatitis B and human immunodeficiency virus (HIV). Cerebral CT revealed diffuse symmetrical hypodensity of the cerebral white matter. The ventricles and subarachnoid spaces were of normal size. Magnetic resonance imaging showed diffuse symmetrical signal abnormality in the cerebral white matter. These changes were hyperintense on proton density, T2-weighted, modified T2-weighted (FLAIR) and diffusion-weighted images. T1 -weighted scans showed corresponding hypointensity. There was no enhancement after intravenous gadolinium. Cerebral spinal fluid (CSF) specimens were negative for a variety of virological, immunological and bacteriological markers. No viral or bacterial growth was demonstrated. Oligoclonal bands for multiple sclerosis and Protein 134 for Wilson's disease were negative. Right frontal brain biopsy showed spongiform white matter and degenerative change with prominent fibrous gliosis. In severely affected areas, loss of normal myelin staining and axonal loss were present, accompanied by scattered foamy macrophages. Loss of oligodendroglial nuclei was also present. There was no evidence of inflammation or progressive multifocal leucoencephalopathy. No bacteria or virus particles were seen on electron microscopic examination of the brain tissue. Following the biopsy, the patient discharged himself from hospital and the

  20. Intravenous cidofovir for resistant cutaneous warts in a patient with psoriasis treated with monoclonal antibodies.

    LENUS (Irish Health Repository)

    McAleer, M A

    2012-02-01

    Human papilloma virus is a common and often distressing cutaneous disease. It can be therapeutically challenging, especially in immunocompromised patients. We report a case of recalcitrant cutaneous warts that resolved with intravenous cidofovir treatment. The patient was immunocompromised secondary to monoclonal antibody therapy for psoriasis.

  1. Incarcerated intravenous heroin users: predictors of post-release utilization of methadone maintenance treatment.

    Science.gov (United States)

    Lin, Huang-Chi; Wang, Peng-Wei; Yang, Yi-Hsin; Tsai, Jih-Jin; Yen, Cheng-Fang

    2016-01-01

    Incarcerated intravenous heroin users have more problematic patterns of heroin use, but are less likely to access methadone maintenance treatment by their own initiative than heroin users in the community. The present study examined predictors for receiving methadone maintenance treatment post-release among incarcerated intravenous heroin users within a 24-month period. This cohort study recruited 315 incarcerated intravenous heroin users detained in 4 prisons in southern Taiwan and followed up within the 24-month period post-release. Cox proportional hazards regression analysis was applied to determine the predictive effects of sociodemographic and drug-use characteristics, attitude toward methadone maintenance treatment, human immunodeficiency virus serostatus, perceived family support, and depression for access to methadone maintenance treatment after release. There were 295 (93.7%) incarcerated intravenous heroin users released that entered the follow-up phase of the study. During the 24-month follow-up period, 50.8% of them received methadone maintenance treatment. After controlling for the effects of the detainment period before and after recruitment by Cox proportional hazards regression analysis, incarcerated intravenous heroin users who had positive human immunodeficiency virus serostatus (HR = 2.85, 95% CI = 1.80-4.52, p maintenance treatment before committal (HR = 1.94, 95% CI = 1.23-3.05, p maintenance treatment within the 24-month follow-up period. Positive human immunodeficiency virus serostatus with fully subsidized treatment and previous methadone maintenance treatment experiences predicted access of methadone maintenance treatment post-release. Strategies for getting familiar with methadone maintenance treatment during detainment, including providing methadone maintenance treatment prior to release and lowering the economic burden of receiving treatment, may facilitate entry of methadone maintenance treatment for incarcerated intravenous heroin

  2. Optimal timing for intravenous administration set replacement.

    Science.gov (United States)

    Gillies, D; O'Riordan, L; Wallen, M; Morrison, A; Rankin, K; Nagy, S

    2005-10-19

    Administration of intravenous therapy is a common occurrence within the hospital setting. Routine replacement of administration sets has been advocated to reduce intravenous infusion contamination. If decreasing the frequency of changing intravenous administration sets does not increase infection rates, a change in practice could result in considerable cost savings. The objective of this review was to identify the optimal interval for the routine replacement of intravenous administration sets when infusate or parenteral nutrition (lipid and non-lipid) solutions are administered to people in hospital via central or peripheral venous catheters. We searched The Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, CINAHL, EMBASE: all from inception to February 2004; reference lists of identified trials, and bibliographies of published reviews. We also contacted researchers in the field. We did not have a language restriction. We included all randomized or quasi-randomized controlled trials addressing the frequency of replacing intravenous administration sets when parenteral nutrition (lipid and non-lipid containing solutions) or infusions (excluding blood) were administered to people in hospital via a central or peripheral catheter. Two authors assessed all potentially relevant studies. We resolved disagreements between the two authors by discussion with a third author. We collected data for the outcomes; infusate contamination; infusate-related bloodstream infection; catheter contamination; catheter-related bloodstream infection; all-cause bloodstream infection and all-cause mortality. We identified 23 references for review. We excluded eight of these studies; five because they did not fit the inclusion criteria and three because of inadequate data. We extracted data from the remaining 15 references (13 studies) with 4783 participants. We conclude that there is no evidence that changing intravenous administration sets more often than every 96 hours

  3. Big Data Analytics

    Indian Academy of Sciences (India)

    IAS Admin

    2016-08-20

    Aug 20, 2016 ... massive computing resources. Data Science, of ... Relational Data Base Management System (RDBMS). A query ..... Beating a human at GO was one of the ..... Job tracker splits a job submitted to the system into Map tasks.

  4. Priming the Pump for Big Data at Sentara Healthcare.

    Science.gov (United States)

    Kern, Howard P; Reagin, Michael J; Reese, Bertram S

    2016-01-01

    Today's healthcare organizations are facing significant demands with respect to managing population health, demonstrating value, and accepting risk for clinical outcomes across the continuum of care. The patient's environment outside the walls of the hospital and physician's office-and outside the electronic health record (EHR)-has a substantial impact on clinical care outcomes. The use of big data is key to understanding factors that affect the patient's health status and enhancing clinicians' ability to anticipate how the patient will respond to various therapies. Big data is essential to delivering sustainable, highquality, value-based healthcare, as well as to the success of new models of care such as clinically integrated networks (CINs) and accountable care organizations.Sentara Healthcare, based in Norfolk, Virginia, has been an early adopter of the technologies that have readied us for our big data journey: EHRs, telehealth-supported electronic intensive care units, and telehealth primary care support through MDLIVE. Although we would not say Sentara is at the cutting edge of the big data trend, it certainly is among the fast followers. Use of big data in healthcare is still at an early stage compared with other industries. Tools for data analytics are maturing, but traditional challenges such as heightened data security and limited human resources remain the primary focus for regional health systems to improve care and reduce costs. Sentara primarily makes actionable use of big data in our CIN, Sentara Quality Care Network, and at our health plan, Optima Health. Big data projects can be expensive, and justifying the expense organizationally has often been easier in times of crisis. We have developed an analytics strategic plan separate from but aligned with corporate system goals to ensure optimal investment and management of this essential asset.

  5. [Reducing fear in preschool children receiving intravenous injections].

    Science.gov (United States)

    Hsieh, Yi-Chuan; Liu, Hui-Tzu; Cho, Yen-Hua

    2012-06-01

    Our pediatric medical ward administers an average of 80 intravenous injections to preschool children. We found that 91.1% exhibit behavior indicative of fear and anxiety. Over three-quarters (77.8%) of this number suffer severe fear and actively resist receiving injections. Such behavior places a greater than normal burden on human and material resources and often gives family members negative impressions that lower their trust in the healthcare service while raising nurse-patient tensions. Using observation and interviews, we found primary factors in injection fear to be: Past negative experiences, lack of adequate prior communication, measures taken to preemptively control child resistance, and default cognitive behavioral strategies from nursing staff. This project worked to develop a strategy to reduce cases of severe injection fear in preschool children from 77.8% to 38.9% and achieve a capacity improvement target for members of 50%. Our team identified several potential strategy solutions from research papers and books between August 1st, 2009 and April 30th, 2010. Our proposed method included therapeutic games, self-selection of injection position, and cognitive behavioral strategies to divert attention. Other measures were also specified as standard operating procedures for administering pediatric intravenous injections. We applied the strategy on 45 preschool children and identified a post-injection "severe fear" level of 37.8%. This project was designed to reduce fear in children to make them more accepting of vaccinations and to enhance children's positive treatment experience in order to raise nursing care quality.

  6. Cardiovascular effects of intravenous ghrelin infusion in healthy young men

    DEFF Research Database (Denmark)

    Vestergaard, Esben Thyssen; Andersen, Niels Holmark; Hansen, Troels Krarup

    2007-01-01

    Ghrelin infusion improves cardiac function in patients suffering from cardiac failure, and bolus administration of ghrelin increases cardiac output in healthy subjects. The cardiovascular effects of more continuous intravenous ghrelin exposure remain to be studied. We therefore studied the cardio......Ghrelin infusion improves cardiac function in patients suffering from cardiac failure, and bolus administration of ghrelin increases cardiac output in healthy subjects. The cardiovascular effects of more continuous intravenous ghrelin exposure remain to be studied. We therefore studied...... the cardiovascular effects of a constant infusion of human ghrelin at a rate of 5 pmol/kg per minute for 180 min. Fifteen healthy, young (aged 23.2 ± 0.5 yr), normal-weight (23.0 ± 0.4 kg/m2) men volunteered in a randomized double-blind, placebo-controlled crossover study. With the subjects remaining fasting, peak...... myocardial systolic velocity S′, tissue tracking TT, left ventricular ejection fraction EF, and endothelium-dependent flow-mediated vasodilatation were measured. Ghrelin infusion increased S′ 9% (P = 0.002) and TT 10% (P

  7. [Big data in official statistics].

    Science.gov (United States)

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  8. GEOSS: Addressing Big Data Challenges

    Science.gov (United States)

    Nativi, S.; Craglia, M.; Ochiai, O.

    2014-12-01

    In the sector of Earth Observation, the explosion of data is due to many factors including: new satellite constellations, the increased capabilities of sensor technologies, social media, crowdsourcing, and the need for multidisciplinary and collaborative research to face Global Changes. In this area, there are many expectations and concerns about Big Data. Vendors have attempted to use this term for their commercial purposes. It is necessary to understand whether Big Data is a radical shift or an incremental change for the existing digital infrastructures. This presentation tries to explore and discuss the impact of Big Data challenges and new capabilities on the Global Earth Observation System of Systems (GEOSS) and particularly on its common digital infrastructure called GCI. GEOSS is a global and flexible network of content providers allowing decision makers to access an extraordinary range of data and information at their desk. The impact of the Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on GEOSS is discussed. The main solutions and experimentation developed by GEOSS along these axes are introduced and analyzed. GEOSS is a pioneering framework for global and multidisciplinary data sharing in the Earth Observation realm; its experience on Big Data is valuable for the many lessons learned.

  9. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  10. Quantum fields in a big-crunch-big-bang spacetime

    International Nuclear Information System (INIS)

    Tolley, Andrew J.; Turok, Neil

    2002-01-01

    We consider quantum field theory on a spacetime representing the big-crunch-big-bang transition postulated in ekpyrotic or cyclic cosmologies. We show via several independent methods that an essentially unique matching rule holds connecting the incoming state, in which a single extra dimension shrinks to zero, to the outgoing state in which it reexpands at the same rate. For free fields in our construction there is no particle production from the incoming adiabatic vacuum. When interactions are included the particle production for fixed external momentum is finite at the tree level. We discuss a formal correspondence between our construction and quantum field theory on de Sitter spacetime

  11. Poker Player Behavior After Big Wins and Big Losses

    OpenAIRE

    Gary Smith; Michael Levere; Robert Kurtzman

    2009-01-01

    We find that experienced poker players typically change their style of play after winning or losing a big pot--most notably, playing less cautiously after a big loss, evidently hoping for lucky cards that will erase their loss. This finding is consistent with Kahneman and Tversky's (Kahneman, D., A. Tversky. 1979. Prospect theory: An analysis of decision under risk. Econometrica 47(2) 263-292) break-even hypothesis and suggests that when investors incur a large loss, it might be time to take ...

  12. Turning big bang into big bounce: II. Quantum dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Malkiewicz, Przemyslaw; Piechocki, Wlodzimierz, E-mail: pmalk@fuw.edu.p, E-mail: piech@fuw.edu.p [Theoretical Physics Department, Institute for Nuclear Studies, Hoza 69, 00-681 Warsaw (Poland)

    2010-11-21

    We analyze the big bounce transition of the quantum Friedmann-Robertson-Walker model in the setting of the nonstandard loop quantum cosmology (LQC). Elementary observables are used to quantize composite observables. The spectrum of the energy density operator is bounded and continuous. The spectrum of the volume operator is bounded from below and discrete. It has equally distant levels defining a quantum of the volume. The discreteness may imply a foamy structure of spacetime at a semiclassical level which may be detected in astro-cosmo observations. The nonstandard LQC method has a free parameter that should be fixed in some way to specify the big bounce transition.

  13. Characterization of an intravenously injected bolus

    International Nuclear Information System (INIS)

    Samuel, A.M.; Raikar, U.R.; Atmaram, S.H.; Ganatra, R.D.

    1976-01-01

    A study of some parameters affecting the time activity histogram of an intravenous bolus injection of radioactivity was performed. A scoring system for bolus compactness was attempted. A score of 2 and above was considered to be a satisfactory bolus. Volumes less than 1 ml tended to result in a satisfactory bolus. The nature of radiopharmaceutical injected, different injecters and age of the patient did not affect the score. Thyrotoxic patients gave the best bolus score. (orig.) [de

  14. Successful outcome after intravenous gasoline injection.

    Science.gov (United States)

    Domej, Wolfgang; Mitterhammer, Heike; Stauber, Rudolf; Kaufmann, Peter; Smolle, Karl Heinz

    2007-12-01

    Gasoline, ingested intentionally or accidentally, is toxic. The majority of reported cases of gasoline intoxication involve oral ingestion or inhalation. Data are scarce on complications and outcomes following hydrocarbon poisoning by intravenous injection. Following a suicide attempt by intravenous self-injection of 10 ml of gasoline, a 26-year-old medical student was admitted to the intensive care unit (ICU) with hemoptysis, symptoms of acute respiratory failure, chest pain, and severe abdominal cramps. Gas exchange was severely impaired and a chest x-ray indicated chemical pneumonitis. Initial treatment consisted of mechanical ventilation, supportive hyperventilation, administration of nitrogen oxide (NO), and prednisone. Unfortunately, the patient developed multi-organ dysfunction syndrome (MODS) complicated by life-threatening severe vasoplegia within 24 hours after gasoline injection. High doses of vasopressors along with massive amounts of parenteral fluids were necessary. Despite fluid replacement, renal function worsened and required hemofiltration on 5 sequential days. After 12 days of intensive care management, the patient recovered completely and was discharged to a psychiatric care facility. Intravenous gasoline injection causes major injury to the lungs, the organ bearing the first capillary bed encountered. Treatment of gasoline poisoning is symptomatic because no specific antidote is available. Early and aggressive supportive care may be conducive to a favorable outcome with minimal residual pulmonary sequelae.

  15. Contrast agent choice for intravenous coronary angiography

    International Nuclear Information System (INIS)

    Zeman, H.D.; Siddons, D.P.

    1989-01-01

    The screening of the general population for coronary artery disease would be practical if a method existed for visualizing the extent of occlusion after an intravenous injection of contrast agent. Measurements performed with monochromatic synchrotron radiation x-rays and an iodine containing contrast agent at the Stanford Synchrotron Radiation Laboratory have shown that such an intravenous angiography procedure would be possible with an adequately intense monochromatic x-ray source. Because of the size and cost of synchrotron radiation facilities it would be desirable to make the most efficient use of the intensity available, while reducing as much as possible the radiation dose experienced by the patient. By choosing contrast agents containing elements with a higher atomic number than iodine, it is possible to both improve the image quality and reduce the patient radiation dose, while using the same synchrotron source. By using Si monochromator crystals with a small mosaic spread, it is possible to increase the x-ray flux available for imaging by over an order of magnitude, without any changes in the storage ring or wiggler magnet. The most critical imaging task for intravenous coronary angiography utilizing synchrotron radiation x-rays is visualizing a coronary artery through the left ventricle or aorta which also contains a contrast agent. Calculations have been made of the signal to noise ratio expected for this imaging task for various contrast agents with atomic numbers between that of iodine and bismuth

  16. Intravenous Lipids for Preterm Infants: A Review

    Directory of Open Access Journals (Sweden)

    Ghassan S. A. Salama

    2015-01-01

    Full Text Available Extremely low birth weight infants (ELBW are born at a time when the fetus is undergoing rapid intrauterine brain and body growth. Continuation of this growth in the first several weeks postnatally during the time these infants are on ventilator support and receiving critical care is often a challenge. These infants are usually highly stressed and at risk for catabolism. Parenteral nutrition is needed in these infants because most cannot meet the majority of their nutritional needs using the enteral route. Despite adoption of a more aggressive approach with amino acid infusions, there still appears to be a reluctance to use early intravenous lipids. This is based on several dogmas that suggest that lipid infusions may be associated with the development or exacerbation of lung disease, displace bilirubin from albumin, exacerbate sepsis, and cause CNS injury and thrombocytopena. Several recent reviews have focused on intravenous nutrition for premature neonate, but very little exists that provides a comprehensive review of intravenous lipid for very low birth and other critically ill neonates. Here, we would like to provide a brief basic overview, of lipid biochemistry and metabolism of lipids, especially as they pertain to the preterm infant, discuss the origin of some of the current clinical practices, and provide a review of the literature, that can be used as a basis for revising clinical care, and provide some clarity in this controversial area, where clinical care is often based more on tradition and dogma than science.

  17. Influences on physicians' choices of intravenous colloids.

    Science.gov (United States)

    Miletin, Michael S; Stewart, Thomas E; Norton, Peter G

    2002-07-01

    Controversy over the optimal intravenous fluid for volume resuscitation continues unabated. Our objectives were to characterize the demographics of physicians who prescribe intravenous colloids and determine factors that enter into their decision to choose a colloid. Questionnaire with 61 items. Ten percent ( n = 364) of frequent intravenous fluid prescribers in the province of Ontario, Canada. The response rate was 74%. Colloid use in the past year was reported by 79% of the responding physicians. Important reasons for choosing a colloid included blood loss and manipulation of oncotic pressure. Physicians tended to prefer either albumin or pentastarch, but no important reasons were found for choosing between the two. Albumin with or without crystalloid was preferred in 5/13 scenarios by more than 50% of the respondents, whereas pentastarch was not favored by more than 50% of respondents in any scenario. Physicians practising in critical care areas and teaching hospitals generally preferred pentastarch to albumin. Physicians reporting pentastarch as representing greater than 90% of total colloid use were more likely to have been visited by a drug detailer for pentastarch than those who used less synthetic colloid (54 vs 22%, p distribution. Although albumin appeared to be preferred in more clinical niches, most physicians did not state reasons for choosing between products. Marketing, specialty, location of practice and clinical scenario appear to play significant roles in the utilization of colloid products.

  18. Big Data Analytics in Medicine and Healthcare.

    Science.gov (United States)

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  19. Big Five personality group differences across academic majors

    DEFF Research Database (Denmark)

    Vedel, Anna

    2016-01-01

    During the past decades, a number of studies have explored personality group differences in the Big Five personality traits among students in different academic majors. To date, though, this research has not been reviewed systematically. This was the aim of the present review. A systematic...... literature search identified twelve eligible studies yielding an aggregated sample size of 13,389. Eleven studies reported significant group differences in one or multiple Big Five personality traits. Consistent findings across studies were that students of arts/humanities and psychology scored high...... on Conscientiousness. Effect sizes were calculated to estimate the magnitude of the personality group differences. These effect sizes were consistent across studies comparing similar pairs of academic majors. For all Big Five personality traits medium effect sizes were found frequently, and for Openness even large...

  20. The Shadow of Big Data: Data-Citizenship and Exclusion

    DEFF Research Database (Denmark)

    Rossi, Luca; Hjelholt, Morten; Neumayer, Christina

    2016-01-01

    The shadow of Big Data: data-citizenship and exclusion Big data are understood as being able to provide insights on human behaviour at an individual as well as at an aggregated societal level (Manyka et al. 2011). These insights are expected to be more detailed and precise than anything before...... thanks to the large volume of digital data and to the unobstrusive nature of the data collection (Fishleigh 2014). Within this perspective, these two dimensions (volume and unobstrusiveness) define contemporary big data techniques as a socio-technical offering to society, a live representation of itself...... this process "data-citizenship" emerges. Data-citizenship assumes that citizens will be visible to the state through the data they produce. On a general level data-citizenship shifts citizenship from an intrinsic status of a group of people to a status achieved through action. This approach assumes equal...

  1. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  2. From big data to deep insight in developmental science.

    Science.gov (United States)

    Gilmore, Rick O

    2016-01-01

    The use of the term 'big data' has grown substantially over the past several decades and is now widespread. In this review, I ask what makes data 'big' and what implications the size, density, or complexity of datasets have for the science of human development. A survey of existing datasets illustrates how existing large, complex, multilevel, and multimeasure data can reveal the complexities of developmental processes. At the same time, significant technical, policy, ethics, transparency, cultural, and conceptual issues associated with the use of big data must be addressed. Most big developmental science data are currently hard to find and cumbersome to access, the field lacks a culture of data sharing, and there is no consensus about who owns or should control research data. But, these barriers are dissolving. Developmental researchers are finding new ways to collect, manage, store, share, and enable others to reuse data. This promises a future in which big data can lead to deeper insights about some of the most profound questions in behavioral science. © 2016 The Authors. WIREs Cognitive Science published by Wiley Periodicals, Inc.

  3. The use of big data in transfusion medicine.

    Science.gov (United States)

    Pendry, K

    2015-06-01

    'Big data' refers to the huge quantities of digital information now available that describe much of human activity. The science of data management and analysis is rapidly developing to enable organisations to convert data into useful information and knowledge. Electronic health records and new developments in Pathology Informatics now support the collection of 'big laboratory and clinical data', and these digital innovations are now being applied to transfusion medicine. To use big data effectively, we must address concerns about confidentiality and the need for a change in culture and practice, remove barriers to adopting common operating systems and data standards and ensure the safe and secure storage of sensitive personal information. In the UK, the aim is to formulate a single set of data and standards for communicating test results and so enable pathology data to contribute to national datasets. In transfusion, big data has been used for benchmarking, detection of transfusion-related complications, determining patterns of blood use and definition of blood order schedules for surgery. More generally, rapidly available information can monitor compliance with key performance indicators for patient blood management and inventory management leading to better patient care and reduced use of blood. The challenges of enabling reliable systems and analysis of big data and securing funding in the restrictive financial climate are formidable, but not insurmountable. The promise is that digital information will soon improve the implementation of best practice in transfusion medicine and patient blood management globally. © 2015 British Blood Transfusion Society.

  4. A practical guide to big data research in psychology.

    Science.gov (United States)

    Chen, Eric Evan; Wojcik, Sean P

    2016-12-01

    The massive volume of data that now covers a wide variety of human behaviors offers researchers in psychology an unprecedented opportunity to conduct innovative theory- and data-driven field research. This article is a practical guide to conducting big data research, covering data management, acquisition, processing, and analytics (including key supervised and unsupervised learning data mining methods). It is accompanied by walkthrough tutorials on data acquisition, text analysis with latent Dirichlet allocation topic modeling, and classification with support vector machines. Big data practitioners in academia, industry, and the community have built a comprehensive base of tools and knowledge that makes big data research accessible to researchers in a broad range of fields. However, big data research does require knowledge of software programming and a different analytical mindset. For those willing to acquire the requisite skills, innovative analyses of unexpected or previously untapped data sources can offer fresh ways to develop, test, and extend theories. When conducted with care and respect, big data research can become an essential complement to traditional research. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  5. Big Book of Windows Hacks

    CERN Document Server

    Gralla, Preston

    2008-01-01

    Bigger, better, and broader in scope, the Big Book of Windows Hacks gives you everything you need to get the most out of your Windows Vista or XP system, including its related applications and the hardware it runs on or connects to. Whether you want to tweak Vista's Aero interface, build customized sidebar gadgets and run them from a USB key, or hack the "unhackable" screensavers, you'll find quick and ingenious ways to bend these recalcitrant operating systems to your will. The Big Book of Windows Hacks focuses on Vista, the new bad boy on Microsoft's block, with hacks and workarounds that

  6. Sosiaalinen asiakassuhdejohtaminen ja big data

    OpenAIRE

    Toivonen, Topi-Antti

    2015-01-01

    Tässä tutkielmassa käsitellään sosiaalista asiakassuhdejohtamista sekä hyötyjä, joita siihen voidaan saada big datan avulla. Sosiaalinen asiakassuhdejohtaminen on terminä uusi ja monille tuntematon. Tutkimusta motivoi aiheen vähäinen tutkimus, suomenkielisen tutkimuksen puuttuminen kokonaan sekä sosiaalisen asiakassuhdejohtamisen mahdollinen olennainen rooli yritysten toiminnassa tulevaisuudessa. Big dataa käsittelevissä tutkimuksissa keskitytään monesti sen tekniseen puoleen, eikä sovellutuk...

  7. Do big gods cause anything?

    DEFF Research Database (Denmark)

    Geertz, Armin W.

    2014-01-01

    Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere.......Dette er et bidrag til et review symposium vedrørende Ara Norenzayans bog Big Gods: How Religion Transformed Cooperation and Conflict (Princeton University Press 2013). Bogen er spændende men problematisk i forhold til kausalitet, ateisme og stereotyper om jægere-samlere....

  8. Big Data and Social Media

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    A critical analysis of the "keep everything" Big Data era, the impact on our lives of the information, at first glance "convenient for future use" that we make known about ourselves on the network. NB! The lecture will be recorded like all Academic Training lectures. Lecturer's biography: Father of the Internet, see https://internethalloffame.org/inductees/vint-cerf or https://en.wikipedia.org/wiki/Vint_Cerf The video on slide number 9 is from page https://www.gapminder.org/tools/#$state$time$value=2018&value;;&chart-type=bubbles   Keywords: Big Data, Internet, History, Applications, tools, privacy, technology, preservation, surveillance, google, Arpanet, CERN, Web  

  9. Baryon symmetric big bang cosmology

    International Nuclear Information System (INIS)

    Stecker, F.W.

    1978-01-01

    It is stated that the framework of baryon symmetric big bang (BSBB) cosmology offers our greatest potential for deducting the evolution of the Universe because its physical laws and processes have the minimum number of arbitrary assumptions about initial conditions in the big-bang. In addition, it offers the possibility of explaining the photon-baryon ratio in the Universe and how galaxies and galaxy clusters are formed. BSBB cosmology also provides the only acceptable explanation at present for the origin of the cosmic γ-ray background radiation. (author)

  10. Release plan for Big Pete

    International Nuclear Information System (INIS)

    Edwards, T.A.

    1996-11-01

    This release plan is to provide instructions for the Radiological Control Technician (RCT) to conduct surveys for the unconditional release of ''Big Pete,'' which was used in the removal of ''Spacers'' from the N-Reactor. Prior to performing surveys on the rear end portion of ''Big Pete,'' it shall be cleaned (i.e., free of oil, grease, caked soil, heavy dust). If no contamination is found, the vehicle may be released with the permission of the area RCT Supervisor. If contamination is found by any of the surveys, contact the cognizant Radiological Engineer for decontamination instructions

  11. Small quarks make big nuggets

    International Nuclear Information System (INIS)

    Deligeorges, S.

    1985-01-01

    After a brief recall on the classification of subatomic particles, this paper deals with quark nuggets, particle with more than three quarks, a big bag, which is called ''nuclearite''. Neutron stars, in fact, are big sacks of quarks, gigantic nuggets. Now, physicists try to calculate which type of nuggets of strange quark matter is stable, what has been the influence of quark nuggets on the primordial nucleosynthesis. At the present time, one says that if these ''nuggets'' exist, and in a large proportion, they may be candidates for the missing mass [fr

  12. [Big Data- challenges and risks].

    Science.gov (United States)

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-06

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented.

  13. Towards a big crunch dual

    Energy Technology Data Exchange (ETDEWEB)

    Hertog, Thomas E-mail: hertog@vulcan2.physics.ucsb.edu; Horowitz, Gary T

    2004-07-01

    We show there exist smooth asymptotically anti-de Sitter initial data which evolve to a big crunch singularity in a low energy supergravity limit of string theory. This opens up the possibility of using the dual conformal field theory to obtain a fully quantum description of the cosmological singularity. A preliminary study of this dual theory suggests that the big crunch is an endpoint of evolution even in the full string theory. We also show that any theory with scalar solitons must have negative energy solutions. The results presented here clarify our earlier work on cosmic censorship violation in N=8 supergravity. (author)

  14. The Inverted Big-Bang

    OpenAIRE

    Vaas, Ruediger

    2004-01-01

    Our universe appears to have been created not out of nothing but from a strange space-time dust. Quantum geometry (loop quantum gravity) makes it possible to avoid the ominous beginning of our universe with its physically unrealistic (i.e. infinite) curvature, extreme temperature, and energy density. This could be the long sought after explanation of the big-bang and perhaps even opens a window into a time before the big-bang: Space itself may have come from an earlier collapsing universe tha...

  15. Integrative Analysis of Omics Big Data.

    Science.gov (United States)

    Yu, Xiang-Tian; Zeng, Tao

    2018-01-01

    The diversity and huge omics data take biology and biomedicine research and application into a big data era, just like that popular in human society a decade ago. They are opening a new challenge from horizontal data ensemble (e.g., the similar types of data collected from different labs or companies) to vertical data ensemble (e.g., the different types of data collected for a group of person with match information), which requires the integrative analysis in biology and biomedicine and also asks for emergent development of data integration to address the great changes from previous population-guided to newly individual-guided investigations.Data integration is an effective concept to solve the complex problem or understand the complicate system. Several benchmark studies have revealed the heterogeneity and trade-off that existed in the analysis of omics data. Integrative analysis can combine and investigate many datasets in a cost-effective reproducible way. Current integration approaches on biological data have two modes: one is "bottom-up integration" mode with follow-up manual integration, and the other one is "top-down integration" mode with follow-up in silico integration.This paper will firstly summarize the combinatory analysis approaches to give candidate protocol on biological experiment design for effectively integrative study on genomics and then survey the data fusion approaches to give helpful instruction on computational model development for biological significance detection, which have also provided newly data resources and analysis tools to support the precision medicine dependent on the big biomedical data. Finally, the problems and future directions are highlighted for integrative analysis of omics big data.

  16. Big Cities, Big Problems: Reason for the Elderly to Move?

    NARCIS (Netherlands)

    Fokkema, T.; de Jong-Gierveld, J.; Nijkamp, P.

    1996-01-01

    In many European countries, data on geographical patterns of internal elderly migration show that the elderly (55+) are more likely to leave than to move to the big cities. Besides emphasising the attractive features of the destination areas (pull factors), it is often assumed that this negative

  17. Big-Eyed Bugs Have Big Appetite for Pests

    Science.gov (United States)

    Many kinds of arthropod natural enemies (predators and parasitoids) inhabit crop fields in Arizona and can have a large negative impact on several pest insect species that also infest these crops. Geocoris spp., commonly known as big-eyed bugs, are among the most abundant insect predators in field c...

  18. A survey on Big Data Stream Mining

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... Big Data can be static on one machine or distributed ... decision making, and process automation. Big data .... Concept Drifting: concept drifting mean the classifier .... transactions generated by a prefix tree structure. EstDec ...

  19. BIG DATA-DRIVEN MARKETING: AN ABSTRACT

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (BD use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: What are the key antecedents of big data customer analytics use? How, and to what extent, does big data customer an...

  20. Big data in Finnish financial services

    OpenAIRE

    Laurila, M. (Mikko)

    2017-01-01

    Abstract This thesis aims to explore the concept of big data, and create understanding of big data maturity in the Finnish financial services industry. The research questions of this thesis are “What kind of big data solutions are being implemented in the Finnish financial services sector?” and “Which factors impede faster implementation of big data solutions in the Finnish financial services sector?”. ...

  1. Mycotic aneurysms in intravenous drug abusers: the utility of intravenous digital subtraction angiography

    International Nuclear Information System (INIS)

    Shetty, P.C.; Krasicky, G.A.; Sharma, R.P.; Vemuri, B.R.; Burke, M.M.

    1985-01-01

    Two-hundred thirteen intravenous digital subtraction angiographic (DSA) examinations were performed on 195 intravenous drug abusers to rule out the possibility of a mycotic aneurysm in a groin, neck, or upper extremity infection. Twenty-three surgically proved cases of mycotic aneurysm were correctly identified with no false positive results. In addition, six cases of major venous occlusion were documented. The authors present the results of their experience and conclude that DSA is an effective and cost-efficient method of examining this high risk patient population

  2. China: Big Changes Coming Soon

    Science.gov (United States)

    Rowen, Henry S.

    2011-01-01

    Big changes are ahead for China, probably abrupt ones. The economy has grown so rapidly for many years, over 30 years at an average of nine percent a year, that its size makes it a major player in trade and finance and increasingly in political and military matters. This growth is not only of great importance internationally, it is already having…

  3. Big data and urban governance

    NARCIS (Netherlands)

    Taylor, L.; Richter, C.; Gupta, J.; Pfeffer, K.; Verrest, H.; Ros-Tonen, M.

    2015-01-01

    This chapter examines the ways in which big data is involved in the rise of smart cities. Mobile phones, sensors and online applications produce streams of data which are used to regulate and plan the city, often in real time, but which presents challenges as to how the city’s functions are seen and

  4. Big Data for personalized healthcare

    NARCIS (Netherlands)

    Siemons, Liseth; Sieverink, Floor; Vollenbroek, Wouter; van de Wijngaert, Lidwien; Braakman-Jansen, Annemarie; van Gemert-Pijnen, Lisette

    2016-01-01

    Big Data, often defined according to the 5V model (volume, velocity, variety, veracity and value), is seen as the key towards personalized healthcare. However, it also confronts us with new technological and ethical challenges that require more sophisticated data management tools and data analysis

  5. Big data en gelijke behandeling

    NARCIS (Netherlands)

    Lammerant, Hans; de Hert, Paul; Blok, P.H.; Blok, P.H.

    2017-01-01

    In dit hoofdstuk bekijken we allereerst de voornaamste basisbegrippen inzake gelijke behandeling en discriminatie (paragraaf 6.2). Vervolgens kijken we haar het Nederlandse en Europese juridisch kader inzake non-discriminatie (paragraaf 6.3-6.5) en hoe die regels moeten worden toegepast op big

  6. Research Ethics in Big Data.

    Science.gov (United States)

    Hammer, Marilyn J

    2017-05-01

    The ethical conduct of research includes, in part, patient agreement to participate in studies and the protection of health information. In the evolving world of data science and the accessibility of large quantities of web-based data created by millions of individuals, novel methodologic approaches to answering research questions are emerging. This article explores research ethics in the context of big data.

  7. Big data e data science

    OpenAIRE

    Cavique, Luís

    2014-01-01

    Neste artigo foram apresentados os conceitos básicos de Big Data e a nova área a que deu origem, a Data Science. Em Data Science foi discutida e exemplificada a noção de redução da dimensionalidade dos dados.

  8. The Case for "Big History."

    Science.gov (United States)

    Christian, David

    1991-01-01

    Urges an approach to the teaching of history that takes the largest possible perspective, crossing time as well as space. Discusses the problems and advantages of such an approach. Describes a course on "big" history that begins with time, creation myths, and astronomy, and moves on to paleontology and evolution. (DK)

  9. Finding errors in big data

    NARCIS (Netherlands)

    Puts, Marco; Daas, Piet; de Waal, A.G.

    No data source is perfect. Mistakes inevitably creep in. Spotting errors is hard enough when dealing with survey responses from several thousand people, but the difficulty is multiplied hugely when that mysterious beast Big Data comes into play. Statistics Netherlands is about to publish its first

  10. Sampling Operations on Big Data

    Science.gov (United States)

    2015-11-29

    gories. These include edge sampling methods where edges are selected by a predetermined criteria; snowball sampling methods where algorithms start... Sampling Operations on Big Data Vijay Gadepally, Taylor Herr, Luke Johnson, Lauren Milechin, Maja Milosavljevic, Benjamin A. Miller Lincoln...process and disseminate information for discovery and exploration under real-time constraints. Common signal processing operations such as sampling and

  11. NASA's Big Data Task Force

    Science.gov (United States)

    Holmes, C. P.; Kinter, J. L.; Beebe, R. F.; Feigelson, E.; Hurlburt, N. E.; Mentzel, C.; Smith, G.; Tino, C.; Walker, R. J.

    2017-12-01

    Two years ago NASA established the Ad Hoc Big Data Task Force (BDTF - https://science.nasa.gov/science-committee/subcommittees/big-data-task-force), an advisory working group with the NASA Advisory Council system. The scope of the Task Force included all NASA Big Data programs, projects, missions, and activities. The Task Force focused on such topics as exploring the existing and planned evolution of NASA's science data cyber-infrastructure that supports broad access to data repositories for NASA Science Mission Directorate missions; best practices within NASA, other Federal agencies, private industry and research institutions; and Federal initiatives related to big data and data access. The BDTF has completed its two-year term and produced several recommendations plus four white papers for NASA's Science Mission Directorate. This presentation will discuss the activities and results of the TF including summaries of key points from its focused study topics. The paper serves as an introduction to the papers following in this ESSI session.

  12. Big Math for Little Kids

    Science.gov (United States)

    Greenes, Carole; Ginsburg, Herbert P.; Balfanz, Robert

    2004-01-01

    "Big Math for Little Kids," a comprehensive program for 4- and 5-year-olds, develops and expands on the mathematics that children know and are capable of doing. The program uses activities and stories to develop ideas about number, shape, pattern, logical reasoning, measurement, operations on numbers, and space. The activities introduce the…

  13. BIG DATA IN BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2015-06-01

    Full Text Available In recent years, dealing with a lot of data originating from social media sites and mobile communications among data from business environments and institutions, lead to the definition of a new concept, known as Big Data. The economic impact of the sheer amount of data produced in a last two years has increased rapidly. It is necessary to aggregate all types of data (structured and unstructured in order to improve current transactions, to develop new business models, to provide a real image of the supply and demand and thereby, generate market advantages. So, the companies that turn to Big Data have a competitive advantage over other firms. Looking from the perspective of IT organizations, they must accommodate the storage and processing Big Data, and provide analysis tools that are easily integrated into business processes. This paper aims to discuss aspects regarding the Big Data concept, the principles to build, organize and analyse huge datasets in the business environment, offering a three-layer architecture, based on actual software solutions. Also, the article refers to the graphical tools for exploring and representing unstructured data, Gephi and NodeXL.

  14. From Big Bang to Eternity?

    Indian Academy of Sciences (India)

    at different distances (that is, at different epochs in the past) to come to this ... that the expansion started billions of years ago from an explosive Big Bang. Recent research sheds new light on the key cosmological question about the distant ...

  15. Banking Wyoming big sagebrush seeds

    Science.gov (United States)

    Robert P. Karrfalt; Nancy Shaw

    2013-01-01

    Five commercially produced seed lots of Wyoming big sagebrush (Artemisia tridentata Nutt. var. wyomingensis (Beetle & Young) S.L. Welsh [Asteraceae]) were stored under various conditions for 5 y. Purity, moisture content as measured by equilibrium relative humidity, and storage temperature were all important factors to successful seed storage. Our results indicate...

  16. Big Data: Implications for Health System Pharmacy.

    Science.gov (United States)

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  17. Big data: een zoektocht naar instituties

    NARCIS (Netherlands)

    van der Voort, H.G.; Crompvoets, J

    2016-01-01

    Big data is a well-known phenomenon, even a buzzword nowadays. It refers to an abundance of data and new possibilities to process and use them. Big data is subject of many publications. Some pay attention to the many possibilities of big data, others warn us for their consequences. This special

  18. A SWOT Analysis of Big Data

    Science.gov (United States)

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  19. Data, Data, Data : Big, Linked & Open

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Eckartz, S.M.

    2013-01-01

    De gehele business en IT-wereld praat op dit moment over Big Data, een trend die medio 2013 Cloud Computing is gepasseerd (op basis van Google Trends). Ook beleidsmakers houden zich actief bezig met Big Data. Neelie Kroes, vice-president van de Europese Commissie, spreekt over de ‘Big Data

  20. A survey of big data research

    Science.gov (United States)

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  1. The BigBoss Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  2. Contrast agent choice for intravenous coronary angiography

    International Nuclear Information System (INIS)

    Zeman, H.D.; Siddons, D.P.

    1990-01-01

    The screening of the general population for coronary artery disease would be practical if a method existed for visualizing the extent of occlusion after an intravenous injection of contrast agent. Measurements performed with monochromatic synchrotron radiation X-rays and an iodine-containing contrast agent at the Stanford Synchrotron Radiation Laboratory have shown that such an intravenous angiography procedure would be possible with an adequately intense monochromatic X-ray source. Because of the size and cost of synchrotron radiation facilities it would be desirable to make the most efficient use of the intensity available, while reducing as much as possible the radiation dose experienced by the patient. By choosing contrast agents containing elements with a higher atomic number than iodine, it is possible to both improve the image quality and reduce the patient radiation dose, while using the same synchrotron radiation source. By using Si monochromator crystals with a small mosaic spread, it is possible to increase the X-ray flux available for imaging by over an order of magnitude, without any changes in the storage ring or wiggler magnet. The most critical imaging task for intravenous coronary angiography utilizing synchrotron radiation X-rays is visualizing a coronary artery through the left ventricle or aorta which also contain contrast agent. Calculations have been made of the signal to noise ratio expected for this imaging task for various contrast agents with atomic numbers between that of iodine and bismuth. The X-ray energy spectrum of the X-17 superconduction wiggler beam line at the National Synchrotron Light Source at Brookhaven National Laboratory has been used for these calculations. Both perfect Si crystals and Si crystals with a small mosaic spread are considered as monochromators. Contrast agents containing Gd or Yb seem to have about the optimal calculated signal to noise ratio. (orig./HSI)

  3. Continuous 24-hour intravenous infusion of recombinant human growth hormone (GH)-releasing hormone-(1-44)-amide augments pulsatile, entropic, and daily rhythmic GH secretion in postmenopausal women equally in the estrogen-withdrawn and estrogen-supplemented states.

    Science.gov (United States)

    Evans, W S; Anderson, S M; Hull, L T; Azimi, P P; Bowers, C Y; Veldhuis, J D

    2001-02-01

    How estrogen amplifies GH secretion in the human is not known. The present study tests the clinical hypothesis that estradiol modulates the stimulatory actions of a primary GH feedforward signal, GHRH. To this end, we investigated the ability of short-term (7- to 12-day) supplementation with oral estradiol vs. placebo to modulate basal, pulsatile, entropic, and 24-h rhythmic GH secretion driven by a continuous iv infusion of recombinant human GHRH-(1--44)-amide vs. saline in nine healthy postmenopausal women. Volunteers underwent concurrent blood sampling every 10 min for 24 h on four occasions in a prospectively randomized, single blind, within-subject cross-over design (placebo/saline, placebo/GHRH, estradiol/saline, estradiol/GHRH). Intensively sampled serum GH concentrations were quantitated by ultrasensitive chemiluminescence assay. Basal, pulsatile, entropic (feedback-sensitive), and 24-h rhythmic modes of GH secretion were appraised by deconvolution analysis, the approximate entropy (ApEn) statistic, and cosine regression, respectively. ANOVA revealed that continuous iv infusion of GHRH in the estrogen-withdrawn (control) milieu 1) amplified individual basal (P = 0.00011) and pulsatile (P < 10(-13)) GH secretion rates by 12- and 11-fold, respectively; 2) augmented GH secretory burst mass and amplitude each by 10-fold (P < 10(-11)), without altering GH secretory burst frequency, duration, or half-life; 3) increased the disorderliness (ApEn) of GH release patterns (P = 0.0000002); 4) elevated the mesor (cosine mean) and amplitude of the 24-h rhythm in serum GH concentrations by nearly 30-fold (both P < 10(-12)); 5) induced a phase advance in the clocktime of the GH zenith (P = 0.021); and 6) evoked a new 24-h rhythm in GH secretory burst mass with a maximum at 0018 h GH (P < 10(-3)), while damping the mesor of the 24-h rhythm in GH interpulse intervals (P < 0.025). Estradiol supplementation alone 1) increased the 24-h mean and integrated serum GH concentration

  4. Game, cloud architecture and outreach for The BIG Bell Test

    Science.gov (United States)

    Abellan, Carlos; Tura, Jordi; Garcia, Marta; Beduini, Federica; Hirschmann, Alina; Pruneri, Valerio; Acin, Antonio; Marti, Maria; Mitchell, Morgan

    The BIG Bell test uses the input from the Bellsters, self-selected human participants introducing zeros and ones through an online videogame, to perform a suite of quantum physics experiments. In this talk, we will explore the videogame, the data infrastructure and the outreach efforts of the BIG Bell test collaboration. First, we will discuss how the game was designed so as to eliminate possible feedback mechanisms that could influence people's behavior. Second, we will discuss the cloud architecture design for scalability as well as explain how we sent each individual bit from the users to the labs. Also, and using all the bits collected via the BIG Bell test interface, we will show a data analysis on human randomness, e.g. are younger Bellsters more random than older Bellsters? Finally, we will talk about the outreach and communication efforts of the BIG Bell test collaboration, exploring both the social media campaigns as well as the close interaction with teachers and educators to bring the project into classrooms.

  5. Choice of intravenous contrast material for CT

    International Nuclear Information System (INIS)

    Cohen, M.D.; Herman, E.; Herron, D.; White, S.T.; Smith, J.A.; Cory, D.A.

    1989-01-01

    For CT, minor side effects (e.g., nausea, vomiting, pain) following intravenous administration of contrast medium may degrade image quality by causing patient motion or by delaying scanning. The objective of this study was to see if nonionic contrast agents offer advantages in reducing the incidence of such side effects. One hundred five pediatric patients randomly received iohexol (Omnipaque), Iopamidol (Isovue), or diatrizoate sodium (Hypaque). Contrast medium was given in doses of 2 mL/kg body weight (300 mg of iodine per milliliter). The results are presented in the paper

  6. [Intravenous ethyl alcohol in metabolic resuscitation].

    Science.gov (United States)

    Agolini, G; Lipartiti, T; Zaffiri, O; Musso, L; Belloni, G P

    1980-11-01

    Intravenously administered ethyl alcohol may be effective as analgesic and hypotensive peripheric vasoactive drug. In the Intensive Care Departments parenteral ethanol administration is infrequent because no "sure dosage" can be suggested in adults and children. Liver, kidney and C.N.S. diseases can worsen; foetopathy can follow. Drug-ethanol interaction may be particularly important for some patients admitted in Intensive Care Departments. Often the potential caloric support cannot be fully utilized ("empty" calories) and seldom hyperventilation, hyperlactacidemia and impaired protein synthesis can follow.

  7. Big Data - Smart Health Strategies

    Science.gov (United States)

    2014-01-01

    Summary Objectives To select best papers published in 2013 in the field of big data and smart health strategies, and summarize outstanding research efforts. Methods A systematic search was performed using two major bibliographic databases for relevant journal papers. The references obtained were reviewed in a two-stage process, starting with a blinded review performed by the two section editors, and followed by a peer review process operated by external reviewers recognized as experts in the field. Results The complete review process selected four best papers, illustrating various aspects of the special theme, among them: (a) using large volumes of unstructured data and, specifically, clinical notes from Electronic Health Records (EHRs) for pharmacovigilance; (b) knowledge discovery via querying large volumes of complex (both structured and unstructured) biological data using big data technologies and relevant tools; (c) methodologies for applying cloud computing and big data technologies in the field of genomics, and (d) system architectures enabling high-performance access to and processing of large datasets extracted from EHRs. Conclusions The potential of big data in biomedicine has been pinpointed in various viewpoint papers and editorials. The review of current scientific literature illustrated a variety of interesting methods and applications in the field, but still the promises exceed the current outcomes. As we are getting closer towards a solid foundation with respect to common understanding of relevant concepts and technical aspects, and the use of standardized technologies and tools, we can anticipate to reach the potential that big data offer for personalized medicine and smart health strategies in the near future. PMID:25123721

  8. Big-Leaf Mahogany on CITES Appendix II: Big Challenge, Big Opportunity

    Science.gov (United States)

    JAMES GROGAN; PAULO BARRETO

    2005-01-01

    On 15 November 2003, big-leaf mahogany (Swietenia macrophylla King, Meliaceae), the most valuable widely traded Neotropical timber tree, gained strengthened regulatory protection from its listing on Appendix II of the Convention on International Trade in Endangered Species ofWild Fauna and Flora (CITES). CITES is a United Nations-chartered agreement signed by 164...

  9. Fuzzy VIKOR approach for selection of big data analyst in procurement management

    Directory of Open Access Journals (Sweden)

    Surajit Bag

    2016-07-01

    Full Text Available Background: Big data and predictive analysis have been hailed as the fourth paradigm of science. Big data and analytics are critical to the future of business sustainability. The demand for data scientists is increasing with the dynamic nature of businesses, thus making it indispensable to manage big data, derive meaningful results and interpret management decisions. Objectives: The purpose of this study was to provide a brief conceptual review of big data and analytics and further illustrate the use of a multicriteria decision-making technique in selecting the right skilled candidate for big data and analytics in procurement management. Method: It is important for firms to select and recruit the right data analyst, both in terms of skills sets and scope of analysis. The nature of such a problem is complex and multicriteria decision-making, which deals with both qualitative and quantitative factors. In the current study, an application of the Fuzzy VIsekriterijumska optimizacija i KOmpromisno Resenje (VIKOR method was used to solve the big data analyst selection problem. Results: From this study, it was identified that Technical knowledge (C1, Intellectual curiosity (C4 and Business acumen (C5 are the strongest influential criteria and must be present in the candidate for the big data and analytics job. Conclusion: Fuzzy VIKOR is the perfect technique in this kind of multiple criteria decisionmaking problematic scenario. This study will assist human resource managers and procurement managers in selecting the right workforce for big data analytics.

  10. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    CERN Multimedia

    2008-01-01

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  11. Antigravity and the big crunch/big bang transition

    Science.gov (United States)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  12. Antigravity and the big crunch/big bang transition

    Energy Technology Data Exchange (ETDEWEB)

    Bars, Itzhak [Department of Physics and Astronomy, University of Southern California, Los Angeles, CA 90089-2535 (United States); Chen, Shih-Hung [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Department of Physics and School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1404 (United States); Steinhardt, Paul J., E-mail: steinh@princeton.edu [Department of Physics and Princeton Center for Theoretical Physics, Princeton University, Princeton, NJ 08544 (United States); Turok, Neil [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada)

    2012-08-29

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  13. Antigravity and the big crunch/big bang transition

    International Nuclear Information System (INIS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-01-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  14. Solution of a braneworld big crunch/big bang cosmology

    International Nuclear Information System (INIS)

    McFadden, Paul L.; Turok, Neil; Steinhardt, Paul J.

    2007-01-01

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c) 2 . At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios

  15. The ethics of big data in big agriculture

    Directory of Open Access Journals (Sweden)

    Isabelle M. Carbonell

    2016-03-01

    Full Text Available This paper examines the ethics of big data in agriculture, focusing on the power asymmetry between farmers and large agribusinesses like Monsanto. Following the recent purchase of Climate Corp., Monsanto is currently the most prominent biotech agribusiness to buy into big data. With wireless sensors on tractors monitoring or dictating every decision a farmer makes, Monsanto can now aggregate large quantities of previously proprietary farming data, enabling a privileged position with unique insights on a field-by-field basis into a third or more of the US farmland. This power asymmetry may be rebalanced through open-sourced data, and publicly-funded data analytic tools which rival Climate Corp. in complexity and innovation for use in the public domain.

  16. Panlobular emphysema in young intravenous Ritalin abusers

    International Nuclear Information System (INIS)

    Schmidt, R.A.; Glenny, R.W.; Godwin, J.D.; Hampson, N.B.; Cantino, M.E.; Reichenbach, D.D.

    1991-01-01

    We studied a distinctive group of young intravenous Ritalin abusers with profound obstructive lung disease. Clinically, they seemed to have severe emphysema, but the pathologic basis of their symptoms had not been investigated previously. Seven patients have died and been autopsied: in four, the lungs were fixed, inflated, dried, and examined in detail radiologically, grossly, microscopically, and by electron probe X-ray microanalysis. All seven patients had severe panlobular (panacinar) emphysema that tended to be more severe in the lower lung zones and that was associated with microscopic talc granulomas. Vascular involvement by talc granulomas was variable, but significant interstitial fibrosis was not present. Five patients were tested for alpha-1-antitrypsin deficiency and found to be normal, as were six similar living patients. These findings indicate that some intravenous drug abusers develop emphysema that clinically, radiologically, and pathologically resembles that caused by alpha-1-antitrypsin deficiency but which must have a different pathogenesis. Talc from the Ritalin tablets may be important, but the mechanism remains to be elucidated

  17. Intravenous Carbamazepine for Adults With Seizures.

    Science.gov (United States)

    Vickery, P Brittany; Tillery, Erika E; DeFalco, Alicia Potter

    2018-03-01

    To review the pharmacology, pharmacokinetics, efficacy, safety, dosage and administration, potential drug-drug interactions, and place in therapy of the intravenous (IV) formulation of carbamazepine (Carnexiv) for the treatment of seizures in adult patients. A comprehensive PubMed and EBSCOhost search (1945 to August 2017) was performed utilizing the keywords carbamazepine, Carnexiv, carbamazepine intravenous, IV carbamazepine, seizures, epilepsy, and seizure disorder. Additional data were obtained from literature review citations, manufacturer's product labeling, and Lundbeck website as well as Clinicaltrials.gov and governmental sources. All English-language trials evaluating IV carbamazepine were analyzed for this review. IV carbamazepine is FDA approved as temporary replacement therapy for treatment of adult seizures. Based on a phase I trial and pooled data from 2 open-label bioavailability studies comparing oral with IV dosing, there was no noted indication of loss of seizure control in patients switched to short-term replacement antiepileptic drug therapy with IV carbamazepine. The recommended dose of IV carbamazepine is 70% of the patient's oral dose, given every 6 hours via 30-minute infusions. The adverse effect profile of IV carbamazepine is similar to that of the oral formulation, with the exception of added infusion-site reactions. IV carbamazepine is a reasonable option for adults with generalized tonic-clonic or focal seizures, previously stabilized on oral carbamazepine, who are unable to tolerate oral medications for up to 7 days. Unknown acquisition cost and lack of availability in the United States limit its use currently.

  18. Adverse reactions to iotroxate at intravenous cholangiography

    International Nuclear Information System (INIS)

    Nilsson, U.

    1987-01-01

    The number and type of adverse reactions to meglumine iotroxate at intravenous infusion cholangiography, performed one day prior to elective cholecystectomy, were recorded in a prospective investigation of 196 asymptomatic, anicteric patients. One hundred ml (50 mg I/ml) of contrast medium was infused over a period of 30 minutes. Only 2 minor (1%) and no severe or fatal reactions were noted. A review of the literature on the use of iotroxate in 2492 patients, including those in the present investigation, revealed a complication rate of 3.5% (3.0% minor, 0.3% moderate and 0.2% severe reactions) at infusion of iotroxate (5.0-8.0 g I) over a period of 30 to 120 minutes. This compared favourably with the 5% complication rate (4% minor, 0.5% moderate and 0.5% severe reactions) at infusion of iodoxamate and the 9% complication rate (5% minor, 1% moderate and 3% severe reactions) at infusion of ioglycamide. Irrespective of the contrast agent used, the frequency of adverse reactions at infusion was found to be 3 times lower than when equal amounts (5.0-5.6 g I) of the same medium were injected. It is concluded that, at present, infusion of iotroxate in an amount which approximates to the transportation maximum of the liver is the least toxic way of performing intravenous cholangiography with an optimum filling of the bile ducts. (orig.)

  19. Intravenous dynamic nucleography of the brain

    International Nuclear Information System (INIS)

    Rosenthall, L.

    1972-01-01

    The advent of stationary imaging devices has created interest in studying cerebral blood flows and transits with diffusible and nondiffusible radioactive indicators. Much of this has disclosed interesting pathophysiology, but not necessarily of significant diagnostic import to include in routine patient workup. The conventional static brain scan is one of the more useful tests in the nuclear medicine armamentarium for uncovering and localizing intracranial disease. Unfortunately, it does not as a rule clearly distinguish cerebral vascular accidents, neoplasms, arteriovenous malformations, and so forth, which is important from the standpoint of patient management. Aside from clinical impressions a diagnosis is often based on the appearance of the radiocontrast angiogram, which is not always desirable because of the implicit hazards. Thus it is incumbent upon investigators to search for innocuous intravenous methods of identifying the various intracranial afflictions. Intravenous 99 /sup m/Tc-pertechnetate comparisons of brain hemisphere perfusion as a routine complement to static brain imaging are useful. Estimations of disparate radioactive transits are made qualitatively from serial 4 to 5 sec exposure scintiphotographs. (U.S.)

  20. Intravenous immunoglobulin therapy and systemic lupus erythematosus.

    Science.gov (United States)

    Zandman-Goddard, Gisele; Levy, Yair; Shoenfeld, Yehuda

    2005-12-01

    Systemic lupus erythematosus (SLE) is a multisystem autoimmune disease with diverse manifestations. We suggest that intravenous immunoglobulin (IVIg) therapy may be beneficial and safe for various manifestations in SLE. A structured literature search of articles published on the efficacy of IVIg in the treatment of SLE between 1983 and 2005 was conducted. We searched the terms "IVIg," "intravenous immunoglobulin," "lupus," "SLE," and "systemic lupus erythematosus." The various clinical manifestations of SLE that were reported to be successfully treated by IVIg in case reports include autoimmune hemolytic anemia, acquired factor VIII inhibitors, acquired von Willebrand disease, pure red cell aplasia, thrombocytopenia, pancytopenia, myelofibrosis, pneumonitis, pleural effusion, pericarditis, myocarditis, cardiogenic shock, nephritis, end-stage renal disease, encephalitis, neuropsychiatric lupus, psychosis, peripheral neuropathy, polyradiculoneuropathy, and vasculitis. The most extensive experience is with lupus nephritis. There are only a few case series of IVIg use in patients with SLE with various manifestations, in which the response rate to IVIg therapy ranged from 33 to 100%. We suggest that IVIg devoid of sucrose, at a dose of 2 g/kg over a 5-d period given uniformly and at a slow infusion rate in patients without an increased risk for thromboembolic events or renal failure, is a safe and beneficial adjunct therapy for cases of SLE that are resistant to or refuse conventional treatment. The duration of therapy is yet to be established. Controlled trials are warranted.

  1. Intravenous ferric carboxymaltose for anaemia in pregnancy.

    Science.gov (United States)

    Froessler, Bernd; Collingwood, Joshua; Hodyl, Nicolette A; Dekker, Gustaaf

    2014-03-25

    Iron deficiency is a common nutritional deficiency amongst women of childbearing age. Peri-partum iron deficiency anaemia (IDA) is associated with significant maternal, fetal and infant morbidity. Current options for treatment are limited: these include oral iron supplementation, which can be ineffective and poorly tolerated, and red blood cell transfusions, which carry an inherent risk and should be avoided. Ferric carboxymaltose is a new treatment option that may be better tolerated.The study was designed to assess the safety and efficacy of iron deficiency anaemia (IDA) correction with intravenous ferric carboxymaltose in pregnant women with mild, moderate and severe anaemia in the second and third trimester. Prospective observational study; 65 anaemic pregnant women received ferric carboxymaltose up to 15 mg/kg between 24 and 40 weeks of pregnancy (median 35 weeks gestational age, SD 3.6). Treatment effectiveness was assessed by repeat haemoglobin (Hb) measurements and patient report of well-being in the postpartum period. Safety was assessed by analysis of adverse drug reactions and fetal heart rate monitoring during the infusion. Intravenous ferric carboxymaltose infusion significantly increased Hb values (p anaemia in pregnancy.

  2. Ultrasonography-guided peripheral intravenous access versus traditional approaches in patients with difficult intravenous access.

    Science.gov (United States)

    Costantino, Thomas G; Parikh, Aman K; Satz, Wayne A; Fojtik, John P

    2005-11-01

    We assess the success rate of emergency physicians in placing peripheral intravenous catheters in difficult-access patients who were unsuccessfully cannulated by emergency nurses. A technique using real-time ultrasonographic guidance by 2 physicians was compared with traditional approaches using palpation and landmark guidance. This was a prospective, systematically allocated study of all patients requiring intravenous access who presented to 2 university hospitals between October 2003 and March 2004. Inclusion criterion was the inability of any available nurse to obtain intravenous access after at least 3 attempts on a subgroup of patients who had a history of difficult intravenous access because of obesity, history of intravenous drug abuse, or chronic medical problems. Exclusion criterion was the need for central venous access. Patients presenting on odd days were allocated to the ultrasonographic-guided group, and those presenting on even days were allocated to the traditional-approach group. Endpoints were successful cannulation, number of sticks, time, and patient satisfaction. Sixty patients were enrolled, 39 on odd days and 21 on even days. Success rate was greater for the ultrasonographic group (97%) versus control (33%), difference in proportions of 64% (95% confidence interval [CI] 39% to 71%). The ultrasonographic group required less overall time (13 minutes versus 30 minutes, for a difference of 17 [95% CI 0.8 to 25.6]), less time to successful cannulation from first percutaneous puncture (4 minutes versus 15 minutes, for a difference of 11 [95% CI 8.2 to 19.4]), and fewer percutaneous punctures (1.7 versus 3.7, for a difference of 2.0 [95% CI 1.27 to 2.82]) and had greater patient satisfaction (8.7 versus 5.7, for a difference of 3.0 [95% CI 1.82 to 4.29]) than the traditional landmark approach. Ultrasonographic-guided peripheral intravenous access is more successful than traditional "blind" techniques, requires less time, decreases the number of

  3. Big Data – Big Deal for Organization Design?

    OpenAIRE

    Janne J. Korhonen

    2014-01-01

    Analytics is an increasingly important source of competitive advantage. It has even been posited that big data will be the next strategic emphasis of organizations and that analytics capability will be manifested in organizational structure. In this article, I explore how analytics capability might be reflected in organizational structure using the notion of  “requisite organization” developed by Jaques (1998). Requisite organization argues that a new strategic emphasis requires the addition ...

  4. Nowcasting using news topics Big Data versus big bank

    OpenAIRE

    Thorsrud, Leif Anders

    2016-01-01

    The agents in the economy use a plethora of high frequency information, including news media, to guide their actions and thereby shape aggregate economic fluctuations. Traditional nowcasting approches have to a relatively little degree made use of such information. In this paper, I show how unstructured textual information in a business newspaper can be decomposed into daily news topics and used to nowcast quarterly GDP growth. Compared with a big bank of experts, here represented by o cial c...

  5. Experimental research on preventing mechanical phlebitis arising from indwelling needles in intravenous therapy by external application of mirabilite.

    Science.gov (United States)

    Lu, Yanyan; Hao, Chunyan; He, Wubin; Tang, Can; Shao, Zhenya

    2018-01-01

    Various types of complications arising from intravenous indwelling needles have become a challenge in clinical care. It is urgent to seek a simple and cost-effective method for prevention and treatment of phlebitis. We investigated the roles of mirabilite in preventing and treating phlebitis caused by intravenous indwelling needles and provide guidance for prevention and treatment of mechanical phlebitis caused by intravenous indwelling needles. A total of 57 healthy congeneric big-eared New Zealand rabbits were randomly divided into 3 groups: blank control, indwelling needle, and group with external application of mirabilite. The ear vein of each rabbit was punctured with an intravenous indwelling needle. The ear vein specimens were taken at 3, 5, and 7 days after indwelling. The hematoxylin and eosin stained pathological tissue sections of the ear veins of the rabbits in each group were observed. The expression levels of IL-1 and IL-6, and tumour necrosis factor-α (TNF-α) in the vascular tissue of the ear veins of the rabbits in each group were detected with the immunofluorescence method. In the blank control group, there was no inflammatory cellular infiltration and no proliferation of fibrous tissue around the vascular wall. With the increase of the indwelling time, proliferation of fibrous tissue in vascular wall, increased inflammatory cellular infiltration and organized thrombus in the vascular tissue occurred in the ear veins of the rabbits in the indwelling needle group and group with external application of mirabilite. Compared with the indwelling needle group, the group with external application of mirabilite had significantly decreased fibrous tissue in the vascular wall and significantly decreased inflammatory cellular infiltration. At the same point in indwelling time, the expression levels of IL-1, IL-6, and TNF-α in the indwelling needle and group with external application of mirabilite were significantly higher than that in the blank control

  6. Big data is not a monolith

    CERN Document Server

    Ekbia, Hamid R; Mattioli, Michael

    2016-01-01

    Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies. The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data's ramifications. The contributors look at big data's effect on individuals as it exerts social control throu...

  7. Research Dilemmas with Behavioral Big Data.

    Science.gov (United States)

    Shmueli, Galit

    2017-06-01

    Behavioral big data (BBD) refers to very large and rich multidimensional data sets on human and social behaviors, actions, and interactions, which have become available to companies, governments, and researchers. A growing number of researchers in social science and management fields acquire and analyze BBD for the purpose of extracting knowledge and scientific discoveries. However, the relationships between the researcher, data, subjects, and research questions differ in the BBD context compared to traditional behavioral data. Behavioral researchers using BBD face not only methodological and technical challenges but also ethical and moral dilemmas. In this article, we discuss several dilemmas, challenges, and trade-offs related to acquiring and analyzing BBD for causal behavioral research.

  8. Administration costs of intravenous biologic drugs for rheumatoid arthritis

    OpenAIRE

    Soini, Erkki J; Leussu, Miina; Hallinen, Taru

    2013-01-01

    Background Cost-effectiveness studies explicitly reporting infusion times, drug-specific administration costs for infusions or real-payer intravenous drug cost are few in number. Yet, administration costs for infusions are needed in the health economic evaluations assessing intravenously-administered drugs. Objectives To estimate the drug-specific administration and total cost of biologic intravenous rheumatoid arthritis (RA) drugs in the adult population and to compare the obtained costs wit...

  9. Big power from walking

    Science.gov (United States)

    Illenberger, Patrin K.; Madawala, Udaya K.; Anderson, Iain A.

    2016-04-01

    Dielectric Elastomer Generators (DEG) offer an opportunity to capture the energy otherwise wasted from human motion. By integrating a DEG into the heel of standard footwear, it is possible to harness this energy to power portable devices. DEGs require substantial auxiliary systems which are commonly large, heavy and inefficient. A unique challenge for these low power generators is the combination of high voltage and low current. A void exists in the semiconductor market for devices that can meet these requirements. Until these become available, existing devices must be used in an innovative way to produce an effective DEG system. Existing systems such as the Bi-Directional Flyback (BDFB) and Self Priming Circuit (SPC) are an excellent example of this. The BDFB allows full charging and discharging of the DEG, improving power gained. The SPC allows fully passive voltage boosting, removing the priming source and simplifying the electronics. This paper outlines the drawbacks and benefits of active and passive electronic solutions for maximizing power from walking.

  10. Comparison of the Effectiveness of a Virtual Simulator With a Plastic Arm Model in Teaching Intravenous Catheter Insertion Skills.

    Science.gov (United States)

    Günay İsmailoğlu, Elif; Zaybak, Ayten

    2018-02-01

    The objective of this study was to compare the effectiveness of a virtual intravenous simulator with a plastic arm model in teaching intravenous catheter insertion skills to nursing students. We used a randomized controlled quasi-experimental trial design and recruited 65 students who were assigned to the experimental (n = 33) and control (n = 32) groups using the simple random sampling method. The experimental group received intravenous catheterization skills training on the virtual intravenous simulator, and the control group received the same training on a plastic model of a human arm. Data were collected using the personal information form, intravenous catheterization knowledge assessment form, Intravenous Catheterization Skill Test, Self-Confidence and Satisfaction Scale, and Fear Symptoms Scale. In the study, the mean scores in the control group were 20.44 for psychomotor skills, 15.62 for clinical psychomotor skills, 31.78 for self-confidence, and 21.77 for satisfaction. The mean scores in the experimental group were 45.18 for psychomotor skills, 16.28 for clinical psychomotor skills, 34.18 for self-confidence, and 43.89 for satisfaction. The results indicated that psychomotor skills and satisfaction scores were higher in the experimental group, while the clinical psychomotor skills and self-confidence scores were similar in both groups. More students in the control group reported experiencing symptoms such as cold and sweaty hands, significant restlessness, and tense muscles than those in the experimental group.

  11. Optimizing the use of intravenous therapy in internal medicine.

    Science.gov (United States)

    Champion, Karine; Mouly, Stéphane; Lloret-Linares, Celia; Lopes, Amanda; Vicaut, Eric; Bergmann, Jean-François

    2013-10-01

    We aimed to evaluate the impact of physicians' educational programs in the reduction of inappropriate intravenous lines in internal medicine. Fifty-six French internal medicine units were enrolled in a nationwide, prospective, blinded, randomized controlled trial. Forms describing the patients with an intravenous line and internal medicine department characteristics were filled out on 2 separate days in January and April 2007. Following the first visit, all units were randomly assigned to either a specific education program on the appropriate indications of an intravenous line, during February and March 2007, or no training (control group). The Investigators' Committee then blindly evaluated the clinical relevance of the intravenous line according to pre-established criteria. The primary outcome was the percentage of inappropriate intravenous lines. During January 2007, intravenous lines were used in 475 (24.9%) of the 1910 hospitalized patients. Of these, 80 (16.8%) were considered inappropriate. In April 2007, 416 (22.8%) of the 1823 hospitalized patients received an intravenous line, which was considered in 10.2% (21/205) of patients managed by trained physicians, versus 16.6% (35/211) of patients in the control group (relative difference 39%; 95% confidence interval, -0.6-13.3; P = .05). Reduced intravenous administration of fluids, antibiotics, and analgesics accounted for the observed decrease. The use of a simple education program reduced the rate of inappropriate intravenous lines by almost 40% in an internal medicine setting (NCT01633307). Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Effects of intravenous diclofenac on postoperative sore throat in ...

    African Journals Online (AJOL)

    Effects of intravenous diclofenac on postoperative sore throat in patients undergoing laparoscopic surgery at Aga Khan University Hospital, Nairobi: A prospective, randomized, double blind controlled trial.

  13. Low-dose intravenous lidocaine as treatment for proctalgia fugax.

    Science.gov (United States)

    Peleg, Roni; Shvartzman, Pesach

    2002-01-01

    Proctalgia fugax is characterized by a sudden internal anal sphincter and anorectic ring attack of pain of a short duration. Description of the influence of intravenous lidocaine treatment for proctalgia fugax. A 28-year-old patient suffering of proctalgia fugax for 8 months. Conventional treatment efforts did not improve his condition. A single dose of an intravenous lidocaine infusion completely stopped his pain attacks. Based on the experience reported in this case and the potential benefit of this treatment for proctalgia fugax, controlled studies comparing intravenous lidocaine with placebo should be conducted to confirm the observation and to provide a more concrete basis for the use of intravenous lidocaine for this indication.

  14. Big Data for Precision Medicine

    Directory of Open Access Journals (Sweden)

    Daniel Richard Leff

    2015-09-01

    Full Text Available This article focuses on the potential impact of big data analysis to improve health, prevent and detect disease at an earlier stage, and personalize interventions. The role that big data analytics may have in interrogating the patient electronic health record toward improved clinical decision support is discussed. We examine developments in pharmacogenetics that have increased our appreciation of the reasons why patients respond differently to chemotherapy. We also assess the expansion of online health communications and the way in which this data may be capitalized on in order to detect public health threats and control or contain epidemics. Finally, we describe how a new generation of wearable and implantable body sensors may improve wellbeing, streamline management of chronic diseases, and improve the quality of surgical implants.

  15. Big Data hvor N=1

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    2017-01-01

    Forskningen vedrørende anvendelsen af ’big data’ indenfor sundhed er kun lige begyndt, og kan på sigt blive en stor hjælp i forhold til at tilrettelægge en mere personlig og helhedsorienteret sundhedsindsats for multisyge. Personlig sundhedsteknologi, som kort præsenteres i dette kapital, rummer et...... stor potentiale for at gennemføre ’big data’ analyser for den enkelte person, det vil sige hvor N=1. Der er store teknologiske udfordringer i at få lavet teknologier og metoder til at indsamle og håndtere personlige data, som kan deles, på tværs på en standardiseret, forsvarlig, robust, sikker og ikke...

  16. George and the big bang

    CERN Document Server

    Hawking, Lucy; Parsons, Gary

    2012-01-01

    George has problems. He has twin baby sisters at home who demand his parents’ attention. His beloved pig Freddy has been exiled to a farm, where he’s miserable. And worst of all, his best friend, Annie, has made a new friend whom she seems to like more than George. So George jumps at the chance to help Eric with his plans to run a big experiment in Switzerland that seeks to explore the earliest moment of the universe. But there is a conspiracy afoot, and a group of evildoers is planning to sabotage the experiment. Can George repair his friendship with Annie and piece together the clues before Eric’s experiment is destroyed forever? This engaging adventure features essays by Professor Stephen Hawking and other eminent physicists about the origins of the universe and ends with a twenty-page graphic novel that explains how the Big Bang happened—in reverse!

  17. Did the Big Bang begin?

    International Nuclear Information System (INIS)

    Levy-Leblond, J.

    1990-01-01

    It is argued that the age of the universe may well be numerically finite (20 billion years or so) and conceptually infinite. A new and natural time scale is defined on a physical basis using group-theoretical arguments. An additive notion of time is obtained according to which the age of the universe is indeed infinite. In other words, never did the Big Bang begin. This new time scale is not supposed to replace the ordinary cosmic time scale, but to supplement it (in the same way as rapidity has taken a place by the side of velocity in Einsteinian relativity). The question is discussed within the framework of conventional (big-bang) and classical (nonquantum) cosmology, but could easily be extended to more elaborate views, as the purpose is not so much to modify present theories as to reach a deeper understanding of their meaning

  18. Big Data in Drug Discovery.

    Science.gov (United States)

    Brown, Nathan; Cambruzzi, Jean; Cox, Peter J; Davies, Mark; Dunbar, James; Plumbley, Dean; Sellwood, Matthew A; Sim, Aaron; Williams-Jones, Bryn I; Zwierzyna, Magdalena; Sheppard, David W

    2018-01-01

    Interpretation of Big Data in the drug discovery community should enhance project timelines and reduce clinical attrition through improved early decision making. The issues we encounter start with the sheer volume of data and how we first ingest it before building an infrastructure to house it to make use of the data in an efficient and productive way. There are many problems associated with the data itself including general reproducibility, but often, it is the context surrounding an experiment that is critical to success. Help, in the form of artificial intelligence (AI), is required to understand and translate the context. On the back of natural language processing pipelines, AI is also used to prospectively generate new hypotheses by linking data together. We explain Big Data from the context of biology, chemistry and clinical trials, showcasing some of the impressive public domain sources and initiatives now available for interrogation. © 2018 Elsevier B.V. All rights reserved.

  19. Big Data and central banks

    OpenAIRE

    David Bholat

    2015-01-01

    This commentary recaps a Centre for Central Banking Studies event held at the Bank of England on 2–3 July 2014. The article covers three main points. First, it situates the Centre for Central Banking Studies event within the context of the Bank’s Strategic Plan and initiatives. Second, it summarises and reflects on major themes from the event. Third, the article links central banks’ emerging interest in Big Data approaches with their broader uptake by other economic agents.

  20. Big Bang or vacuum fluctuation

    International Nuclear Information System (INIS)

    Zel'dovich, Ya.B.

    1980-01-01

    Some general properties of vacuum fluctuations in quantum field theory are described. The connection between the ''energy dominance'' of the energy density of vacuum fluctuations in curved space-time and the presence of singularity is discussed. It is pointed out that a de-Sitter space-time (with the energy density of the vacuum fluctuations in the Einstein equations) that matches the expanding Friedman solution may describe the history of the Universe before the Big Bang. (P.L.)

  1. Big bang is not needed

    Energy Technology Data Exchange (ETDEWEB)

    Allen, A.D.

    1976-02-01

    Recent computer simulations indicate that a system of n gravitating masses breaks up, even when the total energy is negative. As a result, almost any initial phase-space distribution results in a universe that eventually expands under the Hubble law. Hence Hubble expansion implies little regarding an initial cosmic state. Especially it does not imply the singularly dense superpositioned state used in the big bang model.

  2. Big data: the management revolution.

    Science.gov (United States)

    McAfee, Andrew; Brynjolfsson, Erik

    2012-10-01

    Big data, the authors write, is far more powerful than the analytics of the past. Executives can measure and therefore manage more precisely than ever before. They can make better predictions and smarter decisions. They can target more-effective interventions in areas that so far have been dominated by gut and intuition rather than by data and rigor. The differences between big data and analytics are a matter of volume, velocity, and variety: More data now cross the internet every second than were stored in the entire internet 20 years ago. Nearly real-time information makes it possible for a company to be much more agile than its competitors. And that information can come from social networks, images, sensors, the web, or other unstructured sources. The managerial challenges, however, are very real. Senior decision makers have to learn to ask the right questions and embrace evidence-based decision making. Organizations must hire scientists who can find patterns in very large data sets and translate them into useful business information. IT departments have to work hard to integrate all the relevant internal and external sources of data. The authors offer two success stories to illustrate how companies are using big data: PASSUR Aerospace enables airlines to match their actual and estimated arrival times. Sears Holdings directly analyzes its incoming store data to make promotions much more precise and faster.

  3. Big Data Comes to School

    Directory of Open Access Journals (Sweden)

    Bill Cope

    2016-03-01

    Full Text Available The prospect of “big data” at once evokes optimistic views of an information-rich future and concerns about surveillance that adversely impacts our personal and private lives. This overview article explores the implications of big data in education, focusing by way of example on data generated by student writing. We have chosen writing because it presents particular complexities, highlighting the range of processes for collecting and interpreting evidence of learning in the era of computer-mediated instruction and assessment as well as the challenges. Writing is significant not only because it is central to the core subject area of literacy; it is also an ideal medium for the representation of deep disciplinary knowledge across a number of subject areas. After defining what big data entails in education, we map emerging sources of evidence of learning that separately and together have the potential to generate unprecedented amounts of data: machine assessments, structured data embedded in learning, and unstructured data collected incidental to learning activity. Our case is that these emerging sources of evidence of learning have significant implications for the traditional relationships between assessment and instruction. Moreover, for educational researchers, these data are in some senses quite different from traditional evidentiary sources, and this raises a number of methodological questions. The final part of the article discusses implications for practice in an emerging field of education data science, including publication of data, data standards, and research ethics.

  4. INTRAVENOUS REGIONAL ANTIBIOTIC PERFUSION THERAPY AS AN ADJUNCTIVE TREATMENT FOR DIGITAL LESIONS IN SEABIRDS.

    Science.gov (United States)

    Fiorello, Christine V

    2017-03-01

    Foot infections are a common problem among seabirds in wildlife rehabilitation. Pododermatitis and digital infections are often challenging to treat because of the presence of suboptimal substrates, abnormal weight-bearing due to injuries, and suboptimal nutritional or health status. Seabirds represent the majority of animals requiring rehabilitation after oil spills, and foot problems are a common reason for euthanasia among these birds. Antibiotic intravenous regional perfusion therapy is frequently used in humans and other species to treat infections of the distal extremities, but it has not been evaluated in seabirds. During the 2015 Refugio oil spill response, four birds with foot lesions (pododermatitis, osteomyelitis, or both) were treated with ampicillin/sulbactam administered intravenously to the affected limb(s) in addition to systemic antibiotics and anti-inflammatories. Three of the birds, all brown pelicans ( Pelecanus occidentalis ) recovered rapidly and were released. Two of these birds had acute pododermatitis and were treated once with intravenous regional perfusion. They were released approximately 3 wk after the perfusion therapy. The third pelican had osteomyelitis of a digit. It was treated twice with intravenous regional perfusion and was released about 1 mo after the initial perfusion therapy. The fourth bird, a Pacific loon ( Gavia pacifica ), was treated once with perfusion therapy but did not respond to treatment and was euthanatized. No serious adverse effects were observed. This technique should be explored further in avian species.

  5. Big Bayou Creek and Little Bayou Creek Watershed Monitoring Program

    Energy Technology Data Exchange (ETDEWEB)

    Kszos, L.A.; Peterson, M.J.; Ryon; Smith, J.G.

    1999-03-01

    Biological monitoring of Little Bayou and Big Bayou creeks, which border the Paducah Site, has been conducted since 1987. Biological monitoring was conducted by University of Kentucky from 1987 to 1991 and by staff of the Environmental Sciences Division (ESD) at Oak Ridge National Laboratory (ORNL) from 1991 through March 1999. In March 1998, renewed Kentucky Pollutant Discharge Elimination System (KPDES) permits were issued to the US Department of Energy (DOE) and US Enrichment Corporation. The renewed DOE permit requires that a watershed monitoring program be developed for the Paducah Site within 90 days of the effective date of the renewed permit. This plan outlines the sampling and analysis that will be conducted for the watershed monitoring program. The objectives of the watershed monitoring are to (1) determine whether discharges from the Paducah Site and the Solid Waste Management Units (SWMUs) associated with the Paducah Site are adversely affecting instream fauna, (2) assess the ecological health of Little Bayou and Big Bayou creeks, (3) assess the degree to which abatement actions ecologically benefit Big Bayou Creek and Little Bayou Creek, (4) provide guidance for remediation, (5) provide an evaluation of changes in potential human health concerns, and (6) provide data which could be used to assess the impact of inadvertent spills or fish kill. According to the cleanup will result in these watersheds [Big Bayou and Little Bayou creeks] achieving compliance with the applicable water quality criteria.

  6. Small Area Model-Based Estimators Using Big Data Sources

    Directory of Open Access Journals (Sweden)

    Marchetti Stefano

    2015-06-01

    Full Text Available The timely, accurate monitoring of social indicators, such as poverty or inequality, on a finegrained spatial and temporal scale is a crucial tool for understanding social phenomena and policymaking, but poses a great challenge to official statistics. This article argues that an interdisciplinary approach, combining the body of statistical research in small area estimation with the body of research in social data mining based on Big Data, can provide novel means to tackle this problem successfully. Big Data derived from the digital crumbs that humans leave behind in their daily activities are in fact providing ever more accurate proxies of social life. Social data mining from these data, coupled with advanced model-based techniques for fine-grained estimates, have the potential to provide a novel microscope through which to view and understand social complexity. This article suggests three ways to use Big Data together with small area estimation techniques, and shows how Big Data has the potential to mirror aspects of well-being and other socioeconomic phenomena.

  7. Surface urban heat island across 419 global big cities.

    Science.gov (United States)

    Peng, Shushi; Piao, Shilong; Ciais, Philippe; Friedlingstein, Pierre; Ottle, Catherine; Bréon, François-Marie; Nan, Huijuan; Zhou, Liming; Myneni, Ranga B

    2012-01-17

    Urban heat island is among the most evident aspects of human impacts on the earth system. Here we assess the diurnal and seasonal variation of surface urban heat island intensity (SUHII) defined as the surface temperature difference between urban area and suburban area measured from the MODIS. Differences in SUHII are analyzed across 419 global big cities, and we assess several potential biophysical and socio-economic driving factors. Across the big cities, we show that the average annual daytime SUHII (1.5 ± 1.2 °C) is higher than the annual nighttime SUHII (1.1 ± 0.5 °C) (P < 0.001). But no correlation is found between daytime and nighttime SUHII across big cities (P = 0.84), suggesting different driving mechanisms between day and night. The distribution of nighttime SUHII correlates positively with the difference in albedo and nighttime light between urban area and suburban area, while the distribution of daytime SUHII correlates negatively across cities with the difference of vegetation cover and activity between urban and suburban areas. Our results emphasize the key role of vegetation feedbacks in attenuating SUHII of big cities during the day, in particular during the growing season, further highlighting that increasing urban vegetation cover could be one effective way to mitigate the urban heat island effect.

  8. Solar urticaria successfully treated with intravenous immunoglobulin.

    LENUS (Irish Health Repository)

    Hughes, R

    2012-02-01

    Idiopathic solar urticaria (SU) is a rare, debilitating photodermatosis, which may be difficult to treat. First-line treatment with antihistamines is effective in mild cases, but remission after phototherapeutic induction of tolerance is often short-lived. Other treatment options include plasma exchange, photopheresis and cyclosporin. We present two cases of severe, idiopathic SU, which were resistant to conventional treatment. Both patients achieved remission after administration of intravenous immunoglobulin (IVIg) and have remained in remission at 13 months and 4 years, respectively. There are only two case reports of successful treatment of solar urticaria with IVIg. In our experience IVIg given at a total dose of 2 g\\/kg over several 5-day courses about a month apart is an effective treatment option for severe idiopathic SU. It is also generally safe, even if certainly subject to significant theoretical risks, such as induction of viral infection or anaphylaxis.

  9. Anaphylaxis after intravenous infusion of dexketoprofen trometamol

    Directory of Open Access Journals (Sweden)

    Sertac Guler

    2016-09-01

    Full Text Available Dexketoprofen trometamol (DT, a nonsteroidal anti-inflammatory drug, is a highly water-soluble salt and active enantiomer of rac-ketoprofen. Its parenteral form is commonly used for acute pain management in emergency departments of our country. Side effects such as diarrhea, indigestion, nausea, stomach pain, and vomiting may be seen after the use of DT. Anaphylactic shock (AS secondary to infusion of DT is very rare and, to our knowledge, it is the first case report describing this side effect. This case report was presented to emphasize that AS may be seen after the use of DT. Keywords: Anaphylactic shock, Dexketoprofen trometamol, Intravenous infusion (MeSH database

  10. Switching between intravenous and subcutaneous trastuzumab

    DEFF Research Database (Denmark)

    Gligorov, Joseph; Curigliano, Giuseppe; Müller, Volkmar

    2017-01-01

    AIM: To assess the safety and tolerability of switching between subcutaneous (SC) and intravenous (IV) trastuzumab in the PrefHer study (NCT01401166). PATIENTS AND METHODS: Patients with HER2-positive early breast cancer completed (neo)adjuvant chemotherapy and were randomised to receive four...... cycles of SC trastuzumab, via single-use injection device (SID; Cohort 1) or hand-held syringe (Cohort 2), followed by four cycles of IV, or vice versa (the crossover period presented here) as part of their 18 standard cycles of adjuvant trastuzumab treatment. Adverse events (AEs) were reported using....... Rates of clinically important events, including grade ≥3 AEs, serious AEs, AEs leading to study drug discontinuation and cardiac AEs, were low and similar between treatment arms (trastuzumab were observed. CONCLUSIONS: PrefHer revealed...

  11. Intravenous urography in children and youth

    International Nuclear Information System (INIS)

    Pedersen, H.K.; Gudmundsen, T.E.; Oestensen, H.; Pape, J.F.

    1987-01-01

    This report derives from Tromsoe in northern Norway. In a retrospective study of the indications for intravenous urography (IU) and the findings at IU in 740 patients (451 girls and 289 boys) aged 0-19 years, we found that urinary tract infections accounted for 69.4% of the IU in females and 30.1% of the IU in males, most often seen in the youngest patients. The pathological findings most frequently seen were anomalies (17 females and 10 males) and urinary tract obstruction (3 females and 15 males). The present study indicates the following: first, that the yield of IU in the primary investigation of children and youth suffering from enuresis and non-specific abdominal disturbancies is small; and second, that the use of IU in children and youth with urinary tract infection and haematuria should be questioned and reconsidered. (orig.)

  12. Preparation of a Homologous (Human) Intravenous Botulinal Immune Globulin.

    Science.gov (United States)

    1983-05-01

    ptoceýř lie des chilled % here %011t holec t idt e brand tnoii Aerosil 3X0 by Dcgussa. Inc.. plasma could be- treated et irniniicaully and Ailh casw...Ness% York. N Y .atid umiinr the brand name Cab-O-Sil and in a mnannecr that lllasmM#c iiiiel.nd h’lasitiiil c ould IV Is ( .,l ot (’or V Bostoin...and trade name and the other instructions, when indicated by the background. Typography , layout, con- ch -ctrof the product; tra-st, and other rintlnz

  13. 1-3-7 minute intravenous urography

    International Nuclear Information System (INIS)

    Bahk, Yong Whee; Yoon, Sei Chul; Lee, Myung Hee

    1980-01-01

    Intravenous urography (IVU) as it is used widely today was probably started in early 1950's after the introduction of triiodobenzoic acid compounds as contrast media. This long cherished traditional method consists of taking radiograms at 5, 15 and 25 minutes after the injection of contrast medium. There are a few modifications of this standard urographic examination such as five minute IVU (Woodruff, 1959), minute-sequence pyelogram (Maxwell et al., 1964), drip infusion pyelography (Schencker, 1964) and nephrotomography (Evans et al., 1955). The present study has been undertaken to test if the conventional standard IVU can be more rapidly performed without losing essential informational contents of urograms. In this new clinical trial, urograms were taken at the end of 1, 3 and 7 minutes instead of 5, 15 and 25 minutes after the intravenous injection of contrast medium. We injected 40 ml of meglumine diatrizoate solution within 30 seconds using an 18G iv needle. (The amount of injected contrast medium has been reduced recently to ordinary single dose of 20 ml for subjects weighing less than 8 kg). Upon viewing the 7 minute film in front of an automatic processor, the examination was terminated after obtaining an upright view unless any further radiogram was indicated. As shown in Tables and Figures, our new 1-3-7 minute method has been proven to provide us with as much essential and useful information as conventional 5-15-25 minute urography. Thus, we were able to finish one examination within 10 minutes without losing any necessary diagnostic information. In some of patients with obstructive uropathy such as stone the examination was extended as long as it was desired. Side reactions were occasional nausea, flushing and rare mild vomiting which never prevented the examination

  14. Big data reduction framework for value creation in sustainable enterprises

    OpenAIRE

    Rehman, Muhammad Habib ur; Chang, Victor; Batool, Aisha; Teh, Ying Wah

    2016-01-01

    Value creation is a major sustainability factor for enterprises, in addition to profit maximization and revenue generation. Modern enterprises collect big data from various inbound and outbound data sources. The inbound data sources handle data generated from the results of business operations, such as manufacturing, supply chain management, marketing, and human resource management, among others. Outbound data sources handle customer-generated data which are acquired directly or indirectly fr...

  15. Small data, data infrastructures and big data (Working Paper 1)

    OpenAIRE

    Kitchin, Rob; Lauriault, Tracey P.

    2014-01-01

    The production of academic knowledge has progressed for the past few centuries using small data studies characterized by sampled data generated to answer specific questions. It is a strategy that has been remarkably successful, enabling the sciences, social sciences and humanities to advance in leaps and bounds. This approach is presently being challenged by the development of big data. Small data studies will, however, continue to be important in the future because of their utility in answer...

  16. Small data in the era of big data

    OpenAIRE

    Kitchin, Rob; Lauriault, Tracey P.

    2015-01-01

    Academic knowledge building has progressed for the past few centuries using small data studies characterized by sampled data generated to answer specific questions. It is a strategy that has been remarkably successful, enabling the sciences, social sciences and humanities to advance in leaps and bounds. This approach is presently being challenged by the development of big data. Small data studies will however, we argue, continue to be popular and valuable in the fut...

  17. Real-Time Information Extraction from Big Data

    Science.gov (United States)

    2015-10-01

    Introduction Enormous amounts of data are being generated by a large number of sensors and devices (Internet of Things: IoT ), and this data is...brief summary in Section 7. Data Access Patterns for Current and Big Data Systems Many current solution architectures rely on accessing data resident...by highly skilled human experts based on their intuition and vast knowledge. We do not have, and cannot produce enough experts to fill our

  18. An overview of big data and data science education at South African universities

    Directory of Open Access Journals (Sweden)

    Eduan Kotzé

    2016-02-01

    Full Text Available Man and machine are generating data electronically at an astronomical speed and in such a way that society is experiencing cognitive challenges to analyse this data meaningfully. Big data firms, such as Google and Facebook, identified this problem several years ago and are continuously developing new technologies or improving existing technologies in order to facilitate the cognitive analysis process of these large data sets. The purpose of this article is to contribute to our theoretical understanding of the role that big data might play in creating new training opportunities for South African universities. The article investigates emerging literature on the characteristics and main components of big data, together with the Hadoop application stack as an example of big data technology. Due to the rapid development of big data technology, a paradigm shift of human resources is required to analyse these data sets; therefore, this study examines the state of big data teaching at South African universities. This article also provides an overview of possible big data sources for South African universities, as well as relevant big data skills that data scientists need. The study also investigates existing academic programs in South Africa, where the focus is on teaching advanced database systems. The study found that big data and data science topics are introduced to students on a postgraduate level, but that the scope is very limited. This article contributes by proposing important theoretical topics that could be introduced as part of the existing academic programs. More research is required, however, to expand these programs in order to meet the growing demand for data scientists with big data skills.

  19. Big Data Innovation Challenge : Pioneering Approaches to Data-Driven Development

    OpenAIRE

    World Bank Group

    2016-01-01

    Big data can sound remote and lacking a human dimension, with few obvious links to development and impacting the lives of the poor. Concepts such as anti-poverty targeting, market access or rural electrification seem far more relevant – and easier to grasp. And yet some of today’s most groundbreaking initiatives in these areas rely on big data. This publication profiles these and more, sho...

  20. Comparison of postinfusion phlebitis in intravenous push versus intravenous piggyback cefazolin.

    Science.gov (United States)

    Biggar, Constance; Nichols, Cynthia

    2012-01-01

    Reducing health care costs without adversely affecting patient safety is a constant challenge for health care institutions. Cefazolin prophylaxis via intravenous push (IVP) is more cost-effective than via intravenous piggyback (IVPB). The purpose of this study was to determine whether patient safety would be compromised (ie, an increased rate of phlebitis) with a change to the IVP method. Rates of phlebitis in orthopedic surgical patients receiving cefazolin prophylaxis via IVP versus IVPB were evaluated in a prospective quasi-experimental design of 240 patients. The first 120 subjects received cefazolin via IVPB, and the second 120 subjects received it via IVP. Results indicated no statistically significant difference in phlebitis rates in the IVPB (3.4%) versus the IVP groups (3.3%).

  1. Turning big bang into big bounce. I. Classical dynamics

    Science.gov (United States)

    Dzierżak, Piotr; Małkiewicz, Przemysław; Piechocki, Włodzimierz

    2009-11-01

    The big bounce (BB) transition within a flat Friedmann-Robertson-Walker model is analyzed in the setting of loop geometry underlying the loop cosmology. We solve the constraint of the theory at the classical level to identify physical phase space and find the Lie algebra of the Dirac observables. We express energy density of matter and geometrical functions in terms of the observables. It is the modification of classical theory by the loop geometry that is responsible for BB. The classical energy scale specific to BB depends on a parameter that should be fixed either by cosmological data or determined theoretically at quantum level, otherwise the energy scale stays unknown.

  2. Breast abscess after intravenous methamphetamine injection into the breast.

    Science.gov (United States)

    Kistler, Amanda; Ajkay, Nicolas

    2018-05-01

    Intravenous drug use is a problem plaguing our society. We present a case of a young female who injected methamphetamine into her mammary vein, resulting in the formation of a breast abscess. This case demonstrates a rare but dangerous complication of intravenous drug use and a possible differential diagnosis in a patient presenting with a breast abscess. © 2017 Wiley Periodicals, Inc.

  3. Hydrothorax, hydromediastinum and pericardial effusion: a complication of intravenous alimentation.

    Science.gov (United States)

    Damtew, B; Lewandowski, B

    1984-01-01

    Complications secondary to intravenous alimentation are rare but potentially lethal. Massive bilateral pleural effusions and a pericardial effusion developed in a patient receiving prolonged intravenous alimentation. Severe respiratory distress and renal failure ensued. He recovered with appropriate treatment. Images Fig. 1 Fig. 2 Fig. 3 PMID:6428731

  4. Microbiological quality of some brands of intravenous fluids ...

    African Journals Online (AJOL)

    Microbiological quality of some brands of intravenous fluids produced by some pharmaceutical companies in Nigeria was investigated. Membrane filtration method was used for concentration of contaminating organisms in the intravenous fluids. Thioglycollate medium, Tryptone Soya broth, Brilliant Green Agar ...

  5. Cost-minimization of mabthera intravenous versus subcutaneous administration

    NARCIS (Netherlands)

    Bax, P.; Postma, M.J.

    2013-01-01

    Objectives: To identify and compare all costs related to preparing and administrating MabThera for the intravenous and subcutaneous formulations in Dutch hematological patients. The a priori notion is that the costs of subcutaneous MabThera injections are lower compared to intravenous infusion due

  6. Big Data Knowledge in Global Health Education.

    Science.gov (United States)

    Olayinka, Olaniyi; Kekeh, Michele; Sheth-Chandra, Manasi; Akpinar-Elci, Muge

    The ability to synthesize and analyze massive amounts of data is critical to the success of organizations, including those that involve global health. As countries become highly interconnected, increasing the risk for pandemics and outbreaks, the demand for big data is likely to increase. This requires a global health workforce that is trained in the effective use of big data. To assess implementation of big data training in global health, we conducted a pilot survey of members of the Consortium of Universities of Global Health. More than half the respondents did not have a big data training program at their institution. Additionally, the majority agreed that big data training programs will improve global health deliverables, among other favorable outcomes. Given the observed gap and benefits, global health educators may consider investing in big data training for students seeking a career in global health. Copyright © 2017 Icahn School of Medicine at Mount Sinai. Published by Elsevier Inc. All rights reserved.

  7. The Uses of Big Data in Cities.

    Science.gov (United States)

    Bettencourt, Luís M A

    2014-03-01

    There is much enthusiasm currently about the possibilities created by new and more extensive sources of data to better understand and manage cities. Here, I explore how big data can be useful in urban planning by formalizing the planning process as a general computational problem. I show that, under general conditions, new sources of data coordinated with urban policy can be applied following fundamental principles of engineering to achieve new solutions to important age-old urban problems. I also show that comprehensive urban planning is computationally intractable (i.e., practically impossible) in large cities, regardless of the amounts of data available. This dilemma between the need for planning and coordination and its impossibility in detail is resolved by the recognition that cities are first and foremost self-organizing social networks embedded in space and enabled by urban infrastructure and services. As such, the primary role of big data in cities is to facilitate information flows and mechanisms of learning and coordination by heterogeneous individuals. However, processes of self-organization in cities, as well as of service improvement and expansion, must rely on general principles that enforce necessary conditions for cities to operate and evolve. Such ideas are the core of a developing scientific theory of cities, which is itself enabled by the growing availability of quantitative data on thousands of cities worldwide, across different geographies and levels of development. These three uses of data and information technologies in cities constitute then the necessary pillars for more successful urban policy and management that encourages, and does not stifle, the fundamental role of cities as engines of development and innovation in human societies.

  8. The Natural Science Underlying Big History

    Directory of Open Access Journals (Sweden)

    Eric J. Chaisson

    2014-01-01

    Full Text Available Nature’s many varied complex systems—including galaxies, stars, planets, life, and society—are islands of order within the increasingly disordered Universe. All organized systems are subject to physical, biological, or cultural evolution, which together comprise the grander interdisciplinary subject of cosmic evolution. A wealth of observational data supports the hypothesis that increasingly complex systems evolve unceasingly, uncaringly, and unpredictably from big bang to humankind. These are global history greatly extended, big history with a scientific basis, and natural history broadly portrayed across ∼14 billion years of time. Human beings and our cultural inventions are not special, unique, or apart from Nature; rather, we are an integral part of a universal evolutionary process connecting all such complex systems throughout space and time. Such evolution writ large has significant potential to unify the natural sciences into a holistic understanding of who we are and whence we came. No new science (beyond frontier, nonequilibrium thermodynamics is needed to describe cosmic evolution’s major milestones at a deep and empirical level. Quantitative models and experimental tests imply that a remarkable simplicity underlies the emergence and growth of complexity for a wide spectrum of known and diverse systems. Energy is a principal facilitator of the rising complexity of ordered systems within the expanding Universe; energy flows are as central to life and society as they are to stars and galaxies. In particular, energy rate density—contrasting with information content or entropy production—is an objective metric suitable to gauge relative degrees of complexity among a hierarchy of widely assorted systems observed throughout the material Universe. Operationally, those systems capable of utilizing optimum amounts of energy tend to survive, and those that cannot are nonrandomly eliminated.

  9. Big Data Strategy for Telco: Network Transformation

    OpenAIRE

    F. Amin; S. Feizi

    2014-01-01

    Big data has the potential to improve the quality of services; enable infrastructure that businesses depend on to adapt continually and efficiently; improve the performance of employees; help organizations better understand customers; and reduce liability risks. Analytics and marketing models of fixed and mobile operators are falling short in combating churn and declining revenue per user. Big Data presents new method to reverse the way and improve profitability. The benefits of Big Data and ...

  10. Big Data in Shipping - Challenges and Opportunities

    OpenAIRE

    Rødseth, Ørnulf Jan; Perera, Lokukaluge Prasad; Mo, Brage

    2016-01-01

    Big Data is getting popular in shipping where large amounts of information is collected to better understand and improve logistics, emissions, energy consumption and maintenance. Constraints to the use of big data include cost and quality of on-board sensors and data acquisition systems, satellite communication, data ownership and technical obstacles to effective collection and use of big data. New protocol standards may simplify the process of collecting and organizing the data, including in...

  11. Big Data in Action for Government : Big Data Innovation in Public Services, Policy, and Engagement

    OpenAIRE

    World Bank

    2017-01-01

    Governments have an opportunity to harness big data solutions to improve productivity, performance and innovation in service delivery and policymaking processes. In developing countries, governments have an opportunity to adopt big data solutions and leapfrog traditional administrative approaches

  12. Big data optimization recent developments and challenges

    CERN Document Server

    2016-01-01

    The main objective of this book is to provide the necessary background to work with big data by introducing some novel optimization algorithms and codes capable of working in the big data setting as well as introducing some applications in big data optimization for both academics and practitioners interested, and to benefit society, industry, academia, and government. Presenting applications in a variety of industries, this book will be useful for the researchers aiming to analyses large scale data. Several optimization algorithms for big data including convergent parallel algorithms, limited memory bundle algorithm, diagonal bundle method, convergent parallel algorithms, network analytics, and many more have been explored in this book.

  13. Medical big data: promise and challenges

    Directory of Open Access Journals (Sweden)

    Choong Ho Lee

    2017-03-01

    Full Text Available The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  14. Traffic information computing platform for big data

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn; Liu, Yan, E-mail: ztduan@chd.edu.cn; Dai, Jiting, E-mail: ztduan@chd.edu.cn; Kang, Jun, E-mail: ztduan@chd.edu.cn [Chang' an University School of Information Engineering, Xi' an, China and Shaanxi Engineering and Technical Research Center for Road and Traffic Detection, Xi' an (China)

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  15. Medical big data: promise and challenges.

    Science.gov (United States)

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  16. Traffic information computing platform for big data

    International Nuclear Information System (INIS)

    Duan, Zongtao; Li, Ying; Zheng, Xibin; Liu, Yan; Dai, Jiting; Kang, Jun

    2014-01-01

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users

  17. Big Data's Role in Precision Public Health.

    Science.gov (United States)

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts.

  18. Big data analytics with R and Hadoop

    CERN Document Server

    Prajapati, Vignesh

    2013-01-01

    Big Data Analytics with R and Hadoop is a tutorial style book that focuses on all the powerful big data tasks that can be achieved by integrating R and Hadoop.This book is ideal for R developers who are looking for a way to perform big data analytics with Hadoop. This book is also aimed at those who know Hadoop and want to build some intelligent applications over Big data with R packages. It would be helpful if readers have basic knowledge of R.

  19. Towards Geo-spatial Information Science in Big Data Era

    Directory of Open Access Journals (Sweden)

    LI Deren

    2016-04-01

    Full Text Available Since the 1990s, with the advent of worldwide information revolution and the development of internet, geospatial information science have also come of age, which pushed forward the building of digital Earth and cyber city. As we entered the 21st century, with the development and integration of global information technology and industrialization, internet of things and cloud computing came into being, human society enters into the big data era. This article covers the key features (ubiquitous, multi-dimension and dynamics, internet+networking, full automation and real-time, from sensing to recognition, crowdsourcing and VGI, and service-oriented of geospatial information science in the big data era and addresses the key technical issues (non-linear four dimensional Earth reference frame system, space based enhanced GNSS, space-air and land unified network communication techniques, on board processing techniques for multi-sources image data, smart interface service techniques for space-borne information, space based resource scheduling and network security, design and developing of a payloads based multi-functional satellite platform. That needs to be resolved to provide a new definition of geospatial information science in big data era. Based on the discussion in this paper, the author finally proposes a new definition of geospatial information science (geomatics, i.e. Geomatics is a multiple discipline science and technology which, using a systematic approach, integrates all the means for spatio-temporal data acquisition, information extraction, networked management, knowledge discovering, spatial sensing and recognition, as well as intelligent location based services of any physical objects and human activities around the earth and its environment. Starting from this new definition, geospatial information science will get much more chances and find much more tasks in big data era for generation of smart earth and smart city . Our profession

  20. Big Data; A Management Revolution : The emerging role of big data in businesses

    OpenAIRE

    Blasiak, Kevin

    2014-01-01

    Big data is a term that was coined in 2012 and has since then emerged to one of the top trends in business and technology. Big data is an agglomeration of different technologies resulting in data processing capabilities that have been unreached before. Big data is generally characterized by 4 factors. Volume, velocity and variety. These three factors distinct it from the traditional data use. The possibilities to utilize this technology are vast. Big data technology has touch points in differ...

  1. BigDataBench: a Big Data Benchmark Suite from Internet Services

    OpenAIRE

    Wang, Lei; Zhan, Jianfeng; Luo, Chunjie; Zhu, Yuqing; Yang, Qiang; He, Yongqiang; Gao, Wanling; Jia, Zhen; Shi, Yingjie; Zhang, Shujie; Zheng, Chen; Lu, Gang; Zhan, Kent; Li, Xiaona; Qiu, Bizhu

    2014-01-01

    As architecture, systems, and data management communities pay greater attention to innovative big data systems and architectures, the pressure of benchmarking and evaluating these systems rises. Considering the broad use of big data systems, big data benchmarks must include diversity of data and workloads. Most of the state-of-the-art big data benchmarking efforts target evaluating specific types of applications or system software stacks, and hence they are not qualified for serving the purpo...

  2. The faces of Big Science.

    Science.gov (United States)

    Schatz, Gottfried

    2014-06-01

    Fifty years ago, academic science was a calling with few regulations or financial rewards. Today, it is a huge enterprise confronted by a plethora of bureaucratic and political controls. This change was not triggered by specific events or decisions but reflects the explosive 'knee' in the exponential growth that science has sustained during the past three-and-a-half centuries. Coming to terms with the demands and benefits of 'Big Science' is a major challenge for today's scientific generation. Since its foundation 50 years ago, the European Molecular Biology Organization (EMBO) has been of invaluable help in meeting this challenge.

  3. Big Data and central banks

    Directory of Open Access Journals (Sweden)

    David Bholat

    2015-04-01

    Full Text Available This commentary recaps a Centre for Central Banking Studies event held at the Bank of England on 2–3 July 2014. The article covers three main points. First, it situates the Centre for Central Banking Studies event within the context of the Bank’s Strategic Plan and initiatives. Second, it summarises and reflects on major themes from the event. Third, the article links central banks’ emerging interest in Big Data approaches with their broader uptake by other economic agents.

  4. Inhomogeneous Big Bang Nucleosynthesis Revisited

    OpenAIRE

    Lara, J. F.; Kajino, T.; Mathews, G. J.

    2006-01-01

    We reanalyze the allowed parameters for inhomogeneous big bang nucleosynthesis in light of the WMAP constraints on the baryon-to-photon ratio and a recent measurement which has set the neutron lifetime to be 878.5 +/- 0.7 +/- 0.3 seconds. For a set baryon-to-photon ratio the new lifetime reduces the mass fraction of He4 by 0.0015 but does not significantly change the abundances of other isotopes. This enlarges the region of concordance between He4 and deuterium in the parameter space of the b...

  5. Comparative validity of brief to medium-length Big Five and Big Six personality questionnaires

    NARCIS (Netherlands)

    Thalmayer, A.G.; Saucier, G.; Eigenhuis, A.

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five

  6. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    Science.gov (United States)

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  7. Big Data en surveillance, deel 1 : Definities en discussies omtrent Big Data

    NARCIS (Netherlands)

    Timan, Tjerk

    2016-01-01

    Naar aanleiding van een (vrij kort) college over surveillance en Big Data, werd me gevraagd iets dieper in te gaan op het thema, definities en verschillende vraagstukken die te maken hebben met big data. In dit eerste deel zal ik proberen e.e.a. uiteen te zetten betreft Big Data theorie en

  8. 77 FR 27245 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN

    Science.gov (United States)

    2012-05-09

    ... DEPARTMENT OF THE INTERIOR Fish and Wildlife Service [FWS-R3-R-2012-N069; FXRS1265030000S3-123-FF03R06000] Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN AGENCY: Fish and... plan (CCP) and environmental assessment (EA) for Big Stone National Wildlife Refuge (Refuge, NWR) for...

  9. Big game hunting practices, meanings, motivations and constraints: a survey of Oregon big game hunters

    Science.gov (United States)

    Suresh K. Shrestha; Robert C. Burns

    2012-01-01

    We conducted a self-administered mail survey in September 2009 with randomly selected Oregon hunters who had purchased big game hunting licenses/tags for the 2008 hunting season. Survey questions explored hunting practices, the meanings of and motivations for big game hunting, the constraints to big game hunting participation, and the effects of age, years of hunting...

  10. An analysis of cross-sectional differences in big and non-big public accounting firms' audit programs

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans); Drieenhuizen, F.; Stein, M.T.; Simunic, D.A.

    2006-01-01

    A significant body of prior research has shown that audits by the Big 5 (now Big 4) public accounting firms are quality differentiated relative to non-Big 5 audits. This result can be derived analytically by assuming that Big 5 and non-Big 5 firms face different loss functions for "audit failures"

  11. Intravenous coronary angiography using the orbital radiation

    Energy Technology Data Exchange (ETDEWEB)

    Ohtsuka, Sadanori; Yamaguchi, Iwao; Wu, Jin; Takeda, Toru; Itai, Yuji; Maruhashi, Akira [Tsukuba Univ., Ibaraki (Japan). Inst. of Clinical Medicine; Hyodo, Kazuyuki; Ando, Masami [High Energy Accelerator Research Org., Tsukuba, Ibaraki (Japan). Inst. of Materials Structure Science

    2002-09-01

    This review described the progress and current status of intravenous coronary angiography (IVCAG) using the orbital radiation generated by the synchrotron. Authors diagnosed 4 patients of coronary artery disease in 1996 and 33 until 2000. Monochromatic 2-D X-ray beam (2.0 X 10{sup 10} photons/mm{sup 2}/sec, 130 X 75 mm) of 37 keV was obtained by non-symmetrical reflection of synchrotron radiation generated by 5.0 GeV accelerated electron. The use of 2-D beam enabled to give the dynamic IVCAG image in contrast with the static image by the 1-D slit beam. Intermittent irradiation (5 msec/100 msec) reduced the exposure dose to <750 mSv. Images were recorded in the Sony digital video-recorder placed behind the Sony Charge-coupled device (CCD) camera and Toshiba photo-multiplier, and gave the precision of 1 mm of the artery. The IVCAG by synchrotron radiation reduced patients' burden and was expected to be more widely used in future. (K.H.)

  12. Intravenous injection of ioxilan, iohexol and diatrizoate

    International Nuclear Information System (INIS)

    Thomsen, H.S.; Dorph, S.; Mygind, T.; Sovak, M.; Nielsen, H.; Rygaard, H.; Larsen, S.; Skaarup, P.; Hemmingsen, L.; Holm, J.

    1988-01-01

    Effects of intravenous ioxilan, a new third generation non-ionic contrast medium, diatrizoate, iohexol and saline on urine profiles were compared. Albumin, glucose, sodium, phosphate, and the enzymes NAG, LDH and GGT were followed in 24 normal rats over 7 days. Diatrizoate significantly affected all profile components during the first two hours. Albuminuria was significantly greater after diatrizoate than after iohexol or ioxilan, and excretion of glucose, LDH and GGT was significantly higher than after ioxilan. Both iohexol and ioxilan increased the excretion of albumin, LDH and GGT, while iohexol also significantly increased excretion of glucose and sodium. There was a greater excretion of glucose and GGT after iohexol than after ioxilan. Saline did not induce any changes. At day 7, serum sodium, urea, creatinine, and albumin were normal for all test substances, and kidney histology revealed no difference between the groups of animals. It is thus concluded that both high osmolar ionic and low osmolar non-ionic contrast media may cause temporary glomerular and tubular dysfunction in rats. In this model, the kidney is affected most by diatrizoate, less by iohexol, and least by ioxilan. (orig.)

  13. Cardiac complications of intravenous digital subtraction angiography

    International Nuclear Information System (INIS)

    Neergaard, K.; Dirksen, K.L.; Andersen, I.; Galloee, A.M.; Madsen, E.B.

    1989-01-01

    In a prospective study of 103 patients the incidence of cardiac events during intravenous digital subtraction angiography (i.v. DSA) was investigated. Of 103 patients 17 had known ischaemic heart disease. The examination was performed with an ionic contrast medium, Urografin 76% (sodium megluminediatrizoate), administered by bolus injection into the right atrium. Patients with severe cardiac disease were examined only if the procedure was considered of vital importance. Cardiac events were defined as ST-segment changes of more than 0.1 mV, changes in heart rate of more than 20%, arrhythmias and such symptoms as chest pain and dyspnoea. Ischaemic ST-segment changes during i.v. DSA were observed in approximately 20% of the patients and were not related to the presence of known ischaemic heart disease. Three patients developed angina during the procedure. Among 12 patients with known angina only one patient developed angina during the procedure. In this study chest pain was infrequent (3%), but there was a relative high frequency of ECG changes (20%) not related to patients with ischaemic heart disease only. It is concluded that there is a risk of cardiac events during i.v. DSA, but the risk is not increased in patients with known ischaemic heart disease (if they do not suffer from congestive heart failure) as compared with other patients without known ischaemic heart disease. (orig.)

  14. MYCOTIC FEMORAL PSEUDOANEURYSMS FROM INTRAVENOUS DRUG ABUSE

    Directory of Open Access Journals (Sweden)

    Vojko Flis

    2004-04-01

    Full Text Available Background. Parenteral drug abuse is the most common cause of infected femoral artery pseudoaneurysms (IFAP. This complication of intravenous drug abuse is not only limb threatening but can also be life threatening. The management of the IFAP is difficult and controversial. Generally speaking, ligation and excision of the pseudoaneurysm without revascularization is accepted procedure in majority of the patients. However it is not regarded as an appropriate procedure for cases where the high probability of amputation is expected from acute interruption of the femoral artery flow.Patients, methods and results. We present three cases of young (average 20 years, range 18–24 patients with IFAP, in which a primary reconstruction was performed due to absence of doppler signal over pedal arteries after ligation of common femoral artery. In two of them complications in form of haemorrhage and repeated infection developed in late postoperative period. The first one, had an excision and ligation while the second one had a reconstruction made by means of a silver impregnated dacron prosthesis. None of the patients required an amputation.Conclusions. Overall prognosis and prognosis of the reconstruction in parenteral drug abuse patients is uncertain because there is a high incidence of postoperative drug injection despite aggressive drug rehabilitation.

  15. Potential intravenous drug interactions in intensive care

    Directory of Open Access Journals (Sweden)

    Maiara Benevides Moreira

    Full Text Available Abstract OBJECTIVE To analyze potential intravenous drug interactions, and their level of severity associated with the administration of these drugs based on the prescriptions of an intensive care unit. METHOD Quantitative study, with aretrospective exploratory design, and descriptive statistical analysis of the ICU prescriptions of a teaching hospital from March to June 2014. RESULTS The sample consisted of 319 prescriptions and subsamples of 50 prescriptions. The mean number of drugs per patient was 9.3 records, and a higher probability of drug interaction inherent to polypharmacy was evidenced. The study identified severe drug interactions, such as concomitant administration of Tramadol with selective serotonin reuptake inhibitor drugs (e.g., Metoclopramide and Fluconazole, increasing the risk of seizures due to their epileptogenic actions, as well as the simultaneous use of Ranitidine-Fentanyl®, which can lead to respiratory depression. CONCLUSION A previous mapping of prescriptions enables the characterization of the drug therapy, contributing to prevent potential drug interactions and their clinical consequences.

  16. Radiation dose measurements in intravenous pyelography

    International Nuclear Information System (INIS)

    Egeblad, M.; Gottlieb, E.

    1975-01-01

    Intravenous pyelography (IVP) and micturition cystourethrography (MCU) are the standard procedures in the radiological examination of children with urinary tract infections and in the control of these children. Gonad protection against radiation is not possible in MCU, but concerning the girls partly possible in IVP. It is of major importance to know the radiation dose in these procedures, especially since the examination is often repeated in the same patients. All IVP were done by means of the usual technique including possible gonad protection. The thermoluminescence dosimeter was placed rectally in the girls and fixed on the scrota in the boys. A total of 50 children was studied. Gonad dose ranged from 140 to 200mR in the girls and from 20 to 70mR in the boys (mean values). The radiation dose in IVP is very low compared to that of MCU, and from this point of view IVP is a dose saving examination in the control of children with urinary tract infections [fr

  17. Repeated intravenous doxapram induces phrenic motor facilitation.

    Science.gov (United States)

    Sandhu, M S; Lee, K Z; Gonzalez-Rothi, E J; Fuller, D D

    2013-12-01

    Doxapram is a respiratory stimulant used to treat hypoventilation. Here we investigated whether doxapram could also trigger respiratory neuroplasticity. Specifically, we hypothesized that intermittent delivery of doxapram at low doses would lead to long-lasting increases (i.e., facilitation) of phrenic motor output in anesthetized, vagotomized, and mechanically-ventilated rats. Doxapram was delivered intravenously in a single bolus (2 or 6mg/kg) or as a series of 3 injections (2mg/kg) at 5min intervals. Control groups received pH-matched saline injections (vehicle) or no treatment (anesthesia time control). Doxapram evoked an immediate increase in phrenic output in all groups, but a persistent increase in burst amplitude only occurred after repeated dosing with 2mg/kg. At 60min following the last injection, phrenic burst amplitude was 168±24% of baseline (%BL) in the group receiving 3 injections (Pphrenic response to doxapram (2mg/kg) was reduced by 68% suggesting that at low doses the drug was acting primarily via the carotid chemoreceptors. We conclude that intermittent application of doxapram can trigger phrenic neuroplasticity, and this approach might be of use in the context of respiratory rehabilitation following neurologic injury. © 2013.

  18. Evidence of Big Five and Aggressive Personalities in Gait Biomechanics.

    Science.gov (United States)

    Satchell, Liam; Morris, Paul; Mills, Chris; O'Reilly, Liam; Marshman, Paul; Akehurst, Lucy

    2017-01-01

    Behavioral observation techniques which relate action to personality have long been neglected (Furr and Funder in Handbook of research methods in personality psychology, The Guilford Press, New York, 2007) and, when employed, often use human judges to code behavior. In the current study we used an alternative to human coding (biomechanical research techniques) to investigate how personality traits are manifest in gait. We used motion capture technology to record 29 participants walking on a treadmill at their natural speed. We analyzed their thorax and pelvis movements, as well as speed of gait. Participants completed personality questionnaires, including a Big Five measure and a trait aggression questionnaire. We found that gait related to several of our personality measures. The magnitude of upper body movement, lower body movement, and walking speed, were related to Big Five personality traits and aggression. Here, we present evidence that some gait measures can relate to Big Five and aggressive personalities. We know of no other examples of research where gait has been shown to correlate with self-reported measures of personality and suggest that more research should be conducted between largely automatic movement and personality.

  19. [Big data, medical language and biomedical terminology systems].

    Science.gov (United States)

    Schulz, Stefan; López-García, Pablo

    2015-08-01

    A variety of rich terminology systems, such as thesauri, classifications, nomenclatures and ontologies support information and knowledge processing in health care and biomedical research. Nevertheless, human language, manifested as individually written texts, persists as the primary carrier of information, in the description of disease courses or treatment episodes in electronic medical records, and in the description of biomedical research in scientific publications. In the context of the discussion about big data in biomedicine, we hypothesize that the abstraction of the individuality of natural language utterances into structured and semantically normalized information facilitates the use of statistical data analytics to distil new knowledge out of textual data from biomedical research and clinical routine. Computerized human language technologies are constantly evolving and are increasingly ready to annotate narratives with codes from biomedical terminology. However, this depends heavily on linguistic and terminological resources. The creation and maintenance of such resources is labor-intensive. Nevertheless, it is sensible to assume that big data methods can be used to support this process. Examples include the learning of hierarchical relationships, the grouping of synonymous terms into concepts and the disambiguation of homonyms. Although clear evidence is still lacking, the combination of natural language technologies, semantic resources, and big data analytics is promising.

  20. Baryon symmetric big bang cosmology

    Science.gov (United States)

    Stecker, F. W.

    1978-01-01

    Both the quantum theory and Einsteins theory of special relativity lead to the supposition that matter and antimatter were produced in equal quantities during the big bang. It is noted that local matter/antimatter asymmetries may be reconciled with universal symmetry by assuming (1) a slight imbalance of matter over antimatter in the early universe, annihilation, and a subsequent remainder of matter; (2) localized regions of excess for one or the other type of matter as an initial condition; and (3) an extremely dense, high temperature state with zero net baryon number; i.e., matter/antimatter symmetry. Attention is given to the third assumption, which is the simplest and the most in keeping with current knowledge of the cosmos, especially as pertains the universality of 3 K background radiation. Mechanisms of galaxy formation are discussed, whereby matter and antimatter might have collided and annihilated each other, or have coexisted (and continue to coexist) at vast distances. It is pointed out that baryon symmetric big bang cosmology could probably be proved if an antinucleus could be detected in cosmic radiation.